WorldWideScience

Sample records for filters development verification

  1. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT: IN-DRAIN TREATMENT DEVICE. HYDRO INTERNATIONAL UP-FLO™ FILTER

    Science.gov (United States)

    Verification testing of the Hydro International Up-Flo™ Filter with one filter module and CPZ Mix™ filter media was conducted at the Penn State Harrisburg Environmental Engineering Laboratory in Middletown, Pennsylvania. The Up-Flo™ Filter is designed as a passive, modular filtr...

  2. Development and verification of a new wind speed forecasting system using an ensemble Kalman filter data assimilation technique in a fully coupled hydrologic and atmospheric model

    Science.gov (United States)

    Williams, John L.; Maxwell, Reed M.; Monache, Luca Delle

    2013-12-01

    Wind power is rapidly gaining prominence as a major source of renewable energy. Harnessing this promising energy source is challenging because of the chaotic nature of wind and its inherently intermittent nature. Accurate forecasting tools are critical to support the integration of wind energy into power grids and to maximize its impact on renewable energy portfolios. We have adapted the Data Assimilation Research Testbed (DART), a community software facility which includes the ensemble Kalman filter (EnKF) algorithm, to expand our capability to use observational data to improve forecasts produced with a fully coupled hydrologic and atmospheric modeling system, the ParFlow (PF) hydrologic model and the Weather Research and Forecasting (WRF) mesoscale atmospheric model, coupled via mass and energy fluxes across the land surface, and resulting in the PF.WRF model. Numerous studies have shown that soil moisture distribution and land surface vegetative processes profoundly influence atmospheric boundary layer development and weather processes on local and regional scales. We have used the PF.WRF model to explore the connections between the land surface and the atmosphere in terms of land surface energy flux partitioning and coupled variable fields including hydraulic conductivity, soil moisture, and wind speed and demonstrated that reductions in uncertainty in these coupled fields realized through assimilation of soil moisture observations propagate through the hydrologic and atmospheric system. The sensitivities found in this study will enable further studies to optimize observation strategies to maximize the utility of the PF.WRF-DART forecasting system.

  3. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT, BAGHOUSE FILTRATION PRODUCTS, TETRATEC PTFE PRODUCTS, TETRATEX 6212 FILTER SAMPLE

    Science.gov (United States)

    Baghouse filtration products (BFPs) were evaluated by the Air Pollution Control Technology (APCT) Verification Center. The performance factor verified was the mean outlet particle concentration for the filter fabric as a function of the size of those particles equal to and smalle...

  4. Software Verification and Validation Test Report for the HEPA filter Differential Pressure Fan Interlock System

    International Nuclear Information System (INIS)

    ERMI, A.M.

    2000-01-01

    The HEPA Filter Differential Pressure Fan Interlock System PLC ladder logic software was tested using a Software Verification and Validation (VandV) Test Plan as required by the ''Computer Software Quality Assurance Requirements''. The purpose of his document is to report on the results of the software qualification

  5. Development of the code for filter calculation

    International Nuclear Information System (INIS)

    Gritzay, O.O.; Vakulenko, M.M.

    2012-01-01

    This paper describes a calculation method, which commonly used in the Neutron Physics Department to develop a new neutron filter or to improve the existing neutron filter. This calculation is the first step of the traditional filter development procedure. It allows easy selection of the qualitative and quantitative contents of a composite filter in order to receive the filtered neutron beam with given parameters

  6. Development and verification of the CATHENA GUI

    International Nuclear Information System (INIS)

    Chin, T.

    2008-01-01

    This paper presents the development and verification of a graphical user interface for CATHENA MOD-3.5d. The thermalhydraulic computer code CATHENA has been developed to simulate the physical behaviour of the hydraulic components in nuclear reactors and experimental facilities. A representation of the facility is developed as an ASCII text file and used by CATHENA to perform the simulation. The existing method of manual generation of idealizations of a physical system for performing thermal hydraulic analysis is complex, time-consuming and prone to errors. An overview is presented of the CATHENA GUI and its depiction of a CATHENA idealization through the manipulation of a visual collection of objects. The methodologies and rigour involved in the verification of the CATHENA GUI will be discussed. (author)

  7. Technology development for producing nickel metallic filters

    International Nuclear Information System (INIS)

    Hubler, C.H.

    1990-01-01

    A technology to produce metallic filters by Instituto de Engenharia Nuclear (IEN-Brazilian CNEN) providing the Instituto de Pesquisas Energeticas e Nucleares (IPEN-Brazilian CNEN) in obtaining nickel alloy filters used for filtration process of uranium hexafluoride, was developed. The experiences carried out for producing nickel conical trunk filters from powder metallurgy are related. (M.C.K.)

  8. Development of film dosimetric measurement system for verification of RTP

    International Nuclear Information System (INIS)

    Chen Yong; Bao Shanglian; Ji Changguo; Zhang Xin; Wu Hao; Han Shukui; Xiao Guiping

    2007-01-01

    Objective: To develop a novel film dosimetry system based on general laser scanner in order to verify patient-specific Radiotherapy Treatment Plan(RTP) in three-Dimensional Adaptable Radiotherapy(3D ART) and Intensity Modulated Radiotherapy (IMRT). Methods: Some advanced methods, including film saturated development, wavelet filtering with multi-resolution thresholds and discrete Fourier reconstruction are employed in this system to reduce artifacts, noise and distortion induced by film digitizing with general scanner; a set of coefficients derived from Monte Carlo(MC) simulation are adopted to correct the film over-response to low energy scattering photons; a set of newly emerging criteria, including γ index and Normalized Agreement Test (NAT) method, are employed to quantitatively evaluate agreement of 2D dose distributions between the results measured by the films and calculated by Treatment Planning System(TPS), so as to obtain straightforward presentations, displays and results with high accuracy and reliability. Results: Radiotherapy doses measured by developed system agree within 2% with those measured by ionization chamber and VeriSoft Film Dosimetry System, and quantitative evaluation indexes are within 3%. Conclusions: The developed system can be used to accurately measure the radiotherapy dose and reliably make quantitative evaluation for RTP dose verification. (authors)

  9. Development of circular filters for active facilities

    International Nuclear Information System (INIS)

    Pratt, R.P.

    1986-01-01

    An assessment of problems associated with remote handling, changing and disposal of filters suggested that significant improvements to filtration systems could be made if circular geometries were adopted in place of conventional systems. Improved systems have been developed and are now available for a range of applications and air flow rates. Where primary filters are installed within the active cell or cave, circular filters incorporating a lip seal have been developed which enable the filters to be sealed into the facility without recourse to clamping. For smaller cells, a range of push-through filter change systems have been developed, the principal feature being that the filter is passed into the housing from the clean side, but transferred from the housing directly into the cell for subsequent disposal. For plant room applications, circular bag change canister systems have been developed which ease the sealing and bag change operation. Such systems have a rated air flow of up to 3000 m 3 /h whilst still allowing ultimate disposal via the 200 litre waste drum route without prior volume reduction of the filter inserts. (author)

  10. ENVIRONMENTAL TECHNOLOGY VERIFICATION, TEST REPORT OF CONTROL OF BIOAEROSOLS IN HVAC SYSTEMS, FILTRATION GROUP, AEROSTAR FP-98 MINIPLEAT V-BLANK FILTER

    Science.gov (United States)

    The Environmental Technology Verification report discusses the technology and performance of the AeroStar FP-98 Minipleat V-Bank Filter air filter for dust and bioaerosol filtration manufactured by Filtration Group. The pressure drop across the filter was 137 Pa clean and 348 Pa ...

  11. Design and experimental verification of a dual-band metamaterial filter

    Science.gov (United States)

    Zhu, Hong-Yang; Yao, Ai-Qin; Zhong, Min

    2016-10-01

    In this paper, we present the design, simulation, and experimental verification of a dual-band free-standing metamaterial filter operating in a frequency range of 1 THz-30 THz. The proposed structure consists of periodically arranged composite air holes, and exhibits two broad and flat transmission bands. To clarify the effects of the structural parameters on both resonant transmission bands, three sets of experiments are performed. The first resonant transmission band shows a shift towards higher frequency when the side width w 1 of the main air hole is increased. In contrast, the second resonant transmission band displays a shift towards lower frequency when the side width w 2 of the sub-holes is increased, while the first resonant transmission band is unchanged. The measured results indicate that these resonant bands can be modulated individually by simply optimizing the relevant structural parameters (w 1 or w 2) for the required band. In addition, these resonant bands merge into a single resonant band with a bandwidth of 7.7 THz when w 1 and w 2 are optimized simultaneously. The structure proposed in this paper adopts different resonant mechanisms for transmission at different frequencies and thus offers a method to achieve a dual-band and low-loss filter. Project supported by the Doctorate Scientific Research Foundation of Hezhou University, China (Grant No. HZUBS201503), the Promotion of the Basic Ability of Young and Middle-aged Teachers in Universities Project of Guangxi Zhuang Autonomous Region, China (Grant No. KY2016YB453), the Guangxi Colleges and Universities Key Laboratory Symbolic Computation, China, Engineering Data Processing and Mathematical Support Autonomous Discipline Project of Hezhou University, China (Grant No. 2016HZXYSX01).

  12. Environmental Technology Verification: Baghouse Filtration Products--TDC Filter Manufacturing, Inc., SB025 Filtration Media

    Science.gov (United States)

    The U.S. Environmental Protection Agency (EPA) has created the Environmental Technology Verification (ETV) Program to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. ETV seeks to ach...

  13. The development of the spatially correlated adjustment wavelet filter for atomic force microscopy data.

    Science.gov (United States)

    Sikora, Andrzej; Rodak, Aleksander; Unold, Olgierd; Klapetek, Petr

    2016-12-01

    In this paper a novel approach for the practical utilization of the 2D wavelet filter in terms of the artifacts removal from atomic force microscopy measurements results is presented. The utilization of additional data such as summary photodiode signal map is implemented in terms of the identification of the areas requiring the data processing, filtering settings optimization and the verification of the process performance. Such an approach allows to perform the filtering parameters adjustment by average user, while the straightforward method requires an expertise in this field. The procedure was developed as the function of the Gwyddion software. The examples of filtering the phase imaging and Electrostatic Force Microscopy measurement result are presented. As the wavelet filtering feature may remove a local artifacts, its superior efficiency over similar approach with 2D Fast Fourier Transformate based filter (2D FFT) can be noticed. Copyright © 2016 Elsevier B.V. All rights reserved.

  14. MACCS2 development and verification efforts

    International Nuclear Information System (INIS)

    Young, M.; Chanin, D.

    1997-01-01

    MACCS2 represents a major enhancement of the capabilities of its predecessor MACCS, the MELCOR Accident Consequence Code System. MACCS, released in 1987, was developed to estimate the potential impacts to the surrounding public of severe accidents at nuclear power plants. The principal phenomena considered in MACCS/MACCS2 are atmospheric transport and deposition under time-variant meteorology, short-term and long-term mitigative actions and exposure pathways, deterministic and stochastic health effects, and economic costs. MACCS2 was developed as a general-purpose analytical tool applicable to diverse reactor and nonreactor facilities. The MACCS2 package includes three primary enhancements: (1) a more flexible emergency response model, (2) an expanded library of radionuclides, and (3) a semidynamic food-chain model. In addition, errors that had been identified in MACCS version1.5.11.1 were corrected, including an error that prevented the code from providing intermediate-phase results. MACCS2 version 1.10 beta test was released to the beta-test group in May, 1995. In addition, the University of New Mexico (UNM) has completed an independent verification study of the code package. Since the beta-test release of MACCS2 version 1.10, a number of minor errors have been identified and corrected, and a number of enhancements have been added to the code package. The code enhancements added since the beta-test release of version 1.10 include: (1) an option to allow the user to input the σ y and σ z plume expansion parameters in a table-lookup form for incremental downwind distances, (2) an option to define different initial dimensions for up to four segments of a release, (3) an enhancement to the COMIDA2 food-chain model preprocessor to allow the user to supply externally calculated tables of tritium food-chain dose per unit deposition on farmland to support analyses of tritium releases, and (4) the capability to calculate direction-dependent doses

  15. MCNP5 development, verification, and performance

    International Nuclear Information System (INIS)

    Forrest B, Brown

    2003-01-01

    MCNP is a well-known and widely used Monte Carlo code for neutron, photon, and electron transport simulations. During the past 18 months, MCNP was completely reworked to provide MCNP5, a modernized version with many new features, including plotting enhancements, photon Doppler broadening, radiography image tallies, enhancements to source definitions, improved variance reduction, improved random number generator, tallies on a superimposed mesh, and edits of criticality safety parameters. Significant improvements in software engineering and adherence to standards have been made. Over 100 verification problems have been used to ensure that MCNP5 produces the same results as before and that all capabilities have been preserved. Testing on large parallel systems shows excellent parallel scaling. (author)

  16. MCNP5 development, verification, and performance

    Energy Technology Data Exchange (ETDEWEB)

    Forrest B, Brown [Los Alamos National Laboratory (United States)

    2003-07-01

    MCNP is a well-known and widely used Monte Carlo code for neutron, photon, and electron transport simulations. During the past 18 months, MCNP was completely reworked to provide MCNP5, a modernized version with many new features, including plotting enhancements, photon Doppler broadening, radiography image tallies, enhancements to source definitions, improved variance reduction, improved random number generator, tallies on a superimposed mesh, and edits of criticality safety parameters. Significant improvements in software engineering and adherence to standards have been made. Over 100 verification problems have been used to ensure that MCNP5 produces the same results as before and that all capabilities have been preserved. Testing on large parallel systems shows excellent parallel scaling. (author)

  17. Efficient Development and Verification of Safe Railway Control Software

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Peleska, Jan

    2013-01-01

    the monitoring process; hydraulic absorbers as dampers to dissipate the energy of oscillations in railway electric equipment; development of train fare calculation and adjustment systems using VDM++; efficient development and verification of safe railway control software; and evolution of the connectivity...

  18. An Unattended Verification Station for UF6 Cylinders: Development Status

    International Nuclear Information System (INIS)

    Smith, E.; McDonald, B.; Miller, K.; Garner, J.; March-Leuba, J.; Poland, R.

    2015-01-01

    In recent years, the International Atomic Energy Agency (IAEA) has pursued innovative techniques and an integrated suite of safeguards measures to address the verification challenges posed by advanced centrifuge technologies and the growth in separative work unit capacity at modern centrifuge enrichment plants. These measures would include permanently installed, unattended instruments capable of performing the routine and repetitive measurements previously performed by inspectors. Among the unattended instruments currently being explored by the IAEA is an Unattended Cylinder Verification Stations (UCVS) that could provide independent verification of the declared relative enrichment, U-235 mass and total uranium mass of all declared cylinders moving through the plant, as well as the application and verification of a ''Non-destructive Assay Fingerprint'' to preserve verification knowledge on the contents of each cylinder throughout its life in the facility. As IAEA's vision for a UCVS has evolved, Pacific Northwest National Laboratory (PNNL) and Los Alamos National Laboratory have been developing and testing candidate non-destructive assay (NDA) methods for inclusion in a UCVS. Modeling and multiple field campaigns have indicated that these methods are capable of assaying relative cylinder enrichment with a precision comparable to or substantially better than today's high-resolution handheld devices, without the need for manual wall-thickness corrections. In addition, the methods interrogate the full volume of the cylinder, thereby offering the IAEA a new capability to assay the absolute U-235 mass in the cylinder, and much-improved sensitivity to substituted or removed material. Building on this prior work, and under the auspices of the United States Support Programme to the IAEA, a UCVS field prototype is being developed and tested. This paper provides an overview of: a) hardware and software design of the prototypes, b) preparation

  19. The development of the spatially correlated adjustment wavelet filter for atomic force microscopy data

    Energy Technology Data Exchange (ETDEWEB)

    Sikora, Andrzej, E-mail: sikora@iel.wroc.pl [Electrotechnical Institute, Division of Electrotechnology and Materials Science, M. Skłodowskiej-Curie 55/61, 50-369 Wrocław (Poland); Rodak, Aleksander [Faculty of Electronics, Wrocław University of Technology, Janiszewskiego 11/17, 50-372 Wrocław (Poland); Unold, Olgierd [Institute of Computer Engineering, Control and Robotics, Faculty of Electronics, Wrocław University of Technology, Janiszewskiego 11/17, 50-372 Wrocław (Poland); Klapetek, Petr [Czech Metrology Institute, Okružní 31, 638 00 Brno (Czech Republic)

    2016-12-15

    In this paper a novel approach for the practical utilization of the 2D wavelet filter in terms of the artifacts removal from atomic force microscopy measurements results is presented. The utilization of additional data such as summary photodiode signal map is implemented in terms of the identification of the areas requiring the data processing, filtering settings optimization and the verification of the process performance. Such an approach allows to perform the filtering parameters adjustment by average user, while the straightforward method requires an expertise in this field. The procedure was developed as the function of the Gwyddion software. The examples of filtering the phase imaging and Electrostatic Force Microscopy measurement result are presented. As the wavelet filtering feature may remove a local artifacts, its superior efficiency over similar approach with 2D Fast Fourier Transformate based filter (2D FFT) can be noticed. - Highlights: • A novel approach to 2D wavelet-based filter for atomic force microscopy is shown. • The additional AFM measurement signal is used to adjust the filter. • Efficient removal of the local interference phenomena caused artifacts is presented.

  20. The development of the spatially correlated adjustment wavelet filter for atomic force microscopy data

    International Nuclear Information System (INIS)

    Sikora, Andrzej; Rodak, Aleksander; Unold, Olgierd; Klapetek, Petr

    2016-01-01

    In this paper a novel approach for the practical utilization of the 2D wavelet filter in terms of the artifacts removal from atomic force microscopy measurements results is presented. The utilization of additional data such as summary photodiode signal map is implemented in terms of the identification of the areas requiring the data processing, filtering settings optimization and the verification of the process performance. Such an approach allows to perform the filtering parameters adjustment by average user, while the straightforward method requires an expertise in this field. The procedure was developed as the function of the Gwyddion software. The examples of filtering the phase imaging and Electrostatic Force Microscopy measurement result are presented. As the wavelet filtering feature may remove a local artifacts, its superior efficiency over similar approach with 2D Fast Fourier Transformate based filter (2D FFT) can be noticed. - Highlights: • A novel approach to 2D wavelet-based filter for atomic force microscopy is shown. • The additional AFM measurement signal is used to adjust the filter. • Efficient removal of the local interference phenomena caused artifacts is presented.

  1. Enrichment Assay Methods Development for the Integrated Cylinder Verification System

    International Nuclear Information System (INIS)

    Smith, Leon E.; Misner, Alex C.; Hatchell, Brian K.; Curtis, Michael M.

    2009-01-01

    International Atomic Energy Agency (IAEA) inspectors currently perform periodic inspections at uranium enrichment plants to verify UF6 cylinder enrichment declarations. Measurements are typically performed with handheld high-resolution sensors on a sampling of cylinders taken to be representative of the facility's entire product-cylinder inventory. Pacific Northwest National Laboratory (PNNL) is developing a concept to automate the verification of enrichment plant cylinders to enable 100 percent product-cylinder verification and potentially, mass-balance calculations on the facility as a whole (by also measuring feed and tails cylinders). The Integrated Cylinder Verification System (ICVS) could be located at key measurement points to positively identify each cylinder, measure its mass and enrichment, store the collected data in a secure database, and maintain continuity of knowledge on measured cylinders until IAEA inspector arrival. The three main objectives of this FY09 project are summarized here and described in more detail in the report: (1) Develop a preliminary design for a prototype NDA system, (2) Refine PNNL's MCNP models of the NDA system, and (3) Procure and test key pulse-processing components. Progress against these tasks to date, and next steps, are discussed.

  2. Enrichment Assay Methods Development for the Integrated Cylinder Verification System

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Leon E.; Misner, Alex C.; Hatchell, Brian K.; Curtis, Michael M.

    2009-10-22

    International Atomic Energy Agency (IAEA) inspectors currently perform periodic inspections at uranium enrichment plants to verify UF6 cylinder enrichment declarations. Measurements are typically performed with handheld high-resolution sensors on a sampling of cylinders taken to be representative of the facility's entire product-cylinder inventory. Pacific Northwest National Laboratory (PNNL) is developing a concept to automate the verification of enrichment plant cylinders to enable 100 percent product-cylinder verification and potentially, mass-balance calculations on the facility as a whole (by also measuring feed and tails cylinders). The Integrated Cylinder Verification System (ICVS) could be located at key measurement points to positively identify each cylinder, measure its mass and enrichment, store the collected data in a secure database, and maintain continuity of knowledge on measured cylinders until IAEA inspector arrival. The three main objectives of this FY09 project are summarized here and described in more detail in the report: (1) Develop a preliminary design for a prototype NDA system, (2) Refine PNNL's MCNP models of the NDA system, and (3) Procure and test key pulse-processing components. Progress against these tasks to date, and next steps, are discussed.

  3. The development rainfall forecasting using kalman filter

    Science.gov (United States)

    Zulfi, Mohammad; Hasan, Moh.; Dwidja Purnomo, Kosala

    2018-04-01

    Rainfall forecasting is very interesting for agricultural planing. Rainfall information is useful to make decisions about the plan planting certain commodities. In this studies, the rainfall forecasting by ARIMA and Kalman Filter method. Kalman Filter method is used to declare a time series model of which is shown in the form of linear state space to determine the future forecast. This method used a recursive solution to minimize error. The rainfall data in this research clustered by K-means clustering. Implementation of Kalman Filter method is for modelling and forecasting rainfall in each cluster. We used ARIMA (p,d,q) to construct a state space for KalmanFilter model. So, we have four group of the data and one model in each group. In conclusions, Kalman Filter method is better than ARIMA model for rainfall forecasting in each group. It can be showed from error of Kalman Filter method that smaller than error of ARIMA model.

  4. Current Status of Aerosol Generation and Measurement Facilities for the Verification Test of Containment Filtered Venting System in KAERI

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Sung Il; An, Sang Mo; Ha, Kwang Soon; Kim, Hwan Yeol [KAERI, Daejeon (Korea, Republic of)

    2016-05-15

    In this study, the design of aerosol generation and measurement systems are explained and present circumstances are also described. In addition, the aerosol test plan is shown. Containment Filtered Venting System (FCVS) is one of the safety features to reduce the amount of released fission product into the environment by depressurizing the containment. Since Chernobyl accident, the regulatory agency in several countries in Europe such as France, Germany, Sweden, etc. have been demanded the installation of the CFVS. Moreover, the feasibility study on the CFVS was also performed in U.S. After the Fukushima accident, there is a need to improve a containment venting or installation of depressurizing facility in Korea. As a part of a Ministry of Trade, Industry and Energy (MOTIE) project, KAERI has been conducted the integrated performance verification test of CFVS. As a part of the test, aerosol generation system and measurement systems were designed to simulate the fission products behavior. To perform the integrated verification test of CFVS, aerosol generation and measurement system was designed and manufactured. The component operating condition is determined to consider the severe accident condition. The test will be performed in normal conditions at first, and will be conducted under severe condition, high pressure and high temperature. Undesirable difficulties which disturb the elaborate test are expected, such as thermophoresis on the pipe, vapor condensation on aerosol, etc.

  5. ENVIRONMENTAL TECHNOLOGY VERIFICATION, TEST REPORT OF MOBILE SOURCE EMISSIONS CONTROL DEVICES: CLEAN DIESEL TECHNOLOGIES FUEL-BORNE CATALYST WITH MITSUI/PUREARTH CATALYZED WIRE MESH FILTER

    Science.gov (United States)

    The Environmental Technology Verification report discusses the technology and performance of the Fuel-Borne Catalyst with Mitsui/PUREarth Catalyzed Wire Mesh Filter manufactured by Clean Diesel Technologies, Inc. The technology is a platinum/cerium fuel-borne catalyst in commerci...

  6. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT - PHYSICAL REMOVAL OF MICROBIOLOGICAL AND PARTICULATE CONTAMINANTS IN DRINKING WATER : SEPARMATIC™ FLUID SYSTEMS DIATOMACEOUS EARTH PRESSURE TYPE FILTER SYSTEM MODEL 12P-2

    Science.gov (United States)

    The verification test of the SeparmaticTM DE Pressure Type Filter System Model 12P-2 was conducted at the UNH Water Treatment Technology Assistance Center (WTTAC) in Durham, New Hampshire. The source water was finished water from the Arthur Rollins Treatment Plant that was pretr...

  7. SSME Alternate Turbopump Development Program: Design verification specification for high-pressure fuel turbopump

    Science.gov (United States)

    1989-01-01

    The design and verification requirements are defined which are appropriate to hardware at the detail, subassembly, component, and engine levels and to correlate these requirements to the development demonstrations which provides verification that design objectives are achieved. The high pressure fuel turbopump requirements verification matrix provides correlation between design requirements and the tests required to verify that the requirement have been met.

  8. SWAAM code development, verification and application to steam generator design

    International Nuclear Information System (INIS)

    Shin, Y.W.; Valentin, R.A.

    1990-01-01

    This paper describes the family of SWAAM codes developed by Argonne National Laboratory to analyze the effects of sodium/water reactions on LMR steam generators. The SWAAM codes were developed as design tools for analyzing various phenomena related to steam generator leaks and to predict the resulting thermal and hydraulic effects on the steam generator and the intermediate heat transport system (IHTS). The theoretical foundations and numerical treatments on which the codes are based are discussed, followed by a description of code capabilities and limitations, verification of the codes by comparison with experiment, and applications to steam generator and IHTS design. (author). 25 refs, 14 figs

  9. Development and verification of Monte Carlo burnup calculation system

    International Nuclear Information System (INIS)

    Ando, Yoshihira; Yoshioka, Kenichi; Mitsuhashi, Ishi; Sakurada, Koichi; Sakurai, Shungo

    2003-01-01

    Monte Carlo burnup calculation code system has been developed to evaluate accurate various quantities required in the backend field. From the Actinide Research in a Nuclear Element (ARIANE) program, by using, the measured nuclide compositions of fuel rods in the fuel assemblies irradiated in the commercial Netherlands BWR, the analyses have been performed for the code system verification. The code system developed in this paper has been verified through analysis for MOX and UO2 fuel rods. This system enables to reduce large margin assumed in the present criticality analysis for LWR spent fuels. (J.P.N.)

  10. Development of filters and housings for use on active plant

    International Nuclear Information System (INIS)

    Hackney, S.; Pratt, R.P.

    1983-01-01

    New designs of housings for conventional HEPA filters have been developed and are now in use. A further design is planned for future use. The main features to be developed are the engineering of double door systems to replace bag posting and other methods of filter changing which expose personnel to hazardous environments and the addition of a secondary containment to reduce the role of the gasket seal in the filtration efficiency. Also under development are circular geometry filters of HEPA standard which offer significant advantages over rectangular filters for applications requiring remote shielded change facilities. Two types of filter construction are being evaluated, conventional radial flow cartridge filters and spiral-wound, axial-flow filters. The application of circular filters for primary filter systems on active plant is in hand. A push-through change system has been developed for a new cell facility under construction at Harwell. Existing rectangular filters on a high activity cell are being replaced with clusters of small cartridge filters to overcome changing and disposal problems. A similar system but using 1700 m 3 /h filters for large volume off-gas treatment is also being studied. A remote change shielded filter installation is being developed for use in high alpha, beta, gamma extract systems. The design incorporates large cartridge filters in sealed drums with remote transfer and connection to duct work in the facility. A novel application of the use of double-lid technology removes the need for separate shut off dampers and enables the drums to be sealed for all transfer operations

  11. Development and evaluation of a cleanable high efficiency steel filter

    International Nuclear Information System (INIS)

    Bergman, W.; Larsen, G.; Weber, F.; Wilson, P.; Lopez, R.; Valha, G.; Conner, J.; Garr, J.; Williams, K.; Biermann, A.; Wilson, K.; Moore, P.; Gellner, C.; Rapchun, D.; Simon, K.; Turley, J.; Frye, L.; Monroe, D.

    1993-01-01

    We have developed a high efficiency steel filter that can be cleaned in-situ by reverse air pulses. The filter consists of 64 pleated cylindrical filter elements packaged into a 6l0 x 6l0 x 292 mm aluminum frame and has 13.5 m 2 of filter area. The filter media consists of a sintered steel fiber mat using 2 μm diameter fibers. We conducted an optimization study for filter efficiency and pressure drop to determine the filter design parameters of pleat width, pleat depth, outside diameter of the cylinder, and the total number of cylinders. Several prototype cylinders were then built and evaluated in terms of filter cleaning by reverse air pulses. The results of these studies were used to build the high efficiency steel filter. We evaluated the prototype filter for efficiency and cleanability. The DOP filter certification test showed the filter has a passing efficiency of 99.99% but a failing pressure drop of 0.80 kPa at 1,700 m 3 /hr. Since we were not able to achieve a pressure drop less than 0.25 kPa, the steel filter does not meet all the criteria for a HEPA filter. Filter loading and cleaning tests using AC Fine dust showed the filter could be repeatedly cleaned by reverse air pulses. The next phase of the prototype evaluation consisted of installing the unit and support housing in the exhaust duct work of a uranium grit blaster for a field evaluation at the Y-12 Plant in Oak Ridge, TN. The grit blaster is used to clean the surface of uranium parts and generates a cloud of UO 2 aerosols. We used a 1,700 m 3 /hr slip stream from the 10,200 m 3 /hr exhaust system

  12. Development and evaluation of hot filters

    International Nuclear Information System (INIS)

    Thexton, H.E.

    1975-01-01

    High temperature, high flow filtration removes radioactive particles from the primary coolant, as well as inactive particles before they can become activated. Canadian experience with edge, graphite, and magnetic filters is described. (Author)

  13. Development of evaluation method for hydraulic behavior in Venturi scrubber for filtered venting

    International Nuclear Information System (INIS)

    Horiguchi, Naoki; Nakao, Yasuhiro; Kaneko, Akiko; Abe, Yutaka; Yoshida, Hiroyuki

    2016-01-01

    Filtered venting systems have been installed to restart Nuclear Power Plants in Japan after Fukushima Daiichi Nuclear Disaster. Venturi scrubber is main component of one of the systems. To evaluate decontamination performance of the Venturi scrubber for filtered venting, mechanistic evaluation method for hydrodynamic behavior is important. In this paper, our objective is to develop the method. As approaches, we conducted experimental observation under adiabatic (air-water) condition, developed a numerical simulation code with one-dimensional two-fluid model and made verification and validation by comparison between these results in terms of superficial gas, static pressure, superficial liquid velocity, droplet ratio and droplet diameter in Venturi scrubber. As results, we observed the hydrodynamic behavior, developed the code and confirmed that it has capability to evaluate the parameters with following accuracy, superficial gas velocity with +30%, static pressure in throat part with +-10%, superficial liquid velocity with +-80%, droplet diameter with +-30% and droplet ratio with -50%. (author)

  14. Multidimensional filter banks and wavelets research developments and applications

    CERN Document Server

    Levy, Bernard

    1997-01-01

    Multidimensional Filter Banks and Wavelets: Reserach Developments and Applications brings together in one place important contributions and up-to-date research results in this important area. Multidimensional Filter Banks and Wavelets: Research Developments and Applications serves as an excellent reference, providing insight into some of the most important research issues in the field.

  15. DEVELOPMENT OF AN ADHESIVE CANDLE FILTER SAFEGUARD DEVICE; F

    International Nuclear Information System (INIS)

    John P. Hurley; Ann K. Henderson; Jan W. Nowok; Michael L. Swanson

    2002-01-01

    In order to reach the highest possible efficiencies in a coal-fired turbine-based power system, the turbine should be directly fired with the products of coal conversion. Two main types of systems employ these turbines: those based on pressurized fluidized-bed combustors and those based on integrated gasification combined cycles. In both systems, suspended particulates must be cleaned from the gas stream before it enters the turbine so as to prevent fouling and erosion of the turbine blades. To produce the cleanest gas, barrier filters are being developed and are in use in several facilities. Barrier filters are composed of porous, high-temperature materials that allow the hot gas to pass but collect the particulates on the surface. The three main configurations of the barrier filters are candle, cross-flow, and tube filters. Both candle and tube filters have been tested extensively. They are composed of coarsely porous ceramic that serves as a structural support, overlain with a thin, microporous ceramic layer on the dirty gas side that serves as the primary filter surface. They are highly efficient at removing particulate matter from the gas stream and, because of their ceramic construction, are resistant to gas and ash corrosion. However, ceramics are brittle and individual elements can fail, allowing particulates to pass through the hole left by the filter element and erode the turbine. Preventing all failure of individual ceramic filter elements is not possible at the present state of development of the technology. Therefore, safeguard devices (SGDs) must be employed to prevent the particulates streaming through occasional broken filters from reaching the turbine. However, the SGD must allow for the free passage of gas when it is not activated. Upon breaking of a filter, the SGD must either mechanically close or quickly plug with filter dust to prevent additional dust from reaching the turbine. Production of a dependable rapidly closing autonomous mechanical

  16. Development and verifications of fast reactor fuel design code ''Ceptar''

    International Nuclear Information System (INIS)

    Ozawa, T.; Nakazawa, H.; Abe, T.

    2001-01-01

    The annular fuel is very beneficial for fast reactors, because it is available for both high power and high burn-up. Concerning the irradiation behavior of the annular fuel, most of annular pellets irradiated up to high burn-up showed shrinkage of the central hole due to deformation and restructuring of the pellets. It is needed to predict precisely the shrinkage of the central hole during irradiation, because it has a great influence on power-to-melt. In this paper, outline of CEPTAR code (Calculation code to Evaluate fuel pin stability for annular fuel design) developed to meet this need is presented. In this code, the radial profile of fuel density can be computed by using the void migration model, and law of conservation of mass defines the inner diameter. For the mechanical analysis, the fuel and cladding deformation caused by the thermal expansion, swelling and creep is computed by the stress-strain analysis using the approximation of plane-strain. In addition, CEPTAR can also take into account the effect of Joint-Oxide-Gain (JOG) which is observed in fuel-cladding gap of high burn-up fuel. JOG has an effect to decrease the fuel swelling and to improve the gap conductance due to deposition of solid fission product. Based on post-irradiation data on PFR annular fuel, we developed an empirical model for JOG. For code verifications, the thermal and mechanical data obtained from various irradiation tests and post-irradiation examinations were compared with the predictions of this code. In this study, INTA (instrumented test assembly) test in JOYO, PTM (power-to-melt) test in JOYO, EBR-II, FFTF and MTR in Harwell laboratory, and post-irradiation examinations on a number of PFR fuels, were used as verification data. (author)

  17. Formal Development and Verification of a Distributed Railway Control System

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Peleska, Jan

    1999-01-01

    In this article we introduce the concept for a distributed railway control system and present the specification and verification of the main algorithm used for safe distributed control. Our design and verification approach is based on the RAISE method, starting with highly abstract algebraic...

  18. Formal Development and Verification of a Distributed Railway Control System

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Peleska, Jan

    1998-01-01

    In this article we introduce the concept for a distributed railway control system and present the specification and verification of the main algorithm used for safe distributed control. Our design and verification approach is based on the RAISE method, starting with highly abstract algebraic spec...

  19. Development and Verification of Body Armor Target Geometry Created Using Computed Tomography Scans

    Science.gov (United States)

    2017-07-13

    Computed Tomography Scans by Autumn R Kulaga, Kathryn L Loftis, and Eric Murray Approved for public release; distribution is...Army Research Laboratory Development and Verification of Body Armor Target Geometry Created Using Computed Tomography Scans by Autumn R Kulaga...Development and Verification of Body Armor Target Geometry Created Using Computed Tomography Scans 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c

  20. Development of evaluation and performance verification technology for radiotherapy radiation

    Energy Technology Data Exchange (ETDEWEB)

    Kim, J. Y.; Jang, S. Y.; Kim, B. H. and others

    2005-02-15

    No matter how much the importance is emphasized, the exact assessment of the absorbed doses administered to the patients to treat the various diseases such as lately soaring malignant tumors with the radiotherapy practices is the most important factor. In reality, several over-exposed patients from the radiotherapy practice become very serious social issues. Especially, the development of a technology to exactly assess the high doses and high energies (In general, dose administered to the patients with the radiotherapy practices are very huge doses, and they are about three times higher than the lethal doses) generated by the radiation generators and irradiation equipment is a competing issue to be promptly conducted. Over fifty medical centers in Korea operate the radiation generators and irradiation equipment for the radiotherapy practices. However, neither the legal and regulatory systems to implement a quality assurance program are sufficiently stipulated nor qualified personnel who could run a program to maintain the quality assurance and control of those generators and equipment for the radiotherapy practices in the medical facilities are sufficiently employed. To overcome the above deficiencies, a quality assurance program such as those developed in the technically advanced countries should be developed to exactly assess the doses administered to patients with the radiotherapy practices and develop the necessary procedures to maintain the continuing performance of the machine or equipment for the radiotherapy. The QA program and procedures should induce the fluent calibration of the machine or equipment with quality, and definitely establish the safety of patients in the radiotherapy practices. In this study, a methodology for the verification and evaluation of the radiotherapy doses is developed, and several accurate measurements, evaluations of the doses delivered to patients and verification of the performance of the therapy machine and equipment are

  1. Development and Verification of Behavior of Tritium Analytic Code (BOTANIC)

    International Nuclear Information System (INIS)

    Park, Min Young; Kim, Eung Soo

    2014-01-01

    VHTR, one of the Generation IV reactor concepts, has a relatively high operation temperature and is usually suggested as a heat source for many industrial processes, including hydrogen production process. Thus, it is vital to trace tritium behavior in the VHTR system and the potential permeation rate to the industrial process. In other words, tritium is a crucial issue in terms of safety in the fission reactor system. Therefore, it is necessary to understand the behavior of tritium and the development of the tool to enable this is vital.. In this study, a Behavior of Tritium Analytic Code (BOTANIC) an analytic tool which is capable of analyzing tritium behavior is developed using a chemical process code called gPROMS. BOTANIC was then further verified using the analytic solutions and benchmark codes such as Tritium Permeation Analysis Code (TPAC) and COMSOL. In this study, the Behavior of Tritium Analytic Code, BOTANIC, has been developed using a chemical process code called gPROMS. The code has several distinctive features including non-diluted assumption, flexible applications and adoption of distributed permeation model. Due to these features, BOTANIC has the capability to analyze a wide range of tritium level systems and has a higher accuracy as it has the capacity to solve distributed models. BOTANIC was successfully developed and verified using analytical solution and the benchmark code calculation result. The results showed very good agreement with the analytical solutions and the calculation results of TPAC and COMSOL. Future work will be focused on the total system verification

  2. Design Development and Verification of a System Integrated Modular PWR

    International Nuclear Information System (INIS)

    Kim, S.-H.; Kim, K. K.; Chang, M. H.; Kang, C. S.; Park, G.-C.

    2002-01-01

    An advanced PWR with a rated thermal power of 330 MW has been developed at the Korea Atomic Energy Research Institute (KAERI) for a dual purpose: seawater desalination and electricity generation. The conceptual design of SMART ( System-Integrated Modular Advanced ReacTor) with a desalination system was already completed in March of 1999. The basic design for the integrated nuclear desalination system is currently underway and will be finished by March of 2002. The SMART co-generation plant with the MED seawater desalination process is designed to supply forty thousand (40,000) tons of fresh water per day and ninety (90) MW of electricity to an area with approximately a ten thousand (100,000) population or an industrialized complex. This paper describes advanced design features adopted in the SMART design and also introduces the design and engineering verification program. In the beginning stage of the SMART development, top-level requirements for safety and economics were imposed for the SMART design features. To meet the requirements, highly advanced design features enhancing the safety, reliability, performance, and operability are introduced in the SMART design. The SMART consists of proven KOFA (Korea Optimized Fuel Assembly), helical once-through steam generators, a self-controlled pressurizer, control element drive mechanisms, and main coolant pumps in a single pressure vessel. In order to enhance safety characteristics, innovative design features adopted in the SMART system are low core power density, large negative Moderator Temperature Coefficient (MTC), high natural circulation capability and integral arrangement to eliminate large break loss of coolant accident, etc. The progression of emergency situations into accidents is prevented with a number of advanced engineered safety features such as passive residual heat removal system, passive emergency core cooling system, safeguard vessel, and passive containment over-pressure protection. The preliminary

  3. Software Testing and Verification in Climate Model Development

    Science.gov (United States)

    Clune, Thomas L.; Rood, RIchard B.

    2011-01-01

    Over the past 30 years most climate models have grown from relatively simple representations of a few atmospheric processes to a complex multi-disciplinary system. Computer infrastructure over that period has gone from punch card mainframes to modem parallel clusters. Model implementations have become complex, brittle, and increasingly difficult to extend and maintain. Existing verification processes for model implementations rely almost exclusively upon some combination of detailed analysis of output from full climate simulations and system-level regression tests. In additional to being quite costly in terms of developer time and computing resources, these testing methodologies are limited in terms of the types of defects that can be detected, isolated and diagnosed. Mitigating these weaknesses of coarse-grained testing with finer-grained "unit" tests has been perceived as cumbersome and counter-productive. In the commercial software sector, recent advances in tools and methodology have led to a renaissance for systematic fine-grained testing. We discuss the availability of analogous tools for scientific software and examine benefits that similar testing methodologies could bring to climate modeling software. We describe the unique challenges faced when testing complex numerical algorithms and suggest techniques to minimize and/or eliminate the difficulties.

  4. Formal development and verification of a distributed railway control system

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Peleska, J.

    2000-01-01

    specifications which are transformed into directly implementable distributed control processes by applying a series of refinement and verification steps. Concrete safety requirements are derived from an abstract version that can be easily validated with respect to soundness and completeness. Complexity......The authors introduce the concept for a distributed railway control system and present the specification and verification of the main algorithm used for safe distributed control. Our design and verification approach is based on the RAISE method, starting with highly abstract algebraic...... is further reduced by separating the system model into a domain model and a controller model. The domain model describes the physical system in absence of control and the controller model introduces the safety-related control mechanisms as a separate entity monitoring observables of the physical system...

  5. Characteristics of Quoit filter, a digital filter developed for the extraction of circumscribed shadows, and its applications to mammograms

    International Nuclear Information System (INIS)

    Isobe, Yoshiaki; Ohkubo, Natsumi; Yamamoto, Shinji; Toriwaki, Jun-ichiro; Kobatake, Hidefumi.

    1993-01-01

    This paper presents a newly developed filter called Quoit filter, which detects circumscribed shadows (concentric circular isolated image), like typical cancer regions. This Quoit filter is based on the mathematical morphology and is found to have interesting facts as follows. (1) Output of this filter can be analytically expressible when an input image is assumed to be a concentric circular model (output is expectable for typical inputs). (2) This filter has an ability to reconstruct original isolated models mentioned in (1) selectively, when this filter is applied sequentially twice. This filter was tested on the detection of cancer regions in X-ray mammograms, and for 12 cancer mammograms, this filter achieved a true-positive cancer detection rate of 100 %. (author)

  6. Development of a noise filter for radiation thickness gagemeter

    International Nuclear Information System (INIS)

    Jee, C. W.; Kim, Y. T.; Lee, H. H.

    1995-01-01

    The objective of this study is to develop a filter which attenuates sensor noises of radiation thickness gagemeters of the fifth stand of TCM No. 1 in Pohang steel works. The thickness control loop for the fifth stand is modelled as a system for filter design, where the system input is the speed control input and the system output is the gagemeter output. In the design of a filter, the system is described by an ARMAX(AutoRegressive Moving-Average with auXiliary input) model. The parameters of this model are then estimated by using a recursive least square method. Secondly, the ARMAX model, the estimated system, is transformed into an observer canonical state space form. Thirdly, Kalman filtering is applied to obtain optimal estimates of the state and hence those of thickness measurements of steel strips. In addition, a separate low pass filter is designed, which is directly applicable to the gagemeter outputs. Finally, the designed filter algorithms are implemented and tested on a VMEbus board computer under VxWorks real-time operating system. (author)

  7. Further development and verification of the calculating programme Felix for the simulation of criticality excursions

    International Nuclear Information System (INIS)

    Weber, J.; Denk, W.

    1985-01-01

    An improved version of the FELIX programme was applied to varify excursion experiments 01, 03 through 07, and 13. The correspondence of experiments and verification was good. Programme points to be further developed are shown. (orig.) [de

  8. Development of requirements tracking and verification technology for the NPP software

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Chul Hwan; Kim, Jang Yeol; Lee, Jang Soo; Song, Soon Ja; Lee, Dong Young; Kwon, Kee Choon

    1998-12-30

    Searched and analyzed the technology of requirements engineering in the areas of aerospace and defense industry, medical industry and nuclear industry. Summarized the status of tools for the software design and requirements management. Analyzed the software design methodology for the safety software of NPP. Development of the design requirements for the requirements tracking and verification system. Development of the background technology to design the prototype tool for the requirements tracking and verification.

  9. Development of requirements tracking and verification technology for the NPP software

    International Nuclear Information System (INIS)

    Jung, Chul Hwan; Kim, Jang Yeol; Lee, Jang Soo; Song, Soon Ja; Lee, Dong Young; Kwon, Kee Choon

    1998-01-01

    Searched and analyzed the technology of requirements engineering in the areas of aerospace and defense industry, medical industry and nuclear industry. Summarized the status of tools for the software design and requirements management. Analyzed the software design methodology for the safety software of NPP. Development of the design requirements for the requirements tracking and verification system. Development of the background technology to design the prototype tool for the requirements tracking and verification

  10. Development of requirements tracking and verification system for the software design of distributed control system

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Chul Hwan; Kim, Jang Yeol; Kim, Jung Tack; Lee, Jang Soo; Ham, Chang Shik [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1999-12-31

    In this paper a prototype of Requirement Tracking and Verification System(RTVS) for a Distributed Control System was implemented and tested. The RTVS is a software design and verification tool. The main functions required by the RTVS are managing, tracking and verification of the software requirements listed in the documentation of the DCS. The analysis of DCS software design procedures and interfaces with documents were performed to define the user of the RTVS, and the design requirements for RTVS were developed. 4 refs., 3 figs. (Author)

  11. Development of requirements tracking and verification system for the software design of distributed control system

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Chul Hwan; Kim, Jang Yeol; Kim, Jung Tack; Lee, Jang Soo; Ham, Chang Shik [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1998-12-31

    In this paper a prototype of Requirement Tracking and Verification System(RTVS) for a Distributed Control System was implemented and tested. The RTVS is a software design and verification tool. The main functions required by the RTVS are managing, tracking and verification of the software requirements listed in the documentation of the DCS. The analysis of DCS software design procedures and interfaces with documents were performed to define the user of the RTVS, and the design requirements for RTVS were developed. 4 refs., 3 figs. (Author)

  12. Development of membrane filters with nanostructured porous layer by coating of metal nanoparticles sintered onto a micro-filter

    International Nuclear Information System (INIS)

    Park, Seok Joo; Park, Young Ok; Lee, Dong Geun; Ryu, Jeong In

    2008-01-01

    The membrane filter adhered with nanostructured porous layer was made by heat treatment after deposition of nanoparticle-agglomerates sintered in aerosol phase onto a conventional micron-fibrous metal filter as a substrate filter. The Sintered-Nanoparticle-Agglomerates-coated NanoStructured porous layer Membrane Filter (SNA-NSMF), whose the filtration performance was improved compared with the conventional metal membrane filters, was developed by adhesion of nanoparticle-agglomerates of dendrite structure sintered onto the micron-fibrous metal filter. The size of nanoparticle-agglomerates of dendrite structure decreased with increasing the sintering temperature because nanoparticle-agglomerates shrank. When shrinking nanoparticle-agglomerates were deposited and treated with heat onto the conventional micron-fibrous metal filter, pore size of nanostructured porous layer decreased. Therefore, pressure drops of SNA-NSMFs increased from 0.3 to 0.516 KPa and filtration efficiencies remarkably increased from 95.612 to 99.9993%

  13. Development of the clearance level verification evaluation system. 2. Construction of the clearance data management system

    International Nuclear Information System (INIS)

    Kubota, Shintaro; Usui, Hideo; Kawagoshi, Hiroshi

    2014-06-01

    Clearance is defined as the removal of radioactive materials or radioactive objects within authorized practices from any further regulatory control by the regulatory body. In Japan, clearance level and a procedure for its verification has been introduced under the Laws and Regulations, and solid clearance wastes inspected by the national authority can be handled and recycled as normal wastes. The most prevalent type of wastes have generated from the dismantling of nuclear facilities, so the Japan Atomic Energy Agency (JAEA) has been developing the Clearance Level Verification Evaluation System (CLEVES) as a convenient tool. The Clearance Data Management System (CDMS), which is a part of CLEVES, has been developed to support measurement, evaluation, making and recording documents with clearance level verification. In addition, validation of the evaluation result of the CDMS was carried out by inputting the data of actual clearance activities in the JAEA. Clearance level verification is easily applied by using the CDMS for the clearance activities. (author)

  14. Development of Test Protocols for International Space Station Particulate Filters

    Science.gov (United States)

    Vijayakumar, R.; Green, Robert D.; Agui, Juan H.

    2015-01-01

    Air quality control on the International Space Station (ISS) is a vital requirement for maintaining a clean environment for the crew and the hardware. This becomes a serious challenge in pressurized space compartments since no outside air ventilation is possible, and a larger particulate load is imposed on the filtration system due to lack of gravitational settling. The ISS Environmental Control and Life Support System (ECLSS) uses a filtration system that has been in use for over 14 years and has proven to meet this challenge. The heart of this system is a traditional High-Efficiency Particulate Air (HEPA) filter configured to interface with the rest of the life support elements and provide effective cabin filtration. The filter element for this system has a non-standard cross-section with a length-to-width ratio (LW) of 6.6. A filter test setup was designed and built to meet industry testing standards. A CFD analysis was performed to initially determine the optimal duct geometry and flow configuration. Both a screen and flow straighter were added to the test duct design to improve flow uniformity and face velocity profiles were subsequently measured to confirm. Flow quality and aerosol mixing assessments show that the duct flow is satisfactory for the intended leak testing. Preliminary leak testing was performed on two different ISS filters, one with known perforations and one with limited use, and results confirmed that the testing methods and photometer instrument are sensitive enough to detect and locate compromised sections of an ISS BFE.Given the engineering constraints in designing spacecraft life support systems, it is anticipated that non-industry standard filters will be required in future designs. This work is focused on developing test protocols for testing the ISS BFE filters, but the methodology is general enough to be extended to other present and future spacecraft filters. These techniques for characterizing the test duct and perform leak testing

  15. Development of active porous medium filters based on plasma textiles

    International Nuclear Information System (INIS)

    Kuznetsov, Ivan A.; Saveliev, Alexei V.; Rasipuram, Srinivasan; Kuznetsov, Andrey V.; Brown, Alan; Jasper, Warren

    2012-01-01

    Inexpensive, flexible, washable, and durable materials that serve as antimicrobial filters and self-decontaminating fabrics are needed to provide active protection to people in areas regularly exposed to various biohazards, such as hospitals and bio research labs working with pathogens. Airlines and cruise lines need such material to combat the spread of infections. In households these materials can be used in HVAC filters to fight indoor pollution, which is especially dangerous to people suffering from asthma. Efficient filtering materials are also required in areas contaminated by other types of hazardous dust particulates, such as nuclear dust. The primary idea that guided the undertaken study is that a microplasma-generating structure can be embedded in a textile fabric to generate a plasma sheath (''plasma shield'') that kills bacterial agents coming in contact with the fabric. The research resulted in the development of a plasma textile that can be used for producing new types of self-decontaminating garments, fabrics, and filter materials, capable of activating a plasma sheath that would filter, capture, and destroy any bacteriological agent deposited on its surface. This new material relies on the unique antimicrobial and catalytic properties of cold (room temperature) plasma that is benign to people and does not cause thermal damage to many polymer textiles, such as Nomex and polypropylene. The uniqueness of cold plasma as a disinfecting agent lies in the inability of bacteria to develop resistance to plasma exposure, as they can for antibiotics. Plasma textiles could thus be utilized for microbial destruction in active antimicrobial filters (for continuous decontamination and disinfection of large amounts of air) as well as in self-decontaminating surfaces and antibacterial barriers (for example, for creating local antiseptic or sterile environments around wounds and burns).

  16. Development of active porous medium filters based on plasma textiles

    Science.gov (United States)

    Kuznetsov, Ivan A.; Saveliev, Alexei V.; Rasipuram, Srinivasan; Kuznetsov, Andrey V.; Brown, Alan; Jasper, Warren

    2012-05-01

    Inexpensive, flexible, washable, and durable materials that serve as antimicrobial filters and self-decontaminating fabrics are needed to provide active protection to people in areas regularly exposed to various biohazards, such as hospitals and bio research labs working with pathogens. Airlines and cruise lines need such material to combat the spread of infections. In households these materials can be used in HVAC filters to fight indoor pollution, which is especially dangerous to people suffering from asthma. Efficient filtering materials are also required in areas contaminated by other types of hazardous dust particulates, such as nuclear dust. The primary idea that guided the undertaken study is that a microplasma-generating structure can be embedded in a textile fabric to generate a plasma sheath ("plasma shield") that kills bacterial agents coming in contact with the fabric. The research resulted in the development of a plasma textile that can be used for producing new types of self-decontaminating garments, fabrics, and filter materials, capable of activating a plasma sheath that would filter, capture, and destroy any bacteriological agent deposited on its surface. This new material relies on the unique antimicrobial and catalytic properties of cold (room temperature) plasma that is benign to people and does not cause thermal damage to many polymer textiles, such as Nomex and polypropylene. The uniqueness of cold plasma as a disinfecting agent lies in the inability of bacteria to develop resistance to plasma exposure, as they can for antibiotics. Plasma textiles could thus be utilized for microbial destruction in active antimicrobial filters (for continuous decontamination and disinfection of large amounts of air) as well as in self-decontaminating surfaces and antibacterial barriers (for example, for creating local antiseptic or sterile environments around wounds and burns).

  17. Development of active porous medium filters based on plasma textiles

    Energy Technology Data Exchange (ETDEWEB)

    Kuznetsov, Ivan A.; Saveliev, Alexei V.; Rasipuram, Srinivasan; Kuznetsov, Andrey V.; Brown, Alan; Jasper, Warren [Mechanical and Aerospace Engineering, North Carolina State University, Raleigh, NC 27695 (United States); Textile Engineering Chemistry and Science, North Carolina State University, Raleigh, NC 27695 (United States)

    2012-05-15

    Inexpensive, flexible, washable, and durable materials that serve as antimicrobial filters and self-decontaminating fabrics are needed to provide active protection to people in areas regularly exposed to various biohazards, such as hospitals and bio research labs working with pathogens. Airlines and cruise lines need such material to combat the spread of infections. In households these materials can be used in HVAC filters to fight indoor pollution, which is especially dangerous to people suffering from asthma. Efficient filtering materials are also required in areas contaminated by other types of hazardous dust particulates, such as nuclear dust. The primary idea that guided the undertaken study is that a microplasma-generating structure can be embedded in a textile fabric to generate a plasma sheath (''plasma shield'') that kills bacterial agents coming in contact with the fabric. The research resulted in the development of a plasma textile that can be used for producing new types of self-decontaminating garments, fabrics, and filter materials, capable of activating a plasma sheath that would filter, capture, and destroy any bacteriological agent deposited on its surface. This new material relies on the unique antimicrobial and catalytic properties of cold (room temperature) plasma that is benign to people and does not cause thermal damage to many polymer textiles, such as Nomex and polypropylene. The uniqueness of cold plasma as a disinfecting agent lies in the inability of bacteria to develop resistance to plasma exposure, as they can for antibiotics. Plasma textiles could thus be utilized for microbial destruction in active antimicrobial filters (for continuous decontamination and disinfection of large amounts of air) as well as in self-decontaminating surfaces and antibacterial barriers (for example, for creating local antiseptic or sterile environments around wounds and burns).

  18. Baumot BA-B Diesel Particulate Filter with Pre-Catalyst (ETV Mobile Source Emissions Control Devices) Verification Report

    Science.gov (United States)

    The Baumot BA-B Diesel Particulate Filter with Pre-Catalyst is a diesel engine retrofit device for light, medium, and heavy heavy-duty diesel on-highway engines for use with commercial ultra-low-sulfur diesel (ULSD) fuel. The BA-B particulate filter is composed of a pre-catalyst ...

  19. EURATOM safeguards efforts in the development of spent fuel verification methods by non-destructive assay

    Energy Technology Data Exchange (ETDEWEB)

    Matloch, L.; Vaccaro, S.; Couland, M.; De Baere, P.; Schwalbach, P. [Euratom, Communaute europeenne de l' energie atomique - CEEA (European Commission (EC))

    2015-07-01

    The back end of the nuclear fuel cycle continues to develop. The European Commission, particularly the Nuclear Safeguards Directorate of the Directorate General for Energy, implements Euratom safeguards and needs to adapt to this situation. The verification methods for spent nuclear fuel, which EURATOM inspectors can use, require continuous improvement. Whereas the Euratom on-site laboratories provide accurate verification results for fuel undergoing reprocessing, the situation is different for spent fuel which is destined for final storage. In particular, new needs arise from the increasing number of cask loadings for interim dry storage and the advanced plans for the construction of encapsulation plants and geological repositories. Various scenarios present verification challenges. In this context, EURATOM Safeguards, often in cooperation with other stakeholders, is committed to further improvement of NDA methods for spent fuel verification. In this effort EURATOM plays various roles, ranging from definition of inspection needs to direct participation in development of measurement systems, including support of research in the framework of international agreements and via the EC Support Program to the IAEA. This paper presents recent progress in selected NDA methods. These methods have been conceived to satisfy different spent fuel verification needs, ranging from attribute testing to pin-level partial defect verification. (authors)

  20. Formal Development and Verification of Railway Control Systems

    DEFF Research Database (Denmark)

    Vu Hong, Linh; Haxthausen, Anne Elisabeth; Peleska, Jan

    done applying conventional methods where requirements and designs are described using natural language, diagrams and pseudo code, and the verification of requirements has been done by code inspection and non-exhaustive testing. These techniques are not sufficient, leading to errors and an in-effective...... for Strategic Research. The work is affiliated with a number of partners: DTU Compute, DTU Transport, DTU Management, DTU Fotonik, Bremen University, Banedanmark, Trafikstyrelsen, DSB, and DSB S-tog. More information about RobustRails project is available at http://www.dtu.dk/subsites/robustrails/English.aspx...

  1. Development of Genetic Markers for Triploid Verification of the Pacific Oyster,

    Directory of Open Access Journals (Sweden)

    Jung-Ha Kang

    2013-07-01

    Full Text Available The triploid Pacific oyster, which is produced by mating tetraploid and diploid oysters, is favored by the aquaculture industry because of its better flavor and firmer texture, particularly during the summer. However, tetraploid oyster production is not feasible in all oysters; the development of tetraploid oysters is ongoing in some oyster species. Thus, a method for ploidy verification is necessary for this endeavor, in addition to ploidy verification in aquaculture farms and in the natural environment. In this study, a method for ploidy verification of triploid and diploid oysters was developed using multiplex polymerase chain reaction (PCR panels containing primers for molecular microsatellite markers. Two microsatellite multiplex PCR panels consisting of three markers each were developed using previously developed microsatellite markers that were optimized for performance. Both panels were able to verify the ploidy levels of 30 triploid oysters with 100% accuracy, illustrating the utility of microsatellite markers as a tool for verifying the ploidy of individual oysters.

  2. Dissolution Model Development: Formulation Effects and Filter Complications

    DEFF Research Database (Denmark)

    Berthelsen, Ragna; Holm, Rene; Jacobsen, Jette

    2016-01-01

    This study describes various complications related to sample preparation (filtration) during development of a dissolution method intended to discriminate among different fenofibrate immediate-release formulations. Several dissolution apparatus and sample preparation techniques were tested. The fl....... With the tested drug–formulation combination, the best in vivo–in vitro correlation was found after filtration of the dissolution samples through 0.45-μm hydrophobic PTFE membrane filters....

  3. Design and development of laser eye protection filter

    International Nuclear Information System (INIS)

    Ahmed, K; Khan, A N; Rauf, A; Gul, A; Aslam, M

    2013-01-01

    Laser based devices, have been operational for measurement of distances horizontally and vertically in avionics and surveillance industries. These equipments are functional on pulsed Nd:YAG laser at 1064nm, this wavelength elevate the risk of eye exposure to personnel at unexpected levels. In this paper the eye protection filters, for the wavelength 1064nm were developed with soft (ZnS) and hard (TiO 2 ) coating materials by using thin film vacuum coating technique. The damage threshold of the filter is 0.2 J/cm 2 . Transmission characteristics are measured and discussed. Optical damage threshold (for eye 5 × 10 −6 J/cm2) at various distances is also simulated.

  4. Development of a tool for knowledge base verification of expert system based on Design/CPN

    International Nuclear Information System (INIS)

    Kim, Jong Hyun

    1998-02-01

    Verification is a necessary work in developing a reliable expert system. Verification is a process aimed at demonstrating whether a system meets it's specified requirements. As expert systems are used in various applications, the knowledge base verification of systems takes an important position. The conventional Petri net approach that has been studied recently in order to verify the knowledge base is found that it is inadequate to verify the knowledge base of large and complex system, such as alarm processing system of nuclear power plant. Thus, we propose an improved method that models the knowledge base as enhanced colored Petri net. In this study, we analyze the reachability and the error characteristics of the knowledge base. Generally, verification process requires computational support by automated tools. For this reason, this study developed a tool for knowledge base verification based on Design/CPN, which is a tool for editing, modeling, and simulating Colored Petri net. This tool uses Enhanced Colored Petri net as a modeling method. By applying this tool to the knowledge base of nuclear power plant, it is noticed that it can successfully check most of the anomalies that can occur in a knowledge base

  5. Developing a NASA strategy for the verification of large space telescope observatories

    Science.gov (United States)

    Crooke, Julie A.; Gunderson, Johanna A.; Hagopian, John G.; Levine, Marie

    2006-06-01

    In July 2005, the Office of Program Analysis and Evaluation (PA&E) at NASA Headquarters was directed to develop a strategy for verification of the performance of large space telescope observatories, which occurs predominantly in a thermal vacuum test facility. A mission model of the expected astronomical observatory missions over the next 20 years was identified along with performance, facility and resource requirements. Ground testing versus alternatives was analyzed to determine the pros, cons and break points in the verification process. Existing facilities and their capabilities were examined across NASA, industry and other government agencies as well as the future demand for these facilities across NASA's Mission Directorates. Options were developed to meet the full suite of mission verification requirements, and performance, cost, risk and other analyses were performed. Findings and recommendations from the study were presented to the NASA Administrator and the NASA Strategic Management Council (SMC) in February 2006. This paper details the analysis, results, and findings from this study.

  6. Development of laundry drainage treatment system with ceramic ultra filter

    International Nuclear Information System (INIS)

    Kanda, Masanori; Kurahasi, Takafumi

    1995-01-01

    A compact laundry drainage treatment system (UF system hereafter) with a ceramic ultra filter membrane (UF membrane hereafter) has been developed to reduce radioactivity in laundry drainage from nuclear power plants. The UF membrane is made of sintered fine ceramic. The UF membrane has 0.01 μm fine pores, resulting in a durable, heat-resistant, and corrosion-resistant porous ceramic filter medium. A cross-flow system, laundry drainage is filtrated while it flows across the UF membrane, is used as the filtration method. This method creates less caking when compared to other methods. The UF membrane is back washed at regular intervals with permeated water to minimize caking of the filter. The UF membrane and cross-flow system provides long stable filtration. The ceramic UF membrane is strong enough to concentrate suspended solids in laundry drainage up to a weight concentration of 10%. The final concentrated laundry drainage can be treated in an incinerator. The performance of the UF system was checked using radioactive laundry drainage. The decontamination factor of the UF system was 25 or more. The laundry drainage treatment capacity and concentration ratio of the UF system, as well as the service life of the UF membrane were also checked by examination using simulated non-radioactive laundry drainage. Even though laundry drainage was concentrated 1000 times, the UF system showed good permeated water quality and permeated water flux. (author)

  7. DNN Filter Bank Cepstral Coefficients for Spoofing Detection

    DEFF Research Database (Denmark)

    Yu, Hong; Tan, Zheng-Hua; Zhang, Yiming

    2017-01-01

    With the development of speech synthesis techniques, automatic speaker verification systems face the serious challenge of spoofing attack. In order to improve the reliability of speaker verification systems, we develop a new filter bank-based cepstral feature, deep neural network (DNN) filter bank...... cepstral coefficients, to distinguish between natural and spoofed speech. The DNN filter bank is automatically generated by training a filter bank neural network (FBNN) using natural and synthetic speech. By adding restrictions on the training rules, the learned weight matrix of FBNN is band limited...... and sorted by frequency, similar to the normal filter bank. Unlike the manually designed filter bank, the learned filter bank has different filter shapes in different channels, which can capture the differences between natural and synthetic speech more effectively. The experimental results on the ASVspoof...

  8. Development of a Torque Sensor-Based Test Bed for Attitude Control System Verification and Validation

    Science.gov (United States)

    2017-12-30

    AFRL-RV-PS- AFRL-RV-PS- TR-2018-0008 TR-2018-0008 DEVELOPMENT OF A TORQUE SENSOR- BASED TEST BED FOR ATTITUDE CONTROL SYSTEM VERIFICATION AND...Sensor-Based Test Bed for Attitude Control System Verification & Validation 5a. CONTRACT NUMBER FA9453-15-1-0315 5b. GRANT NUMBER 5c. PROGRAM ELEMENT...NUMBER 62601F 6. AUTHOR(S) Norman Fitz-Coy 5d. PROJECT NUMBER 4846 5e. TASK NUMBER PPM00015968 5f. WORK UNIT NUMBER EF125135 7. PERFORMING

  9. Development, characterization, and modeling of a tunable filter camera

    Science.gov (United States)

    Sartor, Mark Alan

    1999-10-01

    This paper describes the development, characterization, and modeling of a Tunable Filter Camera (TFC). The TFC is a new multispectral instrument with electronically tuned spectral filtering and low-light-level sensitivity. It represents a hybrid between hyperspectral and multispectral imaging spectrometers that incorporates advantages from each, addressing issues such as complexity, cost, lack of sensitivity, and adaptability. These capabilities allow the TFC to be applied to low- altitude video surveillance for real-time spectral and spatial target detection and image exploitation. Described herein are the theory and principles of operation for the TFC, which includes a liquid crystal tunable filter, an intensified CCD, and a custom apochromatic lens. The results of proof-of-concept testing, and characterization of two prototype cameras are included, along with a summary of the design analyses for the development of a multiple-channel system. A significant result of this effort was the creation of a system-level model, which was used to facilitate development and predict performance. It includes models for the liquid crystal tunable filter and intensified CCD. Such modeling was necessary in the design of the system and is useful for evaluation of the system in remote-sensing applications. Also presented are characterization data from component testing, which included quantitative results for linearity, signal to noise ratio (SNR), linearity, and radiometric response. These data were used to help refine and validate the model. For a pre-defined source, the spatial and spectral response, and the noise of the camera, system can now be predicted. The innovation that sets this development apart is the fact that this instrument has been designed for integrated, multi-channel operation for the express purpose of real-time detection/identification in low- light-level conditions. Many of the requirements for the TFC were derived from this mission. In order to provide

  10. Development of NSSS Control System Performance Verification Tool

    International Nuclear Information System (INIS)

    Sohn, Suk Whun; Song, Myung Jun

    2007-01-01

    Thanks to many control systems and control components, the nuclear power plant can be operated safely and efficiently under the transient condition as well as the steady state condition. If a fault or an error exists in control systems, the nuclear power plant should experience the unwanted and unexpected transient condition. Therefore, the performance of these control systems and control components should be completely verified through power ascension tests of startup period. However, there are many needs to replace control components or to modify control logic or to change its setpoint. It is important to verify the performance of changed control system without redoing power ascension tests in order to perform these changes. Up to now, a simulation method with computer codes which has been used for design of nuclear power plants was commonly used to verify its performance. But, if hardware characteristics of control system are changed or the software in control system has an unexpected fault or error, this simulation method is not effective to verify the performance of changed control system. Many tests related to V and V (Verification and Validation) are performed in the factory as well as in the plant to eliminate these errors which might be generated in hardware manufacturing or software coding. It reveals that these field tests and the simulation method are insufficient to guaranty the performance of changed control system. Two unexpected transients occurred in YGN 5 and 6 startup period are good examples to show this fact. One occurred at 50% reactor power and caused reactor trip. The other occurred during 70% loss of main feedwater pump test and caused the excess turbine runback

  11. Development and verification of a reciprocating test rig designed for investigation of piston ring tribology

    DEFF Research Database (Denmark)

    Pedersen, Michael Torben; Imran, Tajammal; Klit, Peder

    2009-01-01

    This paper describes the development and verification of a reciprocating test rig, which was designed to study the piston ring tribology. A crank mechanism is used to generate a reciprocating motion for a moving plate, which acts as the liner. A stationary block acting as the ring package is loaded......, which is suitable for the study of piston ring tribology....

  12. Development Modules for Specification of Requirements for a System of Verification of Parallel Algorithms

    Directory of Open Access Journals (Sweden)

    Vasiliy Yu. Meltsov

    2012-05-01

    Full Text Available This paper presents the results of the development of one of the modules of the system verification of parallel algorithms that are used to verify the inference engine. This module is designed to build the specification requirements, the feasibility of which on the algorithm is necessary to prove (test.

  13. Development and Implementation of Cgcre Accreditation Program for Greenhouse Gas Verification Bodies

    International Nuclear Information System (INIS)

    Fermam, Ricardo Kropf Santos; De Queiroz, Andrea Barroso Melo Monteiro

    2016-01-01

    An organizational innovation is defined as the implementation of a new organizational method in the firm's business practices, organization of your workplace or in its external relations. This work illustrates a Cgcre innovation, by presentation of the development process of greenhouse gases verification body in Brazil according to the Brazilian accreditation body, the General Coordination for Accreditation (Cgcre). (paper)

  14. Development of nuclear standard filter elements for PWR plant

    International Nuclear Information System (INIS)

    Weng Minghui; Wu Jidong; Gu Xiuzhang; Zhang Jinghua

    1988-11-01

    Model FRX-5 and FRX-10 nuclear standard filter elements are used for the fluid clarification of the chemical and volume control system (CVCS), boron recycle system (BRS), spent fuel pit cooling system (SFPCS) and steam generator blowdown system (SGBS) in Qinshan Nuclear Power Plant. The radioactive contaminant, fragment of resin and impurity are collected by these filter elements, The core of filter elements consists of polypropylene frames and paper filter medium bonded by resin. A variety of filter papers are tested for optimization. The flow rate and comprehensive performance have been measured in the simulation condition. The results showed that the performance and lifetime have met the designing requirements. The advantages of the filter elements are simple in manufacturing, less expense and facilities for waste-disposal. At present, some of filter elements have been produced and put in operation

  15. Development and verification of printed circuit board toroidal transformer model

    DEFF Research Database (Denmark)

    Pejtersen, Jens; Mønster, Jakob Døllner; Knott, Arnold

    2013-01-01

    An analytical model of an air core printed circuit board embedded toroidal transformer configuration is presented. The transformer has been developed for galvanic isolation of very high frequency switch-mode dc-dc power converter applications. The theoretical model is developed and verified...... by comparing calculated parameters with 3D finite element simulations and experimental measurement results. The developed transformer model shows good agreement with the simulated and measured results. The model can be used to predict the parameters of printed circuit board toroidal transformer configurations...

  16. Development and verification test of integral reactor major components

    Energy Technology Data Exchange (ETDEWEB)

    Kim, J. I.; Kim, Y. W.; Kim, J. H. and others

    1999-03-01

    The conceptual designs for SG, MCP, CEDM to be installed in the integral reactor SMART were developed. Three-dimensional CAD models for the major components were developed to visualize the design concepts. Once-through helical steam generator was conceptually designed for SMART. Canned motor pump was adopted in the conceptual design of MCP. Linear pulse motor type and ballscrew type CEDM, which have fine control capabilities were studied for adoption in SMART. In parallel with the structural design, the electro-magnetic design was performed for the sizing motors and electro-magnet. Prototypes for the CEDM and MCP sub-assemblies were developed and tested to verify the performance. The impeller design procedure and the computer program to analyze the dynamic characteristics of MCP rotor shaft were developed. The design concepts of SG, MCP, CEDM were also invetigated for the fabricability.

  17. Development and verification test of integral reactor major components

    International Nuclear Information System (INIS)

    Kim, J. I.; Kim, Y. W.; Kim, J. H. and others

    1999-03-01

    The conceptual designs for SG, MCP, CEDM to be installed in the integral reactor SMART were developed. Three-dimensional CAD models for the major components were developed to visualize the design concepts. Once-through helical steam generator was conceptually designed for SMART. Canned motor pump was adopted in the conceptual design of MCP. Linear pulse motor type and ballscrew type CEDM, which have fine control capabilities were studied for adoption in SMART. In parallel with the structural design, the electro-magnetic design was performed for the sizing motors and electro-magnet. Prototypes for the CEDM and MCP sub-assemblies were developed and tested to verify the performance. The impeller design procedure and the computer program to analyze the dynamic characteristics of MCP rotor shaft were developed. The design concepts of SG, MCP, CEDM were also invetigated for the fabricability

  18. Development of acid-resistant HEPA filter components

    International Nuclear Information System (INIS)

    Terada, K.; Woodard, R.W.; Buttedahl, O.I.

    1981-01-01

    Laboratory and in-service tests of various HEPA filter media and separators were conducted to establish their relative resistances to HNO 3 -HF vapors. Filter medium of glass fiber with Nomex additive and aluminum separators with an epoxy-vinyl coating have performed quite well in the acid environment in the laboratory, and in prototype-filters placed in service in a plenum at Rocky Flats. Proprietary filters with new design and/or components were also tested in service with generally good results

  19. Development of Real Time Implementation of 5/5 Rule based Fuzzy Logic Controller Shunt Active Power Filter for Power Quality Improvement

    Science.gov (United States)

    Puhan, Pratap Sekhar; Ray, Pravat Kumar; Panda, Gayadhar

    2016-12-01

    This paper presents the effectiveness of 5/5 Fuzzy rule implementation in Fuzzy Logic Controller conjunction with indirect control technique to enhance the power quality in single phase system, An indirect current controller in conjunction with Fuzzy Logic Controller is applied to the proposed shunt active power filter to estimate the peak reference current and capacitor voltage. Current Controller based pulse width modulation (CCPWM) is used to generate the switching signals of voltage source inverter. Various simulation results are presented to verify the good behaviour of the Shunt active Power Filter (SAPF) with proposed two levels Hysteresis Current Controller (HCC). For verification of Shunt Active Power Filter in real time, the proposed control algorithm has been implemented in laboratory developed setup in dSPACE platform.

  20. Effective Development and Verification of Railway Control Software

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth

    2011-01-01

    This document presents a method for effective development of software for a product line of similar railway control systems. The software is constructed in three steps: first a specifications in a domain-specific language is created, then a formal behavioural controller model is automatically...

  1. Status of development and verification of the CTFD code FLUBOX

    International Nuclear Information System (INIS)

    Graf, U.; Paradimitriou, P.

    2004-01-01

    The Computational Two-Fluid Dynamics (CTFD) code FLUBOX is developed at GRS for the multidimensional simulation of two-phase flows. FLUBOX will also be used as a multidimensional module for the German system code ATHLET. The Benchmark test cases of the European ASTAR project were used to verify the ability of the code FLUBOX to calculate typical two-phase flow phenomena and conditions: void and pressure wave propagation, phase transitions, countercurrent flows, sharp interface movements, compressible (vapour) and nearly incompressible (water) conditions, thermal and mechanical non-equilibrium, stiff source terms due to mass and heat transfer between the phases. Realistic simulations of two-phase require beside the pure conservation equations additional transport equations for the interfacial area, turbulent energy and dissipation. A transport equation for the interfacial area density covering the whole two-phase flow range is in development. First validation calculations are presented in the paper. Turbulent shear stress for two-phase flows will be modelled by the development of transport equations for the turbulent kinetic energy and the turbulent dissipation rate. The development of the transport equations is mainly based on first principles on bubbles or drops and is largely free from empiricism. (author)

  2. Intelligent Tools for Planning Knowledge base Development and Verification

    Science.gov (United States)

    Chien, Steve A.

    1996-01-01

    A key obstacle hampering fielding of AI planning applications is the considerable expense of developing, verifying, updating, and maintaining the planning knowledge base (KB). Planning systems must be able to compare favorably in terms of software lifecycle costs to other means of automation such as scripts or rule-based expert systems.

  3. Developing a verification tool for calculations dissemination through COBAYA

    International Nuclear Information System (INIS)

    Sabater Alcaraz, A.; Rucabado Rucabado, G.; Cuervo Gomez, D.; Garcia Herranz, N.

    2014-01-01

    The development of a software tool that automates the comparison of results with previous versions of the code and results using models of accuracy is crucial for implementing the code new functionalities. The work presented here has been the generation the mentioned tool and the set of reference cases that have set up the afore mentioned matrix. (Author)

  4. Verification and Validation in a Rapid Software Development Process

    Science.gov (United States)

    Callahan, John R.; Easterbrook, Steve M.

    1997-01-01

    The high cost of software production is driving development organizations to adopt more automated design and analysis methods such as rapid prototyping, computer-aided software engineering (CASE) tools, and high-level code generators. Even developers of safety-critical software system have adopted many of these new methods while striving to achieve high levels Of quality and reliability. While these new methods may enhance productivity and quality in many cases, we examine some of the risks involved in the use of new methods in safety-critical contexts. We examine a case study involving the use of a CASE tool that automatically generates code from high-level system designs. We show that while high-level testing on the system structure is highly desirable, significant risks exist in the automatically generated code and in re-validating releases of the generated code after subsequent design changes. We identify these risks and suggest process improvements that retain the advantages of rapid, automated development methods within the quality and reliability contexts of safety-critical projects.

  5. Development and verification of symptom based emergency procedure support system

    International Nuclear Information System (INIS)

    Saijou, Nobuyuki; Sakuma, Akira; Takizawa, Yoji; Tamagawa, Naoko; Kubota, Ryuji; Satou, Hiroyuki; Ikeda, Koji; Taminami, Tatsuya

    1998-01-01

    A Computerized Emergency Procedure Guideline (EPG) Support System has been developed for BWR and evaluated using training simulator. It aims to enhance the effective utilization of EPG. The system identifies suitable symptom-based operating procedures for present plant status automatically. It has two functions : one is plant status identification function, and the other is man-machine interface function. For the realization of the former function, a method which identifies and prioritize suitable symptom-based operational procedures against present plant status has been developed. As man-machine interface, operation flow chart display has been developed. It express the flow of the identified operating procedures graphically. For easy understanding of the display, important information such as plant status change, priority of operating procedures and completion/uncompletion of the operation is displayed on the operation flow display by different colors. As evaluation test, the response of the system to the design based accidents was evaluated by actual plant operators, using training simulator at BWR Training Center. Through the analysis of interviews and questionnaires to operators, it was shown that the system is effective and can be utilized for a real plant. (author)

  6. EVA Development and Verification Testing at NASA's Neutral Buoyancy Laboratory

    Science.gov (United States)

    Jairala, Juniper C.; Durkin, Robert; Marak, Ralph J.; Sipila, Stepahnie A.; Ney, Zane A.; Parazynski, Scott E.; Thomason, Arthur H.

    2012-01-01

    As an early step in the preparation for future Extravehicular Activities (EVAs), astronauts perform neutral buoyancy testing to develop and verify EVA hardware and operations. Neutral buoyancy demonstrations at NASA Johnson Space Center's Sonny Carter Training Facility to date have primarily evaluated assembly and maintenance tasks associated with several elements of the International Space Station (ISS). With the retirement of the Shuttle, completion of ISS assembly, and introduction of commercial players for human transportation to space, evaluations at the Neutral Buoyancy Laboratory (NBL) will take on a new focus. Test objectives are selected for their criticality, lack of previous testing, or design changes that justify retesting. Assembly tasks investigated are performed using procedures developed by the flight hardware providers and the Mission Operations Directorate (MOD). Orbital Replacement Unit (ORU) maintenance tasks are performed using a more systematic set of procedures, EVA Concept of Operations for the International Space Station (JSC-33408), also developed by the MOD. This paper describes the requirements and process for performing a neutral buoyancy test, including typical hardware and support equipment requirements, personnel and administrative resource requirements, examples of ISS systems and operations that are evaluated, and typical operational objectives that are evaluated.

  7. Performance verification of Surface Mapping Instrument developed at CGM

    DEFF Research Database (Denmark)

    Bariani, Paolo

    The need of measuring narrow structures, in the micro and nano scale, over a broader range, can be satisfied by the use of highly resolving techniques, such as atomic force microscopy (AFM), in combination with probe relocation and data file stitching. At the Technical University of Denmark......, research has been carried out over the past years involving an AFM probe mounted on a coordinate measuring machine (CMM). Sensor repositioning by the CMM has made possible the inspection of relatively large samples, which are normally not investigable with AFMs. The latest step in the development...

  8. Development of a liquid filter testing technique using radioisotope

    International Nuclear Information System (INIS)

    Kumar, Surender; Ramarathinam, K.; Khan, A.A.

    1979-01-01

    Efficient removal of suspended matter from liquids was always in demand in industries as a process requirement for the recovery of suspended materials. In nuclear industry the filters are required to remove fine suspended matter from water in reactors, effluent treatment plants, fuel reprocessing plants etc. The filters are used to maintain clarity and to limit the activity level to a minimum. In effluent treatment plants low level liquid waste is discharged to the environment after removing active suspended matter by filters. Various type of liquid filters are available in the market to meet the demands of different industries. These filters must be evaluated for their removal effectiveness for particulate matter from liquids. The filters are evaluated using several techniques like gravimetric analysis, turbidity measurement, direct counting of particles using optical and electronic instruments etc. All the techniques have their own advantages and disadvantages. Counting of radioactive particles using radiation counters is a simple and sensitive technique. It involves the neutron activation of selected test powders which are dispersed in the liquid and led through the test filter; the up-stream and down-stream concentrations are measured using GM counter. This technique was found to be consistent and reproducible in the low, middle and high ranges of efficiency. Selection of a test powder, its activation and use for evaluating liquid filters are dealt with. (auth.)

  9. STAMPS: development and verification of swallowing kinematic analysis software.

    Science.gov (United States)

    Lee, Woo Hyung; Chun, Changmook; Seo, Han Gil; Lee, Seung Hak; Oh, Byung-Mo

    2017-10-17

    Swallowing impairment is a common complication in various geriatric and neurodegenerative diseases. Swallowing kinematic analysis is essential to quantitatively evaluate the swallowing motion of the oropharyngeal structures. This study aims to develop a novel swallowing kinematic analysis software, called spatio-temporal analyzer for motion and physiologic study (STAMPS), and verify its validity and reliability. STAMPS was developed in MATLAB, which is one of the most popular platforms for biomedical analysis. This software was constructed to acquire, process, and analyze the data of swallowing motion. The target of swallowing structures includes bony structures (hyoid bone, mandible, maxilla, and cervical vertebral bodies), cartilages (epiglottis and arytenoid), soft tissues (larynx and upper esophageal sphincter), and food bolus. Numerous functions are available for the spatiotemporal parameters of the swallowing structures. Testing for validity and reliability was performed in 10 dysphagia patients with diverse etiologies and using the instrumental swallowing model which was designed to mimic the motion of the hyoid bone and the epiglottis. The intra- and inter-rater reliability tests showed excellent agreement for displacement and moderate to excellent agreement for velocity. The Pearson correlation coefficients between the measured and instrumental reference values were nearly 1.00 (P software is expected to be useful for researchers who are interested in the swallowing motion analysis.

  10. Bringing Automated Formal Verification to PLC Program Development

    CERN Document Server

    Fernández Adiego, Borja; Blanco Viñuela, Enrique

    Automation is the field of engineering that deals with the development of control systems for operating systems such as industrial processes, railways, machinery or aircraft without human intervention. In most of the cases, a failure in these control systems can cause a disaster in terms of economic losses, environmental damages or human losses. For that reason, providing safe, reliable and robust control systems is a first priority goal for control engineers. Ideally, control engineers should be able to guarantee that both software and hardware fulfill the design requirements. This is an enormous challenge in which industry and academia have been working and making progresses in the last decades. This thesis focuses on one particular type of control systems that operates industrial processes, the PLC (Programmable Logic Controller) - based control systems. Moreover it targets one of the main challenges for these systems, guaranteeing that PLC programs are compliant with their specifications. Traditionally ...

  11. Spaceport Command and Control System Automated Verification Software Development

    Science.gov (United States)

    Backus, Michael W.

    2017-01-01

    For as long as we have walked the Earth, humans have always been explorers. We have visited our nearest celestial body and sent Voyager 1 beyond our solar system1 out into interstellar space. Now it is finally time for us to step beyond our home and onto another planet. The Spaceport Command and Control System (SCCS) is being developed along with the Space Launch System (SLS) to take us on a journey further than ever attempted. Within SCCS are separate subsystems and system level software, each of which have to be tested and verified. Testing is a long and tedious process, so automating it will be much more efficient and also helps to remove the possibility of human error from mission operations. I was part of a team of interns and full-time engineers who automated tests for the requirements on SCCS, and with that was able to help verify that the software systems are performing as expected.

  12. Development and Verification of Smoothed Particle Hydrodynamics Code for Analysis of Tsunami near NPP

    Energy Technology Data Exchange (ETDEWEB)

    Jo, Young Beom; Kim, Eung Soo [Seoul National Univ., Seoul (Korea, Republic of)

    2014-10-15

    It becomes more complicated when considering the shape and phase of the ground below the seawater. Therefore, some different attempts are required to precisely analyze the behavior of tsunami. This paper introduces an on-going activities on code development in SNU based on an unconventional mesh-free fluid analysis method called Smoothed Particle Hydrodynamics (SPH) and its verification work with some practice simulations. This paper summarizes the on-going development and verification activities on Lagrangian mesh-free SPH code in SNU. The newly developed code can cover equation of motions and heat conduction equation so far, and verification of each models is completed. In addition, parallel computation using GPU is now possible, and GUI is also prepared. If users change input geometry or input values, they can simulate for various conditions geometries. A SPH method has large advantages and potential in modeling of free surface, highly deformable geometry and multi-phase problems that traditional grid-based code has difficulties in analysis. Therefore, by incorporating more complex physical models such as turbulent flow, phase change, two-phase flow, and even solid mechanics, application of the current SPH code is expected to be much more extended including molten fuel behaviors in the sever accident.

  13. RAPID FREEFORM SHEET METAL FORMING: TECHNOLOGY DEVELOPMENT AND SYSTEM VERIFICATION

    Energy Technology Data Exchange (ETDEWEB)

    Kiridena, Vijitha [Ford Scientific Research Lab., Dearborn, MI (United States); Verma, Ravi [Boeing Research and Technology (BR& T), Seattle, WA (United States); Gutowski, Timothy [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States); Roth, John [Pennsylvania State Univ., University Park, PA (United States)

    2018-03-31

    The objective of this project is to develop a transformational RApid Freeform sheet metal Forming Technology (RAFFT) in an industrial environment, which has the potential to increase manufacturing energy efficiency up to ten times, at a fraction of the cost of conventional technologies. The RAFFT technology is a flexible and energy-efficient process that eliminates the need for having geometry-specific forming dies. The innovation lies in the idea of using the energy resource at the local deformation area which provides greater formability, process control, and process flexibility relative to traditional methods. Double-Sided Incremental Forming (DSIF), the core technology in RAFFT, is a new concept for sheet metal forming. A blank sheet is clamped around its periphery and gradually deformed into a complex 3D freeform part by two strategically aligned stylus-type tools that follow a pre-described toolpath. The two tools, one on each side of the blank, can form a part with sharp features for both concave and convex shapes. Since deformation happens locally, the forming force at any instant is significantly decreased when compared to traditional methods. The key advantages of DSIF are its high process flexibility, high energy-efficiency, low capital investment, and the elimination of the need for massive amounts of die casting and machining. Additionally, the enhanced formability and process flexibility of DSIF can open up design spaces and result in greater weight savings.

  14. Development of a Self-Sluicing Pressure Leaf Filter

    Science.gov (United States)

    Cousineau, Bernard L.; Lumsden, J. R.

    The cylindrical Kelly filter presses installed in the Ewarton Works "C" phase did not perform satisfactorily because of difficulties with head seals, locking rings, and shell retraction mechanisms. As rectification required major modifications, a concept of a press which did not require to be opened for sluicing was proposed. Test work of various sluicing and res lurrying spray arrangements was carried out, and this led to the design of a self-sluicing press which used the shell of an existing Kelly press with its main axis vertical. One press was converted by July 1972, and a development period started. Although initial operation was encouraging, effective sluicing could not be guaranteed after 30 shifts. Modifications to leaf spacing, spray rotational speed, spray slot width, feed pressure and pre-coat control by November 1973, however, allowed effective performance for all of the 800 hour canvas life. Advantages are: reduced operating and maintenance manpower, clean environment, and reduced maintenance cost. The use of 1st wash overflow for sluicing has reduced caustic soda and canvas consumption. Ewarton Works now has four converted self-sluicing presses, and arc converting five more, and Arvida Works plan the installation of one for tests on red pressing (blow-off filtration). A side benefit of the development was the study of the benefits of constant pressure overflow filtration.

  15. Development and verification test of integral reactor major components - Development of MCP impeller design, performance prediction code and experimental verification

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Myung Kyoon; Oh, Woo Hyoung; Song, Jae Wook [Korea Advanced Institute of Science and Technology, Taejon (Korea)

    1999-03-01

    The present study is aimed at developing a computational code for design and performance prediction of an axial-flow pump. The proposed performance prediction method is tested against a model axial-flow pump streamline curvature method. The preliminary design is made by using the ideal velocity triangles at inlet and exit and the three dimensional blade shape is calculated by employing the free vortex design method. Then the detailed blading design is carried out by using experimental database of double circular arc cambered hydrofoils. To computationally determine the design incidence, deviation, blade camber, solidity and stagger angle, a number of correlation equations are developed form the experimental database and a theorical formula for the lift coefficient is adopted. A total of 8 equations are solved iteratively using an under-relaxation factor. An experimental measurement is conducted under a non-cavitating condition to obtain the off-design performance curve and also a cavitation test is carried out by reducing the suction pressure. The experimental results are very satisfactorily compared with the predictions by the streamline curvature method. 28 refs., 26 figs., 11 tabs. (Author)

  16. DEVELOPMENT OF AN INNOVATIVE LASER SCANNER FOR GEOMETRICAL VERIFICATION OF METALLIC AND PLASTIC PARTS

    DEFF Research Database (Denmark)

    Carmignato, Simone; De Chiffre, Leonardo; Fisker, Rune

    2008-01-01

    and plastic parts. A first prototype of the novel measuring system has been developed, using laser triangulation. The system, besides ensuring the automatic reconstruction of complete surface models, has been designed to guarantee user-friendliness, versatility, reliability and speed. The paper focuses mainly...... on the metrological aspects of the system development. Details are given on procedures and artefacts developed for metrological performance verification and traceability establishment. Experimental results from measurements on metallic and plastic parts show that the system prototype is capable of performing...

  17. Status on development and verification of reactivity initiated accident analysis code for PWR (NODAL3)

    International Nuclear Information System (INIS)

    Peng Hong Liem; Surian Pinem; Tagor Malem Sembiring; Tran Hoai Nam

    2015-01-01

    A coupled neutronics thermal-hydraulics code NODAL3 has been developed based on the nodal few-group neutron diffusion theory in 3-dimensional Cartesian geometry for a typical pressurized water reactor (PWR) static and transient analyses, especially for reactivity initiated accidents (RIA). The spatial variables are treated by using a polynomial nodal method (PNM) while for the neutron dynamic solver the adiabatic and improved quasi-static methods are adopted. A simple single channel thermal-hydraulics module and its steam table is implemented into the code. Verification works on static and transient benchmarks are being conducting to assess the accuracy of the code. For the static benchmark verification, the IAEA-2D, IAEA-3D, BIBLIS and KOEBERG light water reactor (LWR) benchmark problems were selected, while for the transient benchmark verification, the OECD NEACRP 3-D LWR Core Transient Benchmark and NEA-NSC 3-D/1-D PWR Core Transient Benchmark (Uncontrolled Withdrawal of Control Rods at Zero Power). Excellent agreement of the NODAL3 results with the reference solutions and other validated nodal codes was confirmed. (author)

  18. Development and evaluation of a HEPA filter for increased strength and resistance to elevated temperature

    International Nuclear Information System (INIS)

    Gilbert, H.; Bergman, W.; Fretthold, J.K.

    1992-01-01

    We have developed an improved HEPA filter for increased strength and resistance to elevated temperature to improve the reliability of HEPA filters under accident conditions. The improvements to the HEPA filter consist of a silicone rubber sealant and a new HEPA medium reinforced with a glass cloth. Several prototype filters were built and evaluated for temperature and pressure resistance and resistance to rough handling. The temperature resistance test consisted of exposing the HEPA filter to 1,000 scan at 700 degrees F for five minutes. The pressure resistance test consisted of exposing the HEPA filter to a differential pressure of 10 in. w.g. using a water saturated air flow at 95 degrees F. For the rough handling test, we used a vibrating machine designated the Q110. DOP filter efficiency tests were performed before and after each of the environmental tests. In addition to following the standard practice of using a separate new filter for each environmental test, we also subjected the same filter to the elevated temperature test followed by the pressure resistance test. The efficiency test results show that the improved HEPA filter is significantly better than the standard HEPA filter

  19. SWAAM-code development and verification and application to steam generator designs

    International Nuclear Information System (INIS)

    Shin, Y.W.; Valentin, R.A.

    1990-01-01

    This paper describes the family of SWAAM codes which were developed by Argonne National Laboratory to analyze the effects of sodium-water reactions on LMR steam generators. The SWAAM codes were developed as design tools for analyzing various phenomena related to steam generator leaks and the resulting thermal and hydraulic effects on the steam generator and the intermediate heat transport system (IHTS). The paper discusses the theoretical foundations and numerical treatments on which the codes are based, followed by a description of code capabilities and limitations, verification of the codes and applications to steam generator and IHTS designs. 25 refs., 14 figs

  20. Verification and validation as an integral part of the development of digital systems for nuclear applications

    International Nuclear Information System (INIS)

    Straker, E.A.; Thomas, N.C.

    1983-01-01

    The nuclear industry's current attitude toward verification and validation (V and V) is realized through the experiences gained to date. On the basis of these experiences, V and V can effectively be applied as an integral part of digital system development for nuclear electric power applications. An overview of a typical approach for integrating V and V with system development is presented. This approach represents a balance between V and V as applied in the aerospace industry and the standard practice commonly applied within the nuclear industry today

  1. Whole-core thermal-hydraulic transient code development and verification for LMFBR analysis

    International Nuclear Information System (INIS)

    Spencer, D.R.

    1979-04-01

    Predicted performance during both steady state and transient reactor operation determines the steady state operating limits on LMFBRs. Unnecessary conservatism in performance predictions will not contribute to safety, but will restrict the reactor to more conservative, less economical steady state operation. The most general method for reducing analytical conservatism in LMFBR's without compromising safety is to develop, validate and apply more sophisticated computer models to the limiting performance analyses. The purpose of the on-going Natural Circulation Verification Program (NCVP) is to develop and validate computer codes to analyze natural circulation transients in LMFBRs, and thus, replace unnecessary analytical conservatism with demonstrated calculational capability

  2. A standardized approach to verification and validation to assist in expert system development

    International Nuclear Information System (INIS)

    Hines, J.W.; Hajek, B.K.; Miller, D.W.; Haas, M.A.

    1992-01-01

    For the past six years, the Nuclear Engineering Program's Artificial Intelligence (AI) Group at The Ohio State University has been developing an integration of expert systems to act as an aid to nuclear power plant operators. This Operator Advisor consists of four modules that monitor plant parameters, detect deviations from normality, diagnose the root cause of the abnormality, manage procedures to effectively respond to the abnormality, and mitigate its consequences. To aid in the development of this new system, a standardized Verification and Validation (V and V) approach is being implemented. The primary functions are to guide the development of the expert system and to ensure that the end product fulfills the initial objectives. The development process has been divided into eight life-cycle V and V phases from concept to operation and maintenance. Each phase has specific V and V tasks to be performed to ensure a quality end product. Four documents are being used to guide development. The Software Verification and Validation Plan (SVVP) outlines the V and V tasks necessary to verify the product at the end of each software development phase, and to validate that the end product complies with the established software and system requirements and meets the needs of the user. The Software Requirements Specification (SRS) documents the essential requirements of the system. The Software Design Description (SDD) represents these requirements with a specific design. And lastly, the Software Test Document establishes a testing methodology to be used throughout the development life-cycle

  3. Development of independent MU/treatment time verification algorithm for non-IMRT treatment planning: A clinical experience

    Science.gov (United States)

    Tatli, Hamza; Yucel, Derya; Yilmaz, Sercan; Fayda, Merdan

    2018-02-01

    The aim of this study is to develop an algorithm for independent MU/treatment time (TT) verification for non-IMRT treatment plans, as a part of QA program to ensure treatment delivery accuracy. Two radiotherapy delivery units and their treatment planning systems (TPS) were commissioned in Liv Hospital Radiation Medicine Center, Tbilisi, Georgia. Beam data were collected according to vendors' collection guidelines, and AAPM reports recommendations, and processed by Microsoft Excel during in-house algorithm development. The algorithm is designed and optimized for calculating SSD and SAD treatment plans, based on AAPM TG114 dose calculation recommendations, coded and embedded in MS Excel spreadsheet, as a preliminary verification algorithm (VA). Treatment verification plans were created by TPSs based on IAEA TRS 430 recommendations, also calculated by VA, and point measurements were collected by solid water phantom, and compared. Study showed that, in-house VA can be used for non-IMRT plans MU/TT verifications.

  4. Development of a double-layered ceramic filter for aerosol filtration at high-temperatures: the filter collection efficiency.

    Science.gov (United States)

    de Freitas, Normanda L; Gonçalves, José A S; Innocentini, Murilo D M; Coury, José R

    2006-08-25

    The performance of double-layered ceramic filters for aerosol filtration at high temperatures was evaluated in this work. The filtering structure was composed of two layers: a thin granular membrane deposited on a reticulate ceramic support of high porosity. The goal was to minimize the high pressure drop inherent of granular structures, without decreasing their high collection efficiency for small particles. The reticulate support was developed using the technique of ceramic replication of polyurethane foam substrates of 45 and 75 pores per inch (ppi). The filtering membrane was prepared by depositing a thin layer of granular alumina-clay paste on one face of the support. Filters had their permeability and fractional collection efficiency analyzed for filtration of an airborne suspension of phosphatic rock in temperatures ranging from ambient to 700 degrees C. Results revealed that collection efficiency decreased with gas temperature and was enhanced with filtration time. Also, the support layer influenced the collection efficiency: the 75 ppi support was more effective than the 45 ppi. Particle collection efficiency dropped considerably for particles below 2 microm in diameter. The maximum collection occurred for particle diameters of approximately 3 microm, and decreased again for diameters between 4 and 8 microm. Such trend was successfully represented by the proposed correlation, which is based on the classical mechanisms acting on particle collection. Inertial impaction seems to be the predominant collection mechanism, with particle bouncing/re-entrainment acting as detachment mechanisms.

  5. Developing topic-specific search filters for PubMed with click-through data.

    Science.gov (United States)

    Li, J; Lu, Z

    2013-01-01

    Search filters have been developed and demonstrated for better information access to the immense and ever-growing body of publications in the biomedical domain. However, to date the number of filters remains quite limited because the current filter development methods require significant human efforts in manual document review and filter term selection. In this regard, we aim to investigate automatic methods for generating search filters. We present an automated method to develop topic-specific filters on the basis of users' search logs in PubMed. Specifically, for a given topic, we first detect its relevant user queries and then include their corresponding clicked articles to serve as the topic-relevant document set accordingly. Next, we statistically identify informative terms that best represent the topic-relevant document set using a background set composed of topic irrelevant articles. Lastly, the selected representative terms are combined with Boolean operators and evaluated on benchmark datasets to derive the final filter with the best performance. We applied our method to develop filters for four clinical topics: nephrology, diabetes, pregnancy, and depression. For the nephrology filter, our method obtained performance comparable to the state of the art (sensitivity of 91.3%, specificity of 98.7%, precision of 94.6%, and accuracy of 97.2%). Similarly, high-performing results (over 90% in all measures) were obtained for the other three search filters. Based on PubMed click-through data, we successfully developed a high-performance method for generating topic-specific search filters that is significantly more efficient than existing manual methods. All data sets (topic-relevant and irrelevant document sets) used in this study and a demonstration system are publicly available at http://www.ncbi.nlm.nih.gov/CBBresearch/Lu/downloads/CQ_filter/

  6. The measurement of X-rays radiation temperature with a new developed filter-fluorescence spectroscopy

    International Nuclear Information System (INIS)

    Zhang Chuanfei; Lin Libin; Lou Fuhong; Peng Taiping

    2001-01-01

    The author introduces how to measure the energy spectra of X-rays by filter-fluorescence spectroscopy. The design principle and structure of new-developed double diaphragms and filter-fluorescence spectroscopy with 5 channels are depicted. The parameters of optimized spectroscopy by numerical method are given. The filter-fluorescence spectroscopy designed according as Rousseau balance principle improves signal-noises ratio

  7. Autonomic networking-on-chip bio-inspired specification, development, and verification

    CERN Document Server

    Cong-Vinh, Phan

    2011-01-01

    Despite the growing mainstream importance and unique advantages of autonomic networking-on-chip (ANoC) technology, Autonomic Networking-On-Chip: Bio-Inspired Specification, Development, and Verification is among the first books to evaluate research results on formalizing this emerging NoC paradigm, which was inspired by the human nervous system. The FIRST Book to Assess Research Results, Opportunities, & Trends in ""BioChipNets"" The third book in the Embedded Multi-Core Systems series from CRC Press, this is an advanced technical guide and reference composed of contributions from prominent re

  8. Working Group 2: Future Directions for Safeguards and Verification, Technology, Research and Development

    International Nuclear Information System (INIS)

    Zykov, S.; Blair, D.

    2013-01-01

    For traditional safeguards it was recognized that the hardware presently available is, in general, addressing adequately fundamental IAEA needs, and that further developments should therefore focus mainly on improving efficiencies (i.e. increasing cost economies, reliability, maintainability and user-friendliness, keeping abreast of continual advancements in technologies and of the evolution of verification approaches). Specific technology areas that could benefit from further development include: -) Non-destructive measurement systems (NDA), in particular, gamma-spectroscopy and neutron counting techniques; -) Containment and surveillance tools, such as tamper indicating seals, video-surveillance, surface identification methods, etc.; -) Geophysical methods for design information verification (DIV) and safeguarding of geological repositories; and -) New tools and methods for real-time monitoring. Furthermore, the Working Group acknowledged that a 'building block' (or modular) approach should be adopted towards technology development, enabling equipment to be upgraded efficiently as technologies advance. Concerning non-traditional safeguards, in the area of satellite-based sensors, increased spatial resolution and broadened spectral range were identified as priorities. In the area of wide area surveillance, the development of LIDAR-like tools for atmospheric sensing was discussed from the perspective of both potential benefits and certain limitations. Recognizing the limitations imposed by the human brain in terms of information assessment and analysis, technologies are needed that will enable the more effective utilization of all information, regardless of its format and origin. The paper is followed by the slides of the presentation. (A.C.)

  9. Developing particulate thin filter using coconut fiber for motor vehicle emission

    Science.gov (United States)

    Wardoyo, A. Y. P.; Juswono, U. P.; Riyanto, S.

    2016-03-01

    Amounts of motor vehicles in Indonesia have been recognized a sharply increase from year to year with the increment reaching to 22 % per annum. Meanwhile motor vehicles produce particulate emissions in different sizes with high concentrations depending on type of vehicles, fuels, and engine capacity. Motor Particle emissions are not only to significantly contribute the atmosphric particles but also adverse to human health. In order to reduce the particle emission, it is needed a filter. This study was aimed to develop a thin filter using coconut fiber to reduce particulate emissions for motor vehicles. The filter was made of coconut fibers that were grinded into power and mixed with glues. The filter was tested by the measurements of particle concentrations coming out from the vehicle exhaust directly and the particle concentrations after passing through the filter. The efficiency of the filter was calculated by ratio of the particle concentrations before comming in the filter to the particle conentrations after passing through the filter. The results showed that the efficiency of the filter obtained more than 30 %. The efficiency increases sharply when a number of the filters are arranged paralelly.

  10. Development of a computerized portal verification scheme for pelvic treatment fields

    International Nuclear Information System (INIS)

    Nie, K.; Yin, F.-F.; Gao, Q.; Brasacchio, R.

    1996-01-01

    Purpose/Objective: At present, treatment verification between portal and reference images is performed based on manually-identified features by radiation oncologist, which is both time-consuming and potentially error-prone. There is a demand for the computerized verification procedure in clinical application. The purpose of this study is to develop a computerized portal verification scheme for pelvic treatment fields. Materials/Methods: The automated verification system involves image acquisition, image feature extraction, feature matching between reference and portal images and quantitative evaluation of patient setup. Electronic portal images with a matrix size of 256 x 256 and 12 bit gray levels were acquired using a liquid matrix electronic portal imaging device. Simulation images were acquired by digitizing simulation films using a TV camera into images with 256 x 256 matrix and 8 bit gray levels. Initially a Canny edge detector is applied to identify the field edges and an elliptic Fourier transformation is used to correlate the size and shape information between the reference and portal field edges. Several measures can be calculated using the transformation coefficients to describe the field shape, size and orientation. The quantitative information regarding to the relative shifts, rotation and magnification factor between portal and reference field edges can then be determined based on these measures. Next the pelvic brim, which is typically used as the landmark for radiation treatment verification, is identified by a pyramid searching process with double snakes defined from initial global area to final local area. A snake is an active model and energy-minimizing spline guided by external constraint forces and influenced by image forces that pull it toward features such as lines and edges. The search range is limited to the region between two snakes. Sobel edge detector and wavelet transformation approach are used to generate a serial image forces at

  11. Progress on the development of NbZr Radio frequency band reject filters

    International Nuclear Information System (INIS)

    Hudak, J.J.; Alper, M.; Cotte, D.; Gardner, C.G.; Harvey, A.

    1983-01-01

    This chapter reports on the design and testing of a tunable superconducting filter element fabricated from Nb25%Zr having a transition temperature of 11 K. The filter element will serve as a component in a multielement filter bank to be cooled to less than 10 K by a two stage Gifford-McMahon refrigerator. A radio frequency (RF) interference rejection system composed of a set of tunable superconducting filter elements is being developed to supplement conventional interference rejection tehcniques. The thermal loading performance of the 8.5 K Gifford-McMahon refrigerator is found to exceed 2 watts at 10 K on the second stage with a 10 watt loading on the first stage. A superconducting filter bank consisting of tunable narrow band RF filters applied to strong interfering signals can be used to match the dynamic range of the RF signal environment to that of the receiving system

  12. Verification of gamma knife based fractionated radiosurgery with newly developed head-thorax phantom

    International Nuclear Information System (INIS)

    Bisht, Raj Kishor; Kale, Shashank Sharad; Natanasabapathi, Gopishankar; Singh, Manmohan Jit; Agarwal, Deepak; Garg, Ajay; Rath, Goura Kishore; Julka, Pramod Kumar; Kumar, Pratik; Thulkar, Sanjay; Sharma, Bhawani Shankar

    2016-01-01

    Objective: Purpose of the study is to verify the Gamma Knife Extend™ system (ES) based fractionated stereotactic radiosurgery with newly developed head-thorax phantom. Methods: Phantoms are extensively used to measure radiation dose and verify treatment plan in radiotherapy. A human upper body shaped phantom with thorax was designed to simulate fractionated stereotactic radiosurgery using Extend™ system of Gamma Knife. The central component of the phantom aids in performing radiological precision test, dosimetric evaluation and treatment verification. A hollow right circular cylindrical space of diameter 7.0 cm was created at the centre of this component to place various dosimetric devices using suitable adaptors. The phantom is made of poly methyl methacrylate (PMMA), a transparent thermoplastic material. Two sets of disk assemblies were designed to place dosimetric films in (1) horizontal (xy) and (2) vertical (xz) planes. Specific cylindrical adaptors were designed to place thimble ionization chamber inside phantom for point dose recording along xz axis. EBT3 Gafchromic films were used to analyze and map radiation field. The focal precision test was performed using 4 mm collimator shot in phantom to check radiological accuracy of treatment. The phantom head position within the Extend™ frame was estimated using encoded aperture measurement of repositioning check tool (RCT). For treatment verification, the phantom with inserts for film and ion chamber was scanned in reference treatment position using X-ray computed tomography (CT) machine and acquired stereotactic images were transferred into Leksell Gammaplan (LGP). A patient treatment plan with hypo-fractionated regimen was delivered and identical fractions were compared using EBT3 films and in-house MATLAB codes. Results: RCT measurement showed an overall positional accuracy of 0.265 mm (range 0.223 mm–0.343 mm). Gamma index analysis across fractions exhibited close agreement between LGP and film

  13. Development of exhaust air filters for reprocessing plants

    International Nuclear Information System (INIS)

    Furrer, J.; Kaempffer, R.; Jannakos, K.; Apenberg, W.

    1975-01-01

    Investigations of the iodine loading capacity of highly impregnated iodine sorption material (AC 6,120/H 1 ) for the GWA-filters (GWA: reprocessing plant for 1,500 metric tons per year of uranium) have been continued for low NO 2 -contents of the simulated dissolver offgas from GWA. When fully loading AC 6,120/H 1 , a conversion to silver iodides of Ag + of the impregnation of about 80% was reached in experiments with 1% NO 2 in the carrier gas. Despite the consumption of a substantial portion of the impregnation removal efficiencies > 99.99% were measured for a bed depth corresponding to a GWA filter stage. The test facility allowing to examine the behavior and the capacity of the AC 6,120/H 1 iodine sorption material under actual conditions at SAP Marcoule (reprocessing plant) has been completed except for installation in the reprocessing plant. (orig.) [de

  14. A standardized approach to verification and validation to assist in expert system development

    International Nuclear Information System (INIS)

    Hines, J.W.; Hajek, B.K.; Miller, D.W.; Haas, M.A.

    1992-01-01

    For the past six years, the Nuclear Engineering Program's Artificial Intelligence (AI) Group at The Ohio State University has been developing an integration of expert systems to act as an aid to nuclear power plant operators. This Operator Advisor consists of four modules that monitor plant parameters, detect deviations from normality, diagnose the root cause of the abnormality, manage procedures to effectively respond to the abnormality, and mitigate its consequences. Recently Ohio State University received a grant from the Department of Energy's Special Research Grant Program to utilize the methodologies developed for the Operator Advisor for Heavy Water Reactor (HWR) malfunction root cause diagnosis. To aid in the development of this new system, a standardized Verification and Validation (V ampersand V) approach is being implemented. Its primary functions are to guide the development of the expert system and to ensure that the end product fulfills the initial objectives. The development process has been divided into eight life-cycle V ampersand V phases from concept to operation and maintenance. Each phase has specific V ampersand V tasks to be performed to ensure a quality end product. Four documents are being used to guide development. The Software Verification and Validation Plan (SVVP) outlines the V ampersand V tasks necessary to verify the product at the end of each software development phase, and to validate that the end product complies with the established software and system requirements and meets the needs of the user. The Software Requirements Specification (SRS) documents the essential requirements of the system. The Software Design Description (SDD) represents these requirements with a specific design. And lastly, the Software Test Document establishes a testing methodology to be used throughout the development life-cycle. 10 refs., 1 fig

  15. Online fingerprint verification.

    Science.gov (United States)

    Upendra, K; Singh, S; Kumar, V; Verma, H K

    2007-01-01

    As organizations search for more secure authentication methods for user access, e-commerce, and other security applications, biometrics is gaining increasing attention. With an increasing emphasis on the emerging automatic personal identification applications, fingerprint based identification is becoming more popular. The most widely used fingerprint representation is the minutiae based representation. The main drawback with this representation is that it does not utilize a significant component of the rich discriminatory information available in the fingerprints. Local ridge structures cannot be completely characterized by minutiae. Also, it is difficult quickly to match two fingerprint images containing different number of unregistered minutiae points. In this study filter bank based representation, which eliminates these weakness, is implemented and the overall performance of the developed system is tested. The results have shown that this system can be used effectively for secure online verification applications.

  16. Development of an automated testing system for verification and validation of nuclear data

    International Nuclear Information System (INIS)

    Triplett, B. S.; Anghaie, S.; White, M. C.

    2008-01-01

    Verification and validation of nuclear data is critical to the accuracy of both stochastic and deterministic particle transport codes. In order to effectively test a set of nuclear data, the data must be applied to a wide variety of transport problems. Performing this task in a timely, efficient manner is tedious. The nuclear data team at Los Alamos National Laboratory (LANL) in collaboration with the University of Florida is developing a methodology to automate the process of nuclear data verification and validation. The International Criticality Safety Benchmark Experiment Project (ICSBEP) provides a set of criticality problems that may be used to evaluate nuclear data. This process tests a number of data libraries using cases from the ICSBEP benchmark set to demonstrate how automation of these tasks may reduce errors and increase efficiency. The process is driven by an integrated set of Python scripts. Material and geometry data may be read from an existing code input file to generate a standardized template or the template may be generated directly by the user The user specifies the desired precision and other vital problem parameters. The Python scripts generate input decks for multiple transport codes from these templates, run and monitor individual jobs, and parse the relevant output. This output can then be used to generate reports directly or can be stored into a database for later analysis. This methodology eases the burden on the user by reducing the amount of time and effort required for obtaining and compiling calculation results. (authors)

  17. Development and analysis of vent-filtered containment conceptual designs

    International Nuclear Information System (INIS)

    Benjamin, A.S.; Walling, H.C.

    1980-01-01

    Conceptual filtered-vented containment systems have been postulated for a reference large, dry, pressurized water reactor containment, and the systems have been analyzed to determine design parameters, actuation/operation requirements, and overall feasibility. The primary design challenge has been found to emanate from pressure spikes caused by core debris bed interactions with water and by hydrogen deflagrations. Circumvention of the pressure spikes may require a more complicated actuation logic than has previously been considered. Otherwise, major reductions in consequences for certain severe accidents appear to be possible with relatively simple systems. A probabilistic assessment of competing risks remains to be performed

  18. CANDU RU fuel manufacturing basic technology development and advanced fuel verification tests

    International Nuclear Information System (INIS)

    Chung, Chang Hwan; Chang, S.K.; Hong, S.D.

    1999-04-01

    A PHWR advanced fuel named the CANFLEX fuel has been developed through a KAERI/AECL joint Program. The KAERI made fuel bundle was tested at the KAERI Hot Test Loop for the performance verification of the bundle design. The major test activities were the fuel bundle cross-flow test, the endurance fretting/vibration test, the freon CHF test, and the fuel bundle heat-up test. KAERI also has developing a more advanced PHWR fuel, the CANFLEX-RU fuel, using recovered uranium to extend fuel burn-up in the CANDU reactors. For the purpose of proving safety of the RU handling techniques and appraising feasibility of the CANFLEX-RU fuel fabrication in near future, a physical, chemical and radiological characterization of the RU powder and pellets was performed. (author). 54 refs., 46 tabs., 62 figs

  19. CANDU RU fuel manufacturing basic technology development and advanced fuel verification tests

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Chang Hwan; Chang, S.K.; Hong, S.D. [and others

    1999-04-01

    A PHWR advanced fuel named the CANFLEX fuel has been developed through a KAERI/AECL joint Program. The KAERI made fuel bundle was tested at the KAERI Hot Test Loop for the performance verification of the bundle design. The major test activities were the fuel bundle cross-flow test, the endurance fretting/vibration test, the freon CHF test, and the fuel bundle heat-up test. KAERI also has developing a more advanced PHWR fuel, the CANFLEX-RU fuel, using recovered uranium to extend fuel burn-up in the CANDU reactors. For the purpose of proving safety of the RU handling techniques and appraising feasibility of the CANFLEX-RU fuel fabrication in near future, a physical, chemical and radiological characterization of the RU powder and pellets was performed. (author). 54 refs., 46 tabs., 62 figs.

  20. A Translator Verification Technique for FPGA Software Development in Nuclear Power Plants

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jae Yeob; Kim, Eui Sub; Yoo, Jun Beom [Konkuk University, Seoul (Korea, Republic of)

    2014-10-15

    Although the FPGAs give a high performance than PLC (Programmable Logic Controller), the platform change from PLC to FPGA impose all PLC software engineers give up their experience, knowledge and practices accumulated over decades, and start a new FPGA-based hardware development from scratch. We have researched to fine the solution to this problem reducing the risk and preserving the experience and knowledge. One solution is to use the FBDtoVerilog translator, which translates the FBD programs into behavior-preserving Verilog programs. In general, the PLCs are usually designed with an FBD, while the FPGAs are described with a HDL (Hardware Description Language) such as Verilog or VHDL. Once PLC designer designed the FBD programs, the FBDtoVerilog translates the FBD into Verilog, mechanically. The designers, therefore, need not consider the rest of FPGA development process (e.g., Synthesis and Place and Routing) and can preserve the accumulated experience and knowledge. Even if we assure that the translation from FBD to Verilog is correct, it must be verified rigorously and thoroughly since it is used in nuclear power plants, which is one of the most safety critical systems. While the designer develops the FPGA software with the FBD program translated by the translator, there are other translation tools such as synthesis tool and place and routing tool. This paper also focuses to verify them rigorously and thoroughly. There are several verification techniques for correctness of translator, but they are hard to apply because of the outrageous cost and performance time. Instead, this paper tries to use an indirect verification technique for demonstrating the correctness of translator using the co-simulation technique. We intend to prove only against specific inputs which are under development for a target I and C system, not against all possible input cases.

  1. A Translator Verification Technique for FPGA Software Development in Nuclear Power Plants

    International Nuclear Information System (INIS)

    Kim, Jae Yeob; Kim, Eui Sub; Yoo, Jun Beom

    2014-01-01

    Although the FPGAs give a high performance than PLC (Programmable Logic Controller), the platform change from PLC to FPGA impose all PLC software engineers give up their experience, knowledge and practices accumulated over decades, and start a new FPGA-based hardware development from scratch. We have researched to fine the solution to this problem reducing the risk and preserving the experience and knowledge. One solution is to use the FBDtoVerilog translator, which translates the FBD programs into behavior-preserving Verilog programs. In general, the PLCs are usually designed with an FBD, while the FPGAs are described with a HDL (Hardware Description Language) such as Verilog or VHDL. Once PLC designer designed the FBD programs, the FBDtoVerilog translates the FBD into Verilog, mechanically. The designers, therefore, need not consider the rest of FPGA development process (e.g., Synthesis and Place and Routing) and can preserve the accumulated experience and knowledge. Even if we assure that the translation from FBD to Verilog is correct, it must be verified rigorously and thoroughly since it is used in nuclear power plants, which is one of the most safety critical systems. While the designer develops the FPGA software with the FBD program translated by the translator, there are other translation tools such as synthesis tool and place and routing tool. This paper also focuses to verify them rigorously and thoroughly. There are several verification techniques for correctness of translator, but they are hard to apply because of the outrageous cost and performance time. Instead, this paper tries to use an indirect verification technique for demonstrating the correctness of translator using the co-simulation technique. We intend to prove only against specific inputs which are under development for a target I and C system, not against all possible input cases

  2. Towards the development of a rapid, portable, surface enhanced Raman spectroscopy based cleaning verification system for the drug nelarabine.

    Science.gov (United States)

    Corrigan, Damion K; Salton, Neale A; Preston, Chris; Piletsky, Sergey

    2010-09-01

    Cleaning verification is a scientific and economic problem for the pharmaceutical industry. A large amount of potential manufacturing time is lost to the process of cleaning verification. This involves the analysis of residues on spoiled manufacturing equipment, with high-performance liquid chromatography (HPLC) being the predominantly employed analytical technique. The aim of this study was to develop a portable cleaning verification system for nelarabine using surface enhanced Raman spectroscopy (SERS). SERS was conducted using a portable Raman spectrometer and a commercially available SERS substrate to develop a rapid and portable cleaning verification system for nelarabine. Samples of standard solutions and swab extracts were deposited onto the SERS active surfaces, allowed to dry and then subjected to spectroscopic analysis. Nelarabine was amenable to analysis by SERS and the necessary levels of sensitivity were achievable. It is possible to use this technology for a semi-quantitative limits test. Replicate precision, however, was poor due to the heterogeneous drying pattern of nelarabine on the SERS active surface. Understanding and improving the drying process in order to produce a consistent SERS signal for quantitative analysis is desirable. This work shows the potential application of SERS for cleaning verification analysis. SERS may not replace HPLC as the definitive analytical technique, but it could be used in conjunction with HPLC so that swabbing is only carried out once the portable SERS equipment has demonstrated that the manufacturing equipment is below the threshold contamination level.

  3. Filter testing and development for prolonged transuranic service and waste reduction

    International Nuclear Information System (INIS)

    Geer, J.A.; Buttedahl, O.I.; Skaats, C.D.; Terada, K.; Woodard, R.W.

    1977-02-01

    The life of High Efficiency Particulate Air (HEPA) filters used in transuranic service is influenced greatly by the gaseous and particulate matter to which the filters are exposed. The most severe conditions encountered at Rocky Flats are at the ventilation systems serving the plutonium recovery operations in Bldg. 771. A project of filter testing and development for prolonged transuranic service and waste reduction was formally initiated at Rocky Flats on July 1, 1975. The project is directed toward improving filtration methods which will prolong the life of HEPA filter systems without sacrificing effectiveness. Another important aspect of the project is to reduce the volume of HEPA filter waste shipped from the plant for long-term storage. Progress to September 30, 1976, is reported

  4. The construction of environments for development of test and verification technology -The development of advanced instrumentation and control technology-

    International Nuclear Information System (INIS)

    Kwon, Kee Choon; Ham, Chang Shick; Lee, Byung Sun; Kim, Jung Taek; Park, Won Man; Park, Jae Chang; Kim, Jae Hee; Lee, Chang Soo

    1994-07-01

    Several problems were identified in digitalizing the I and C systems of NPPs. A scheme is divided into hardware and software to resolve these problems. Hardware verification and validation analyzed about common mode failure, commercial grade dedication process, electromagnetic competibility. We have reviewed codes and standards to be a consensus criteria among vendor, licenser. Then we have described the software licensing procedures under 10 CFR 50 and 10 CFR 52 in the United States Nuclear Regulatory Commision (NRC) and presented vendor's approaches to scope with the licensing barrier. At last, we have surveyed the technical issues related to developing and licensing the high integrity software for digital I and C systems. (Author)

  5. Why nuclear power generation must be developed? A many-faceted verification of its irreplaceable role

    International Nuclear Information System (INIS)

    Kawai, Yuichi; Oda, Toshiyuki

    1998-01-01

    Given the poor public acceptance right now, the future of nuclear power development is not necessarily bright. Yet, from the energy security aspect, the role of nuclear power, already responsible for about 30% of Japan's generated output, is never negligible. Also, Japan could hardly meet the GHG reduction target under the Kyoto Protocol without carbon-free nuclear power generation. While Japan is required to deal with both energy security and global warming from now on, to satisfy the two concurrently without nuclear power development is nearly impossible in practical terms. We have to consider calmly how nuclear power generation should be understood and treated in our effort to ensure energy supply and mitigate global warming. With this study, the need for nuclear power development was verified anew by reevaluating nuclear power generation from many facets, which are energy (electricity) supply and demand, environmental measures, energy security, and cost. Verification results showed: On supply and demand, the absence of nuclear power causes an electricity shortage during peak hours; On environment, no GHG-free power sources but nuclear currently have a sufficient supply capacity; On energy security, nuclear fuel procurement sources are diverse and located in relatively stable areas; On cost, the strong yen and cheap oil favors fossil fuels, and the weak yen and dear oil does nuclear power, though depending on unpredictable elements to send their cost up, typically waste disposal cost incurred in nuclear power, and CO 2 reduction cost in fossil fuels. With all these factors taken into consideration, the best mix of power sources should be figured out. From the verification results, we can conclude that nuclear power is one of irreplaceable energy sources for Japan. To prepare for growing electricity demand and care the environment better, Japan has few choices but to increase the installed capacity of nuclear power generation in the years to come. (author)

  6. Development and verification of a space-dependent dynamic model of a natural circulation steam generator

    International Nuclear Information System (INIS)

    Mewdell, C.G.; Harrison, W.C.; Hawley, E.H.

    1980-01-01

    This paper describes the development and verification of a Non-Linear Space-Dependent Dynamic Model of a Natural Circulation Steam Generator typical of boilers used in CANDU nuclear power stations. The model contains a detailed one-dimensional dynamic description of both the primary and secondary sides of an integral pre-heater natural circulation boiler. Two-phase flow effects on the primary side are included. The secondary side uses a drift-flux model in the boiling sections and a detailed non-equilibrium point model for the steam drum. The paper presents the essential features of the final model called BOILER-2, its solution scheme, the RD-12 loop and test boiler, the boiler steady-state and transient experiments, and the comparison of the model predictions with experimental results. (author)

  7. Development of a Compton camera for online ion beam range verification via prompt γ detection

    Energy Technology Data Exchange (ETDEWEB)

    Aldawood, S. [LMU Munich, Garching (Germany); King Saud University, Riyadh (Saudi Arabia); Liprandi, S.; Marinsek, T.; Bortfeldt, J.; Lang, C.; Lutter, R.; Dedes, G.; Parodi, K.; Thirolf, P.G. [LMU Munich, Garching (Germany); Maier, L.; Gernhaeuser, R. [TU Munich, Garching (Germany); Kolff, H. van der [LMU Munich, Garching (Germany); TU Delft (Netherlands); Castelhano, I. [LMU Munich, Garching (Germany); University of Lisbon, Lisbon (Portugal); Schaart, D.R. [TU Delft (Netherlands)

    2015-07-01

    Precise and preferably online ion beam range verification is a mandatory prerequisite to fully exploit the advantages of hadron therapy in cancer treatment. An imaging system is being developed in Garching aiming to detect promptγ rays induced by nuclear reactions between the ion beam and biological tissue. The Compton camera prototype consists of a stack of six customized double-sided Si-strip detectors (DSSSD, 50 x 50 mm{sup 2}, 0.5 mm thick, 128 strips/side) acting as scatterer, while the absorber is formed by a monolithic LaBr{sub 3}:Ce scintillator crystal (50 x 50 x 30 mm{sup 3}) read out by a position-sensitive multi-anode photomultiplier (Hamamatsu H9500). The on going characterization of the Compton camera properties and its individual components both offline in the laboratory as well as online using proton beam are presented.

  8. The medline UK filter: development and validation of a geographic search filter to retrieve research about the UK from OVID medline.

    Science.gov (United States)

    Ayiku, Lynda; Levay, Paul; Hudson, Tom; Craven, Jenny; Barrett, Elizabeth; Finnegan, Amy; Adams, Rachel

    2017-07-13

    A validated geographic search filter for the retrieval of research about the United Kingdom (UK) from bibliographic databases had not previously been published. To develop and validate a geographic search filter to retrieve research about the UK from OVID medline with high recall and precision. Three gold standard sets of references were generated using the relative recall method. The sets contained references to studies about the UK which had informed National Institute for Health and Care Excellence (NICE) guidance. The first and second sets were used to develop and refine the medline UK filter. The third set was used to validate the filter. Recall, precision and number-needed-to-read (NNR) were calculated using a case study. The validated medline UK filter demonstrated 87.6% relative recall against the third gold standard set. In the case study, the medline UK filter demonstrated 100% recall, 11.4% precision and a NNR of nine. A validated geographic search filter to retrieve research about the UK with high recall and precision has been developed. The medline UK filter can be applied to systematic literature searches in OVID medline for topics with a UK focus. © 2017 Crown copyright. Health Information and Libraries Journal © 2017 Health Libraries GroupThis article is published with the permission of the Controller of HMSO and the Queen's Printer for Scotland.

  9. Design, Development, and Automated Verification of an Integrity-Protected Hypervisor

    Science.gov (United States)

    2012-07-16

    also require considerable manual effort. For example, the verification of the SEL4 operating system [45] required several man years effort. In...Winwood. seL4 : formal verification of an OS kernel. In Proc. of SOSP, 2009. [46] K. Kortchinsky. Cloudburst: A VMware guest to host escape story

  10. Rectifier Filters

    Directory of Open Access Journals (Sweden)

    Y. A. Bladyko

    2010-01-01

    Full Text Available The paper contains definition of a smoothing factor which is suitable for any rectifier filter. The formulae of complex smoothing factors have been developed for simple and complex passive filters. The paper shows conditions for application of calculation formulae and filters

  11. Development of anti-debris filter for WWER-440 working fuel assembly

    International Nuclear Information System (INIS)

    Kolosovsky, V.; Aksyonov, P.; Kukushkin, Y.; Molchanov, V.; Kolobaev, A.

    2006-01-01

    Mechanical damaging of the fuel rod claddings caused by debris is one of the main reasons for fuel assembly failures. The paper focuses on the program and results of experimental and design activities carried out by Russian organizations relating to the development and investigation of operational characteristics of anti-debris filters for WWER-440 working fuel assemblies. Lead working fuel assemblies equipped with anti-debris filters have been loaded in the core of Kola-2 NPP. The results obtained can be used for making the decision concerning the application of anti-debris filter for WWER-440 working fuel assemblies with the purpose of enhancing their debris-resistance properties. (authors)

  12. Development of the quickmix injector for in-situ filter testing

    International Nuclear Information System (INIS)

    Costigan, G.; Loughborough, D.

    1993-01-01

    In-situ filter testing is routinely carried out on nuclear ventilation plant to assess the effectiveness of installed filter systems. Ideally the system is tested by introducing a sub-micron aerosol upstream of the filter, in such a way as to present a uniform challenge to the whole of the upstream filter face. Samples are withdrawn from upstream and downstream of the filter, and the respective concentrations are used to calculate the system (or filter) efficiency. These requirements are documented in the Atomic Energy Code of Practice, AECP 1054. The Filter Development Section at Harwell Laboratory has been investigating methods of improving the accuracy and reliability of the in-situ filter test over the past ten years. The programme has included the evaluation of devices used to mix the aerosol and multi-point samplers to obtain representative aerosol samples. This paper reports the results of laboratory trials on the open-quotes QUICKMIXclose quotes injector developed and patented by Harwell. The Quickmix injector is designed to mix the test aerosol with the air stream and thereby reduce the duct length required to produce uniform concentrations. The injector has been tested in ducts ranging from 150 mm diameter to 610 mm square, at air velocities up to 26 m/s. Upstream mixing lengths required to achieve a ± 10% concentration variation on the mean were reduced to between 2 and 5 duct diameters, with a very small pressure drop. This simple, compact device is being installed in new and existing plants in the UK to improve the accuracy and reliability of in-situ filter testing. Some examples of plant applications are given, together with some of the first results from operating plant

  13. Physics Verification Overview

    Energy Technology Data Exchange (ETDEWEB)

    Doebling, Scott William [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-09-12

    The purpose of the verification project is to establish, through rigorous convergence analysis, that each ASC computational physics code correctly implements a set of physics models and algorithms (code verification); Evaluate and analyze the uncertainties of code outputs associated with the choice of temporal and spatial discretization (solution or calculation verification); and Develop and maintain the capability to expand and update these analyses on demand. This presentation describes project milestones.

  14. ENVIRONMENTAL TECHNOLOGY VERIFICATION: TEST REPORT OF MOBILE SOURCE EMISSION CONTROL DEVICES--PUREM NORTH AMERICA LLC, PMF GREENTEC 1004205.00.0 DIESEL PARTICULATE FILTER

    Science.gov (United States)

    The U.S. EPA has created the Environmental Technology Verification (ETV) program to provide high quality, peer reviewed data on technology performance to those involved in the design, distribution, financing, permitting, purchase, and use of environmental technologies. The Air Po...

  15. Battery algorithm verification and development using hardware-in-the-loop testing

    Energy Technology Data Exchange (ETDEWEB)

    He, Yongsheng [General Motors Global Research and Development, 30500 Mound Road, MC 480-106-252, Warren, MI 48090 (United States); Liu, Wei; Koch, Brain J. [General Motors Global Vehicle Engineering, Warren, MI 48090 (United States)

    2010-05-01

    Battery algorithms play a vital role in hybrid electric vehicles (HEVs), plug-in hybrid electric vehicles (PHEVs), extended-range electric vehicles (EREVs), and electric vehicles (EVs). The energy management of hybrid and electric propulsion systems needs to rely on accurate information on the state of the battery in order to determine the optimal electric drive without abusing the battery. In this study, a cell-level hardware-in-the-loop (HIL) system is used to verify and develop state of charge (SOC) and power capability predictions of embedded battery algorithms for various vehicle applications. Two different batteries were selected as representative examples to illustrate the battery algorithm verification and development procedure. One is a lithium-ion battery with a conventional metal oxide cathode, which is a power battery for HEV applications. The other is a lithium-ion battery with an iron phosphate (LiFePO{sub 4}) cathode, which is an energy battery for applications in PHEVs, EREVs, and EVs. The battery cell HIL testing provided valuable data and critical guidance to evaluate the accuracy of the developed battery algorithms, to accelerate battery algorithm future development and improvement, and to reduce hybrid/electric vehicle system development time and costs. (author)

  16. Development of An Automatic Verification Program for Thermal-hydraulic System Codes

    Energy Technology Data Exchange (ETDEWEB)

    Lee, J. Y.; Ahn, K. T.; Ko, S. H.; Kim, Y. S.; Kim, D. W. [Pusan National University, Busan (Korea, Republic of); Suh, J. S.; Cho, Y. S.; Jeong, J. J. [System Engineering and Technology Co., Daejeon (Korea, Republic of)

    2012-05-15

    As a project activity of the capstone design competitive exhibition, supported by the Education Center for Green Industry-friendly Fusion Technology (GIFT), we have developed a computer program which can automatically perform non-regression test, which is needed repeatedly during a developmental process of a thermal-hydraulic system code, such as the SPACE code. A non-regression test (NRT) is an approach to software testing. The purpose of the non-regression testing is to verify whether, after updating a given software application (in this case, the code), previous software functions have not been compromised. The goal is to prevent software regression, whereby adding new features results in software bugs. As the NRT is performed repeatedly, a lot of time and human resources will be needed during the development period of a code. It may cause development period delay. To reduce the cost and the human resources and to prevent wasting time, non-regression tests need to be automatized. As a tool to develop an automatic verification program, we have used Visual Basic for Application (VBA). VBA is an implementation of Microsoft's event-driven programming language Visual Basic 6 and its associated integrated development environment, which are built into most Microsoft Office applications (In this case, Excel)

  17. Battery algorithm verification and development using hardware-in-the-loop testing

    Science.gov (United States)

    He, Yongsheng; Liu, Wei; Koch, Brain J.

    Battery algorithms play a vital role in hybrid electric vehicles (HEVs), plug-in hybrid electric vehicles (PHEVs), extended-range electric vehicles (EREVs), and electric vehicles (EVs). The energy management of hybrid and electric propulsion systems needs to rely on accurate information on the state of the battery in order to determine the optimal electric drive without abusing the battery. In this study, a cell-level hardware-in-the-loop (HIL) system is used to verify and develop state of charge (SOC) and power capability predictions of embedded battery algorithms for various vehicle applications. Two different batteries were selected as representative examples to illustrate the battery algorithm verification and development procedure. One is a lithium-ion battery with a conventional metal oxide cathode, which is a power battery for HEV applications. The other is a lithium-ion battery with an iron phosphate (LiFePO 4) cathode, which is an energy battery for applications in PHEVs, EREVs, and EVs. The battery cell HIL testing provided valuable data and critical guidance to evaluate the accuracy of the developed battery algorithms, to accelerate battery algorithm future development and improvement, and to reduce hybrid/electric vehicle system development time and costs.

  18. Development of An Automatic Verification Program for Thermal-hydraulic System Codes

    International Nuclear Information System (INIS)

    Lee, J. Y.; Ahn, K. T.; Ko, S. H.; Kim, Y. S.; Kim, D. W.; Suh, J. S.; Cho, Y. S.; Jeong, J. J.

    2012-01-01

    As a project activity of the capstone design competitive exhibition, supported by the Education Center for Green Industry-friendly Fusion Technology (GIFT), we have developed a computer program which can automatically perform non-regression test, which is needed repeatedly during a developmental process of a thermal-hydraulic system code, such as the SPACE code. A non-regression test (NRT) is an approach to software testing. The purpose of the non-regression testing is to verify whether, after updating a given software application (in this case, the code), previous software functions have not been compromised. The goal is to prevent software regression, whereby adding new features results in software bugs. As the NRT is performed repeatedly, a lot of time and human resources will be needed during the development period of a code. It may cause development period delay. To reduce the cost and the human resources and to prevent wasting time, non-regression tests need to be automatized. As a tool to develop an automatic verification program, we have used Visual Basic for Application (VBA). VBA is an implementation of Microsoft's event-driven programming language Visual Basic 6 and its associated integrated development environment, which are built into most Microsoft Office applications (In this case, Excel)

  19. Development of Out-pile Test Technology for Fuel Assembly Performance Verification

    Energy Technology Data Exchange (ETDEWEB)

    Chun, Tae Hyun; In, W. K.; Oh, D. S. [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)] (and others)

    2007-03-15

    Out-pile tests with full scale fuel assembly are to verify the design and to evaluate the performance of the final products. HTL for the hydraulic tests and FAMeCT for mechanical/structural tests were constructed in this project. The maximum operating conditions of HTL are 30 bar, 320 .deg. C, and 500 m3/hr. This facility can perform the pressure drop test, fuel assembly uplift test, and flow induced vibration test. FAMeCT can perform the bending and vibration tests. The verification of the developed facilities were carried out by comparing the reference data of the fuel assembly which was obtained at the Westinghouse Co. The compared data showed a good coincidence within uncertainties. FRETONUS was developed for high temperature and high pressure fretting wear simulator and performance test. A performance test was conducted for 500 hours to check the integrity, endurance, data acquisition capability of the simulator. The technology of turbulent flow analysis and finite element analysis by computation was developed. From the establishments of out-pile test facilities for full scale fuel assembly, the domestic infrastructure for PWR fuel development has been greatly upgraded.

  20. Visualization of Instrumental Verification Information Details (VIVID) : code development, description, and usage.

    Energy Technology Data Exchange (ETDEWEB)

    Roy, Christopher John; Bainbridge, Bruce L.; Potter, Donald L.; Blottner, Frederick G.; Black, Amalia Rebecca

    2005-03-01

    The formulation, implementation and usage of a numerical solution verification code is described. This code uses the Richardson extrapolation procedure to estimate the order of accuracy and error of a computational program solution. It evaluates multiple solutions performed in numerical grid convergence studies to verify a numerical algorithm implementation. Analyses are performed on both structured and unstructured grid codes. Finite volume and finite element discretization programs are examined. Two and three-dimensional solutions are evaluated. Steady state and transient solution analysis capabilities are present in the verification code. Multiple input data bases are accepted. Benchmark options are included to allow for minimal solution validation capability as well as verification.

  1. Development and Verification of Tritium Analyses Code for a Very High Temperature Reactor

    International Nuclear Information System (INIS)

    Oh, Chang H.; Kim, Eung S.

    2009-01-01

    A tritium permeation analyses code (TPAC) has been developed by Idaho National Laboratory for the purpose of analyzing tritium distributions in the VHTR systems including integrated hydrogen production systems. A MATLAB SIMULINK software package was used for development of the code. The TPAC is based on the mass balance equations of tritium-containing species and a various form of hydrogen (i.e., HT, H2, HTO, HTSO4, and TI) coupled with a variety of tritium source, sink, and permeation models. In the TPAC, ternary fission and neutron reactions with 6Li, 7Li 10B, 3He were taken into considerations as tritium sources. Purification and leakage models were implemented as main tritium sinks. Permeation of HT and H2 through pipes, vessels, and heat exchangers were importantly considered as main tritium transport paths. In addition, electrolyzer and isotope exchange models were developed for analyzing hydrogen production systems including both high-temperature electrolysis and sulfur-iodine process. The TPAC has unlimited flexibility for the system configurations, and provides easy drag-and-drops for making models by adopting a graphical user interface. Verification of the code has been performed by comparisons with the analytical solutions and the experimental data based on the Peach Bottom reactor design. The preliminary results calculated with a former tritium analyses code, THYTAN which was developed in Japan and adopted by Japan Atomic Energy Agency were also compared with the TPAC solutions. This report contains descriptions of the basic tritium pathways, theory, simple user guide, verifications, sensitivity studies, sample cases, and code tutorials. Tritium behaviors in a very high temperature reactor/high temperature steam electrolysis system have been analyzed by the TPAC based on the reference indirect parallel configuration proposed by Oh et al. (2007). This analysis showed that only 0.4% of tritium released from the core is transferred to the product hydrogen

  2. The Development of a Microbial Challenge Test with Acholeplasma laidlawii To Rate Mycoplasma-Retentive Filters by Filter Manufacturers.

    Science.gov (United States)

    Folmsbee, Martha; Lentine, Kerry Roche; Wright, Christine; Haake, Gerhard; Mcburnie, Leesa; Ashtekar, Dilip; Beck, Brian; Hutchison, Nick; Okhio-Seaman, Laura; Potts, Barbara; Pawar, Vinayak; Windsor, Helena

    2014-01-01

    Mycoplasma are bacteria that can penetrate 0.2 and 0.22 μm rated sterilizing-grade filters and even some 0.1 μm rated filters. Primary applications for mycoplasma filtration include large scale mammalian and bacterial cell culture media and serum filtration. The Parenteral Drug Association recognized the absence of standard industry test parameters for testing and classifying 0.1 μm rated filters for mycoplasma clearance and formed a task force to formulate consensus test parameters. The task force established some test parameters by common agreement, based upon general industry practices, without the need for additional testing. However, the culture medium and incubation conditions, for generating test mycoplasma cells, varied from filter company to filter company and was recognized as a serious gap by the task force. Standardization of the culture medium and incubation conditions required collaborative testing in both commercial filter company laboratories and in an Independent laboratory (Table I). The use of consensus test parameters will facilitate the ultimate cross-industry goal of standardization of 0.1 μm filter claims for mycoplasma clearance. However, it is still important to recognize filter performance will depend on the actual conditions of use. Therefore end users should consider, using a risk-based approach, whether process-specific evaluation of filter performance may be warranted for their application. Mycoplasma are small bacteria that have the ability to penetrate sterilizing-grade filters. Filtration of large-scale mammalian and bacterial cell culture media is an example of an industry process where effective filtration of mycoplasma is required. The Parenteral Drug Association recognized the absence of industry standard test parameters for evaluating mycoplasma clearance filters by filter manufacturers and formed a task force to formulate such a consensus among manufacturers. The use of standardized test parameters by filter manufacturers

  3. Development of nuclear safety class filter elements with long life and high quality

    International Nuclear Information System (INIS)

    Zhang Jinghua

    2009-04-01

    This paper describes the development on nuclear safety class filter elements with long life and high quality used for collecting radioactive contaminants, fragments of resin and impurities in primary systems of NPPs. The filter elements made of glass fibre elements are used for PWR, and of paper elements are used for PHWR. During the research, a series of tests for optimization were performed for selection of filter material and the improvement of binder. The flow rate and comprehensive performance have been measured in simulated conditions. The result shows that the application requirements for operational NPPs can be met, and the reliability and safety of the frame are also be verified. The comprehensive performance of the filter elements is equivalent to that of oversea similar products. The products have been used in NPPs in operation. (authors)

  4. Octennial History of the Development and Quality of High-Efficiency Filters for the US Atomic Energy Program

    Energy Technology Data Exchange (ETDEWEB)

    Gilbert, H. [United States Atomic Energy Commission, Washington, DC (United States)

    1968-12-15

    Two facilities are operated for the US atomic energy commission in its program to ensure the uniform quality of commercially manufactured high-efficiency particulate filters. the filter-testing program was started in january 1960 after it was realized that the commercial fire-resistant product incorporated deficiencies of manufacture. the record of testing for quality assurance by the two facilities and an analysis of factors governing the quality of filters are presented. the analysis is complemented with a description of efforts, made in the course of the filter testing, to improve the design of the filter for efficiency and reliability. the fire-resistant (hepa) filter of 1959 was inadequate. the inadequacy of the filter, now judged by reflection, was brought about by the intensive accelerated efforts to replace and preclude, wherever possible in the US atomic energy program, use of filters made of combustible materials. this desire for fire resistance of filters has proliferated widely among other members of the international atomic energy family. the intensive AEC effort caused US industry to produce filters of fire-resistant design but without the opportunity for development of manufacturing technology adequate for ensuring reliability of the filter. the state of the fire-resistant filter today is in sharp contrast to the 1959 filter. the testing program, coupled with a program for continuing improvement of the filter, has resulted in the effective removal of radioactive aerosols at atomic energy installations on a consistent and dependable basis. (author)

  5. DEVELOPING VERIFICATION SYSTEMS FOR BUILDING INFORMATION MODELS OF HERITAGE BUILDINGS WITH HETEROGENEOUS DATASETS

    Directory of Open Access Journals (Sweden)

    L. Chow

    2017-08-01

    Full Text Available The digitization and abstraction of existing buildings into building information models requires the translation of heterogeneous datasets that may include CAD, technical reports, historic texts, archival drawings, terrestrial laser scanning, and photogrammetry into model elements. In this paper, we discuss a project undertaken by the Carleton Immersive Media Studio (CIMS that explored the synthesis of heterogeneous datasets for the development of a building information model (BIM for one of Canada’s most significant heritage assets – the Centre Block of the Parliament Hill National Historic Site. The scope of the project included the development of an as-found model of the century-old, six-story building in anticipation of specific model uses for an extensive rehabilitation program. The as-found Centre Block model was developed in Revit using primarily point cloud data from terrestrial laser scanning. The data was captured by CIMS in partnership with Heritage Conservation Services (HCS, Public Services and Procurement Canada (PSPC, using a Leica C10 and P40 (exterior and large interior spaces and a Faro Focus (small to mid-sized interior spaces. Secondary sources such as archival drawings, photographs, and technical reports were referenced in cases where point cloud data was not available. As a result of working with heterogeneous data sets, a verification system was introduced in order to communicate to model users/viewers the source of information for each building element within the model.

  6. Metal fuel development and verification for prototype generation- IV Sodium- Cooled Fast Reactor

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Chan Bock; Cheon, Jin Sik; Kim, Sung Ho; Park, Jeong Yong; Joo, Hyung Kook [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2016-10-15

    Metal fuel is being developed for the prototype generation-IV sodium-cooled fast reactor (PGSFR) to be built by 2028. U-Zr fuel is a driver for the initial core of the PGSFR, and U -transuranics (TRU)-Zr fuel will gradually replace U-Zr fuel through its qualification in the PGSFR. Based on the vast worldwide experiences of U-Zr fuel, work on U-Zr fuel is focused on fuel design, fabrication of fuel components, and fuel verification tests. U-TRU-Zr fuel uses TRU recovered through pyroelectrochemical processing of spent PWR (pressurized water reactor) fuels, which contains highly radioactive minor actinides and chemically active lanthanide or rare earth elements as carryover impurities. An advanced fuel slug casting system, which can prevent vaporization of volatile elements through a control of the atmospheric pressure of the casting chamber and also deal with chemically active lanthanide elements using protective coatings in the casting crucible, was developed. Fuel cladding of the ferritic-martensitic steel FC92, which has higher mechanical strength at a high temperature than conventional HT9 cladding, was developed and fabricated, and is being irradiated in the fast reactor.

  7. Metal Fuel Development and Verification for Prototype Generation IV Sodium-Cooled Fast Reactor

    Directory of Open Access Journals (Sweden)

    Chan Bock Lee

    2016-10-01

    Full Text Available Metal fuel is being developed for the prototype generation-IV sodium-cooled fast reactor (PGSFR to be built by 2028. U–Zr fuel is a driver for the initial core of the PGSFR, and U–transuranics (TRU–Zr fuel will gradually replace U–Zr fuel through its qualification in the PGSFR. Based on the vast worldwide experiences of U–Zr fuel, work on U–Zr fuel is focused on fuel design, fabrication of fuel components, and fuel verification tests. U–TRU–Zr fuel uses TRU recovered through pyroelectrochemical processing of spent PWR (pressurized water reactor fuels, which contains highly radioactive minor actinides and chemically active lanthanide or rare earth elements as carryover impurities. An advanced fuel slug casting system, which can prevent vaporization of volatile elements through a control of the atmospheric pressure of the casting chamber and also deal with chemically active lanthanide elements using protective coatings in the casting crucible, was developed. Fuel cladding of the ferritic–martensitic steel FC92, which has higher mechanical strength at a high temperature than conventional HT9 cladding, was developed and fabricated, and is being irradiated in the fast reactor.

  8. Developing Verification Systems for Building Information Models of Heritage Buildings with Heterogeneous Datasets

    Science.gov (United States)

    Chow, L.; Fai, S.

    2017-08-01

    The digitization and abstraction of existing buildings into building information models requires the translation of heterogeneous datasets that may include CAD, technical reports, historic texts, archival drawings, terrestrial laser scanning, and photogrammetry into model elements. In this paper, we discuss a project undertaken by the Carleton Immersive Media Studio (CIMS) that explored the synthesis of heterogeneous datasets for the development of a building information model (BIM) for one of Canada's most significant heritage assets - the Centre Block of the Parliament Hill National Historic Site. The scope of the project included the development of an as-found model of the century-old, six-story building in anticipation of specific model uses for an extensive rehabilitation program. The as-found Centre Block model was developed in Revit using primarily point cloud data from terrestrial laser scanning. The data was captured by CIMS in partnership with Heritage Conservation Services (HCS), Public Services and Procurement Canada (PSPC), using a Leica C10 and P40 (exterior and large interior spaces) and a Faro Focus (small to mid-sized interior spaces). Secondary sources such as archival drawings, photographs, and technical reports were referenced in cases where point cloud data was not available. As a result of working with heterogeneous data sets, a verification system was introduced in order to communicate to model users/viewers the source of information for each building element within the model.

  9. REDD+ readiness: early insights on monitoring, reporting and verification systems of project developers

    International Nuclear Information System (INIS)

    Joseph, Shijo; Sunderlin, William D; Verchot, Louis V; Herold, Martin

    2013-01-01

    A functional measuring, monitoring, reporting and verification (MRV) system is essential to assess the additionality and impact on forest carbon in REDD+ (reducing emissions from deforestation and degradation) projects. This study assesses the MRV capacity and readiness of project developers at 20 REDD+ projects in Brazil, Peru, Cameroon, Tanzania, Indonesia and Vietnam, using a questionnaire survey and field visits. Nineteen performance criteria with 76 indicators were formulated in three categories, and capacity was measured with respect to each category. Of the 20 projects, 11 were found to have very high or high overall MRV capacity and readiness. At the regional level, capacity and readiness tended to be highest in the projects in Brazil and Peru and somewhat lower in Cameroon, Tanzania, Indonesia and Vietnam. Although the MRV capacities of half the projects are high, there are capacity deficiencies in other projects that are a source of concern. These are not only due to limitations in technical expertise, but can also be attributed to the slowness of international REDD+ policy formulation and the unclear path of development of the forest carbon market. Based on the study results, priorities for MRV development and increased investment in readiness are proposed. (letter)

  10. Development of optical ground verification method for μm to sub-mm reflectors

    Science.gov (United States)

    Stockman, Y.; Thizy, C.; Lemaire, P.; Georges, M.; Mazy, E.; Mazzoli, A.; Houbrechts, Y.; Rochus, P.; Roose, S.; Doyle, D.; Ulbrich, G.

    2017-11-01

    develop and realise suitable verification tools based on infrared interferometry and other optical techniques for testing large reflector structures, telescope configurations and their performances under simulated space conditions. Two methods and techniques are developed at CSL. The first one is an IR-phase shifting interferometer with high spatial resolution. This interferometer shall be used specifically for the verification of high precision IR, FIR and sub-mm reflector surfaces and telescopes under both ambient and thermal vacuum conditions. The second one presented hereafter is a holographic method for relative shape measurement. The holographic solution proposed makes use of a home built vacuum compatible holographic camera that allows displacement measurements from typically 20 nanometres to 25 microns in one shot. An iterative process allows the measurement of a total of up to several mm of deformation. Uniquely the system is designed to measure both specular and diffuse surfaces.

  11. Development of the advanced PHWR technology -Verification tests for CANDU advanced fuel-

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Jang Hwan; Suk, Hoh Chun; Jung, Moon Kee; Oh, Duk Joo; Park, Joo Hwan; Shim, Kee Sub; Jang, Suk Kyoo; Jung, Heung Joon; Park, Jin Suk; Jung, Seung Hoh; Jun, Ji Soo; Lee, Yung Wook; Jung, Chang Joon; Byun, Taek Sang; Park, Kwang Suk; Kim, Bok Deuk; Min, Kyung Hoh [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1995-07-01

    This is the `94 annual report of the CANDU advanced fuel verification test project. This report describes the out-of pile hydraulic tests at CANDU-hot test loop for verification of CANFLEX fuel bundle. It is also describes the reactor thermal-hydraulic analysis for thermal margin and flow stability. The contents in this report are as follows; (1) Out-of pile hydraulic tests for verification of CANFLEX fuel bundle. (a) Pressure drop tests at reactor operation condition (b) Strength test during reload at static condition (c) Impact test during reload at impact load condition (d) Endurance test for verification of fuel integrity during life time (2) Reactor thermal-hydraulic analysis with CANFLEX fuel bundle. (a) Critical channel power sensitivity analysis (b) CANDU-6 channel flow analysis (c) Flow instability analysis. 61 figs, 29 tabs, 21 refs. (Author).

  12. Development of the advanced PHWR technology -Verification tests for CANDU advanced fuel-

    International Nuclear Information System (INIS)

    Jung, Jang Hwan; Suk, Hoh Chun; Jung, Moon Kee; Oh, Duk Joo; Park, Joo Hwan; Shim, Kee Sub; Jang, Suk Kyoo; Jung, Heung Joon; Park, Jin Suk; Jung, Seung Hoh; Jun, Ji Soo; Lee, Yung Wook; Jung, Chang Joon; Byun, Taek Sang; Park, Kwang Suk; Kim, Bok Deuk; Min, Kyung Hoh

    1995-07-01

    This is the '94 annual report of the CANDU advanced fuel verification test project. This report describes the out-of pile hydraulic tests at CANDU-hot test loop for verification of CANFLEX fuel bundle. It is also describes the reactor thermal-hydraulic analysis for thermal margin and flow stability. The contents in this report are as follows; (1) Out-of pile hydraulic tests for verification of CANFLEX fuel bundle. (a) Pressure drop tests at reactor operation condition (b) Strength test during reload at static condition (c) Impact test during reload at impact load condition (d) Endurance test for verification of fuel integrity during life time (2) Reactor thermal-hydraulic analysis with CANFLEX fuel bundle. (a) Critical channel power sensitivity analysis (b) CANDU-6 channel flow analysis (c) Flow instability analysis. 61 figs, 29 tabs, 21 refs. (Author)

  13. Cost-Effective CNC Part Program Verification Development for Laboratory Instruction.

    Science.gov (United States)

    Chen, Joseph C.; Chang, Ted C.

    2000-01-01

    Describes a computer numerical control program verification system that checks a part program before its execution. The system includes character recognition, word recognition, a fuzzy-nets system, and a tool path viewer. (SK)

  14. Characterizing proton-activated materials to develop PET-mediated proton range verification markers

    Science.gov (United States)

    Cho, Jongmin; Ibbott, Geoffrey S.; Kerr, Matthew D.; Amos, Richard A.; Stingo, Francesco C.; Marom, Edith M.; Truong, Mylene T.; Palacio, Diana M.; Betancourt, Sonia L.; Erasmus, Jeremy J.; DeGroot, Patricia M.; Carter, Brett W.; Gladish, Gregory W.; Sabloff, Bradley S.; Benveniste, Marcelo F.; Godoy, Myrna C.; Patil, Shekhar; Sorensen, James; Mawlawi, Osama R.

    2016-06-01

    Conventional proton beam range verification using positron emission tomography (PET) relies on tissue activation alone and therefore requires particle therapy PET whose installation can represent a large financial burden for many centers. Previously, we showed the feasibility of developing patient implantable markers using high proton cross-section materials (18O, Cu, and 68Zn) for in vivo proton range verification using conventional PET scanners. In this technical note, we characterize those materials to test their usability in more clinically relevant conditions. Two phantoms made of low-density balsa wood (~0.1 g cm-3) and beef (~1.0 g cm-3) were embedded with Cu or 68Zn foils of several volumes (10-50 mm3). The metal foils were positioned at several depths in the dose fall-off region, which had been determined from our previous study. The phantoms were then irradiated with different proton doses (1-5 Gy). After irradiation, the phantoms with the embedded foils were moved to a diagnostic PET scanner and imaged. The acquired data were reconstructed with 20-40 min of scan time using various delay times (30-150 min) to determine the maximum contrast-to-noise ratio. The resultant PET/computed tomography (CT) fusion images of the activated foils were then examined and the foils’ PET signal strength/visibility was scored on a 5 point scale by 13 radiologists experienced in nuclear medicine. For both phantoms, the visibility of activated foils increased in proportion to the foil volume, dose, and PET scan time. A linear model was constructed with visibility scores as the response variable and all other factors (marker material, phantom material, dose, and PET scan time) as covariates. Using the linear model, volumes of foils that provided adequate visibility (score 3) were determined for each dose and PET scan time. The foil volumes that were determined will be used as a guideline in developing practical implantable markers.

  15. Survey of Verification and Validation Techniques for Small Satellite Software Development

    Science.gov (United States)

    Jacklin, Stephen A.

    2015-01-01

    The purpose of this paper is to provide an overview of the current trends and practices in small-satellite software verification and validation. This document is not intended to promote a specific software assurance method. Rather, it seeks to present an unbiased survey of software assurance methods used to verify and validate small satellite software and to make mention of the benefits and value of each approach. These methods include simulation and testing, verification and validation with model-based design, formal methods, and fault-tolerant software design with run-time monitoring. Although the literature reveals that simulation and testing has by far the longest legacy, model-based design methods are proving to be useful for software verification and validation. Some work in formal methods, though not widely used for any satellites, may offer new ways to improve small satellite software verification and validation. These methods need to be further advanced to deal with the state explosion problem and to make them more usable by small-satellite software engineers to be regularly applied to software verification. Last, it is explained how run-time monitoring, combined with fault-tolerant software design methods, provides an important means to detect and correct software errors that escape the verification process or those errors that are produced after launch through the effects of ionizing radiation.

  16. Development and Verification of the Computer Codes for the Fast Reactors Nuclear Safety Justification

    International Nuclear Information System (INIS)

    Kisselev, A.E.; Mosunova, N.A.; Strizhov, V.F.

    2015-01-01

    The information on the status of the work on development of the system of the nuclear safety codes for fast liquid metal reactors is presented in paper. The purpose of the work is to create an instrument for NPP neutronic, thermohydraulic and strength justification including human and environment radiation safety. The main task that is to be solved by the system of codes developed is the analysis of the broad spectrum of phenomena taking place on the NPP (including reactor itself, NPP components, containment rooms, industrial site and surrounding area) and analysis of the impact of the regular and accidental releases on the environment. The code system is oriented on the ability of fully integrated modeling of the NPP behavior in the coupled definition accounting for the wide range of significant phenomena taking place on the NPP under normal and accident conditions. It is based on the models that meet the state-of-the-art knowledge level. The codes incorporate advanced numerical methods and modern programming technologies oriented on the high-performance computing systems. The information on the status of the work on verification of the separate codes of the system of codes is also presented. (author)

  17. Development of the Simbol-X science verification model and its contribution for the IXO Mission

    Science.gov (United States)

    Maier, Daniel; Aschauer, Florian; Dick, Jürgen; Distratis, Giuseppe; Gebhardt, Henry; Herrmann, Sven; Kendziorra, Eckhard; Lauf, Thomas; Lechner, Peter; Santangelo, Andrea; Schanz, Thomas; Strüder, Lothar; Tenzer, Chris; Treis, Johannes

    2010-07-01

    Like the International X-ray Observatory (IXO) mission, the Simbol-X mission is a projected X-ray space telescope with spectral and imaging capabilities covering the energy range from 500 eV up to 80 keV. To detect photons within this wide range of energies, a silicon based "Depleted P-channel Field Effect Transistor" (DePFET)- matrix is used as the Low Energy Detector (LED) on top of an array of CdTe-Caliste modules, which act as the High Energy Detector (HED). A Science Verification Model (SVM) consisting of one LED quadrant in front of one Caliste module will be set up at our institute (IAAT) and operated under laboratory conditions that approximate the expected environment in space. As a first step we use the SVM to test and optimize the performance of the LED operation and data acquisition chain, consisting of an ADC, an event-preprocessor, a sequencer, and an interface controller. All these components have been developed at our institute with the objective to handle the high readout rate of approximately 8000 frames per second. The second step is to study the behaviour and the interactions of LED and HED operating as a combined detector system. We report on the development status of the SVM and its associated electronics and present first results of the currently achieved spectral performance.

  18. Development of a consensus standard for verification and validation of nuclear system thermal-fluids software

    International Nuclear Information System (INIS)

    Harvego, Edwin A.; Schultz, Richard R.; Crane, Ryan L.

    2011-01-01

    With the resurgence of nuclear power and increased interest in advanced nuclear reactors as an option to supply abundant energy without the associated greenhouse gas emissions of the more conventional fossil fuel energy sources, there is a need to establish internationally recognized standards for the verification and validation (V and V) of software used to calculate the thermal–hydraulic behavior of advanced reactor designs for both normal operation and hypothetical accident conditions. To address this need, ASME (American Society of Mechanical Engineers) Standards and Certification has established the V and V 30 Committee, under the jurisdiction of the V and V Standards Committee, to develop a consensus standard for verification and validation of software used for design and analysis of advanced reactor systems. The initial focus of this committee will be on the V and V of system analysis and computational fluid dynamics (CFD) software for nuclear applications. To limit the scope of the effort, the committee will further limit its focus to software to be used in the licensing of High-Temperature Gas-Cooled Reactors. Although software verification will be an important and necessary part of the standard, much of the initial effort of the committee will be focused on the validation of existing software and new models that could be used in the licensing process. In this framework, the Standard should conform to Nuclear Regulatory Commission (NRC) and other regulatory practices, procedures and methods for licensing of nuclear power plants as embodied in the United States (U.S.) Code of Federal Regulations and other pertinent documents such as Regulatory Guide 1.203, “Transient and Accident Analysis Methods” and NUREG-0800, “NRC Standard Review Plan”. In addition, the Standard should be consistent with applicable sections of ASME NQA-1-2008 “Quality Assurance Requirements for Nuclear Facility Applications (QA)”. This paper describes the general

  19. Development of filters for exhaust air or off-gas cleaning

    International Nuclear Information System (INIS)

    Wilhelm, J.

    1988-01-01

    The activities of the 'Laboratorium fuer Aerosolphysik und Filtertechnik II' of the 'Kernforschungszentrum Karlsruhe' concentrate on the development of filters to be used for cleaning nuclear and conventional exhaust air and off-gas. Originally, these techniques were intended to be applied in nuclear facilities only. Their application for conventional gas purification, however, has led to a reorientation of research and development projects. By way of example, it is reported about the use of the multi-way sorption filter for radioiodine removal in nuclear power plants and following flue-gas purification in heating power plants as well as for off-gas cleaning in chemical industry. The improvement of HEPA filters and the development of metal fibre filters has led to components which can be used in the range of high humidity and moisture as well as at high temperatures and an increased differential pressure. The experience obtained in the field of high-efficiency filtering of nuclear airborne particles is made use of during the investigations concerning the removal of particles of conventional pollutants in the submicron range. A technique of radioiodine removal and an improved removal of airborne particles has been developed for use in the future reprocessing plant. Thus, a maximum removal efficiency can be achieved and an optimum waste management is made possible. It is reported about the components obtained as a result of these activities and their use for off-gas cleaning in the Wackersdorf reprocessing plant (WAW). (orig.) [de

  20. The development of a HEPA filter with improved dust holding characteristics

    International Nuclear Information System (INIS)

    Dyment, J.; Hamblin, C.

    1995-01-01

    A limitation of the HEPA filters used in the extract of nuclear facilities is their relatively low capacity for captured dust. The costs associated with the disposal of a typical filter means that there are clear incentives to extend filter life. The work described in this report are the initial stages in the development of a filter which incorporates a medium which enhances its dust holding capacity. Experimental equipment was installed to enable the dust loading characteristics of candidate media to be compared with those of the glass fibre based papers currently used in filter construction. These tests involved challenging representative samples of the media with an air stream containing a controlled concentration of thermally generated sodium chloride particles. The dust loading characteristics of the media were then compared in terms of the rate of increasing in pressure differential. A number of open-quotes graded densityclose quotes papers were subsequently identified which appeared to offer significant improvements in dust holding. In the second phase of the programme deep-pleat filters (1,700 M 3 h -1 ) incorporating graded density papers were manufactured and tested. Improvements of up to 50% were observed in their capacity for the sub-micron sodium chloride test dust. Smaller differences (15%) were measured when a coarser, carbon black, challenge was used. This is attributed to the differences in the particles sizes of the two dusts

  1. The development of a HEPA filter with improved dust holding characteristics

    Energy Technology Data Exchange (ETDEWEB)

    Dyment, J.; Hamblin, C.

    1995-02-01

    A limitation of the HEPA filters used in the extract of nuclear facilities is their relatively low capacity for captured dust. The costs associated with the disposal of a typical filter means that there are clear incentives to extend filter life. The work described in this report are the initial stages in the development of a filter which incorporates a medium which enhances its dust holding capacity. Experimental equipment was installed to enable the dust loading characteristics of candidate media to be compared with those of the glass fibre based papers currently used in filter construction. These tests involved challenging representative samples of the media with an air stream containing a controlled concentration of thermally generated sodium chloride particles. The dust loading characteristics of the media were then compared in terms of the rate of increasing in pressure differential. A number of {open_quotes}graded density{close_quotes} papers were subsequently identified which appeared to offer significant improvements in dust holding. In the second phase of the programme deep-pleat filters (1,700 M{sup 3}h{sup {minus}1}) incorporating graded density papers were manufactured and tested. Improvements of up to 50% were observed in their capacity for the sub-micron sodium chloride test dust. Smaller differences (15%) were measured when a coarser, carbon black, challenge was used. This is attributed to the differences in the particles sizes of the two dusts.

  2. Development of fast charge exchange recombination spectroscopy by using interference filter method in JT-60U

    International Nuclear Information System (INIS)

    Kobayashi, Shinji; Sakasai, Akira; Koide, Yoshihiko; Sakamoto, Yoshiteru; Kamada, Yutaka; Hatae, Takaki; Oyama, Naoyuki; Miura, Yukitoshi

    2003-01-01

    Recent developments and results of fast charge exchange recombination spectroscopy (CXRS) using interference filter method are reported. In order to measure the rapid change of the ion temperature and rotation velocity under collapse or transition phenomena with high-time resolution, two types of interference filter systems were applied to the CXRS diagnostics on the JT-60U Tokamak. One can determine the Doppler broadening and Doppler shift of the CXR emission using three interference filters having slightly different center wavelengths. A rapid estimation method of the temperature ad rotation velocity without non-linear least square fitting is presented. The modification of the three-filters system enables us to improve the minimum time resolution up to 0.8 ms, which is better than that of 16.7 ms for the conventional CXRS system using the CCD detector in JT-60U. The other system having seven wavelength channels is newly fabricated to crosscheck the results obtained by the three-filters assembly, that is, to verify that the CXR emission forms a Gaussian profile under collapse phenomena. In a H-mode discharge having giant edge localized modes, the results obtained by the two systems are compared. The applicability of the three-filters system to the measurement of rapid changes in temperature and rotation velocity is demonstrated. (author)

  3. DEVELOPMENT OF AG-1 SECTION FI ON METAL MEDIA FILTERS - 9061

    International Nuclear Information System (INIS)

    Adamson, D.; Waggoner, C.A.

    2008-01-01

    Development of a metal media standard (FI) for ASME AG-1 (Code on Nuclear Air and Gas Treatment) has been under way for almost ten years. This paper will provide a brief history of the development process of this section and a detailed overview of its current content/status. There have been at least two points when dramatic changes have been made in the scope of the document due to feedback from the full Committee on Nuclear Air and Gas Treatment (CONAGT). Development of the proposed section has required resolving several difficult issues associated with scope; namely, filtering efficiency, operating conditions (media velocity, pressure drop, etc.), qualification testing, and quality control/acceptance testing. A proposed version of Section FI is currently undergoing final revisions prior to being submitted for balloting. The section covers metal media filters of filtering efficiencies ranging from medium (less than 99.97%) to high (99.97% and greater). Two different types of high efficiency filters are addressed; those units intended to be a direct replacement of Section FC fibrous glass HEPA filters and those that will be placed into newly designed systems capable of supporting greater static pressures and differential pressures across the filter elements. Direct replacements of FC HEPA filters in existing systems will be required to meet equivalent qualification and testing requirements to those contained in Section FC. A series of qualification and quality assurance test methods have been identified for the range of filtering efficiencies covered by this proposed standard. Performance characteristics of sintered metal powder vs. sintered metal fiber media are dramatically different with respect to parameters like differential pressures and rigidity of the media. Wide latitude will be allowed for owner specification of performance criteria for filtration units that will be placed into newly designed systems. Such allowances will permit use of the most

  4. Development and verification of a software tool for the acoustic location of partial discharge in a power transformer

    Directory of Open Access Journals (Sweden)

    Polužanski Vladimir

    2014-01-01

    Full Text Available This paper discusses the development and verification of software tool for determining the location of partial discharge in a power transformer with the acoustic method. Classification and systematization of physical principles and detection methods and tests of partial discharge in power transformers are shown at the beginning of this paper. The most important mathematical models, features, algorithms, and real problems that affect measurement accuracy are highlighted. This paper describes the development and implementation of a software tool for determining the location of partial discharge in a power transformer based on a no iterative mathematical algorithm. Verification and accuracy of measurement are proved both by computer simulation and experimental results available in the literature.

  5. Environmental Technology Verification: Test Report of Mobile Source Emission Control Devices--Johnson Matthey PCRT2 1000, Version 2, Filter + Diesel Oxidation Catalyst

    Science.gov (United States)

    The Johnson Matthey PCRT2 1000, v.2 system is a partial continuously regenerating technology (PCRT) system that consists of a flow-through partial filter combined with a DOC. The system is designed for low temperature exhaust resulting from intermittent loads from medium and heav...

  6. Development Concept of Guaranteed Verification Electric Power System Simulation Tools and Its Realization

    Directory of Open Access Journals (Sweden)

    Gusev Alexander

    2015-01-01

    Full Text Available The analysis of existing problem reliability and verification of widespread electric power systems (EPS simulation tools is presented in this article. Everything simulation tools are based on the using of numerical methods for ordinary differential equations. Described the concept of guaranteed verification EPS simulation tools and the structure of its realization are based using the Simulator having properties of continuous , without decomposition three-phase EPS simulation in real time and on an unlimited range with guaranteed accuracy. The information from the Simulator can be verified by using data only quasi-steady-state regime received from the SCADA and such Simulator can be applied as the standard model for verification any EPS simulation tools.

  7. Development of regeneration technique for diesel particulate filter made of porous metal; Kinzoku takotai DPF no saisei gijutsu no kaihatsu

    Energy Technology Data Exchange (ETDEWEB)

    Yoro, K; Ban, S; Ooka, T; Saito, H; Oji, M; Nakajima, S; Okamoto, S [Sumitomo Electric Industries, Ltd., Osaka (Japan)

    1997-10-01

    We have developed the diesel particulate filter (DPF) in which porous metal is used for a filter because of its high thermal conductivity and a radiation heater is used for a regeneration device because of its uniform thermal distribution. In the case high trapping efficiency is required, filter thickness should be thick. The thicker filter has a disadvantage of difficulty in regeneration because of the thermal distribution in the direction of thickness. In order to improve regeneration efficiency, we designed the best filter-heater construction which achieves uniform thermal distribution by using computer simulation and we confirmed good regeneration efficiency in the experiment. 4 refs., 14 figs., 1 tab.

  8. Development of an advanced real time simulation tool, ARTIST and its verification

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Hee Cheol; Moon, S. K.; Yoon, B. J.; Sim, S. K.; Lee, W. J. [Korea Atomic Energy Research Institute, Taejeon (Korea)

    1999-10-01

    A real time reactor system analysis code ARTIST, based on drift flux model has been developed to investigate the transient system behavior under low pressure, low flow and low power conditions with noncondensable gas present in the system. The governing equations of the ARTIST code consist of three mass continuity equations (steam, liquid and noncondensables), two energy equations (steam and mixture) and one mixture equation constituted with the drift flux model. The drift flux model of ARTIST has been validated against the THETIS experimental data by comparing the void distribution in the system. Especially, the calculated void fraction by Chexal-Lellouche void fraction correlation at low pressure and low flow, is better than the results of both the homogeneous model of TASS code and the two-fluid model of RELAP5/MOD3 code. When noncondensable gas exists, thermal-hydraulic state solution scheme and the calculation methods of the partial derivatives are developed. Numerical consistency and convergence was tested with the one volume problems and the manometric oscillation was assessed to examine the calculation methods of the partial derivatives. Calculated thermal-hydraulic state for each test shows the consistent and expected behaviour. In order to evaluate the ARTIST code capability in predicting the two phase thermal-hydraulic phenomena of the loss of RHR accident during midloop operation, BETHSY test 6.9d is simulated. From the results, it is judged that the reflux condensation model and the critical flow model for the noncondensable gas are necessary to correctly predict the thermal-hydraulic behaviour. Finally, the verification run was performed without the drift flux model and the noncondensable gas model for the postulated accidents of the real plants. The ARTIST code well reproduces the parametric trends which are calculated by TASS code. Therefore, the integrity of ARTIST code was verified. 35 refs., 70 figs., 3 tabs. (Author)

  9. Iron oxide impregnated filter paper (Pi test): a review of its development and methodological research

    NARCIS (Netherlands)

    Chardon, W.J.; Menon, R.G.; Chien, S.H.

    1996-01-01

    Iron oxide impregnated filter paper (FeO paper) has been used to study the availability of phosphorus (P) to plants and algae, P desorption kinetics and P dynamics in the field. Since its initial development a number of differences in the method of preparation of the paper and its application have

  10. Knowing How Good Our Searches Are: An Approach Derived from Search Filter Development Methodology

    Directory of Open Access Journals (Sweden)

    Sarah Hayman

    2015-12-01

    Full Text Available Objective – Effective literature searching is of paramount importance in supporting evidence based practice, research, and policy. Missed references can have adverse effects on outcomes. This paper reports on the development and evaluation of an online learning resource, designed for librarians and other interested searchers, presenting an evidence based approach to enhancing and testing literature searches. Methods – We developed and evaluated the set of free online learning modules for librarians called Smart Searching, suggesting the use of techniques derived from search filter development undertaken by the CareSearch Palliative Care Knowledge Network and its associated project Flinders Filters. The searching module content has been informed by the processes and principles used in search filter development. The self-paced modules are intended to help librarians and other interested searchers test the effectiveness of their literature searches, provide evidence of search performance that can be used to improve searches, as well as to evaluate and promote searching expertise. Each module covers one of four techniques, or core principles, employed in search filter development: (1 collaboration with subject experts; (2 use of a reference sample set; (3 term identification through frequency analysis; and (4 iterative testing. Evaluation of the resource comprised ongoing monitoring of web analytics to determine factors such as numbers of users and geographic origin; a user survey conducted online elicited qualitative information about the usefulness of the resource. Results – The resource was launched in May 2014. Web analytics show over 6,000 unique users from 101 countries (at 9 August 2015. Responses to the survey (n=50 indicated that 80% would recommend the resource to a colleague. Conclusions – An evidence based approach to searching, derived from search filter development methodology, has been shown to have value as an online learning

  11. Development and performance validation of a cryogenic linear stage for SPICA-SAFARI verification

    Science.gov (United States)

    Ferrari, Lorenza; Smit, H. P.; Eggens, M.; Keizer, G.; de Jonge, A. W.; Detrain, A.; de Jonge, C.; Laauwen, W. M.; Dieleman, P.

    2014-07-01

    In the context of the SAFARI instrument (SpicA FAR-infrared Instrument) SRON is developing a test environment to verify the SAFARI performance. The characterization of the detector focal plane will be performed with a backilluminated pinhole over a reimaged SAFARI focal plane by an XYZ scanning mechanism that consists of three linear stages stacked together. In order to reduce background radiation that can couple into the high sensitivity cryogenic detectors (goal NEP of 2•10-19 W/√Hz and saturation power of few femtoWatts) the scanner is mounted inside the cryostat in the 4K environment. The required readout accuracy is 3 μm and reproducibility of 1 μm along the total travel of 32 mm. The stage will be operated in "on the fly" mode to prevent vibrations of the scanner mechanism and will move with a constant speed varying from 60 μm/s to 400 μm/s. In order to meet the requirements of large stroke, low dissipation (low friction) and high accuracy a DC motor plus spindle stage solution has been chosen. In this paper we will present the stage design and stage characterization, describing also the measurements setup. The room temperature performance has been measured with a 3D measuring machine cross calibrated with a laser interferometer and a 2-axis tilt sensor. The low temperature verification has been performed in a wet 4K cryostat using a laser interferometer for measuring the linear displacements and a theodolite for measuring the angular displacements. The angular displacements can be calibrated with a precision of 4 arcsec and the position could be determined with high accuracy. The presence of friction caused higher values of torque than predicted and consequently higher dissipation. The thermal model of the stage has also been verified at 4K.

  12. Development and Verification of a Pilot Code based on Two-fluid Three-field Model

    Energy Technology Data Exchange (ETDEWEB)

    Hwang, Moon Kyu; Bae, S. W.; Lee, Y. J.; Chung, B. D.; Jeong, J. J.; Ha, K. S.; Kang, D. H

    2006-09-15

    In this study, a semi-implicit pilot code is developed for a one-dimensional channel flow as three-fields. The three fields are comprised of a gas, continuous liquid and entrained liquid fields. All the three fields are allowed to have their own velocities. The temperatures of the continuous liquid and the entrained liquid are, however, assumed to be equilibrium. The interphase phenomena include heat and mass transfer, as well as momentum transfer. The fluid/structure interaction, generally, include both heat and momentum transfer. Assuming adiabatic system, only momentum transfer is considered in this study, leaving the wall heat transfer for the future study. Using 10 conceptual problems, the basic pilot code has been verified. The results of the verification are summarized below: It was confirmed that the basic pilot code can simulate various flow conditions (such as single-phase liquid flow, bubbly flow, slug/churn turbulent flow, annular-mist flow, and single-phase vapor flow) and transitions of the flow conditions. The pilot code was programmed so that the source terms of the governing equations and numerical solution schemes can be easily tested. The mass and energy conservation was confirmed for single-phase liquid and single-phase vapor flows. It was confirmed that the inlet pressure and velocity boundary conditions work properly. It was confirmed that, for single- and two-phase flows, the velocity and temperature of non-existing phase are calculated as intended. Complete phase depletion which might occur during a phase change was found to adversely affect the code stability. A further study would be required to enhance code capability in this regard.

  13. Development of discrete-time H∞ filtering method for time-delay compensation of rhodium incore detectors

    International Nuclear Information System (INIS)

    Park, Moon Kyu; Kim, Yong Hee; Cha, Kune Ho; Kim, Myung Ki

    1998-01-01

    A method is described to develop an H∞ filtering method for the dynamic compensation of self-powered neutron detectors normally used for fixed incore instruments. An H∞ norm of the filter transfer matrix is used as the optimization criteria in the worst-case estimation error sense. Filter modeling is performed for discrete-time model. The filter gains are optimized in the sense of noise attenuation level of H∞ setting. By introducing Bounded Real Lemma, the conventional algebraic Riccati inequalities are converted into Linear Matrix Inequalities (LMIs). Finally, the filter design problem is solved via the convex optimization framework using LMIs. The simulation results show that remarkable improvements are achieved in view of the filter response time and the filter design efficiency

  14. Program Verification with Monadic Second-Order Logic & Languages for Web Service Development

    DEFF Research Database (Denmark)

    Møller, Anders

    applications, this implementation forms the basis of a verification technique for imperative programs that perform data-type operations using pointers. To achieve this, the basic logic is extended with layers of language abstractions. Also, a language for expressing data structures and operations along...

  15. Development of prompt gamma measurement system for in vivo proton beam range verification

    International Nuclear Information System (INIS)

    Min, Chul Hee

    2011-02-01

    In radiation therapy, most research has focused on reducing unnecessary radiation dose to normal tissues and critical organs around the target tumor volume. Proton therapy is considered to be one of the most promising radiation therapy methods with its physical characteristics in the dose distribution, delivering most of the dose just before protons come to rest at the so-named Bragg peak; that is, proton therapy allows for a very high radiation dose to the tumor volume, effectively sparing adjacent critical organs. However, the uncertainty in the location of the Bragg peak, coming from not only the uncertainty in the beam delivery system and the treatment planning method but also anatomical changes and organ motions of a patient, could be a critical problem in proton therapy. In spite of the importance of the in vivo dose verification to prevent the misapplication of the Bragg peak and to guarantee both successful treatment and patient safety, there is no practical methodology to monitor the in vivo dose distribution, only a few attempts have been made so far. The present dissertation suggests the prompt gamma measurement method for monitoring of the in vivo proton dose distribution during treatment. As a key part of the process of establishing the utility of this method, the verification of the clear relationship between the prompt gamma distribution and the proton dose distribution was accomplished by means of Monte Carlo simulations and experimental measurements. First, the physical properties of prompt gammas were investigated on the basis of cross-section data and Monte Carlo simulations. Prompt gammas are generated mainly from proton-induced nuclear interactions, and then emitted isotropically in less than 10 -9 sec at energies up to 10 MeV. Simulation results for the prompt gamma yield of the major elements of a human body show that within the optimal energy range of 4-10 MeV the highest number of prompt gammas is generated from oxygen, whereas over the

  16. Development of an elution device for ViroCap virus filters.

    Science.gov (United States)

    Fagnant, Christine Susan; Toles, Matthew; Zhou, Nicolette Angela; Powell, Jacob; Adolphsen, John; Guan, Yifei; Ockerman, Byron; Shirai, Jeffry Hiroshi; Boyle, David S; Novosselov, Igor; Meschke, John Scott

    2017-10-19

    Environmental surveillance of waterborne pathogens is vital for monitoring the spread of diseases, and electropositive filters are frequently used for sampling wastewater and wastewater-impacted surface water. Viruses adsorbed to electropositive filters require elution prior to detection or quantification. Elution is typically facilitated by a peristaltic pump, although this requires a significant startup cost and does not include biosafety or cross-contamination considerations. These factors may pose a barrier for low-resource laboratories that aim to conduct environmental surveillance of viruses. The objective of this study was to develop a biologically enclosed, manually powered, low-cost device for effectively eluting from electropositive ViroCap™ virus filters. The elution device described here utilizes a non-electric bilge pump, instead of an electric peristaltic pump or a positive pressure vessel. The elution device also fully encloses liquids and aerosols that could contain biological organisms, thereby increasing biosafety. Moreover, all elution device components that are used in the biosafety cabinet are autoclavable, reducing cross-contamination potential. This device reduces costs of materials while maintaining convenience in terms of size and weight. With this new device, there is little sample volume loss due to device inefficiency, similar virus yields were demonstrated during seeded studies with poliovirus type 1, and the time to elute filters is similar to that required with the peristaltic pump. The efforts described here resulted in a novel, low-cost, manually powered elution device that can facilitate environmental surveillance of pathogens through effective virus recovery from ViroCap filters while maintaining the potential for adaptability to other cartridge filters.

  17. Environmental technology verification methods

    CSIR Research Space (South Africa)

    Szewczuk, S

    2016-03-01

    Full Text Available Environmental Technology Verification (ETV) is a tool that has been developed in the United States of America, Europe and many other countries around the world to help innovative environmental technologies reach the market. Claims about...

  18. The development of advanced instrumentation and control technology -The development of verification and validation technology for instrumentation and control in NPPs-

    International Nuclear Information System (INIS)

    Kwon, Kee Choon; Ham, Chang Sik; Lee, Byung Sun; Kim, Jung Taek; Park, Won Man; Park, Jae Chang; Lee, Jang Soo; Um, Heung Sub; Kim, Jang Yul; Ryoo, Chan Hoh; Joo, Jae Yoon; Song, Soon Ja

    1995-07-01

    We collected and analyzed the domestic/international codes, standards and guidelines to develop high reliable software verification and validation methodology which is satisfied to our real situation. The three major parts of work are performed that is the construction of the frame for high reliable software development environment, establishment of high reliable software development methodology and study for the basic technology related to safety-critical software. These three parts are tightly coupled each other to achieve self-reliable software verification and validation technology for digital I and C in NPPs. The configuration of hardware and software are partly performed using requirements which is developed in first stage for the development of I and C test facility. In hardware part, expanded interface using VXI bus and driving software is completed. The main program for math, modelling and supervisor program for instructions are developed. 27 figs, 22 tabs, 69 refs. (Author)

  19. The development of advanced instrumentation and control technology -The development of verification and validation technology for instrumentation and control in NPPs-

    Energy Technology Data Exchange (ETDEWEB)

    Kwon, Kee Choon; Ham, Chang Sik; Lee, Byung Sun; Kim, Jung Taek; Park, Won Man; Park, Jae Chang; Lee, Jang Soo; Um, Heung Sub; Kim, Jang Yul; Ryoo, Chan Hoh; Joo, Jae Yoon; Song, Soon Ja [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1995-07-01

    We collected and analyzed the domestic/international codes, standards and guidelines to develop high reliable software verification and validation methodology which is satisfied to our real situation. The three major parts of work are performed that is the construction of the frame for high reliable software development environment, establishment of high reliable software development methodology and study for the basic technology related to safety-critical software. These three parts are tightly coupled each other to achieve self-reliable software verification and validation technology for digital I and C in NPPs. The configuration of hardware and software are partly performed using requirements which is developed in first stage for the development of I and C test facility. In hardware part, expanded interface using VXI bus and driving software is completed. The main program for math, modelling and supervisor program for instructions are developed. 27 figs, 22 tabs, 69 refs. (Author).

  20. RELAP5/SCDAPSIM model development for AP1000 and verification for large break LOCA

    Energy Technology Data Exchange (ETDEWEB)

    Trivedi, A.K. [Nuclear Engineering and Technology Program, Indian Institute of Technology, Kanpur 208016 (India); Allison, C. [Innovative Systems Software, Idaho Falls, ID 83406 (United States); Khanna, A., E-mail: akhanna@iitk.ac.in [Nuclear Engineering and Technology Program, Indian Institute of Technology, Kanpur 208016 (India); Munshi, P. [Nuclear Engineering and Technology Program, Indian Institute of Technology, Kanpur 208016 (India)

    2016-08-15

    Highlights: • RELAP5/SCDAPSIM model of AP1000 has been developed. • Analysis involves a LBLOCA (double ended guillotine break) study in cold leg. • Results are compared with those of WCOBRA–TRAC and TRACE. • Concluded that PCT does not violate the safety criteria of 1477 K. - Abstract: The AP1000 is a Westinghouse 2-loop pressurized water reactor (PWR) with all emergency core cooling systems based on natural circulation. Its core design is very similar to a 3-loop PWR with 157 fuel assemblies. Westinghouse has reported their results of the safety analysis in its design control document (DCD) for a large break loss of coolant accident (LOCA) using WCOBRA/TRAC and for a small break LOCA using NOTRUMP. The current study involves the development of a representative RELAP5/SCDASIM model for AP1000 based on publically available data and its verification for a double ended cold leg (DECL) break in one of the cold legs in the loop containing core makeup tanks (CMT). The calculated RELAP5/SCDAPSIM results have been compared to publically available WCOBRA–TRAC and TRACE results of DECL break in AP1000. The objective of this study is to benchmark thermal hydraulic model for later severe accident analyses using the 2D SCDAP fuel rod component in place of the RELAP5 heat structures which currently represent the fuel rods. Results from this comparison provides sufficient confidence in the model which will be used for further studies such as a station blackout. The primary circuit pumps, pressurizer and steam generators (including the necessary secondary side) are modeled using RELAP5 components following all the necessary recommendations for nodalization. The core has been divided into 6 radial rings and 10 axial nodes. For the RELAP5 thermal hydraulic calculation, the six groups of fuel assemblies have been modeled as pipe components with equivalent flow areas. The fuel including the gap and cladding is modeled as a 1d heat structure. The final input deck achieved

  1. Development of built-in debris-filter bottom nozzle for PWR fuel assemblies

    International Nuclear Information System (INIS)

    Juntaro Shimizu; Kazuki Monaka; Masaji Mori; Kazuo Ikeda

    2005-01-01

    Mitsubishi Heavy Industries, Ltd. (MHI) has worked to improve the capability of anti debris bottom nozzle for a PWR fuel assembly. The Current debris filter bottom nozzle (DFBN) having 4mm diameter flow holes can capture the larger size of debris than the flow hole inner diameter. MHI has completed the development of the built-in debris filter bottom nozzle, which is the new idea of the debris-filter for high burnup (55GWd/t assembly average burnup). Built-in debris filter bottom nozzle consists of the blades and nozzle body. The blades made from inconel strip are embedded and welded on the grooved top surface of the bottom nozzle adapter plate. A flow hole is divided by the blade and the trap size of the debris is reduced. Because the blades block the coolant flow, it was anticipated to increase the pressure loss of the nozzle, however, adjusting the relation between blade and taper shape of the flow hole, the pressure loss has been successfully maintained the satisfactory level. Grooves are cut on the nozzle plate; nevertheless, the additional skirts on the four sides of the nozzle compensate the structural strength. (authors)

  2. Development of aerosol decontamination factor evaluation method for filtered containment venting system

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jae bong; Kim, Sung Il; Jung, Jaehoon; Ha, Kwang Soon; Kim, Hwan Yeol [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2016-10-15

    Fission products would be released from molten corium pools which are relocated into the lower plenum of reactor pressure vessel, on the concrete pit and in the core catcher. In addition, steam, hydrogen and noncondensable gases such as CO and CO2 are generated during the core damage progression due to loss of coolant and the molten core-concrete interaction. Consequently, the pressure inside the containment could be increased continuously. Filtered containment venting is one action to prevent an uncontrolled release of radioactive fission products caused by an overpressure failure of the containment. After the Fukushima-Daiichi accident which was demonstrated the containment failure, many countries to consider the implementation of filtered containment venting system(FCVS) on nuclear power plant where these are not currently applied. In general evaluation for FCVS is conducted to determine decontamination factor on several conditions (aerosol diameter, submergence depth, water temperature, gas flow, steam flow rate, pressure, operating time,...). It is essential to quantify the mass concentration before and after FCVS for decontamination factor. This paper presents the development of the evaluation facility for filtered containment venting system at KAERI and an experimental investigation for aerosol removal performance. Decontamination factor for the FCVS is determined by filter measurement. The result of the aerosol size distribution measurement shows the aerosol removal performance by an aerosol size.

  3. Development and verification of the neutron diffusion solver for the GeN-Foam multi-physics platform

    International Nuclear Information System (INIS)

    Fiorina, Carlo; Kerkar, Nordine; Mikityuk, Konstantin; Rubiolo, Pablo; Pautz, Andreas

    2016-01-01

    Highlights: • Development and verification of a neutron diffusion solver based on OpenFOAM. • Integration in the GeN-Foam multi-physics platform. • Implementation and verification of acceleration techniques. • Implementation of isotropic discontinuity factors. • Automatic adjustment of discontinuity factors. - Abstract: The Laboratory for Reactor Physics and Systems Behaviour at the PSI and the EPFL has been developing in recent years a new code system for reactor analysis based on OpenFOAM®. The objective is to supplement available legacy codes with a modern tool featuring state-of-the-art characteristics in terms of scalability, programming approach and flexibility. As part of this project, a new solver has been developed for the eigenvalue and transient solution of multi-group diffusion equations. Several features distinguish the developed solver from other available codes, in particular: object oriented programming to ease code modification and maintenance; modern parallel computing capabilities; use of general unstructured meshes; possibility of mesh deformation; cell-wise parametrization of cross-sections; and arbitrary energy group structure. In addition, the solver is integrated into the GeN-Foam multi-physics solver. The general features of the solver and its integration with GeN-Foam have already been presented in previous publications. The present paper describes the diffusion solver in more details and provides an overview of new features recently implemented, including the use of acceleration techniques and discontinuity factors. In addition, a code verification is performed through a comparison with Monte Carlo results for both a thermal and a fast reactor system.

  4. Development of activated charcoal impregnated air sampling filter media : their characteristics and use

    International Nuclear Information System (INIS)

    Khan, A.A.; Ramarathinam, K.; Gupta, S.K.; Deshingkar, D.S.; Kishore, A.G.

    1975-01-01

    Because of its low maximum permissible concentration in air, air-borne radioiodine must be accurately monitored in contaminated air streams, in the working environment and handling facilities, before release to the environment from the nuclear facilities. Activated charcoal impregnated air sampling filter media are found to be most suitable for monitoring airborne iodine-131. Because of its simplicity and reproducible nature in assessment of air-borne radioactive iodine, the work on the development of such media was undertaken in order to find a suitable substitute for imported activated charcoal impregnated air sampling filter media. Eight different media of such type were developed, evaluated and compared with two imported media. Best suitable medium is recommended for its use in air-borne iodine sampling which was found to be even better suited than imported media of such type. (author)

  5. The development of search filters for adverse effects of surgical interventions in medline and Embase.

    Science.gov (United States)

    Golder, Su; Wright, Kath; Loke, Yoon Kong

    2018-03-31

    Search filter development for adverse effects has tended to focus on retrieving studies of drug interventions. However, a different approach is required for surgical interventions. To develop and validate search filters for medline and Embase for the adverse effects of surgical interventions. Systematic reviews of surgical interventions where the primary focus was to evaluate adverse effect(s) were sought. The included studies within these reviews were divided randomly into a development set, evaluation set and validation set. Using word frequency analysis we constructed a sensitivity maximising search strategy and this was tested in the evaluation and validation set. Three hundred and fifty eight papers were included from 19 surgical intervention reviews. Three hundred and fifty two papers were available on medline and 348 were available on Embase. Generic adverse effects search strategies in medline and Embase could achieve approximately 90% relative recall. Recall could be further improved with the addition of specific adverse effects terms to the search strategies. We have derived and validated a novel search filter that has reasonable performance for identifying adverse effects of surgical interventions in medline and Embase. However, we appreciate the limitations of our methods, and recommend further research on larger sample sizes and prospective systematic reviews. © 2018 The Authors Health Information and Libraries Journal published by John Wiley & Sons Ltd on behalf of Health Libraries Group.

  6. Development, verification and validation of an FPGA-based core heat removal protection system for a PWR

    Energy Technology Data Exchange (ETDEWEB)

    Wu, Yichun, E-mail: ycwu@xmu.edu.cn [College of Energy, Xiamen University, Xiamen 361102 (China); Shui, Xuanxuan, E-mail: 807001564@qq.com [College of Energy, Xiamen University, Xiamen 361102 (China); Cai, Yuanfeng, E-mail: 1056303902@qq.com [College of Energy, Xiamen University, Xiamen 361102 (China); Zhou, Junyi, E-mail: 1032133755@qq.com [College of Energy, Xiamen University, Xiamen 361102 (China); Wu, Zhiqiang, E-mail: npic_wu@126.com [State Key Laboratory of Reactor System Design Technology, Nuclear Power Institute of China, Chengdu 610041 (China); Zheng, Jianxiang, E-mail: zwu@xmu.edu.cn [College of Energy, Xiamen University, Xiamen 361102 (China)

    2016-05-15

    Highlights: • An example on life cycle development process and V&V on FPGA-based I&C is presented. • Software standards and guidelines are used in FPGA-based NPP I&C system logic V&V. • Diversified FPGA design and verification languages and tools are utilized. • An NPP operation principle simulator is used to simulate operation scenarios. - Abstract: To reach high confidence and ensure reliability of nuclear FPGA-based safety system, life cycle processes of discipline specification and implementation of design as well as regulations verification and validation (V&V) are needed. A specific example on how to conduct life cycle development process and V&V on FPGA-based core heat removal (CHR) protection system for CPR1000 pressure water reactor (PWR) is presented in this paper. Using the existing standards and guidelines for life cycle development and V&V, a simplified FPGA-based CHR protection system for PWR has been designed, implemented, verified and validated. Diversified verification and simulation languages and tools are used by the independent design team and the V&V team. In the system acceptance testing V&V phase, a CPR1000 NPP operation principle simulator (OPS) model is utilized to simulate normal and abnormal operation scenarios, and provide input data to the under-test FPGA-based CHR protection system and a verified C code CHR function module. The evaluation results are applied to validate the under-test FPGA-based CHR protection system. The OPS model operation outputs also provide reasonable references for the tests. Using an OPS model in the system acceptance testing V&V is cost-effective and high-efficient. A dedicated OPS, as a commercial-off-the-shelf (COTS) item, would contribute as an important tool in the V&V process of NPP I&C systems, including FPGA-based and microprocessor-based systems.

  7. Development of an Advanced Recycle Filter Tank Assembly for the ISS Urine Processor Assembly

    Science.gov (United States)

    Link, Dwight E., Jr.; Carter, Donald Layne; Higbie, Scott

    2010-01-01

    Recovering water from urine is a process that is critical to supporting larger crews for extended missions aboard the International Space Station. Urine is collected, preserved, and stored for processing into water and a concentrated brine solution that is highly toxic and must be contained to avoid exposure to the crew. The brine solution is collected in an accumulator tank, called a Recycle Filter Tank Assembly (RFTA) that must be replaced monthly and disposed in order to continue urine processing operations. In order to reduce resupply requirements, a new accumulator tank is being developed that can be emptied on orbit into existing ISS waste tanks. The new tank, called the Advanced Recycle Filter Tank Assembly (ARFTA) is a metal bellows tank that is designed to collect concentrated brine solution and empty by applying pressure to the bellows. This paper discusses the requirements and design of the ARFTA as well as integration into the urine processor assembly.

  8. Development of DC active filter for high magnetic field stable power supply

    International Nuclear Information System (INIS)

    Wang Lei; Liu Xiaoning

    2008-01-01

    The DC active filter (DAF), with very low current ripple, of the stable power supply system of high magnetic field device is developed by using the PWM and parallel active power filter technique. Due to the PWM control technique, the required DAF current can be obtained and the current ripple can be compensated by means of monitoring the load voltage, and the current ripple becomes very low by adjusting the load voltage. The simulation and analysis show that this system can respond to the reference quickly and is effective in suppressing the harmonics, especially the low-order harmonics. The feasibility of the proposed scheme is proved on the equipment built in the laboratory. (authors)

  9. Dynamic simulator for nuclear power plants (DSNP): development, verification, and expansion of modules

    International Nuclear Information System (INIS)

    Larson, H.A.; Dean, E.M.; Koenig, J.F.; Gale, J.G.; Lehto, W.K.

    1984-01-01

    The DSNP Simulation Language facilitates whole reactor plant simulation and design. Verification includes DSNP dynamic modeling of Experimental Breeder Reactor No. 2 (EBR-II) plant experiments as well as comparisons with verified simulation programs. Great flexibility is allowed in expanding the DSNP language and accommodate other computer languages. The component modules of DSNP, contained in libraries, are continually updated with new, improved, and verified modules. The modules are used to simulate the dynamic response of LMFBR reactor systems to upset and transient conditions, with special emphasis on investigations of inherent shutdown mechanisms

  10. Development of advanced-RCCA in PWR (2). Design of advanced-RCCA and verification test

    Energy Technology Data Exchange (ETDEWEB)

    Kitagawa, T.; Naitou, T.; Suzuki, S.; Kawahara, H. [Mitsubishi Heavy Industries Ltd., Kobe (Japan); Tanaka, T. [Kansai Electric Power Co., Inc. (Japan); Kuriyama, H. [Hokkaido Electric Power Co., Inc., Sapporo (Japan); Fujii, S. [Shikoku Electric Power Co., Inc., Takamatsu (Japan); Murakami, S. [Kyusyu Electric Power Co., Inc. (Japan); Murota, M. [Japan Atomic Power Co., Tokyo (Japan)

    2001-07-01

    Advanced-RCCA enhances control rod worth by adopting boron carbide (B{sub 4}C) with enriched {sup 10}B (hybrid structure B{sub 4}C/Ag-In-Cd). In APWR, advanced-RCCA result in the reduction of the number of RCCA. In conventional PWR, large MOX or high burn-up fuel loading could be introduced without the additional RCCAs. The duplex cladding structure with Cr plating on each outside surface increases the reliability against the RCCA-wear and results in reduction of inspection cost (inspection-equipment, and inspection-interval). Design of advanced-RCCA and verification are also discussed. (author)

  11. Further Development of Verification Check-Cases for Six- Degree-of-Freedom Flight Vehicle Simulations

    Science.gov (United States)

    Jackson, E. Bruce; Madden, Michael M.; Shelton, Robert; Jackson, A. A.; Castro, Manuel P.; Noble, Deleena M.; Zimmerman, Curtis J.; Shidner, Jeremy D.; White, Joseph P.; Dutta, Doumyo; hide

    2015-01-01

    This follow-on paper describes the principal methods of implementing, and documents the results of exercising, a set of six-degree-of-freedom rigid-body equations of motion and planetary geodetic, gravitation and atmospheric models for simple vehicles in a variety of endo- and exo-atmospheric conditions with various NASA, and one popular open-source, engineering simulation tools. This effort is intended to provide an additional means of verification of flight simulations. The models used in this comparison, as well as the resulting time-history trajectory data, are available electronically for persons and organizations wishing to compare their flight simulation implementations of the same models.

  12. Development of an optimal filter substrate for the identification of small microplastic particles in food by micro-Raman spectroscopy.

    Science.gov (United States)

    Oßmann, Barbara E; Sarau, George; Schmitt, Sebastian W; Holtmannspötter, Heinrich; Christiansen, Silke H; Dicke, Wilhelm

    2017-06-01

    When analysing microplastics in food, due to toxicological reasons it is important to achieve clear identification of particles down to a size of at least 1 μm. One reliable, optical analytical technique allowing this is micro-Raman spectroscopy. After isolation of particles via filtration, analysis is typically performed directly on the filter surface. In order to obtain high qualitative Raman spectra, the material of the membrane filters should not show any interference in terms of background and Raman signals during spectrum acquisition. To facilitate the usage of automatic particle detection, membrane filters should also show specific optical properties. In this work, beside eight different, commercially available membrane filters, three newly designed metal-coated polycarbonate membrane filters were tested to fulfil these requirements. We found that aluminium-coated polycarbonate membrane filters had ideal characteristics as a substrate for micro-Raman spectroscopy. Its spectrum shows no or minimal interference with particle spectra, depending on the laser wavelength. Furthermore, automatic particle detection can be applied when analysing the filter surface under dark-field illumination. With this new membrane filter, analytics free of interference of microplastics down to a size of 1 μm becomes possible. Thus, an important size class of these contaminants can now be visualized and spectrally identified. Graphical abstract A newly developed aluminium coated polycarbonate membrane filter enables automatic particle detection and generation of high qualitative Raman spectra allowing identification of small microplastics.

  13. Development of Advanced Verification and Validation Procedures and Tools for the Certification of Learning Systems in Aerospace Applications

    Science.gov (United States)

    Jacklin, Stephen; Schumann, Johann; Gupta, Pramod; Richard, Michael; Guenther, Kurt; Soares, Fola

    2005-01-01

    Adaptive control technologies that incorporate learning algorithms have been proposed to enable automatic flight control and vehicle recovery, autonomous flight, and to maintain vehicle performance in the face of unknown, changing, or poorly defined operating environments. In order for adaptive control systems to be used in safety-critical aerospace applications, they must be proven to be highly safe and reliable. Rigorous methods for adaptive software verification and validation must be developed to ensure that control system software failures will not occur. Of central importance in this regard is the need to establish reliable methods that guarantee convergent learning, rapid convergence (learning) rate, and algorithm stability. This paper presents the major problems of adaptive control systems that use learning to improve performance. The paper then presents the major procedures and tools presently developed or currently being developed to enable the verification, validation, and ultimate certification of these adaptive control systems. These technologies include the application of automated program analysis methods, techniques to improve the learning process, analytical methods to verify stability, methods to automatically synthesize code, simulation and test methods, and tools to provide on-line software assurance.

  14. Development and testing the modular fireproof fine filters on the basis of glass paper

    International Nuclear Information System (INIS)

    Rovnyj, S.I.; Glagolenko, Yu.V.; Pyatin, N.P.; Tranchuk, O.A.; Maksimov, V.E.; Afanas'eva, E.V.

    2006-01-01

    Paper describes a procedure to fabricate modified module glass paper fine filters to trap radioactive substances (14 models). The mentioned filters are made of a glass paper ensuring their fire-resistance. Paper describes the procedure of service life tests of the designed filters and the efficient procedure to extract valuable components from the spent filters [ru

  15. Further development of the cleanable steel HEPA filter, cost/benefit analysis, and comparison with competing technologies

    Energy Technology Data Exchange (ETDEWEB)

    Bergman, W.; Lopez, R.; Wilson, K. [Lawrence Livermore National Lab., CA (United States)] [and others

    1997-08-01

    We have made further progress in developing a cleanable steel fiber HEPA filter. We fabricated a pleated cylindrical cartridge using commercially available steel fiber media that is made with 1 {mu}m stainless steel fibers and sintered into a sheet form. Test results at the Department of Energy (DOE) Filter Test Station at Oak Ridge show the prototype filter cartridge has 99.99% efficiency for 0.3 {mu}m dioctyl phthalate (DOP) aerosols and a pressure drop of 1.5 inches. Filter loading and cleaning tests using AC Fine dust showed the filter could be repeatedly cleaned using reverse air pulses. Our analysis of commercially optimized filters suggest that cleanable steel HEPA filters need to be made from steel fibers less than 1{mu}m, and preferably 0.5 {mu}m, to meet the standard HEPA filter requirements in production units. We have demonstrated that 0.5 {mu}m steel fibers can be produced using the fiber bundling and drawing process. The 0.5 {mu}m steel fibers are then sintered into small filter samples and tested for efficiency and pressure drop. Test results on the sample showed a penetration of 0.0015 % at 0.3 {mu}m and a pressure drop of 1.15 inches at 6.9 ft/min (3.5 cm/s) velocity. Based on these results, steel fiber media can easily meet the requirements of 0.03 % penetration and 1.0 inch of pressure drop by using less fibers in the media. A cost analysis of the cleanable steel HEPA filter shows that, although the steel HEPA filter costs much more than the standard glass fiber HEPA filter, it has the potential to be very cost effective because of the high disposal costs of contaminated HEPA filters. We estimate that the steel HEPA filter will save an average of $16,000 over its 30 year life. The additional savings from the clean-up costs resulting from ruptured glass HEPA filters during accidents was not included but makes the steel HEPA filter even more cost effective. 33 refs., 28 figs., 1 tab.

  16. DEVELOPMENT OF ENRICHMENT VERIFICATION ASSAY BASED ON THE AGE AND 235U AND 238U ACTIVITIES OF THE SAMPLES

    International Nuclear Information System (INIS)

    AL-YAMAHI, H.; EL-MONGY, S.A.

    2008-01-01

    Development of the enrichment verification methods is the backbone of the nuclear materials safeguards skeleton. In this study, the 235U percentage of depleted , natural and very slightly enriched uranium samples were estimated based on the sample age and the measured activity of 235U and 238U. The HpGe and NaI spectrometry were used for samples assay. A developed equation was derived to correlate the sample age and 235U and 238U activities with the enrichment percentage (E%). The results of the calculated E% by the deduced equation and the target E% values were found to be similar and within 0.58 -1.75% bias in the case of HpGe measurements. The correlation between them was found to be very sharp. The activity was also calculated based on the measured sample count rate and the efficiency at the gamma energies of interest. The correlation between the E% and the 235U activity was estimated and found to be linearly sharp. The results obtained by NaI was found to be less accurate than these obtained by HpGe. The bias in the case of NaI assay was in the range from 6.398% to 22.8% for E% verification

  17. Development of off-gas filters for reprocessing plants. Development and construction of an off-gas filter system for large reprocessing plants. Off-gas section of the resolver test stand of the IHCh

    International Nuclear Information System (INIS)

    Furrer, J.; Kaempffer, R.; Wilhelm, J.G.; Pfauter, C.; Jannakos, K.; Apenberg, W.; Lange, W.; Mendel, W.; Potgeter, G.; Zabel, G.

    1976-01-01

    The test of the highly impregnated iodine sorption material AC 6,120 was continued in the laboratory under simulated conditions of a 1,500 t/a uranium reprocessing plant. The influence of NO in nitrogen as the carrier gas on the removal efficiency of the sorption material has been especially examined. Several experiments on the removal efficiency of iodine sorption by the material AC 6,120 were carried out in the original off-gas of the French processing plant SAP Marcoule while the filter system was installed on the one side directly behind the dissolver and on the other side behind the iodine desorption columm. The first iodine filter developed at LAF II was installed in the off-gas line of the dissolver in the Karlsruhe reprocessing plant. The filter system for the dissolver off-gas handling test rig of the IHCh was specified and ordered with an engineering firm. The conception of the prototype off-gas filter system was selected and a lock and transport system allowing to replace filters was designed and subjected for testing. Five alternative solutions were set up in order to find the appropriate filter concept. The method of selection based on the evaluation of performance criteria. According to the selected solution a filter drum was designed and constructed. The lock of the filter system has been designed and realized. Preliminary tests have been made. (orig.) [de

  18. Development of biomass in a drinking water granular active carbon (GAC) filter.

    Science.gov (United States)

    Velten, Silvana; Boller, Markus; Köster, Oliver; Helbing, Jakob; Weilenmann, Hans-Ulrich; Hammes, Frederik

    2011-12-01

    Indigenous bacteria are essential for the performance of drinking water biofilters, yet this biological component remains poorly characterized. In the present study we followed biofilm formation and development in a granular activated carbon (GAC) filter on pilot-scale during the first six months of operation. GAC particles were sampled from four different depths (10, 45, 80 and 115 cm) and attached biomass was measured with adenosine tri-phosphate (ATP) analysis. The attached biomass accumulated rapidly on the GAC particles throughout all levels in the filter during the first 90 days of operation and maintained a steady state afterward. Vertical gradients of biomass density and growth rates were observed during start-up and also in steady state. During steady state, biomass concentrations ranged between 0.8-1.83 x 10(-6) g ATP/g GAC in the filter, and 22% of the influent dissolved organic carbon (DOC) was removed. Concomitant biomass production was about 1.8 × 10(12) cells/m(2)h, which represents a yield of 1.26 × 10(6) cells/μg. The bacteria assimilated only about 3% of the removed carbon as biomass. At one point during the operational period, a natural 5-fold increase in the influent phytoplankton concentration occurred. As a result, influent assimilable organic carbon concentrations increased and suspended bacteria in the filter effluent increased 3-fold as the direct consequence of increased growth in the biofilter. This study shows that the combination of different analytical methods allows detailed quantification of the microbiological activity in drinking water biofilters. Copyright © 2011 Elsevier Ltd. All rights reserved.

  19. SU-E-T-339: Dosimetric Verification of Acuros XB Dose Calculation Algorithm On An Air Cavity for 6-MV Flattening Filter-Free Beam

    International Nuclear Information System (INIS)

    Kang, S; Suh, T; Chung, J

    2015-01-01

    Purpose: This study was to verify the accuracy of Acuros XB (AXB) dose calculation algorithm on an air cavity for a single radiation field using 6-MV flattening filter-free (FFF) beam. Methods: A rectangular slab phantom containing an air cavity was made for this study. The CT images of the phantom for dose calculation were scanned with and without film at measurement depths (4.5, 5.5, 6.5 and 7.5 cm). The central axis doses (CADs) and the off-axis doses (OADs) were measured by film and calculated with Analytical Anisotropic Algorithm (AAA) and AXB for field sizes ranging from 2 Χ 2 to 5 Χ 5 cm 2 of 6-MV FFF beams. Both algorithms were divided into AXB-w and AAA -w when included the film in phantom for dose calculation, and AXB-w/o and AAA-w/o in calculation without film. The calculated OADs for both algorithms were compared with the measured OADs and difference values were determined using root means squares error (RMSE) and gamma evaluation. Results: The percentage differences (%Diffs) between the measured and calculated CAD for AXB-w was most agreement than others. Compared to the %Diff with and without film, the %Diffs with film were decreased than without within both algorithms. The %Diffs for both algorithms were reduced with increasing field size and increased relative to the depth increment. RMSEs of CAD for AXB-w were within 10.32% for both inner-profile and penumbra, while the corresponding values of AAA-w appeared to 96.50%. Conclusion: This study demonstrated that the dose calculation with AXB within air cavity shows more accurate than with AAA compared to the measured dose. Furthermore, we found that the AXB-w was superior to AXB-w/o in this region when compared against the measurements

  20. SU-E-T-435: Development and Commissioning of a Complete System for In-Vivo Dosimetry and Range Verification in Proton Therapy

    Energy Technology Data Exchange (ETDEWEB)

    Samuel, D [Universite catholique de Louvain, Louvain-la-neuve, BW (Belgium); Testa, M; Park, Y [Massachusetts General Hospital, Boston, MA (United States); Schneider, R; Moteabbed, M [General Hospital, Boston, MA (United States); Janssens, G; Prieels, D [Ion Beam Applications, Louvain-la-neuve, Brabant Wallon (Belgium); Orban de Xivry, J [Universite catholique de Louvain, Louvain-la-neuve, BW (Belgium); Lu, H [Massachusetts General Hospital and Harvard Medical School, Boston, MA (United States); Bentefour, E

    2014-06-01

    Purpose: In-vivo dose and beam range verification in proton therapy could play significant roles in proton treatment validation and improvements. Invivo beam range verification, in particular, could enable new treatment techniques one of which, for example, could be the use of anterior fields for prostate treatment instead of opposed lateral fields as in current practice. We have developed and commissioned an integrated system with hardware, software and workflow protocols, to provide a complete solution, simultaneously for both in-vivo dosimetry and range verification for proton therapy. Methods: The system uses a matrix of diodes, up to 12 in total, but separable into three groups for flexibility in application. A special amplifier was developed to capture extremely small signals from very low proton beam current. The software was developed within iMagX, a general platform for image processing in radiation therapy applications. The range determination exploits the inherent relationship between the internal range modulation clock of the proton therapy system and the radiological depth at the point of measurement. The commissioning of the system, for in-vivo dosimetry and for range verification was separately conducted using anthropomorphic phantom. EBT films and TLDs were used for dose comparisons and range scan of the beam distal fall-off was used as ground truth for range verification. Results: For in-vivo dose measurement, the results were in agreement with TLD and EBT films and were within 3% from treatment planning calculations. For range verification, a precision of 0.5mm is achieved in homogeneous phantoms, and a precision of 2mm for anthropomorphic pelvic phantom, except at points with significant range mixing. Conclusion: We completed the commissioning of our system for in-vivo dosimetry and range verification in proton therapy. The results suggest that the system is ready for clinical trials on patient.

  1. SU-E-T-435: Development and Commissioning of a Complete System for In-Vivo Dosimetry and Range Verification in Proton Therapy

    International Nuclear Information System (INIS)

    Samuel, D; Testa, M; Park, Y; Schneider, R; Moteabbed, M; Janssens, G; Prieels, D; Orban de Xivry, J; Lu, H; Bentefour, E

    2014-01-01

    Purpose: In-vivo dose and beam range verification in proton therapy could play significant roles in proton treatment validation and improvements. Invivo beam range verification, in particular, could enable new treatment techniques one of which, for example, could be the use of anterior fields for prostate treatment instead of opposed lateral fields as in current practice. We have developed and commissioned an integrated system with hardware, software and workflow protocols, to provide a complete solution, simultaneously for both in-vivo dosimetry and range verification for proton therapy. Methods: The system uses a matrix of diodes, up to 12 in total, but separable into three groups for flexibility in application. A special amplifier was developed to capture extremely small signals from very low proton beam current. The software was developed within iMagX, a general platform for image processing in radiation therapy applications. The range determination exploits the inherent relationship between the internal range modulation clock of the proton therapy system and the radiological depth at the point of measurement. The commissioning of the system, for in-vivo dosimetry and for range verification was separately conducted using anthropomorphic phantom. EBT films and TLDs were used for dose comparisons and range scan of the beam distal fall-off was used as ground truth for range verification. Results: For in-vivo dose measurement, the results were in agreement with TLD and EBT films and were within 3% from treatment planning calculations. For range verification, a precision of 0.5mm is achieved in homogeneous phantoms, and a precision of 2mm for anthropomorphic pelvic phantom, except at points with significant range mixing. Conclusion: We completed the commissioning of our system for in-vivo dosimetry and range verification in proton therapy. The results suggest that the system is ready for clinical trials on patient

  2. Development and testing of a two stage granular filter to improve collection efficiency

    Energy Technology Data Exchange (ETDEWEB)

    Rangan, R.S.; Prakash, S.G.; Chakravarti, S.; Rao, S.R.

    1999-07-01

    A circulating bed granular filter (CBGF) with a single filtration stage was tested with a PFB combustor in the Coal Research Facility of BHEL R and D in Hyderabad during the years 1993--95. Filter outlet dust loading varied between 20--50 mg/Nm{sup 3} for an inlet dust loading of 5--8 gms/Nm{sup 3}. The results were reported in Fluidized Bed Combustion-Volume 2, ASME 1995. Though the outlet consists of predominantly fine particulates below 2 microns, it is still beyond present day gas turbine specifications for particulate concentration. In order to enhance the collection efficiency, a two-stage granular filtration concept was evolved, wherein the filter depth is divided between two stages, accommodated in two separate vertically mounted units. The design also incorporates BHEL's scale-up concept of multiple parallel stages. The two-stage concept minimizes reentrainment of captured dust by providing clean granules in the upper stage, from where gases finally exit the filter. The design ensures that dusty gases come in contact with granules having a higher dust concentration at the bottom of the two-stage unit, where most of the cleaning is completed. A second filtration stage of cleaned granules is provided in the top unit (where the granules are returned to the system after dedusting) minimizing reentrainment. Tests were conducted to determine the optimum granule to dust ratio (G/D ratio) which decides the granule circulation rate required for the desired collection efficiency. The data brings out the importance of pre-separation and the limitation on inlet dust loading for any continuous system of granular filtration. Collection efficiencies obtained were much higher (outlet dust being 3--9 mg/Nm{sub 3}) than in the single stage filter tested earlier for similar dust loading at the inlet. The results indicate that two-stage granular filtration has a high potential for HTHT application with fewer risks as compared to other systems under development.

  3. Scalable Techniques for Formal Verification

    CERN Document Server

    Ray, Sandip

    2010-01-01

    This book presents state-of-the-art approaches to formal verification techniques to seamlessly integrate different formal verification methods within a single logical foundation. It should benefit researchers and practitioners looking to get a broad overview of the spectrum of formal verification techniques, as well as approaches to combining such techniques within a single framework. Coverage includes a range of case studies showing how such combination is fruitful in developing a scalable verification methodology for industrial designs. This book outlines both theoretical and practical issue

  4. Development of the automatic control rod operation system for JOYO. Verification of automatic control rod operation guide system

    International Nuclear Information System (INIS)

    Terakado, Tsuguo; Suzuki, Shinya; Kawai, Masashi; Aoki, Hiroshi; Ohkubo, Toshiyuki

    1999-10-01

    The automatic control rod operation system was developed to control the JOYO reactor power automatically in all operation modes(critical approach, cooling system heat up, power ascent, power descent), development began in 1989. Prior to applying the system, verification tests of the automatic control rod operation guide system was conducted during 32nd duty cycles of JOYO' from Dec. 1997 to Feb. 1998. The automatic control rod operation guide system consists of the control rod operation guide function and the plant operation guide function. The control rod operation guide function provides information on control rod movement and position, while the plant operation guide function provide guidance for plant operations corresponding to reactor power changes(power ascent or power descent). Control rod insertion or withdrawing are predicted by fuzzy algorithms. (J.P.N.)

  5. Development and evaluation of a HEPA filter for increased strength and resistance to elevated temperature

    International Nuclear Information System (INIS)

    Gilbert, H.; Bergman, W.; Fretthold, J.K.

    1993-01-01

    We have completed a preliminary study of an improved HEPA filter for increased strength and resistance to elevated temperature to improve the reliability of the standard deep pleated HEPA filter under accident conditions. The improvements to the HEPA filter consist of a silicone rubber sealant and a new HEPA medium reinforced with a glass cloth. Three prototype filters were built and evaluated for temperature and pressure resistance and resistance to rough handling. The temperature resistance test consisted of exposing the HEPA filter to 1,000 scan (1,700 m 3 /hr) at 700 degrees F (371 degrees C) for five minutes.The pressure resistance test consisted of exposing the HEPA filter to a differential pressure of 10 in. w.g. (2.5 kPa) using a water saturated air flow at 95 degrees F (35 degrees C). For the rough handling test, we used a vibrating machine designated the Q110. DOP filter efficiency tests were performed before and after each of the environmental tests. In addition to following the standard practice of using a separate new filter for each environmental test, we also subjected the same filter to the elevated temperature test followed by the pressure resistance test. The efficiency test results show that the improved HEPA filter is significantly better than the standard HEPA filter. Further studies are recommended to evaluate the improved HEPA filter and to assess its performance under more severe accident conditions

  6. Development of Aerosol Scrubbing Test Loop for Containment Filtered Venting System

    International Nuclear Information System (INIS)

    Lee, Doo Yong; Jung, Woo Young; Lee, Hyun Chul; Lee, Jong Chan; Kim, Gyu Tae

    2016-01-01

    The scrubber tank is filled with scrubbing water with the chemical additives. The droplet separator based on a cyclone is installed above the scrubbing water pool to remove the large droplets that may clog a metal fiber filter installed at the upper section of the scrubber tank. The outlet piping is connected from the scrubber tank to the molecular sieve to chemically remove the gaseous iodine. The aerosol as a particle is physically captured in the scrubbing water pool passing through the scrubbing nozzle as well as the metal fiber filter. The gaseous iodine such as molecular iodine as well as organic iodide is chemically removed in the scrubbing water pool and molecular sieve. The thermal-hydraulic as well as scrubbing performance for the CFVS should be verified with the experiments. The experiment can be divided into the filtration component based experiment and whole system based one. In this paper, the aerosol scrubbing test loop developed to test the thermal-hydraulic and aerosol scrubbing performance of the scrubbing nozzle with the scrubbing water pool is introduced. The aerosol scrubbing test loop has been developed as a part of the Korean CFVS project. In this loop, the filtration components such as the scrubbing nozzle submerged in the scrubbing water pool as well as the cyclone as droplet separator can be tested under the CFVS operating conditions. The aerosol scrubbing performance of the filtration components including pool scrubbing behavior can be tested with the aerosol generation and feeding system and aerosol measurement system.

  7. Development of Aerosol Scrubbing Test Loop for Containment Filtered Venting System

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Doo Yong; Jung, Woo Young; Lee, Hyun Chul; Lee, Jong Chan; Kim, Gyu Tae [FNC Technology, Yongin (Korea, Republic of)

    2016-05-15

    The scrubber tank is filled with scrubbing water with the chemical additives. The droplet separator based on a cyclone is installed above the scrubbing water pool to remove the large droplets that may clog a metal fiber filter installed at the upper section of the scrubber tank. The outlet piping is connected from the scrubber tank to the molecular sieve to chemically remove the gaseous iodine. The aerosol as a particle is physically captured in the scrubbing water pool passing through the scrubbing nozzle as well as the metal fiber filter. The gaseous iodine such as molecular iodine as well as organic iodide is chemically removed in the scrubbing water pool and molecular sieve. The thermal-hydraulic as well as scrubbing performance for the CFVS should be verified with the experiments. The experiment can be divided into the filtration component based experiment and whole system based one. In this paper, the aerosol scrubbing test loop developed to test the thermal-hydraulic and aerosol scrubbing performance of the scrubbing nozzle with the scrubbing water pool is introduced. The aerosol scrubbing test loop has been developed as a part of the Korean CFVS project. In this loop, the filtration components such as the scrubbing nozzle submerged in the scrubbing water pool as well as the cyclone as droplet separator can be tested under the CFVS operating conditions. The aerosol scrubbing performance of the filtration components including pool scrubbing behavior can be tested with the aerosol generation and feeding system and aerosol measurement system.

  8. Development and testing of the detector for monitoring radon double-filter method

    International Nuclear Information System (INIS)

    Sevcik, P.

    2008-01-01

    Applications of physics in the study of radon transport processes in the atmosphere and of testing of atmospheric transport models require sensitive detection devices with low maintenance requirements. The most precise devices involved in the worldwide monitoring program of the atmosphere (GAW) determine volume activity of radon from a variety of daughter products of 222 Rn, resulting in a working volume of the detector (double-filter method). The purpose of this work was to explore theoretically and experimentally the possibilities and limits of a particular simple implementation of this procedure. Tested apparatus consists of a 200 dm 3 chamber (metal drum), where are developed transformation products of radon and semiconductor detector with surface barrier, which registers α particles from the conversion of daughter products 222 Rn collected on a filter at the outlet of the chamber. Testing of the apparatus takes place in the atmosphere with higher concentrations of radon. The measured variations of volume activities 222 Rn have the same character as the variations of radon concentration in the air in laboratory. Minimum detectable activity at 95% significance level is 16.0 Bq.m -3 at a pumping speed of the air 20 dm 3 .min - 1 and 13.0 Bq.m -3 at a pumping rate 24 dm 3 .min -1 . These values are still too high for using the apparatus for measuring in external atmosphere. The main limit of the apparatus is a capture of transformation products arising on the inner walls of the chamber (plate-out effect). The effectiveness of collecting 218 Po from the chamber on the filter in our measurements was only 2.8%. But we managed to increase it to about 20% by adding aerosol delivery systems into production chamber of transformation products of radon. It turns out that based on this principle can be made sensitive and continuously working monitor of radon. (author)

  9. General-Purpose Heat Source development: Safety Verification Test Program. Bullet/fragment test series

    Energy Technology Data Exchange (ETDEWEB)

    George, T.G.; Tate, R.E.; Axler, K.M.

    1985-05-01

    The radioisotope thermoelectric generator (RTG) that will provide power for space missions contains 18 General-Purpose Heat Source (GPHS) modules. Each module contains four /sup 238/PuO/sub 2/-fueled clads and generates 250 W/sub (t)/. Because a launch-pad or post-launch explosion is always possible, we need to determine the ability of GPHS fueled clads within a module to survive fragment impact. The bullet/fragment test series, part of the Safety Verification Test Plan, was designed to provide information on clad response to impact by a compact, high-energy, aluminum-alloy fragment and to establish a threshold value of fragment energy required to breach the iridium cladding. Test results show that a velocity of 555 m/s (1820 ft/s) with an 18-g bullet is at or near the threshold value of fragment velocity that will cause a clad breach. Results also show that an exothermic Ir/Al reaction occurs if aluminum and hot iridium are in contact, a contact that is possible and most damaging to the clad within a narrow velocity range. The observed reactions between the iridium and the aluminum were studied in the laboratory and are reported in the Appendix.

  10. Developing reliable safeguards seals for application verification and removal by State operators

    Energy Technology Data Exchange (ETDEWEB)

    Finch, Robert J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Smartt, Heidi A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Haddal, Risa [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-10-01

    Once a geological repository has begun operations, the encapsulation and disposal of spent fuel will be performed as a continuous, industrial-scale series of processes, during which time safeguards seals will be applied to transportation casks before shipment from an encapsulation plant, and then verified and removed following receipt at the repository. These operations will occur approximately daily during several decades of Sweden's repository operation; however, requiring safeguards inspectors to perform the application, verification, and removal of every seal would be an onerous burden on International Atomic Energy Agency's (IAEA's) resources. Current IAEA practice includes allowing operators to either apply seals or remove them, but not both, so the daily task of either applying or verifying and removing would still require continuous presence of IAEA inspectors at one site at least. Of special importance is the inability to re-verify cask or canisters from which seals have been removed and the canisters emplaced underground. Successfully designing seals that can be applied, verified and removed by an operator with IAEA approval could impact more than repository shipments, but other applications as well, potentially reducing inspector burdens for a wide range of such duties.

  11. Explosion overpressure test series: General-Purpose Heat Source development: Safety Verification Test program

    International Nuclear Information System (INIS)

    Cull, T.A.; George, T.G.; Pavone, D.

    1986-09-01

    The General-Purpose Heat Source (GPHS) is a modular, radioisotope heat source that will be used in radioisotope thermoelectric generators (RTGs) to supply electric power for space missions. The first two uses will be the NASA Galileo and the ESA Ulysses missions. The RTG for these missions will contain 18 GPHS modules, each of which contains four 238 PuO 2 -fueled clads and generates 250 W/sub (t)/. A series of Safety Verification Tests (SVTs) was conducted to assess the ability of the GPHS modules to contain the plutonia in accident environments. Because a launch pad or postlaunch explosion of the Space Transportation System vehicle (space shuttle) is a conceivable accident, the SVT plan included a series of tests that simulated the overpressure exposure the RTG and GPHS modules could experience in such an event. Results of these tests, in which we used depleted UO 2 as a fuel simulant, suggest that exposure to overpressures as high as 15.2 MPa (2200 psi), without subsequent impact, does not result in a release of fuel

  12. KAERI software verification and validation guideline for developing safety-critical software in digital I and C system of NPP

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jang Yeol; Lee, Jang Soo; Eom, Heung Seop

    1997-07-01

    This technical report is to present V and V guideline development methodology for safety-critical software in NPP safety system. Therefore it is to present V and V guideline of planning phase for the NPP safety system in addition to critical safety items, for example, independence philosophy, software safety analysis concept, commercial off the shelf (COTS) software evaluation criteria, inter-relationships between other safety assurance organizations, including the concepts of existing industrial standard, IEEE Std-1012, IEEE Std-1059. This technical report includes scope of V and V guideline, guideline framework as part of acceptance criteria, V and V activities and task entrance as part of V and V activity and exit criteria, review and audit, testing and QA records of V and V material and configuration management, software verification and validation plan production etc., and safety-critical software V and V methodology. (author). 11 refs.

  13. KAERI software verification and validation guideline for developing safety-critical software in digital I and C system of NPP

    International Nuclear Information System (INIS)

    Kim, Jang Yeol; Lee, Jang Soo; Eom, Heung Seop.

    1997-07-01

    This technical report is to present V and V guideline development methodology for safety-critical software in NPP safety system. Therefore it is to present V and V guideline of planning phase for the NPP safety system in addition to critical safety items, for example, independence philosophy, software safety analysis concept, commercial off the shelf (COTS) software evaluation criteria, inter-relationships between other safety assurance organizations, including the concepts of existing industrial standard, IEEE Std-1012, IEEE Std-1059. This technical report includes scope of V and V guideline, guideline framework as part of acceptance criteria, V and V activities and task entrance as part of V and V activity and exit criteria, review and audit, testing and QA records of V and V material and configuration management, software verification and validation plan production etc., and safety-critical software V and V methodology. (author). 11 refs

  14. Selection vector filter framework

    Science.gov (United States)

    Lukac, Rastislav; Plataniotis, Konstantinos N.; Smolka, Bogdan; Venetsanopoulos, Anastasios N.

    2003-10-01

    We provide a unified framework of nonlinear vector techniques outputting the lowest ranked vector. The proposed framework constitutes a generalized filter class for multichannel signal processing. A new class of nonlinear selection filters are based on the robust order-statistic theory and the minimization of the weighted distance function to other input samples. The proposed method can be designed to perform a variety of filtering operations including previously developed filtering techniques such as vector median, basic vector directional filter, directional distance filter, weighted vector median filters and weighted directional filters. A wide range of filtering operations is guaranteed by the filter structure with two independent weight vectors for angular and distance domains of the vector space. In order to adapt the filter parameters to varying signal and noise statistics, we provide also the generalized optimization algorithms taking the advantage of the weighted median filters and the relationship between standard median filter and vector median filter. Thus, we can deal with both statistical and deterministic aspects of the filter design process. It will be shown that the proposed method holds the required properties such as the capability of modelling the underlying system in the application at hand, the robustness with respect to errors in the model of underlying system, the availability of the training procedure and finally, the simplicity of filter representation, analysis, design and implementation. Simulation studies also indicate that the new filters are computationally attractive and have excellent performance in environments corrupted by bit errors and impulsive noise.

  15. Hybrid Adaptive Filter development for the minimisation of transient fluctuations superimposed on electrotelluric field recordings mainly by magnetic storms

    Directory of Open Access Journals (Sweden)

    A. Konstantaras

    2006-01-01

    Full Text Available The method of Hybrid Adaptive Filtering (HAF aims to recover the recorded electric field signals from anomalies of magnetotelluric origin induced mainly by magnetic storms. An adaptive filter incorporating neuro-fuzzy technology has been developed to remove any significant distortions from the equivalent magnetic field signal, as retrieved from the original electric field signal by reversing the magnetotelluric method. Testing with further unseen data verifies the reliability of the model and demonstrates the effectiveness of the HAF method.

  16. Particularities of Verification Processes for Distributed Informatics Applications

    Directory of Open Access Journals (Sweden)

    Ion IVAN

    2013-01-01

    Full Text Available This paper presents distributed informatics applications and characteristics of their development cycle. It defines the concept of verification and there are identified the differences from software testing. Particularities of the software testing and software verification processes are described. The verification steps and necessary conditions are presented and there are established influence factors of quality verification. Software optimality verification is analyzed and some metrics are defined for the verification process.

  17. Development of an expert system for success path generation and operator's action guides in NPP: Verification and validation of COSMOS

    International Nuclear Information System (INIS)

    Yang, Jun Un; Jung, Kwang Sup; Park, Chang Gyu

    1992-08-01

    For the support of emergency operation, an expert system named COSMOS (COmputerized Success-path MOnitoring System) is being developed at Korea Atomic Energy Research Institute (KAERI). COSMOS identifies the critical safety function's (CSF'S) status, and suggests the overall response strategy with a set of success paths which restore the challenged CSF's. The status of CSF is identified by the rule-based reasoning. The overall response strategy is inferred according to the identified CSF's status. The success paths are generated by the given structure descriptions of systems and the general generation algorithm. For efficient man-machine interface, a colar graphic display is utilized. COSMOS is being built on a workstation. The major tasks to build an expert system such as COSMOS are the construction of knowledge base and inference engine. In COSMOS, the knowledges are derived from the Emergency Operating Procedures (EOPs), and the forward chaining is adopted as the inference strategy. While the knowledge base and inference engine are the most common and essential elements of an expert system, they are not the only ones. The evaluation of expert systems can not only lessen the risk of using faulty software, but also enhance the acceptability of the expert systems by both users and regulators. The evaluation of expert systems consists of the system verification, validation and user acceptance testing. Among them, in this report, we have focused our attention to verification and validation (V≅V) of expert systems. We have accessed the general V≅V procedures and tried to develop the specific V≅V procedure for COSMOS. (Author)

  18. Development and Implementation of Dynamic Scripts to Support Local Model Verification at National Weather Service Weather Forecast Offices

    Science.gov (United States)

    Zavodsky, Bradley; Case, Jonathan L.; Gotway, John H.; White, Kristopher; Medlin, Jeffrey; Wood, Lance; Radell, Dave

    2014-01-01

    Local modeling with a customized configuration is conducted at National Weather Service (NWS) Weather Forecast Offices (WFOs) to produce high-resolution numerical forecasts that can better simulate local weather phenomena and complement larger scale global and regional models. The advent of the Environmental Modeling System (EMS), which provides a pre-compiled version of the Weather Research and Forecasting (WRF) model and wrapper Perl scripts, has enabled forecasters to easily configure and execute the WRF model on local workstations. NWS WFOs often use EMS output to help in forecasting highly localized, mesoscale features such as convective initiation, the timing and inland extent of lake effect snow bands, lake and sea breezes, and topographically-modified winds. However, quantitatively evaluating model performance to determine errors and biases still proves to be one of the challenges in running a local model. Developed at the National Center for Atmospheric Research (NCAR), the Model Evaluation Tools (MET) verification software makes performing these types of quantitative analyses easier, but operational forecasters do not generally have time to familiarize themselves with navigating the sometimes complex configurations associated with the MET tools. To assist forecasters in running a subset of MET programs and capabilities, the Short-term Prediction Research and Transition (SPoRT) Center has developed and transitioned a set of dynamic, easily configurable Perl scripts to collaborating NWS WFOs. The objective of these scripts is to provide SPoRT collaborating partners in the NWS with the ability to evaluate the skill of their local EMS model runs in near real time with little prior knowledge of the MET package. The ultimate goal is to make these verification scripts available to the broader NWS community in a future version of the EMS software. This paper provides an overview of the SPoRT MET scripts, instructions for how the scripts are run, and example use

  19. Development of the neutron filters for JET gamma-ray cameras

    International Nuclear Information System (INIS)

    Soare, S.; Curuia, M.; Anghel, M.; Constantin, M.; David, E.; Kiptily, V.; Prior, P.; Edlington, T.; Griph, S.; Krivchenkov, Y.; Popovichev, S.; Riccardo, V.; Syme, B; Thompson, V.; Murari, A.; Zoita, V.; Bonheure, G.; Le Guern

    2007-01-01

    The JET gamma-ray camera diagnostics have already provided valuable information on the gamma-ray imaging of fast ion evaluation in JET plasmas. The JET Gamma-Ray Cameras (GRC) upgrade project deals with the design of appropriate neutron/gamma-ray filters ('neutron attenuaters').The main design parameter was the neutron attenuation factor. The two design solutions, that have been finally chosen and developed at the level of scheme design, consist of: a) one quasi-crescent shaped neutron attenuator (for the horizontal camera) and b) two quasi-trapezoid shaped neutron attenuators (for the vertical one). Various neutron-attenuating materials have been considered (lithium hydride with natural isotopic composition and 6 Li enriched, light and heavy water, polyethylene). Pure light water was finally chosen as the attenuating material for the JET gamma-ray cameras. FEA methods used to evaluate the behaviour of the filter casings under the loadings (internal hydrostatic pressure, torques) have proven the stability of the structure. (authors)

  20. Development of a hotspot detector with an acrylic filter and dose rate survey meters

    International Nuclear Information System (INIS)

    Shirakawa, Yoshiyuki; Yamano, Toshiya; Kobayashi, Yusuke; Hara, Masaki

    2013-01-01

    Fukushima and adjacent regions still have a large number of high dose rate areas called hotspots. It is necessary to know these hotspots for efficient decontamination of radioactive substances such as 137 Cs and for relief of residents coming home. To find the hotspots rapidly, we have to specify the direction of the area where the dose rate is at least 1μSv/h higher than those of surroundings. We have developed a detector that consists of an acrylic filter and three NaI(Tl) scintillation survey meters, and the detector can be expected to indicate the direction of the hotspot in the short time. A basic performance of the detector was examined by using acrylic filters of 10, 15, 20 and 25cm diameter and a tiny sealed 137 Cs source of 3 MBq as the alternative of a hotspot. It demonstrated the possibility of identifying the direction of γ-rays emitted from the source in 90 seconds. (author)

  1. Development of high efficiency filtered containment venting system by using AgX

    International Nuclear Information System (INIS)

    Narabayashi, Tadashi; Fujii, Yasuhiro; Chiba, Go; Tsuji, Masashi; Ishii, Tasuku

    2014-01-01

    Fukushima Daiichi NPP accident would be terminated, if sufficient accident countermeasures, such as water proof door, mobile power, etc. In case of Europe, it had already installed the heat removal system and filtered containment venting system (FCVS) from the lessons of TMI and Chernobyl Accidents. Decay heat removal system and CV spray cooling system with FCVS are ensured by using mobile generators and heat exchangers to keep the ultimate heat sink even in any natural disaster, such as large earthquake, big tsunami, sudden flooding etc. In this paper we introduce high decontamination factor FCVS that used Silver Zeolite named AgX, developed by Rasa Industries, Ltd. Hokkaido University has tested wet type FCVS using venturi scrubber in water pool and dry type FCVS using metallic filter for 1st stage, and AgX for 2nd stage. Since the AgX needs super heat steam, it is possible to heat up steam by heat exchanger. It is confirmed by TRAC analysis. (author)

  2. Development, Verification and Validation of Parallel, Scalable Volume of Fluid CFD Program for Propulsion Applications

    Science.gov (United States)

    West, Jeff; Yang, H. Q.

    2014-01-01

    There are many instances involving liquid/gas interfaces and their dynamics in the design of liquid engine powered rockets such as the Space Launch System (SLS). Some examples of these applications are: Propellant tank draining and slosh, subcritical condition injector analysis for gas generators, preburners and thrust chambers, water deluge mitigation for launch induced environments and even solid rocket motor liquid slag dynamics. Commercially available CFD programs simulating gas/liquid interfaces using the Volume of Fluid approach are currently limited in their parallel scalability. In 2010 for instance, an internal NASA/MSFC review of three commercial tools revealed that parallel scalability was seriously compromised at 8 cpus and no additional speedup was possible after 32 cpus. Other non-interface CFD applications at the time were demonstrating useful parallel scalability up to 4,096 processors or more. Based on this review, NASA/MSFC initiated an effort to implement a Volume of Fluid implementation within the unstructured mesh, pressure-based algorithm CFD program, Loci-STREAM. After verification was achieved by comparing results to the commercial CFD program CFD-Ace+, and validation by direct comparison with data, Loci-STREAM-VoF is now the production CFD tool for propellant slosh force and slosh damping rate simulations at NASA/MSFC. On these applications, good parallel scalability has been demonstrated for problems sizes of tens of millions of cells and thousands of cpu cores. Ongoing efforts are focused on the application of Loci-STREAM-VoF to predict the transient flow patterns of water on the SLS Mobile Launch Platform in order to support the phasing of water for launch environment mitigation so that vehicle determinantal effects are not realized.

  3. Development of decommissioning management system. 9. Remodeling to PC system and system verification by evaluation of real work

    International Nuclear Information System (INIS)

    Kondo, Hitoshi; Fukuda, Seiji; Okubo, Toshiyuki

    2004-03-01

    When the plan of decommissioning such as nuclear fuel cycle facilities and small-scale research reactors is examined, it is necessary to select the technology and the process of the work procedure, and to optimize the index (such as the radiation dose, the cost, amount of the waste, the number of workers, and the term of works, etc.) concerning dismantling the facility. In our waste management section, Development of the decommissioning management system, which is called 'DECMAN', for the support of making the decommissioning plan is advanced. DECMAN automatically calculates the index by using the facility data and dismantling method. This paper describes the remodeling of program to the personal computer and the system verification by evaluation of real work (Dismantling of the liquor dissolver in the old JOYO Waste Treatment Facility (the old JWTF), the glove boxes in Deuterium Critical Assembly (DCA), and the incinerator in Waste Dismantling Facility (WDF)). The outline of remodeling and verification is as follows. (1) Additional function: 1) Equipment arrangement mapping, 2) Evaluation of the radiation dose by using the air dose rate, 3) I/O of data that uses EXCEL (software). (2) Comparison of work amount between calculation value and results value: The calculation value is 222.67man·hour against the result value 249.40 man·hour in the old JWTF evaluation. (3) Forecast of accompanying work is predictable to multiply a certain coefficient by the calculation value. (4) A new idea that expected the amount of the work was constructed by using the calculation value of DECMAN. (author)

  4. Procedure generation and verification

    International Nuclear Information System (INIS)

    Sheely, W.F.

    1986-01-01

    The Department of Energy has used Artificial Intelligence of ''AI'' concepts to develop two powerful new computer-based techniques to enhance safety in nuclear applications. The Procedure Generation System, and the Procedure Verification System, can be adapted to other commercial applications, such as a manufacturing plant. The Procedure Generation System can create a procedure to deal with the off-normal condition. The operator can then take correct actions on the system in minimal time. The Verification System evaluates the logic of the Procedure Generator's conclusions. This evaluation uses logic techniques totally independent of the Procedure Generator. The rapid, accurate generation and verification of corrective procedures can greatly reduce the human error, possible in a complex (stressful/high stress) situation

  5. Development of dose delivery verification by PET imaging of photonuclear reactions following high energy photon therapy

    International Nuclear Information System (INIS)

    Janek, S; Svensson, R; Jonsson, C; Brahme, A

    2006-01-01

    A method for dose delivery monitoring after high energy photon therapy has been investigated based on positron emission tomography (PET). The technique is based on the activation of body tissues by high energy bremsstrahlung beams, preferably with energies well above 20 MeV, resulting primarily in 11 C and 15 O but also 13 N, all positron-emitting radionuclides produced by photoneutron reactions in the nuclei of 12 C, 16 O and 14 N. A PMMA phantom and animal tissue, a frozen hind leg of a pig, were irradiated to 10 Gy and the induced positron activity distributions were measured off-line in a PET camera a couple of minutes after irradiation. The accelerator used was a Racetrack Microtron at the Karolinska University Hospital using 50 MV scanned photon beams. From photonuclear cross-section data integrated over the 50 MV photon fluence spectrum the predicted PET signal was calculated and compared with experimental measurements. Since measured PET images change with time post irradiation, as a result of the different decay times of the radionuclides, the signals from activated 12 C, 16 O and 14 N within the irradiated volume could be separated from each other. Most information is obtained from the carbon and oxygen radionuclides which are the most abundant elements in soft tissue. The predicted and measured overall positron activities are almost equal (-3%) while the predicted activity originating from nitrogen is overestimated by almost a factor of two, possibly due to experimental noise. Based on the results obtained in this first feasibility study the great value of a combined radiotherapy-PET-CT unit is indicated in order to fully exploit the high activity signal from oxygen immediately after treatment and to avoid patient repositioning. With an RT-PET-CT unit a high signal could be collected even at a dose level of 2 Gy and the acquisition time for the PET could be reduced considerably. Real patient dose delivery verification by means of PET imaging seems to be

  6. Development and Verification of the Tire/Road Friction Estimation Algorithm for Antilock Braking System

    Directory of Open Access Journals (Sweden)

    Jian Zhao

    2014-01-01

    Full Text Available Road friction information is very important for vehicle active braking control systems such as ABS, ASR, or ESP. It is not easy to estimate the tire/road friction forces and coefficient accurately because of the nonlinear system, parameters uncertainties, and signal noises. In this paper, a robust and effective tire/road friction estimation algorithm for ABS is proposed, and its performance is further discussed by simulation and experiment. The tire forces were observed by the discrete Kalman filter, and the road friction coefficient was estimated by the recursive least square method consequently. Then, the proposed algorithm was analysed and verified by simulation and road test. A sliding mode based ABS with smooth wheel slip ratio control and a threshold based ABS by pulse pressure control with significant fluctuations were used for the simulation. Finally, road tests were carried out in both winter and summer by the car equipped with the same threshold based ABS, and the algorithm was evaluated on different road surfaces. The results show that the proposed algorithm can identify the variation of road conditions with considerable accuracy and response speed.

  7. Face identification with frequency domain matched filtering in mobile environments

    Science.gov (United States)

    Lee, Dong-Su; Woo, Yong-Hyun; Yeom, Seokwon; Kim, Shin-Hwan

    2012-06-01

    Face identification at a distance is very challenging since captured images are often degraded by blur and noise. Furthermore, the computational resources and memory are often limited in the mobile environments. Thus, it is very challenging to develop a real-time face identification system on the mobile device. This paper discusses face identification based on frequency domain matched filtering in the mobile environments. Face identification is performed by the linear or phase-only matched filter and sequential verification stages. The candidate window regions are decided by the major peaks of the linear or phase-only matched filtering outputs. The sequential stages comprise a skin-color test and an edge mask filtering test, which verify color and shape information of the candidate regions in order to remove false alarms. All algorithms are built on the mobile device using Android platform. The preliminary results show that face identification of East Asian people can be performed successfully in the mobile environments.

  8. Verification and validation benchmarks.

    Energy Technology Data Exchange (ETDEWEB)

    Oberkampf, William Louis; Trucano, Timothy Guy

    2007-02-01

    Verification and validation (V&V) are the primary means to assess the accuracy and reliability of computational simulations. V&V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V&V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the level of

  9. Verification and validation benchmarks

    International Nuclear Information System (INIS)

    Oberkampf, William Louis; Trucano, Timothy Guy

    2007-01-01

    Verification and validation (V and V) are the primary means to assess the accuracy and reliability of computational simulations. V and V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V and V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the

  10. Verification and validation benchmarks

    International Nuclear Information System (INIS)

    Oberkampf, William L.; Trucano, Timothy G.

    2008-01-01

    Verification and validation (V and V) are the primary means to assess the accuracy and reliability of computational simulations. V and V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V and V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the

  11. Development of filter element from nanocomposites of ultra high molar mass polyethylene having silver nanoparticles

    International Nuclear Information System (INIS)

    Bizzo, Maurizio A.; Wang, S. Hui

    2015-01-01

    The production of polymer based filter elements for water is widespread in the market but has an undesirable characteristic, they are not always efficient and capable of retaining or eliminating microorganisms. This paper proposes the production of filters with biocidal activity, comprised by nanocomposites of ultra-high molar mass polyethylene (UHMMPE) containing silver nanoparticles. The polymer is responsible for the uniform porous structure of the filter element and the Ag nanoparticles for its biocidal action. The filter elements were produced from two kinds of UHMMPE particles with different particle size distributions, one in the range of 150 to 200μm and the other of 300 to 400μm. Samples were collected from the obtained filter elements and characterized by X-ray diffractometry, scanning electron microscopy and microanalysis. The results indicated the formation of nanocomposite containing silver nanoparticles. (author)

  12. Formal Development and Verification of Railway Control Systems - In the context of ERTMS/ETCS Level 2

    DEFF Research Database (Denmark)

    Vu, Linh Hong

    This dissertation presents a holistic, formal method for efficient modelling and verification of safety-critical railway control systems that have product line characteristics, i.e., each individual system is constructed by instantiating common generic applications with concrete configuration dat...... standardized railway control systems ERTMS/ETCS Level 2. Experiments showed that the method can be used for specification, verification and validation of systems of industrial size....

  13. Development and testing of a medline search filter for identifying patient and public involvement in health research.

    Science.gov (United States)

    Rogers, Morwenna; Bethel, Alison; Boddy, Kate

    2017-06-01

    Research involving the public as partners often proves difficult to locate due to the variations in terms used to describe public involvement, and inability of medical databases to index this concept effectively. To design a search filter to identify literature where patient and public involvement (PPI) was used in health research. A reference standard of 172 PPI papers was formed. The references were divided into a development set and a test set. Search terms were identified from common words, phrases and synonyms in the development set. These terms were combined as a search strategy for medline via OvidSP, which was then tested for sensitivity against the test set. The resultant search filter was then assessed for sensitivity, specificity and precision using a previously published systematic review. The search filter was found to be highly sensitive 98.5% in initial testing. When tested against results generated by a 'real-life' systematic review, the filter had a specificity of 81%. However, sensitivity dropped to 58%. Adjustments to the population group of terms increased the sensitivity to 73%. The PPI filter designed for medline via OvidSP could aid information specialists and researchers trying to find literature specific to PPI. © 2016 Health Libraries Group.

  14. Enhanced Bank of Kalman Filters Developed and Demonstrated for In-Flight Aircraft Engine Sensor Fault Diagnostics

    Science.gov (United States)

    Kobayashi, Takahisa; Simon, Donald L.

    2005-01-01

    In-flight sensor fault detection and isolation (FDI) is critical to maintaining reliable engine operation during flight. The aircraft engine control system, which computes control commands on the basis of sensor measurements, operates the propulsion systems at the demanded conditions. Any undetected sensor faults, therefore, may cause the control system to drive the engine into an undesirable operating condition. It is critical to detect and isolate failed sensors as soon as possible so that such scenarios can be avoided. A challenging issue in developing reliable sensor FDI systems is to make them robust to changes in engine operating characteristics due to degradation with usage and other faults that can occur during flight. A sensor FDI system that cannot appropriately account for such scenarios may result in false alarms, missed detections, or misclassifications when such faults do occur. To address this issue, an enhanced bank of Kalman filters was developed, and its performance and robustness were demonstrated in a simulation environment. The bank of filters is composed of m + 1 Kalman filters, where m is the number of sensors being used by the control system and, thus, in need of monitoring. Each Kalman filter is designed on the basis of a unique fault hypothesis so that it will be able to maintain its performance if a particular fault scenario, hypothesized by that particular filter, takes place.

  15. Development and verification testing of automation and robotics for assembly of space structures

    Science.gov (United States)

    Rhodes, Marvin D.; Will, Ralph W.; Quach, Cuong C.

    1993-01-01

    A program was initiated within the past several years to develop operational procedures for automated assembly of truss structures suitable for large-aperture antennas. The assembly operations require the use of a robotic manipulator and are based on the principle of supervised autonomy to minimize crew resources. A hardware testbed was established to support development and evaluation testing. A brute-force automation approach was used to develop the baseline assembly hardware and software techniques. As the system matured and an operation was proven, upgrades were incorprated and assessed against the baseline test results. This paper summarizes the developmental phases of the program, the results of several assembly tests, the current status, and a series of proposed developments for additional hardware and software control capability. No problems that would preclude automated in-space assembly of truss structures have been encountered. The current system was developed at a breadboard level and continued development at an enhanced level is warranted.

  16. Laboratory for filter testing

    Energy Technology Data Exchange (ETDEWEB)

    Paluch, W.

    1987-07-01

    Filters used for mine draining in brown coal surface mines are tested by the Mine Draining Department of Poltegor. Laboratory tests of new types of filters developed by Poltegor are analyzed. Two types of tests are used: tests of scale filter models and tests of experimental units of new filters. Design and operation of the test stands used for testing mechanical properties and hydraulic properties of filters for coal mines are described: dimensions, pressure fluctuations, hydraulic equipment. Examples of testing large-diameter filters for brown coal mines are discussed.

  17. Formal Development of a Tool for Automated Modelling and Verification of Relay Interlocking Systems

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Kjær, Andreas A.; Le Bliguet, Marie

    2011-01-01

    This paper describes a tool for formal modelling relay interlocking systems and explains how it has been stepwise, formally developed using the RAISE method. The developed tool takes the circuit diagrams of a relay interlocking system as input and gives as result a state transition system modelling...

  18. Development and stability studies of sunscreen cream formulations containing three photo-protective filters

    Directory of Open Access Journals (Sweden)

    Slim Smaoui

    2017-02-01

    Full Text Available The present study aimed to formulate and subsequently evaluate sunscreen cream (W/O/W emulsion containing three photo-protective filters: benzophenone-3, ethylhexyl methoxycinnamate and titanium dioxide at different percentages. Formulations were stored at 8, 25 and 40 °C for four weeks to investigate their stability. Color, centrifugation, liquefaction, phase separation, pH and Sun Protection Factor (SPF of sunscreen cream formulations were determined. The microbiological stability of the creams was also evaluated and the organoleptic quality was carried out for 28 days. Interestingly, the combination of 7% Benzophenone-3, 7% Ethylhexyl methoxycinnamate and 6% Titanium dioxide preserved physicochemical properties of the product and was efficient against the development of different spoilage microorganisms as well as aerobic plate counts, Pseudomonas aeruginosa, Staphylococcus aureus, and yeast and mold counts. Furthermore, a good stability was observed for all formulations throughout the experimental period. The newly formulated sunscreen cream was proved to exhibit a number of promising properties and attributes that might open new opportunities for the development of more efficient, safe, and cost-effective skin-care, cosmetic, and pharmaceutical products.

  19. Development of simple band-spectral pyranometer and quantum meter using photovoltaic cells and bandpass filters

    Energy Technology Data Exchange (ETDEWEB)

    Bilguun, Amarsaikhan, E-mail: bilguun@pes.ee.tut.ac.jp; Nakaso, Tetsushi; Harigai, Toru; Suda, Yoshiyuki; Takikawa, Hirofumi, E-mail: takikawa@ee.tut.ac.jp [Toyohashi University of Technology, 1-1 Habarigaoka, Tempaku, Toyohashi 441-8580 (Japan); Tanoue, Hideto [Kitakyushu National College of Technology, 5-20-1, Kokuraminami, Kitakyushu, Fukuoka 802-0985 (Japan)

    2016-02-01

    In recent years, greenhouse automatic-control, based on the measurement of solar irradiance, has been attracting attention. This control is an effective method for improving crop production. In the agricultural field, it is necessary to measure Photon Flux Density (PFD), which is an important parameter in the promotion of plant growth. In particular, the PFD of Photosynthetically Active Radiation (PAR, 400-700 nm) and Plant Biologically Active Radiation (PBAR, 300-800 nm) have been discussed in agricultural plant science. The commercial quantum meter (QM, PAR meter) can only measure Photosynthetically Photon Flux Density (PPFD) which is the integrated PFD quantity on the PAR wavelength. In this research, a band-spectral pyranometer or quantum meter using PVs with optical bandpass filters for dividing the PBAR wavelength into 100 nm bands (five independent channels) was developed. Before field testing, calibration of the instruments was carried out using a solar simulator. Next, a field test was conducted in three differing weather conditions such as clear, partly cloudy and cloudy skies. As a result, it was found that the response rate of the developed pyranometer was faster by four seconds compared with the response rate of the commercial pyranometer. Moreover, the outputs of each channel in the developed pyranometer were very similar to the integrated outputs of the commercial spectroradiometer. It was confirmed that the solar irradiance could be measured in each band separately using the developed band-spectral pyranometer. It was indicated that the developed band-spectral pyranometer could also be used as a PV band-spectral quantum meter which is obtained by converting the band irradiance into band PFD.

  20. Development of simple band-spectral pyranometer and quantum meter using photovoltaic cells and bandpass filters

    Science.gov (United States)

    Bilguun, Amarsaikhan; Nakaso, Tetsushi; Harigai, Toru; Suda, Yoshiyuki; Takikawa, Hirofumi; Tanoue, Hideto

    2016-02-01

    In recent years, greenhouse automatic-control, based on the measurement of solar irradiance, has been attracting attention. This control is an effective method for improving crop production. In the agricultural field, it is necessary to measure Photon Flux Density (PFD), which is an important parameter in the promotion of plant growth. In particular, the PFD of Photosynthetically Active Radiation (PAR, 400-700 nm) and Plant Biologically Active Radiation (PBAR, 300-800 nm) have been discussed in agricultural plant science. The commercial quantum meter (QM, PAR meter) can only measure Photosynthetically Photon Flux Density (PPFD) which is the integrated PFD quantity on the PAR wavelength. In this research, a band-spectral pyranometer or quantum meter using PVs with optical bandpass filters for dividing the PBAR wavelength into 100 nm bands (five independent channels) was developed. Before field testing, calibration of the instruments was carried out using a solar simulator. Next, a field test was conducted in three differing weather conditions such as clear, partly cloudy and cloudy skies. As a result, it was found that the response rate of the developed pyranometer was faster by four seconds compared with the response rate of the commercial pyranometer. Moreover, the outputs of each channel in the developed pyranometer were very similar to the integrated outputs of the commercial spectroradiometer. It was confirmed that the solar irradiance could be measured in each band separately using the developed band-spectral pyranometer. It was indicated that the developed band-spectral pyranometer could also be used as a PV band-spectral quantum meter which is obtained by converting the band irradiance into band PFD.

  1. Development of simple band-spectral pyranometer and quantum meter using photovoltaic cells and bandpass filters

    International Nuclear Information System (INIS)

    Bilguun, Amarsaikhan; Nakaso, Tetsushi; Harigai, Toru; Suda, Yoshiyuki; Takikawa, Hirofumi; Tanoue, Hideto

    2016-01-01

    In recent years, greenhouse automatic-control, based on the measurement of solar irradiance, has been attracting attention. This control is an effective method for improving crop production. In the agricultural field, it is necessary to measure Photon Flux Density (PFD), which is an important parameter in the promotion of plant growth. In particular, the PFD of Photosynthetically Active Radiation (PAR, 400-700 nm) and Plant Biologically Active Radiation (PBAR, 300-800 nm) have been discussed in agricultural plant science. The commercial quantum meter (QM, PAR meter) can only measure Photosynthetically Photon Flux Density (PPFD) which is the integrated PFD quantity on the PAR wavelength. In this research, a band-spectral pyranometer or quantum meter using PVs with optical bandpass filters for dividing the PBAR wavelength into 100 nm bands (five independent channels) was developed. Before field testing, calibration of the instruments was carried out using a solar simulator. Next, a field test was conducted in three differing weather conditions such as clear, partly cloudy and cloudy skies. As a result, it was found that the response rate of the developed pyranometer was faster by four seconds compared with the response rate of the commercial pyranometer. Moreover, the outputs of each channel in the developed pyranometer were very similar to the integrated outputs of the commercial spectroradiometer. It was confirmed that the solar irradiance could be measured in each band separately using the developed band-spectral pyranometer. It was indicated that the developed band-spectral pyranometer could also be used as a PV band-spectral quantum meter which is obtained by converting the band irradiance into band PFD

  2. Development, verification and validation of the fuel channel behaviour computer code FACTAR

    Energy Technology Data Exchange (ETDEWEB)

    Westbye, C J; Brito, A C; MacKinnon, J C; Sills, H E; Langman, V J [Ontario Hydro, Toronto, ON (Canada)

    1996-12-31

    FACTAR (Fuel And Channel Temperature And Response) is a computer code developed to simulate the transient thermal and mechanical behaviour of 37-element or 28-element fuel bundles within a single CANDU fuel channel for moderate loss of coolant accident conditions including transition and large break LOCA`s (loss of coolant accidents) with emergency coolant injection assumed available. FACTAR`s predictions of fuel temperature and sheath failure times are used to subsequent assessment of fission product releases and fuel string expansion. This paper discusses the origin and development history of FACTAR, presents the mathematical models and solution technique, the detailed quality assurance procedures that are followed during development, and reports the future development of the code. (author). 27 refs., 3 figs.

  3. Development and Verification of a Fully Coupled Simulator for Offshore Wind Turbines: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Jonkman, J. M.; Buhl, M. L. Jr.

    2007-01-01

    This report outlines the development of an analysis tool capable of analyzing a variety of wind turbine, support platform, and mooring system configurations.The simulation capability was tested by model-to-model comparisons to ensure its correctness.

  4. SU-D-BRC-03: Development and Validation of an Online 2D Dose Verification System for Daily Patient Plan Delivery Accuracy Check

    International Nuclear Information System (INIS)

    Zhao, J; Hu, W; Xing, Y; Wu, X; Li, Y

    2016-01-01

    Purpose: All plan verification systems for particle therapy are designed to do plan verification before treatment. However, the actual dose distributions during patient treatment are not known. This study develops an online 2D dose verification tool to check the daily dose delivery accuracy. Methods: A Siemens particle treatment system with a modulated scanning spot beam is used in our center. In order to do online dose verification, we made a program to reconstruct the delivered 2D dose distributions based on the daily treatment log files and depth dose distributions. In the log files we can get the focus size, position and particle number for each spot. A gamma analysis is used to compare the reconstructed dose distributions with the dose distributions from the TPS to assess the daily dose delivery accuracy. To verify the dose reconstruction algorithm, we compared the reconstructed dose distributions to dose distributions measured using PTW 729XDR ion chamber matrix for 13 real patient plans. Then we analyzed 100 treatment beams (58 carbon and 42 proton) for prostate, lung, ACC, NPC and chordoma patients. Results: For algorithm verification, the gamma passing rate was 97.95% for the 3%/3mm and 92.36% for the 2%/2mm criteria. For patient treatment analysis,the results were 97.7%±1.1% and 91.7%±2.5% for carbon and 89.9%±4.8% and 79.7%±7.7% for proton using 3%/3mm and 2%/2mm criteria, respectively. The reason for the lower passing rate for the proton beam is that the focus size deviations were larger than for the carbon beam. The average focus size deviations were −14.27% and −6.73% for proton and −5.26% and −0.93% for carbon in the x and y direction respectively. Conclusion: The verification software meets our requirements to check for daily dose delivery discrepancies. Such tools can enhance the current treatment plan and delivery verification processes and improve safety of clinical treatments.

  5. SU-D-BRC-03: Development and Validation of an Online 2D Dose Verification System for Daily Patient Plan Delivery Accuracy Check

    Energy Technology Data Exchange (ETDEWEB)

    Zhao, J; Hu, W [Fudan University Shanghai Cancer Center, Shanghai, Shanghai (China); Xing, Y [Fudan univercity shanghai proton and heavy ion center, Shanghai (China); Wu, X [Fudan university shanghai proton and heavy ion center, Shanghai, shagnhai (China); Li, Y [Department of Medical physics at Shanghai Proton and Heavy Ion Center, Shanghai, Shanghai (China)

    2016-06-15

    Purpose: All plan verification systems for particle therapy are designed to do plan verification before treatment. However, the actual dose distributions during patient treatment are not known. This study develops an online 2D dose verification tool to check the daily dose delivery accuracy. Methods: A Siemens particle treatment system with a modulated scanning spot beam is used in our center. In order to do online dose verification, we made a program to reconstruct the delivered 2D dose distributions based on the daily treatment log files and depth dose distributions. In the log files we can get the focus size, position and particle number for each spot. A gamma analysis is used to compare the reconstructed dose distributions with the dose distributions from the TPS to assess the daily dose delivery accuracy. To verify the dose reconstruction algorithm, we compared the reconstructed dose distributions to dose distributions measured using PTW 729XDR ion chamber matrix for 13 real patient plans. Then we analyzed 100 treatment beams (58 carbon and 42 proton) for prostate, lung, ACC, NPC and chordoma patients. Results: For algorithm verification, the gamma passing rate was 97.95% for the 3%/3mm and 92.36% for the 2%/2mm criteria. For patient treatment analysis,the results were 97.7%±1.1% and 91.7%±2.5% for carbon and 89.9%±4.8% and 79.7%±7.7% for proton using 3%/3mm and 2%/2mm criteria, respectively. The reason for the lower passing rate for the proton beam is that the focus size deviations were larger than for the carbon beam. The average focus size deviations were −14.27% and −6.73% for proton and −5.26% and −0.93% for carbon in the x and y direction respectively. Conclusion: The verification software meets our requirements to check for daily dose delivery discrepancies. Such tools can enhance the current treatment plan and delivery verification processes and improve safety of clinical treatments.

  6. SU-E-J-145: Validation of An Analytical Model for in Vivo Range Verification Using GATE Monte Carlo Simulation in Proton Therapy

    International Nuclear Information System (INIS)

    Lee, C; Lin, H; Chao, T; Hsiao, I; Chuang, K

    2015-01-01

    Purpose: Predicted PET images on the basis of analytical filtering approach for proton range verification has been successful developed and validated using FLUKA Monte Carlo (MC) codes and phantom measurements. The purpose of the study is to validate the effectiveness of analytical filtering model for proton range verification on GATE/GEANT4 Monte Carlo simulation codes. Methods: In this study, we performed two experiments for validation of predicted β+-isotope by the analytical model with GATE/GEANT4 simulations. The first experiments to evaluate the accuracy of predicting β+-yields as a function of irradiated proton energies. In second experiment, we simulate homogeneous phantoms of different materials irradiated by a mono-energetic pencil-like proton beam. The results of filtered β+-yields distributions by the analytical model is compared with those of MC simulated β+-yields in proximal and distal fall-off ranges. Results: The results investigate the distribution between filtered β+-yields and MC simulated β+-yields distribution in different conditions. First, we found that the analytical filtering can be applied over the whole range of the therapeutic energies. Second, the range difference between filtered β+-yields and MC simulated β+-yields at the distal fall-off region are within 1.5mm for all materials used. The findings validated the usefulness of analytical filtering model on range verification of proton therapy on GATE Monte Carlo simulations. In addition, there is a larger discrepancy between filtered prediction and MC simulated β+-yields using GATE code, especially in proximal region. This discrepancy might Result from the absence of wellestablished theoretical models for predicting the nuclear interactions. Conclusion: Despite the fact that large discrepancies of the distributions between MC-simulated and predicted β+-yields were observed, the study prove the effectiveness of analytical filtering model for proton range verification using

  7. Development of an optimal automatic control law and filter algorithm for steep glideslope capture and glideslope tracking

    Science.gov (United States)

    Halyo, N.

    1976-01-01

    A digital automatic control law to capture a steep glideslope and track the glideslope to a specified altitude is developed for the longitudinal/vertical dynamics of a CTOL aircraft using modern estimation and control techniques. The control law uses a constant gain Kalman filter to process guidance information from the microwave landing system, and acceleration from body mounted accelerometer data. The filter outputs navigation data and wind velocity estimates which are used in controlling the aircraft. Results from a digital simulation of the aircraft dynamics and the control law are presented for various wind conditions.

  8. Polonium evaporation and adhesion experiments for the development of polonium filter in lead-bismuth cooled reactors

    International Nuclear Information System (INIS)

    Obara, Toru; Koga, Takeru; Miura, Terumitsu; Sekimoto, Hiroshi

    2008-01-01

    Fundamental experiments were performed to determine the adhesion characteristics of polonium to different metals and to develop a filter for polonium evaporated from neutron-irradiated LBE. The results of the first experiments suggested that adhesion characteristics are almost the same for stainless steel and nickel metal. The results of the preliminary experiments for a polonium filter suggested that stainless steel mesh with thin wires could effectively collect polonium evaporated from neutron-irradiated LBE. In the experiments, stainless steel wire mesh was used, but from the results of adhesion experiment, it is expected that the same effect can be obtained with wire mesh made of other kinds of metal. (author)

  9. User input verification and test driven development in the NJOY21 nuclear data processing code

    Energy Technology Data Exchange (ETDEWEB)

    Trainer, Amelia Jo [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Conlin, Jeremy Lloyd [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); McCartney, Austin Paul [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-08-21

    Before physically-meaningful data can be used in nuclear simulation codes, the data must be interpreted and manipulated by a nuclear data processing code so as to extract the relevant quantities (e.g. cross sections and angular distributions). Perhaps the most popular and widely-trusted of these processing codes is NJOY, which has been developed and improved over the course of 10 major releases since its creation at Los Alamos National Laboratory in the mid-1970’s. The current phase of NJOY development is the creation of NJOY21, which will be a vast improvement from its predecessor, NJOY2016. Designed to be fast, intuitive, accessible, and capable of handling both established and modern formats of nuclear data, NJOY21 will address many issues that many NJOY users face, while remaining functional for those who prefer the existing format. Although early in its development, NJOY21 is quickly providing input validation to check user input. By providing rapid and helpful responses to users while writing input files, NJOY21 will prove to be more intuitive and easy to use than any of its predecessors. Furthermore, during its development, NJOY21 is subject to regular testing, such that its test coverage must strictly increase with the addition of any production code. This thorough testing will allow developers and NJOY users to establish confidence in NJOY21 as it gains functionality. This document serves as a discussion regarding the current state input checking and testing practices of NJOY21.

  10. Verification of WIMS-ANL to be used as supporting code for WIMS-CANDU development

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Dai Hai; Kim, Won Young; Park, Joo Hwan

    2007-08-15

    The lattice code WIMS-ANL has been tested in order to assess it for the qualification to be used as a supporting code to aide the WIMS-CANDU development. A series of calculations have been performed to determine lattice physics parameters such as multiplication factors, isotopic number densities and coolant void reactivity. The WIMS-ANL results are compared with the predictions of WIMS-AECL/D4/D5 and PPV (POWDERPUFS-V), and the comparisons indicate that WIMS-ANL can be used not only as a supporting code to aide the WIMS-CANDU development, but also as a starting source for the study of developing detailed model that could delineate the realistic situations as it might occur during LOCA such as the asymmetric flux distribution across lattice cell.

  11. Development and verification of a compact TDC-based data acquisition system for space applications

    Energy Technology Data Exchange (ETDEWEB)

    Losekamm, Martin [Physics Department E18, Technische Universitaet Muenchen (Germany); Institute of Astronautics, Technische Universitaet Muenchen (Germany); Gaisbauer, Dominic; Konorov, Igor; Paul, Stephan; Poeschl, Thomas [Physics Department E18, Technische Universitaet Muenchen (Germany)

    2015-07-01

    The advances of solid-state detectors and in particular those for the detection of photons have made their application in space systems increasingly attractive in recent years. The use of, for example, silicon photomultipliers (SiPM) paired with a suitable scintillating material allows the development of compact and lightweight particle detectors. The Antiproton Flux in Space experiment (AFIS) intends to measure the flux of antiprotons trapped in Earth's magnetosphere aboard a nanosatellite using an active target tracking detector, consisting of plastic scintillating fibers read out by SiPMs. In order to implement a large number of detector channels while adhering to the given space, mass and power constraints, the development of a compact TDC-based data acquisition system was proposed. This talk presents a current prototype featuring 900 channels, real-time multi-channel temperature measurement and bias regulation. Possible alternative applications as well as the next steps in the development are also discussed.

  12. Development, Validation, and Verification of a Self-Assessment Tool to Estimate Agnibala (Digestive Strength).

    Science.gov (United States)

    Singh, Aparna; Singh, Girish; Patwardhan, Kishor; Gehlot, Sangeeta

    2017-01-01

    According to Ayurveda, the traditional system of healthcare of Indian origin, Agni is the factor responsible for digestion and metabolism. Four functional states (Agnibala) of Agni have been recognized: regular, irregular, intense, and weak. The objective of the present study was to develop and validate a self-assessment tool to estimate Agnibala The developed tool was evaluated for its reliability and validity by administering it to 300 healthy volunteers of either gender belonging to 18 to 40-year age group. Besides confirming the statistical validity and reliability, the practical utility of the newly developed tool was also evaluated by recording serum lipid parameters of all the volunteers. The results show that the lipid parameters vary significantly according to the status of Agni The tool, therefore, may be used to screen normal population to look for possible susceptibility to certain health conditions. © The Author(s) 2016.

  13. Development, implementation, and verification of multicycle depletion perturbation theory for reactor burnup analysis

    Energy Technology Data Exchange (ETDEWEB)

    White, J.R.

    1980-08-01

    A generalized depletion perturbation formulation based on the quasi-static method for solving realistic multicycle reactor depletion problems is developed and implemented within the VENTURE/BURNER modular code system. The present development extends the original formulation derived by M.L. Williams to include nuclide discontinuities such as fuel shuffling and discharge. This theory is first described in detail with particular emphasis given to the similarity of the forward and adjoint quasi-static burnup equations. The specific algorithm and computational methods utilized to solve the adjoint problem within the newly developed DEPTH (Depletion Perturbation Theory) module are then briefly discussed. Finally, the main features and computational accuracy of this new method are illustrated through its application to several representative reactor depletion problems.

  14. Development and verification of ground-based tele-robotics operations concept for Dextre

    Science.gov (United States)

    Aziz, Sarmad

    2013-05-01

    The Special Purpose Dextreous Manipulator (Dextre) is the latest addition to the on-orbit segment of the Mobile Servicing System (MSS); Canada's contribution to the International Space Station (ISS). Launched in March 2008, the advanced two-armed robot is designed to perform various ISS maintenance tasks on robotically compatible elements and on-orbit replaceable units using a wide variety of tools and interfaces. The addition of Dextre has increased the capabilities of the MSS, and has introduced significant complexity to ISS robotics operations. While the initial operations concept for Dextre was based on human-in-the-loop control by the on-orbit astronauts, the complexities of robotic maintenance and the associated costs of training and maintaining the operator skills required for Dextre operations demanded a reexamination of the old concepts. A new approach to ISS robotic maintenance was developed in order to utilize the capabilities of Dextre safely and efficiently, while at the same time reducing the costs of on-orbit operations. This paper will describe the development, validation, and on-orbit demonstration of the operations concept for ground-based tele-robotics control of Dextre. It will describe the evolution of the new concepts from the experience gained from the development and implementation of the ground control capability for the Space Station Remote Manipulator System; Canadarm 2. It will discuss the various technical challenges faced during the development effort, such as requirements for high positioning accuracy, force/moment sensing and accommodation, failure tolerance, complex tool operations, and the novel operational tools and techniques developed to overcome them. The paper will also describe the work performed to validate the new concepts on orbit and will discuss the results and lessons learned from the on-orbit checkout and commissioning of Dextre using the newly developed tele-robotics techniques and capabilities.

  15. Preliminary Study for Development of Welds Integrity Verification Equipment for the Small Bore Piping

    International Nuclear Information System (INIS)

    Choi, Geun Suk; Lee, Jong Eun; Ryu, Jung Hoon; Cho, Kyoung Youn; Sohn, Myoung Sung; Lee, Sanghoon; Sung, Gi Ho; Cho, Hong Seok

    2016-01-01

    It has been reported leakage accident of small-bore piping in Korea. Leakage accident of small-bore pipes are those that will increase due to the aging of the nuclear power plant. And if leakage of the pipe is repaired by using the clamping device when it occur accident, it is economically benefits. The clamping device is a fastening device used to hold or secure objects tightly together to prevent movement or separation through the application of inward pressure. However, when the accident occurs, it can't immediately respond because maintenance and repairing technology are not institutionalized in KEPIC. Thus it appears an economic loss. The technology for corresponding thereto is necessary for the safety of the operation of nuclear power plants. The purpose of this research is to develop an online repairing technology of socket welded pipe and vibration monitoring system of small-bore pipe in the nuclear power plant. Specifically, detailed studies are as follows : • Development of weld overlay method of safety class socket welded connections • Development of Mechanical Clamping Devices for Safety Class 2, 3 small-bore pipe. The purpose of this study is to develop an online repairing technology of socket welded pipe and vibration monitoring system of small-bore pipe, resulting in degraded plant systems. And it is necessary to institutionalize the technology. The fatigue crack testing of socket welded overlay will be performed and fatigue life evaluation method will be developed in second year. Also prototype fabrication of mechanical clamping device will be completed. Base on final goal, the intent is to propose practical evaluation tools, design and fabrication methods for socket welded connection integrity. And result of this study is to development of KEPIC code case approved technology for on-line repairing system of socket welded connection and fabrication of mechanical clamping device

  16. Development of filter module for passive filtration and accident gas release confinement system for NPP

    International Nuclear Information System (INIS)

    Yelizarov, P.G.; Efanov, A.D.; Martynov, P.N.; Masalov, D.P.; Osipov, V.P.; Yagodkin, I.V.

    2005-01-01

    Full text of publication follows: One of the urgent problems of the safe NPP operation is air cleaning from radioactive aerosols and volatile iodine compounds under the accident operation conditions of NPP. A principally new passive accident gas release confinement system is used as the basis of the designs of new generation reactor power blocks under the-beyond-design-basis accident conditions with total loss of current. The basic structural component of the passive filtration system (PFS) is the filter-sorber being heated up to 300 deg. C. The filter-sorber represents a design consisting of 150 connected in parallel two-step filtering modules. The first step is intended to clean air from radioactive aerosols, the second one - to clean air from radioactive iodine and its volatile compounds. The filter-sorber is located in the upper point of the exterior protection shell. Due to natural convection, it provides confinement of r/a impurities and controlled steam-gas release from the inter-shell space into atmosphere. The basic specific design feature is the two-section design of the PFS filter module consisting of a coarse-cleaning section and a fine-cleaning section. A combination of layer-by-layer put filtering materials on the basis of glass fiber and metal fiber. The pilot PFS filter module specimen tests run in conditions modeling accident situation indicated that at a filtration rate of 0,3 cm/s the aerodynamic resistance of the module does not exceed 12 Pa, the filtration effectiveness equals 99,99 % in terms of aerosol, no less than 99,9% in terms of radioactive 131 I and no less than 99,0% in terms of organic compounds of iodine (CH 3 131 I); the dust capacity amounts to a value above 50 g/m 2 . The obtained results of tests comply with the design requirements imposed on the PFS filter-sorber module. (authors)

  17. Development of an adaptive bilateral filter for evaluating color image difference

    Science.gov (United States)

    Wang, Zhaohui; Hardeberg, Jon Yngve

    2012-04-01

    Spatial filtering, which aims to mimic the contrast sensitivity function (CSF) of the human visual system (HVS), has previously been combined with color difference formulae for measuring color image reproduction errors. These spatial filters attenuate imperceptible information in images, unfortunately including high frequency edges, which are believed to be crucial in the process of scene analysis by the HVS. The adaptive bilateral filter represents a novel approach, which avoids the undesirable loss of edge information introduced by CSF-based filtering. The bilateral filter employs two Gaussian smoothing filters in different domains, i.e., spatial domain and intensity domain. We propose a method to decide the parameters, which are designed to be adaptive to the corresponding viewing conditions, and the quantity and homogeneity of information contained in an image. Experiments and discussions are given to support the proposal. A series of perceptual experiments were conducted to evaluate the performance of our approach. The experimental sample images were reproduced with variations in six image attributes: lightness, chroma, hue, compression, noise, and sharpness/blurriness. The Pearson's correlation values between the model-predicted image difference and the observed difference were employed to evaluate the performance, and compare it with that of spatial CIELAB and image appearance model.

  18. Development and verification of a high performance multi-group SP3 transport capability in the ARTEMIS core simulator

    International Nuclear Information System (INIS)

    Van Geemert, Rene

    2008-01-01

    For satisfaction of future global customer needs, dedicated efforts are being coordinated internationally and pursued continuously at AREVA NP. The currently ongoing CONVERGENCE project is committed to the development of the ARCADIA R next generation core simulation software package. ARCADIA R will be put to global use by all AREVA NP business regions, for the entire spectrum of core design processes, licensing computations and safety studies. As part of the currently ongoing trend towards more sophisticated neutronics methodologies, an SP 3 nodal transport concept has been developed for ARTEMIS which is the steady-state and transient core simulation part of ARCADIA R . For enabling a high computational performance, the SP N calculations are accelerated by applying multi-level coarse mesh re-balancing. In the current implementation, SP 3 is about 1.4 times as expensive computationally as SP 1 (diffusion). The developed SP 3 solution concept is foreseen as the future computational workhorse for many-group 3D pin-by-pin full core computations by ARCADIA R . With the entire numerical workload being highly parallelizable through domain decomposition techniques, associated CPU-time requirements that adhere to the efficiency needs in the nuclear industry can be expected to become feasible in the near future. The accuracy enhancement obtainable by using SP 3 instead of SP 1 has been verified by a detailed comparison of ARTEMIS 16-group pin-by-pin SP N results with KAERI's DeCart reference results for the 2D pin-by-pin Purdue UO 2 /MOX benchmark. This article presents the accuracy enhancement verification and quantifies the achieved ARTEMIS-SP 3 computational performance for a number of 2D and 3D multi-group and multi-box (up to pin-by-pin) core computations. (authors)

  19. Static and Completion Analysis for Planning Knowledge Base Development and Verification

    Science.gov (United States)

    Chien, Steve A.

    1996-01-01

    A key obstacle hampering fielding of AI planning applications is the considerable expense of developing, verifying, updating, and maintaining the planning knowledge base (KB). Planning systems must be able to compare favorably in terms of software lifecycle costs to other means of automation such as scripts or rule-based expert systems.

  20. Design, Development and Delivery of Active Learning Tools in Software Verification & Validation Education

    Science.gov (United States)

    Acharya, Sushil; Manohar, Priyadarshan Anant; Wu, Peter; Maxim, Bruce; Hansen, Mary

    2018-01-01

    Active learning tools are critical in imparting real world experiences to the students within a classroom environment. This is important because graduates are expected to develop software that meets rigorous quality standards in functional and application domains with little to no training. However, there is a well-recognized need for the…

  1. Development and verification of unstructured adaptive mesh technique with edge compatibility

    International Nuclear Information System (INIS)

    Ito, Kei; Ohshima, Hiroyuki; Kunugi, Tomoaki

    2010-01-01

    In the design study of the large-sized sodium-cooled fast reactor (JSFR), one key issue is suppression of gas entrainment (GE) phenomena at a gas-liquid interface. Therefore, the authors have been developed a high-precision CFD algorithm to evaluate the GE phenomena accurately. The CFD algorithm has been developed on unstructured meshes to establish an accurate modeling of JSFR system. For two-phase interfacial flow simulations, a high-precision volume-of-fluid algorithm is employed. It was confirmed that the developed CFD algorithm could reproduce the GE phenomena in a simple GE experiment. Recently, the authors have been developed an important technique for the simulation of the GE phenomena in JSFR. That is an unstructured adaptive mesh technique which can apply fine cells dynamically to the region where the GE occurs in JSFR. In this paper, as a part of the development, a two-dimensional unstructured adaptive mesh technique is discussed. In the two-dimensional adaptive mesh technique, each cell is refined isotropically to reduce distortions of the mesh. In addition, connection cells are formed to eliminate the edge incompatibility between refined and non-refined cells. The two-dimensional unstructured adaptive mesh technique is verified by solving well-known lid-driven cavity flow problem. As a result, the two-dimensional unstructured adaptive mesh technique succeeds in providing a high-precision solution, even though poor-quality distorted initial mesh is employed. In addition, the simulation error on the two-dimensional unstructured adaptive mesh is much less than the error on the structured mesh with a larger number of cells. (author)

  2. Developing a software for tracking the memory states of the machines in the LHCb Filter Farm

    CERN Document Server

    Jain, Harshit

    2017-01-01

    The LHCb Event Filter Farm consists of more than 1500 server nodes with a total amount of roughly 65 TB operating memory .The memory is crucial for the success of the LHCb experiment, since the proton-proton collisions are temporarily stored on these memory modules. Unfortunately, the aging nodes of the server farm occasionally suffer losses of their memory modules. The lower the available memory, the lower performance we can get out of it. Inducing the users or administrators to pay attention to this matter is inefficient. One needs to upgrade it to an acceptable way. The aim of this project was to develop a software to monitor a set of test machines. The software stores the data of the memory sticks in advance in a database which will be used for future reference. Then it checks the memory sticks at a future time instant to find any failures. In the case of any such losses the software looks up in the database to find out which memory sticks have lost and displays all information of those sticks in a log fi...

  3. Potential of resource recovery in UASB/trickling filter systems treating domestic sewage in developing countries.

    Science.gov (United States)

    Bressani-Ribeiro, T; Brandt, E M F; Gutierrez, K G; Díaz, C A; Garcia, G B; Chernicharo, C A L

    2017-04-01

    This paper aims to present perspectives for energy (thermal and electric) and nutrient (N and S) recovery in domestic sewage treatment systems comprised of upflow anaerobic sludge blanket (UASB) reactors followed by sponge-bed trickling filters (SBTF) in developing countries. The resource recovery potential was characterized, taking into account 114 countries and a corresponding population of 968.9 million inhabitants living in the tropical world, which were grouped into three desired ranges in terms of cities' size. For each of these clusters, a technological arrangement flow-sheet was proposed, depending on their technical and economic viability from our best experience. Considering the population living in cities over 100, 000 inhabitants, the potential of energy and nutrient recovery via the sewage treatment scheme would be sufficient to generate electricity for approximately 3.2 million residents, as well as thermal energy for drying purposes that could result in a 24% volume reduction of sludge to be transported and disposed of in landfills. The results show that UASB/SBTF systems can play a very important role in the sanitation and environmental sector towards more sustainable sewage treatment plants.

  4. Critical parameters in the production of ceramic pot filters for household water treatment in developing countries.

    Science.gov (United States)

    Soppe, A I A; Heijman, S G J; Gensburger, I; Shantz, A; van Halem, D; Kroesbergen, J; Wubbels, G H; Smeets, P W M H

    2015-06-01

    The need to improve the access to safe water is generally recognized for the benefit of public health in developing countries. This study's objective was to identify critical parameters which are essential for improving the performance of ceramic pot filters (CPFs) as a point-of-use water treatment system. Defining critical production parameters was also relevant to confirm that CPFs with high-flow rates may have the same disinfection capacity as pots with normal flow rates. A pilot unit was built in Cambodia to produce CPFs under controlled and constant conditions. Pots were manufactured from a mixture of clay, laterite and rice husk in a small-scale, gas-fired, temperature-controlled kiln and tested for flow rate, removal efficiency of bacteria and material strength. Flow rate can be increased by increasing pore sizes and by increasing porosity. Pore sizes were increased by using larger rice husk particles and porosity was increased with larger proportions of rice husk in the clay mixture. The main conclusions: larger pore size decreases the removal efficiency of bacteria; higher porosity does not affect the removal efficiency of bacteria, but does influence the strength of pots; flow rates of CPFs can be raised to 10-20 L/hour without a significant decrease in bacterial removal efficiency.

  5. Development of GUI Temperature Monitoring System based on Thin-Film Optical Filter

    Directory of Open Access Journals (Sweden)

    Hilal Adnan Fadhil

    2017-08-01

    Full Text Available Fiber optic sensors have progressed rapidly in recent year as because it has many advantages over other types of sensors in terms of freedom from electromagnetic radiation, wide bandwidth, economy, can withstand high temperature and under harsh environment. Due to those reason a thermo sensor based on fiber optic which utilizes a thin-film optical band-pass filter has been developed. However, the proposed system has advantages over the fiber Bragg grating sensor which can observe the temperature in small area and low transmission loss. The simulation software is used to design a Graphical User Interface (GUI. The GUI system allows the user to monitor the condition and the status of the current temperature. The monitoring system presented in this paper is divided into three basic sub-systems which are retrieve the real-time data system, displaying out the data system, and warning system. This GUI system used to collect the data and process the data for displaying the current data and further checking as a history data has been keep. The values obtained of thermo sensor are measured as 30°C till 330°C and the wavelength values are between 1552.93nm till 1557.25nm

  6. Simulation of an electrostatic soot-filter with continuous electrochemical conversion during the stages of development

    International Nuclear Information System (INIS)

    Muri, M.

    1996-04-01

    The dissertation describes the simulation of an electrostatic Diesel-Soot-Converter during its stages of development. This simulation is not only necessary for the interpretation of the experimental results, it also shows results for assumptions that cannot be received experimentally. The Diesel-Soot-Converter consists of a charging electrode, which charges the particles by a high-voltage and a ceramic monolith, where the particles are precipitated in the open channels because of an electric field created also by a high-voltage. Afterwards the particles are burned by a plasma. The filter-function of the Diesel-Soot-Converter was formulated and the efficiency for a vehicle was calculated. In the first part of the calculation the mass flow of a BMW 318tds and a BMW 325tds was determined for an US-FTP75-testcycle and for fuel load. In the second part the efficiency of different Diesel-Soot-Converter-types was calculated for the US-FTP75-testcycle and for full load. The use of the program with other testcycles is possible. The results of the calculations show the best configuration of the Diesel-Soot-Converter for the corresponding vehicle. Therefore with the help of this program time and money for the production of the ceramic can be saved. (author)

  7. Development of Vision Control Scheme of Extended Kalman filtering for Robot's Position Control

    International Nuclear Information System (INIS)

    Jang, W. S.; Kim, K. S.; Park, S. I.; Kim, K. Y.

    2003-01-01

    It is very important to reduce the computational time in estimating the parameters of vision control algorithm for robot's position control in real time. Unfortunately, the batch estimation commonly used requires too murk computational time because it is iteration method. So, the batch estimation has difficulty for robot's position control in real time. On the other hand, the Extended Kalman Filtering(EKF) has many advantages to calculate the parameters of vision system in that it is a simple and efficient recursive procedures. Thus, this study is to develop the EKF algorithm for the robot's vision control in real time. The vision system model used in this study involves six parameters to account for the inner(orientation, focal length etc) and outer (the relative location between robot and camera) parameters of camera. Then, EKF has been first applied to estimate these parameters, and then with these estimated parameters, also to estimate the robot's joint angles used for robot's operation. finally, the practicality of vision control scheme based on the EKF has been experimentally verified by performing the robot's position control

  8. Development and benchmark verification of a parallelized Monte Carlo burnup calculation program MCBMPI

    International Nuclear Information System (INIS)

    Yang Wankui; Liu Yaoguang; Ma Jimin; Yang Xin; Wang Guanbo

    2014-01-01

    MCBMPI, a parallelized burnup calculation program, was developed. The program is modularized. Neutron transport calculation module employs the parallelized MCNP5 program MCNP5MPI, and burnup calculation module employs ORIGEN2, with the MPI parallel zone decomposition strategy. The program system only consists of MCNP5MPI and an interface subroutine. The interface subroutine achieves three main functions, i.e. zone decomposition, nuclide transferring and decaying, data exchanging with MCNP5MPI. Also, the program was verified with the Pressurized Water Reactor (PWR) cell burnup benchmark, the results showed that it's capable to apply the program to burnup calculation of multiple zones, and the computation efficiency could be significantly improved with the development of computer hardware. (authors)

  9. Development and verification of an efficient spatial neutron kinetics method for reactivity-initiated event analyses

    International Nuclear Information System (INIS)

    Ikeda, Hideaki; Takeda, Toshikazu

    2001-01-01

    A space/time nodal diffusion code based on the nodal expansion method (NEM), EPISODE, was developed in order to evaluate transient neutron behavior in light water reactor cores. The present code employs the improved quasistatic (IQS) method for spatial neutron kinetics, and neutron flux distribution is numerically obtained by solving the neutron diffusion equation with the nonlinear iteration scheme to achieve fast computation. A predictor-corrector (PC) method developed in the present study enabled to apply a coarse time mesh to the transient spatial neutron calculation than that applicable in the conventional IQS model, which improved computational efficiency further. Its computational advantage was demonstrated by applying to the numerical benchmark problems that simulate reactivity-initiated events, showing reduction of computational times up to a factor of three than the conventional IQS. The thermohydraulics model was also incorporated in EPISODE, and the capability of realistic reactivity event analyses was verified using the SPERT-III/E-Core experimental data. (author)

  10. Development and verification of remote research environment based on 'Fusion research grid'

    International Nuclear Information System (INIS)

    Iba, Katsuyuki; Ozeki, Takahisa; Totsuka, Toshiyuki; Suzuki, Yoshio; Oshima, Takayuki; Sakata, Shinya; Sato, Minoru; Suzuki, Mitsuhiro; Hamamatsu, Kiyotaka; Kiyono, Kimihiro

    2008-01-01

    'Fusion research grid' is a concept that unites scientists and let them collaborate effectively against their difference in time zone and location in a nuclear fusion research. Fundamental technologies of 'Fusion research grid' have been developed at JAEA in the VizGrid project under the e-Japan project at the Ministry of Education, Culture, Sports, Science and Technology (MEXT). We are conscious of needs to create new systems that assist researchers with their research activities because remote collaborations have been increasing in international projects. Therefore we have developed prototype remote research environments for experiments, diagnostics, analyses and communications based on 'Fusion research grid'. All users can access these environments from anywhere because 'Fusion research grid' does not require a closed network like Super SINET to maintain security. The prototype systems were verified in experiments at JT-60U and their availability was confirmed

  11. Chemical/Biological Agent Resistance Test (CBART) Test Fixture System Verification and Analytical Monitioring System Development

    Science.gov (United States)

    2011-03-15

    progress was made towards the proportional intergral derivative (PID) tuning. The CBART NRT analytical system was developed, moved, replumbed, and...efficacy, or applicability of the contents hereof. The use of trade names in this report does not constitute endorsement of any commercial product ...Office MFC mass flow controller MS mass spectrometer MSD mass selective detector NRT near real-time PID proportional intergral derivative

  12. New development in nondestructive measurement and verification of irradiated LWR fuels

    International Nuclear Information System (INIS)

    Lee, D.M.; Phillips, J.R.; Halbig, J.K.; Hsue, S.T.; Lindquist, L.O.; Ortega, E.M.; Caine, J.C.; Swansen, J.; Kaieda, K.; Dermendjiev, E.

    1979-01-01

    Nondestructive techniques for characterizing irradiated LWR fuel assemblies are discussed. This includes detection systems that measure the axial activity profile, neutron yield and gamma yield. A multi-element profile monitor has been developed that offers a significant improvement in speed and complexity over existing mechanical scanning systems. New portable detectors and electronics, applicable to safeguard inspection, are presented and results of gamma-ray and neutron measurements at commercial reactor facilities are given

  13. Development and verification for review plan of emergency action level (EAL)

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2013-08-15

    Emergency action levels (EALs) are used as the trigger in order to implement the precautionary protective actions at the nuclear emergency. In this study the framework for applying the EAL in Japan and the process for developing the review plan, such as procedures to review the basis of EAL submitted by the licensee, have been investigated based on the survey for EAL review executed in the United States. In addition, issues to reflect the EAL framework in enhancement of the local government emergency planning and emergency response support system have been investigated. (author)

  14. Development and verification of a coupled code system RETRAN-MASTER-TORC

    International Nuclear Information System (INIS)

    Cho, J.Y.; Song, J.S.; Joo, H.G.; Zee, S.Q.

    2004-01-01

    Recently, coupled thermal-hydraulics (T-H) and three-dimensional kinetics codes have been widely used for the best-estimate simulations such as the main steam line break (MSLB) and locked rotor problems. This work is to develop and verify one of such codes by coupling the system T-H code RETRAN, the 3-D kinetics code MASTER and sub-channel analysis code TORC. The MASTER code has already been applied to such simulations after coupling with the MARS or RETRAN-3D multi-dimensional system T-H codes. The MASTER code contains a sub-channel analysis code COBRA-III C/P, and the coupled systems MARSMASTER-COBRA and RETRAN-MASTER-COBRA had been already developed and verified. With these previous studies, a new coupled system of RETRAN-MASTER-TORC is to be developed and verified for the standard best-estimate simulation code package in Korea. The TORC code has already been applied to the thermal hydraulics design of the several ABB/CE type plants and Korean Standard Nuclear Power Plants (KSNP). This justifies the choice of TORC rather than COBRA. Because the coupling between RETRAN and MASTER codes are already established and verified, this work is simplified to couple the TORC sub-channel T-H code with the MASTER neutronics code. The TORC code is a standalone code that solves the T-H equations for a given core problem from reading the input file and finally printing the converged solutions. However, in the coupled system, because TORC receives the pin power distributions from the neutronics code MASTER and transfers the T-H results to MASTER iteratively, TORC needs to be controlled by the MASTER code and does not need to solve the given problem completely at each iteration step. By this reason, the coupling of the TORC code with the MASTER code requires several modifications in the I/O treatment, flow iteration and calculation logics. The next section of this paper describes the modifications in the TORC code. The TORC control logic of the MASTER code is then followed. The

  15. Recent developments in the design and verification of crystalline polarization scramblers for space applications

    Science.gov (United States)

    Dubroca, Guilhem; Richert, Michaël.; Loiseaux, Didier; Caron, Jérôme; Bézy, Jean-Loup

    2015-09-01

    To increase the accuracy of earth-observation spectro-imagers, it is necessary to achieve high levels of depolarization of the incoming beam. The preferred device in space instrument is the so-called polarization scrambler. It is made of birefringent crystal wedges arranged in a single or dual Babinet. Today, with required radiometric accuracies of the order of 0.1%, it is necessary to develop tools to find optimal and low sensitivity solutions quickly and to measure the performances with a high level of accuracy.

  16. Swarm Verification

    Science.gov (United States)

    Holzmann, Gerard J.; Joshi, Rajeev; Groce, Alex

    2008-01-01

    Reportedly, supercomputer designer Seymour Cray once said that he would sooner use two strong oxen to plow a field than a thousand chickens. Although this is undoubtedly wise when it comes to plowing a field, it is not so clear for other types of tasks. Model checking problems are of the proverbial "search the needle in a haystack" type. Such problems can often be parallelized easily. Alas, none of the usual divide and conquer methods can be used to parallelize the working of a model checker. Given that it has become easier than ever to gain access to large numbers of computers to perform even routine tasks it is becoming more and more attractive to find alternate ways to use these resources to speed up model checking tasks. This paper describes one such method, called swarm verification.

  17. Verification tests for CANDU advanced fuel -Development of the advanced CANDU technology-

    International Nuclear Information System (INIS)

    Chung, Jang Hwan; Suk, Ho Cheon; Jeong, Moon Ki; Park, Joo Hwan; Jeong, Heung Joon; Jeon, Ji Soo; Kim, Bok Deuk

    1994-07-01

    This project is underway in cooperation with AECL to develop the CANDU advanced fuel bundle (so-called, CANFLEX) which can enhance reactor safety and fuel economy in comparison with the current CANDU fuel and which can be used with natural uranium, slightly enriched uranium and other advanced fuel cycle. As the final schedule, the advanced fuel will be verified by carrying out a large scale demonstration of the bundle irradiation in a commercial CANDU reactor, and consequently will be used in the existing and future CANDU reactors in Korea. The research activities during this year Out-of-pile hydraulic tests for the prototype of CANFLEX bundle was conducted in the CANDU-hot test loop at KAERI. Thermalhydraulic analysis with the assumption of CANFLEX-NU fuel loaded in Wolsong-1 was performed by using thermalhydraulic code, and the thermal margin and T/H compatibility of CANFLEX bundle with existing fuel for CANDU-6 reactor have been evaluated. (Author)

  18. The development and verification of a highly accurate collision prediction model for automated noncoplanar plan delivery

    International Nuclear Information System (INIS)

    Yu, Victoria Y.; Tran, Angelia; Nguyen, Dan; Cao, Minsong; Ruan, Dan; Low, Daniel A.; Sheng, Ke

    2015-01-01

    Purpose: Significant dosimetric benefits had been previously demonstrated in highly noncoplanar treatment plans. In this study, the authors developed and verified an individualized collision model for the purpose of delivering highly noncoplanar radiotherapy and tested the feasibility of total delivery automation with Varian TrueBeam developer mode. Methods: A hand-held 3D scanner was used to capture the surfaces of an anthropomorphic phantom and a human subject, which were positioned with a computer-aided design model of a TrueBeam machine to create a detailed virtual geometrical collision model. The collision model included gantry, collimator, and couch motion degrees of freedom. The accuracy of the 3D scanner was validated by scanning a rigid cubical phantom with known dimensions. The collision model was then validated by generating 300 linear accelerator orientations corresponding to 300 gantry-to-couch and gantry-to-phantom distances, and comparing the corresponding distance measurements to their corresponding models. The linear accelerator orientations reflected uniformly sampled noncoplanar beam angles to the head, lung, and prostate. The distance discrepancies between measurements on the physical and virtual systems were used to estimate treatment-site-specific safety buffer distances with 0.1%, 0.01%, and 0.001% probability of collision between the gantry and couch or phantom. Plans containing 20 noncoplanar beams to the brain, lung, and prostate optimized via an in-house noncoplanar radiotherapy platform were converted into XML script for automated delivery and the entire delivery was recorded and timed to demonstrate the feasibility of automated delivery. Results: The 3D scanner measured the dimension of the 14 cm cubic phantom within 0.5 mm. The maximal absolute discrepancy between machine and model measurements for gantry-to-couch and gantry-to-phantom was 0.95 and 2.97 cm, respectively. The reduced accuracy of gantry-to-phantom measurements was

  19. Development and Performance Verification of Fiber Optic Temperature Sensors in High Temperature Engine Environments

    Science.gov (United States)

    Adamovsky, Grigory; Mackey, Jeffrey R.; Kren, Lawrence A.; Floyd, Bertram M.; Elam, Kristie A.; Martinez, Martel

    2014-01-01

    A High Temperature Fiber Optic Sensor (HTFOS) has been developed at NASA Glenn Research Center for aircraft engine applications. After fabrication and preliminary in-house performance evaluation, the HTFOS was tested in an engine environment at NASA Armstrong Flight Research Center. The engine tests enabled the performance of the HTFOS in real engine environments to be evaluated along with the ability of the sensor to respond to changes in the engine's operating condition. Data were collected prior, during, and after each test in order to observe the change in temperature from ambient to each of the various test point levels. An adequate amount of data was collected and analyzed to satisfy the research team that HTFOS operates properly while the engine was running. Temperature measurements made by HTFOS while the engine was running agreed with those anticipated.

  20. The development and verification of a highly accurate collision prediction model for automated noncoplanar plan delivery

    Energy Technology Data Exchange (ETDEWEB)

    Yu, Victoria Y.; Tran, Angelia; Nguyen, Dan; Cao, Minsong; Ruan, Dan; Low, Daniel A.; Sheng, Ke, E-mail: ksheng@mednet.ucla.edu [Department of Radiation Oncology, David Geffen School of Medicine, University of California Los Angeles, Los Angeles, California 90024 (United States)

    2015-11-15

    Purpose: Significant dosimetric benefits had been previously demonstrated in highly noncoplanar treatment plans. In this study, the authors developed and verified an individualized collision model for the purpose of delivering highly noncoplanar radiotherapy and tested the feasibility of total delivery automation with Varian TrueBeam developer mode. Methods: A hand-held 3D scanner was used to capture the surfaces of an anthropomorphic phantom and a human subject, which were positioned with a computer-aided design model of a TrueBeam machine to create a detailed virtual geometrical collision model. The collision model included gantry, collimator, and couch motion degrees of freedom. The accuracy of the 3D scanner was validated by scanning a rigid cubical phantom with known dimensions. The collision model was then validated by generating 300 linear accelerator orientations corresponding to 300 gantry-to-couch and gantry-to-phantom distances, and comparing the corresponding distance measurements to their corresponding models. The linear accelerator orientations reflected uniformly sampled noncoplanar beam angles to the head, lung, and prostate. The distance discrepancies between measurements on the physical and virtual systems were used to estimate treatment-site-specific safety buffer distances with 0.1%, 0.01%, and 0.001% probability of collision between the gantry and couch or phantom. Plans containing 20 noncoplanar beams to the brain, lung, and prostate optimized via an in-house noncoplanar radiotherapy platform were converted into XML script for automated delivery and the entire delivery was recorded and timed to demonstrate the feasibility of automated delivery. Results: The 3D scanner measured the dimension of the 14 cm cubic phantom within 0.5 mm. The maximal absolute discrepancy between machine and model measurements for gantry-to-couch and gantry-to-phantom was 0.95 and 2.97 cm, respectively. The reduced accuracy of gantry-to-phantom measurements was

  1. Development of novel EMAT-ECT multi-sensor and verification of its feasibility

    International Nuclear Information System (INIS)

    Suzuki, Kenichiro; Uchimoto, Tetsuya; Takagi, Toshiyuki; Sato, Takeshi; Guy, Philippe; Casse, Amelie

    2006-01-01

    In this study, we propose a novel EMAT-ECT multi sensor aiming at advanced structural health monitoring. For the purpose, proto-type EMAT-ECT multi-sensor was developed and their functions both as ECT and EMAT prove were evaluated. Experimental results of pulse ECT using the EMAT-ECT multi-sensor showed that the proposed sensor has a capability of detection and sizing of flaws. Experimental results of EMAT evaluation using the EMAT-ECT multi-sensor showed that ultrasonic wave was transmitted by EMAT-ECT multi sensor and flaw echo was observed. These results imply that EMAT-ECT multi sensor is available for pulse ECT and EMAT. (author)

  2. The development and verification of a highly accurate collision prediction model for automated noncoplanar plan delivery.

    Science.gov (United States)

    Yu, Victoria Y; Tran, Angelia; Nguyen, Dan; Cao, Minsong; Ruan, Dan; Low, Daniel A; Sheng, Ke

    2015-11-01

    Significant dosimetric benefits had been previously demonstrated in highly noncoplanar treatment plans. In this study, the authors developed and verified an individualized collision model for the purpose of delivering highly noncoplanar radiotherapy and tested the feasibility of total delivery automation with Varian TrueBeam developer mode. A hand-held 3D scanner was used to capture the surfaces of an anthropomorphic phantom and a human subject, which were positioned with a computer-aided design model of a TrueBeam machine to create a detailed virtual geometrical collision model. The collision model included gantry, collimator, and couch motion degrees of freedom. The accuracy of the 3D scanner was validated by scanning a rigid cubical phantom with known dimensions. The collision model was then validated by generating 300 linear accelerator orientations corresponding to 300 gantry-to-couch and gantry-to-phantom distances, and comparing the corresponding distance measurements to their corresponding models. The linear accelerator orientations reflected uniformly sampled noncoplanar beam angles to the head, lung, and prostate. The distance discrepancies between measurements on the physical and virtual systems were used to estimate treatment-site-specific safety buffer distances with 0.1%, 0.01%, and 0.001% probability of collision between the gantry and couch or phantom. Plans containing 20 noncoplanar beams to the brain, lung, and prostate optimized via an in-house noncoplanar radiotherapy platform were converted into XML script for automated delivery and the entire delivery was recorded and timed to demonstrate the feasibility of automated delivery. The 3D scanner measured the dimension of the 14 cm cubic phantom within 0.5 mm. The maximal absolute discrepancy between machine and model measurements for gantry-to-couch and gantry-to-phantom was 0.95 and 2.97 cm, respectively. The reduced accuracy of gantry-to-phantom measurements was attributed to phantom setup

  3. Development and verification of child observation sheet for 5-year-old children.

    Science.gov (United States)

    Fujimoto, Keiko; Nagai, Toshisaburo; Okazaki, Shin; Kawajiri, Mie; Tomiwa, Kiyotaka

    2014-02-01

    The aim of the study was to develop a newly devised child observation sheet (COS-5) as a scoring sheet, based on the Childhood Autism Rating Scale (CARS), for use in the developmental evaluation of 5-year-old children, especially focusing on children with autistic features, and to verify its validity. Seventy-six children were studied. The children were recruited among participants of the Japan Children's Cohort Study, a research program implemented by the Research Institute of Science and Technology for Society (RISTEX) from 2004 to 2009. The developmental evaluation procedure was performed by doctors, clinical psychologists, and public health nurses. The COS-5 was also partly based on the Kyoto Scale of Psychological Development 2001 (Kyoto Scale 2001). Further, the Developmental Disorders Screening Questionnaire for 5-Years-Olds, PDD-Autism Society Japan Rating Scale (PARS), doctor interview questions and neurological examination for 5-year-old children, and the Draw-a-Man Test (DAM) were used as evaluation scales. Eighteen (25.4%) children were rated as Suspected, including Suspected PDD, Suspected ADHD and Suspected MR. The COS-5 was suggested to be valid with favorable reliability (α=0.89) and correlation with other evaluation scales. The COS-5 may be useful, with the following advantages: it can be performed within a shorter time frame; it facilitates the maintenance of observation quality; it facilitates sharing information with other professions; and it is reliable to identify the autistic features of 5-year-old children. In order to verify its wider applications including the screening of infants (18months to 3years old) by adjusting the items of younger age, additional study is needed. Copyright © 2013 The Japanese Society of Child Neurology. Published by Elsevier B.V. All rights reserved.

  4. Development of the SEAtrace{trademark} barrier verification and validation technology. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Dunn, S.D.; Lowry, W.; Walsh, R.; Rao, D.V. [Science and Engineering Associates, Santa Fe, NM (United States); Williams, C. [Sandia National Labs., Albuquerque, NM (United States). Underground Storage Technology Dept.

    1998-08-01

    In-situ barrier emplacement techniques and materials for the containment of high-risk contaminants in soils are currently being developed by the Department of Energy (DOE). Because of their relatively high cost, the barriers are intended to be used in cases where the risk is too great to remove the contaminants, the contaminants are too difficult to remove with current technologies, or the potential movement of the contaminants to the water table is so high that immediate action needs to be taken to reduce health risks. Assessing the integrity of the barrier once it is emplaced, and during its anticipated life, is a very difficult but necessary requirement. Science and Engineering Associates, Inc., (SEA) and Sandia National Laboratories (SNL) have developed a quantitative subsurface barrier assessment system using gaseous tracers in support of the Subsurface Contaminants Focus Area barrier technology program. Called SEAtrace{trademark}, this system integrates an autonomous, multi-point soil vapor sampling and analysis system with a global optimization modeling methodology to locate and size barrier breaches in real time. The methodology for the global optimization code was completed and a prototype code written using simplifying assumptions. Preliminary modeling work to validate the code assumptions were performed using the T2VOC numerical code. A multi-point field sampling system was built to take soil gas samples and analyze for tracer gas concentration. The tracer concentration histories were used in the global optimization code to locate and size barrier breaches. SEAtrace{trademark} was consistently able to detect and locate leaks, even under very adverse conditions. The system was able to locate the leak to within 0.75 m of the actual value, and was able to determine the size of the leak to within 0.15 m.

  5. Development of the SEAtrace trademark barrier verification and validation technology. Final report

    International Nuclear Information System (INIS)

    Dunn, S.D.; Lowry, W.; Walsh, R.; Rao, D.V.; Williams, C.

    1998-08-01

    In-situ barrier emplacement techniques and materials for the containment of high-risk contaminants in soils are currently being developed by the Department of Energy (DOE). Because of their relatively high cost, the barriers are intended to be used in cases where the risk is too great to remove the contaminants, the contaminants are too difficult to remove with current technologies, or the potential movement of the contaminants to the water table is so high that immediate action needs to be taken to reduce health risks. Assessing the integrity of the barrier once it is emplaced, and during its anticipated life, is a very difficult but necessary requirement. Science and Engineering Associates, Inc., (SEA) and Sandia National Laboratories (SNL) have developed a quantitative subsurface barrier assessment system using gaseous tracers in support of the Subsurface Contaminants Focus Area barrier technology program. Called SEAtrace trademark, this system integrates an autonomous, multi-point soil vapor sampling and analysis system with a global optimization modeling methodology to locate and size barrier breaches in real time. The methodology for the global optimization code was completed and a prototype code written using simplifying assumptions. Preliminary modeling work to validate the code assumptions were performed using the T2VOC numerical code. A multi-point field sampling system was built to take soil gas samples and analyze for tracer gas concentration. The tracer concentration histories were used in the global optimization code to locate and size barrier breaches. SEAtrace trademark was consistently able to detect and locate leaks, even under very adverse conditions. The system was able to locate the leak to within 0.75 m of the actual value, and was able to determine the size of the leak to within 0.15 m

  6. Simulating the Water Use of Thermoelectric Power Plants in the United States: Model Development and Verification

    Science.gov (United States)

    Betrie, G.; Yan, E.; Clark, C.

    2016-12-01

    Thermoelectric power plants use the highest amount of freshwater second to the agriculture sector. However, there is scarcity of information that characterizes the freshwater use of these plants in the United States. This could be attributed to the lack of model and data that are required to conduct analysis and gain insights. The competition for freshwater among sectors will increase in the future as the amount of freshwater gets limited due climate change and population growth. A model that makes use of less data is urgently needed to conduct analysis and identify adaptation strategies. The objectives of this study are to develop a model and simulate the water use of thermoelectric power plants in the United States. The developed model has heat-balance, climate, cooling system, and optimization modules. It computes the amount of heat rejected to the environment, estimates the quantity of heat exchanged through latent and sensible heat to the environment, and computes the amount of water required per unit generation of electricity. To verify the model, we simulated a total of 876 fossil-fired, nuclear and gas-turbine power plants with different cooling systems (CS) using 2010-2014 data obtained from Energy Information Administration. The CS includes once-through with cooling pond, once-through without cooling ponds, recirculating with induced draft and recirculating with induced draft natural draft. The results show that the model reproduced the observed water use per unit generation of electricity for the most of the power plants. It is also noticed that the model slightly overestimates the water use during the summer period when the input water temperatures are higher. We are investigating the possible reasons for the overestimation and address it in the future work. The model could be used individually or coupled to regional models to analyze various adaptation strategies and improve the water use efficiency of thermoelectric power plants.

  7. Improved verification methods for safeguards verifications at enrichment plants

    International Nuclear Information System (INIS)

    Lebrun, A.; Kane, S. C.; Bourva, L.; Poirier, S.; Loghin, N. E.; Langlands, D.

    2009-01-01

    The International Atomic Energy Agency (IAEA) has initiated a coordinated research and development programme to improve its verification methods and equipment applicable to enrichment plants. The programme entails several individual projects to meet the objectives of the IAEA Safeguards Model Approach for Gas Centrifuge Enrichment Plants updated in 2006. Upgrades of verification methods to confirm the absence of HEU (highly enriched uranium) production have been initiated and, in particular, the Cascade Header Enrichment Monitor (CHEM) has been redesigned to reduce its weight and incorporate an electrically cooled germanium detector. Such detectors are also introduced to improve the attended verification of UF 6 cylinders for the verification of the material balance. Data sharing of authenticated operator weighing systems such as accountancy scales and process load cells is also investigated as a cost efficient and an effective safeguards measure combined with unannounced inspections, surveillance and non-destructive assay (NDA) measurement. (authors)

  8. Improved verification methods for safeguards verifications at enrichment plants

    Energy Technology Data Exchange (ETDEWEB)

    Lebrun, A.; Kane, S. C.; Bourva, L.; Poirier, S.; Loghin, N. E.; Langlands, D. [Department of Safeguards, International Atomic Energy Agency, Wagramer Strasse 5, A1400 Vienna (Austria)

    2009-07-01

    The International Atomic Energy Agency (IAEA) has initiated a coordinated research and development programme to improve its verification methods and equipment applicable to enrichment plants. The programme entails several individual projects to meet the objectives of the IAEA Safeguards Model Approach for Gas Centrifuge Enrichment Plants updated in 2006. Upgrades of verification methods to confirm the absence of HEU (highly enriched uranium) production have been initiated and, in particular, the Cascade Header Enrichment Monitor (CHEM) has been redesigned to reduce its weight and incorporate an electrically cooled germanium detector. Such detectors are also introduced to improve the attended verification of UF{sub 6} cylinders for the verification of the material balance. Data sharing of authenticated operator weighing systems such as accountancy scales and process load cells is also investigated as a cost efficient and an effective safeguards measure combined with unannounced inspections, surveillance and non-destructive assay (NDA) measurement. (authors)

  9. HEPA Filter Vulnerability Assessment

    International Nuclear Information System (INIS)

    GUSTAVSON, R.D.

    2000-01-01

    This assessment of High Efficiency Particulate Air (HEPA) filter vulnerability was requested by the USDOE Office of River Protection (ORP) to satisfy a DOE-HQ directive to evaluate the effect of filter degradation on the facility authorization basis assumptions. Within the scope of this assessment are ventilation system HEPA filters that are classified as Safety-Class (SC) or Safety-Significant (SS) components that perform an accident mitigation function. The objective of the assessment is to verify whether HEPA filters that perform a safety function during an accident are likely to perform as intended to limit release of hazardous or radioactive materials, considering factors that could degrade the filters. Filter degradation factors considered include aging, wetting of filters, exposure to high temperature, exposure to corrosive or reactive chemicals, and exposure to radiation. Screening and evaluation criteria were developed by a site-wide group of HVAC engineers and HEPA filter experts from published empirical data. For River Protection Project (RPP) filters, the only degradation factor that exceeded the screening threshold was for filter aging. Subsequent evaluation of the effect of filter aging on the filter strength was conducted, and the results were compared with required performance to meet the conditions assumed in the RPP Authorization Basis (AB). It was found that the reduction in filter strength due to aging does not affect the filter performance requirements as specified in the AB. A portion of the HEPA filter vulnerability assessment is being conducted by the ORP and is not part of the scope of this study. The ORP is conducting an assessment of the existing policies and programs relating to maintenance, testing, and change-out of HEPA filters used for SC/SS service. This document presents the results of a HEPA filter vulnerability assessment conducted for the River protection project as requested by the DOE Office of River Protection

  10. Development of particle filters for ships; Udvikling af partikelfiltre til skibe

    Energy Technology Data Exchange (ETDEWEB)

    Jakobsen, O.; Norre Holm, J.; Koecks, M. [Teknologisk Institut, Aarhus (Denmark)

    2013-04-01

    The project has resulted in a solution with a well-functioning maritime particle filter which reduces the particle emission significantly. The visible smoke from the vessels funnel, which typically is seen while manoeuvring in the harbour, is also reduced to a minimum. The system is constructed in such a way that the exhaust gases can be bypassed around the filter unit, in this situation to ensure the engines operation in case of filter clogging. The system has been provided with safety functions to prevent an excessive exhaust gas back-pressure and there are fitted remote controlled exhaust valves. Some of the challenges in the project have been the requirement from the engine manufacturer of keeping a low turbocharger back-pressure, besides the space conditions aboard the test vessel and the achievement of sufficient temperatures for regeneration of the particle filter. To oppose the requirement of low exhaust gas back-pressure, the filter housing was designed with space for twice as many monoliths as originally planned. In the funnel casing the original installations were removed to make space for the filter housing, and the system was enlarged with electrically controlled exhaust valves to improve the daily operation of the crew. The regeneration issue was solved by mounting electric automatically controlled heating elements in the filter housing and by an ash exhaust system. Regeneration is carried out by the crew when the vessel lies in harbour in the evening after the last tour of the day. Before mounting the particle filter, measurements were carried out aboard, showing a compound of particle emissions with an expected high NO{sub x}-level of 8.33 g/kW, whereas the other emissions were lower than expected at first. Especially HC and CO were very low, but also the particle mass (PM) had a relatively low value of 0.22 g/kWh. After commissioning the particle filter, a significant reduction of 93% of the particle number (N) was observed. A reduction in N was

  11. Development of a HPLC method for determination of four UV filters in sunscreen and its application to skin penetration studies.

    Science.gov (United States)

    Souza, Carla; Maia Campos, Patrícia M B G

    2017-12-01

    This study describes the development, validation and application of a high-performance liquid chromatography (HPLC) method for the simultaneous determination of the in vitro skin penetration profile of four UV filters on porcine skin. Experiments were carried out on a gel-cream formulation containing the following UV filters: diethylamino hydroxybenzoyl hexyl benzoate (DHHB), bis-ethylhexyloxyphenol methoxyphenyl triazine (BEMT), methylene bis-benzotriazolyl tetramethylbutylphenol (MBBT) and ethylhexyl triazone (EHT). The HPLC method demonstrated suitable selectivity, linearity (10.0-50.0 μg/mL), precision, accuracy and recovery from porcine skin and sunscreen formulation. The in vitro skin penetration profile was evaluated using Franz vertical diffusion cells for 24 h after application on porcine ear skin. None of the UV filters penetrated the porcine skin. Most of them stayed on the skin surface (>90%) and only BEMT, EHT and DHHB reached the dermis plus epidermis layer. These results are in agreement with previous results in the literature. Therefore, the analytical method was useful to evaluate the in vitro skin penetration of the UV filters and may help the development of safer and effective sunscreen products. Copyright © 2017 John Wiley & Sons, Ltd.

  12. Development of the VESUVIUS module. Molten jet breakup modeling and model verification

    Energy Technology Data Exchange (ETDEWEB)

    Vierow, K. [Nuclear Power Engineering Corp., Tokyo (Japan); Nagano, Katsuhiro; Araki, Kazuhiro

    1998-01-01

    With the in-vessel vapor explosion issue ({alpha}-mode failure) now considered to pose an acceptably small risk to the safety of a light water reactor, ex-vessel vapor explosions are being given considerable attention. Attempts are being made to analytically model breakup of continuous-phase jets, however uncertainty exists regarding the basic phenomena. In addition, the conditions upon reactor vessel failure, which determine the starting point of the ex-vessel vapor explosion process, are difficult to quantify. Herein, molten jet ejection from the reactor pressure vessel is characterized. Next, the expected mode of jet breakup is determined and the current state of analytical modeling is reviewed. A jet breakup model for ex-vessel scenarios, with the primary breakup mechanism being the Kelvin-Helmholtz instability, is described. The model has been incorporated into the VESUVIUS module and comparisons of VESUVIUS calculations against FARO L-06 experimental data show differences, particularly in the pressure curve and amount of jet breakup. The need for additional development to resolve these differences is discussed. (author)

  13. The Development and Verification of a Novel ECMS of Hybrid Electric Bus

    Directory of Open Access Journals (Sweden)

    Jun Wang

    2014-01-01

    Full Text Available This paper presents the system modeling, control strategy design, and hardware-in-the-loop test for a series-parallel hybrid electric bus. First, the powertrain mathematical models and the system architecture were proposed. Then an adaptive ECMS is developed for the real-time control of a hybrid electric bus, which is investigated and verified in a hardware-in-the-loop simulation system. The ECMS through driving cycle recognition results in updating the equivalent charge and discharge coefficients and extracting optimized rules for real-time control. This method not only solves the problems of mode transition frequently and improves the fuel economy, but also simplifies the complexity of control strategy design and provides new design ideas for the energy management strategy and gear-shifting rules designed. Finally, the simulation results show that the proposed real-time A-ECMS can coordinate the overall hybrid electric powertrain to optimize fuel economy and sustain the battery SOC level.

  14. Modelling coupled microbial processes in the subsurface: Model development, verification, evaluation and application

    Science.gov (United States)

    Masum, Shakil A.; Thomas, Hywel R.

    2018-06-01

    To study subsurface microbial processes, a coupled model which has been developed within a Thermal-Hydraulic-Chemical-Mechanical (THCM) framework is presented. The work presented here, focuses on microbial transport, growth and decay mechanisms under the influence of multiphase flow and bio-geochemical reactions. In this paper, theoretical formulations and numerical implementations of the microbial model are presented. The model has been verified and also evaluated against relevant experimental results. Simulated results show that the microbial processes have been accurately implemented and their impacts on porous media properties can be predicted either qualitatively or quantitatively or both. The model has been applied to investigate biofilm growth in a sandstone core that is subjected to a two-phase flow and variable pH conditions. The results indicate that biofilm growth (if not limited by substrates) in a multiphase system largely depends on the hydraulic properties of the medium. When the change in porewater pH which occurred due to dissolution of carbon dioxide gas is considered, growth processes are affected. For the given parameter regime, it has been shown that the net biofilm growth is favoured by higher pH; whilst the processes are considerably retarded at lower pH values. The capabilities of the model to predict microbial respiration in a fully coupled multiphase flow condition and microbial fermentation leading to production of a gas phase are also demonstrated.

  15. Development and preliminary verification of 2-D transport module of radiation shielding code ARES

    International Nuclear Information System (INIS)

    Zhang Penghe; Chen Yixue; Zhang Bin; Zang Qiyong; Yuan Longjun; Chen Mengteng

    2013-01-01

    The 2-D transport module of radiation shielding code ARES is two-dimensional neutron and radiation shielding code. The theory model was based on the first-order steady state neutron transport equation, adopting the discrete ordinates method to disperse direction variables. Then a set of differential equations can be obtained and solved with the source iteration method. The 2-D transport module of ARES was capable of calculating k eff and fixed source problem with isotropic or anisotropic scattering in x-y geometry. The theoretical model was briefly introduced and series of benchmark problems were verified in this paper. Compared with the results given by the benchmark, the maximum relative deviation of k eff is 0.09% and the average relative deviation of flux density is about 0.60% in the BWR cells benchmark problem. As for the fixed source problem with isotropic and anisotropic scattering, the results of the 2-D transport module of ARES conform with DORT very well. These numerical results of benchmark problems preliminarily demonstrate that the development process of the 2-D transport module of ARES is right and it is able to provide high precision result. (authors)

  16. Development and verification of local/global analysis techniques for laminated composites

    Science.gov (United States)

    Griffin, O. Hayden, Jr.

    1989-01-01

    Analysis and design methods for laminated composite materials have been the subject of considerable research over the past 20 years, and are currently well developed. In performing the detailed three-dimensional analyses which are often required in proximity to discontinuities, however, analysts often encounter difficulties due to large models. Even with the current availability of powerful computers, models which are too large to run, either from a resource or time standpoint, are often required. There are several approaches which can permit such analyses, including substructuring, use of superelements or transition elements, and the global/local approach. This effort is based on the so-called zoom technique to global/local analysis, where a global analysis is run, with the results of that analysis applied to a smaller region as boundary conditions, in as many iterations as is required to attain an analysis of the desired region. Before beginning the global/local analyses, it was necessary to evaluate the accuracy of the three-dimensional elements currently implemented in the Computational Structural Mechanics (CSM) Testbed. It was also desired to install, using the Experimental Element Capability, a number of displacement formulation elements which have well known behavior when used for analysis of laminated composites.

  17. The developments and verifications of trace model for IIST LOCA experiments

    Energy Technology Data Exchange (ETDEWEB)

    Zhuang, W. X. [Inst. of Nuclear Engineering and Science, National Tsing-Hua Univ., Taiwan, No. 101, Kuang-Fu Road, Hsinchu 30013, Taiwan (China); Wang, J. R.; Lin, H. T. [Inst. of Nuclear Energy Research, Taiwan, No. 1000, Wenhua Rd., Longtan Township, Taoyuan County 32546, Taiwan (China); Shih, C.; Huang, K. C. [Inst. of Nuclear Engineering and Science, National Tsing-Hua Univ., Taiwan, No. 101, Kuang-Fu Road, Hsinchu 30013, Taiwan (China); Dept. of Engineering and System Science, National Tsing-Hua Univ., Taiwan, No. 101, Kuang-Fu Road, Hsinchu 30013, Taiwan (China)

    2012-07-01

    The test facility IIST (INER Integral System Test) is a Reduced-Height and Reduced-Pressure (RHRP) integral test loop, which was constructed for the purposes of conducting thermal hydraulic and safety analysis of the Westinghouse three-loop PWR Nuclear Power Plants. The main purpose of this study is to develop and verify TRACE models of IIST through the IIST small break loss of coolant accident (SBLOCA) experiments. First, two different IIST TRACE models which include a pipe-vessel model and a 3-D vessel component model have been built. The steady state and transient calculation results show that both TRACE models have the ability to simulate the related IIST experiments. Comparing with IIST SBLOCA experiment data, the 3-D vessel component model has shown better simulation capabilities so that it has been chosen for all further thermal hydraulic studies. The second step is the sensitivity studies of two phase multiplier and subcooled liquid multiplier in choked flow model; and two correlation constants in CCFL model respectively. As a result, an appropriate set of multipliers and constants can be determined. In summary, a verified IIST TRACE model with 3D vessel component, and fine-tuned choked flow model and CCFL model is established for further studies on IIST experiments in the future. (authors)

  18. Development of a three-dimensionally movable phantom system for dosimetric verifications

    International Nuclear Information System (INIS)

    Nakayama, Hiroshi; Mizowaki, Takashi; Narita, Yuichiro; Kawada, Noriyuki; Takahashi, Kunio; Mihara, Kazumasa; Hiraoka, Masahiro

    2008-01-01

    The authors developed a three-dimensionally movable phantom system (3D movable phantom system) which can reproduce three-dimensional movements to experimentally verify the impact of radiotherapy treatment-related movements on dose distribution. The phantom system consists of three integrated components: a three-dimensional driving mechanism (3D driving mechanism), computer control system, and phantoms for film dosimetry. The 3D driving mechanism is a quintessential part of this system. It is composed of three linear-motion tables (single-axis robots) which are joined orthogonally to each other. This mechanism has a motion range of 100 mm, with a maximum velocity of 200 mm/s in each dimension, and 3D motion ability of arbitrary patterns. These attributes are sufficient to reproduce almost all organ movements. The positional accuracy of this 3D movable phantom system in a state of geostationary is less than 0.1 mm. The maximum error in terms of the absolute position on movement was 0.56 mm. The positional reappearance error on movement was up to 0.23 mm. The observed fluctuation of time was 0.012 s in the cycle of 4.5 s of oscillation. These results suggested that the 3D movable phantom system exhibited a sufficient level of accuracy in terms of geometry and timing to reproduce interfractional organ movement or setup errors in order to assess the influence of these errors on high-precision radiotherapy such as stereotactic irradiation and intensity-modulated radiotherapy. In addition, the authors 3D movable phantom system will also be useful in evaluating the adequacy and efficacy of new treatment techniques such as gating or tracking radiotherapy

  19. AXAF-I Low Intensity-Low Temperature (LILT) Testing of the Development Verification Test (DVT) Solar Panel

    Science.gov (United States)

    Alexander, Doug; Edge, Ted; Willowby, Doug

    1998-01-01

    The planned orbit of the AXAF-I spacecraft will subject the spacecraft to both short, less than 30 minutes for solar and less than 2 hours for lunar, and long earth eclipses and lunar eclipses with combined conjunctive duration of up to 3 to 4 hours. Lack of proper Electrical Power System (EPS) conditioning prior to eclipse may cause loss of mission. To avoid this problem, for short eclipses, it is necessary to off-point the solar array prior to or at the beginning of the eclipse to reduce the battery state of charge (SOC). This yields less overcharge during the high charge currents at sun entry. For long lunar eclipses, solar array pointing and load scheduling must be tailored for the profile of the eclipse. The battery SOC, loads, and solar array current-voltage (I-V) must be known or predictable to maintain the bus voltage within acceptable range. To address engineering concerns about the electrical performance of the AXAF-I solar array under Low Intensity and Low Temperature (LILT) conditions, Marshall Space Flight Center (MSFC) engineers undertook special testing of the AXAF-I Development Verification Test (DVT) solar panel in September-November 1997. In the test the DVT test panel was installed in a thermal vacuum chamber with a large view window with a mechanical "flapper door". The DVT test panel was "flash" tested with a Large Area Pulse Solar Simulator (LAPSS) at various fractional sun intensities and panel (solar cell) temperatures. The testing was unique with regards to the large size of the test article and type of testing performed. The test setup, results, and lessons learned from the testing will be presented.

  20. Synthesis of reference compounds related to Chemical Weapons Convention for verification and drug development purposes – a Brazilian endeavour

    Science.gov (United States)

    Cavalcante, S. F. A.; de Paula, R. L.; Kitagawa, D. A. S.; Barcellos, M. C.; Simas, A. B. C.; Granjeiro, J. M.

    2018-03-01

    This paper deals with challenges that Brazilian Army Organic Synthesis Laboratory has been going through to access reference compounds related to the Chemical Weapons Convention in order to support verification analysis and for research of novel antidotes. Some synthetic procedures to produce the chemicals, as well as Quality Assurance issues and a brief introduction of international agreements banning chemical weapons are also presented.

  1. Distorted Fingerprint Verification System

    Directory of Open Access Journals (Sweden)

    Divya KARTHIKAESHWARAN

    2011-01-01

    Full Text Available Fingerprint verification is one of the most reliable personal identification methods. Fingerprint matching is affected by non-linear distortion introduced in fingerprint impression during the image acquisition process. This non-linear deformation changes both the position and orientation of minutiae. The proposed system operates in three stages: alignment based fingerprint matching, fuzzy clustering and classifier framework. First, an enhanced input fingerprint image has been aligned with the template fingerprint image and matching score is computed. To improve the performance of the system, a fuzzy clustering based on distance and density has been used to cluster the feature set obtained from the fingerprint matcher. Finally a classifier framework has been developed and found that cost sensitive classifier produces better results. The system has been evaluated on fingerprint database and the experimental result shows that system produces a verification rate of 96%. This system plays an important role in forensic and civilian applications.

  2. Development of gel-filter method for high enrichment of low-molecular weight proteins from serum.

    Directory of Open Access Journals (Sweden)

    Lingsheng Chen

    Full Text Available The human serum proteome has been extensively screened for biomarkers. However, the large dynamic range of protein concentrations in serum and the presence of highly abundant and large molecular weight proteins, make identification and detection changes in the amount of low-molecular weight proteins (LMW, molecular weight ≤ 30kDa difficult. Here, we developed a gel-filter method including four layers of different concentration of tricine SDS-PAGE-based gels to block high-molecular weight proteins and enrich LMW proteins. By utilizing this method, we identified 1,576 proteins (n = 2 from 10 μL serum. Among them, 559 (n = 2 proteins belonged to LMW proteins. Furthermore, this gel-filter method could identify 67.4% and 39.8% more LMW proteins than that in representative methods of glycine SDS-PAGE and optimized-DS, respectively. By utilizing SILAC-AQUA approach with labeled recombinant protein as internal standard, the recovery rate for GST spiked in serum during the treatment of gel-filter, optimized-DS, and ProteoMiner was 33.1 ± 0.01%, 18.7 ± 0.01% and 9.6 ± 0.03%, respectively. These results demonstrate that the gel-filter method offers a rapid, highly reproducible and efficient approach for screening biomarkers from serum through proteomic analyses.

  3. Filter arrays

    Science.gov (United States)

    Page, Ralph H.; Doty, Patrick F.

    2017-08-01

    The various technologies presented herein relate to a tiled filter array that can be used in connection with performance of spatial sampling of optical signals. The filter array comprises filter tiles, wherein a first plurality of filter tiles are formed from a first material, the first material being configured such that only photons having wavelengths in a first wavelength band pass therethrough. A second plurality of filter tiles is formed from a second material, the second material being configured such that only photons having wavelengths in a second wavelength band pass therethrough. The first plurality of filter tiles and the second plurality of filter tiles can be interspersed to form the filter array comprising an alternating arrangement of first filter tiles and second filter tiles.

  4. Developments of DPF systems with mesh laminated structures. Performances of DPF systems which consist of the metal-mesh laminated filter combustion with the alumina-fiber mesh, and the combustion device of trapped diesel particles; Mesh taso kozo no DPF no kaihatsu. Kinzokusen to arumina sen`i mesh ni yoru fukugo filter to filter heiyo heater ni yoru DPF no seino

    Energy Technology Data Exchange (ETDEWEB)

    Kojima, T; Tange, A; Matsuda, K [NHK Spring Co. Ltd., Yokohama (Japan)

    1997-10-01

    For the purpose of continuous run without any maintenance, new DPF (diesel particulate filter)systems laminated by both metal-wire mesh and alumina-fiber mesh alternately, are under the developments. The perfect combustion of trapped diesel particulate can be achieved by a couple of the resistance heating devices inserted into the filter. 5 refs., 7 figs., 3 tabs.

  5. Development and evaluation of a plant-based air filter system for ...

    African Journals Online (AJOL)

    We investigated a novel plant-based air filter system for bacterial growth control. The volatile components released from the experimental plant (Cupressus macrocarpa) were used as the basis of the bacterial growth control and inhibition. We monitored the effect of light on the gas exhausted from the system, and we found ...

  6. Development and evaluation of a plant-based air filter system for ...

    African Journals Online (AJOL)

    Y. Choi

    2013-04-17

    Apr 17, 2013 ... plant based filter system on bacterial growth in aqueous media, the compressed air was fed to the system at a rate of 200 mL/min, and the exhaust gas from the system was supplied to a bacterial culture. In this experiment, we attempted to verify the inhibition activity of the gas on bacterial growth in aqueous ...

  7. Development of a filtered neutron field in KUR. In behalf of biological irradiation experiments

    International Nuclear Information System (INIS)

    Sato, Takashi; Utsuro, Masahiko; Utsumi, Hiroshi

    1995-07-01

    Very little direct measurements have been made of the biological effects of neutrons below 100keV. Recently, an iron-filtered 24keV neutron beam of Harwell Materials Testing Reactor, PLUTO, was reported to be highly efficient in inducing chromosome aberrations; the efficiency being comparable to that of fission neutrons. This results could have serious repercussions for radiation protection standards as the ICRP assume a decrease in neutron RBE below 100keV. The investigations reported here have as their primary purpose the production of neutron beams at the 24keV iron window energy, using the B-1 experimental facility of the Kyoto University Research Reactor (KUR) at the Research Reactor Institute, Kyoto University (KURRI). The filtered neutron filed for biomedical applications is designed to maximized the contributions of neutrons with other energies and gamma-rays. The characteristics of the radiation field were obtained by the simple transmission calculations for Fe(45cm) and Al(35cm) filters, by using the Monte Carlo code MCN P, and by the measurement of nuclear heating for Fe and Al filter pieces. The 24keV neutron flux and gamma-ray dose rate were measured using a proton recoil counter and TLDs, respectively. The measured findings are as follows: The 24keV neutron flux at the irradiation field was approximately 1x10 6 n/cm 2 /s, and the gamma-ray dose rate was 1.0Gy/h at the surface of the B-1 plug. Nuclear heating of the filter materials was 5.2mW/g for Fe and 4mW/g for Al, in maximum. (author)

  8. Development of sealed radioactive sources immobilized in epoxy resin for verification of detectors used in nuclear medicine

    International Nuclear Information System (INIS)

    Tiezzi, Rodrigo

    2016-01-01

    The radioactive sealed sources are used in verification ionization chamber detectors, which measure the activity of radioisotopes used in several areas, such as in nuclear medicine. The measurement of the activity of radioisotopes must be made with accuracy, because it is administered to a patient. To ensure the proper functioning of the ionization chamber detectors, standardized tests are set by the International Atomic Energy Agency (IAEA) and the National Nuclear Energy Commission using sealed radioactive sources of Barium-133, Cesium-137 and Cobalt-57. The tests assess the accuracy, precision, reproducibility and linearity of response of the equipment. The focus of this work was the study and the development of these radioactive sources with standard Barium-133 and Cesium-137,using a polymer, in case commercial epoxy resin of diglycidyl ether of bisphenol A (DGEBA) and a curing agent based on modified polyamine diethylenetriamine (DETA), to immobilize the radioactive material. The polymeric matrix has the main function of fix and immobilize the radioactive contents not allowing them to leak within the technical limits required by the standards of radiological protection in the category of characteristics of a sealed source and additionally have the ability to retain the emanation of any gases that may be formed during the manufacture process and the useful life of this artifact. The manufacturing process of a sealed source standard consists of the potting ,into bottle standardized geometry, in fixed volume of a quantity of a polymeric matrix within which is added and dispersed homogeneously to need and exact amount in activity of the radioactive materials standards. Accordingly, a study was conducted for the choice of epoxy resin, analyzing its characteristics and properties. Studies and tests were performed, examining the maximum miscibility of the resin with the water (acidic solution, simulating the conditions of radioactive solution), loss of mechanical and

  9. Development of sealed radioactive sources immobilized in epoxy resin for verification of detectors used in nuclear medicine

    Energy Technology Data Exchange (ETDEWEB)

    Tiezzi, Rodrigo; Rostelato, Maria Elisa C.M.; Nagatomi, Helio R.; Zeituni, Calos A.; Benega, Marcos A.G.; Souza, Daiane B. de; Costa, Osvaldo L. da; Souza, Carla D.; Rodrigues, Bruna T.; Souza, Anderson S. de; Peleias Junior, Fernando S.; Santos, Rafael Melo dos; Melo, Emerson Ronaldo de, E-mail: rktiezzi@gmail.com [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil); Karan Junior, Dib [Universidade de Sao Paulo (USP), Sao Paulo, SP (Brazil)

    2015-07-01

    The radioactive sealed sources are used in verification ionization chamber detectors, which measure the activity of radioisotopes used in several areas, such as in nuclear medicine. The measurement of the activity of radioisotopes must be made with accuracy, because it is administered to a patient. To ensure the proper functioning of the ionization chamber detectors, standardized tests are set by the International Atomic Energy Agency (IAEA) and the National Nuclear Energy Commission using sealed radioactive sources of Barium-133, Cesium-137 and Cobalt-57. The tests assess the accuracy, precision, reproducibility and linearity of response of the equipment. The focus of this work was the study and the development of these radioactive sources with standard Barium-133, Cesium-137 and Cobalt-57,using a polymer, in case commercial epoxy resin of diglycidyl ether of bisphenol A (DGEBA) and a curing agent based on modified polyamine diethylenetriamine (DETA), to immobilize the radioactive material. The polymeric matrix has the main function of fix and immobilize the radioactive contents not allowing them to leak within the technical limits required by the standards of radiological protection in the category of characteristics of a sealed source and additionally have the ability to retain the emanation of any gases that may be formed during the manufacture process and the useful life of this artifact. The manufacturing process of a sealed source standard consists of the potting ,into bottle standardized geometry, in fixed volume of a quantity of a polymeric matrix within which is added and dispersed homogeneously to need and exact amount in activity of the radioactive materials standards. Accordingly, a study was conducted for the choice of epoxy resin, analyzing its characteristics and properties. Studies and tests were performed, examining the maximum solubility of the resin in water (acidic solution, simulating the conditions of radioactive solution), loss of mechanical

  10. Development of GPS Receiver Kalman Filter Algorithms for Stationary, Low-Dynamics, and High-Dynamics Applications

    Science.gov (United States)

    2016-06-01

    Filter Algorithms for Stationary, Low-Dynamics, and High-Dynamics Applications Executive Summary The Global Positioning system ( GPS ) is the primary...software that may need to be developed for performance prediction of current or future systems that incorporate GPS . The ultimate aim is to help inform...Defence Science and Technology Organisation in 1986. His major areas of work were adaptive tracking , sig- nal processing, and radar systems engineering

  11. Methods of Software Verification

    Directory of Open Access Journals (Sweden)

    R. E. Gurin

    2015-01-01

    Full Text Available This article is devoted to the problem of software verification (SW. Methods of software verification designed to check the software for compliance with the stated requirements such as correctness, system security and system adaptability to small changes in the environment, portability and compatibility, etc. These are various methods both by the operation process and by the way of achieving result. The article describes the static and dynamic methods of software verification and paid attention to the method of symbolic execution. In its review of static analysis are discussed and described the deductive method, and methods for testing the model. A relevant issue of the pros and cons of a particular method is emphasized. The article considers classification of test techniques for each method. In this paper we present and analyze the characteristics and mechanisms of the static analysis of dependencies, as well as their views, which can reduce the number of false positives in situations where the current state of the program combines two or more states obtained both in different paths of execution and in working with multiple object values. Dependences connect various types of software objects: single variables, the elements of composite variables (structure fields, array elements, the size of the heap areas, the length of lines, the number of initialized array elements in the verification code using static methods. The article pays attention to the identification of dependencies within the framework of the abstract interpretation, as well as gives an overview and analysis of the inference tools.Methods of dynamic analysis such as testing, monitoring and profiling are presented and analyzed. Also some kinds of tools are considered which can be applied to the software when using the methods of dynamic analysis. Based on the work a conclusion is drawn, which describes the most relevant problems of analysis techniques, methods of their solutions and

  12. Material integrity verification radar

    International Nuclear Information System (INIS)

    Koppenjan, S.K.

    1999-01-01

    The International Atomic Energy Agency (IAEA) has the need for verification of 'as-built' spent fuel-dry storage containers and other concrete structures. The IAEA has tasked the Special Technologies Laboratory (STL) to fabricate, test, and deploy a stepped-frequency Material Integrity Verification Radar (MIVR) system to nondestructively verify the internal construction of these containers. The MIVR system is based on previously deployed high-frequency, ground penetrating radar (GPR) systems that have been developed by STL for the U.S. Department of Energy (DOE). Whereas GPR technology utilizes microwave radio frequency energy to create subsurface images, MTVR is a variation for which the medium is concrete instead of soil. The purpose is to nondestructively verify the placement of concrete-reinforcing materials, pipes, inner liners, and other attributes of the internal construction. The MIVR system underwent an initial field test on CANDU reactor spent fuel storage canisters at Atomic Energy of Canada Limited (AECL), Chalk River Laboratories, Ontario, Canada, in October 1995. A second field test at the Embalse Nuclear Power Plant in Embalse, Argentina, was completed in May 1996. The DOE GPR also was demonstrated at the site. Data collection and analysis were performed for the Argentine National Board of Nuclear Regulation (ENREN). IAEA and the Brazilian-Argentine Agency for the Control and Accounting of Nuclear Material (ABACC) personnel were present as observers during the test. Reinforcing materials were evident in the color, two-dimensional images produced by the MIVR system. A continuous pattern of reinforcing bars was evident and accurate estimates on the spacing, depth, and size were made. The potential uses for safeguard applications were jointly discussed. The MIVR system, as successfully demonstrated in the two field tests, can be used as a design verification tool for IAEA safeguards. A deployment of MIVR for Design Information Questionnaire (DIQ

  13. Exploring Middle School Students' Representational Competence in Science: Development and Verification of a Framework for Learning with Visual Representations

    Science.gov (United States)

    Tippett, Christine Diane

    Scientific knowledge is constructed and communicated through a range of forms in addition to verbal language. Maps, graphs, charts, diagrams, formulae, models, and drawings are just some of the ways in which science concepts can be represented. Representational competence---an aspect of visual literacy that focuses on the ability to interpret, transform, and produce visual representations---is a key component of science literacy and an essential part of science reading and writing. To date, however, most research has examined learning from representations rather than learning with representations. This dissertation consisted of three distinct projects that were related by a common focus on learning from visual representations as an important aspect of scientific literacy. The first project was the development of an exploratory framework that is proposed for use in investigations of students constructing and interpreting multimedia texts. The exploratory framework, which integrates cognition, metacognition, semiotics, and systemic functional linguistics, could eventually result in a model that might be used to guide classroom practice, leading to improved visual literacy, better comprehension of science concepts, and enhanced science literacy because it emphasizes distinct aspects of learning with representations that can be addressed though explicit instruction. The second project was a metasynthesis of the research that was previously conducted as part of the Explicit Literacy Instruction Embedded in Middle School Science project (Pacific CRYSTAL, http://www.educ.uvic.ca/pacificcrystal). Five overarching themes emerged from this case-to-case synthesis: the engaging and effective nature of multimedia genres, opportunities for differentiated instruction using multimodal strategies, opportunities for assessment, an emphasis on visual representations, and the robustness of some multimodal literacy strategies across content areas. The third project was a mixed

  14. Development of improved low-cost ceramic water filters for viral removal in the Haitian context

    OpenAIRE

    Guerrero Latorre, Laura; Rusiñol Arantegui, Marta; Hundesa Gonfa, Ayalkibet; Garcia Vallès, Maite; Martínez Manent, Salvador; Joseph, Osnick; Bofill Mas, Silvia; Gironès Llop, Rosina

    2015-01-01

    Household-based water treatment (HWT) is increasingly being promoted to improve water quality and, therefore, health status in low-income countries. Ceramic water filters (CWFs) are used in many regions as sustainable HWT and have been proven to meet World Health Organization (WHO) microbiological performance targets for bacterial removal (24 log); however, the described viral removal efficiencies are insufficient to significantly reduce the associated risk of viral infection. With the object...

  15. Development and stability studies of sunscreen cream formulations containing three photo-protective filters

    OpenAIRE

    Smaoui, Slim; Ben Hlima, Hajer; Ben Chobba, Ines; Kadri, Adel

    2013-01-01

    The present study aimed to formulate and subsequently evaluate sunscreen cream (W/O/W emulsion) containing three photo-protective filters: benzophenone-3, ethylhexyl methoxycinnamate and titanium dioxide at different percentages. Formulations were stored at 8, 25 and 40 °C for four weeks to investigate their stability. Color, centrifugation, liquefaction, phase separation, pH and Sun Protection Factor (SPF) of sunscreen cream formulations were determined. The microbiological stability of the ...

  16. Preliminary study on filamentous particle distribution in septic tank effluent and their impact on filter cake development.

    Science.gov (United States)

    Spychała, Marcin; Nieć, Jakub; Pawlak, Maciej

    2013-01-01

    In this paper, the preliminary study on the impact of filamentous particles (FP) in the septic tank effluent (STE) on filter cake (FC) development was presented. The number, length and diameter (30 p./cm3, 451 and 121 microm, respectively, on average) of FPs were measured using microscope image analysis of STE samples condensed using a vacuum evaporation set. Results of this study showed, that 0.73% of volatile suspended solids (VSSs) mass from the STE occurs in the form of FPs. No correlation between FP total mass and VSS was found. An experiment with a layer of FPs simulated by ground toilet paper was conducted and showed the impact of this layer (4.89 mg/cm2) on wastewater hydraulic conductivity--for an FC with FPs (FC-FP), hydraulic conductivity was seven times lower than for the FC without the FP layer, and on outflow quality (lower concentration of organic matter expressed as chemical oxygen demand (COD) in effluent from the FC-FP filter than in the effluent from the FC filter: 618 and 732 gO2/m3, respectively). Despite a relatively small amount of FPs in STE solids (as volume fraction), they play an important role in FC development due to their relatively high length and low degradability. Probably relatively small pores of the FC containing FPs (FC-FP) caused a small particle blocking and a decrease in permeability.

  17. Development of Tremor Suppression Control System Using Adaptive Filter and Its Application to Meal-assist Robot

    Science.gov (United States)

    Yano, Ken'ichi; Ohara, Eiichi; Horihata, Satoshi; Aoki, Takaaki; Nishimoto, Yutaka

    A robot that supports independent living by assisting with eating and other activities which use the operator's own hand would be helpful for people suffering from tremors of the hand or any other body part. The proposed system using adaptive filter estimates tremor frequencies with a time-varying property and individual differences online. In this study, the estimated frequency is used to adjusting the tremor suppression filter which insulates the voluntary motion signal from the sensor signal containing tremor components. These system are integrated into the control system of the Meal-Assist Robot. As a result, the developed system makes it possible for the person with a tremor to manipulate the supporting robot without causing operability to deteriorate and without hazards due to improper operation.

  18. SU-F-J-28: Development of a New Imaging Filter to Remove the Shadows From the Carbon Fiber Grid Table Top

    Energy Technology Data Exchange (ETDEWEB)

    Maehana, W [Kanagawa Cancer Center, Yokohama, Kanagawa (Japan); Yokohama National University, Yokohama, kanagawa (Japan); Nagao, T [Yokohama National University, Yokohama, kanagawa (Japan)

    2016-06-15

    Purpose: For the image guided radiation therapy (IGRT), the shadows caused by the construction of the treatment couch top adversely affect the visual evaluation. Therefore, we developed the new imaging filter in order to remove the shadows. The performance of the new filter was evaluated using the clinical images. Methods: The new filter was composed of the band-pass filter (BPF) weighted by the k factor and the low-pass filter (LPF). In the frequency region, the stop bandwidth were 8.3×10{sup 3} mm{sup −1} on u direction and 11.1×10{sup 3} mm{sup −1} on v direction for the BPF, and the pass bandwidth were 8.3×10{sup 3} mm{sup −1} on u direction and 11.1×10{sup 3} mm{sup −1} on v direction for the LPF. After adding the filter, the shadows from the carbon fiber grid table top (CFGTT, Varian) on the kV-image was removed. To check the filter effect, we compared the clinical images, which are thorax and thoracoabdominal region, with to without the filter. The subjective evaluation tests was performed by adapting a three-point scale (agree, neither agree nor disagree, disagree) about the 15 persons in the department of radiation oncology. Results: We succeeded in removing all shadows of CFGTT using the new filter. This filter is very useful shown by the results of the subjective evaluation having the 23/30 persons agreed to the filtered clinical images. Conclusion: We concluded that the proposed method was useful tool for the IGRT and the new filter leads to improvement of the accuracy of radiation therapy.

  19. Development and Verification of a Mobile Shelter Assessment System "Rapid Assessment System of Evacuation Center Condition Featuring Gonryo and Miyagi (RASECC-GM)" for Major Disasters.

    Science.gov (United States)

    Ishii, Tadashi; Nakayama, Masaharu; Abe, Michiaki; Takayama, Shin; Kamei, Takashi; Abe, Yoshiko; Yamadera, Jun; Amito, Koichiro; Morino, Kazuma

    2016-10-01

    Introduction There were 5,385 deceased and 710 missing in the Ishinomaki medical zone following the Great East Japan Earthquake that occurred in Japan on March 11, 2011. The Ishinomaki Zone Joint Relief Team (IZJRT) was formed to unify the relief teams of all organizations joining in support of the Ishinomaki area. The IZJRT expanded relief activity as they continued to manually collect and analyze assessments of essential information for maintaining health in all 328 shelters using a paper-type survey. However, the IZJRT spent an enormous amount of time and effort entering and analyzing these data because the work was vastly complex. Therefore, an assessment system must be developed that can tabulate shelter assessment data correctly and efficiently. The objective of this report was to describe the development and verification of a system to rapidly assess evacuation centers in preparation for the next major disaster. Report Based on experiences with the complex work during the disaster, software called the "Rapid Assessment System of Evacuation Center Condition featuring Gonryo and Miyagi" (RASECC-GM) was developed to enter, tabulate, and manage the shelter assessment data. Further, a verification test was conducted during a large-scale Self-Defense Force (SDF) training exercise to confirm its feasibility, usability, and accuracy. The RASECC-GM comprises three screens: (1) the "Data Entry screen," allowing for quick entry on tablet devices of 19 assessment items, including shelter administrator, living and sanitary conditions, and a tally of the injured and sick; (2) the "Relief Team/Shelter Management screen," for registering information on relief teams and shelters; and (3) the "Data Tabulation screen," which allows tabulation of the data entered for each shelter, as well as viewing and sorting from a disaster headquarters' computer. During the verification test, data of mock shelters entered online were tabulated quickly and accurately on a mock disaster

  20. Verification Games: Crowd-Sourced Formal Verification

    Science.gov (United States)

    2016-03-01

    additional paintbrushes. Additionally, in Paradox , human players are never given small optimization problems (for example, toggling the values of 50...were developed by the Center for Game Science: Pipe Jam, Traffic Jam, Flow Jam and Paradox . Verification tools and games were integrated to verify...4 4. Paradox …………………………………………………......5 5. MyClass ………………………………………………….....7 6. Results …………………………………………………......11 7. Time to

  1. Hailstorms over Switzerland: Verification of Crowd-sourced Data

    Science.gov (United States)

    Noti, Pascal-Andreas; Martynov, Andrey; Hering, Alessandro; Martius, Olivia

    2016-04-01

    The reports of smartphone users, witnessing hailstorms, can be used as source of independent, ground-based observation data on ground-reaching hailstorms with high temporal and spatial resolution. The presented work focuses on the verification of crowd-sourced data collected over Switzerland with the help of a smartphone application recently developed by MeteoSwiss. The precise location, time of hail precipitation and the hailstone size are included in the crowd-sourced data, assessed on the basis of the weather radar data of MeteoSwiss. Two radar-based hail detection algorithms, POH (Probability of Hail) and MESHS (Maximum Expected Severe Hail Size), in use at MeteoSwiss are confronted with the crowd-sourced data. The available data and investigation time period last from June to August 2015. Filter criteria have been applied in order to remove false reports from the crowd-sourced data. Neighborhood methods have been introduced to reduce the uncertainties which result from spatial and temporal biases. The crowd-sourced and radar data are converted into binary sequences according to previously set thresholds, allowing for using a categorical verification. Verification scores (e.g. hit rate) are then calculated from a 2x2 contingency table. The hail reporting activity and patterns corresponding to "hail" and "no hail" reports, sent from smartphones, have been analyzed. The relationship between the reported hailstone sizes and both radar-based hail detection algorithms have been investigated.

  2. Software verification for nuclear industry

    International Nuclear Information System (INIS)

    Wilburn, N.P.

    1985-08-01

    Why verification of software products throughout the software life cycle is necessary is considered. Concepts of verification, software verification planning, and some verification methodologies for products generated throughout the software life cycle are then discussed

  3. Verification and disarmament

    Energy Technology Data Exchange (ETDEWEB)

    Blix, H. [IAEA, Vienna (Austria)

    1998-07-01

    The main features are described of the IAEA safeguards verification system that non-nuclear weapon states parties of the NPT are obliged to accept. Verification activities/problems in Iraq and North Korea are discussed.

  4. Verification and disarmament

    International Nuclear Information System (INIS)

    Blix, H.

    1998-01-01

    The main features are described of the IAEA safeguards verification system that non-nuclear weapon states parties of the NPT are obliged to accept. Verification activities/problems in Iraq and North Korea are discussed

  5. Development of an intense negative hydrogen ion source with a wide-range of external magnetic filter field

    International Nuclear Information System (INIS)

    Takeiri, Y.; Ando, A.; Kaneko, O.

    1994-09-01

    An intense negative hydrogen ion source has been developed, which has a strong external magnetic filter field in the wide area of 35 cm x 62 cm produced by a pair of permanent magnet rows located with 35.4 cm separation. The filter strength is 70 G in the center and the line-integrated filter strength is 850 G cm, which keeps the low electron temperature in the extraction region. Strong cusp magnetic field, 1.8 kG on the chamber surface, is generated for improvement of the plasma confinement. These resulted in the high arc efficiency at the low operational gas pressure. A 16.2 A of the H - ion current with the energy of 47 keV was obtained at the arc efficiency of 0.1 A/kW at the gas pressure of 3.8 mTorr in the cesium-mode operation. The magnetic field in the extraction gap is also strong, 450 G, for the electron suppression. The ratio of the extraction to the negative ion currents was less than 2.2 at the gas pressure of 3 mTorr. The two-stage acceleration was tried, and a 13.6 A of the H - ion beam was accelerated to 125 keV. (author)

  6. Development of a new linearly variable edge filter (LVEF)-based compact slit-less mini-spectrometer

    Science.gov (United States)

    Mahmoud, Khaled; Park, Seongchong; Lee, Dong-Hoon

    2018-02-01

    This paper presents the development of a compact charge-coupled detector (CCD) spectrometer. We describe the design, concept and characterization of VNIR linear variable edge filter (LVEF)- based mini-spectrometer. The new instrument has been realized for operation in the 300 nm to 850 nm wavelength range. The instrument consists of a linear variable edge filter in front of CCD array. Low-size, light-weight and low-cost could be achieved using the linearly variable filters with no need to use any moving parts for wavelength selection as in the case of commercial spectrometers available in the market. This overview discusses the main components characteristics, the main concept with the main advantages and limitations reported. Experimental characteristics of the LVEFs are described. The mathematical approach to get the position-dependent slit function of the presented prototype spectrometer and its numerical de-convolution solution for a spectrum reconstruction is described. The performance of our prototype instrument is demonstrated by measuring the spectrum of a reference light source.

  7. Development of moving alternating magnetic filter using permanent magnet for removal of radioactive corrosion product from nuclear power plant

    International Nuclear Information System (INIS)

    Song, M. C.; Kim, S. I.; Lee, K. J.

    2002-01-01

    Radioactive Corrosion Products (CRUD) which are generated by the neutron activation of general corrosion products at the nuclear power plant are the major source of occupational radiation exposure. Most of the CRUD has a characteristic of showing strong ferrimagnetisms. Along with the new development and production of permanent magnet (rare earth magnet) which generates much stronger magnetic field than the conventional magnet, new type of magnetic filter that can separate CRUD efficiently and eventually reduce radiation exposure of personnel at nuclear power plant is suggested. This separator consists of inner and outer magnet assemblies, coolant channel and container surrounding the outer magnet assembly. The rotational motion of the inner and outer permanent magnet assemblies surrounding the coolant channel by driving motor system produces moving alternating magnetic fields in the coolant channel. The CRUD can be separated from the coolant by the moving alternating magnetic field. This study describes the results of preliminary experiment performed with the different flow rates of coolant and rotation velocities of magnet assemblies. This new magnetic filter shows better performance results of filtering the magnetite at coolant (water). Flow rates, rotating velocities of magnet assemblies and particle sizes turn out to be very important design parameters

  8. Development of Filtered Bispectrum for EEG Signal Feature Extraction in Automatic Emotion Recognition Using Artificial Neural Networks

    Directory of Open Access Journals (Sweden)

    Prima Dewi Purnamasari

    2017-05-01

    Full Text Available The development of automatic emotion detection systems has recently gained significant attention due to the growing possibility of their implementation in several applications, including affective computing and various fields within biomedical engineering. Use of the electroencephalograph (EEG signal is preferred over facial expression, as people cannot control the EEG signal generated by their brain; the EEG ensures a stronger reliability in the psychological signal. However, because of its uniqueness between individuals and its vulnerability to noise, use of EEG signals can be rather complicated. In this paper, we propose a methodology to conduct EEG-based emotion recognition by using a filtered bispectrum as the feature extraction subsystem and an artificial neural network (ANN as the classifier. The bispectrum is theoretically superior to the power spectrum because it can identify phase coupling between the nonlinear process components of the EEG signal. In the feature extraction process, to extract the information contained in the bispectrum matrices, a 3D pyramid filter is used for sampling and quantifying the bispectrum value. Experiment results show that the mean percentage of the bispectrum value from 5 × 5 non-overlapped 3D pyramid filters produces the highest recognition rate. We found that reducing the number of EEG channels down to only eight in the frontal area of the brain does not significantly affect the recognition rate, and the number of data samples used in the training process is then increased to improve the recognition rate of the system. We have also utilized a probabilistic neural network (PNN as another classifier and compared its recognition rate with that of the back-propagation neural network (BPNN, and the results show that the PNN produces a comparable recognition rate and lower computational costs. Our research shows that the extracted bispectrum values of an EEG signal using 3D filtering as a feature extraction

  9. MO-AB-BRA-03: Development of Novel Real Time in Vivo EPID Treatment Verification for Brachytherapy

    Energy Technology Data Exchange (ETDEWEB)

    Fonseca, G; Podesta, M [Department of Radiation Oncology (MAASTRO), GROW School for Oncology and Developmental Biology, Maastricht University Medical Center, Maastricht 6201 BN (Netherlands); Reniers, B [Department of Radiation Oncology (MAASTRO), GROW School for Oncology and Developmental Biology, Maastricht University Medical Center, Maastricht 6201 BN (Netherlands); Research Group NuTeC, CMK, Hasselt University, Agoralaan Gebouw H, Diepenbeek B-3590 (Belgium); Verhaegen, F [Department of Radiation Oncology (MAASTRO), GROW School for Oncology and Developmental Biology, Maastricht University Medical Center, Maastricht 6201 BN (Netherlands); Medical Physics Unit, Department of Oncology, McGill University, Montreal, Quebec H3G 1A4 (Canada)

    2016-06-15

    Purpose: High Dose Rate (HDR) brachytherapy treatments are employed worldwide to treat a wide variety of cancers. However, in vivo dose verification remains a challenge with no commercial dosimetry system available to verify the treatment dose delivered to the patient. We propose a novel dosimetry system that couples an independent Monte Carlo (MC) simulation platform and an amorphous silicon Electronic Portal Imaging Device (EPID) to provide real time treatment verification. Methods: MC calculations predict the EPID response to the photon fluence emitted by the HDR source by simulating the patient, the source dwell positions and times, and treatment complexities such as tissue compositions/densities and different applicators. Simulated results are then compared against EPID measurements acquired with ∼0.14s time resolution which allows dose measurements for each dwell position. The EPID has been calibrated using an Ir-192 HDR source and experiments were performed using different phantoms, including tissue equivalent materials (PMMA, lung and bone). A source positioning accuracy of 0.2 mm, without including the afterloader uncertainty, was ensured using a robotic arm moving the source. Results: An EPID can acquire 3D Cartesian source positions and its response varies significantly due to differences in the material composition/density of the irradiated object, allowing detection of changes in patient geometry. The panel time resolution allows dose rate and dwell time measurements. Moreover, predicted EPID images obtained from clinical treatment plans provide anatomical information that can be related to the patient anatomy, mostly bone and air cavities, localizing the source inside of the patient using its anatomy as reference. Conclusion: Results obtained show the feasibility of the proposed dose verification system that is capable to verify all the brachytherapy treatment steps in real time providing data about treatment delivery quality and also applicator

  10. Development and Tests of the Event Filter for the ATLAS Experiment

    CERN Document Server

    Bosman, M; Negri, A; Segura, E; Sushkov, S; Touchard, F; Wheeler, S J; 14th IEEE - NPSS Real Time Conference 2005 Nuclear Plasma Sciences Society

    2005-01-01

    The Trigger and Data Acquisition (TDAQ) System of the ATLAS Experiment comprises three stages of event selection. The Event Filter (EF) is the third level trigger and is software implemented. Its primary goal is the final selection of interesting events with reduction of the event rate down to ~200 Hz acceptable by the mass storage. The EF System will be implemented as a set of independent commodity components Sub-Farms, each connected to the Event Builder subsystem to receive full events and on the other side to the Sub-Farm Output nodes, where the selected events are forwarded to mass storage. A distinctive feature of the Event Filter is its ability to use the full event data for selection directly based on the offline reconstruction and analysis algorithms. Besides the main duties on event triggering and data transportation, the EF is also able to provide additional functionalities, like monitoring of the selected events and online calibration of the ATLAS detectors. Significant design improvements are cur...

  11. Monte Carlo filters for identification of nonlinear structural dynamical ...

    Indian Academy of Sciences (India)

    The theory of Kalman filtering provides one of ...... expansion (appendix B contains a reasonably self-contained account of how such expansions ...... Shinozuka M, Ghanem R 1995 Structural system identification II: experimental verification.

  12. Development and verification of a three-dimensional core model for WWR type reactors and its coupling with the accident code ATHLET. Final report

    International Nuclear Information System (INIS)

    Grundmann, U.; Lucas, D.; Mittag, S.; Rohde, U.

    1995-04-01

    The main goal of the project was the coupling of the 3D core model DYN3D for Russian VVER-type reactors, which has been developed in the RCR, with the thermohydraulic code ATHLET. The coupling has been realized on two basically different ways: - The implementation of only the neutron kinetics model of DYN3D into ATHLET (internal coupling), - the connection of the complete DYN3D core model including neutron kinetics, thermohydraulics and fuel rod model via data interfaces at the core top and bottom (external coupling). For the test of the coupling, comparative calculations between internal and external coupling versions have been carried out for a LOCA and a reactivity transient. Complementary goals of the project were: - The development of a DYN3D version for burn-up calculations, - the verification of DYN3D on benchmark tasks and experimental data on fuel rod behaviour, - a study on the extension of the neutron-physical data base. The project contributed to the development of advanced tools for the safety analysis of VVER-type reactors. Future work is aimed to the verification of the coupled code complex DYN3D-ATHLET. (orig.) [de

  13. HTGR analytical methods and design verification

    International Nuclear Information System (INIS)

    Neylan, A.J.; Northup, T.E.

    1982-05-01

    Analytical methods for the high-temperature gas-cooled reactor (HTGR) include development, update, verification, documentation, and maintenance of all computer codes for HTGR design and analysis. This paper presents selected nuclear, structural mechanics, seismic, and systems analytical methods related to the HTGR core. This paper also reviews design verification tests in the reactor core, reactor internals, steam generator, and thermal barrier

  14. Nanofiber Filters Eliminate Contaminants

    Science.gov (United States)

    2009-01-01

    With support from Phase I and II SBIR funding from Johnson Space Center, Argonide Corporation of Sanford, Florida tested and developed its proprietary nanofiber water filter media. Capable of removing more than 99.99 percent of dangerous particles like bacteria, viruses, and parasites, the media was incorporated into the company's commercial NanoCeram water filter, an inductee into the Space Foundation's Space Technology Hall of Fame. In addition to its drinking water filters, Argonide now produces large-scale nanofiber filters used as part of the reverse osmosis process for industrial water purification.

  15. Filters in nuclear facilities

    International Nuclear Information System (INIS)

    Berg, K.H.; Wilhelm, J.G.

    1985-01-01

    The topics of the nine papers given include the behavior of HEPA filters during exposure to air flows of high humidity as well as of high differential pressure, the development of steel-fiber filters suitable for extreme operating conditions, and the occurrence of various radioactive iodine species in the exhaust air from boiling water reactors. In an introductory presentation the German view of the performance requirements to be met by filters in nuclear facilities as well as the present status of filter quality assurance are discussed. (orig.) [de

  16. The Identification of Filters and Interdependencies for Effective Resource Allocation: Coupling the Mitigation of Natural Hazards to Economic Development.

    Science.gov (United States)

    Agar, S. M.; Kunreuther, H.

    2005-12-01

    Policy formulation for the mitigation and management of risks posed by natural hazards requires that governments confront difficult decisions for resource allocation and be able to justify their spending. Governments also need to recognize when spending offers little improvement and the circumstances in which relatively small amounts of spending can make substantial differences. Because natural hazards can have detrimental impacts on local and regional economies, patterns of economic development can also be affected by spending decisions for disaster mitigation. This paper argues that by mapping interdependencies among physical, social and economic factors, governments can improve resource allocation to mitigate the risks of natural hazards while improving economic development on local and regional scales. Case studies of natural hazards in Turkey have been used to explore specific "filters" that act to modify short- and long-term outcomes. Pre-event filters can prevent an event from becoming a natural disaster or change a routine event into a disaster. Post-event filters affect both short and long-term recovery and development. Some filters cannot be easily modified by spending (e.g., rural-urban migration) but others (e.g., land-use practices) provide realistic spending targets. Net social benefits derived from spending, however, will also depend on the ways by which filters are linked, or so-called "interdependencies". A single weak link in an interdependent system, such as a power grid, can trigger a cascade of failures. Similarly, weak links in social and commercial networks can send waves of disruption through communities. Conversely, by understanding the positive impacts of interdependencies, spending can be targeted to maximize net social benefits while mitigating risks and improving economic development. Detailed information on public spending was not available for this study but case studies illustrate how networks of interdependent filters can modify

  17. Performance of ceramic disk filter coated with nano ZnO for removing Escherichia coli from water in small rural and remote communities of developing regions.

    Science.gov (United States)

    Huang, Jing; Huang, Guohe; An, Chunjiang; He, Yuan; Yao, Yao; Zhang, Peng; Shen, Jian

    2018-03-12

    Global water safety is facing great challenges due to increased population and demand. There is an urgent need to develop suitable water treatment strategy for small rural and remote communities in low-income developing countries. In order to find a low-cost solution, the reduction of E. coli using ceramic water disk coated with nano ZnO was investigated in this study. The performance of modified ceramic disk filters was influenced by several factors in the filter production process. Based on the factorial analysis, the pore size of the disk filters was the most significant factor for influencing E. coli removal efficiency and the clay content was the most significant one for influencing flow rate of modified disk filters. The coating of nano ZnO led to the change of disk filter surface and porosity. The reduction of E. coli could be attributed to both filter retention and photocatalytic antibacterial activity of nano ZnO. The effects of filter operation factors including initial E. coli concentration, illumination time and lamp power on E. coli removal effectiveness were also revealed. The results can help find a safe and cost-effective approach to solve drinking water problems in small rural and remote communities of developing regions. Copyright © 2018 Elsevier Ltd. All rights reserved.

  18. Development and validation of a Kalman filter-based model for vehicle slip angle estimation

    Science.gov (United States)

    Gadola, M.; Chindamo, D.; Romano, M.; Padula, F.

    2014-01-01

    It is well known that vehicle slip angle is one of the most difficult parameters to measure on a vehicle during testing or racing activities. Moreover, the appropriate sensor is very expensive and it is often difficult to fit to a car, especially on race cars. We propose here a strategy to eliminate the need for this sensor by using a mathematical tool which gives a good estimation of the vehicle slip angle. A single-track car model, coupled with an extended Kalman filter, was used in order to achieve the result. Moreover, a tuning procedure is proposed that takes into consideration both nonlinear and saturation characteristics typical of vehicle lateral dynamics. The effectiveness of the proposed algorithm has been proven by both simulation results and real-world data.

  19. Development of multi-filter spectroradiometry; Filter hoshiki ni yoru bunka hosharyo no keisoku hoho to sono supekutoru no hyogen hoho ni tsuite

    Energy Technology Data Exchange (ETDEWEB)

    Miyake, Y; Aoshima, T; Minoda, T; Kato, T; Kondo, S [Eiko Instruments Trading Co. Ltd., Tokyo (Japan)

    1996-10-27

    Described in this paper is a technique of solar radiation spectroradiometry in which high-resolution wavelength computation adds to a multi-filter method. The solar spectrum upon entering the atmosphere is scattered and absorbed by parameter-constituting elements such as gas, aerosol, cloud particles, etc., and its spectral contour is complicatedly deformed relative to wavelength. Taking advantage of the fact that the scattering and absorbing characteristics of some of the elements are constant relative to wavelength, a simple equation was constructed to enable high-resolution spectrum measurement wavelength-wise, and this compensates for the limit in measurable wavelength that the conventional multi-filter method suffers from. The new method discussed here is not so expensive as the grating method thanks to the employment of filters, is capable of determining spectral radiation quantities with a precision of {plus_minus}5%, and is reduced in terms of the capacity of memory for data storage. The new method enables data collection under various atmospheric conditions that the four seasons present, which the difficult-to-apply and expensive spectroradiometer fails. It is expected that this method will find its use in collecting basic data for the designing of photovoltaic power generation systems, in the study of photochemical reaction in agriculture, and in collecting basic data for daylight lighting. 1 ref., 6 figs., 2 tabs.

  20. CASL Verification and Validation Plan

    Energy Technology Data Exchange (ETDEWEB)

    Mousseau, Vincent Andrew [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Dinh, Nam [North Carolina State Univ., Raleigh, NC (United States)

    2016-06-30

    This report documents the Consortium for Advanced Simulation of LWRs (CASL) verification and validation plan. The document builds upon input from CASL subject matter experts, most notably the CASL Challenge Problem Product Integrators, CASL Focus Area leaders, and CASL code development and assessment teams. This document will be a living document that will track progress on CASL to do verification and validation for both the CASL codes (including MPACT, CTF, BISON, MAMBA) and for the CASL challenge problems (CIPS, PCI, DNB). The CASL codes and the CASL challenge problems are at differing levels of maturity with respect to validation and verification. The gap analysis will summarize additional work that needs to be done. Additional VVUQ work will be done as resources permit. This report is prepared for the Department of Energy’s (DOE’s) CASL program in support of milestone CASL.P13.02.

  1. IMRT plan verification in radiotherapy

    International Nuclear Information System (INIS)

    Vlk, P.

    2006-01-01

    This article describes the procedure for verification of IMRT (Intensity modulated radiation therapy) plan, which is used in the Oncological Institute of St. Elisabeth in Bratislava. It contains basic description of IMRT technology and developing a deployment plan for IMRT planning system CORVUS 6.0, the device Mimic (Multilammelar intensity modulated collimator) and the overall process of verifying the schedule created. The aim of verification is particularly good control of the functions of MIMIC and evaluate the overall reliability of IMRT planning. (author)

  2. District-level hospital trauma care audit filters: Delphi technique for defining context-appropriate indicators for quality improvement initiative evaluation in developing countries.

    Science.gov (United States)

    Stewart, Barclay T; Gyedu, Adam; Quansah, Robert; Addo, Wilfred Larbi; Afoko, Akis; Agbenorku, Pius; Amponsah-Manu, Forster; Ankomah, James; Appiah-Denkyira, Ebenezer; Baffoe, Peter; Debrah, Sam; Donkor, Peter; Dorvlo, Theodor; Japiong, Kennedy; Kushner, Adam L; Morna, Martin; Ofosu, Anthony; Oppong-Nketia, Victor; Tabiri, Stephen; Mock, Charles

    2016-01-01

    Prospective clinical audit of trauma care improves outcomes for the injured in high-income countries (HICs). However, equivalent, context-appropriate audit filters for use in low- and middle-income country (LMIC) district-level hospitals have not been well established. We aimed to develop context-appropriate trauma care audit filters for district-level hospitals in Ghana, was well as other LMICs more broadly. Consensus on trauma care audit filters was built between twenty panellists using a Delphi technique with four anonymous, iterative surveys designed to elicit: (i) trauma care processes to be measured; (ii) important features of audit filters for the district-level hospital setting; and (iii) potentially useful filters. Filters were ranked on a scale from 0 to 10 (10 being very useful). Consensus was measured with average percent majority opinion (APMO) cut-off rate. Target consensus was defined a priori as: a median rank of ≥9 for each filter and an APMO cut-off rate of ≥0.8. Panellists agreed on trauma care processes to target (e.g. triage, phases of trauma assessment, early referral if needed) and specific features of filters for district-level hospital use (e.g. simplicity, unassuming of resource capacity). APMO cut-off rate increased successively: Round 1--0.58; Round 2--0.66; Round 3--0.76; and Round 4--0.82. After Round 4, target consensus on 22 trauma care and referral-specific filters was reached. Example filters include: triage--vital signs are recorded within 15 min of arrival (must include breathing assessment, heart rate, blood pressure, oxygen saturation if available); circulation--a large bore IV was placed within 15 min of patient arrival; referral--if referral is activated, the referring clinician and receiving facility communicate by phone or radio prior to transfer. This study proposes trauma care audit filters appropriate for LMIC district-level hospitals. Given the successes of similar filters in HICs and obstetric care filters in LMICs

  3. 24 CFR 5.512 - Verification of eligible immigration status.

    Science.gov (United States)

    2010-04-01

    ... immigration status. 5.512 Section 5.512 Housing and Urban Development Office of the Secretary, Department of... Noncitizens § 5.512 Verification of eligible immigration status. (a) General. Except as described in paragraph...) Primary verification—(1) Automated verification system. Primary verification of the immigration status of...

  4. Verification of safety critical software

    International Nuclear Information System (INIS)

    Son, Ki Chang; Chun, Chong Son; Lee, Byeong Joo; Lee, Soon Sung; Lee, Byung Chai

    1996-01-01

    To assure quality of safety critical software, software should be developed in accordance with software development procedures and rigorous software verification and validation should be performed. Software verification is the formal act of reviewing, testing of checking, and documenting whether software components comply with the specified requirements for a particular stage of the development phase[1]. New software verification methodology was developed and was applied to the Shutdown System No. 1 and 2 (SDS1,2) for Wolsung 2,3 and 4 nuclear power plants by Korea Atomic Energy Research Institute(KAERI) and Atomic Energy of Canada Limited(AECL) in order to satisfy new regulation requirements of Atomic Energy Control Boars(AECB). Software verification methodology applied to SDS1 for Wolsung 2,3 and 4 project will be described in this paper. Some errors were found by this methodology during the software development for SDS1 and were corrected by software designer. Outputs from Wolsung 2,3 and 4 project have demonstrated that the use of this methodology results in a high quality, cost-effective product. 15 refs., 6 figs. (author)

  5. Challenges for effective WMD verification

    International Nuclear Information System (INIS)

    Andemicael, B.

    2006-01-01

    already awash in fissile material and is increasingly threatened by the possible consequences of illicit trafficking in such material. The chemical field poses fewer problems. The ban on chemical weapons is a virtually complete post-Cold War regime, with state-of-the-art concepts and procedures of verification resulting from decades of negotiation. The detection of prohibited materials and activities is the common goal of the nuclear and chemical regimes for which the most intrusive and intensive procedures are activated by the three organizations. Accounting for the strictly peaceful application of dual-use items constitutes the bulk of the work of the inspectorates at the IAEA and the OPCW. A common challenge in both fields is the advance of science and technology in the vast nuclear and chemical industries and the ingenuity of some determined proliferators to deceive by concealing illicit activities under legitimate ones. Inspection procedures and technologies need to keep up with the requirement for flexibility and adaptation to change. The common objective of the three organizations is to assemble and analyze all relevant information in order to conclude reliably whether a State is or is not complying with its treaty obligations. The positive lessons learned from the IAEA's verification experience today are valuable in advancing concepts and technologies that might also benefit the other areas of WMD verification. Together with the emerging, more comprehensive verification practice of the OPCW, they may provide a useful basis for developing common standards, which may in turn help in evaluating the cost-effectiveness of verification methods for the Biological and Toxin Weapons Convention and other components of a WMD control regime

  6. Hierarchical Representation Learning for Kinship Verification.

    Science.gov (United States)

    Kohli, Naman; Vatsa, Mayank; Singh, Richa; Noore, Afzel; Majumdar, Angshul

    2017-01-01

    Kinship verification has a number of applications such as organizing large collections of images and recognizing resemblances among humans. In this paper, first, a human study is conducted to understand the capabilities of human mind and to identify the discriminatory areas of a face that facilitate kinship-cues. The visual stimuli presented to the participants determine their ability to recognize kin relationship using the whole face as well as specific facial regions. The effect of participant gender and age and kin-relation pair of the stimulus is analyzed using quantitative measures such as accuracy, discriminability index d' , and perceptual information entropy. Utilizing the information obtained from the human study, a hierarchical kinship verification via representation learning (KVRL) framework is utilized to learn the representation of different face regions in an unsupervised manner. We propose a novel approach for feature representation termed as filtered contractive deep belief networks (fcDBN). The proposed feature representation encodes relational information present in images using filters and contractive regularization penalty. A compact representation of facial images of kin is extracted as an output from the learned model and a multi-layer neural network is utilized to verify the kin accurately. A new WVU kinship database is created, which consists of multiple images per subject to facilitate kinship verification. The results show that the proposed deep learning framework (KVRL-fcDBN) yields the state-of-the-art kinship verification accuracy on the WVU kinship database and on four existing benchmark data sets. Furthermore, kinship information is used as a soft biometric modality to boost the performance of face verification via product of likelihood ratio and support vector machine based approaches. Using the proposed KVRL-fcDBN framework, an improvement of over 20% is observed in the performance of face verification.

  7. Hybrid Filter Membrane

    Science.gov (United States)

    Laicer, Castro; Rasimick, Brian; Green, Zachary

    2012-01-01

    Cabin environmental control is an important issue for a successful Moon mission. Due to the unique environment of the Moon, lunar dust control is one of the main problems that significantly diminishes the air quality inside spacecraft cabins. Therefore, this innovation was motivated by NASA s need to minimize the negative health impact that air-suspended lunar dust particles have on astronauts in spacecraft cabins. It is based on fabrication of a hybrid filter comprising nanofiber nonwoven layers coated on porous polymer membranes with uniform cylindrical pores. This design results in a high-efficiency gas particulate filter with low pressure drop and the ability to be easily regenerated to restore filtration performance. A hybrid filter was developed consisting of a porous membrane with uniform, micron-sized, cylindrical pore channels coated with a thin nanofiber layer. Compared to conventional filter media such as a high-efficiency particulate air (HEPA) filter, this filter is designed to provide high particle efficiency, low pressure drop, and the ability to be regenerated. These membranes have well-defined micron-sized pores and can be used independently as air filters with discreet particle size cut-off, or coated with nanofiber layers for filtration of ultrafine nanoscale particles. The filter consists of a thin design intended to facilitate filter regeneration by localized air pulsing. The two main features of this invention are the concept of combining a micro-engineered straight-pore membrane with nanofibers. The micro-engineered straight pore membrane can be prepared with extremely high precision. Because the resulting membrane pores are straight and not tortuous like those found in conventional filters, the pressure drop across the filter is significantly reduced. The nanofiber layer is applied as a very thin coating to enhance filtration efficiency for fine nanoscale particles. Additionally, the thin nanofiber coating is designed to promote capture of

  8. Hot cell verification facility update

    International Nuclear Information System (INIS)

    Titzler, P.A.; Moffett, S.D.; Lerch, R.E.

    1985-01-01

    The Hot Cell Verification Facility (HCVF) provides a prototypic hot cell mockup to check equipment for functional and remote operation, and provides actual hands-on training for operators. The facility arrangement is flexible and assists in solving potential problems in a nonradioactive environment. HCVF has been in operation for six years, and the facility is a part of the Hanford Engineering Development Laboratory

  9. Software development for efficient description of FIR filters of wireless LAN systems, in VHDL

    OpenAIRE

    Καινούργιος, Σωτήριος

    2005-01-01

    Υλοποίηση παραμετροποιημένων FIR φίλτρων με την βοήθεια της γλώσσας περιγραφής υλικού VHDL. Μελέτη των αποτελεσμάτων για χώρο και περιοχή που καταλαμβάνει ο σχεδιαμσός. FIR implementation with the description language VHDL. We discuss their results for area and the delay of each diagram (FIR filter).

  10. Development of a predictive model for 6 month survival in patients with venous thromboembolism and solid malignancy requiring IVC filter placement.

    Science.gov (United States)

    Huang, Steven Y; Odisio, Bruno C; Sabir, Sharjeel H; Ensor, Joe E; Niekamp, Andrew S; Huynh, Tam T; Kroll, Michael; Gupta, Sanjay

    2017-07-01

    Our purpose was to develop a predictive model for short-term survival (i.e. filter placement in patients with venous thromboembolism (VTE) and solid malignancy. Clinical and laboratory parameters were retrospectively reviewed for patients with solid malignancy who received a filter between January 2009 and December 2011 at a tertiary care cancer center. Multivariate Cox proportional hazards modeling was used to assess variables associated with 6 month survival following filter placement in patients with VTE and solid malignancy. Significant variables were used to generate a predictive model. 397 patients with solid malignancy received a filter during the study period. Three variables were associated with 6 month survival: (1) serum albumin [hazard ratio (HR) 0.496, P filter placement can be predicted from three patient variables. Our predictive model could be used to help physicians decide whether a permanent or retrievable filter may be more appropriate as well as to assess the risks and benefits for filter retrieval within the context of survival longevity in patients with cancer.

  11. Development of advanced earthquake resistant performance verification on reinforced concrete underground structure. Pt. 2. Verification of the ground modeling methods applied to non-linear soil-structure interaction analysis

    International Nuclear Information System (INIS)

    Kawai, Tadashi; Kanatani, Mamoru; Ohtomo, Keizo; Matsui, Jun; Matsuo, Toyofumi

    2003-01-01

    In order to develop an advanced verification method for earthquake resistant performance on reinforced concrete underground structures, the applicability of two different types of soil modeling methods in numerical analysis were verified through non-linear dynamic numerical simulations of the large shaking table tests conducted using the model comprised of free-field ground or soils and a reinforced concrete two-box culvert structure system. In these simulations, the structure was modeled by a beam type element having a tri-linear curve of the relations between curvature and flexural moment. The soil was modeled by the Ramberg-Osgood model as well as an elasto-plastic constitutive model. The former model only employs non-linearity of shear modulus regarding strain and initial stress conditions, whereas the latter can express non-linearity of shear modulus caused by changes of mean effective stress during ground excitation and dilatancy of ground soil. Therefore the elasto-plastic constitutive model could precisely simulate the vertical acceleration and displacement response on ground surface, which were produced by the soil dilations during a shaking event of a horizontal base input in the model tests. In addition, the model can explain distinctive dynamic earth pressure acting on the vertical walls of the structure which was also confirmed to be related to the soil dilations. However, since both these modeling methods could express the shear force on the upper slab surface of the model structure, which plays the predominant role on structural deformation, these modeling methods were applicable equally to the evaluation of seismic performance similar to the model structure of this study. (author)

  12. Toward the Development of a Coupled COAMPS-ROMS Ensemble Kalman Filter and Adjoint with a focus on the Indian Ocean and the Intraseasonal Oscillation

    Science.gov (United States)

    2015-09-30

    1 Approved for public release; distribution is unlimited. Toward the Development of a Coupled COAMPS-ROMS Ensemble Kalman Filter and Adjoint...system at NCAR. (2) Compare the performance of the Ensemble Kalman Filter (EnKF) using the Data Assimilation Research Testbed (DART) and 4...undercurrent is clearly visible. Figure 2 shows the horizontal temperature structure and circulation at a depth of 50 m within the surface mixed layer

  13. Fingerprint verification prediction model in hand dermatitis.

    Science.gov (United States)

    Lee, Chew K; Chang, Choong C; Johor, Asmah; Othman, Puwira; Baba, Roshidah

    2015-07-01

    Hand dermatitis associated fingerprint changes is a significant problem and affects fingerprint verification processes. This study was done to develop a clinically useful prediction model for fingerprint verification in patients with hand dermatitis. A case-control study involving 100 patients with hand dermatitis. All patients verified their thumbprints against their identity card. Registered fingerprints were randomized into a model derivation and model validation group. Predictive model was derived using multiple logistic regression. Validation was done using the goodness-of-fit test. The fingerprint verification prediction model consists of a major criterion (fingerprint dystrophy area of ≥ 25%) and two minor criteria (long horizontal lines and long vertical lines). The presence of the major criterion predicts it will almost always fail verification, while presence of both minor criteria and presence of one minor criterion predict high and low risk of fingerprint verification failure, respectively. When none of the criteria are met, the fingerprint almost always passes the verification. The area under the receiver operating characteristic curve was 0.937, and the goodness-of-fit test showed agreement between the observed and expected number (P = 0.26). The derived fingerprint verification failure prediction model is validated and highly discriminatory in predicting risk of fingerprint verification in patients with hand dermatitis. © 2014 The International Society of Dermatology.

  14. Filter apparatus

    International Nuclear Information System (INIS)

    Butterworth, D.J.

    1980-01-01

    This invention relates to liquid filters, precoated by replaceable powders, which are used in the production of ultra pure water required for steam generation of electricity. The filter elements are capable of being installed and removed by remote control so that they can be used in nuclear power reactors. (UK)

  15. Development of 2.8 V Ketjen black supercapacitors with high rate capabilities for AC line filtering

    Science.gov (United States)

    Yoo, Yongju; Park, Jinwoo; Kim, Min-Seop; Kim, Woong

    2017-08-01

    Supercapacitors are generally more compact than conventional bulky aluminum electrolytic capacitors (AECs). Replacement of AECs with supercapacitors can lead to miniaturization of electronic devices. However, even state-of-the-art supercapacitors developed in laboratories are superior to or competitive with AECs only in low voltage applications (<∼40 V). In order to improve the voltage limits of current supercapacitors, we have incorporated Ketjen black (KB) as an electrode material. Utilizing the open pore structure and the graphitic nature of KB, we demonstrate that the voltage limit can be extended to 53 V. The KB supercapacitor exhibits excellent areal capacitance, cell voltage, and phase angle values of ∼574 μF cm-2, 2.8 V, and ∼-80°, respectively. In addition, we demonstrate that an AC line filtering circuit with three supercapacitors connected in series can extend the application voltage without significant sacrifice in rate capability (ϕ ∼ -77° at 120 Hz). On the other hand, KBs are much less expensive than carbon materials previously demonstrated for AC line filtering and hence are very attractive for practical applications. We believe that this demonstration of high-performance supercapacitors made from low-cost carbon materials is both scientifically interesting and important for practical applications.

  16. Enteric virus removal from water by coal-based sorbents: development of low-cost water filters

    Energy Technology Data Exchange (ETDEWEB)

    Chaudhuri, M.; Sattar, S.A.

    1986-01-01

    Using poliovirus type 1 (Sabin) and dechlorinated tap water, several coal-based sorbents were tested for their capacity to remove viruses from water. The sorbents included bituminous coal from Giridih, India, pretreated/impregnated with either alum, ferric hydroxide, lime or manganese dioxide. Filtrasorb-400, commercially available active carbon, was used as a reference. In batch tests, with input virus concentration of 2.34-2.83x10/sup 6/ PFU/1 and sorbent concentration of 10 g/l, alum pretreated coal removed about 96% of the virus when pH of the water was between 6.3 and 8.9. Virus sorption was rapid and a plateau was reached in 30 min. Compared with the active carbon, alum pretreated coal exhibited greater sorption energy and about one log higher limiting poliovirus sorption capacity. Downflow column study indicated the potential of alum pretreated coal as a filter media for removing enteric viruses from water. A previous study showed this sorbent to be capable of removing enteric bacteria as well. Water filters prepared from such low-cost material may prove useful for domestic use in rural areas of India and other developing countries. 19 refs.

  17. Inspector measurement verification activities

    International Nuclear Information System (INIS)

    George, R.S.; Crouch, R.

    e most difficult and complex activity facing a safeguards inspector involves the verification of measurements and the performance of the measurement system. Remeasurement is the key to measurement verification activities. Remeasurerements using the facility's measurement system provide the bulk of the data needed for determining the performance of the measurement system. Remeasurements by reference laboratories are also important for evaluation of the measurement system and determination of systematic errors. The use of these measurement verification activities in conjunction with accepted inventory verification practices provides a better basis for accepting or rejecting an inventory. (U.S.)

  18. Development of procedures for calculating stiffness and damping properties of elastomers in engineering applications. Part 1: Verification of basic methods

    Science.gov (United States)

    Chiang, T.; Tessarzik, J. M.; Badgley, R. H.

    1972-01-01

    The primary aim of this investigation was verification of basic methods which are to be used in cataloging elastomer dynamic properties (stiffness and damping) in terms of viscoelastic model constants. These constants may then be used to predict dynamic properties for general elastomer shapes and operating conditions, thereby permitting optimum application of elastomers as energy absorption and/or energy storage devices in the control of vibrations in a broad variety of applications. The efforts reported involved: (1) literature search; (2) the design, fabrication and use of a test rig for obtaining elastomer dynamic test data over a wide range of frequencies, amplitudes, and preloads; and (3) the reduction of the test data, by means of a selected three-element elastomer model and specialized curve fitting techniques, to material properties. Material constants thus obtained have been used to calculate stiffness and damping for comparison with measured test data. These comparisons are excellent for a number of test conditions and only fair to poor for others. The results confirm the validity of the basic approach of the overall program and the mechanics of the cataloging procedure, and at the same time suggest areas in which refinements should be made.

  19. Development of a Compton camera for online ion beam range verification via prompt γ detection. Session: HK 12.6 Mo 18:30

    Energy Technology Data Exchange (ETDEWEB)

    Aldawood, S. [LMU Munich, Garching (Germany); King Saud University, Riyadh (Saudi Arabia); Liprandi, S.; Marinsek, T.; Bortfeldt, J.; Lang, C.; Lutter, R.; Dedes, G.; Parodi, K.; Thirolf, P.G. [LMU Munich, Garching (Germany); Maier, L.; Gernhaeuser, R. [TU Munich, Garching (Germany); Kolff, H. van der; Schaart, D. [TU Delft (Netherlands); Castelhano, I. [University of Lisbon, Lisbon (Portugal)

    2015-07-01

    A real-time ion beam verification in hadron-therapy is playing a major role in cancer treatment evaluation. This will make the treatment interuption possible if the planned and actual ion range are mismatched. An imaging system is being developed in Garching aiming to detect prompt γ rays induced by nuclear reactions between the ion beam and biological tissue. The Compton camera prototype consists of a stack of six customized double-sided Si-strip detectors (DSSSD, 50 x 50 mm{sup 2}, 128 strips/side) acting as scatterer, while the absorber is formed by a monolithic LaBr{sub 3}:Ce scintillator crystal (50 x 50 x 30 mm{sup 3}) read out by a position-sensitive multi-anode photomultiplier (Hamamatsu H9500). The study of the Compton camera properties and its individual component are in progress both in the laboratory as well as at the online facilities.

  20. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT - BAGHOUSE FILTRATION PRODUCTS - TETRATEC PTFE TECHNOLOGIES TETRATEX 8005

    Science.gov (United States)

    Baghouse filtration products (BFPs) were evaluated by the Air Pollution Control Technology (APCT) pilot of the Environmental Technology Verification (ETV) Program. The performance factor verified was the mean outlet particle concentration for the filter fabric as a function of th...

  1. Verification of Simulation Tools

    International Nuclear Information System (INIS)

    Richard, Thierry

    2015-01-01

    Before qualifying a simulation tool, the requirements shall first be clearly identified, i.e.: - What type of study needs to be carried out? - What phenomena need to be modeled? This phase involves writing a precise technical specification. Once the requirements are defined, the most adapted product shall be selected from the various software options available on the market. Before using a particular version of a simulation tool to support the demonstration of nuclear safety studies, the following requirements shall be met. - An auditable quality assurance process complying with development international standards shall be developed and maintained, - A process of verification and validation (V and V) shall be implemented. This approach requires: writing a report and/or executive summary of the V and V activities, defining a validated domain (domain in which the difference between the results of the tools and those of another qualified reference is considered satisfactory for its intended use). - Sufficient documentation shall be available, - A detailed and formal description of the product (software version number, user configuration, other settings and parameters) in the targeted computing environment shall be available. - Source codes corresponding to the software shall be archived appropriately. When these requirements are fulfilled, the version of the simulation tool shall be considered qualified for a defined domain of validity, in a given computing environment. The functional verification shall ensure that: - the computer architecture of the tool does not include errors, - the numerical solver correctly represents the physical mathematical model, - equations are solved correctly. The functional verification can be demonstrated through certification or report of Quality Assurance. The functional validation shall allow the user to ensure that the equations correctly represent the physical phenomena in the perimeter of intended use. The functional validation can

  2. SU-E-T-256: Development of a Monte Carlo-Based Dose-Calculation System in a Cloud Environment for IMRT and VMAT Dosimetric Verification

    Energy Technology Data Exchange (ETDEWEB)

    Fujita, Y [Tokai University School of Medicine, Isehara, Kanagawa (Japan)

    2015-06-15

    Purpose: Intensity-modulated radiation therapy (IMRT) and volumetric-modulated arc therapy (VMAT) are techniques that are widely used for treating cancer due to better target coverage and critical structure sparing. The increasing complexity of IMRT and VMAT plans leads to decreases in dose calculation accuracy. Monte Carlo simulations are the most accurate method for the determination of dose distributions in patients. However, the simulation settings for modeling an accurate treatment head are very complex and time consuming. The purpose of this work is to report our implementation of a simple Monte Carlo simulation system in a cloud-computing environment for dosimetric verification of IMRT and VMAT plans. Methods: Monte Carlo simulations of a Varian Clinac linear accelerator were performed using the BEAMnrc code, and dose distributions were calculated using the DOSXYZnrc code. Input files for the simulations were automatically generated from DICOM RT files by the developed web application. We therefore must only upload the DICOM RT files through the web interface, and the simulations are run in the cloud. The calculated dose distributions were exported to RT Dose files that can be downloaded through the web interface. The accuracy of the calculated dose distribution was verified by dose measurements. Results: IMRT and VMAT simulations were performed and good agreement results were observed for measured and MC dose comparison. Gamma analysis with a 3% dose and 3 mm DTA criteria shows a mean gamma index value of 95% for the studied cases. Conclusion: A Monte Carlo-based dose calculation system has been successfully implemented in a cloud environment. The developed system can be used for independent dose verification of IMRT and VMAT plans in routine clinical practice. The system will also be helpful for improving accuracy in beam modeling and dose calculation in treatment planning systems. This work was supported by JSPS KAKENHI Grant Number 25861057.

  3. SU-E-T-256: Development of a Monte Carlo-Based Dose-Calculation System in a Cloud Environment for IMRT and VMAT Dosimetric Verification

    International Nuclear Information System (INIS)

    Fujita, Y

    2015-01-01

    Purpose: Intensity-modulated radiation therapy (IMRT) and volumetric-modulated arc therapy (VMAT) are techniques that are widely used for treating cancer due to better target coverage and critical structure sparing. The increasing complexity of IMRT and VMAT plans leads to decreases in dose calculation accuracy. Monte Carlo simulations are the most accurate method for the determination of dose distributions in patients. However, the simulation settings for modeling an accurate treatment head are very complex and time consuming. The purpose of this work is to report our implementation of a simple Monte Carlo simulation system in a cloud-computing environment for dosimetric verification of IMRT and VMAT plans. Methods: Monte Carlo simulations of a Varian Clinac linear accelerator were performed using the BEAMnrc code, and dose distributions were calculated using the DOSXYZnrc code. Input files for the simulations were automatically generated from DICOM RT files by the developed web application. We therefore must only upload the DICOM RT files through the web interface, and the simulations are run in the cloud. The calculated dose distributions were exported to RT Dose files that can be downloaded through the web interface. The accuracy of the calculated dose distribution was verified by dose measurements. Results: IMRT and VMAT simulations were performed and good agreement results were observed for measured and MC dose comparison. Gamma analysis with a 3% dose and 3 mm DTA criteria shows a mean gamma index value of 95% for the studied cases. Conclusion: A Monte Carlo-based dose calculation system has been successfully implemented in a cloud environment. The developed system can be used for independent dose verification of IMRT and VMAT plans in routine clinical practice. The system will also be helpful for improving accuracy in beam modeling and dose calculation in treatment planning systems. This work was supported by JSPS KAKENHI Grant Number 25861057

  4. Verification and the safeguards legacy

    International Nuclear Information System (INIS)

    Perricos, Demetrius

    2001-01-01

    A number of inspection or monitoring systems throughout the world over the last decades have been structured drawing upon the IAEA experience of setting up and operating its safeguards system. The first global verification system was born with the creation of the IAEA safeguards system, about 35 years ago. With the conclusion of the NPT in 1968, inspections were to be performed under safeguards agreements, concluded directly between the IAEA and non-nuclear weapon states parties to the Treaty. The IAEA developed the safeguards system within the limitations reflected in the Blue Book (INFCIRC 153), such as limitations of routine access by the inspectors to 'strategic points', including 'key measurement points', and the focusing of verification on declared nuclear material in declared installations. The system, based as it was on nuclear material accountancy. It was expected to detect a diversion of nuclear material with a high probability and within a given time and therefore determine also that there had been no diversion of nuclear material from peaceful purposes. The most vital element of any verification system is the inspector. Technology can assist but cannot replace the inspector in the field. Their experience, knowledge, intuition and initiative are invaluable factors contributing to the success of any inspection regime. The IAEA inspectors are however not part of an international police force that will intervene to prevent a violation taking place. To be credible they should be technically qualified with substantial experience in industry or in research and development before they are recruited. An extensive training program has to make sure that the inspectors retain their professional capabilities and that it provides them with new skills. Over the years, the inspectors and through them the safeguards verification system gained experience in: organization and management of large teams; examination of records and evaluation of material balances

  5. Development of filtered containment venting system and application for Kashiwazaki-Kariwa Nuclear Power Station Unit 6, 7

    International Nuclear Information System (INIS)

    Murai, Soutarou; Hiranuma, Naoki; Kimura, Takeo; Omori, Shuichi; Watanabe, Fumitoshi; Sasa, Daisuke

    2014-01-01

    The Fukushima Dai-ichi Nuclear Power Station (1F) of Tokyo Electric Power Company (TEPCO) had experienced severe radio-active release to the environment in the Tohoku Region Pacific Coast Earthquake (alias: the Great East Japan Earthquake) in 2011. Under the Station Black-Out (SBO) conditions caused by tsunami with the earthquake, the 1F operators had tried to vent the gasses in the Primary Containment Vessels (PCVs) of the unit 1, 2 and 3 to the environment through the water pools in the suppression chambers of the PCVs. Its venting, however, was imperfect and, as a result, major direct radio-active release to the environment was caused. After this disaster, TEPCO launched a project to develop the Filtered Containment Venting System (FCVS), in which our very bitter experiences in the 1F accident as described above are reflected. One of the main purposes of the development of the FCVS is to enhance operability of venting under the severe plant conditions such as the SBO during progressing of severe core damage, and another is to enhance removal performance of radio-nuclides with the newly added filtering equipment, which is installed in the venting line from the PCV to the outer. The Kashiwazaki-Kariwa NPS unit 6 and 7 will be the first reactors applied the FCVSs. In this paper, we show the design concept of the TEPCO's FCVS, the brief overview of the system design and the summary of experiment which has been performed for getting the performance data of the FCVS such as decontamination factor in various conditions. (author)

  6. The history of ceramic filters.

    Science.gov (United States)

    Fujishima, S

    2000-01-01

    The history of ceramic filters is surveyed. Included is the history of piezoelectric ceramics. Ceramic filters were developed using technology similar to that of quartz crystal and electro-mechanical filters. However, the key to this development involved the theoretical analysis of vibration modes and material improvements of piezoelectric ceramics. The primary application of ceramic filters has been for consumer-market use. Accordingly, a major emphasis has involved mass production technology, leading to low-priced devices. A typical ceramic filter includes monolithic resonators and capacitors packaged in unique configurations.

  7. Remotely operated top loading filter housing

    International Nuclear Information System (INIS)

    Ross, M.J.; Carter, J.A.

    1989-01-01

    A high-efficiency particulate air (HEPA) filter system was developed for the Fuel Processing Facility at the Idaho Chemical Processing Plant. The system utilizes commercially available HEPA filters and allows in-cell filters to be maintained using operator-controlled remote handling equipment. The remote handling tasks include transport of filters before and after replacement, removal and replacement of the filter from the housing, and filter containment

  8. Filter systems

    International Nuclear Information System (INIS)

    Vanin, V.R.

    1990-01-01

    The multidetector systems for high resolution gamma spectroscopy are presented. The observable parameters for identifying nuclides produced simultaneously in the reaction are analysed discussing the efficiency of filter systems. (M.C.K.)

  9. Analog filters in nanometer CMOS

    CERN Document Server

    Uhrmann, Heimo; Zimmermann, Horst

    2014-01-01

    Starting from the basics of analog filters and the poor transistor characteristics in nanometer CMOS 10 high-performance analog filters developed by the authors in 120 nm and 65 nm CMOS are described extensively. Among them are gm-C filters, current-mode filters, and active filters for system-on-chip realization for Bluetooth, WCDMA, UWB, DVB-H, and LTE applications. For the active filters several operational amplifier designs are described. The book, furthermore, contains a review of the newest state of research on low-voltage low-power analog filters. To cover the topic of the book comprehensively, linearization issues and measurement methods for the characterization of advanced analog filters are introduced in addition. Numerous elaborate illustrations promote an easy comprehension. This book will be of value to engineers and researchers in industry as well as scientists and Ph.D students at universities. The book is also recommendable to graduate students specializing on nanoelectronics, microelectronics ...

  10. Verification Account Management System (VAMS)

    Data.gov (United States)

    Social Security Administration — The Verification Account Management System (VAMS) is the centralized location for maintaining SSA's verification and data exchange accounts. VAMS account management...

  11. Development of Filter-Blower Unit for use in the Advanced Nuclear Biological Chemical Protection System (ANBCPS) Helicopter/Transport-aircraft version

    NARCIS (Netherlands)

    Sabel, R.; Reffeltrath, P.A.; Jonkman, A.; Post, T.

    2006-01-01

    As a participant in the three-nation partnership for development of the ANBCP-S for use in Helicopters, Transport Aircraft and Fast Jet, the Royal Netherlands Airforce (RNLAF) picked up the challenge to design a Filter- Blower-Unit (FBU). Major Command (MajCom) of the RNLAF set priority to develop a

  12. As-Built Verification Plan Spent Nuclear Fuel Canister Storage Building MCO Handling Machine

    International Nuclear Information System (INIS)

    SWENSON, C.E.

    2000-01-01

    This as-built verification plan outlines the methodology and responsibilities that will be implemented during the as-built field verification activity for the Canister Storage Building (CSB) MCO HANDLING MACHINE (MHM). This as-built verification plan covers THE ELECTRICAL PORTION of the CONSTRUCTION PERFORMED BY POWER CITY UNDER CONTRACT TO MOWAT. The as-built verifications will be performed in accordance Administrative Procedure AP 6-012-00, Spent Nuclear Fuel Project As-Built Verification Plan Development Process, revision I. The results of the verification walkdown will be documented in a verification walkdown completion package, approved by the Design Authority (DA), and maintained in the CSB project files

  13. Development of polarized {sup 3}He filter for polarized neutron experiment

    Energy Technology Data Exchange (ETDEWEB)

    Sakai, K.; Sato, H.; Yoshimi, A.; Asahi, K. [Tokyo Inst. of Tech. (Japan). Faculty of Science; Masuda, Y.; Muto, S.; Ishimoto, S.; Morimoto, K.

    1996-08-01

    A high-pressure polarized {sup 3}He gas cell, pumped with two diode lasers, has been developed at KEK for use as a polarizer and a spin analyzer for low energy neutrons. The polarization attained of {sup 3}He was determined through the measurement of the transmission of the unpolarized neutrons through the {sup 3}He cell. So far we obtained P{sub He}=18% at 10 atm and P{sub He}=12% at 20 atm. (author)

  14. Development of a prototype real-time automated filter for operational deep space navigation

    Science.gov (United States)

    Masters, W. C.; Pollmeier, V. M.

    1994-01-01

    Operational deep space navigation has been in the past, and is currently, performed using systems whose architecture requires constant human supervision and intervention. A prototype for a system which allows relatively automated processing of radio metric data received in near real-time from NASA's Deep Space Network (DSN) without any redesign of the existing operational data flow has been developed. This system can allow for more rapid response as well as much reduced staffing to support mission navigation operations.

  15. Development of Web GIS-Based VFSMOD System with Three Modules for Effective Vegetative Filter Strip Design

    Directory of Open Access Journals (Sweden)

    Dong Soo Kong

    2013-08-01

    Full Text Available In recent years, Non-Point Source Pollution has been rising as a significant environmental issue. The sediment-laden water problem is causing serious impacts on river ecosystems not only in South Korea but also in most countries. The vegetative filter strip (VFS has been thought to be one of the most effective methods to reduce the transport of sediment to down-gradient area. However, the effective width of the VFS first needs to be determined before VFS installation in the field. To provide an easy-to-use interface with a scientific VFS modeling engine, the Web GIS-based VFSMOD system was developed in this study. The Web GIS-based VFSMOD uses the UH and VFSM executable programs from the VFSMOD-w model as core engines to simulate rainfall-runoff and sediment trapping. To provide soil information for a point of interest, the Google Map interface to the MapServer soil database system was developed using the Google Map API, Javascript, Perl/CGI, and Oracle DB programming. Three modules of the Web GIS-based VFSMOD system were developed for various VFS designs under single storm, multiple storm, and long-term period scenarios. These modules in the Web GIS-based VFSMOD system were applied to the study watershed in South Korea and these were proven as efficient tools for the VFS design for various purposes.

  16. Impact of Chloramination on the Development of Laboratory-Grown Biofilms Fed with Filter-Pretreated Groundwater

    KAUST Repository

    Ling, Fangqiong

    2013-01-01

    This study evaluated the continuous impact of monochloramine disinfection on laboratory-grown biofilms through the characterization of biofilm architecture and microbial community structure. Biofilm development and disinfection were achieved using CDC (Centers for Disease Control and Prevention) biofilm reactor systems with polyvinyl chloride (PVC) coupons as the substratum and sand filter-pretreated groundwater as the source of microbial seeding and growth nutrient. After 2 weeks of growth, the biofilms were subjected to chloramination for 8 more weeks at concentrations of 7.5±1.4 to 9.1±0.4 mg Cl2 L-1. Control reactors received no disinfection during the development of biofilms. Confocal laser scanning microscopy and image analysis indicated that chloramination could lead to 81.4-83.5% and 86.3-95.6% reduction in biofilm biomass and thickness, respectively, but could not eliminate biofilm growth. 16S rRNA gene terminal restriction fragment length polymorphism analysis indicated that microbial community structures between chloraminated and non-chloraminated biofilms exhibited different successional trends. 16S rRNA gene pyrosequencing analysis further revealed that chloramination could select members of Actinobacteria and Acidobacteria as the dominant populations, whereas natural development leads to the selection of members of Nitrospira and Bacteroidetes as dominant biofilm populations. Overall, chloramination treatment could alter the growth of multi-species biofilms on the PVC surface, shape the biofilm architecture, and select a certain microbial community that can survive or proliferate under chloramination.

  17. Verification and Validation of Embedded Knowledge-Based Software Systems

    National Research Council Canada - National Science Library

    Santos, Eugene

    1999-01-01

    .... We pursued this by carefully examining the nature of uncertainty and information semantics and developing intelligent tools for verification and validation that provides assistance to the subject...

  18. Verification and Validation in Systems Engineering

    CERN Document Server

    Debbabi, Mourad; Jarraya, Yosr; Soeanu, Andrei; Alawneh, Luay

    2010-01-01

    "Verification and validation" represents an important process used for the quality assessment of engineered systems and their compliance with the requirements established at the beginning of or during the development cycle. Debbabi and his coauthors investigate methodologies and techniques that can be employed for the automatic verification and validation of systems engineering design models expressed in standardized modeling languages. Their presentation includes a bird's eye view of the most prominent modeling languages for software and systems engineering, namely the Unified Model

  19. Verification and validation for waste disposal models

    International Nuclear Information System (INIS)

    1987-07-01

    A set of evaluation criteria has been developed to assess the suitability of current verification and validation techniques for waste disposal methods. A survey of current practices and techniques was undertaken and evaluated using these criteria with the items most relevant to waste disposal models being identified. Recommendations regarding the most suitable verification and validation practices for nuclear waste disposal modelling software have been made

  20. FMCT verification: Case studies

    International Nuclear Information System (INIS)

    Hui Zhang

    2001-01-01

    Full text: How to manage the trade-off between the need for transparency and the concern about the disclosure of sensitive information would be a key issue during the negotiations of FMCT verification provision. This paper will explore the general concerns on FMCT verification; and demonstrate what verification measures might be applied to those reprocessing and enrichment plants. A primary goal of an FMCT will be to have the five declared nuclear weapon states and the three that operate unsafeguarded nuclear facilities become parties. One focus in negotiating the FMCT will be verification. Appropriate verification measures should be applied in each case. Most importantly, FMCT verification would focus, in the first instance, on these states' fissile material production facilities. After the FMCT enters into force, all these facilities should be declared. Some would continue operating to produce civil nuclear power or to produce fissile material for non- explosive military uses. The verification measures necessary for these operating facilities would be essentially IAEA safeguards, as currently being applied to non-nuclear weapon states under the NPT. However, some production facilities would be declared and shut down. Thus, one important task of the FMCT verifications will be to confirm the status of these closed facilities. As case studies, this paper will focus on the verification of those shutdown facilities. The FMCT verification system for former military facilities would have to differ in some ways from traditional IAEA safeguards. For example, there could be concerns about the potential loss of sensitive information at these facilities or at collocated facilities. Eventually, some safeguards measures such as environmental sampling might be seen as too intrusive. Thus, effective but less intrusive verification measures may be needed. Some sensitive nuclear facilities would be subject for the first time to international inspections, which could raise concerns

  1. Driving Force Filtering and Driving Mechanism Analysis of Urban Agricultural Development in Weifang County, China

    Directory of Open Access Journals (Sweden)

    SUI Fei-fei

    2016-03-01

    Full Text Available As an agricultural nation, the agricultural landscape is the basic appearance and existence in China, but the common existence often be neglected and contempted. As a new type of design and ideology, the development of urban agricultural landscape will greatly affect the texture and structure of the urban space. According to the urban agricultural production data and the socio-economic data of Weifang County, a set of evaluation index system that could analyze quantitatively the driving force of urban agricultural production changes and the internal drive mechanism was built. The original driving force indicators of economy, society, resources and environment from the time-series were chosen, and then 15 driving forces from the original driving forces by correlation analysis and principal component analysis were selected. The degree of influence was analyzed and the driving forces model by means of partial least squares(PLS was built. The results demonstrated that the factors greatly influenced the increase of urban agricultural output value in Weifang County were per capita net income of rural residents, agricultural machinery total power, effective irrigation area, centralized treatment rate of urban sewage, with the driving exponents 0.2509, 0.1019, 0.1655, 0.1332, respectively. The negative influence factor was the use amount of agricultural plastic film and the driving exponent was-0.2146. The research provides a reference for the development of urban agriculture, as well as a reference for the related study.

  2. Advanced verification topics

    CERN Document Server

    Bhattacharya, Bishnupriya; Hall, Gary; Heaton, Nick; Kashai, Yaron; Khan Neyaz; Kirshenbaum, Zeev; Shneydor, Efrat

    2011-01-01

    The Accellera Universal Verification Methodology (UVM) standard is architected to scale, but verification is growing and in more than just the digital design dimension. It is growing in the SoC dimension to include low-power and mixed-signal and the system integration dimension to include multi-language support and acceleration. These items and others all contribute to the quality of the SOC so the Metric-Driven Verification (MDV) methodology is needed to unify it all into a coherent verification plan. This book is for verification engineers and managers familiar with the UVM and the benefits it brings to digital verification but who also need to tackle specialized tasks. It is also written for the SoC project manager that is tasked with building an efficient worldwide team. While the task continues to become more complex, Advanced Verification Topics describes methodologies outside of the Accellera UVM standard, but that build on it, to provide a way for SoC teams to stay productive and profitable.

  3. Development of an electrometer/amplifier and filter set for analysis of reactor noise

    International Nuclear Information System (INIS)

    Strohl, Claude Emile

    1996-01-01

    In nuclear power reactors, the neutron detector signal is dependent on the number of fissions and the reactor power level. The detector signal can be divided into two components: a D C component, proportional to the average value and an A C component, which is the fluctuating part superimposed to the D C component. The analysis of the fluctuating part of the signal is called noise analysis and allow us to investigate phenomena occurring within the reactor vessel, such as vibrational of fuel elements and coolant density, temperature, pressure and flow changes. On the other hand, the measure of the static D C part allows us to measure the local power density. This work describes the development of a personal computer based signal conditioning card that, together with a personal computer commercial data acquisition card, can be used for noise analysis and reactivity measurements of signals coming from ionization chambers or SPD's. (author)

  4. Development of a Novel Bone Conduction Verification Tool Using a Surface Microphone: Validation With Percutaneous Bone Conduction Users.

    Science.gov (United States)

    Hodgetts, William; Scott, Dylan; Maas, Patrick; Westover, Lindsey

    2018-03-23

    . There were 90 planned comparisons of interest, three at each frequency (3 × 10) for the three input levels (30 × 3). Therefore, to minimize a type 1 error associated with multiple comparisons, we adjusted alpha using the Holm-Bonferroni method. There were five comparisons that yielded significant differences between the skull simulator and surface microphone (test and retest) in the estimation of audibility. However, the mean difference in these effects was small at 3.3 dB. Both sensors yielded equivalent results for the majority of comparisons. Models of bone conduction devices that have intact skin cannot be measured with the skull simulator. This study is the first to present and evaluate a new tool for bone conduction verification. The surface microphone is capable of yielding equivalent audibility measurements as the skull simulator for percutaneous bone conduction users at multiple input levels. This device holds potential for measuring other bone conduction devices (Sentio, BoneBridge, Attract, Soft headband devices) that do not have a percutaneous implant.

  5. SU-E-T-800: Verification of Acurose XB Dose Calculation Algorithm at Air Cavity-Tissue Interface Using Film Measurement for Small Fields of 6-MV Flattening Filter-Free Beams

    International Nuclear Information System (INIS)

    Kang, S; Suh, T; Chung, J

    2015-01-01

    Purpose: To verify the dose accuracy of Acuros XB (AXB) dose calculation algorithm at air-tissue interface using inhomogeneous phantom for 6-MV flattening filter-free (FFF) beams. Methods: An inhomogeneous phantom included air cavity was manufactured for verifying dose accuracy at the air-tissue interface. The phantom was composed with 1 and 3 cm thickness of air cavity. To evaluate the central axis doses (CAD) and dose profiles of the interface, the dose calculations were performed for 3 × 3 and 4 × 4 cm 2 fields of 6 MV FFF beams with AAA and AXB in Eclipse treatment plainning system. Measurements in this region were performed with Gafchromic film. The root mean square errors (RMSE) were analyzed with calculated and measured dose profile. Dose profiles were divided into inner-dose profile (>80%) and penumbra (20% to 80%) region for evaluating RMSE. To quantify the distribution difference, gamma evaluation was used and determined the agreement with 3%/3mm criteria. Results: The percentage differences (%Diffs) between measured and calculated CAD in the interface, AXB shows more agreement than AAA. The %Diffs were increased with increasing the thickness of air cavity size and it is similar for both algorithms. In RMSEs of inner-profile, AXB was more accurate than AAA. The difference was up to 6 times due to overestimation by AAA. RMSEs of penumbra appeared to high difference for increasing the measurement depth. Gamma agreement also presented that the passing rates decreased in penumbra. Conclusion: This study demonstrated that the dose calculation with AXB shows more accurate than with AAA for the air-tissue interface. The 2D dose distributions with AXB for both inner-profile and penumbra showed better agreement than with AAA relative to variation of the measurement depths and air cavity sizes

  6. Updating the OMERACT filter

    DEFF Research Database (Denmark)

    Tugwell, Peter; Boers, Maarten; D'Agostino, Maria-Antonietta

    2014-01-01

    OBJECTIVE: The Outcome Measures in Rheumatology (OMERACT) Filter provides guidelines for the development and validation of outcome measures for use in clinical research. The "Truth" section of the OMERACT Filter requires that criteria be met to demonstrate that the outcome instrument meets...... the criteria for content, face, and construct validity. METHODS: Discussion groups critically reviewed a variety of ways in which case studies of current OMERACT Working Groups complied with the Truth component of the Filter and what issues remained to be resolved. RESULTS: The case studies showed...... that there is broad agreement on criteria for meeting the Truth criteria through demonstration of content, face, and construct validity; however, several issues were identified that the Filter Working Group will need to address. CONCLUSION: These issues will require resolution to reach consensus on how Truth...

  7. Updating the OMERACT filter

    DEFF Research Database (Denmark)

    Kirwan, John R; Boers, Maarten; Hewlett, Sarah

    2014-01-01

    OBJECTIVE: The Outcome Measures in Rheumatology (OMERACT) Filter provides guidelines for the development and validation of outcome measures for use in clinical research. The "Truth" section of the OMERACT Filter presupposes an explicit framework for identifying the relevant core outcomes...... for defining core areas of measurement ("Filter 2.0 Core Areas of Measurement") was presented at OMERACT 11 to explore areas of consensus and to consider whether already endorsed core outcome sets fit into this newly proposed framework. METHODS: Discussion groups critically reviewed the extent to which case......, presentation, and clarity of the framework were questioned. The discussion groups and subsequent feedback highlighted 20 such issues. CONCLUSION: These issues will require resolution to reach consensus on accepting the proposed Filter 2.0 framework of Core Areas as the basis for the selection of Core Outcome...

  8. Inorganic UV filters

    Directory of Open Access Journals (Sweden)

    Eloísa Berbel Manaia

    2013-06-01

    Full Text Available Nowadays, concern over skin cancer has been growing more and more, especially in tropical countries where the incidence of UVA/B radiation is higher. The correct use of sunscreen is the most efficient way to prevent the development of this disease. The ingredients of sunscreen can be organic and/or inorganic sun filters. Inorganic filters present some advantages over organic filters, such as photostability, non-irritability and broad spectrum protection. Nevertheless, inorganic filters have a whitening effect in sunscreen formulations owing to the high refractive index, decreasing their esthetic appeal. Many techniques have been developed to overcome this problem and among them, the use of nanotechnology stands out. The estimated amount of nanomaterial in use must increase from 2000 tons in 2004 to a projected 58000 tons in 2020. In this context, this article aims to analyze critically both the different features of the production of inorganic filters (synthesis routes proposed in recent years and the permeability, the safety and other characteristics of the new generation of inorganic filters.

  9. HDM/PASCAL Verification System User's Manual

    Science.gov (United States)

    Hare, D.

    1983-01-01

    The HDM/Pascal verification system is a tool for proving the correctness of programs written in PASCAL and specified in the Hierarchical Development Methodology (HDM). This document assumes an understanding of PASCAL, HDM, program verification, and the STP system. The steps toward verification which this tool provides are parsing programs and specifications, checking the static semantics, and generating verification conditions. Some support functions are provided such as maintaining a data base, status management, and editing. The system runs under the TOPS-20 and TENEX operating systems and is written in INTERLISP. However, no knowledge is assumed of these operating systems or of INTERLISP. The system requires three executable files, HDMVCG, PARSE, and STP. Optionally, the editor EMACS should be on the system in order for the editor to work. The file HDMVCG is invoked to run the system. The files PARSE and STP are used as lower forks to perform the functions of parsing and proving.

  10. Development of nuclear thermal hydraulic verification test and evaluation technology - Development of fundamental technique for experiment of natural circulation phenomena in PWR systems

    Energy Technology Data Exchange (ETDEWEB)

    Park, Goon Cherl; Lee, Tae Ho; Kim, Moon Oh; Kim, Hak Joon [Seoul National University, Seoul (Korea)

    2000-04-01

    The dimensional analysis applied two-fluid model of CFX-4,2 were performed. For verification of analysis results, experimental measurement data of two-phase flow parameters in subcooled boiling flow were produced for vertical(0 deg) and inclination (60 deg). And through comparison analysis and experiments the application possibility of various two -phase flow models and the analysis ability of code were evaluated. Measurement technique of bubble velocity in two-phase flow using backscattering standard LDV was investigated from slug to bubbly flow regime. The range of velocity measured is from 0.2 to 1.5 m/s and that of bubble size is from 2 to 20 mm. For local temperature of boiling flow measurement, microthermocouple were manufactured and local liquid and vapor temperatures were measured in pool boiling and boiling flow. 66 refs., 74 figs., 4 tabs. (Author)

  11. Development of digital device based work verification system for cooperation between main control room operators and field workers in nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Seung Min, E-mail: jewellee@kaeri.re.kr [Korea Atomic Energy Research Institute, 305-353, 989-111 Daedeok-daero, Yuseong-gu, Daejeon (Korea, Republic of); Department of Nuclear and Quantum Engineering, Korea Advanced Institute of Science and Technology, 373-1, Guseong-dong, Yuseong-gu, Daejeon 305-701 (Korea, Republic of); Lee, Hyun Chul, E-mail: leehc@kaeri.re.kr [Korea Atomic Energy Research Institute, 305-353, 989-111 Daedeok-daero, Yuseong-gu, Daejeon (Korea, Republic of); Ha, Jun Su, E-mail: junsu.ha@kustar.ac.ae [Department of Nuclear Engineering, Khalifa University of Science Technology and Research, Abu Dhabi P.O. Box 127788 (United Arab Emirates); Seong, Poong Hyun, E-mail: phseong@kaist.ac.kr [Department of Nuclear and Quantum Engineering, Korea Advanced Institute of Science and Technology, 373-1, Guseong-dong, Yuseong-gu, Daejeon 305-701 (Korea, Republic of)

    2016-10-15

    Highlights: • A digital device-based work verification and cooperation support system was developed. • Requirements were derived by interviewing field operators having experiences with mobile-based work support systems. • The usability of the proposed system was validated by conducting questionnaire surveys. • The proposed system will be useful if the manual or the set of guidelines is well constructed. - Abstract: Digital technologies have been applied in the nuclear field to check task results, monitor events and accidents, and transmit/receive data. The results of using digital devices have proven that these devices can provide high accuracy and convenience for workers, allowing them to obtain obvious positive effects by reducing their workloads. In this study, as one step forward, a digital device-based cooperation support system, the nuclear cooperation support and mobile documentation system (Nu-COSMOS), is proposed to support communication between main control room (MCR) operators and field workers by verifying field workers’ work results in nuclear power plants (NPPs). The proposed system consists of a mobile based information storage system to support field workers by providing various functions to make workers more trusted by MCR operators; also to improve the efficiency of meeting, and a large screen based information sharing system supports meetings by allowing both sides to share one medium. The usability of this system was estimated by interviewing field operators working in nuclear power plants and experts who have experience working as operators. A survey to estimate the usability of the suggested system and the suitability of the functions of the system for field working was conducted for 35 subjects who have experience in field works or with support system development-related research. The usability test was conducted using the system usability scale (SUS), which is widely used in industrial usability evaluation. Using questionnaires

  12. Development of digital device based work verification system for cooperation between main control room operators and field workers in nuclear power plants

    International Nuclear Information System (INIS)

    Lee, Seung Min; Lee, Hyun Chul; Ha, Jun Su; Seong, Poong Hyun

    2016-01-01

    Highlights: • A digital device-based work verification and cooperation support system was developed. • Requirements were derived by interviewing field operators having experiences with mobile-based work support systems. • The usability of the proposed system was validated by conducting questionnaire surveys. • The proposed system will be useful if the manual or the set of guidelines is well constructed. - Abstract: Digital technologies have been applied in the nuclear field to check task results, monitor events and accidents, and transmit/receive data. The results of using digital devices have proven that these devices can provide high accuracy and convenience for workers, allowing them to obtain obvious positive effects by reducing their workloads. In this study, as one step forward, a digital device-based cooperation support system, the nuclear cooperation support and mobile documentation system (Nu-COSMOS), is proposed to support communication between main control room (MCR) operators and field workers by verifying field workers’ work results in nuclear power plants (NPPs). The proposed system consists of a mobile based information storage system to support field workers by providing various functions to make workers more trusted by MCR operators; also to improve the efficiency of meeting, and a large screen based information sharing system supports meetings by allowing both sides to share one medium. The usability of this system was estimated by interviewing field operators working in nuclear power plants and experts who have experience working as operators. A survey to estimate the usability of the suggested system and the suitability of the functions of the system for field working was conducted for 35 subjects who have experience in field works or with support system development-related research. The usability test was conducted using the system usability scale (SUS), which is widely used in industrial usability evaluation. Using questionnaires

  13. Development of a FBR fuel bundle-duct interaction analysis code-BAMBOO. Analysis model and verification by Phenix high burn-up fuel subassemblies

    International Nuclear Information System (INIS)

    Uwaba, Tomoyuki; Ito, Masahiro; Ukai, Shigeharu

    2005-01-01

    The bundle-duct interaction analysis code ''BAMBOO'' has been developed for the purpose of predicting deformation of a wire-wrapped fuel pin bundle of a fast breeder reactor (FBR). The BAMBOO code calculates helical bowing and oval-distortion of all the fuel pins in a fuel subassembly. We developed deformation models in order to precisely analyze the irradiation induced deformation by the code: a model to analyze fuel pin self-bowing induced by circumferential gradient of void swelling as well as thermal expansion, and a model to analyze dispersion of the orderly arrangement of a fuel pin bundle. We made deformation analyses of high burn-up fuel subassemblies in Phenix reactor and compared the calculated results with the post irradiation examination data of these subassemblies for the verification of these models. From the comparison we confirmed that the calculated values of the oval-distortion and bowing reasonably agreed with the PIE results if these models were used in the analysis of the code. (author)

  14. On the possibility of developing incoherent fibre-optic data transmission systems based on signal spectral coding with matched acousto-optical filters

    International Nuclear Information System (INIS)

    Proklov, Valerii V; Byshevski-Konopko, O A; Grigorievski, V I

    2013-01-01

    The scheme is suggested for developing the optical communication line based on the principle of code division of multiple access with matched acousto-optical filters and a 16-bit long Walsh sequence. Results of modelling show that such a line can operate if adjacent spectral lines are separated by at least double the Rayleigh criterion. (optical information transmission)

  15. Land surface Verification Toolkit (LVT)

    Science.gov (United States)

    Kumar, Sujay V.

    2017-01-01

    LVT is a framework developed to provide an automated, consolidated environment for systematic land surface model evaluation Includes support for a range of in-situ, remote-sensing and other model and reanalysis products. Supports the analysis of outputs from various LIS subsystems, including LIS-DA, LIS-OPT, LIS-UE. Note: The Land Information System Verification Toolkit (LVT) is a NASA software tool designed to enable the evaluation, analysis and comparison of outputs generated by the Land Information System (LIS). The LVT software is released under the terms and conditions of the NASA Open Source Agreement (NOSA) Version 1.1 or later. Land Information System Verification Toolkit (LVT) NOSA.

  16. ENVIRONMENTAL TECHNOLOGY VERIFICATION: TEST/QA PLAN FOR THE VERIFICATION TESTING OF SELECTIVE CATALYTIC REDUCTION CONTROL TECHNOLOGIES FOR HIGHWAY, NONROAD, AND STATIONARY USE DIESEL ENGINES

    Science.gov (United States)

    The U.S. Environmental Protection Agency established the Environmental Technology Verification Program to accelerate the development and commercialization of improved environmental technology through third party verification and reporting of product performance. Research Triangl...

  17. Self-packed filter plates: a good alternative for pre-packed filter plates for developing purification processes of therapeutic proteins

    NARCIS (Netherlands)

    Li, X.; Roo, de G.; Burgers, K.; Ottens, M.; Eppink, M.H.M.

    2012-01-01

    The use of high throughput screening (HTS) has successfully been applied in the past years in downstream process development of therapeutic proteins. Different HTS applications were introduced to speed up the purification process development of these proteins. In the light of these findings, studies

  18. Nuclear test ban verification

    International Nuclear Information System (INIS)

    Chun, Kin-Yip

    1991-07-01

    This report describes verification and its rationale, the basic tasks of seismic verification, the physical basis for earthquake/explosion source discrimination and explosion yield determination, the technical problems pertaining to seismic monitoring of underground nuclear tests, the basic problem-solving strategy deployed by the forensic seismology resarch team at the University of Toronto, and the scientific significance of the team's research. The research carried out at the Univeristy of Toronto has two components: teleseismic verification using P wave recordings from the Yellowknife Seismic Array (YKA), and regional (close-in) verification using high-frequency L g and P n recordings from the Eastern Canada Telemetered Network. Major differences have been found in P was attenuation among the propagation paths connecting the YKA listening post with seven active nuclear explosion testing areas in the world. Significant revisions have been made to previously published P wave attenuation results, leading to more interpretable nuclear explosion source functions. (11 refs., 12 figs.)

  19. Standard Verification System (SVS)

    Data.gov (United States)

    Social Security Administration — SVS is a mainframe program that accesses the NUMIDENT to perform SSN verifications. This program is called by SSA Internal applications to verify SSNs. There is also...

  20. Formal Verification -26 ...

    Indian Academy of Sciences (India)

    by testing of the components and successful testing leads to the software being ... Formal verification is based on formal methods which are mathematically based ..... scenario under which a similar error could occur. There are various other ...

  1. SSN Verification Service

    Data.gov (United States)

    Social Security Administration — The SSN Verification Service is used by Java applications to execute the GUVERF02 service using the WebSphere/CICS Interface. It accepts several input data fields...

  2. Verification of RADTRAN

    International Nuclear Information System (INIS)

    Kanipe, F.L.; Neuhauser, K.S.

    1995-01-01

    This document presents details of the verification process of the RADTRAN computer code which was established for the calculation of risk estimates for radioactive materials transportation by highway, rail, air, and waterborne modes

  3. Generalised Filtering

    Directory of Open Access Journals (Sweden)

    Karl Friston

    2010-01-01

    Full Text Available We describe a Bayesian filtering scheme for nonlinear state-space models in continuous time. This scheme is called Generalised Filtering and furnishes posterior (conditional densities on hidden states and unknown parameters generating observed data. Crucially, the scheme operates online, assimilating data to optimize the conditional density on time-varying states and time-invariant parameters. In contrast to Kalman and Particle smoothing, Generalised Filtering does not require a backwards pass. In contrast to variational schemes, it does not assume conditional independence between the states and parameters. Generalised Filtering optimises the conditional density with respect to a free-energy bound on the model's log-evidence. This optimisation uses the generalised motion of hidden states and parameters, under the prior assumption that the motion of the parameters is small. We describe the scheme, present comparative evaluations with a fixed-form variational version, and conclude with an illustrative application to a nonlinear state-space model of brain imaging time-series.

  4. Filter This

    Directory of Open Access Journals (Sweden)

    Audrey Barbakoff

    2011-03-01

    Full Text Available In the Library with the Lead Pipe welcomes Audrey Barbakoff, a librarian at the Milwaukee Public Library, and Ahniwa Ferrari, Virtual Experience Manager at the Pierce County Library System in Washington, for a point-counterpoint piece on filtering in libraries. The opinions expressed here are those of the authors, and are not endorsed by their employers. [...

  5. Development of Shunt-Type Three-Phase Active Power Filter with Novel Adaptive Control for Wind Generators

    Directory of Open Access Journals (Sweden)

    Ming-Hung Chen

    2015-01-01

    Full Text Available This paper proposes a new adaptive filter for wind generators that combines instantaneous reactive power compensation technology and current prediction controller, and therefore this system is characterized by low harmonic distortion, high power factor, and small DC-link voltage variations during load disturbances. The performance of the system was first simulated using MATLAB/Simulink, and the possibility of an adaptive digital low-pass filter eliminating current harmonics was confirmed in steady and transient states. Subsequently, a digital signal processor was used to implement an active power filter. The experimental results indicate, that for the rated operation of 2 kVA, the system has a total harmonic distortion of current less than 5.0% and a power factor of 1.0 on the utility side. Thus, the transient performance of the adaptive filter is superior to the traditional digital low-pass filter and is more economical because of its short computation time compared with other types of adaptive filters.

  6. Development of Shunt-Type Three-Phase Active Power Filter with Novel Adaptive Control for Wind Generators.

    Science.gov (United States)

    Chen, Ming-Hung

    2015-01-01

    This paper proposes a new adaptive filter for wind generators that combines instantaneous reactive power compensation technology and current prediction controller, and therefore this system is characterized by low harmonic distortion, high power factor, and small DC-link voltage variations during load disturbances. The performance of the system was first simulated using MATLAB/Simulink, and the possibility of an adaptive digital low-pass filter eliminating current harmonics was confirmed in steady and transient states. Subsequently, a digital signal processor was used to implement an active power filter. The experimental results indicate, that for the rated operation of 2 kVA, the system has a total harmonic distortion of current less than 5.0% and a power factor of 1.0 on the utility side. Thus, the transient performance of the adaptive filter is superior to the traditional digital low-pass filter and is more economical because of its short computation time compared with other types of adaptive filters.

  7. Development and modelling of a steel slag filter effluent neutralization process with CO2-enriched air from an upstream bioprocess.

    Science.gov (United States)

    Bove, Patricia; Claveau-Mallet, Dominique; Boutet, Étienne; Lida, Félix; Comeau, Yves

    2018-02-01

    The main objective of this project was to develop a steel slag filter effluent neutralization process by acidification with CO 2 -enriched air coming from a bioprocess. Sub-objectives were to evaluate the neutralization capacity of different configurations of neutralization units in lab-scale conditions and to propose a design model of steel slag effluent neutralization. Two lab-scale column neutralization units fed with two different types of influent were operated at hydraulic retention time of 10 h. Tested variables were mode of flow (saturated or percolating), type of media (none, gravel, Bionest and AnoxKaldnes K3), type of air (ambient or CO 2 -enriched) and airflow rate. One neutralization field test (saturated and no media, 2000-5000 ppm CO 2 , sequential feeding, hydraulic retention time of 7.8 h) was conducted for 7 days. Lab-scale and field-scale tests resulted in effluent pH of 7.5-9.5 when the aeration rate was sufficiently high. A model was implemented in the PHREEQC software and was based on the carbonate system, CO 2 transfer and calcite precipitation; and was calibrated on ambient air lab tests. The model was validated with CO 2 -enriched air lab and field tests, providing satisfactory validation results over a wide range of CO 2 concentrations. The flow mode had a major impact on CO 2 transfer and hydraulic efficiency, while the type of media had little influence. The flow mode also had a major impact on the calcite surface concentration in the reactor: it was constant in saturated mode and was increasing in percolating mode. Predictions could be made for different steel slag effluent pH and different operation conditions (hydraulic retention time, CO 2 concentration, media and mode of flow). The pH of the steel slag filter effluent and the CO 2 concentration of the enriched air were factors that influenced most the effluent pH of the neutralization process. An increased concentration in CO 2 in the enriched air reduced calcite precipitation

  8. Development of a LiF-filter for measuring plasma fluctuations in the far ultraviolet radiation spectral range

    International Nuclear Information System (INIS)

    Schittenhelm, M.

    1991-06-01

    The investigations of fluctuations and anomalous transport lie at hart of the tokamak research program, especially in the shear zone close to and beyond the last closed flux surface. Until now fluctuation measurements using plasma radiation were only made on the edge of the plasma, since they rely on the H α emission. In order to measure electron density fluctuations with good spatial and temporal resolution in the shear zone, the OVI doublet (2s-2p) can be observed. These are very strong impurity emission lines in the VUV region (103.2 nm and 103.8 nm) emitted from a narrow layer close to the separatrix. To get an image of this layer and to achieve enough intensity for a good temporal resolution, it is necessary to develop a filter with high transmission. A possible candidate is lithium fluoride (LiF), which transmits light at shorter wavelength than other materials. By cooling LiF crystals from 300 K to 220 K the cutoff wavelength decreases from 105 nm to about 103 nm. This master thesis presents a detailed investigation of the transmission of LiF near the cutoff wavelength. Crystal sheets produced by different manufactures were tested and the temperature dependence of the cutoff edge was investigated. (orig./AH)

  9. OPTIMIZATION OF ADVANCED FILTER SYSTEMS

    Energy Technology Data Exchange (ETDEWEB)

    R.A. Newby; G.J. Bruck; M.A. Alvin; T.E. Lippert

    1998-04-30

    Reliable, maintainable and cost effective hot gas particulate filter technology is critical to the successful commercialization of advanced, coal-fired power generation technologies, such as IGCC and PFBC. In pilot plant testing, the operating reliability of hot gas particulate filters have been periodically compromised by process issues, such as process upsets and difficult ash cake behavior (ash bridging and sintering), and by design issues, such as cantilevered filter elements damaged by ash bridging, or excessively close packing of filtering surfaces resulting in unacceptable pressure drop or filtering surface plugging. This test experience has focused the issues and has helped to define advanced hot gas filter design concepts that offer higher reliability. Westinghouse has identified two advanced ceramic barrier filter concepts that are configured to minimize the possibility of ash bridge formation and to be robust against ash bridges should they occur. The ''inverted candle filter system'' uses arrays of thin-walled, ceramic candle-type filter elements with inside-surface filtering, and contains the filter elements in metal enclosures for complete separation from ash bridges. The ''sheet filter system'' uses ceramic, flat plate filter elements supported from vertical pipe-header arrays that provide geometry that avoids the buildup of ash bridges and allows free fall of the back-pulse released filter cake. The Optimization of Advanced Filter Systems program is being conducted to evaluate these two advanced designs and to ultimately demonstrate one of the concepts in pilot scale. In the Base Contract program, the subject of this report, Westinghouse has developed conceptual designs of the two advanced ceramic barrier filter systems to assess their performance, availability and cost potential, and to identify technical issues that may hinder the commercialization of the technologies. A plan for the Option I, bench

  10. Multilateral disarmament verification

    International Nuclear Information System (INIS)

    Persbo, A.

    2013-01-01

    Non-governmental organisations, such as VERTIC (Verification Research, Training and Information Centre), can play an important role in the promotion of multilateral verification. Parties involved in negotiating nuclear arms accords are for the most part keen that such agreements include suitable and robust provisions for monitoring and verification. Generally progress in multilateral arms control verification is often painstakingly slow, but from time to time 'windows of opportunity' - that is, moments where ideas, technical feasibility and political interests are aligned at both domestic and international levels - may occur and we have to be ready, so the preparatory work is very important. In the context of nuclear disarmament, verification (whether bilateral or multilateral) entails an array of challenges, hurdles and potential pitfalls relating to national security, health, safety and even non-proliferation, so preparatory work is complex and time-greedy. A UK-Norway Initiative was established in order to investigate the role that a non-nuclear-weapon state such as Norway could potentially play in the field of nuclear arms control verification. (A.C.)

  11. Development of a method for bacteria and virus recovery from heating, ventilation, and air conditioning (HVAC) filters.

    Science.gov (United States)

    Farnsworth, James E; Goyal, Sagar M; Kim, Seung Won; Kuehn, Thomas H; Raynor, Peter C; Ramakrishnan, M A; Anantharaman, Senthilvelan; Tang, Weihua

    2006-10-01

    The aim of the work presented here is to study the effectiveness of building air handling units (AHUs) in serving as high volume sampling devices for airborne bacteria and viruses. An HVAC test facility constructed according to ASHRAE Standard 52.2-1999 was used for the controlled loading of HVAC filter media with aerosolized bacteria and virus. Nonpathogenic Bacillus subtilis var. niger was chosen as a surrogate for Bacillus anthracis. Three animal viruses; transmissible gastroenteritis virus (TGEV), avian pneumovirus (APV), and fowlpox virus were chosen as surrogates for three human viruses; SARS coronavirus, respiratory syncytial virus, and smallpox virus; respectively. These bacteria and viruses were nebulized in separate tests and injected into the test duct of the test facility upstream of a MERV 14 filter. SKC Biosamplers upstream and downstream of the test filter served as reference samplers. The collection efficiency of the filter media was calculated to be 96.5 +/- 1.5% for B. subtilis, however no collection efficiency was measured for the viruses as no live virus was ever recovered from the downstream samplers. Filter samples were cut from the test filter and eluted by hand-shaking. An extraction efficiency of 105 +/- 19% was calculated for B. subtilis. The viruses were extracted at much lower efficiencies (0.7-20%). Our results indicate that the airborne concentration of spore-forming bacteria in building AHUs may be determined by analyzing the material collected on HVAC filter media, however culture-based analytical techniques are impractical for virus recovery. Molecular-based identification techniques such as PCR could be used.

  12. Development of High-Reflectivity Optical Coatings for the Vacuum Ultraviolet and Verification on a Sounding Rocket Flight

    Data.gov (United States)

    National Aeronautics and Space Administration — We desire to develop new thin film coatings of fluorides to utilize the high intrinsic reflectivity of aluminum. Highly controllable thickness of fluorides can be...

  13. Multi-Axis Independent Electromechanical Load Control for Docking System Actuation Development and Verification Using dSPACE

    Science.gov (United States)

    Oesch, Christopher; Dick, Brandon; Rupp, Timothy

    2015-01-01

    The development of highly complex and advanced actuation systems to meet customer demands has accelerated as the use of real-time testing technology expands into multiple markets at Moog. Systems developed for the autonomous docking of human rated spacecraft to the International Space Station (ISS), envelope multi-operational characteristics which place unique constraints on an actuation system. Real-time testing hardware has been used as a platform for incremental testing and development for the linear actuation system which controls initial capture and docking for vehicles visiting the ISS. This presentation will outline the role of dSPACE hardware as a platform for rapid control-algorithm prototyping as well as an Electromechanical Actuator (EMA) system dynamic loading simulator, both conducted at Moog to develop the safety critical Linear Actuator System (LAS) of the NASA Docking System (NDS).

  14. Cleaning verification: A five parameter study of a Total Organic Carbon method development and validation for the cleaning assessment of residual detergents in manufacturing equipment.

    Science.gov (United States)

    Li, Xue; Ahmad, Imad A Haidar; Tam, James; Wang, Yan; Dao, Gina; Blasko, Andrei

    2018-02-05

    A Total Organic Carbon (TOC) based analytical method to quantitate trace residues of clean-in-place (CIP) detergents CIP100 ® and CIP200 ® on the surfaces of pharmaceutical manufacturing equipment was developed and validated. Five factors affecting the development and validation of the method were identified: diluent composition, diluent volume, extraction method, location for TOC sample preparation, and oxidant flow rate. Key experimental parameters were optimized to minimize contamination and to improve the sensitivity, recovery, and reliability of the method. The optimized concentration of the phosphoric acid in the swabbing solution was 0.05M, and the optimal volume of the sample solution was 30mL. The swab extraction method was 1min sonication. The use of a clean room, as compared to an isolated lab environment, was not required for method validation. The method was demonstrated to be linear with a correlation coefficient (R) of 0.9999. The average recoveries from stainless steel surfaces at multiple spike levels were >90%. The repeatability and intermediate precision results were ≤5% across the 2.2-6.6ppm range (50-150% of the target maximum carry over, MACO, limit). The method was also shown to be sensitive with a detection limit (DL) of 38ppb and a quantitation limit (QL) of 114ppb. The method validation demonstrated that the developed method is suitable for its intended use. The methodology developed in this study is generally applicable to the cleaning verification of any organic detergents used for the cleaning of pharmaceutical manufacturing equipment made of electropolished stainless steel material. Copyright © 2017 Elsevier B.V. All rights reserved.

  15. Embedded software verification and debugging

    CERN Document Server

    Winterholer, Markus

    2017-01-01

    This book provides comprehensive coverage of verification and debugging techniques for embedded software, which is frequently used in safety critical applications (e.g., automotive), where failures are unacceptable. Since the verification of complex systems needs to encompass the verification of both hardware and embedded software modules, this book focuses on verification and debugging approaches for embedded software with hardware dependencies. Coverage includes the entire flow of design, verification and debugging of embedded software and all key approaches to debugging, dynamic, static, and hybrid verification. This book discusses the current, industrial embedded software verification flow, as well as emerging trends with focus on formal and hybrid verification and debugging approaches. Includes in a single source the entire flow of design, verification and debugging of embedded software; Addresses the main techniques that are currently being used in the industry for assuring the quality of embedded softw...

  16. Improvement on post-OPC verification efficiency for contact/via coverage check by final CD biasing of metal lines and considering their location on the metal layout

    Science.gov (United States)

    Kim, Youngmi; Choi, Jae-Young; Choi, Kwangseon; Choi, Jung-Hoe; Lee, Sooryong

    2011-04-01

    As IC design complexity keeps increasing, it is more and more difficult to ensure the pattern transfer after optical proximity correction (OPC) due to the continuous reduction of layout dimensions and lithographic limitation by k1 factor. To guarantee the imaging fidelity, resolution enhancement technologies (RET) such as off-axis illumination (OAI), different types of phase shift masks and OPC technique have been developed. In case of model-based OPC, to cross-confirm the contour image versus target layout, post-OPC verification solutions continuously keep developed - contour generation method and matching it to target structure, method for filtering and sorting the patterns to eliminate false errors and duplicate patterns. The way to detect only real errors by excluding false errors is the most important thing for accurate and fast verification process - to save not only reviewing time and engineer resource, but also whole wafer process time and so on. In general case of post-OPC verification for metal-contact/via coverage (CC) check, verification solution outputs huge of errors due to borderless design, so it is too difficult to review and correct all points of them. It should make OPC engineer to miss the real defect, and may it cause the delay time to market, at least. In this paper, we studied method for increasing efficiency of post-OPC verification, especially for the case of CC check. For metal layers, final CD after etch process shows various CD bias, which depends on distance with neighbor patterns, so it is more reasonable that consider final metal shape to confirm the contact/via coverage. Through the optimization of biasing rule for different pitches and shapes of metal lines, we could get more accurate and efficient verification results and decrease the time for review to find real errors. In this paper, the suggestion in order to increase efficiency of OPC verification process by using simple biasing rule to metal layout instead of etch model

  17. Algorithm development and verification of UASCM for multi-dimension and multi-group neutron kinetics model

    International Nuclear Information System (INIS)

    Si, S.

    2012-01-01

    The Universal Algorithm of Stiffness Confinement Method (UASCM) for neutron kinetics model of multi-dimensional and multi-group transport equations or diffusion equations has been developed. The numerical experiments based on transport theory code MGSNM and diffusion theory code MGNEM have demonstrated that the algorithm has sufficient accuracy and stability. (authors)

  18. Development of regulation technologies for software verification and validation of I and C systems important to safety in NPPs

    International Nuclear Information System (INIS)

    Kim, Bok Ryul; Oh, S. H.; Zhu, O. P.; Jeong, C. H.; Hwang, H. S.; Goo, C. S.; Chung, Y. H.

    2000-12-01

    The project has provided the draft regulatory policies and guides regarding the quality assurance of software used to I and C systems important to safety in nuclear power plants, differentiated V and V activities by safety classes which are important elements in ensuring software quality assurance, and suggested V and V techniques to be applied, regulatory guides and checklists for reviewing software important to safety. The project introduced the classification concepts on software quality assurance. The I and C systems important to safety are classified into IC-1, IC-2, IC-3, and Non-IC as based on safety classifications. And the software used to these I and C systems are classified into 3 categories, say, safety-critical software, safety-related software, and non-safety software, in the light of safety importance of functions to be performed. Based upon these safety classifications, the extent of software V and V activities by each class has been differentiated each other. On the other hand, the project has divided software important to safety into newly-developed software and previously-developed software in terms of design and implementation, and provided the draft regulatory guides on each type of software, for instance, newly-developed software, previously-developed software, and software tools

  19. Box-particle intensity filter

    OpenAIRE

    Schikora, Marek; Gning, Amadou; Mihaylova, Lyudmila; Cremers, Daniel; Koch, Wofgang; Streit, Roy

    2012-01-01

    This paper develops a novel approach for multi-target tracking, called box-particle intensity filter (box-iFilter). The approach is able to cope with unknown clutter, false alarms and estimates the unknown number of targets. Furthermore, it is capable of dealing with three sources of uncertainty: stochastic, set-theoretic and data association uncertainty. The box-iFilter reduces the number of particles significantly, which improves the runtime considerably. The low particle number enables thi...

  20. Biotrickling filter modeling for styrene abatement. Part 1: Model development, calibration and validation on an industrial scale.

    Science.gov (United States)

    San-Valero, Pau; Dorado, Antonio D; Martínez-Soria, Vicente; Gabaldón, Carmen

    2018-01-01

    A three-phase dynamic mathematical model based on mass balances describing the main processes in biotrickling filtration: convection, mass transfer, diffusion, and biodegradation was calibrated and validated for the simulation of an industrial styrene-degrading biotrickling filter. The model considered the key features of the industrial operation of biotrickling filters: variable conditions of loading and intermittent irrigation. These features were included in the model switching from the mathematical description of periods with and without irrigation. Model equations were based on the mass balances describing the main processes in biotrickling filtration: convection, mass transfer, diffusion, and biodegradation. The model was calibrated with steady-state data from a laboratory biotrickling filter treating inlet loads at 13-74 g C m -3 h -1 and at empty bed residence time of 30-15 s. The model predicted the dynamic emission in the outlet of the biotrickling filter, simulating the small peaks of concentration occurring during irrigation. The validation of the model was performed using data from a pilot on-site biotrickling filter treating styrene installed in a fiber-reinforced facility. The model predicted the performance of the biotrickling filter working under high-oscillating emissions at an inlet load in a range of 5-23 g C m -3 h -1 and at an empty bed residence time of 31 s for more than 50 days, with a goodness of fit of 0.84. Copyright © 2017 Elsevier Ltd. All rights reserved.

  1. Scaling effects on H2-deflagration in containment-geometries - BASSIM. Further development and verification. Final report

    International Nuclear Information System (INIS)

    Rastogi, A.K.; Wennerberg, D.; Fischer, K.

    1998-01-01

    A multidimensional mechanistic calculation procedure for simulating H 2 -deflagration in multiroom geometries was developed at Battelle in a previous project. This calculation method was verified against a number of experiments performed in BMC (Battelle Model Containment) and HDR (Heissdampf Reaktor) facilities. It turned out that the above mentioned procedure overpredicted the H 2 -burnrates in experiments in smaller facilities and therefore was unable to predict the important 'scaling influences'. It is the purpose of the present work to develop the above mentioned calculation procedure BASSIM-H 2 (mod 2.3) further in order to predict the scaling influences correctly. In the present work the combustion model was developed further such that the important phenomena e.g. ignition phase, quasi-laminar initial phase, and the turbulent phase of a H 2 premixed flame would be modelled realistically. The model developed has been verified against 16 very different experiments from 9 different facilities. The computed cases varied in volumes from 0.022 m 3 up to 2100 m 3 . These cases have also been computed with the older model verified in [15]. Based on the comparison between the computed results obtained with the new model and the computed results obtained with the old model as well as with the experimental data, the model put forward in this work is evaluated. The present model computes the scaling effects on H 2 -deflagration satisfactorily with the same set of empirical constants. The flame propagation in horizontal as well as vertical (both upwards and downwards) directions can be computed satisfactorily. The influence of flow obstructions and heat loss at walls is considered as well. (orig.) [de

  2. Development and verification of signal processing system of avalanche photo diode for the active shields onboard ASTRO-H

    Energy Technology Data Exchange (ETDEWEB)

    Ohno, M., E-mail: ohno@hep01.hepl.hiroshima-u.ac.jp [Department of Physical Sciences, Hiroshima University, Hiroshima 739-8526 (Japan); Kawano, T.; Edahiro, I.; Shirakawa, H.; Ohashi, N.; Okada, C.; Habata, S.; Katsuta, J.; Tanaka, Y.; Takahashi, H.; Mizuno, T.; Fukazawa, Y. [Department of Physical Sciences, Hiroshima University, Hiroshima 739-8526 (Japan); Murakami, H.; Kobayashi, S.; Miyake, K.; Ono, K.; Kato, Y.; Furuta, Y.; Murota, Y.; Okuda, K. [Department of Physics, University of Tokyo, Tokyo 113-0033 (Japan); and others

    2016-09-21

    The hard X-ray Imager and Soft Gamma-ray Detector onboard ASTRO-H demonstrate high sensitivity to hard X-ray (5–80 keV) and soft gamma-rays (60–600 keV), respectively. To reduce the background, both instruments are actively shielded by large, thick Bismuth Germanate scintillators. We have developed the signal processing system of the avalanche photodiode in the BGO active shields and have demonstrated its effectiveness after assembly in the flight model of the HXI/SGD sensor and after integration into the satellite. The energy threshold achieved is about 150 keV and anti-coincidence efficiency for cosmic-ray events is almost 100%. Installed in the BGO active shield, the developed signal processing system successfully reduces the room background level of the main detector. - Highlights: • A detail of development of signal processing system for ASTRO-H is presented. • Digital filer with FPGA instead of discrete analog circuit is applied. • Expected performance is verified after integration of the satellite.

  3. Whole-body isometric force/torque measurements for functional assessment in neuro-rehabilitation: platform design, development and verification

    Directory of Open Access Journals (Sweden)

    Cavallo Giuseppe

    2009-10-01

    Full Text Available Abstract Background One of the main scientific and technological challenges of rehabilitation bioengineering is the development of innovative methodologies, based on the use of appropriate technological devices, for an objective assessment of patients undergoing a rehabilitation treatment. Such tools should be as fast and cheap to use as clinical scales, which are currently the daily instruments most widely used in the routine clinical practice. Methods A human-centered approach was used in the design and development of a mechanical structure equipped with eight force/torque sensors that record quantitative data during the initiation of a predefined set of Activities of Daily Living (ADL tasks, in isometric conditions. Results Preliminary results validated the appropriateness, acceptability and functionality of the proposed platform, that has become now a tool used for clinical research in three clinical centres. Conclusion This paper presented the design and development of an innovative platform for whole-body force and torque measurements on human subjects. The platform has been designed to perform accurate quantitative measurements in isometric conditions with the specific aim to address the needs for functional assessment tests of patients undergoing a rehabilitation treatment as a consequence of a stroke. The versatility of the system also enlightens several other interesting possible areas of application for therapy in neurorehabilitation, for research in basic neuroscience, and more.

  4. The Development of Target-Specific Pose Filter Ensembles To Boost Ligand Enrichment for Structure-Based Virtual Screening.

    Science.gov (United States)

    Xia, Jie; Hsieh, Jui-Hua; Hu, Huabin; Wu, Song; Wang, Xiang Simon

    2017-06-26

    Structure-based virtual screening (SBVS) has become an indispensable technique for hit identification at the early stage of drug discovery. However, the accuracy of current scoring functions is not high enough to confer success to every target and thus remains to be improved. Previously, we had developed binary pose filters (PFs) using knowledge derived from the protein-ligand interface of a single X-ray structure of a specific target. This novel approach had been validated as an effective way to improve ligand enrichment. Continuing from it, in the present work we attempted to incorporate knowledge collected from diverse protein-ligand interfaces of multiple crystal structures of the same target to build PF ensembles (PFEs). Toward this end, we first constructed a comprehensive data set to meet the requirements of ensemble modeling and validation. This set contains 10 diverse targets, 118 well-prepared X-ray structures of protein-ligand complexes, and large benchmarking actives/decoys sets. Notably, we designed a unique workflow of two-layer classifiers based on the concept of ensemble learning and applied it to the construction of PFEs for all of the targets. Through extensive benchmarking studies, we demonstrated that (1) coupling PFE with Chemgauss4 significantly improves the early enrichment of Chemgauss4 itself and (2) PFEs show greater consistency in boosting early enrichment and larger overall enrichment than our prior PFs. In addition, we analyzed the pairwise topological similarities among cognate ligands used to construct PFEs and found that it is the higher chemical diversity of the cognate ligands that leads to the improved performance of PFEs. Taken together, the results so far prove that the incorporation of knowledge from diverse protein-ligand interfaces by ensemble modeling is able to enhance the screening competence of SBVS scoring functions.

  5. Development and Implementation of Optimal Filtering in a Virtex FPGA for the Upgrade of the ATLAS LAr Calorimeter Readout

    CERN Document Server

    Stärz, S; The ATLAS collaboration

    2012-01-01

    In the context of upgraded read-out systems for the Liquid-Argon Calorimeters of the ATLAS detector, modified front-end, back-end and trigger electronics are foreseen for operation in the high-luminosity phase of the LHC. Accuracy and efficiency of the energy measurement and reliability of pile-up suppression are substantial when processing the detector raw-data in real-time. Several digital filter algorithms are investigated for their performance to extract energies from incoming detector signals and for the needs of the future trigger system. The implementation of fast, resource economizing, parameter driven filter algorithms in a modern Virtex FPGA is presented.

  6. Development of Work Verification System for Cooperation between MCR Operators and Field Workers in Nuclear Power Plants

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Seung Min; Seong, Poong Hyun [Korea Advanced Institute of Science and Technology, Daejeon (Korea, Republic of); Lee, Hyun Chul [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2014-05-15

    In this work, as an application of digital devices to NPPs, a cooperation support system to aid communication between MCR operators and field workers in Nuclear Power Plants (NPPs), NUclear COoperation Support and MObile document System (Nu-COSMOS), is suggested. It is not easy for MCR operators to estimate whether field workers conduct their work correctly because MCR operators cannot monitor field workers at a real time, and records on paper procedure written by field workers do not contain the detailed information about work process and results. Thus, for safety operation without any events induced by misunderstand and miscommunication between MCR operators and field workers, the Nu-COSMOS is developed and it will be useful from the supporting cooperation point of view. To support the cooperation between MCR operators and field workers in NPPs, the cooperation support and mobile documentation system Nu-COSMOS is suggested in this work. To improve usability and applicability of the suggested system, the results of using existed digital device based support systems were analyzed. Through the analysis, the disincentive elements of using digital device-based developments and the recommendations for developing new mobile based system were derived. Based on derived recommendations, two sub systems, the mobile device based in-formation storing system and the large screen based information sharing system were suggested. The usability of the suggested system will be conducted by a survey with questionnaires. Field workers and operators, and nuclear-related person who had experiences as an operator, graduate students affiliated in nuclear engineering department will use and test the functions of the suggested system. It is expected that the mobile based information storing system can reduce the field workers' work load and enhance the understanding of MCR operators about field operators work process by monitoring all work results and work processes stored in devices.

  7. Development of Work Verification System for Cooperation between MCR Operators and Field Workers in Nuclear Power Plants

    International Nuclear Information System (INIS)

    Lee, Seung Min; Seong, Poong Hyun; Lee, Hyun Chul

    2014-01-01

    In this work, as an application of digital devices to NPPs, a cooperation support system to aid communication between MCR operators and field workers in Nuclear Power Plants (NPPs), NUclear COoperation Support and MObile document System (Nu-COSMOS), is suggested. It is not easy for MCR operators to estimate whether field workers conduct their work correctly because MCR operators cannot monitor field workers at a real time, and records on paper procedure written by field workers do not contain the detailed information about work process and results. Thus, for safety operation without any events induced by misunderstand and miscommunication between MCR operators and field workers, the Nu-COSMOS is developed and it will be useful from the supporting cooperation point of view. To support the cooperation between MCR operators and field workers in NPPs, the cooperation support and mobile documentation system Nu-COSMOS is suggested in this work. To improve usability and applicability of the suggested system, the results of using existed digital device based support systems were analyzed. Through the analysis, the disincentive elements of using digital device-based developments and the recommendations for developing new mobile based system were derived. Based on derived recommendations, two sub systems, the mobile device based in-formation storing system and the large screen based information sharing system were suggested. The usability of the suggested system will be conducted by a survey with questionnaires. Field workers and operators, and nuclear-related person who had experiences as an operator, graduate students affiliated in nuclear engineering department will use and test the functions of the suggested system. It is expected that the mobile based information storing system can reduce the field workers' work load and enhance the understanding of MCR operators about field operators work process by monitoring all work results and work processes stored in devices

  8. Development of a one-stop beam verification system using electronic portal imaging devices for routine quality assurance

    International Nuclear Information System (INIS)

    Lim, Sangwook; Ma, Sun Young; Jeung, Tae Sig; Yi, Byong Yong; Lee, Sang Hoon; Lee, Suk; Cho, Sam Ju; Choi, Jinho

    2012-01-01

    In this study, a computer-based system for routine quality assurance (QA) of a linear accelerator (linac) was developed by using the dosimetric properties of an amorphous silicon electronic portal imaging device (EPID). An acrylic template phantom was designed such that it could be placed on the EPID and be aligned with the light field of the collimator. After irradiation, portal images obtained from the EPID were transferred in DICOM format to a computer and analyzed using a program we developed. The symmetry, flatness, field size, and congruence of the light and radiation fields of the photon beams from the linac were verified simultaneously. To validate the QA system, the ion chamber and film (X-Omat V2; Kodak, New York, NY) measurements were compared with the EPID measurements obtained in this study. The EPID measurements agreed with the film measurements. Parameters for beams with energies of 6 MV and 15 MV were obtained daily for 1 month using this system. It was found that our QA tool using EPID could substitute for the film test, which is a time-consuming method for routine QA assessment.

  9. Development of a Wearable Instrumented Vest for Posture Monitoring and System Usability Verification Based on the Technology Acceptance Model

    Science.gov (United States)

    Lin, Wen-Yen; Chou, Wen-Cheng; Tsai, Tsai-Hsuan; Lin, Chung-Chih; Lee, Ming-Yih

    2016-01-01

    Body posture and activity are important indices for assessing health and quality of life, especially for elderly people. Therefore, an easily wearable device or instrumented garment would be valuable for monitoring elderly people’s postures and activities to facilitate healthy aging. In particular, such devices should be accepted by elderly people so that they are willing to wear it all the time. This paper presents the design and development of a novel, textile-based, intelligent wearable vest for real-time posture monitoring and emergency warnings. The vest provides a highly portable and low-cost solution that can be used both indoors and outdoors in order to provide long-term care at home, including health promotion, healthy aging assessments, and health abnormality alerts. The usability of the system was verified using a technology acceptance model-based study of 50 elderly people. The results indicated that although elderly people are anxious about some newly developed wearable technologies, they look forward to wearing this instrumented posture-monitoring vest in the future. PMID:27999324

  10. Development of core design/analysis technology for integral reactor; verification of SMART nuclear design by Monte Carlo method

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Chang Hyo; Hong, In Seob; Han, Beom Seok; Jeong, Jong Seong [Seoul National University, Seoul (Korea)

    2002-03-01

    The objective of this project is to verify neutronics characteristics of the SMART core design as to compare computational results of the MCNAP code with those of the MASTER code. To achieve this goal, we will analyze neutronics characteristics of the SMART core using the MCNAP code and compare these results with results of the MASTER code. We improved parallel computing module and developed error analysis module of the MCNAP code. We analyzed mechanism of the error propagation through depletion computation and developed a calculation module for quantifying these errors. We performed depletion analysis for fuel pins and assemblies of the SMART core. We modeled a 3-D structure of the SMART core and considered a variation of material compositions by control rods operation and performed depletion analysis for the SMART core. We computed control-rod worths of assemblies and a reactor core for operation of individual control-rod groups. We computed core reactivity coefficients-MTC, FTC and compared these results with computational results of the MASTER code. To verify error analysis module of the MCNAP code, we analyzed error propagation through depletion of the SMART B-type assembly. 18 refs., 102 figs., 36 tabs. (Author)

  11. Development of a one-stop beam verification system using electronic portal imaging devices for routine quality assurance

    Energy Technology Data Exchange (ETDEWEB)

    Lim, Sangwook, E-mail: medicalphysics@hotmail.com [Department of Radiation Oncology, Kosin University College of Medicine, Seo-gu, Busan (Korea, Republic of); Ma, Sun Young; Jeung, Tae Sig [Department of Radiation Oncology, Kosin University College of Medicine, Seo-gu, Busan (Korea, Republic of); Yi, Byong Yong [Department of Radiation Oncology, University of Maryland School of Medicine, Baltimore, MD (United States); Lee, Sang Hoon [Department of Radiation Oncology, Cheil General Hospital and Women' s Healthcare Center, Kwandong University College of Medicine, Jung-gu, Seoul (Korea, Republic of); Lee, Suk [Department of Radiation Oncology, College of Medicine, Korea University, Seongbuk-gu, Seoul (Korea, Republic of); Cho, Sam Ju [Department of Radiation Oncology, Eulji University School of Medicine, Eulji General Hospital, Nowon-gu, Seoul (Korea, Republic of); Choi, Jinho [Department of Radiation Oncology, Gachon University of Medicine and Science, Namdong-gu, Incheon (Korea, Republic of)

    2012-10-01

    In this study, a computer-based system for routine quality assurance (QA) of a linear accelerator (linac) was developed by using the dosimetric properties of an amorphous silicon electronic portal imaging device (EPID). An acrylic template phantom was designed such that it could be placed on the EPID and be aligned with the light field of the collimator. After irradiation, portal images obtained from the EPID were transferred in DICOM format to a computer and analyzed using a program we developed. The symmetry, flatness, field size, and congruence of the light and radiation fields of the photon beams from the linac were verified simultaneously. To validate the QA system, the ion chamber and film (X-Omat V2; Kodak, New York, NY) measurements were compared with the EPID measurements obtained in this study. The EPID measurements agreed with the film measurements. Parameters for beams with energies of 6 MV and 15 MV were obtained daily for 1 month using this system. It was found that our QA tool using EPID could substitute for the film test, which is a time-consuming method for routine QA assessment.

  12. Development of a Wearable Instrumented Vest for Posture Monitoring and System Usability Verification Based on the Technology Acceptance Model.

    Science.gov (United States)

    Lin, Wen-Yen; Chou, Wen-Cheng; Tsai, Tsai-Hsuan; Lin, Chung-Chih; Lee, Ming-Yih

    2016-12-17

    Body posture and activity are important indices for assessing health and quality of life, especially for elderly people. Therefore, an easily wearable device or instrumented garment would be valuable for monitoring elderly people's postures and activities to facilitate healthy aging. In particular, such devices should be accepted by elderly people so that they are willing to wear it all the time. This paper presents the design and development of a novel, textile-based, intelligent wearable vest for real-time posture monitoring and emergency warnings. The vest provides a highly portable and low-cost solution that can be used both indoors and outdoors in order to provide long-term care at home, including health promotion, healthy aging assessments, and health abnormality alerts. The usability of the system was verified using a technology acceptance model-based study of 50 elderly people. The results indicated that although elderly people are anxious about some newly developed wearable technologies, they look forward to wearing this instrumented posture-monitoring vest in the future.

  13. Development and verification of a 281-group WIMS-D library based on ENDF/B-VII.1

    International Nuclear Information System (INIS)

    Dong, Zhengyun; Wu, Jun; Ma, Xubo; Yu, Hui; Chen, Yixue

    2016-01-01

    Highlights: • A new WIMS-D library based on SHEM 281 energy structures is developed. • The method for calculating the lambda factor is illustrated and parameters are discussed. • The results show the improvements of this library compared with other libraries. - Abstract: The WIMS-D library based on WIMS 69 or XMAS 172 energy group structures is widely used in thermal reactor research. Otherwise, the resonance overlap effect is not taken into account in the two energy group structure, which limits the accuracy of resonance treatment. The SHEM 281 group structure is designed by the French to avoid the resonance overlap effect. In this study, a new WIMS-D library with SHEM 281 mesh is developed by using the NJOY nuclear data processing system based on the latest Evaluated Nuclear Data Library ENDF/B-VII.1. The parameters such as the thermal cut-off energy and lambda factor that depend on group structure are discussed. The lambda factor is calculated by Neutron Resonance Spectrum Calculation System and the effect of this factor is analyzed. The new library is verified through the analysis of various criticality benchmarks by using DRAGON code. The values of multiplication factor are consistent with the experiment data and the results also are improved in comparison with other WIMS libraries.

  14. Development of a multi-dimensional realistic thermal-hydraulic system analysis code, MARS 1.3 and its verification

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Won Jae; Chung, Bub Dong; Jeong, Jae Jun; Ha, Kwi Seok [Korea Atomic Energy Research Institute, Taejon (Korea)

    1998-06-01

    A multi-dimensional realistic thermal-hydraulic system analysis code, MARS version 1.3 has been developed. Main purpose of MARS 1.3 development is to have the realistic analysis capability of transient two-phase thermal-hydraulics of Pressurized Water Reactors (PWRs) especially during Large Break Loss of Coolant Accidents (LBLOCAs) where the multi-dimensional phenomena domain the transients. MARS code is a unified version of USNRC developed COBRA-TF, domain the transients. MARS code is a unified version of USNRC developed COBRA-TF, three-dimensional (3D) reactor vessel analysis code, and RELAP5/MOD3.2.1.2, one-dimensional (1D) reactor system analysis code., Developmental requirements for MARS are chosen not only to best utilize the existing capability of the codes but also to have the enhanced capability in code maintenance, user accessibility, user friendliness, code portability, code readability, and code flexibility. For the maintenance of existing codes capability and the enhancement of code maintenance capability, user accessibility and user friendliness, MARS has been unified to be a single code consisting of 1D module (RELAP5) and 3D module (COBRA-TF). This is realized by implicitly integrating the system pressure matrix equations of hydrodynamic models and solving them simultaneously, by modifying the 1D/3D calculation sequence operable under a single Central Processor Unit (CPU) and by unifying the input structure and the light water property routines of both modules. In addition, the code structure of 1D module is completely restructured using the modular data structure of standard FORTRAN 90, which greatly improves the code maintenance capability, readability and portability. For the code flexibility, a dynamic memory management scheme is applied in both modules. MARS 1.3 now runs on PC/Windows and HP/UNIX platforms having a single CPU, and users have the options to select the 3D module to model the 3D thermal-hydraulics in the reactor vessel or other

  15. SU-E-T-641: Development and Verification of Automatic Reading Dose of Interest From Eclipse's DVH

    International Nuclear Information System (INIS)

    Wu, Q

    2014-01-01

    Purpose: According to clinical and research requirement, we develop a function of automatic reading dose of interest from dose volume histogram(DVH), to replace the traditional method with a mouse one by one point, and it's also verified. Methods: The DVH automatic reading function will be developed in an in-house developed radiotherapy information management system(RTIMS), which is based on Apache+PHP+MySQL. A DVH ASCII file is exported from Varian Eclipse V8.6, which includes the following contents: 1. basic information of patient; 2. dose information of plan; 3. dose information of structures, including basic information and dose volume data of target volume and organ at risk. And the default exported dose volume data also includes relative doses by 1% step and corresponding absolute doses and cumulative relative volumes, and the volumes are 4 decimal fraction. Clinically, we often need read the doses of some integer percent volumes, such as D50 and D30. So it couldn't be directly obtained from the above data, but we can use linear interpolation bye the near volumes and doses: Dx=D2−(V2−Vx)*(D2−D1)/(V2−V1), and program a function to search, read and calculate the corresponding data. And the doses of all preseted volume of interest of all structures can be automatically read one by one patient, and saved as a CSV file. To verify it, we select 24 IMRT plans for prostate cancer, and doses of interest are PTV D98/D95/D5/D2, bladder D30/D50, and rectum D25/D50. Two groups of data, using the automatic reading method(ARM) and pointed dose method(PDM), are analyzed with SPSS 16. The absolute difference=D-ARM-D-PDM, relative difference=absolute difference*100%/prescription dose(7600cGy). Results: The differences are as following: PTV D98/D95/D5/D2: −0.04%/− 0.04%/0.13%/0.19%, bladder D30/D50: −0.02%/0.01%, and rectum D25/D50: 0.03%/0.01%. Conclusion: Using this function, the error is very small, and can be neglected. It could greatly improve the

  16. SU-E-T-641: Development and Verification of Automatic Reading Dose of Interest From Eclipse's DVH

    Energy Technology Data Exchange (ETDEWEB)

    Wu, Q [Department of Radiation Oncology, Beijing Hospital, Ministry of Health, Beijing (China)

    2014-06-15

    Purpose: According to clinical and research requirement, we develop a function of automatic reading dose of interest from dose volume histogram(DVH), to replace the traditional method with a mouse one by one point, and it's also verified. Methods: The DVH automatic reading function will be developed in an in-house developed radiotherapy information management system(RTIMS), which is based on Apache+PHP+MySQL. A DVH ASCII file is exported from Varian Eclipse V8.6, which includes the following contents: 1. basic information of patient; 2. dose information of plan; 3. dose information of structures, including basic information and dose volume data of target volume and organ at risk. And the default exported dose volume data also includes relative doses by 1% step and corresponding absolute doses and cumulative relative volumes, and the volumes are 4 decimal fraction. Clinically, we often need read the doses of some integer percent volumes, such as D50 and D30. So it couldn't be directly obtained from the above data, but we can use linear interpolation bye the near volumes and doses: Dx=D2−(V2−Vx)*(D2−D1)/(V2−V1), and program a function to search, read and calculate the corresponding data. And the doses of all preseted volume of interest of all structures can be automatically read one by one patient, and saved as a CSV file. To verify it, we select 24 IMRT plans for prostate cancer, and doses of interest are PTV D98/D95/D5/D2, bladder D30/D50, and rectum D25/D50. Two groups of data, using the automatic reading method(ARM) and pointed dose method(PDM), are analyzed with SPSS 16. The absolute difference=D-ARM-D-PDM, relative difference=absolute difference*100%/prescription dose(7600cGy). Results: The differences are as following: PTV D98/D95/D5/D2: −0.04%/− 0.04%/0.13%/0.19%, bladder D30/D50: −0.02%/0.01%, and rectum D25/D50: 0.03%/0.01%. Conclusion: Using this function, the error is very small, and can be neglected. It could greatly improve the

  17. Development of a computer program for the simulation of ice-bank system operation, part II: Verification

    Energy Technology Data Exchange (ETDEWEB)

    Grozdek, Marino; Halasz, Boris; Curko, Tonko [University of Zagreb, Faculty of Mechanical Engineering and Naval Architecture, Ivana Lucica 5, 10 000 Zagreb (Croatia)

    2010-12-15

    In order to verify the mathematical model of an ice bank system developed for the purpose of predicting the system performance, experimental measurements on the ice bank system were performed. Static, indirect, cool thermal storage system, with an external ice-on-coil building/melting was considered. Cooling energy stored in the form of ice by night is used for the rapid cooling of milk after the process of pasteurization by day. The ice bank system was tested under real operating conditions to determine parameters such as the time-varying heat load imposed by the consumer, refrigeration unit load, storage capacity, supply water temperature to the load and to find charging and discharging characteristics of the storage. Experimentally obtained results were then compared to the computed ones. It was found that the calculated and experimentally obtained results are in good agreement as long as there is ice present in the silo. (author)

  18. Development and verification of an alcohol craving-induction tool using virtual reality: craving characteristics in social pressure situation.

    Science.gov (United States)

    Cho, Sangwoo; Ku, Jeonghun; Park, Jinsick; Han, Kiwan; Lee, Hyeongrae; Choi, You Kyong; Jung, Young-Chul; Namkoong, Kee; Kim, Jae-Jin; Kim, In Young; Kim, Sun I; Shen, Dong Fan

    2008-06-01

    Alcoholism is a disease that affects parts of the brain that control emotion, decisions, and behavior. Therapy for people with alcoholism must address coping skills for facing high-risk situations. Therefore, it is important to develop tools to mimic such conditions. Cue exposure therapy (CET) provides high-risk situations during treatment, which raises the individual's ability to recognize that alcohol craving is being induced. Using CET, it is hard to simulate situations that induce alcohol craving. By contrast, virtual reality (VR) approaches can present realistic situations that cannot be experienced directly in CET. Therefore, we hypothesized that is possible to model social pressure situations using VR. We developed a VR system for inducing alcohol craving under social pressure situations and measured both the induced alcohol craving and head gaze of participants. A 2 x 2 experimental model (alcohol-related locality vs. social pressure) was designed. In situations without an avatar (no social pressure), more alcohol craving was induced if alcohol was present than if it was not. And more alcohol craving was induced in situations with an avatar (social pressure) than in situations without an avatar (no social pressure). The difference of angle between the direction of head gazing and the direction of alcohol or avatar was smaller in situations with an avatar alone (social pressure) than in situations with alcohol alone. In situations with both alcohol and an avatar, the angle between the direction of head gaze and the direction of the avatar was smaller than between the direction of head gaze and the direction of the alcohol. Considering the results, this VR system induces alcohol craving using an avatar that can express various social pressure situations.

  19. Nuclear disarmament verification

    International Nuclear Information System (INIS)

    DeVolpi, A.

    1993-01-01

    Arms control treaties, unilateral actions, and cooperative activities -- reflecting the defusing of East-West tensions -- are causing nuclear weapons to be disarmed and dismantled worldwide. In order to provide for future reductions and to build confidence in the permanency of this disarmament, verification procedures and technologies would play an important role. This paper outlines arms-control objectives, treaty organization, and actions that could be undertaken. For the purposes of this Workshop on Verification, nuclear disarmament has been divided into five topical subareas: Converting nuclear-weapons production complexes, Eliminating and monitoring nuclear-weapons delivery systems, Disabling and destroying nuclear warheads, Demilitarizing or non-military utilization of special nuclear materials, and Inhibiting nuclear arms in non-nuclear-weapons states. This paper concludes with an overview of potential methods for verification

  20. BAGHOUSE FILTRATION PRODUCTS VERIFICATION TESTING, HOW IT BENEFITS THE BOILER BAGHOUSE OPERATOR

    Science.gov (United States)

    The paper describes the Environmental Technology Verification (ETV) Program for baghouse filtration products developed by the Air Pollution Control Technology Verification Center, one of six Centers under the ETV Program, and discusses how it benefits boiler baghouse operators. A...

  1. Verification Testing of Air Pollution Control Technology Quality Management Plan Revision 2.3

    Science.gov (United States)

    The Air Pollution Control Technology Verification Center was established in 1995 as part of the EPA’s Environmental Technology Verification Program to accelerate the development and commercialization of improved environmental technologies’ performance.

  2. Development of a Kalman Filter in the Gauss-Helmert Model for Reliability Analysis in Orientation Determination with Smartphone Sensors.

    Science.gov (United States)

    Ettlinger, Andreas; Neuner, Hans; Burgess, Thomas

    2018-01-31

    The topic of indoor positioning and indoor navigation by using observations from smartphone sensors is very challenging as the determined trajectories can be subject to significant deviations compared to the route travelled in reality. Especially the calculation of the direction of movement is the critical part of pedestrian positioning approaches such as Pedestrian Dead Reckoning ("PDR"). Due to distinct systematic effects in filtered trajectories, it can be assumed that there are systematic deviations present in the observations from smartphone sensors. This article has two aims: one is to enable the estimation of partial redundancies for each observation as well as for observation groups. Partial redundancies are a measure for the reliability indicating how well systematic deviations can be detected in single observations used in PDR. The second aim is to analyze the behavior of partial redundancy by modifying the stochastic and functional model of the Kalman filter. The equations relating the observations to the orientation are condition equations, which do not exhibit the typical structure of the Gauss-Markov model ("GMM"), wherein the observations are linear and can be formulated as functions of the states. To calculate and analyze the partial redundancy of the observations from smartphone-sensors used in PDR, the system equation and the measurement equation of a Kalman filter as well as the redundancy matrix need to be derived in the Gauss-Helmert model ("GHM"). These derivations are introduced in this article and lead to a novel Kalman filter structure based on condition equations, enabling reliability assessment of each observation.

  3. Verification of Ceramic Structures

    Science.gov (United States)

    Behar-Lafenetre, Stephanie; Cornillon, Laurence; Rancurel, Michael; De Graaf, Dennis; Hartmann, Peter; Coe, Graham; Laine, Benoit

    2012-07-01

    In the framework of the “Mechanical Design and Verification Methodologies for Ceramic Structures” contract [1] awarded by ESA, Thales Alenia Space has investigated literature and practices in affiliated industries to propose a methodological guideline for verification of ceramic spacecraft and instrument structures. It has been written in order to be applicable to most types of ceramic or glass-ceramic materials - typically Cesic®, HBCesic®, Silicon Nitride, Silicon Carbide and ZERODUR®. The proposed guideline describes the activities to be performed at material level in order to cover all the specific aspects of ceramics (Weibull distribution, brittle behaviour, sub-critical crack growth). Elementary tests and their post-processing methods are described, and recommendations for optimization of the test plan are given in order to have a consistent database. The application of this method is shown on an example in a dedicated article [7]. Then the verification activities to be performed at system level are described. This includes classical verification activities based on relevant standard (ECSS Verification [4]), plus specific analytical, testing and inspection features. The analysis methodology takes into account the specific behaviour of ceramic materials, especially the statistical distribution of failures (Weibull) and the method to transfer it from elementary data to a full-scale structure. The demonstration of the efficiency of this method is described in a dedicated article [8]. The verification is completed by classical full-scale testing activities. Indications about proof testing, case of use and implementation are given and specific inspection and protection measures are described. These additional activities are necessary to ensure the required reliability. The aim of the guideline is to describe how to reach the same reliability level as for structures made of more classical materials (metals, composites).

  4. The intractable cigarette 'filter problem'.

    Science.gov (United States)

    Harris, Bradford

    2011-05-01

    When lung cancer fears emerged in the 1950s, cigarette companies initiated a shift in cigarette design from unfiltered to filtered cigarettes. Both the ineffectiveness of cigarette filters and the tobacco industry's misleading marketing of the benefits of filtered cigarettes have been well documented. However, during the 1950s and 1960s, American cigarette companies spent millions of dollars to solve what the industry identified as the 'filter problem'. These extensive filter research and development efforts suggest a phase of genuine optimism among cigarette designers that cigarette filters could be engineered to mitigate the health hazards of smoking. This paper explores the early history of cigarette filter research and development in order to elucidate why and when seemingly sincere filter engineering efforts devolved into manipulations in cigarette design to sustain cigarette marketing and mitigate consumers' concerns about the health consequences of smoking. Relevant word and phrase searches were conducted in the Legacy Tobacco Documents Library online database, Google Patents, and media and medical databases including ProQuest, JSTOR, Medline and PubMed. 13 tobacco industry documents were identified that track prominent developments involved in what the industry referred to as the 'filter problem'. These reveal a period of intense focus on the 'filter problem' that persisted from the mid-1950s to the mid-1960s, featuring collaborations between cigarette producers and large American chemical and textile companies to develop effective filters. In addition, the documents reveal how cigarette filter researchers' growing scientific knowledge of smoke chemistry led to increasing recognition that filters were unlikely to offer significant health protection. One of the primary concerns of cigarette producers was to design cigarette filters that could be economically incorporated into the massive scale of cigarette production. The synthetic plastic cellulose acetate

  5. Development and experimental verification of a finite element method for accurate analysis of a surface acoustic wave device

    Science.gov (United States)

    Mohibul Kabir, K. M.; Matthews, Glenn I.; Sabri, Ylias M.; Russo, Salvy P.; Ippolito, Samuel J.; Bhargava, Suresh K.

    2016-03-01

    Accurate analysis of surface acoustic wave (SAW) devices is highly important due to their use in ever-growing applications in electronics, telecommunication and chemical sensing. In this study, a novel approach for analyzing the SAW devices was developed based on a series of two-dimensional finite element method (FEM) simulations, which has been experimentally verified. It was found that the frequency response of the two SAW device structures, each having slightly different bandwidth and center lobe characteristics, can be successfully obtained utilizing the current density of the electrodes via FEM simulations. The two SAW structures were based on XY Lithium Niobate (LiNbO3) substrates and had two and four electrode finger pairs in both of their interdigital transducers, respectively. Later, SAW devices were fabricated in accordance with the simulated models and their measured frequency responses were found to correlate well with the obtained simulations results. The results indicated that better match between calculated and measured frequency response can be obtained when one of the input electrode finger pairs was set at zero volts and all the current density components were taken into account when calculating the frequency response of the simulated SAW device structures.

  6. Mechanistic Physiologically Based Pharmacokinetic (PBPK) Model of the Heart Accounting for Inter-Individual Variability: Development and Performance Verification.

    Science.gov (United States)

    Tylutki, Zofia; Mendyk, Aleksander; Polak, Sebastian

    2018-04-01

    Modern model-based approaches to cardiac safety and efficacy assessment require accurate drug concentration-effect relationship establishment. Thus, knowledge of the active concentration of drugs in heart tissue is desirable along with inter-subject variability influence estimation. To that end, we developed a mechanistic physiologically based pharmacokinetic model of the heart. The models were described with literature-derived parameters and written in R, v.3.4.0. Five parameters were estimated. The model was fitted to amitriptyline and nortriptyline concentrations after an intravenous infusion of amitriptyline. The cardiac model consisted of 5 compartments representing the pericardial fluid, heart extracellular water, and epicardial intracellular, midmyocardial intracellular, and endocardial intracellular fluids. Drug cardiac metabolism, passive diffusion, active efflux, and uptake were included in the model as mechanisms involved in the drug disposition within the heart. The model accounted for inter-individual variability. The estimates of optimized parameters were within physiological ranges. The model performance was verified by simulating 5 clinical studies of amitriptyline intravenous infusion, and the simulated pharmacokinetic profiles agreed with clinical data. The results support the model feasibility. The proposed structure can be tested with the goal of improving the patient-specific model-based cardiac safety assessment and offers a framework for predicting cardiac concentrations of various xenobiotics. Copyright © 2018 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.

  7. Development and verification of an analytical algorithm to predict absorbed dose distributions in ocular proton therapy using Monte Carlo simulations

    International Nuclear Information System (INIS)

    Koch, Nicholas C; Newhauser, Wayne D

    2010-01-01

    Proton beam radiotherapy is an effective and non-invasive treatment for uveal melanoma. Recent research efforts have focused on improving the dosimetric accuracy of treatment planning and overcoming the present limitation of relative analytical dose calculations. Monte Carlo algorithms have been shown to accurately predict dose per monitor unit (D/MU) values, but this has yet to be shown for analytical algorithms dedicated to ocular proton therapy, which are typically less computationally expensive than Monte Carlo algorithms. The objective of this study was to determine if an analytical method could predict absolute dose distributions and D/MU values for a variety of treatment fields like those used in ocular proton therapy. To accomplish this objective, we used a previously validated Monte Carlo model of an ocular nozzle to develop an analytical algorithm to predict three-dimensional distributions of D/MU values from pristine Bragg peaks and therapeutically useful spread-out Bragg peaks (SOBPs). Results demonstrated generally good agreement between the analytical and Monte Carlo absolute dose calculations. While agreement in the proximal region decreased for beams with less penetrating Bragg peaks compared with the open-beam condition, the difference was shown to be largely attributable to edge-scattered protons. A method for including this effect in any future analytical algorithm was proposed. Comparisons of D/MU values showed typical agreement to within 0.5%. We conclude that analytical algorithms can be employed to accurately predict absolute proton dose distributions delivered by an ocular nozzle.

  8. Reload core safety verification

    International Nuclear Information System (INIS)

    Svetlik, M.; Minarcin, M.

    2003-01-01

    This paper presents a brief look at the process of reload core safety evaluation and verification in Slovak Republic. It gives an overview of experimental verification of selected nuclear parameters in the course of physics testing during reactor start-up. The comparison of IAEA recommendations and testing procedures at Slovak and European nuclear power plants of similar design is included. An introduction of two level criteria for evaluation of tests represents an effort to formulate the relation between safety evaluation and measured values (Authors)

  9. Development of methodology for characterization of cartridge filters from the IEA-R1 using the Monte Carlo method

    International Nuclear Information System (INIS)

    Costa, Priscila

    2014-01-01

    The Cuno filter is part of the water processing circuit of the IEA-R1 reactor and, when saturated, it is replaced and becomes a radioactive waste, which must be managed. In this work, the primary characterization of the Cuno filter of the IEA-R1 nuclear reactor at IPEN was carried out using gamma spectrometry associated with the Monte Carlo method. The gamma spectrometry was performed using a hyperpure germanium detector (HPGe). The germanium crystal represents the detection active volume of the HPGe detector, which has a region called dead layer or inactive layer. It has been reported in the literature a difference between the theoretical and experimental values when obtaining the efficiency curve of these detectors. In this study we used the MCNP-4C code to obtain the detector calibration efficiency for the geometry of the Cuno filter, and the influence of the dead layer and the effect of sum in cascade at the HPGe detector were studied. The correction of the dead layer values were made by varying the thickness and the radius of the germanium crystal. The detector has 75.83 cm 3 of active volume of detection, according to information provided by the manufacturer. Nevertheless, the results showed that the actual value of active volume is less than the one specified, where the dead layer represents 16% of the total volume of the crystal. A Cuno filter analysis by gamma spectrometry has enabled identifying energy peaks. Using these peaks, three radionuclides were identified in the filter: 108m Ag, 110m Ag and 60 Co. From the calibration efficiency obtained by the Monte Carlo method, the value of activity estimated for these radionuclides is in the order of MBq. (author)

  10. Bag filters

    Energy Technology Data Exchange (ETDEWEB)

    Yoshida, M; Komeda, I; Takizaki, K

    1982-01-01

    Bag filters are widely used throughout the cement industry for recovering raw materials and products and for improving the environment. Their general mechanism, performance and advantages are shown in a classification table, and there are comparisons and explanations. The outer and inner sectional construction of the Shinto ultra-jet collector for pulverized coal is illustrated and there are detailed descriptions of dust cloud prevention, of measures used against possible sources of ignition, of oxygen supply and of other topics. Finally, explanations are given of matters that require careful and comprehensive study when selecting equipment.

  11. Digital filters

    CERN Document Server

    Hamming, Richard W

    1997-01-01

    Digital signals occur in an increasing number of applications: in telephone communications; in radio, television, and stereo sound systems; and in spacecraft transmissions, to name just a few. This introductory text examines digital filtering, the processes of smoothing, predicting, differentiating, integrating, and separating signals, as well as the removal of noise from a signal. The processes bear particular relevance to computer applications, one of the focuses of this book.Readers will find Hamming's analysis accessible and engaging, in recognition of the fact that many people with the s

  12. Development and verification of coupled fluid-structural dynamic codes for stress analysis of reactor vessel internals under blowdown loading

    International Nuclear Information System (INIS)

    Krieg, R.; Schlechtendahl, E.G.

    1977-01-01

    YAQUIR has been applied to large PWR blowdown problems and compared with LECK results. The structural model of CYLDY2 and the fluid model of YAQUIR have been coupled in the code STRUYA. First tests with the fluid dynamic systems code FLUST have been successful. The incompressible fluid version of the 3D coupled code FLUX for HDR-geometry was checked against some analytical test cases and was used for evaluation of the eigenfrequencies of the coupled system. Several test cases were run with the two phase flow code SOLA-DF with satisfactory results. Remarkable agreement was found between YAQUIR results and experimental data obtained from shallow water analogy experiments. A test for investigation of nonequilibrium twophase flow dynamics has been specified in some detail. The test is to be performed early 1978 in the water loop of the IRB. Good agreement was found between the natural frequency predictions for the core barrel obtained from CYLDY2 and STRUDL/DYNAL. Work started on improvement of the beam mode treatment in CYLDY2. The name of this modified version will be CYLDY3. The fluiddynamic code SING1, based on an advanced singularity method and applicable to a broad class of highly transient, incompressible 3D-problems with negligible viscosity has been developed and tested. It will be used in connection with the planned laboratory experiments in order to investigate the effect of the core structure on the blowdown process. Coupling of SING1 with structural dynamics is on the way. (orig./RW) [de

  13. Development and verification of an excel program for calculation of monitor units for tangential breast irradiation with external photon beams

    International Nuclear Information System (INIS)

    Woldemariyam, M.G.

    2015-07-01

    The accuracy of MU calculation performed with Prowess Panther TPS (for Co-60) and Oncentra (for 6MV and 15MV x-rays) for tangential breast irradiation was evaluated with measurements made in an anthropomorphic phantom using calibrated Gafchromic EBT2 films. Excel programme which takes in to account external body surface irregularity of an intact breast or chest wall (hence absence of full scatter condition) using Clarkson’s sector summation technique was developed. A single surface contour of the patient obtained in a transverse plane containing the MU calculation point was required for effective implementation of the programme. The outputs of the Excel programme were validated with the respective outputs from the 3D treatment planning systems. The variations between the measured point doses and their calculated counterparts by the TPSs were within the range of -4.74% to 4.52% (mean of -1.33% and SD of 2.69) for the prowess panther TPS and -4.42% to 3.14% (mean of -1.47% and SD of -3.95) for the Oncentra TPS. The observed degree of deviation may be attributed to limitations of the dose calculation algorithm within the TPSs, set up inaccuracies of the phantom during irradiation and inherent uncertainties associated with radiochromic film dosimetry. The percentage deviations between MUs calculated with the two TPSs and the Excel program were within the range of -3.45% and 3.82% (mean of 0.83% and SD of 2.25). The observed percentage deviations are within the 4% action level recommended by TG-114. This indicates that the Excel program can be confidently employed for calculation of MUs for 2D planned tangential breast irradiations or to independently verify MUs calculated with another calculation methods. (au)

  14. Development and application of emission models for verification and evaluation of user requirements in the sector of environmental planning

    International Nuclear Information System (INIS)

    Fister, G.

    2001-04-01

    In chapter I, two basic emission models for calculation of emission inventories are presented: the bottom-up- and the top-down-approach. Their characteristics, typical international and national fields of application and the requirements for these approaches are discussed. A separate chapter describes a detailed comparison between two different emission balances. These characterize the same regional area but are based on different emission models (top-down- and bottom-up-approach). The structures of these approaches are analyzed, emission sectors are adjusted for a comparison of detailed emission data. Differences are pointed out and reasons for discrepancies are discussed. Due to the results of this investigation, limits for the fields of application of the two approaches are set and substantiated. An application of results of the above mentioned comparison are shown in the following part. Following the Kyoto Protocol commitment and Lower Austria Climate Protection Program current and future emission situation of Lower Austria is discussed. Other types of emission inventories are included for discussion and a top-down-based approach for a local splitting of Austrian reduction potentials of greenhouse gases is developed. Another step in the Lower Austria Climate Protection Program are investigations of all funding in Lower Austria related to their ozone and climate relevance. Survey and evaluation of funding are described in detail. Further analyses are made with housing grants which include quantitative aspects, too. Taking all aspects into consideration the actual situation regarding ozone and climate related emissions is shown. Changes in requirements of emission inventories in the last decade are discussed, experiences of applying emission approaches are mentioned. Concluding this work, an outlook in calculating emission inventories is given. (author)

  15. Alarm filtering and presentation

    International Nuclear Information System (INIS)

    Bray, M.A.

    1989-01-01

    This paper discusses alarm filtering and presentation in the control room of nuclear and other process control plants. Alarm generation and presentation is widely recognized as a general process control problem. Alarm systems often fail to provide meaningful alarms to operators. Alarm generation and presentation is an area in which computer aiding is feasible and provides clear benefits. Therefore, researchers have developed several computerized alarm filtering and presentation approaches. This paper discusses problems associated with alarm generation and presentation. Approaches to improving the alarm situation and installation issues of alarm system improvements are discussed. The impact of artificial intelligence (AI) technology on alarm system improvements is assessed. (orig.)

  16. Development of a new rapid isolation device for circulating tumor cells (CTCs using 3D palladium filter and its application for genetic analysis.

    Directory of Open Access Journals (Sweden)

    Akiko Yusa

    Full Text Available Circulating tumor cells (CTCs in the blood of patients with epithelial malignancies provide a promising and minimally invasive source for early detection of metastasis, monitoring of therapeutic effects and basic research addressing the mechanism of metastasis. In this study, we developed a new filtration-based, sensitive CTC isolation device. This device consists of a 3-dimensional (3D palladium (Pd filter with an 8 µm-sized pore in the lower layer and a 30 µm-sized pocket in the upper layer to trap CTCs on a filter micro-fabricated by precise lithography plus electroforming process. This is a simple pump-less device driven by gravity flow and can enrich CTCs from whole blood within 20 min. After on-device staining of CTCs for 30 min, the filter cassette was removed from the device, fixed in a cassette holder and set up on the upright fluorescence microscope. Enumeration and isolation of CTCs for subsequent genetic analysis from the beginning were completed within 1.5 hr and 2 hr, respectively. Cell spike experiments demonstrated that the recovery rate of tumor cells from blood by this Pd filter device was more than 85%. Single living tumor cells were efficiently isolated from these spiked tumor cells by a micromanipulator, and KRAS mutation, HER2 gene amplification and overexpression, for example, were successfully detected from such isolated single tumor cells. Sequential analysis of blood from mice bearing metastasis revealed that CTC increased with progression of metastasis. Furthermore, a significant increase in the number of CTCs from the blood of patients with metastatic breast cancer was observed compared with patients without metastasis and healthy volunteers. These results suggest that this new 3D Pd filter-based device would be a useful tool for the rapid, cost effective and sensitive detection, enumeration, isolation and genetic analysis of CTCs from peripheral blood in both preclinical and clinical settings.

  17. Stack filter classifiers

    Energy Technology Data Exchange (ETDEWEB)

    Porter, Reid B [Los Alamos National Laboratory; Hush, Don [Los Alamos National Laboratory

    2009-01-01

    Just as linear models generalize the sample mean and weighted average, weighted order statistic models generalize the sample median and weighted median. This analogy can be continued informally to generalized additive modeels in the case of the mean, and Stack Filters in the case of the median. Both of these model classes have been extensively studied for signal and image processing but it is surprising to find that for pattern classification, their treatment has been significantly one sided. Generalized additive models are now a major tool in pattern classification and many different learning algorithms have been developed to fit model parameters to finite data. However Stack Filters remain largely confined to signal and image processing and learning algorithms for classification are yet to be seen. This paper is a step towards Stack Filter Classifiers and it shows that the approach is interesting from both a theoretical and a practical perspective.

  18. Integrated knowledge base tool for acquisition and verification of NPP alarm systems

    International Nuclear Information System (INIS)

    Park, Joo Hyun; Seong, Poong Hyun

    1998-01-01

    Knowledge acquisition and knowledge base verification are important activities in developing knowledge-based systems such as alarm processing systems. In this work, we developed the integrated tool, for knowledge acquisition and verification of NPP alarm processing systems, by using G2 tool. The tool integrates document analysis method and ECPN matrix analysis method, for knowledge acquisition and knowledge verification, respectively. This tool enables knowledge engineers to perform their tasks from knowledge acquisition to knowledge verification consistently

  19. Verification in Referral-Based Crowdsourcing

    Science.gov (United States)

    Naroditskiy, Victor; Rahwan, Iyad; Cebrian, Manuel; Jennings, Nicholas R.

    2012-01-01

    Online social networks offer unprecedented potential for rallying a large number of people to accomplish a given task. Here we focus on information gathering tasks where rare information is sought through “referral-based crowdsourcing”: the information request is propagated recursively through invitations among members of a social network. Whereas previous work analyzed incentives for the referral process in a setting with only correct reports, misreporting is known to be both pervasive in crowdsourcing applications, and difficult/costly to filter out. A motivating example for our work is the DARPA Red Balloon Challenge where the level of misreporting was very high. In order to undertake a formal study of verification, we introduce a model where agents can exert costly effort to perform verification and false reports can be penalized. This is the first model of verification and it provides many directions for future research, which we point out. Our main theoretical result is the compensation scheme that minimizes the cost of retrieving the correct answer. Notably, this optimal compensation scheme coincides with the winning strategy of the Red Balloon Challenge. PMID:23071530

  20. Temperature Modeling of Lost Creek Lake Using CE-QUAL-W2: A Report on the Development, Calibration, Verification, and Application of the Model

    Science.gov (United States)

    2017-05-01

    Daily Loads and Rogue Spring Chinook Conservation Plan require the Corps to review the Rogue Basin Project operations to determine whether improvements ...flow (cfs) for CY03 verification. ............................................................................ 35 Table 8. Temperature stats (deg-C...Chief, WQCMB; Warren P. Lorentz was Chief, EP. Dr. Al Cofrancesco, CEERD-EZT, was the Senior Science and Technology Manager. The Deputy Director of

  1. Verification and nuclear material security

    International Nuclear Information System (INIS)

    ElBaradei, M.

    2001-01-01

    Full text: The Director General will open the symposium by presenting a series of challenges facing the international safeguards community: the need to ensure a robust system, with strong verification tools and a sound research and development programme; the importance of securing the necessary support for the system, in terms of resources; the effort to achieve universal participation in the non-proliferation regime; and the necessity of re-energizing disarmament efforts. Special focus will be given to the challenge underscored by recent events, of strengthening international efforts to combat nuclear terrorism. (author)

  2. Mobile filters in nuclear engineering

    International Nuclear Information System (INIS)

    Meuter, R.

    1979-01-01

    The need for filters with high efficiencies which may be used at any place originated in nuclear power plants. Filters of this type, called Filtermobil, have been developed by Sulzer. They have been used successfully in nuclear plants for several years. (orig.) [de

  3. Neutron absorbers and detector types for spent fuel verification using the self-interrogation neutron resonance densitometry

    International Nuclear Information System (INIS)

    Rossa, Riccardo; Borella, Alessandro; Labeau, Pierre-Etienne; Pauly, Nicolas; Meer, Klaas van der

    2015-01-01

    The Self-Interrogation Neutron Resonance Densitometry (SINRD) is a passive non-destructive assay (NDA) technique that is proposed for the direct measurement of 239 Pu in a spent fuel assembly. The insertion of neutron detectors wrapped with different neutron absorbing materials, or neutron filters, in the central guide tube of a PWR fuel assembly is envisaged to measure the neutron flux in the energy region close to the 0.3 eV resonance of 239 Pu. In addition, the measurement of the fast neutron flux is foreseen. This paper is focused on the determination of the Gd and Cd neutron filters thickness to maximize the detection of neutrons within the resonance region. Moreover, several detector types are compared to identify the optimal condition and to assess the expected total neutron counts that can be obtained with the SINRD measurements. Results from Monte Carlo simulations showed that ranges between 0.1–0.3 mm and 0.5–1.0 mm ensure the optimal conditions for the Gd and Cd filters, respectively. Moreover, a 239 Pu fission chamber is better suited to measure neutrons close to the 0.3 eV resonance and it has the highest sensitivity to 239 Pu, in comparison with a 235 U fission chamber, with a 3 He proportional counter, and with a 10 B proportional counter. The use of a thin Gd filter and a thick Cd filter is suggested for the 239 Pu and 235 U fission chambers to increase the total counts achieved in a measurement, while a thick Gd filter and a thin Cd filter are envisaged for the 3 He and 10 B proportional counters to increase the sensitivity to 239 Pu. We concluded that an optimization process that takes into account measurement time, filters thickness, and detector size is needed to develop a SINRD detector that can meet the requirement for an efficient verification of spent fuel assemblies

  4. Verification of a CT scanner using a miniature step gauge

    DEFF Research Database (Denmark)

    Cantatore, Angela; Andreasen, J.L.; Carmignato, S.

    2011-01-01

    The work deals with performance verification of a CT scanner using a 42mm miniature replica step gauge developed for optical scanner verification. Errors quantification and optimization of CT system set-up in terms of resolution and measurement accuracy are fundamental for use of CT scanning...

  5. Portable system for periodical verification of area monitors for neutrons

    International Nuclear Information System (INIS)

    Souza, Luciane de R.; Leite, Sandro Passos; Lopes, Ricardo Tadeu; Patrao, Karla C. de Souza; Fonseca, Evaldo S. da; Pereira, Walsan W.

    2009-01-01

    The Neutrons Laboratory develops a project viewing the construction of a portable test system for verification of functioning conditions of neutron area monitors. This device will allow to the users the verification of the calibration maintenance of his instruments at the use installations, avoiding the use of an inadequate equipment related to his answer to the neutron beam response

  6. Neutron spectrometric methods for core inventory verification in research reactors

    International Nuclear Information System (INIS)

    Ellinger, A.; Filges, U.; Hansen, W.; Knorr, J.; Schneider, R.

    2002-01-01

    In consequence of the Non-Proliferation Treaty safeguards, inspections are periodically made in nuclear facilities by the IAEA and the EURATOM Safeguards Directorate. The inspection methods are permanently improved. Therefore, the Core Inventory Verification method is being developed as an indirect method for the verification of the core inventory and to check the declared operation of research reactors

  7. Is flow verification necessary

    International Nuclear Information System (INIS)

    Beetle, T.M.

    1986-01-01

    Safeguards test statistics are used in an attempt to detect diversion of special nuclear material. Under assumptions concerning possible manipulation (falsification) of safeguards accounting data, the effects on the statistics due to diversion and data manipulation are described algebraically. A comprehensive set of statistics that is capable of detecting any diversion of material is defined in terms of the algebraic properties of the effects. When the assumptions exclude collusion between persons in two material balance areas, then three sets of accounting statistics are shown to be comprehensive. Two of the sets contain widely known accountancy statistics. One of them does not require physical flow verification - comparisons of operator and inspector data for receipts and shipments. The third set contains a single statistic which does not require physical flow verification. In addition to not requiring technically difficult and expensive flow verification, this single statistic has several advantages over other comprehensive sets of statistics. This algebraic approach as an alternative to flow verification for safeguards accountancy is discussed in this paper

  8. Integrated Java Bytecode Verification

    DEFF Research Database (Denmark)

    Gal, Andreas; Probst, Christian; Franz, Michael

    2005-01-01

    Existing Java verifiers perform an iterative data-flow analysis to discover the unambiguous type of values stored on the stack or in registers. Our novel verification algorithm uses abstract interpretation to obtain definition/use information for each register and stack location in the program...

  9. Simulation Environment Based on the Universal Verification Methodology

    CERN Document Server

    AUTHOR|(SzGeCERN)697338

    2017-01-01

    Universal Verification Methodology (UVM) is a standardized approach of verifying integrated circuit designs, targeting a Coverage-Driven Verification (CDV). It combines automatic test generation, self-checking testbenches, and coverage metrics to indicate progress in the design verification. The flow of the CDV differs from the traditional directed-testing approach. With the CDV, a testbench developer, by setting the verification goals, starts with an structured plan. Those goals are targeted further by a developed testbench, which generates legal stimuli and sends them to a device under test (DUT). The progress is measured by coverage monitors added to the simulation environment. In this way, the non-exercised functionality can be identified. Moreover, the additional scoreboards indicate undesired DUT behaviour. Such verification environments were developed for three recent ASIC and FPGA projects which have successfully implemented the new work-flow: (1) the CLICpix2 65 nm CMOS hybrid pixel readout ASIC desi...

  10. ETV TEST REPORT OF CONTROL OF BIOAEROSOLS IN HVAC SYSTEMS GLASFLOSS INDUSTRIES EXCEL FILTER, MODEL SBG24242898

    Science.gov (United States)

    The Environmental Technology Verification report discusses the technology and performance of the Excel Filter, Model SBG24242898 air filter for dust and bioaerosol filtration manufactured by Glasfloss Industries, Inc. The pressure drop across the filter was 82 Pa clean and 348 Pa...

  11. Development of a broadband reflective T-filter for voltage biasing high-Q superconducting microwave cavities

    International Nuclear Information System (INIS)

    Hao, Yu; Rouxinol, Francisco; LaHaye, M. D.

    2014-01-01

    We present the design of a reflective stop-band filter based on quasi-lumped elements that can be utilized to introduce large dc and low-frequency voltage biases into a low-loss superconducting coplanar waveguide (CPW) cavity. Transmission measurements of the filter are seen to be in good agreement with simulations and demonstrate insertion losses greater than 20 dB in the range of 3–10 GHz. Moreover, transmission measurements of the CPW's fundamental mode demonstrate that loaded quality factors exceeding 10 5 can be achieved with this design for dc voltages as large as 20 V and for the cavity operated in the single-photon regime. This makes the design suitable for use in a number of applications including qubit-coupled mechanical systems and circuit QED

  12. KfK Laboratory for Aerosol Physics and Filter Technology. Progress report and development activities in 1990

    International Nuclear Information System (INIS)

    1991-03-01

    The activities undertaken by the laboratory for aerosol physics and filter technology (LAF) in 1990 under the following projects are described: (1) nuclear safety research (safety and material problems of fast breeders, IWR-oriented safety research); (2) pollutant control in the environment (communal waste management, emission-reducing processes, climate research - pollutants' behaviour in the atmosphere), and (3) radioactive waste management (basic work on reprocessing technologies). The annex lists the publications by the LAF staff. (BBR) [de

  13. Tunable Multiband Microwave Photonic Filters

    Directory of Open Access Journals (Sweden)

    Mable P. Fok

    2017-11-01

    Full Text Available The increasing demand for multifunctional devices, the use of cognitive wireless technology to solve the frequency resource shortage problem, as well as the capabilities and operational flexibility necessary to meet ever-changing environment result in an urgent need of multiband wireless communications. Spectral filter is an essential part of any communication systems, and in the case of multiband wireless communications, tunable multiband RF filters are required for channel selection, noise/interference removal, and RF signal processing. Unfortunately, it is difficult for RF electronics to achieve both tunable and multiband spectral filtering. Recent advancements of microwave photonics have proven itself to be a promising candidate to solve various challenges in RF electronics including spectral filtering, however, the development of multiband microwave photonic filtering still faces lots of difficulties, due to the limited scalability and tunability of existing microwave photonic schemes. In this review paper, we first discuss the challenges that were facing by multiband microwave photonic filter, then we review recent techniques that have been developed to tackle the challenge and lead to promising developments of tunable microwave photonic multiband filters. The successful design and implementation of tunable microwave photonic multiband filter facilitate the vision of dynamic multiband wireless communications and radio frequency signal processing for commercial, defense, and civilian applications.

  14. The OSIRIS diffractometer and polarisation analysis spectrometer at ISIS. New developments and 3He spin-filter polarisation analysis

    International Nuclear Information System (INIS)

    Andersen, Ken H.; Marero, David Martin y; Barlow, Michael J.

    2001-01-01

    OSIRIS combines a long-wavelength powder diffractometer with a polarisation analysis backscattering spectrometer. The diffractometer can access wavelengths up to 70 A with a resolution of better than 1% Δd/d. The very high counting-rate at shorter wavelengths is ideal for in-situ, real-time and parametric experiments. The spectroscopy section incorporates an array of graphite crystals arranged in near-backscattering to give a high counting rate with 25 μeV energy resolution. The incident beam is polarised using a supermirror bender and the scattered beam is polarisation-analysed by a 3 He spin-filter in the process of being constructed. The spin-filter system consists of a fibre laser, a peristaltic pump and a wide-angle banana-shaped quartz cell in a continuous-flow setup. The scattered beam passes twice through the spin-filter cell, thus doubling the optical path length in the cell. The aim is to achieve 70% nuclear polarisation with no variation in time. (author)

  15. Advanced Filtering Techniques Applied to Spaceflight, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — IST-Rolla developed two nonlinear filters for spacecraft orbit determination during the Phase I contract. The theta-D filter and the cost based filter, CBF, were...

  16. The design of verification regimes

    International Nuclear Information System (INIS)

    Gallagher, N.W.

    1991-01-01

    Verification of a nuclear agreement requires more than knowledge of relevant technologies and institutional arrangements. It also demands thorough understanding of the nature of verification and the politics of verification design. Arms control efforts have been stymied in the past because key players agreed to verification in principle, only to disagree radically over verification in practice. In this chapter, it is shown that the success and stability of arms control endeavors can be undermined by verification designs which promote unilateral rather than cooperative approaches to security, and which may reduce, rather than enhance, the security of both sides. Drawing on logical analysis and practical lessons from previous superpower verification experience, this chapter summarizes the logic and politics of verification and suggests implications for South Asia. The discussion begins by determining what properties all forms of verification have in common, regardless of the participants or the substance and form of their agreement. Viewing verification as the political process of making decisions regarding the occurrence of cooperation points to four critical components: (1) determination of principles, (2) information gathering, (3) analysis and (4) projection. It is shown that verification arrangements differ primarily in regards to how effectively and by whom these four stages are carried out

  17. Heavy water physical verification in power plants

    International Nuclear Information System (INIS)

    Morsy, S.; Schuricht, V.; Beetle, T.; Szabo, E.

    1986-01-01

    This paper is a report on the Agency experience in verifying heavy water inventories in power plants. The safeguards objectives and goals for such activities are defined in the paper. The heavy water is stratified according to the flow within the power plant, including upgraders. A safeguards scheme based on a combination of records auditing, comparing records and reports, and physical verification has been developed. This scheme has elevated the status of heavy water safeguards to a level comparable to nuclear material safeguards in bulk facilities. It leads to attribute and variable verification of the heavy water inventory in the different system components and in the store. The verification methods include volume and weight determination, sampling and analysis, non-destructive assay (NDA), and criticality check. The analysis of the different measurement methods and their limits of accuracy are discussed in the paper

  18. Packaged low-level waste verification system

    International Nuclear Information System (INIS)

    Tuite, K.T.; Winberg, M.; Flores, A.Y.; Killian, E.W.; McIsaac, C.V.

    1996-01-01

    Currently, states and low-level radioactive waste (LLW) disposal site operators have no method of independently verifying the radionuclide content of packaged LLW that arrive at disposal sites for disposal. At this time, disposal sites rely on LLW generator shipping manifests and accompanying records to insure that LLW received meets the waste acceptance criteria. An independent verification system would provide a method of checking generator LLW characterization methods and help ensure that LLW disposed of at disposal facilities meets requirements. The Mobile Low-Level Waste Verification System (MLLWVS) provides the equipment, software, and methods to enable the independent verification of LLW shipping records to insure that disposal site waste acceptance criteria are being met. The MLLWVS system was developed under a cost share subcontract between WMG, Inc., and Lockheed Martin Idaho Technologies through the Department of Energy's National Low-Level Waste Management Program at the Idaho National Engineering Laboratory (INEL)

  19. Comparing formal verification approaches of interlocking systems

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Nguyen, Hoang Nga; Roggenbach, Markus

    2016-01-01

    these approaches. As a first step towards this, in this paper we suggest a way to compare different formal approaches for verifying designs of route-based interlocking systems and we demonstrate it on modelling and verification approaches developed within the research groups at DTU/Bremen and at Surrey......The verification of railway interlocking systems is a challenging task, and therefore several research groups have suggested to improve this task by using formal methods, but they use different modelling and verification approaches. To advance this research, there is a need to compare....../Swansea. The focus is on designs that are specified by so-called control tables. The paper can serve as a starting point for further comparative studies. The DTU/Bremen research has been funded by the RobustRailS project granted by Innovation Fund Denmark. The Surrey/Swansea research has been funded by the Safe...

  20. Verification test report on a solar heating and hot water system

    Science.gov (United States)

    1978-01-01

    Information is provided on the development, qualification and acceptance verification of commercial solar heating and hot water systems and components. The verification includes the performances, the efficiences and the various methods used, such as similarity, analysis, inspection, test, etc., that are applicable to satisfying the verification requirements.

  1. Comparison between In-house developed and Diamond commercial software for patient specific independent monitor unit calculation and verification with heterogeneity corrections.

    Science.gov (United States)

    Kuppusamy, Vijayalakshmi; Nagarajan, Vivekanandan; Jeevanandam, Prakash; Murugan, Lavanya

    2016-02-01

    The study was aimed to compare two different monitor unit (MU) or dose verification software in volumetric modulated arc therapy (VMAT) using modified Clarkson's integration technique for 6 MV photons beams. In-house Excel Spreadsheet based monitor unit verification calculation (MUVC) program and PTW's DIAMOND secondary check software (SCS), version-6 were used as a secondary check to verify the monitor unit (MU) or dose calculated by treatment planning system (TPS). In this study 180 patients were grouped into 61 head and neck, 39 thorax and 80 pelvic sites. Verification plans are created using PTW OCTAVIUS-4D phantom and also measured using 729 detector chamber and array with isocentre as the suitable point of measurement for each field. In the analysis of 154 clinically approved VMAT plans with isocentre at a region above -350 HU, using heterogeneity corrections, In-house Spreadsheet based MUVC program and Diamond SCS showed good agreement TPS. The overall percentage average deviations for all sites were (-0.93% + 1.59%) and (1.37% + 2.72%) for In-house Excel Spreadsheet based MUVC program and Diamond SCS respectively. For 26 clinically approved VMAT plans with isocentre at a region below -350 HU showed higher variations for both In-house Spreadsheet based MUVC program and Diamond SCS. It can be concluded that for patient specific quality assurance (QA), the In-house Excel Spreadsheet based MUVC program and Diamond SCS can be used as a simple and fast accompanying to measurement based verification for plans with isocentre at a region above -350 HU. Copyright © 2016 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  2. Key Nuclear Verification Priorities: Safeguards and Beyond

    International Nuclear Information System (INIS)

    Carlson, J.

    2010-01-01

    In addressing nuclear verification priorities, we should look beyond the current safeguards system. Non-proliferation, which the safeguards system underpins, is not an end in itself, but an essential condition for achieving and maintaining nuclear disarmament. Effective safeguards are essential for advancing disarmament, and safeguards issues, approaches and techniques are directly relevant to the development of future verification missions. The extent to which safeguards challenges are successfully addressed - or otherwise - will impact not only on confidence in the safeguards system, but on the effectiveness of, and confidence in, disarmament verification. To identify the key nuclear verification priorities, we need to consider the objectives of verification, and the challenges to achieving these. The strategic objective of IAEA safeguards might be expressed as: To support the global nuclear non-proliferation regime by: - Providing credible assurance that states are honouring their safeguards commitments - thereby removing a potential motivation to proliferate; and - Early detection of misuse of nuclear material and technology - thereby deterring proliferation by the risk of early detection, enabling timely intervention by the international community. Or to summarise - confidence-building, detection capability, and deterrence. These will also be essential objectives for future verification missions. The challenges to achieving these involve a mix of political, technical and institutional dimensions. Confidence is largely a political matter, reflecting the qualitative judgment of governments. Clearly assessments of detection capability and deterrence have a major impact on confidence. Detection capability is largely thought of as 'technical', but also involves issues of legal authority, as well as institutional issues. Deterrence has both political and institutional aspects - including judgments on risk of detection and risk of enforcement action being taken. The

  3. Key Nuclear Verification Priorities - Safeguards and Beyond

    International Nuclear Information System (INIS)

    Carlson, J.

    2010-01-01

    In addressing nuclear verification priorities, we should look beyond the current safeguards system. Non-proliferation, which the safeguards system underpins, is not an end in itself, but an essential condition for achieving and maintaining nuclear disarmament. Effective safeguards are essential for advancing disarmament, and safeguards issues, approaches and techniques are directly relevant to the development of future verification missions. The extent to which safeguards challenges are successfully addressed - or otherwise - will impact not only on confidence in the safeguards system, but on the effectiveness of, and confidence in, disarmament verification. To identify the key nuclear verification priorities, we need to consider the objectives of verification, and the challenges to achieving these. The strategic objective of IAEA safeguards might be expressed as: To support the global nuclear non-proliferation regime by: - Providing credible assurance that states are honouring their safeguards commitments - thereby removing a potential motivation to proliferate; and - Early detection of misuse of nuclear material and technology - thereby deterring proliferation by the risk of early detection, enabling timely intervention by the international community. Or to summarise - confidence-building, detection capability, and deterrence. These will also be essential objectives for future verification missions. The challenges to achieving these involve a mix of political, technical and institutional dimensions. Confidence is largely a political matter, reflecting the qualitative judgment of governments. Clearly assessments of detection capability and deterrence have a major impact on confidence. Detection capability is largely thought of as 'technical', but also involves issues of legal authority, as well as institutional issues. Deterrence has both political and institutional aspects - including judgments on risk of detection and risk of enforcement action being taken. The

  4. Operational flood-forecasting in the Piemonte region – development and verification of a fully distributed physically-oriented hydrological model

    Directory of Open Access Journals (Sweden)

    D. Rabuffetti

    2009-03-01

    Full Text Available A hydrological model for real time flood forecasting to Civil Protection services requires reliability and rapidity. At present, computational capabilities overcome the rapidity needs even when a fully distributed hydrological model is adopted for a large river catchment as the Upper Po river basin closed at Ponte Becca (nearly 40 000 km2. This approach allows simulating the whole domain and obtaining the responses of large as well as of medium and little sized sub-catchments. The FEST-WB hydrological model (Mancini, 1990; Montaldo et al., 2007; Rabuffetti et al., 2008 is implemented. The calibration and verification activities are based on more than 100 flood events, occurred along the main tributaries of the Po river in the period 2000–2003. More than 300 meteorological stations are used to obtain the forcing fields, 10 cross sections with continuous and reliable discharge time series are used for calibration while verification is performed on about 40 monitored cross sections. Furthermore meteorological forecasting models are used to force the hydrological model with Quantitative Precipitation Forecasts (QPFs for 36 h horizon in "operational setting" experiments. Particular care is devoted to understanding how QPF affects the accuracy of the Quantitative Discharge Forecasts (QDFs and to assessing the QDF uncertainty impact on the warning system reliability. Results are presented either in terms of QDF and of warning issues highlighting the importance of an "operational based" verification approach.

  5. Convergent Filter Bases

    Directory of Open Access Journals (Sweden)

    Coghetto Roland

    2015-09-01

    Full Text Available We are inspired by the work of Henri Cartan [16], Bourbaki [10] (TG. I Filtres and Claude Wagschal [34]. We define the base of filter, image filter, convergent filter bases, limit filter and the filter base of tails (fr: filtre des sections.

  6. Verification of Timed-Arc Petri Nets

    DEFF Research Database (Denmark)

    Jacobsen, Lasse; Jacobsen, Morten; Møller, Mikael Harkjær

    2011-01-01

    of interesting theoretical properties distinguishing them from other time extensions of Petri nets. We shall give an overview of the recent theory developed in the verification of TAPN extended with features like read/transport arcs, timed inhibitor arcs and age invariants. We will examine in detail...

  7. RESRAD-BUILD verification

    International Nuclear Information System (INIS)

    Kamboj, S.; Yu, C.; Biwer, B. M.; Klett, T.

    2002-01-01

    The results generated by the RESRAD-BUILD code (version 3.0) were verified with hand or spreadsheet calculations using equations given in the RESRAD-BUILD manual for different pathways. For verification purposes, different radionuclides--H-3, C-14, Na-22, Al-26, Cl-36, Mn-54, Co-60, Au-195, Ra-226, Ra-228, Th-228, and U-238--were chosen to test all pathways and models. Tritium, Ra-226, and Th-228 were chosen because of the special tritium and radon models in the RESRAD-BUILD code. Other radionuclides were selected to represent a spectrum of radiation types and energies. Verification of the RESRAD-BUILD code was conducted with an initial check of all the input parameters for correctness against their original source documents. Verification of the calculations was performed external to the RESRAD-BUILD code with Microsoft Excel to verify all the major portions of the code. In some cases, RESRAD-BUILD results were compared with those of external codes, such as MCNP (Monte Carlo N-particle) and RESRAD. The verification was conducted on a step-by-step basis and used different test cases as templates. The following types of calculations were investigated: (1) source injection rate, (2) air concentration in the room, (3) air particulate deposition, (4) radon pathway model, (5) tritium model for volume source, (6) external exposure model, (7) different pathway doses, and (8) time dependence of dose. Some minor errors were identified in version 3.0; these errors have been corrected in later versions of the code. Some possible improvements in the code were also identified

  8. Ageing studies for the ATLAS MDT muonchambers and development of a gas filter to prevent drift tube ageing

    International Nuclear Information System (INIS)

    Koenig, S.

    2008-01-01

    The muon spectrometer of the ATLAS detector, which is currently assembled at the LHC accelerator at CERN, uses drift tubes as basic detection elements over most of the solid angle. The performance of these monitored drift tubes (MDTs), in particular their spatial resolution of 80 μm, determines the precision of the spectrometer. If ageing effects occur, the precision of the drift tubes will be degraded. Hence ageing effects have to be minimized or avoided altogether if possible. Even with a gas mixture of Ar:CO 2 =93:7, which was selected for its good ageing properties, ageing effects were observed in test systems. They were caused by small amounts of impurities, in particular volatile silicon compounds. Systematic studies revealed the required impurity levels deteriorating the drift tubes to be well below 1 ppm. Many components of the ATLAS MDT gas system are supplied by industry. In a newly designed ageing experiment in Freiburg these components were validated for their use in ATLAS. With a fully assembled ATLAS gas distribution rack as test component ageing effects were observed. It was therefore decided to install gas filters in the gas distribution lines to remove volatile silicon compounds efficiently from the gas mixture. Finally a filter was designed that can adsorb up to 5.5 g of volatile silicon compounds, hereby reducing the impurities in the outlet gas mixture to less than 30 ppb. (orig.)

  9. Ageing studies for the ATLAS MDT muonchambers and development of a gas filter to prevent drift tube ageing

    Energy Technology Data Exchange (ETDEWEB)

    Koenig, S.

    2008-01-15

    The muon spectrometer of the ATLAS detector, which is currently assembled at the LHC accelerator at CERN, uses drift tubes as basic detection elements over most of the solid angle. The performance of these monitored drift tubes (MDTs), in particular their spatial resolution of 80 {mu}m, determines the precision of the spectrometer. If ageing effects occur, the precision of the drift tubes will be degraded. Hence ageing effects have to be minimized or avoided altogether if possible. Even with a gas mixture of Ar:CO{sub 2}=93:7, which was selected for its good ageing properties, ageing effects were observed in test systems. They were caused by small amounts of impurities, in particular volatile silicon compounds. Systematic studies revealed the required impurity levels deteriorating the drift tubes to be well below 1 ppm. Many components of the ATLAS MDT gas system are supplied by industry. In a newly designed ageing experiment in Freiburg these components were validated for their use in ATLAS. With a fully assembled ATLAS gas distribution rack as test component ageing effects were observed. It was therefore decided to install gas filters in the gas distribution lines to remove volatile silicon compounds efficiently from the gas mixture. Finally a filter was designed that can adsorb up to 5.5 g of volatile silicon compounds, hereby reducing the impurities in the outlet gas mixture to less than 30 ppb. (orig.)

  10. Quantitative reactive modeling and verification.

    Science.gov (United States)

    Henzinger, Thomas A

    Formal verification aims to improve the quality of software by detecting errors before they do harm. At the basis of formal verification is the logical notion of correctness , which purports to capture whether or not a program behaves as desired. We suggest that the boolean partition of software into correct and incorrect programs falls short of the practical need to assess the behavior of software in a more nuanced fashion against multiple criteria. We therefore propose to introduce quantitative fitness measures for programs, specifically for measuring the function, performance, and robustness of reactive programs such as concurrent processes. This article describes the goals of the ERC Advanced Investigator Project QUAREM. The project aims to build and evaluate a theory of quantitative fitness measures for reactive models. Such a theory must strive to obtain quantitative generalizations of the paradigms that have been success stories in qualitative reactive modeling, such as compositionality, property-preserving abstraction and abstraction refinement, model checking, and synthesis. The theory will be evaluated not only in the context of software and hardware engineering, but also in the context of systems biology. In particular, we will use the quantitative reactive models and fitness measures developed in this project for testing hypotheses about the mechanisms behind data from biological experiments.

  11. Verification and quality control of routine hematology analyzers.

    Science.gov (United States)

    Vis, J Y; Huisman, A

    2016-05-01

    Verification of hematology analyzers (automated blood cell counters) is mandatory before new hematology analyzers may be used in routine clinical care. The verification process consists of several items which comprise among others: precision, accuracy, comparability, carryover, background and linearity throughout the expected range of results. Yet, which standard should be met or which verification limit be used is at the discretion of the laboratory specialist. This paper offers practical guidance on verification and quality control of automated hematology analyzers and provides an expert opinion on the performance standard that should be met by the contemporary generation of hematology analyzers. Therefore (i) the state-of-the-art performance of hematology analyzers for complete blood count parameters is summarized, (ii) considerations, challenges, and pitfalls concerning the development of a verification plan are discussed, (iii) guidance is given regarding the establishment of reference intervals, and (iv) different methods on quality control of hematology analyzers are reviewed. © 2016 John Wiley & Sons Ltd.

  12. ENVIRONMENTAL TECHNOLOGY VERIFICATION, TEST REPORT OF CONTROL OF BIOAEROSOLS IN HVAC SYSTEMS, COLUMBUS INDUSTRIES SL-3 RING PANEL

    Science.gov (United States)

    The Environmental Technology Verification report discusses the technology and performance of the High Efficiency Mini Pleat air filter for dust and bioaerosol filtration manufactured by Columbus Industries. The pressure drop across the filter was 142 Pa clean and 283 Pa dust load...

  13. Spent fuel verification options for final repository safeguards in Finland. A study on verification methods, their feasibility and safety aspects

    International Nuclear Information System (INIS)

    Hautamaeki, J.; Tiitta, A.

    2000-12-01

    The verification possibilities of the spent fuel assemblies from the Olkiluoto and Loviisa NPPs and the fuel rods from the research reactor of VTT are contemplated in this report. The spent fuel assemblies have to be verified at the partial defect level before the final disposal into the geologic repository. The rods from the research reactor may be verified at the gross defect level. Developing a measurement system for partial defect verification is a complicated and time-consuming task. The Passive High Energy Gamma Emission Tomography and the Fork Detector combined with Gamma Spectrometry are the most potential measurement principles to be developed for this purpose. The whole verification process has to be planned to be as slick as possible. An early start in the planning of the verification and developing the measurement devices is important in order to enable a smooth integration of the verification measurements into the conditioning and disposal process. The IAEA and Euratom have not yet concluded the safeguards criteria for the final disposal. E.g. criteria connected to the selection of the best place to perform the verification. Measurements have not yet been concluded. Options for the verification places have been considered in this report. One option for a verification measurement place is the intermediate storage. The other option is the encapsulation plant. Crucial viewpoints are such as which one offers the best practical possibilities to perform the measurements effectively and which would be the better place in the safeguards point of view. Verification measurements may be needed both in the intermediate storages and in the encapsulation plant. In this report also the integrity of the fuel assemblies after wet intermediate storage period is assessed, because the assemblies have to stand the handling operations of the verification measurements. (orig.)

  14. Miniaturized dielectric waveguide filters

    OpenAIRE

    Sandhu, MY; Hunter, IC

    2016-01-01

    Design techniques for a new class of integrated monolithic high-permittivity ceramic waveguide filters are presented. These filters enable a size reduction of 50% compared to air-filled transverse electromagnetic filters with the same unloaded Q-factor. Designs for Chebyshev and asymmetric generalised Chebyshev filter and a diplexer are presented with experimental results for an 1800 MHz Chebyshev filter and a 1700 MHz generalised Chebyshev filter showing excellent agreement with theory.

  15. New filter media development for effective control of trimethysilanol (TMS) and related low molecular weight silicon containing organic species in the photobay ambient

    Science.gov (United States)

    Grayfer, Anatoly; Belanger, Frank V.; Cate, Phillip; Ruede, David

    2007-03-01

    The authors present results of extensive studies on the chemical behavior of low molecular weight silicon-containing species (LMWS) and associated challenges of their analytical determination and control to prevent adverse influence on critical optical elements of exposure tools. In their paper the authors describe a non-traditional approach to the creation of a TMS gaseous source for filter media development and an engineering solution to the challenge of controlling LMWS - a solution that shows a significant advantage over currently existing approaches.

  16. Quantum money with classical verification

    Energy Technology Data Exchange (ETDEWEB)

    Gavinsky, Dmitry [NEC Laboratories America, Princeton, NJ (United States)

    2014-12-04

    We propose and construct a quantum money scheme that allows verification through classical communication with a bank. This is the first demonstration that a secure quantum money scheme exists that does not require quantum communication for coin verification. Our scheme is secure against adaptive adversaries - this property is not directly related to the possibility of classical verification, nevertheless none of the earlier quantum money constructions is known to possess it.

  17. Quantum money with classical verification

    International Nuclear Information System (INIS)

    Gavinsky, Dmitry

    2014-01-01

    We propose and construct a quantum money scheme that allows verification through classical communication with a bank. This is the first demonstration that a secure quantum money scheme exists that does not require quantum communication for coin verification. Our scheme is secure against adaptive adversaries - this property is not directly related to the possibility of classical verification, nevertheless none of the earlier quantum money constructions is known to possess it

  18. Development of an Intrinsic Continuum Robot and Attitude Estimation of Its End-effector Based on a Kalman Filter

    International Nuclear Information System (INIS)

    Kang, Chang Hyun; Bae, Ji Hwan; Kang, Bong Soo

    2015-01-01

    This paper presents the design concept of an intrinsic continuum robot for safe man-machine interface and characteristic behaviors of its end-effector based on real experiments. Since pneumatic artificial muscles having similar antagonistic actuation to human muscles are used for main backbones of the proposed robot as well as in the role of the actuating devices, variable stiffness of robotic joints can be available in the actual environment. In order to solve the inherent shortcoming of an intrinsic continuum robot due to bending motion of the backbone materials, a Kalman filter scheme based on a triaxial accelerometer and a triaxial gyroscope was proposed to conduct an attitude estimation of the end-effector of the robot. The experimental results verified that the proposed method was effective in estimating the attitude of the end-effector of the intrinsic continuum robot

  19. Development of a compact in situ polarized ³He neutron spin filter at Oak Ridge National Laboratory.

    Science.gov (United States)

    Jiang, C Y; Tong, X; Brown, D R; Chi, S; Christianson, A D; Kadron, B J; Robertson, J L; Winn, B L

    2014-07-01

    We constructed a compact in situ polarized (3)He neutron spin filter based on spin-exchange optical pumping which is capable of continuous pumping of the (3)He gas while the system is in place in the neutron beam on an instrument. The compact size and light weight of the system simplifies its utilization on various neutron instruments. The system has been successfully tested as a neutron polarizer on the triple-axis spectrometer (HB3) and the hybrid spectrometer (HYSPEC) at Oak Ridge National Laboratory. Over 70% (3)He polarization was achieved and maintained during the test experiments. Over 90% neutron polarization and an average of 25% transmission for neutrons of 14.7 meV and 15 meV was also obtained.

  20. Development of a compact in situ polarized 3He neutron spin filter at Oak Ridge National Laboratory

    International Nuclear Information System (INIS)

    Jiang, C. Y.; Tong, X.; Brown, D. R.; Kadron, B. J.; Robertson, J. L.; Chi, S.; Christianson, A. D.; Winn, B. L.

    2014-01-01

    We constructed a compact in situ polarized 3 He neutron spin filter based on spin-exchange optical pumping which is capable of continuous pumping of the 3 He gas while the system is in place in the neutron beam on an instrument. The compact size and light weight of the system simplifies its utilization on various neutron instruments. The system has been successfully tested as a neutron polarizer on the triple-axis spectrometer (HB3) and the hybrid spectrometer (HYSPEC) at Oak Ridge National Laboratory. Over 70% 3 He polarization was achieved and maintained during the test experiments. Over 90% neutron polarization and an average of 25% transmission for neutrons of 14.7 meV and 15 meV was also obtained