WorldWideScience

Sample records for filters development verification

  1. Biometric verification with correlation filters

    Science.gov (United States)

    Vijaya Kumar, B. V. K.; Savvides, Marios; Xie, Chunyan; Venkataramani, Krithika; Thornton, Jason; Mahalanobis, Abhijit

    2004-01-01

    Using biometrics for subject verification can significantly improve security over that of approaches based on passwords and personal identification numbers, both of which people tend to lose or forget. In biometric verification the system tries to match an input biometric (such as a fingerprint, face image, or iris image) to a stored biometric template. Thus correlation filter techniques are attractive candidates for the matching precision needed in biometric verification. In particular, advanced correlation filters, such as synthetic discriminant function filters, can offer very good matching performance in the presence of variability in these biometric images (e.g., facial expressions, illumination changes, etc.). We investigate the performance of advanced correlation filters for face, fingerprint, and iris biometric verification.

  2. Palm Vein Verification Using Gabor Filter

    Directory of Open Access Journals (Sweden)

    Ali Mohsin Al-Juboori

    2013-01-01

    Full Text Available Palm vein authentication is one of the modern biometric techniques, which employs the vein pattern in the human palm to verify the person. The merits of palm vein on classical biometric (e.g. fingerprint, iris, face are a low risk of falsification, difficulty of duplicated and stability. In this research, a new method is proposed for personal verification based on palm vein features. In the propose method, the palm vein images are firstly enhanced and then the features are extracted by using bank of Gabor filters. Then Fisher Discriminated Analysis (FDA is used to reduce the dimension of the features vectors. For vein pattern verification, this work uses Nearest Neighbors method. The EER of the proposed method is 0.2335%.

  3. Fingerprint Verification based on Gabor Filter Enhancement

    CERN Document Server

    Lavanya, B N; Venugopal, K R

    2009-01-01

    Human fingerprints are reliable characteristics for personnel identification as it is unique and persistence. A fingerprint pattern consists of ridges, valleys and minutiae. In this paper we propose Fingerprint Verification based on Gabor Filter Enhancement (FVGFE) algorithm for minutiae feature extraction and post processing based on 9 pixel neighborhood. A global feature extraction and fingerprints enhancement are based on Hong enhancement method which is simultaneously able to extract local ridge orientation and ridge frequency. It is observed that the Sensitivity and Specificity values are better compared to the existing algorithms.

  4. Development of DWDM Filter Manufacture

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    DWDM technology is developing rapidly. Thin film narrow bandpass filter plays an important role in this field. This article presents some achievements in developing the DWDM narrow bandpass filters and also describes the results achieved by us.

  5. Verification of sub-grid filtered drag models for gas-particle fluidized beds with immersed cylinder arrays

    Energy Technology Data Exchange (ETDEWEB)

    Sarkar, Avik [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Sun, Xin [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Sundaresan, Sankaran [Princeton Univ., NJ (United States)

    2014-04-23

    The accuracy of coarse-grid multiphase CFD simulations of fluidized beds may be improved via the inclusion of filtered constitutive models. In our previous study (Sarkar et al., Chem. Eng. Sci., 104, 399-412), we developed such a set of filtered drag relationships for beds with immersed arrays of cooling tubes. Verification of these filtered drag models is addressed in this work. Predictions from coarse-grid simulations with the sub-grid filtered corrections are compared against accurate, highly-resolved simulations of full-scale turbulent and bubbling fluidized beds. The filtered drag models offer a computationally efficient yet accurate alternative for obtaining macroscopic predictions, but the spatial resolution of meso-scale clustering heterogeneities is sacrificed.

  6. Software Verification and Validation Test Report for the HEPA filter Differential Pressure Fan Interlock System

    Energy Technology Data Exchange (ETDEWEB)

    ERMI, A.M.

    2000-09-05

    The HEPA Filter Differential Pressure Fan Interlock System PLC ladder logic software was tested using a Software Verification and Validation (V&V) Test Plan as required by the ''Computer Software Quality Assurance Requirements''. The purpose of his document is to report on the results of the software qualification.

  7. ADVANCED HOT GAS FILTER DEVELOPMENT

    Energy Technology Data Exchange (ETDEWEB)

    E.S. Connolly; G.D. Forsythe

    2000-09-30

    DuPont Lanxide Composites, Inc. undertook a sixty-month program, under DOE Contract DEAC21-94MC31214, in order to develop hot gas candle filters from a patented material technology know as PRD-66. The goal of this program was to extend the development of this material as a filter element and fully assess the capability of this technology to meet the needs of Pressurized Fluidized Bed Combustion (PFBC) and Integrated Gasification Combined Cycle (IGCC) power generation systems at commercial scale. The principal objective of Task 3 was to build on the initial PRD-66 filter development, optimize its structure, and evaluate basic material properties relevant to the hot gas filter application. Initially, this consisted of an evaluation of an advanced filament-wound core structure that had been designed to produce an effective bulk filter underneath the barrier filter formed by the outer membrane. The basic material properties to be evaluated (as established by the DOE/METC materials working group) would include mechanical, thermal, and fracture toughness parameters for both new and used material, for the purpose of building a material database consistent with what is being done for the alternative candle filter systems. Task 3 was later expanded to include analysis of PRD-66 candle filters, which had been exposed to actual PFBC conditions, development of an improved membrane, and installation of equipment necessary for the processing of a modified composition. Task 4 would address essential technical issues involving the scale-up of PRD-66 candle filter manufacturing from prototype production to commercial scale manufacturing. The focus would be on capacity (as it affects the ability to deliver commercial order quantities), process specification (as it affects yields, quality, and costs), and manufacturing systems (e.g. QA/QC, materials handling, parts flow, and cost data acquisition). Any filters fabricated during this task would be used for product qualification tests

  8. VARTM Model Development and Verification

    Science.gov (United States)

    Cano, Roberto J. (Technical Monitor); Dowling, Norman E.

    2004-01-01

    In this investigation, a comprehensive Vacuum Assisted Resin Transfer Molding (VARTM) process simulation model was developed and verified. The model incorporates resin flow through the preform, compaction and relaxation of the preform, and viscosity and cure kinetics of the resin. The computer model can be used to analyze the resin flow details, track the thickness change of the preform, predict the total infiltration time and final fiber volume fraction of the parts, and determine whether the resin could completely infiltrate and uniformly wet out the preform.

  9. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT - BAGHOUSE FILTRATION PRODUCTS - W.L. GORE & ASSOCIATES, INC. L4347 FILTER SAMPLE

    Science.gov (United States)

    Baghouse filtration products (BFPs) were evaluated by the Air Pollution Control Technology (APCT) pilot of the Environmental Technology Verification (ETV) Program. The performance factor verified was the mean outlet particle concentration for the filter fabric as a function of th...

  10. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT - BAGHOUSE FILTRATION PRODUCTS - BASF CORPORATION AX/BA-14/9-SAXP FILTER SAMPLE

    Science.gov (United States)

    Baghouse filtration products (BFPs) were evaluated by the Air Pollution Control Technology (APCT) pilot of the Environmental Technology Verification (ETV) Program. The performance factor verified was the mean outlet particle concentration for the filter fabric as a function of th...

  11. ADVANCED HOT GAS FILTER DEVELOPMENT

    Energy Technology Data Exchange (ETDEWEB)

    Matthew R. June; John L. Hurley; Mark W. Johnson

    1999-04-01

    Iron aluminide hot gas filters have been developed using powder metallurgy techniques to form seamless cylinders. Three alloys were short-term corrosion tested in simulated IGCC atmospheres with temperatures between 925 F and 1200 F with hydrogen sulfide concentrations ranging from 783 ppm{sub v} to 78,300 ppm{sub v}. Long-term testing was conducted for 1500 hours at 925 F with 78,300 ppm{sub v}. The FAS and FAL alloys were found to be corrosion resistant in the simulated environments. The FAS alloy has been commercialized.

  12. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT: BAGHOUSE FILTRATION PRODUCTS, BWF AMERICA, INC., GRADE 700 MPS POLYESTER FELT FILTER SAMPLE

    Science.gov (United States)

    EPA's National Risk Management Research Laboratory, through its Environmental Technology Verification Program, evaluated the performance of a bag house filtration product for use controlling PM2.5. The product was BWF America, Inc., filter fabric Grade 700 Polyester Felt. All tes...

  13. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT: BAGHOUSE FILTRATION PRODUCTS, BWF AMERICA, INC., GRADE 700 MPS POLYESTER FELT FILTER SAMPLE

    Science.gov (United States)

    EPA's National Risk Management Research Laboratory, through its Environmental Technology Verification Program, evaluated the performance of a bag house filtration product for use controlling PM2.5. The product was BWF America, Inc., filter fabric Grade 700 Polyester Felt. All tes...

  14. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT: BAGHOUSE FILTRATION PRODUCTS, W.L. GORE & ASSOCIATES, INC. LYSB3 FILTER SAMPLE

    Science.gov (United States)

    Baghouse filtration products (BFPs) were evaluated by the Air Pollution Control Technology (APCT) Verification Center. The performance factor verified was the mean outlet particle concentration for the filter fabric as a function of the size for particles equal to or smaller than...

  15. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT, BAGHOUSE FILTRATION PRODUCTS, W.L. GORE & ASSOCIATES, INC., L4427 FILTER SAMPLE

    Science.gov (United States)

    Baghouse filtration products (BFPs) were evaluated by the Air Pollution Control Technology (APCT) Verification Center. The performance factor verified was the mean outlet particle concentration for the filter fabric as a function of the size of those particles equal to and smalle...

  16. Solar Array Verification Analysis Tool (SAVANT) Developed

    Science.gov (United States)

    Bailey, Sheila G.; Long, KIenwyn J.; Curtis, Henry B.; Gardner, Barbara; Davis, Victoria; Messenger, Scott; Walters, Robert

    1999-01-01

    Modeling solar cell performance for a specific radiation environment to obtain the end-of-life photovoltaic array performance has become both increasingly important and, with the rapid advent of new types of cell technology, more difficult. For large constellations of satellites, a few percent difference in the lifetime prediction can have an enormous economic impact. The tool described here automates the assessment of solar array on-orbit end-of-life performance and assists in the development and design of ground test protocols for different solar cell designs. Once established, these protocols can be used to calculate on-orbit end-of-life performance from ground test results. The Solar Array Verification Analysis Tool (SAVANT) utilizes the radiation environment from the Environment Work Bench (EWB) model developed by the NASA Lewis Research Center s Photovoltaic and Space Environmental Effects Branch in conjunction with Maxwell Technologies. It then modifies and combines this information with the displacement damage model proposed by Summers et al. (ref. 1) of the Naval Research Laboratory to determine solar cell performance during the course of a given mission. The resulting predictions can then be compared with flight data. The Environment WorkBench (ref. 2) uses the NASA AE8 (electron) and AP8 (proton) models of the radiation belts to calculate the trapped radiation flux. These fluxes are integrated over the defined spacecraft orbit for the duration of the mission to obtain the total omnidirectional fluence spectra. Components such as the solar cell coverglass, adhesive, and antireflective coatings can slow and attenuate the particle fluence reaching the solar cell. In SAVANT, a continuous slowing down approximation is used to model this effect.

  17. MACCS2 development and verification efforts

    Energy Technology Data Exchange (ETDEWEB)

    Young, M.; Chanin, D.

    1997-03-01

    MACCS2 represents a major enhancement of the capabilities of its predecessor MACCS, the MELCOR Accident Consequence Code System. MACCS, released in 1987, was developed to estimate the potential impacts to the surrounding public of severe accidents at nuclear power plants. The principal phenomena considered in MACCS/MACCS2 are atmospheric transport and deposition under time-variant meteorology, short-term and long-term mitigative actions and exposure pathways, deterministic and stochastic health effects, and economic costs. MACCS2 was developed as a general-purpose analytical tool applicable to diverse reactor and nonreactor facilities. The MACCS2 package includes three primary enhancements: (1) a more flexible emergency response model, (2) an expanded library of radionuclides, and (3) a semidynamic food-chain model. In addition, errors that had been identified in MACCS version1.5.11.1 were corrected, including an error that prevented the code from providing intermediate-phase results. MACCS2 version 1.10 beta test was released to the beta-test group in May, 1995. In addition, the University of New Mexico (UNM) has completed an independent verification study of the code package. Since the beta-test release of MACCS2 version 1.10, a number of minor errors have been identified and corrected, and a number of enhancements have been added to the code package. The code enhancements added since the beta-test release of version 1.10 include: (1) an option to allow the user to input the {sigma}{sub y} and {sigma}{sub z} plume expansion parameters in a table-lookup form for incremental downwind distances, (2) an option to define different initial dimensions for up to four segments of a release, (3) an enhancement to the COMIDA2 food-chain model preprocessor to allow the user to supply externally calculated tables of tritium food-chain dose per unit deposition on farmland to support analyses of tritium releases, and (4) the capability to calculate direction-dependent doses.

  18. Summary of Martian Dust Filtering Challenges and Current Filter Development

    Science.gov (United States)

    O'Hara, William J., IV

    2017-01-01

    Traditional air particulate filtering in manned spaceflight (Apollo, Shuttle, ISS, etc.) has used cleanable or replaceable catch filters such as screens and High-Efficiency Particulate Arrestance (HEPA) filters. However, the human mission to Mars architecture will require a new approach. It is Martian dust that is the particulate of concern but the need also applies to particulates generated by crew. The Mars Exploration Program Analysis Group (MEPAG) high-lighted this concern in its Mars Science, Goals, Objectives, Investigations and Priorities document [7], by saying specifically that one high priority investigation will be to "Test ISRU atmospheric processing systems to measure resilience with respect to dust and other environmental challenge performance parameters that are critical to the design of a full-scale system." By stating this as high priority the MEPAG is acknowledging that developing and adequately verifying this capability is critical to success of a human mission to Mars. This architecture will require filtering capabilities that are highly reliable, will not restrict the flow path with clogging, and require little to no maintenance. This paper will summarize why this is the case, the general requirements for developing the technology, and the status of the progress made in this area.

  19. Optimal design and verification of temporal and spatial filters using second-order cone programming approach

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    Temporal filters and spatial filters are widely used in many areas of signal processing. A number of optimal design criteria to these problems are available in the literature. Various computational techniques are also presented to optimize these criteria chosen. There are many drawbacks in these methods. In this paper, we introduce a unified framework for optimal design of temporal and spatial filters. Most of the optimal design problems of FIR filters and beamformers are included in the framework. It is shown that all the design problems can be reformulated as convex optimization form as the second-order cone programming (SOCP) and solved efficiently via the well-established interior point methods. The main advantage of our SOCP approach as compared with earlier approaches is that it can include most of the existing methods as its special cases, which leads to more flexible designs. Furthermore, the SOCP approach can optimize multiple required performance measures, which is the drawback of earlier approaches. The SOCP approach is also developed to optimally design temporal and spatial two-dimensional filter and spatial matrix filter. Numerical results demonstrate the effectiveness of the proposed approach.

  20. ENVIRONMENTAL TECHNOLOGY VERIFICATION, TEST REPORT OF CONTROL OF BIOAEROSOLS IN HVAC SYSTEMS: TRI-DIM FILTER CORP. PREDATOR II MODEL 8VADTP123C23

    Science.gov (United States)

    The Environmental Technology Verification report discusses the technology and performance of the Predator II, Model 8VADTP123C23CC000 air filter for dust and bioaerosol filtration manufactured by Tri-Dim Filter Corporation. The pressure drop across the filter was 138 Pa clean and...

  1. ENVIRONMENTAL TECHNOLOGY VERIFICATION, TEST REPORT OF CONTROL OF BIOAEROSOLS IN HVAC SYSTEMS, FILTRATION GROUP, AEROSTAR FP-98 MINIPLEAT V-BLANK FILTER

    Science.gov (United States)

    The Environmental Technology Verification report discusses the technology and performance of the AeroStar FP-98 Minipleat V-Bank Filter air filter for dust and bioaerosol filtration manufactured by Filtration Group. The pressure drop across the filter was 137 Pa clean and 348 Pa ...

  2. ENVIRONMENTAL TECHNOLOGY VERIFICATION, TEST REPORT OF CONTROL OF BIOAEROSOLS IN HVAC SYSTEMS, FILTRATION GROUP, AEROSTAR "C-SERIES" POLYESTER PANEL FILTER

    Science.gov (United States)

    The Environmental Technology Verification report discusses the technology and performance of the AeroStar "C-Series" Polyester Panel Filter air filter for dust and bioaerosol filtration manufactured by Filtration Group. The pressure drop across the filter was 126 Pa clean and 267...

  3. ADVANCED HOT GAS FILTER DEVELOPMENT

    Energy Technology Data Exchange (ETDEWEB)

    E.S. Connolly; G.D. Forsythe

    1998-12-22

    Advanced, coal-based power plants will require durable and reliable hot gas filtration systems to remove particulate contaminants from the gas streams to protect downstream components such as turbine blades from erosion damage. It is expected that the filter elements in these systems will have to be made of ceramic materials to withstand goal service temperatures of 1600 F or higher. Recent demonstration projects and pilot plant tests have indicated that the current generation of ceramic hot gas filters (cross-flow and candle configurations) are failing prematurely. Two of the most promising materials that have been extensively evaluated are clay-bonded silicon carbide and alumina-mullite porous monoliths. These candidates, however, have been found to suffer progressive thermal shock fatigue damage, as a result of rapid cooling/heating cycles. Such temperature changes occur when the hot filters are back-pulsed with cooler gas to clean them, or in process upset conditions, where even larger gas temperature changes may occur quickly and unpredictably. In addition, the clay-bonded silicon carbide materials are susceptible to chemical attack of the glassy binder phase that holds the SiC particles together, resulting in softening, strength loss, creep, and eventual failure.

  4. Environmental Technology Verification: Baghouse Filtration Products--TDC Filter Manufacturing, Inc., SB025 Filtration Media

    Science.gov (United States)

    The U.S. Environmental Protection Agency (EPA) has created the Environmental Technology Verification (ETV) Program to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. ETV seeks to ach...

  5. Design and experimental verification of a dual-band metamaterial filter

    Science.gov (United States)

    Zhu, Hong-Yang; Yao, Ai-Qin; Zhong, Min

    2016-10-01

    In this paper, we present the design, simulation, and experimental verification of a dual-band free-standing metamaterial filter operating in a frequency range of 1 THz-30 THz. The proposed structure consists of periodically arranged composite air holes, and exhibits two broad and flat transmission bands. To clarify the effects of the structural parameters on both resonant transmission bands, three sets of experiments are performed. The first resonant transmission band shows a shift towards higher frequency when the side width w 1 of the main air hole is increased. In contrast, the second resonant transmission band displays a shift towards lower frequency when the side width w 2 of the sub-holes is increased, while the first resonant transmission band is unchanged. The measured results indicate that these resonant bands can be modulated individually by simply optimizing the relevant structural parameters (w 1 or w 2) for the required band. In addition, these resonant bands merge into a single resonant band with a bandwidth of 7.7 THz when w 1 and w 2 are optimized simultaneously. The structure proposed in this paper adopts different resonant mechanisms for transmission at different frequencies and thus offers a method to achieve a dual-band and low-loss filter. Project supported by the Doctorate Scientific Research Foundation of Hezhou University, China (Grant No. HZUBS201503), the Promotion of the Basic Ability of Young and Middle-aged Teachers in Universities Project of Guangxi Zhuang Autonomous Region, China (Grant No. KY2016YB453), the Guangxi Colleges and Universities Key Laboratory Symbolic Computation, China, Engineering Data Processing and Mathematical Support Autonomous Discipline Project of Hezhou University, China (Grant No. 2016HZXYSX01).

  6. The development of the spatially correlated adjustment wavelet filter for atomic force microscopy data.

    Science.gov (United States)

    Sikora, Andrzej; Rodak, Aleksander; Unold, Olgierd; Klapetek, Petr

    2016-12-01

    In this paper a novel approach for the practical utilization of the 2D wavelet filter in terms of the artifacts removal from atomic force microscopy measurements results is presented. The utilization of additional data such as summary photodiode signal map is implemented in terms of the identification of the areas requiring the data processing, filtering settings optimization and the verification of the process performance. Such an approach allows to perform the filtering parameters adjustment by average user, while the straightforward method requires an expertise in this field. The procedure was developed as the function of the Gwyddion software. The examples of filtering the phase imaging and Electrostatic Force Microscopy measurement result are presented. As the wavelet filtering feature may remove a local artifacts, its superior efficiency over similar approach with 2D Fast Fourier Transformate based filter (2D FFT) can be noticed. Copyright © 2016 Elsevier B.V. All rights reserved.

  7. Development of a spatial filtering apparatus

    Science.gov (United States)

    Wilson, Nicolle

    This thesis contains a discussion of the theoretical background for Fourier spatial filtering and a description of the design and construction of a portable in-class spatial filtering apparatus. A portable, in-class spatial filtering demonstration apparatus was designed and built. This apparatus uses liquid crystal display (LCD) panels from two projectors as the object and filter masks. The blue LCD panel from the first projector serves as the object mask, and the red panel from the second projector serves as the filter mask. The panels were extracted from their projectors and mounted onto aluminum blocks which are held in place by optical component mounts. Images are written to the LCD panels via custom open source software developed for this apparatus which writes independent monochromatic images to the video signal. The software has two monochromatic image windows, basic image manipulation tools, and two video feed input display windows. Two complementary metal-oxide semiconductor (CMOS) sensors are positioned to record the reconstructed image of the object mask and the diffraction pattern created by the object mask. The object and filter mask can be digitally changed and the effects on the filtered image and diffraction pattern can be observed in real-time. The entire apparatus is assembled onto a rolling cart which allows it to be easily taken into classrooms.

  8. Development of Palmprint Verification System Using Biometrics

    Institute of Scientific and Technical Information of China (English)

    G. Shobha; M. Krishna; S.C. Sharma

    2006-01-01

    Palmprint verification system using Biometrics is one of the emerging technologies, which recognizes a person based on the principle lines, wrinkles and ridges on the surface of the palm. These line structures are stable and remain unchanged throughout the life of an individual. More importantly, no two palmprints from different individuals are the same, and normally people do not feel uneasy to have their palmprint images taken for testing. Therefore palmprint recognition offers a promising future for medium-security access control systems. In this paper, a new approach for personal authentication using hand images is discussed. Gray-Scale palm images are captured using a digital camera at a resolution of 640′480. Each of these gray-scale images is aligned and then used to extract palmprint and hand geometry features. These features are then used for authenticating users. The image acquisition setup used here is inherently simple and it does not employ any special illumination nor does it use any pegs that might cause any inconvenience to users. Experimental results show that the designed system achieves an acceptable level of performance.

  9. Further developed filter systems for keeping the air clean

    Energy Technology Data Exchange (ETDEWEB)

    Hochstrat, W.B.

    1978-12-01

    The pulse-jet filter system is presented in this contribution. The development of improved fibers to produce high-quality needle felting was the basis for surface filters as well as tube filters. The system tube filter with pressurized-air cleaning is described in greater detail. Its advantage besides good degrees of separation is especially the little maintenance required.

  10. Efficient Development and Verification of Safe Railway Control Software

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Peleska, Jan

    2013-01-01

    In this book, the authors present current research on the types, design and safety issues of railways. Topics discussed include the acoustic characteristics of noise in train stations; monitoring railway structure conditions and opportunities to use wireless sensor networks as tools to improve...... the monitoring process; hydraulic absorbers as dampers to dissipate the energy of oscillations in railway electric equipment; development of train fare calculation and adjustment systems using VDM++; efficient development and verification of safe railway control software; and evolution of the connectivity...... of the Portuguese broad gauge railway network (1948-2012)....

  11. Efficient Development and Verification of Safe Railway Control Software

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Peleska, Jan

    2013-01-01

    the monitoring process; hydraulic absorbers as dampers to dissipate the energy of oscillations in railway electric equipment; development of train fare calculation and adjustment systems using VDM++; efficient development and verification of safe railway control software; and evolution of the connectivity......In this book, the authors present current research on the types, design and safety issues of railways. Topics discussed include the acoustic characteristics of noise in train stations; monitoring railway structure conditions and opportunities to use wireless sensor networks as tools to improve...

  12. Optimal design and performance verification of a broadband waveguide filter using ANN-GA algorithm

    Directory of Open Access Journals (Sweden)

    Manidipa Nath

    2013-09-01

    Full Text Available In this work design and optimization of EBGstructure having multiple dielectric posts uniformly placed insidea rectangular waveguide is done to extract filter responses.Frequency response of BPF configuration using trained ANNmodel of multipost rectangular waveguide are studied andoptimized using GA. The geometrical and positional dimensionof post parameters are varied in accordance to the requirementof reflectance and transmittance of the filter.

  13. Tunable n-path notch filters for blocker suppression: modeling and verification

    NARCIS (Netherlands)

    Ghaffari, A.; Klumperink, Eric A.M.; Nauta, Bram

    2013-01-01

    N-path switched-RC circuits can realize filters with very high linearity and compression point while they are tunable by a clock frequency. In this paper, both differential and single-ended N-path notch filters are modeled and analyzed. Closed-form equations provide design equations for the main

  14. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT - PHYSICAL REMOVAL OF MICROBIOLOGICAL & PARTICULATE CONTAMINANTS IN DRINKING WATER: US FILTER 3M10C MICROFILTRATION MEMBRANE SYSTEM AT CHULA VISTA, CALIFORNIA

    Science.gov (United States)

    Verification testing of the US Filter 3M10C membrane system was conducted over a 44-day test period at the Aqua 2000 Research Center in Chula Vista, California. The test period extended from July 24, 2002 to September 5, 2002. The source water was a blend of Colorado River and ...

  15. Numerical simulation on trapping efficiency of steady filtration process in diesel particulate filter and its experimental verification

    Institute of Scientific and Technical Information of China (English)

    张桂菊; 鄂加强; 左青松; 龚金科; 左威; 袁文华

    2015-01-01

    Taking wall-flow diesel particulate filter(DPF) as the research objective and separately assuming its filtering wall to be composed of numerous spherical or cylindrical elements, two different mathematical models of steady filtration for wall-flow diesel particulate filter were developed and verified by experiments as well as numerically solved. Furthermore, the effects of the macroand micro-structural parameters of filtering wall and exhaust-flow characteristic parameters on trapping efficiency were also analyzed and researched. The results show that: 1) The two developed mathematical models are consistent with the prediction of variation of particulate size; the influence of various factors on the steady trapping efficiency is exactly the same. Compared to model 2, model 1 is more suitable for describing the steady filtration process of wall-flow diesel particulate filter; 2)The major influencing factors on steady trapping efficiency of wall-flow diesel particulate filter are the macro-and micro-structural parameters of filtering wall; and the secondary influencing factors are the exhaust-flow characteristic parameters and macro-structural parameters of filter; 3)The steady trapping efficiency will be improved by increasing filter body volume, pore density as well as wall thickness and by decreasing exhaust-flow, but effects will be weakened when particulate size exceeds a certain critical value; 4) The steady trapping efficiency will be significantly improved by increasing exhaust-flow temperature and filtering wall thickness, but effects will be also weakened when particulate size exceeds a certain critical value; 5) The steady trapping efficiency will approximately linearly increase with reducing porosity, micropore aperture and pore width.

  16. Formal Development and Verification of Railway Control Systems

    DEFF Research Database (Denmark)

    Vu Hong, Linh; Haxthausen, Anne Elisabeth; Peleska, Jan

    , are avoided. Central parts of these new systems consist of safety-critical software the functional correctness of which is one of the key requisites for a reliable operation of the traffics and in particular for the safety of passengers. Until now the development of railway control software has typically been......This paper presents work package WP4.1 of the RobustRails research project. The work package aims at suggesting a methodology for efficient development and verification of safe and robust railway control systems. 1 Project background and state of the art Over the next 10 years all Danish railway...... signalling systems are going to be completely replaced with modern, computer based railway control systems based on the European standard ERTMS/ETCS [3, 4] by the Danish Signaling Programme [1]. The purpose of these systems is to control the railway traffic such that unsafe situations, like train collisions...

  17. Development of the nested fiber filter

    Science.gov (United States)

    Litt, R. D.; Conkle, H. N.; Raghavan, J. K.

    Battelle has tested the Nested Fiber Filter (NFF) as a particulate control device for high temperature, high-pressure (HTHP) applications. Battelle funded initial bench-scale tests which were the basis for patents and a concept applying the NFF. Subsequent parametric tests in a 6-inch diameter reactor established excellent particulate capture performance, greater than 99 percent, for conditions up to 1600 F and 6 atmospheres. Effective cleaning/regeneration of the NFF was achieved in the 6-inch scale with acoustic and mechanical vibration. A pulse combustor was tested in an integrated NFF arrangement because of compatibility with the HTBP conditions. This arrangement provided the basis for larger scale tests under the subject contract. A 6-sq ft test module was designed and installed with an existing fluidized bed combustor for additional development and testing.

  18. In-Space Engine (ISE-100) Development - Design Verification Test

    Science.gov (United States)

    Trinh, Huu P.; Popp, Chris; Bullard, Brad

    2017-01-01

    In the past decade, NASA has formulated science mission concepts with an anticipation of landing spacecraft on the lunar surface, meteoroids, and other planets. Advancing thruster technology for spacecraft propulsion systems has been considered for maximizing science payload. Starting in 2010, development of In-Space Engine (designated as ISE-100) has been carried out. ISE-100 thruster is designed based on heritage Missile Defense Agency (MDA) technology aimed for a lightweight and efficient system in terms volume and packaging. It runs with a hypergolic bi-propellant system: MON-25 (nitrogen tetroxide, N2O4, with 25% of nitric oxide, NO) and MMH (monomethylhydrazine, CH6N2) for NASA spacecraft applications. The utilization of this propellant system will provide a propulsion system capable of operating at wide range of temperatures, from 50 C (122 F) down to -30 C (-22 F) to drastically reduce heater power. The thruster is designed to deliver 100 lb(sub f) of thrust with the capability of a pulse mode operation for a wide range of mission duty cycles (MDCs). Two thrusters were fabricated. As part of the engine development, this test campaign is dedicated for the design verification of the thruster. This presentation will report the efforts of the design verification hot-fire test program of the ISE-100 thruster in collaboration between NASA Marshall Space Flight Center (MSFC) and Aerojet Rocketdyne (AR) test teams. The hot-fire tests were conducted at Advance Mobile Propulsion Test (AMPT) facility in Durango, Colorado, from May 13 to June 10, 2016. This presentation will also provide a summary of key points from the test results.

  19. Development of evaluation and performance verification technology for radiotherapy radiation

    Energy Technology Data Exchange (ETDEWEB)

    Kim, J. Y.; Jang, S. Y.; Kim, B. H. and others

    2005-02-15

    No matter how much the importance is emphasized, the exact assessment of the absorbed doses administered to the patients to treat the various diseases such as lately soaring malignant tumors with the radiotherapy practices is the most important factor. In reality, several over-exposed patients from the radiotherapy practice become very serious social issues. Especially, the development of a technology to exactly assess the high doses and high energies (In general, dose administered to the patients with the radiotherapy practices are very huge doses, and they are about three times higher than the lethal doses) generated by the radiation generators and irradiation equipment is a competing issue to be promptly conducted. Over fifty medical centers in Korea operate the radiation generators and irradiation equipment for the radiotherapy practices. However, neither the legal and regulatory systems to implement a quality assurance program are sufficiently stipulated nor qualified personnel who could run a program to maintain the quality assurance and control of those generators and equipment for the radiotherapy practices in the medical facilities are sufficiently employed. To overcome the above deficiencies, a quality assurance program such as those developed in the technically advanced countries should be developed to exactly assess the doses administered to patients with the radiotherapy practices and develop the necessary procedures to maintain the continuing performance of the machine or equipment for the radiotherapy. The QA program and procedures should induce the fluent calibration of the machine or equipment with quality, and definitely establish the safety of patients in the radiotherapy practices. In this study, a methodology for the verification and evaluation of the radiotherapy doses is developed, and several accurate measurements, evaluations of the doses delivered to patients and verification of the performance of the therapy machine and equipment are

  20. Development and Verification of Behavior of Tritium Analytic Code (BOTANIC)

    Energy Technology Data Exchange (ETDEWEB)

    Park, Min Young; Kim, Eung Soo [Seoul National University, Seoul (Korea, Republic of)

    2014-10-15

    VHTR, one of the Generation IV reactor concepts, has a relatively high operation temperature and is usually suggested as a heat source for many industrial processes, including hydrogen production process. Thus, it is vital to trace tritium behavior in the VHTR system and the potential permeation rate to the industrial process. In other words, tritium is a crucial issue in terms of safety in the fission reactor system. Therefore, it is necessary to understand the behavior of tritium and the development of the tool to enable this is vital.. In this study, a Behavior of Tritium Analytic Code (BOTANIC) an analytic tool which is capable of analyzing tritium behavior is developed using a chemical process code called gPROMS. BOTANIC was then further verified using the analytic solutions and benchmark codes such as Tritium Permeation Analysis Code (TPAC) and COMSOL. In this study, the Behavior of Tritium Analytic Code, BOTANIC, has been developed using a chemical process code called gPROMS. The code has several distinctive features including non-diluted assumption, flexible applications and adoption of distributed permeation model. Due to these features, BOTANIC has the capability to analyze a wide range of tritium level systems and has a higher accuracy as it has the capacity to solve distributed models. BOTANIC was successfully developed and verified using analytical solution and the benchmark code calculation result. The results showed very good agreement with the analytical solutions and the calculation results of TPAC and COMSOL. Future work will be focused on the total system verification.

  1. Development of a Scalable Testbed for Mobile Olfaction Verification.

    Science.gov (United States)

    Zakaria, Syed Muhammad Mamduh Syed; Visvanathan, Retnam; Kamarudin, Kamarulzaman; Yeon, Ahmad Shakaff Ali; Md Shakaff, Ali Yeon; Zakaria, Ammar; Kamarudin, Latifah Munirah

    2015-12-09

    The lack of information on ground truth gas dispersion and experiment verification information has impeded the development of mobile olfaction systems, especially for real-world conditions. In this paper, an integrated testbed for mobile gas sensing experiments is presented. The integrated 3 m × 6 m testbed was built to provide real-time ground truth information for mobile olfaction system development. The testbed consists of a 72-gas-sensor array, namely Large Gas Sensor Array (LGSA), a localization system based on cameras and a wireless communication backbone for robot communication and integration into the testbed system. Furthermore, the data collected from the testbed may be streamed into a simulation environment to expedite development. Calibration results using ethanol have shown that using a large number of gas sensor in the LGSA is feasible and can produce coherent signals when exposed to the same concentrations. The results have shown that the testbed was able to capture the time varying characteristics and the variability of gas plume in a 2 h experiment thus providing time dependent ground truth concentration maps. The authors have demonstrated the ability of the mobile olfaction testbed to monitor, verify and thus, provide insight to gas distribution mapping experiment.

  2. Analog Video Authentication and Seal Verification Equipment Development

    Energy Technology Data Exchange (ETDEWEB)

    Gregory Lancaster

    2012-09-01

    Under contract to the US Department of Energy in support of arms control treaty verification activities, the Savannah River National Laboratory in conjunction with the Pacific Northwest National Laboratory, the Idaho National Laboratory and Milagro Consulting, LLC developed equipment for use within a chain of custody regime. This paper discussed two specific devices, the Authentication Through the Lens (ATL) analog video authentication system and a photographic multi-seal reader. Both of these devices have been demonstrated in a field trial, and the experience gained throughout will also be discussed. Typically, cryptographic methods are used to prove the authenticity of digital images and video used in arms control chain of custody applications. However, in some applications analog cameras are used. Since cryptographic authentication methods will not work on analog video streams, a simple method of authenticating analog video was developed and tested. A photographic multi-seal reader was developed to image different types of visual unique identifiers for use in chain of custody and authentication activities. This seal reader is unique in its ability to image various types of seals including the Cobra Seal, Reflective Particle Tags, and adhesive seals. Flicker comparison is used to compare before and after images collected with the seal reader in order to detect tampering and verify the integrity of the seal.

  3. Development of a Scalable Testbed for Mobile Olfaction Verification

    Directory of Open Access Journals (Sweden)

    Syed Muhammad Mamduh Syed Zakaria

    2015-12-01

    Full Text Available The lack of information on ground truth gas dispersion and experiment verification information has impeded the development of mobile olfaction systems, especially for real-world conditions. In this paper, an integrated testbed for mobile gas sensing experiments is presented. The integrated 3 m × 6 m testbed was built to provide real-time ground truth information for mobile olfaction system development. The testbed consists of a 72-gas-sensor array, namely Large Gas Sensor Array (LGSA, a localization system based on cameras and a wireless communication backbone for robot communication and integration into the testbed system. Furthermore, the data collected from the testbed may be streamed into a simulation environment to expedite development. Calibration results using ethanol have shown that using a large number of gas sensor in the LGSA is feasible and can produce coherent signals when exposed to the same concentrations. The results have shown that the testbed was able to capture the time varying characteristics and the variability of gas plume in a 2 h experiment thus providing time dependent ground truth concentration maps. The authors have demonstrated the ability of the mobile olfaction testbed to monitor, verify and thus, provide insight to gas distribution mapping experiment.

  4. Simscape Modeling Verification in the Simulink Development Environment

    Science.gov (United States)

    Volle, Christopher E. E.

    2014-01-01

    The purpose of the Simulation Product Group of the Control and Data Systems division of the NASA Engineering branch at Kennedy Space Center is to provide a realtime model and simulation of the Ground Subsystems participating in vehicle launching activities. The simulation software is part of the Spaceport Command and Control System (SCCS) and is designed to support integrated launch operation software verification, and console operator training. Using Mathworks Simulink tools, modeling engineers currently build models from the custom-built blocks to accurately represent ground hardware. This is time consuming and costly due to required rigorous testing and peer reviews to be conducted for each custom-built block. Using Mathworks Simscape tools, modeling time can be reduced since there would be no custom-code developed. After careful research, the group came to the conclusion it is feasible to use Simscape's blocks in MatLab's Simulink. My project this fall was to verify the accuracy of the Crew Access Arm model developed using Simscape tools running in the Simulink development environment.

  5. Multidimensional filter banks and wavelets research developments and applications

    CERN Document Server

    Levy, Bernard

    1997-01-01

    Multidimensional Filter Banks and Wavelets: Reserach Developments and Applications brings together in one place important contributions and up-to-date research results in this important area. Multidimensional Filter Banks and Wavelets: Research Developments and Applications serves as an excellent reference, providing insight into some of the most important research issues in the field.

  6. Software Testing and Verification in Climate Model Development

    Science.gov (United States)

    Clune, Thomas L.; Rood, RIchard B.

    2011-01-01

    Over the past 30 years most climate models have grown from relatively simple representations of a few atmospheric processes to a complex multi-disciplinary system. Computer infrastructure over that period has gone from punch card mainframes to modem parallel clusters. Model implementations have become complex, brittle, and increasingly difficult to extend and maintain. Existing verification processes for model implementations rely almost exclusively upon some combination of detailed analysis of output from full climate simulations and system-level regression tests. In additional to being quite costly in terms of developer time and computing resources, these testing methodologies are limited in terms of the types of defects that can be detected, isolated and diagnosed. Mitigating these weaknesses of coarse-grained testing with finer-grained "unit" tests has been perceived as cumbersome and counter-productive. In the commercial software sector, recent advances in tools and methodology have led to a renaissance for systematic fine-grained testing. We discuss the availability of analogous tools for scientific software and examine benefits that similar testing methodologies could bring to climate modeling software. We describe the unique challenges faced when testing complex numerical algorithms and suggest techniques to minimize and/or eliminate the difficulties.

  7. Formal Development and Verification of a Distributed Railway Control System

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Peleska, Jan

    1999-01-01

    In this article we introduce the concept for a distributed railway control system and present the specification and verification of the main algorithm used for safe distributed control. Our design and verification approach is based on the RAISE method, starting with highly abstract algebraic...... specifications which are transformed into directly implementable distributed control processes by applying a series of refinement and verification steps. Concrete safety requirements are derived from an abstract version that can be easily validated with respect to soundness and completeness. Complexity...... is further reduced by separating the system model into a domain model describing the physical system in absence of control and a controller model introducing the safety-related control mechanisms as a separate entity monitoring observables of the physical system to decide whether it is safe for a train...

  8. Formal development and verification of a distributed railway control system

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Peleska, J.

    2000-01-01

    The authors introduce the concept for a distributed railway control system and present the specification and verification of the main algorithm used for safe distributed control. Our design and verification approach is based on the RAISE method, starting with highly abstract algebraic...... specifications which are transformed into directly implementable distributed control processes by applying a series of refinement and verification steps. Concrete safety requirements are derived from an abstract version that can be easily validated with respect to soundness and completeness. Complexity...... is further reduced by separating the system model into a domain model and a controller model. The domain model describes the physical system in absence of control and the controller model introduces the safety-related control mechanisms as a separate entity monitoring observables of the physical system...

  9. Formal Development and Verification of a Distributed Railway Control System

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Peleska, Jan

    1998-01-01

    In this article we introduce the concept for a distributed railway control system and present the specification and verification of the main algorithm used for safe distributed control. Our design and verification approach is based on the RAISE method, starting with highly abstract algebraic...... specifications which are transformed into directly implementable distributed control processes by applying a series of refinement and verification steps. Concrete safety requirements are derived from an abstract version that can be easily validated with respect to soundness and completeness. Complexity...... is further reduced by separating the system model into a domain model describing the physical system in absence of control and a controller model introducing the safety-related control mechanisms as a separate entity monitoring observables of the physical system to decide whether it is safe for a train...

  10. Formal Development and Verification of a Distributed Railway Control System

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Peleska, Jan

    1998-01-01

    In this article we introduce the concept for a distributed railway control system and present the specification and verification of the main algorithm used for safe distributed control. Our design and verification approach is based on the RAISE method, starting with highly abstract algebraic...... is further reduced by separating the system model into a domain model describing the physical system in absence of control and a controller model introducing the safety-related control mechanisms as a separate entity monitoring observables of the physical system to decide whether it is safe for a train...... specifications which are transformed into directly implementable distributed control processes by applying a series of refinement and verification steps. Concrete safety requirements are derived from an abstract version that can be easily validated with respect to soundness and completeness. Complexity...

  11. Formal Development and Verification of a Distributed Railway Control System

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Peleska, Jan

    1999-01-01

    In this article we introduce the concept for a distributed railway control system and present the specification and verification of the main algorithm used for safe distributed control. Our design and verification approach is based on the RAISE method, starting with highly abstract algebraic...... is further reduced by separating the system model into a domain model describing the physical system in absence of control and a controller model introducing the safety-related control mechanisms as a separate entity monitoring observables of the physical system to decide whether it is safe for a train...... specifications which are transformed into directly implementable distributed control processes by applying a series of refinement and verification steps. Concrete safety requirements are derived from an abstract version that can be easily validated with respect to soundness and completeness. Complexity...

  12. Practical approach for pretreatment verification of IMRT with flattening filter free(FFF) beams using Varian Portal Dosimetry.

    Science.gov (United States)

    Min, Soonki; Choi, Young Eun; Kwak, Jungwon; Cho, Byungchul

    2014-01-08

    Patient-specific pretreatment verification of intensity-modulated radiation therapy (IMRT) or volumetric-modulated arc therapy (VMAT) is strongly recommended for all patients in order to detect any potential errors in treatment planning process and machine deliverability, and is thus performed routinely in many clinics. Portal dosimetry is an effective method for this purpose because of its prompt setup, easy data acquisition, and high spatial resolution. However, portal dosimetry cannot be applied to IMRT or VMAT with flattening filter-free (FFF) beams because of the high dose-rate saturation effect of the electronic portal imaging device (EPID). In our current report, we suggest a practical QA method of expanding the conventional portal dosimetry to FFF beams with a QA plan generated by the following three steps: 1) replace the FFF beams with flattening filtered (FF) beams of the same nominal energy; 2) reduce the dose rate to avoid the saturation effect of the EPID detector; and 3) adjust the total MU to match the gantry and MLC leaf motions. Two RapidArc plans with 6 and 10 MV FFF beams were selected, and QA plans were created by the aforementioned steps and delivered. The trajectory log files of TrueBeam obtained during the treatment and during the delivery of QA plan were analyzed and compared. The maximum discrepancies in the expected trajectories between the treatment and QA plans were within 0.002 MU for the MU, 0.06° for the motion of gantry rotation, and 0.006 mm for the positions of the MLC leaves, indicating much higher levels of accuracy compared to the mechanical specifications of the machine. For further validation of the method, direct comparisons of the delivered QA FF beam to the treatment FFF beam were performed using film dosimetry and show that gamma passing rates under 2%/2 mm criteria are 99.0%-100% for the all four arc beams. This method can be used on RapidArc plans with FFF beams without any additional procedure or modifications on the

  13. Development and Analysis of Compact Lowpass Filter for UWB Systems

    Directory of Open Access Journals (Sweden)

    Rangaswamy Nakkeeran

    2012-06-01

    Full Text Available This paper presents a developed compact lowpass filter which is devised by concatenating 'T' and inverted 'T' open stubs on a microstrip line. They are combined to enhance the bandwidth and also to improve the outband performance. Insertion loss of the developed filter is less than -0.15 dB and the return loss is -30 dB at 2.85 GHz. Stopband performance of the filter is less than -20 dB from 5.4 GHz to 7.8 GHz and provides -3 dB cut-off frequency at 4.5 GHz. The final dimension of the fabricated filter with the above features is only 16.17 mm (length $imes$15.76 mm (width. The response of the developed filter has a good agreement with simulation response.

  14. DEVELOPMENT OF AN ADHESIVE CANDLE FILTER SAFEGUARD DEVICE

    Energy Technology Data Exchange (ETDEWEB)

    John P. Hurley; Ann K. Henderson; Jan W. Nowok; Michael L. Swanson

    2002-01-01

    In order to reach the highest possible efficiencies in a coal-fired turbine-based power system, the turbine should be directly fired with the products of coal conversion. Two main types of systems employ these turbines: those based on pressurized fluidized-bed combustors and those based on integrated gasification combined cycles. In both systems, suspended particulates must be cleaned from the gas stream before it enters the turbine so as to prevent fouling and erosion of the turbine blades. To produce the cleanest gas, barrier filters are being developed and are in use in several facilities. Barrier filters are composed of porous, high-temperature materials that allow the hot gas to pass but collect the particulates on the surface. The three main configurations of the barrier filters are candle, cross-flow, and tube filters. Both candle and tube filters have been tested extensively. They are composed of coarsely porous ceramic that serves as a structural support, overlain with a thin, microporous ceramic layer on the dirty gas side that serves as the primary filter surface. They are highly efficient at removing particulate matter from the gas stream and, because of their ceramic construction, are resistant to gas and ash corrosion. However, ceramics are brittle and individual elements can fail, allowing particulates to pass through the hole left by the filter element and erode the turbine. Preventing all failure of individual ceramic filter elements is not possible at the present state of development of the technology. Therefore, safeguard devices (SGDs) must be employed to prevent the particulates streaming through occasional broken filters from reaching the turbine. However, the SGD must allow for the free passage of gas when it is not activated. Upon breaking of a filter, the SGD must either mechanically close or quickly plug with filter dust to prevent additional dust from reaching the turbine. Production of a dependable rapidly closing autonomous mechanical

  15. DEVELOPMENT OF AN ADHESIVE CANDLE FILTER SAFEGUARD DEVICE

    Energy Technology Data Exchange (ETDEWEB)

    John P. Hurley; Ann K. Henderson; Jan W. Nowok; Michael L. Swanson

    2002-01-01

    In order to reach the highest possible efficiencies in a coal-fired turbine-based power system, the turbine should be directly fired with the products of coal conversion. Two main types of systems employ these turbines: those based on pressurized fluidized-bed combustors and those based on integrated gasification combined cycles. In both systems, suspended particulates must be cleaned from the gas stream before it enters the turbine so as to prevent fouling and erosion of the turbine blades. To produce the cleanest gas, barrier filters are being developed and are in use in several facilities. Barrier filters are composed of porous, high-temperature materials that allow the hot gas to pass but collect the particulates on the surface. The three main configurations of the barrier filters are candle, cross-flow, and tube filters. Both candle and tube filters have been tested extensively. They are composed of coarsely porous ceramic that serves as a structural support, overlain with a thin, microporous ceramic layer on the dirty gas side that serves as the primary filter surface. They are highly efficient at removing particulate matter from the gas stream and, because of their ceramic construction, are resistant to gas and ash corrosion. However, ceramics are brittle and individual elements can fail, allowing particulates to pass through the hole left by the filter element and erode the turbine. Preventing all failure of individual ceramic filter elements is not possible at the present state of development of the technology. Therefore, safeguard devices (SGDs) must be employed to prevent the particulates streaming through occasional broken filters from reaching the turbine. However, the SGD must allow for the free passage of gas when it is not activated. Upon breaking of a filter, the SGD must either mechanically close or quickly plug with filter dust to prevent additional dust from reaching the turbine. Production of a dependable rapidly closing autonomous mechanical

  16. Development of requirements tracking and verification technology for the NPP software

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Chul Hwan; Kim, Jang Yeol; Lee, Jang Soo; Song, Soon Ja; Lee, Dong Young; Kwon, Kee Choon

    1998-12-30

    Searched and analyzed the technology of requirements engineering in the areas of aerospace and defense industry, medical industry and nuclear industry. Summarized the status of tools for the software design and requirements management. Analyzed the software design methodology for the safety software of NPP. Development of the design requirements for the requirements tracking and verification system. Development of the background technology to design the prototype tool for the requirements tracking and verification.

  17. Development of requirements tracking and verification system for the software design of distributed control system

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Chul Hwan; Kim, Jang Yeol; Kim, Jung Tack; Lee, Jang Soo; Ham, Chang Shik [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1998-12-31

    In this paper a prototype of Requirement Tracking and Verification System(RTVS) for a Distributed Control System was implemented and tested. The RTVS is a software design and verification tool. The main functions required by the RTVS are managing, tracking and verification of the software requirements listed in the documentation of the DCS. The analysis of DCS software design procedures and interfaces with documents were performed to define the user of the RTVS, and the design requirements for RTVS were developed. 4 refs., 3 figs. (Author)

  18. Development of iron-aluminide hot-gas filters

    Energy Technology Data Exchange (ETDEWEB)

    Tortorelli, P.F.; Wright, I.G.; Judkins, R.R.

    1996-06-01

    Removal of particles from hot synthesis gas produced by coal gasification is vital to the success of these systems. In Integrated [Coal] Gasification Combined Cycle systems, the synthesis gas is the fuel for gas turbines. To avoid damage to turbine components, it is necessary that particles be removed from the fuel gas prior to combustion and introduction into the turbine. Reliability and durability of the hot-gas filtering devices used to remove the particles is, of course, of special importance. Hot-gas filter materials include both ceramics and metals. Numerous considerations must be made in selecting materials for these filters. Constituents in the hot gases may potentially degrade the properties and performance of the filters to the point that they are ineffective in removing the particles. Very significant efforts have been made by DOE and others to develop effective hot-particle filters and, although improvements have been made, alternative materials and structures are still needed.

  19. Development of Test Protocols for International Space Station Particulate Filters

    Science.gov (United States)

    Green, Robert D.; Vijayakumar, R.; Agui, Juan H.

    2014-01-01

    Air quality control on the International Space Station (ISS) is a vital requirement for maintaining a clean environment for the crew and the hardware. This becomes a serious challenge in pressurized space compartments since no outside air ventilation is possible, and a larger particulate load is imposed on the filtration system due to lack of gravitational settling. The ISS Environmental Control and Life Support System (ECLSS) uses a filtration system that has been in use for over 14 years and has proven to meet this challenge. The heart of this system is a traditional High- Efficiency Particulate Air (HEPA) filter configured to interface with the rest of the life support elements and provide effective cabin filtration. Over the years, the service life of these filters has been re-evaluated based on limited post-flight tests of returned filters and risk factors. On earth, a well designed and installed HEPA filter will last for several years, e.g. in industrial and research clean room applications. Test methods for evaluating these filters are being developed on the basis of established test protocols used by the industry and the military. This paper will discuss the test methods adopted and test results on prototypes of the ISS filters. The results will assist in establishing whether the service life can be extended for these filters. Results from unused filters that have been in storage will also be presented to ascertain the shelf life and performance deterioration, if any and determine if the shelf life may be extended.

  20. Development and validation of MCNPX-based Monte Carlo treatment plan verification system

    OpenAIRE

    Iraj Jabbari; Shahram Monadi

    2015-01-01

    A Monte Carlo treatment plan verification (MCTPV) system was developed for clinical treatment plan verification (TPV), especially for the conformal and intensity-modulated radiotherapy (IMRT) plans. In the MCTPV, the MCNPX code was used for particle transport through the accelerator head and the patient body. MCTPV has an interface with TiGRT planning system and reads the information which is needed for Monte Carlo calculation transferred in digital image communications in medicine-radiation ...

  1. Developing a Verification and Training Phantom for Gynecological Brachytherapy System

    Directory of Open Access Journals (Sweden)

    Mahbobeh Nazarnejad

    2012-03-01

    Full Text Available Introduction Dosimetric accuracy is a major issue in the quality assurance (QA program for treatment planning systems (TPS. An important contribution to this process has been a proper dosimetry method to guarantee the accuracy of delivered dose to the tumor. In brachytherapy (BT of gynecological (Gyn cancer it is usual to insert a combination of tandem and ovoid applicators with a complicated geometry which makes their dosimetry verification difficult and important. Therefore, evaluation and verification of dose distribution is necessary for accurate dose delivery to the patients. Materials and Methods The solid phantom was made from Perspex slabs as a tool for intracavitary brachytherapy dosimetric QA. Film dosimetry (EDR2 was done for a combination of ovoid and tandem applicators introduced by Flexitron brachytherapy system. Treatment planning was also done with Flexiplan 3D-TPS to irradiate films sandwiched between phantom slabs. Isodose curves obtained from treatment planning system and the films were compared with each other in 2D and 3D manners. Results The brachytherapy solid phantom was constructed with slabs. It was possible to insert tandems and ovoids loaded with radioactive source of Ir-192 subsequently. Relative error was 3-8.6% and average relative error was 5.08% in comparison with the films and TPS isodose curves. Conclusion Our results showed that the difference between TPS and the measurements is well within the acceptable boundaries and below the action level according to AAPM TG.45. Our findings showed that this phantom after minor corrections can be used as a method of choice for inter-comparison analysis of TPS and to fill the existing gap for accurate QA program in intracavitary brachytherapy. The constructed phantom also showed that it can be a valuable tool for verification of accurate dose delivery to the patients as well as training for brachytherapy residents and physics students.

  2. Developments towards a filter wheel hyperspectral camera for planetary exploration

    Science.gov (United States)

    Gunn, M.; Langstaff, D. P.; Barnes, D.

    2011-10-01

    The benefits of hyperspectral imaging in remote sensing applications are well established and it is now routinely exploited in terrestrial applications. However the restrictions imposed on mass and power consumption and the extreme operating conditions encountered in extra-terrestrial environments have limited its widespread use for planetary exploration. Instead multispectral camera systems with typically 10-12 discrete filters are employed, providing only coarse spectral information. By exploiting the properties of interference filters off axis it is possible to obtain additional spectral information. Recent advances in filter technology have made it possible to develop a simple and lightweight wide angle hyperspectral camera employing a filter wheel. The theory of operation and early test results from a prototype camera system are presented.

  3. Development of Genetic Markers for Triploid Verification of the Pacific Oyster,

    Directory of Open Access Journals (Sweden)

    Jung-Ha Kang

    2013-07-01

    Full Text Available The triploid Pacific oyster, which is produced by mating tetraploid and diploid oysters, is favored by the aquaculture industry because of its better flavor and firmer texture, particularly during the summer. However, tetraploid oyster production is not feasible in all oysters; the development of tetraploid oysters is ongoing in some oyster species. Thus, a method for ploidy verification is necessary for this endeavor, in addition to ploidy verification in aquaculture farms and in the natural environment. In this study, a method for ploidy verification of triploid and diploid oysters was developed using multiplex polymerase chain reaction (PCR panels containing primers for molecular microsatellite markers. Two microsatellite multiplex PCR panels consisting of three markers each were developed using previously developed microsatellite markers that were optimized for performance. Both panels were able to verify the ploidy levels of 30 triploid oysters with 100% accuracy, illustrating the utility of microsatellite markers as a tool for verifying the ploidy of individual oysters.

  4. Fuel Efficient Diesel Particulate Filter (DPF) Modeling and Development

    Energy Technology Data Exchange (ETDEWEB)

    Stewart, Mark L.; Gallant, Thomas R.; Kim, Do Heui; Maupin, Gary D.; Zelenyuk, Alla

    2010-08-01

    The project described in this report seeks to promote effective diesel particulate filter technology with minimum fuel penalty by enhancing fundamental understanding of filtration mechanisms through targeted experiments and computer simulations. The overall backpressure of a filtration system depends upon complex interactions of particulate matter and ash with the microscopic pores in filter media. Better characterization of these phenomena is essential for exhaust system optimization. The acicular mullite (ACM) diesel particulate filter substrate is under continuing development by Dow Automotive. ACM is made up of long mullite crystals which intersect to form filter wall framework and protrude from the wall surface into the DPF channels. ACM filters have been demonstrated to effectively remove diesel exhaust particles while maintaining relatively low backpressure. Modeling approaches developed for more conventional ceramic filter materials, such as silicon carbide and cordierite, have been difficult to apply to ACM because of properties arising from its unique microstructure. Penetration of soot into the high-porosity region of projecting crystal structures leads to a somewhat extended depth filtration mode, but with less dramatic increases in pressure drop than are normally observed during depth filtration in cordierite or silicon carbide filters. Another consequence is greater contact between the soot and solid surfaces, which may enhance the action of some catalyst coatings in filter regeneration. The projecting crystals appear to provide a two-fold benefit for maintaining low backpressures during filter loading: they help prevent soot from being forced into the throats of pores in the lower porosity region of the filter wall, and they also tend to support the forming filter cake, resulting in lower average cake density and higher permeability. Other simulations suggest that soot deposits may also tend to form at the tips of projecting crystals due to the axial

  5. Development of low optical cross talk filters for VIIRS (JPSS)

    Science.gov (United States)

    Murgai, Vijay; Hendry, Derek; Downing, Kevin; Carbone, David; Potter, John

    2016-09-01

    The Visible/Infrared Imaging Radiometer Suite (VIIRS) is a key sensor on Suomi National Polar-orbiting Partnership (S-NPP) satellite launched on October 28, 2011 into a polar orbit of 824 km nominal altitude and the JPSS sensors currently being built and integrated. VIIRS collects radiometric and imagery data of the Earth's atmosphere, oceans, and land surfaces in 22 spectral bands spanning the visible and infrared spectrum from 0.4 to 12.5 μm. Interference filters assembled in `butcher-block' arrays mounted adjacent to focal plane arrays provide spectral definition. Out-of-band signal and out-of-band optical cross-talk was observed for bands in the 0.4 to 1 μm range in testing of VIIRS for S-NPP. Optical cross-talk is in-band or out-of-band light incident on an adjacent filter or adjacent region of the same filter reaching the detector. Out-of-band optical cross-talk results in spectral and spatial `impurities' in the signal and consequent errors in the calculated environmental parameters such as ocean color that rely on combinations of signals from more than one band. This paper presents results of characterization, specification, and coating process improvements that enabled production of filters with significantly reduced out of band light for Joint Polar Satellite System (JPSS) J1 and subsequent sensors. Total transmission and scatter measurements at a wavelength within the pass band can successfully characterize filter performance prior to dicing and assembling filters into butcher block assemblies. Coating and process development demonstrated performance on test samples followed by production of filters for J1 and J2. Results for J1 and J2 filters are presented.

  6. Does a Business Curriculum Develop or Filter Critical Thinking?

    Science.gov (United States)

    Coleman, B. Jay; Mason, Paul; Steagall, Jeffrey W.

    2012-01-01

    We investigate whether a business curriculum develops critical thinking ability or at least serves as a filter for critical thinking (i.e., students who cannot think critically tend not to progress toward graduation). We measure critical thinking by performance on the Watson-Glaser Critical Thinking Appraisal Short Form which was administered to a…

  7. Development of active porous medium filters based on plasma textiles

    Science.gov (United States)

    Kuznetsov, Ivan A.; Saveliev, Alexei V.; Rasipuram, Srinivasan; Kuznetsov, Andrey V.; Brown, Alan; Jasper, Warren

    2012-05-01

    Inexpensive, flexible, washable, and durable materials that serve as antimicrobial filters and self-decontaminating fabrics are needed to provide active protection to people in areas regularly exposed to various biohazards, such as hospitals and bio research labs working with pathogens. Airlines and cruise lines need such material to combat the spread of infections. In households these materials can be used in HVAC filters to fight indoor pollution, which is especially dangerous to people suffering from asthma. Efficient filtering materials are also required in areas contaminated by other types of hazardous dust particulates, such as nuclear dust. The primary idea that guided the undertaken study is that a microplasma-generating structure can be embedded in a textile fabric to generate a plasma sheath ("plasma shield") that kills bacterial agents coming in contact with the fabric. The research resulted in the development of a plasma textile that can be used for producing new types of self-decontaminating garments, fabrics, and filter materials, capable of activating a plasma sheath that would filter, capture, and destroy any bacteriological agent deposited on its surface. This new material relies on the unique antimicrobial and catalytic properties of cold (room temperature) plasma that is benign to people and does not cause thermal damage to many polymer textiles, such as Nomex and polypropylene. The uniqueness of cold plasma as a disinfecting agent lies in the inability of bacteria to develop resistance to plasma exposure, as they can for antibiotics. Plasma textiles could thus be utilized for microbial destruction in active antimicrobial filters (for continuous decontamination and disinfection of large amounts of air) as well as in self-decontaminating surfaces and antibacterial barriers (for example, for creating local antiseptic or sterile environments around wounds and burns).

  8. Development of active porous medium filters based on plasma textiles

    Energy Technology Data Exchange (ETDEWEB)

    Kuznetsov, Ivan A.; Saveliev, Alexei V.; Rasipuram, Srinivasan; Kuznetsov, Andrey V.; Brown, Alan; Jasper, Warren [Mechanical and Aerospace Engineering, North Carolina State University, Raleigh, NC 27695 (United States); Textile Engineering Chemistry and Science, North Carolina State University, Raleigh, NC 27695 (United States)

    2012-05-15

    Inexpensive, flexible, washable, and durable materials that serve as antimicrobial filters and self-decontaminating fabrics are needed to provide active protection to people in areas regularly exposed to various biohazards, such as hospitals and bio research labs working with pathogens. Airlines and cruise lines need such material to combat the spread of infections. In households these materials can be used in HVAC filters to fight indoor pollution, which is especially dangerous to people suffering from asthma. Efficient filtering materials are also required in areas contaminated by other types of hazardous dust particulates, such as nuclear dust. The primary idea that guided the undertaken study is that a microplasma-generating structure can be embedded in a textile fabric to generate a plasma sheath (''plasma shield'') that kills bacterial agents coming in contact with the fabric. The research resulted in the development of a plasma textile that can be used for producing new types of self-decontaminating garments, fabrics, and filter materials, capable of activating a plasma sheath that would filter, capture, and destroy any bacteriological agent deposited on its surface. This new material relies on the unique antimicrobial and catalytic properties of cold (room temperature) plasma that is benign to people and does not cause thermal damage to many polymer textiles, such as Nomex and polypropylene. The uniqueness of cold plasma as a disinfecting agent lies in the inability of bacteria to develop resistance to plasma exposure, as they can for antibiotics. Plasma textiles could thus be utilized for microbial destruction in active antimicrobial filters (for continuous decontamination and disinfection of large amounts of air) as well as in self-decontaminating surfaces and antibacterial barriers (for example, for creating local antiseptic or sterile environments around wounds and burns).

  9. Improving search filter development: a study of palliative care literature

    Directory of Open Access Journals (Sweden)

    Tieman Jennifer

    2007-06-01

    Full Text Available Abstract Background It is difficult to systematically search for literature relevant to palliative care in general medical journals. A previously developed search filter for use on OVID Medline validated using a gold standard set of references identified through hand searching, achieved an unacceptably low sensitivity (45.4%. Retrieving relevant literature is integral to support evidence based practice, and understanding the nature of the incorrectly excluded citations (false negatives using the filter may lead to improvement in the filter's performance. Methods The objectives were to describe the nature of subjects reflected in the false negative citations and to empirically improve the sensitivity of the search filter. A thematic analysis of MeSH terms by three independent reviewers was used to describe the subject coverage of the missed records. Using a frequency analysis of MeSH terms, those headings which could individually contribute at least 2.5% to sensitivity (occurring 19 or more times were added to the search filter. All previously run searches were rerun at the same time as the revised filter, and results compared. Results Thematic analysis of MeSH terms identified thirteen themes reflected in the missing records, none of them intrinsically palliative. The addition of six MeSH terms to the existing search filter (physician-patient relations, prognosis, quality of life, survival rate, treatment outcome and attitude to health led to an increase in sensitivity from 46.3% to 64.7%, offset by a decrease in precision from 72.6% to 21.9%. Conclusion The filter's sensitivity was successfully increased using frequency analysis of MeSH terms, offset by a decrease in precision. A thematic analysis of MeSH terms for the false negative citations confirmed the absence of any intrinsically palliative theme or term, suggesting that future improvements to search filters for palliative care literature will first depend on better identifying how

  10. Development of Test Protocols for International Space Station Particulate Filters

    Science.gov (United States)

    Vijayakumar, R.; Green, Robert D.; Agui, Juan H.

    2015-01-01

    Air quality control on the International Space Station (ISS) is a vital requirement for maintaining a clean environment for the crew and the hardware. This becomes a serious challenge in pressurized space compartments since no outside air ventilation is possible, and a larger particulate load is imposed on the filtration system due to lack of gravitational settling. The ISS Environmental Control and Life Support System (ECLSS) uses a filtration system that has been in use for over 14 years and has proven to meet this challenge. The heart of this system is a traditional High-Efficiency Particulate Air (HEPA) filter configured to interface with the rest of the life support elements and provide effective cabin filtration. The filter element for this system has a non-standard cross-section with a length-to-width ratio (LW) of 6.6. A filter test setup was designed and built to meet industry testing standards. A CFD analysis was performed to initially determine the optimal duct geometry and flow configuration. Both a screen and flow straighter were added to the test duct design to improve flow uniformity and face velocity profiles were subsequently measured to confirm. Flow quality and aerosol mixing assessments show that the duct flow is satisfactory for the intended leak testing. Preliminary leak testing was performed on two different ISS filters, one with known perforations and one with limited use, and results confirmed that the testing methods and photometer instrument are sensitive enough to detect and locate compromised sections of an ISS BFE.Given the engineering constraints in designing spacecraft life support systems, it is anticipated that non-industry standard filters will be required in future designs. This work is focused on developing test protocols for testing the ISS BFE filters, but the methodology is general enough to be extended to other present and future spacecraft filters. These techniques for characterizing the test duct and perform leak testing

  11. Development and Implementation of Cgcre Accreditation Program for Greenhouse Gas Verification Bodies

    Science.gov (United States)

    Kropf Santos Fermam, Ricardo; Barroso Melo Monteiro de Queiroz, Andrea

    2016-07-01

    An organizational innovation is defined as the implementation of a new organizational method in the firm's business practices, organization of your workplace or in its external relations. This work illustrates a Cgcre innovation, by presentation of the development process of greenhouse gases verification body in Brazil according to the Brazilian accreditation body, the General Coordination for Accreditation (Cgcre).

  12. REDD+ readiness: early insights on monitoring, reporting and verification systems of project developers

    NARCIS (Netherlands)

    Joseph, S.; Herold, M.; Sunderlin, W.D.; Verchot, L.V.

    2013-01-01

    A functional measuring, monitoring, reporting and verification (MRV) system is essential to assess the additionality and impact on forest carbon in REDD+ (reducing emissions from deforestation and degradation) projects. This study assesses the MRV capacity and readiness of project developers at 20 R

  13. Development and verification of a reciprocating test rig designed for investigation of piston ring tribology

    DEFF Research Database (Denmark)

    Pedersen, Michael Torben; Imran, Tajammal; Klit, Peder

    2009-01-01

    This paper describes the development and verification of a reciprocating test rig, which was designed to study the piston ring tribology. A crank mechanism is used to generate a reciprocating motion for a moving plate, which acts as the liner. A stationary block acting as the ring package is loaded......, which is suitable for the study of piston ring tribology....

  14. An Actual Design of AC Filter for Static Var Compensator and Verification Results from the Field Test

    Science.gov (United States)

    Tamura, Yuji; Takasaki, Shinji; Irokawa, Shoichi; Takeda, Hideo; Takagi, Kikuo; Noro, Yasuhiro; Ametani, Akihiro

    AC filter design method for SVC and HVDC is commonly known in the relevant CIGRE technical brochure and IEC technical report. However the conventional method requires many iterative calculations of the harmonic voltages and currents until the calculation results become within the regulation levels by changing filter parameters based on the experience. In this respect, a new improved design method is proposed, which enables efficient evaluation on the complex impedance plane to confirm as to whether the proposed filter impedance is in the permissible range. In an actual project of Okuura SVC of Kyusyu Electric Power Co., Inc., the new method was applied to the AC filter design. This paper describes on the actual procedure of the AC filter design with the new method, the actual references of the harmonic performance calculation, and the field test measurement results on Okuura SVC. The calculation results and the filed measurement results are consistent with each other, thus the validity of the new design method is verified on its accuracy and effectiveness.

  15. Design, development and verification of the HIFI Alignment Camera System

    NARCIS (Netherlands)

    Boslooper, E.C.; Zwan, B.A. van der; Kruizinga, B.; Lansbergen, R.

    2005-01-01

    This paper presents the TNO share of the development of the HIFI Alignment Camera System (HACS), covering the opto-mechanical and thermal design. The HACS is an Optical Ground Support Equipment (OGSE) that is specifically developed to verify proper alignment of different modules of the HIFI instrume

  16. Development and verification of printed circuit board toroidal transformer model

    DEFF Research Database (Denmark)

    Pejtersen, Jens; Mønster, Jakob Døllner; Knott, Arnold

    2013-01-01

    by comparing calculated parameters with 3D finite element simulations and experimental measurement results. The developed transformer model shows good agreement with the simulated and measured results. The model can be used to predict the parameters of printed circuit board toroidal transformer configurations......An analytical model of an air core printed circuit board embedded toroidal transformer configuration is presented. The transformer has been developed for galvanic isolation of very high frequency switch-mode dc-dc power converter applications. The theoretical model is developed and verified...

  17. Development and verification test of integral reactor major components

    Energy Technology Data Exchange (ETDEWEB)

    Kim, J. I.; Kim, Y. W.; Kim, J. H. and others

    1999-03-01

    The conceptual designs for SG, MCP, CEDM to be installed in the integral reactor SMART were developed. Three-dimensional CAD models for the major components were developed to visualize the design concepts. Once-through helical steam generator was conceptually designed for SMART. Canned motor pump was adopted in the conceptual design of MCP. Linear pulse motor type and ballscrew type CEDM, which have fine control capabilities were studied for adoption in SMART. In parallel with the structural design, the electro-magnetic design was performed for the sizing motors and electro-magnet. Prototypes for the CEDM and MCP sub-assemblies were developed and tested to verify the performance. The impeller design procedure and the computer program to analyze the dynamic characteristics of MCP rotor shaft were developed. The design concepts of SG, MCP, CEDM were also invetigated for the fabricability.

  18. Dissolution Model Development: Formulation Effects and Filter Complications

    DEFF Research Database (Denmark)

    Berthelsen, Ragna; Holm, Rene; Jacobsen, Jette

    2016-01-01

    This study describes various complications related to sample preparation (filtration) during development of a dissolution method intended to discriminate among different fenofibrate immediate-release formulations. Several dissolution apparatus and sample preparation techniques were tested. The flow......-through cell apparatus (USP 4) was found unfit for dissolution testing of fenofibrate MeltDose formulations due to clogging of filters and varying flow rates. A mini paddle dissolution setup produced dissolution profiles of the tested formulations that correlated well with clinical data. The work towards...... the mini paddle dissolution method demonstrates that sample preparation influenced the results. The investigations show that excipients from the formulations directly affected the drug–filter interaction, thereby affecting the dissolution profiles and the ability to predict the in vivo data...

  19. Effective Development and Verification of Railway Control Software

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth

    2011-01-01

    This document presents a method for effective development of software for a product line of similar railway control systems. The software is constructed in three steps: first a specifications in a domain-specific language is created, then a formal behavioural controller model is automatically...

  20. Early Development of UVM based Verification Environment of Image Signal Processing Designs using TLM Reference Model of RTL

    Directory of Open Access Journals (Sweden)

    Abhishek Jain

    2014-01-01

    Full Text Available With semiconductor industry trend of “smaller the better”, from an idea to a final product, more innovation on product portfolio and yet remaining competitive and profitable are few criteria which are culminating into pressure and need for more and more innovation for CAD flow, process management and project execution cycle. Project schedules are very tight and to achieve first silicon success is key for projects. This necessitates quicker verification with better coverage matrix. Quicker Verification requires early development of the verification environment with wider test vectors without waiting for RTL to be available. In this paper, we are presenting a novel approach of early development of reusable multi-language verification flow, by addressing four major activities of verification – 1. Early creation of Executable Specification 2. Early creation of Verification Environment 3. Early development of test vectors and 4. Better and increased Re-use of blocks Although this paper focuses on early development of UVM based Verification Environment of Image Signal Processing designs using TLM Reference Model of RTL, same concept can be extended for non-image signal processing designs.

  1. DEVELOPMENT OF OPTIMAL FILTERS OBTAINED THROUGH CONVOLUTION METHODS, USED FOR FINGERPRINT IMAGE ENHANCEMENT AND RESTORATION

    Directory of Open Access Journals (Sweden)

    Cătălin LUPU

    2014-12-01

    Full Text Available This article presents the development of optimal filters through covolution methods, necessary for restoring, correcting and improving fingerprints acquired from a sensor, able to provide the most ideal image in the output. After the image was binarized and equalized, Canny filter is applied in order to: eliminate the noise (filtering the image with a Gaussian filter, non-maxima suppression, module gradient adaptive binarization and extension edge points edges by hysteresis. The resulting image after applying Canny filter is not ideal. It is possible that the result will be an image with very fragmented edges and many pores in ridge. For the resulting image, a bank of convolution filters are applied one after another (Kirsch, Laplace, Roberts, Prewitt, Sobel, Frei-Chen, averaging convolution filter, circular convolution filter, lapacian convolution filter, gaussian convolution filter, LoG convolution filter, DoG, inverted filters, Wiener, the filter of ”equalization of the power spectrum” (intermediary filter between the Wiener filter and the inverted filter, the geometrical average filter , etc. with different features.

  2. Efficient Development and Verification of Safe Railway Control Software

    OpenAIRE

    Haxthausen, Anne Elisabeth; Peleska, Jan

    2013-01-01

    In this book, the authors present current research on the types, design and safety issues of railways. Topics discussed include the acoustic characteristics of noise in train stations; monitoring railway structure conditions and opportunities to use wireless sensor networks as tools to improve the monitoring process; hydraulic absorbers as dampers to dissipate the energy of oscillations in railway electric equipment; development of train fare calculation and adjustment systems using VDM++; ef...

  3. EVA Development and Verification Testing at NASA's Neutral Buoyancy Laboratory

    Science.gov (United States)

    Jairala, Juniper C.; Durkin, Robert; Marak, Ralph J.; Sipila, Stepahnie A.; Ney, Zane A.; Parazynski, Scott E.; Thomason, Arthur H.

    2012-01-01

    As an early step in the preparation for future Extravehicular Activities (EVAs), astronauts perform neutral buoyancy testing to develop and verify EVA hardware and operations. Neutral buoyancy demonstrations at NASA Johnson Space Center's Sonny Carter Training Facility to date have primarily evaluated assembly and maintenance tasks associated with several elements of the International Space Station (ISS). With the retirement of the Shuttle, completion of ISS assembly, and introduction of commercial players for human transportation to space, evaluations at the Neutral Buoyancy Laboratory (NBL) will take on a new focus. Test objectives are selected for their criticality, lack of previous testing, or design changes that justify retesting. Assembly tasks investigated are performed using procedures developed by the flight hardware providers and the Mission Operations Directorate (MOD). Orbital Replacement Unit (ORU) maintenance tasks are performed using a more systematic set of procedures, EVA Concept of Operations for the International Space Station (JSC-33408), also developed by the MOD. This paper describes the requirements and process for performing a neutral buoyancy test, including typical hardware and support equipment requirements, personnel and administrative resource requirements, examples of ISS systems and operations that are evaluated, and typical operational objectives that are evaluated.

  4. Verification and Validation in a Rapid Software Development Process

    Science.gov (United States)

    Callahan, John R.; Easterbrook, Steve M.

    1997-01-01

    The high cost of software production is driving development organizations to adopt more automated design and analysis methods such as rapid prototyping, computer-aided software engineering (CASE) tools, and high-level code generators. Even developers of safety-critical software system have adopted many of these new methods while striving to achieve high levels Of quality and reliability. While these new methods may enhance productivity and quality in many cases, we examine some of the risks involved in the use of new methods in safety-critical contexts. We examine a case study involving the use of a CASE tool that automatically generates code from high-level system designs. We show that while high-level testing on the system structure is highly desirable, significant risks exist in the automatically generated code and in re-validating releases of the generated code after subsequent design changes. We identify these risks and suggest process improvements that retain the advantages of rapid, automated development methods within the quality and reliability contexts of safety-critical projects.

  5. Development and Verification of Smoothed Particle Hydrodynamics Code for Analysis of Tsunami near NPP

    Energy Technology Data Exchange (ETDEWEB)

    Jo, Young Beom; Kim, Eung Soo [Seoul National Univ., Seoul (Korea, Republic of)

    2014-10-15

    It becomes more complicated when considering the shape and phase of the ground below the seawater. Therefore, some different attempts are required to precisely analyze the behavior of tsunami. This paper introduces an on-going activities on code development in SNU based on an unconventional mesh-free fluid analysis method called Smoothed Particle Hydrodynamics (SPH) and its verification work with some practice simulations. This paper summarizes the on-going development and verification activities on Lagrangian mesh-free SPH code in SNU. The newly developed code can cover equation of motions and heat conduction equation so far, and verification of each models is completed. In addition, parallel computation using GPU is now possible, and GUI is also prepared. If users change input geometry or input values, they can simulate for various conditions geometries. A SPH method has large advantages and potential in modeling of free surface, highly deformable geometry and multi-phase problems that traditional grid-based code has difficulties in analysis. Therefore, by incorporating more complex physical models such as turbulent flow, phase change, two-phase flow, and even solid mechanics, application of the current SPH code is expected to be much more extended including molten fuel behaviors in the sever accident.

  6. Development and Verification of 3000Rpm 48Inch Integral Shroud Blade for Steam Turbine

    Science.gov (United States)

    Kaneko, Yasutomo; Mori, Kazushi; Ohyama, Hiroharu

    The 3000rpm 48inch blade for steam turbine was developed as one of the new standard series of LP end blades. The new LP end blades are characterized by the ISB (Integral Shroud Blade) structure. In the ISB structure, blades are continuously coupled by blade untwist due to centrifugal force when the blades rotate at high speed. Therefore, the number of the resonant vibration modes can be reduced by virtue of the vibration characteristics of the circumferentially continuous blades, and the resonant stress can be decreased due to the additional friction damping generated at shrouds and stubs. In order to develop the 3000rpm 48inch blade, the latest analysis methods to predict the vibration characteristics of the ISB structure were applied, after confirming their validity to the blade design. Moreover, the verification tests such as rotational vibration tests and model turbine tests were carried out in the shop to confirm the reliability of the developed blade. As the final verification test, the field test of the actual steam turbine was carried out in the site during the trial operation, and the vibration stress of the 3000rpm 48inch blade was measured by use of telemetry system. In the field test, the vibratory stress of the blade was measured under various operating conditions for more than one month. This paper first presents the up-to-date design technology applied to the design of the 3000rpm 48inch blade. In the second place, the results of the various verification tests carried out in the shop are presented as well as their procedure. Lastly, the results of the final verification tests of 3000rpm 48inch blade carried out in the site are presented.

  7. Performance verification of Surface Mapping Instrument developed at CGM

    DEFF Research Database (Denmark)

    Bariani, Paolo

    covering applications in micro-technology and in surface metrology. The paper addresses the description of the stitching procedure, its validation, and a more comprehensive metrological evaluation of the AFM-CMM instrument performance. Experimental validation of the method was performed by the use of...... of the instrument was the development of stitching software. Successful stitching of AFM scans is demonstrated in this report. Single data files in the millimetre range can be obtained, which are entirely based on AFM probing. High definition of nanostructures can therefore be combined with a measuring range...

  8. Spaceport Command and Control System Automated Verification Software Development

    Science.gov (United States)

    Backus, Michael W.

    2017-01-01

    For as long as we have walked the Earth, humans have always been explorers. We have visited our nearest celestial body and sent Voyager 1 beyond our solar system1 out into interstellar space. Now it is finally time for us to step beyond our home and onto another planet. The Spaceport Command and Control System (SCCS) is being developed along with the Space Launch System (SLS) to take us on a journey further than ever attempted. Within SCCS are separate subsystems and system level software, each of which have to be tested and verified. Testing is a long and tedious process, so automating it will be much more efficient and also helps to remove the possibility of human error from mission operations. I was part of a team of interns and full-time engineers who automated tests for the requirements on SCCS, and with that was able to help verify that the software systems are performing as expected.

  9. Development and Implementation of Radiation-Hydrodynamics Verification Test Problems

    Energy Technology Data Exchange (ETDEWEB)

    Marcath, Matthew J. [Los Alamos National Laboratory; Wang, Matthew Y. [Los Alamos National Laboratory; Ramsey, Scott D. [Los Alamos National Laboratory

    2012-08-22

    Analytic solutions to the radiation-hydrodynamic equations are useful for verifying any large-scale numerical simulation software that solves the same set of equations. The one-dimensional, spherically symmetric Coggeshall No.9 and No.11 analytic solutions, cell-averaged over a uniform-grid have been developed to analyze the corresponding solutions from the Los Alamos National Laboratory Eulerian Applications Project radiation-hydrodynamics code xRAGE. These Coggeshall solutions have been shown to be independent of heat conduction, providing a unique opportunity for comparison with xRAGE solutions with and without the heat conduction module. Solution convergence was analyzed based on radial step size. Since no shocks are involved in either problem and the solutions are smooth, second-order convergence was expected for both cases. The global L1 errors were used to estimate the convergence rates with and without the heat conduction module implemented.

  10. Bringing Automated Formal Verification to PLC Program Development

    CERN Document Server

    Fernández Adiego, Borja; Blanco Viñuela, Enrique

    Automation is the field of engineering that deals with the development of control systems for operating systems such as industrial processes, railways, machinery or aircraft without human intervention. In most of the cases, a failure in these control systems can cause a disaster in terms of economic losses, environmental damages or human losses. For that reason, providing safe, reliable and robust control systems is a first priority goal for control engineers. Ideally, control engineers should be able to guarantee that both software and hardware fulfill the design requirements. This is an enormous challenge in which industry and academia have been working and making progresses in the last decades. This thesis focuses on one particular type of control systems that operates industrial processes, the PLC (Programmable Logic Controller) - based control systems. Moreover it targets one of the main challenges for these systems, guaranteeing that PLC programs are compliant with their specifications. Traditionally ...

  11. The IXV guidance, navigation and control subsystem: Development, verification and performances

    Science.gov (United States)

    Marco, Victor; Contreras, Rafael; Sanchez, Raul; Rodriguez, Guillermo; Serrano, Daniel; Kerr, Murray; Fernandez, Vicente; Haya-Ramos, Rodrigo; Peñin, Luis F.; Ospina, Jose A.; De Zaiacomo, Gabriale; Bejar-Romero, Juan Antonio; Yague, Ricardo; Zaccagnino, Elio; Preaud, Jean-Philippe

    2016-07-01

    The Intermediate eXperimental Vehicle (IXV) [1] is an ESA re-entry lifting body demonstrator built to verify in-flight the performance of critical re-entry technologies. The IXV was launched on February the 11th, 2015, aboard Europe's Vega launcher. The IXV´s flight and successful recovery represents a major step forward with respect to previous European re-entry experience with the Atmospheric Re-entry Demonstrator (ARD) [2], flown in October 1998. The increased in-flight manoeuvrability achieved from the lifting body solution permitted the verification of technologies over a wider re-entry corridor. Among other objectives, which included the characterisation of the re-entry environment through a variety of sensors, special attention was paid to Guidance, Navigation and Control (GNC) aspects, including the guidance algorithms for the lifting body, the use of the inertial measurement unit measurements with GPS updates for navigation, and the flight control by means of aerodynamic flaps and reaction control thrusters. This paper presents the overall Design, Development and Verification logic that has been successfully followed by the GNC and Flight Management (FM) subsystem of the IXV. It also focuses on the interactions between the GNC and the System, Avionics and OBSW development lifecycles and how an integrated and incremental verification process has been implemented by ensuring the maximum representativeness and reuse through all stages.

  12. Development and optimization of FJP tools and their practical verification

    Science.gov (United States)

    Messelink, Wilhelmus A. C. M.; Waeger, Reto; Meeder, Mark; Looser, Herbert; Wons, Torsten; Heiniger, Kurt C.; Faehnle, Oliver W.

    2005-09-01

    This article presents the recent achievements with Jules Verne, a sub-aperture polishing technique closely related to Fluid Jet Polishing [1]. Whereas FJP typically applies a nozzle stand-off distance of millimeters to centimeters, JV uses a stand-off distance down to 50 μm. The objective is to generate a non-directional fluid flow parallel to the surface, which is specifically suited to reduce the surface roughness [2, 3]. Different characteristic Jules Verne nozzle geometries have been designed and numerically simulated using Computational Fluid Dynamics (CFD). To verify these simulations, the flow of fluid and particles of these nozzles has been visualized in a measurement setup developed specifically for this purpose. A simplified JV nozzle geometry is positioned in a measurement setup and the gap between tool and surface has been observed by an ICCD camera. In order to be able to visualize the motion of the abrasives, the particles have been coated with fluorescence. Furthermore, these nozzles have been manufactured and tested in a practical environment using a modified polishing machine. The results of these laboratory and practical tests are presented and discussed, demonstrating that the CFD simulations are in good agreement with the experiments. It was possible to qualitatively predict the material removal on the processed glass surface, due to the implementation of appropriate erosion models [4, 5] in the CFD software.

  13. Development and Validation of Search Filters to Identify Articles on Family Medicine in Online Medical Databases.

    Science.gov (United States)

    Pols, David H J; Bramer, Wichor M; Bindels, Patrick J E; van de Laar, Floris A; Bohnen, Arthur M

    2015-01-01

    Physicians and researchers in the field of family medicine often need to find relevant articles in online medical databases for a variety of reasons. Because a search filter may help improve the efficiency and quality of such searches, we aimed to develop and validate search filters to identify research studies of relevance to family medicine. Using a new and objective method for search filter development, we developed and validated 2 search filters for family medicine. The sensitive filter had a sensitivity of 96.8% and a specificity of 74.9%. The specific filter had a specificity of 97.4% and a sensitivity of 90.3%. Our new filters should aid literature searches in the family medicine field. The sensitive filter may help researchers conducting systematic reviews, whereas the specific filter may help family physicians find answers to clinical questions at the point of care when time is limited.

  14. Development and verification of a dynamic underbalanced drilling simulator

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Z.; Vefring, E.H.; Rommetveit, R. [RF-Rogaland Research, Bergen (Norway); Bieseman, T. [Shell RTS, Rijswijk (Netherlands); Maglione, R. [Agip Spa, Milano (Italy); Lage, A.C.; Nakagawa, E. [Petrobras/CENPES, Rio de Janeiro (Brazil)

    1997-07-01

    A dynamic underbalanced drilling (UBD) simulator has been developed in a joint industry project. The simulator incorporates models for multiphase flow, well-reservoir interaction, gas/oil solubility and gas injection systems. The fluid components in the system include injected gases, mud, produced gas, produced oil and water and drilled cuttings. Both coiled tubing and conventional jointed pipe can be simulated. The primary use of the simulator is in the planning phase of an UBD operation. An UBD operation is very dynamic due to the changes in flow conditions and other operations. The importance of the dynamic effects is illustrated by a field example. The dynamic simulator allows for the analysis of various operations that cannot be analyzed with a steady state simulator. Some of these operations include starting/stopping circulation; various gas injection techniques, e.g.: parasitic string, parasitic casing, through completion, and drill string injection; drilling operations: drilling, tripping, pipe connections, and BHA deployment. To verify the simulator, two phase flow tests in near-horizontal annulus were performed in order to provide data for validation. Field data are actively collected for this purpose. In this paper, two field cases are presented. One is a coiled tubing drilling operation in Dalen field in the Netherlands where a Nitrogen lift test was performed in a through completion configuration. The second case is a UBD operation in Candeias field in Brazil. In this case, drillstring gas injection tests were performed in a cemented 9-5/8-in. casing at 1,800 m.

  15. Vacuum assisted resin transfer molding (VARTM): Model development and verification

    Science.gov (United States)

    Song, Xiaolan

    2003-06-01

    In this investigation, a comprehensive Vacuum Assisted Resin Transfer Molding (VARTM) process simulation model was developed and verified. The model incorporates resin flow through the preform, compaction and relaxation of the preform, and viscosity and cure kinetics of the resin. The computer model can be used to analyze the resin flow details, track the thickness change of the preform, predict the total infiltration time and final fiber volume fraction of the parts, and determine whether the resin could completely infiltrate and uniformly wet out the preform. Flow of resin through the preform is modeled as flow through porous media. Darcy's law combined with the continuity equation for an incompressible Newtonian fluid forms the basis of the flow model. During the infiltration process, it is well accepted that the total pressure is shared by the resin pressure and the pressure supported by the fiber network. With the progression of the resin, the net pressure applied to the preform decreases as a result of increasing local resin pressure. This leads to the springback of the preform, and is called the springback mechanism. On the other side, the lubrication effect of the resin causes the rearrangement of the fiber network and an increase in the preform compaction. This is called the wetting compaction mechanism. The thickness change of the preform is determined by the relative magnitude of the springback and wetting deformation mechanisms. In the compaction model, the transverse equilibrium equation is used to calculate the net compaction pressure applied to the preform, and the compaction test results are fitted to give the compressive constitutive law of the preform. The Finite Element/Control Volume (FE/CV) method is adopted to find the flow front location and the fluid pressure. The code features the ability of simultaneous integration of 1-D, 2-D and 3-D element types in a single simulation, and thus enables efficient modeling of the flow in complex mold

  16. Design of the software development and verification system (SWDVS) for shuttle NASA study task 35

    Science.gov (United States)

    Drane, L. W.; Mccoy, B. J.; Silver, L. W.

    1973-01-01

    An overview of the Software Development and Verification System (SWDVS) for the space shuttle is presented. The design considerations, goals, assumptions, and major features of the design are examined. A scenario that shows three persons involved in flight software development using the SWDVS in response to a program change request is developed. The SWDVS is described from the standpoint of different groups of people with different responsibilities in the shuttle program to show the functional requirements that influenced the SWDVS design. The software elements of the SWDVS that satisfy the requirements of the different groups are identified.

  17. Development and verification test of integral reactor major components - Development of MCP impeller design, performance prediction code and experimental verification

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Myung Kyoon; Oh, Woo Hyoung; Song, Jae Wook [Korea Advanced Institute of Science and Technology, Taejon (Korea)

    1999-03-01

    The present study is aimed at developing a computational code for design and performance prediction of an axial-flow pump. The proposed performance prediction method is tested against a model axial-flow pump streamline curvature method. The preliminary design is made by using the ideal velocity triangles at inlet and exit and the three dimensional blade shape is calculated by employing the free vortex design method. Then the detailed blading design is carried out by using experimental database of double circular arc cambered hydrofoils. To computationally determine the design incidence, deviation, blade camber, solidity and stagger angle, a number of correlation equations are developed form the experimental database and a theorical formula for the lift coefficient is adopted. A total of 8 equations are solved iteratively using an under-relaxation factor. An experimental measurement is conducted under a non-cavitating condition to obtain the off-design performance curve and also a cavitation test is carried out by reducing the suction pressure. The experimental results are very satisfactorily compared with the predictions by the streamline curvature method. 28 refs., 26 figs., 11 tabs. (Author)

  18. Working memory filtering continues to develop into late adolescence.

    Science.gov (United States)

    Peverill, Matthew; McLaughlin, Katie A; Finn, Amy S; Sheridan, Margaret A

    2016-04-01

    While most measures of working memory (WM) performance have been shown to plateau by mid-adolescence and developmental changes in fronto-parietal regions supporting WM encoding and maintenance have been well characterized, little is known about developmental variation in WM filtering. We investigated the possibility that the neural underpinnings of filtering in WM reach maturity later in life than WM function without filtering. Using a cued WM filtering task (McNab and Klingberg, 2008), we investigated neural activity during WM filtering in a sample of 64 adults and adolescents. Regardless of age, increases in WM activity with load were concentrated in the expected fronto-parietal network. For adults, but not adolescents, recruitment of the basal ganglia during presentation of a filtering cue was associated with neural and behavioral indices of successful filtering, suggesting that WM filtering and related basal ganglia function may still be maturing throughout adolescence and into adulthood.

  19. Development of Drabkin energy filters for J-PARC project

    CERN Document Server

    Yamazaki, D; Soyama, K; Tasaki, S

    2003-01-01

    In the J-PARC project, the high intensity spallation neutron source has been developed. Very intensive pulsed neutron beam will be available from a coupled moderator installed at the spallation source. Wavelengths of neutrons is generally determined by its time-of-flight (TOF) from the source to the detector, but the available precision is limited by the non-zero emission time-width of the moderator system. It follows that high precision experiments cannot be performed with the intensive pulsed neutrons from the coupled moderator. We have been developing Drabkin energy filters, which effectively reduces the emission time-width by the spatial neutron spin resonance. In this paper, firstly, we describe the physics in the Drabkin spin flipper, which is the main part of the Drabkin energy filter, and derive the spin-flip probability by the flipper in the quantum-mechanical manner. Secondly, the properties of the resonance spin flipping are described. Thirdly, sweep mode for the application to pulsed neutrons are ...

  20. Ice classification algorithm development and verification for the Alaska SAR Facility using aircraft imagery

    Science.gov (United States)

    Holt, Benjamin; Kwok, Ronald; Rignot, Eric

    1989-01-01

    The Alaska SAR Facility (ASF) at the University of Alaska, Fairbanks is a NASA program designed to receive, process, and archive SAR data from ERS-1 and to support investigations that will use this regional data. As part of ASF, specialized subsystems and algorithms to produce certain geophysical products from the SAR data are under development. Of particular interest are ice motion, ice classification, and ice concentration. This work focuses on the algorithm under development for ice classification, and the verification of the algorithm using C-band aircraft SAR imagery recently acquired over the Alaskan arctic.

  1. Development, Verification and Validation of Enclosure Radiation Capabilities in the CHarring Ablator Response (CHAR) Code

    Science.gov (United States)

    Salazar, Giovanni; Droba, Justin C.; Oliver, Brandon; Amar, Adam J.

    2016-01-01

    With the recent development of multi-dimensional thermal protection system (TPS) material response codes including the capabilities to account for radiative heating is a requirement. This paper presents the recent efforts to implement such capabilities in the CHarring Ablator Response (CHAR) code developed at NASA's Johnson Space Center. This work also describes the different numerical methods implemented in the code to compute view factors for radiation problems involving multiple surfaces. Furthermore, verification and validation of the code's radiation capabilities are demonstrated by comparing solutions to analytical results, to other codes, and to radiant test data.

  2. Development of Real Time Implementation of 5/5 Rule based Fuzzy Logic Controller Shunt Active Power Filter for Power Quality Improvement

    Science.gov (United States)

    Puhan, Pratap Sekhar; Ray, Pravat Kumar; Panda, Gayadhar

    2016-12-01

    This paper presents the effectiveness of 5/5 Fuzzy rule implementation in Fuzzy Logic Controller conjunction with indirect control technique to enhance the power quality in single phase system, An indirect current controller in conjunction with Fuzzy Logic Controller is applied to the proposed shunt active power filter to estimate the peak reference current and capacitor voltage. Current Controller based pulse width modulation (CCPWM) is used to generate the switching signals of voltage source inverter. Various simulation results are presented to verify the good behaviour of the Shunt active Power Filter (SAPF) with proposed two levels Hysteresis Current Controller (HCC). For verification of Shunt Active Power Filter in real time, the proposed control algorithm has been implemented in laboratory developed setup in dSPACE platform.

  3. Autonomic networking-on-chip bio-inspired specification, development, and verification

    CERN Document Server

    Cong-Vinh, Phan

    2011-01-01

    Despite the growing mainstream importance and unique advantages of autonomic networking-on-chip (ANoC) technology, Autonomic Networking-On-Chip: Bio-Inspired Specification, Development, and Verification is among the first books to evaluate research results on formalizing this emerging NoC paradigm, which was inspired by the human nervous system. The FIRST Book to Assess Research Results, Opportunities, & Trends in ""BioChipNets"" The third book in the Embedded Multi-Core Systems series from CRC Press, this is an advanced technical guide and reference composed of contributions from prominent re

  4. Metal Fuel Development and Verification for Prototype Generation IV Sodium-Cooled Fast Reactor

    OpenAIRE

    Chan Bock Lee; Jin Sik Cheon; Sung Ho Kim; Jeong-Yong Park; Hyung-Kook Joo

    2016-01-01

    Metal fuel is being developed for the prototype generation-IV sodium-cooled fast reactor (PGSFR) to be built by 2028. U–Zr fuel is a driver for the initial core of the PGSFR, and U–transuranics (TRU)–Zr fuel will gradually replace U–Zr fuel through its qualification in the PGSFR. Based on the vast worldwide experiences of U–Zr fuel, work on U–Zr fuel is focused on fuel design, fabrication of fuel components, and fuel verification tests. U–TRU–Zr fuel uses TRU recovered through pyroelectrochem...

  5. Development and preliminary verification of the 3D core neutronic code: COCO

    Energy Technology Data Exchange (ETDEWEB)

    Lu, H.; Mo, K.; Li, W.; Bai, N.; Li, J. [Reactor Design and Fuel Management Research Center, China Nuclear Power Technology Research Inst., 47F/A Jiangsu Bldg., Yitian Road, Futian District, Shenzhen (China)

    2012-07-01

    As the recent blooming economic growth and following environmental concerns (China)) is proactively pushing forward nuclear power development and encouraging the tapping of clean energy. Under this situation, CGNPC, as one of the largest energy enterprises in China, is planning to develop its own nuclear related technology in order to support more and more nuclear plants either under construction or being operation. This paper introduces the recent progress in software development for CGNPC. The focus is placed on the physical models and preliminary verification results during the recent development of the 3D Core Neutronic Code: COCO. In the COCO code, the non-linear Green's function method is employed to calculate the neutron flux. In order to use the discontinuity factor, the Neumann (second kind) boundary condition is utilized in the Green's function nodal method. Additionally, the COCO code also includes the necessary physical models, e.g. single-channel thermal-hydraulic module, burnup module, pin power reconstruction module and cross-section interpolation module. The preliminary verification result shows that the COCO code is sufficient for reactor core design and analysis for pressurized water reactor (PWR). (authors)

  6. DEVELOPMENT OF OPTIMAL FILTERS OBTAINED THROUGH CONVOLUTION METHODS, USED FOR FINGERPRINT IMAGE ENHANCEMENT AND RESTORATION

    OpenAIRE

    Cătălin LUPU

    2014-01-01

    This article presents the development of optimal filters through covolution methods, necessary for restoring, correcting and improving fingerprints acquired from a sensor, able to provide the most ideal image in the output. After the image was binarized and equalized, Canny filter is applied in order to: eliminate the noise (filtering the image with a Gaussian filter), non-maxima suppression, module gradient adaptive binarization and extension edge points edges by hysteresis. The resulting i...

  7. DEVELOPMENT OF OPTIMAL FILTERS OBTAINED THROUGH CONVOLUTION METHODS, USED FOR FINGERPRINT IMAGE ENHANCEMENT AND RESTORATION

    OpenAIRE

    Cătălin LUPU

    2014-01-01

    This article presents the development of optimal filters through covolution methods, necessary for restoring, correcting and improving fingerprints acquired from a sensor, able to provide the most ideal image in the output. After the image was binarized and equalized, Canny filter is applied in order to: eliminate the noise (filtering the image with a Gaussian filter), non-maxima suppression, module gradient adaptive binarization and extension edge points edges by hysteresis. The resulting i...

  8. Development and applications of an interactive digital filter design program.

    Science.gov (United States)

    Woo, H W; Kim, Y M; Tompkins, W J

    1985-10-01

    We have implemented an interactive digital filter design program in the HP 1000 computer at the Department of Electrical Engineering of the University of Washington. This program allows users to design different types of filters interactively with both amplitude and phase responses displayed on graphic devices. The performance of each designed filter can be evaluated conveniently before the best one is chosen and implemented for any particular application. This program can design recursive filters, e.g. Butterworth, Chebyshev and elliptic, or nonrecursive filters with one out of six different windows, i.e. rectangular, triangular, Hann, Hamming, Blackman and Kaiser. The main outputs from this program are coefficients of a transfer function of an analog filter, a digital filter, or both. Therefore, the design of both analog and digital filters is facilitated by using this program. The program is very simple to use and does not require background in analog or digital filter principles in order to run it. The program is written in standard FORTRAN and is about 30 kbytes in size excluding the graphics display routines. Since it uses standard FORTRAN, it can be easily transported to minicomputer and microcomputer systems that have a FORTRAN compiler and minimal graphics capabilities. This program is available for distribution to interested institutions and laboratories.

  9. DEVELOPMENT OF SEPIOLITE TYPE FILTER TIPS OF CIGARETTE

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    Activating conditions of sepiolite are studied by determining specific surface method. Sepiolite is used in processing filter tip of cigarette of acetate silk and paper type first.Tar of cigarette with sepiolite filter tip is lowered to a lower tar level. Mechanism of the lowered tar content by sepiolite is analysed.

  10. Development and validation of MCNPX-based Monte Carlo treatment plan verification system.

    Science.gov (United States)

    Jabbari, Iraj; Monadi, Shahram

    2015-01-01

    A Monte Carlo treatment plan verification (MCTPV) system was developed for clinical treatment plan verification (TPV), especially for the conformal and intensity-modulated radiotherapy (IMRT) plans. In the MCTPV, the MCNPX code was used for particle transport through the accelerator head and the patient body. MCTPV has an interface with TiGRT planning system and reads the information which is needed for Monte Carlo calculation transferred in digital image communications in medicine-radiation therapy (DICOM-RT) format. In MCTPV several methods were applied in order to reduce the simulation time. The relative dose distribution of a clinical prostate conformal plan calculated by the MCTPV was compared with that of TiGRT planning system. The results showed well implementation of the beams configuration and patient information in this system. For quantitative evaluation of MCTPV a two-dimensional (2D) diode array (MapCHECK2) and gamma index analysis were used. The gamma passing rate (3%/3 mm) of an IMRT plan was found to be 98.5% for total beams. Also, comparison of the measured and Monte Carlo calculated doses at several points inside an inhomogeneous phantom for 6- and 18-MV photon beams showed a good agreement (within 1.5%). The accuracy and timing results of MCTPV showed that MCTPV could be used very efficiently for additional assessment of complicated plans such as IMRT plan.

  11. Development and validation of MCNPX-based Monte Carlo treatment plan verification system

    Directory of Open Access Journals (Sweden)

    Iraj Jabbari

    2015-01-01

    Full Text Available A Monte Carlo treatment plan verification (MCTPV system was developed for clinical treatment plan verification (TPV, especially for the conformal and intensity-modulated radiotherapy (IMRT plans. In the MCTPV, the MCNPX code was used for particle transport through the accelerator head and the patient body. MCTPV has an interface with TiGRT planning system and reads the information which is needed for Monte Carlo calculation transferred in digital image communications in medicine-radiation therapy (DICOM-RT format. In MCTPV several methods were applied in order to reduce the simulation time. The relative dose distribution of a clinical prostate conformal plan calculated by the MCTPV was compared with that of TiGRT planning system. The results showed well implementation of the beams configuration and patient information in this system. For quantitative evaluation of MCTPV a two-dimensional (2D diode array (MapCHECK2 and gamma index analysis were used. The gamma passing rate (3%/3 mm of an IMRT plan was found to be 98.5% for total beams. Also, comparison of the measured and Monte Carlo calculated doses at several points inside an inhomogeneous phantom for 6- and 18-MV photon beams showed a good agreement (within 1.5%. The accuracy and timing results of MCTPV showed that MCTPV could be used very efficiently for additional assessment of complicated plans such as IMRT plan.

  12. Online fingerprint verification.

    Science.gov (United States)

    Upendra, K; Singh, S; Kumar, V; Verma, H K

    2007-01-01

    As organizations search for more secure authentication methods for user access, e-commerce, and other security applications, biometrics is gaining increasing attention. With an increasing emphasis on the emerging automatic personal identification applications, fingerprint based identification is becoming more popular. The most widely used fingerprint representation is the minutiae based representation. The main drawback with this representation is that it does not utilize a significant component of the rich discriminatory information available in the fingerprints. Local ridge structures cannot be completely characterized by minutiae. Also, it is difficult quickly to match two fingerprint images containing different number of unregistered minutiae points. In this study filter bank based representation, which eliminates these weakness, is implemented and the overall performance of the developed system is tested. The results have shown that this system can be used effectively for secure online verification applications.

  13. Verification of nuclear fuel plates by a developed non-destructive assay method

    Science.gov (United States)

    El-Gammal, W.; El-Nagdy, M.; Rizk, M.; Shawky, S.; Samei, M. A.

    2005-11-01

    Nuclear material (NM) verification is a main target for NM accounting and control. In this work a new relative non-destructive assay technique has been developed to verify the uranium mass content in nuclear fuel. The technique uses a planar high-resolution germanium gamma ray spectrometer in combination with the MCNP-4B Monte Carlo transport code. A standard NM sample was used to simulate the assayed NM and to determine the average intrinsic full energy peak efficiency of the detector for assayed configuration. The developed technique was found to be capable of verifying the operator declarations with an average accuracy of about 2.8% within a precision of better than 4%.

  14. CANDU RU fuel manufacturing basic technology development and advanced fuel verification tests

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Chang Hwan; Chang, S.K.; Hong, S.D. [and others

    1999-04-01

    A PHWR advanced fuel named the CANFLEX fuel has been developed through a KAERI/AECL joint Program. The KAERI made fuel bundle was tested at the KAERI Hot Test Loop for the performance verification of the bundle design. The major test activities were the fuel bundle cross-flow test, the endurance fretting/vibration test, the freon CHF test, and the fuel bundle heat-up test. KAERI also has developing a more advanced PHWR fuel, the CANFLEX-RU fuel, using recovered uranium to extend fuel burn-up in the CANDU reactors. For the purpose of proving safety of the RU handling techniques and appraising feasibility of the CANFLEX-RU fuel fabrication in near future, a physical, chemical and radiological characterization of the RU powder and pellets was performed. (author). 54 refs., 46 tabs., 62 figs.

  15. Book Titled Autonomic Networking-on-Chip: Bio-Inspired Specification, Development, and Verification: An Introduction

    Directory of Open Access Journals (Sweden)

    Phan Cong Vinh

    2015-03-01

    Full Text Available Despite the growing mainstream importance and unique advantages of autonomic networking-onchip (ANoC technology, Autonomic Networking-On-Chip: Bio-Inspired Specification, Development, and Verification is among the first books to evaluate research results on formalizing this emerging NoC paradigm, which was inspired by the human nervous system. The third book in the Embedded Multi-Core Systems series from CRC Press, this is an advanced technical guide and reference composed of contributions from prominent researchers in industry and academia around the world. A response to the critical need for a global information exchange and dialogue, it is written for engineers, scientists, practitioners, and other researchers who have a basic understanding of NoC and are now ready to learn how to specify, develop, and verify ANoC using rigorous approaches.

  16. A Translator Verification Technique for FPGA Software Development in Nuclear Power Plants

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jae Yeob; Kim, Eui Sub; Yoo, Jun Beom [Konkuk University, Seoul (Korea, Republic of)

    2014-10-15

    Although the FPGAs give a high performance than PLC (Programmable Logic Controller), the platform change from PLC to FPGA impose all PLC software engineers give up their experience, knowledge and practices accumulated over decades, and start a new FPGA-based hardware development from scratch. We have researched to fine the solution to this problem reducing the risk and preserving the experience and knowledge. One solution is to use the FBDtoVerilog translator, which translates the FBD programs into behavior-preserving Verilog programs. In general, the PLCs are usually designed with an FBD, while the FPGAs are described with a HDL (Hardware Description Language) such as Verilog or VHDL. Once PLC designer designed the FBD programs, the FBDtoVerilog translates the FBD into Verilog, mechanically. The designers, therefore, need not consider the rest of FPGA development process (e.g., Synthesis and Place and Routing) and can preserve the accumulated experience and knowledge. Even if we assure that the translation from FBD to Verilog is correct, it must be verified rigorously and thoroughly since it is used in nuclear power plants, which is one of the most safety critical systems. While the designer develops the FPGA software with the FBD program translated by the translator, there are other translation tools such as synthesis tool and place and routing tool. This paper also focuses to verify them rigorously and thoroughly. There are several verification techniques for correctness of translator, but they are hard to apply because of the outrageous cost and performance time. Instead, this paper tries to use an indirect verification technique for demonstrating the correctness of translator using the co-simulation technique. We intend to prove only against specific inputs which are under development for a target I and C system, not against all possible input cases.

  17. Development of a Compton camera for online ion beam range verification via prompt γ detection

    Energy Technology Data Exchange (ETDEWEB)

    Aldawood, Saad [Ludwig-Maximilians-Universitaet Muenchen (Germany); King Saud University, Riyadh (Saudi Arabia); Lang, Christian; Lutter, Rudolf; Bortfeldt, Jonathan; Parodi, Katia; Thirolf, Peter G. [Ludwig-Maximilians-Universitaet Muenchen (Germany); Kolff, Hugh van der [Ludwig-Maximilians-Universitaet Muenchen (Germany); Delft University of Technology (Netherlands); Maier, Ludwig [Technische Universitaet Muenchen (Germany)

    2014-07-01

    Precise and preferably online ion beam range verification is a mandatory prerequisite to fully exploit the advantages of hadron-therapy in cancer treatment. Our aim is to develop an imaging system based on a Compton camera designed to detect prompt γ rays induced by nuclear reactions between ion beam and biological tissue. The Compton camera prototype consists of a stack of double-sided Si-strip detectors (DSSSD) acting as scatterers, while the absorber is formed by a LaBr{sub 3} scintillator crystal read out by a position-sensitive multi-anode photomultiplier. The LaBr{sub 3} detector was characterized with both absorptive and reflective side-face wrapping materials. Comparative studies of energy and time resolution, photopeak detection efficiency and spatial resolution are presented together with first tests of the complete camera system.

  18. Development of a Compton camera for online ion beam range verification via prompt γ detection

    Energy Technology Data Exchange (ETDEWEB)

    Aldawood, S. [LMU Munich, Garching (Germany); King Saud University, Riyadh (Saudi Arabia); Liprandi, S.; Marinsek, T.; Bortfeldt, J.; Lang, C.; Lutter, R.; Dedes, G.; Parodi, K.; Thirolf, P.G. [LMU Munich, Garching (Germany); Maier, L.; Gernhaeuser, R. [TU Munich, Garching (Germany); Kolff, H. van der [LMU Munich, Garching (Germany); TU Delft (Netherlands); Castelhano, I. [LMU Munich, Garching (Germany); University of Lisbon, Lisbon (Portugal); Schaart, D.R. [TU Delft (Netherlands)

    2015-07-01

    Precise and preferably online ion beam range verification is a mandatory prerequisite to fully exploit the advantages of hadron therapy in cancer treatment. An imaging system is being developed in Garching aiming to detect promptγ rays induced by nuclear reactions between the ion beam and biological tissue. The Compton camera prototype consists of a stack of six customized double-sided Si-strip detectors (DSSSD, 50 x 50 mm{sup 2}, 0.5 mm thick, 128 strips/side) acting as scatterer, while the absorber is formed by a monolithic LaBr{sub 3}:Ce scintillator crystal (50 x 50 x 30 mm{sup 3}) read out by a position-sensitive multi-anode photomultiplier (Hamamatsu H9500). The on going characterization of the Compton camera properties and its individual components both offline in the laboratory as well as online using proton beam are presented.

  19. Testing and verification of granular-bed filters for the removal of particulate and alkalis. Tenth quarterly project report, January 1-March 31, 1983

    Energy Technology Data Exchange (ETDEWEB)

    Lippert, T.E.

    1983-01-01

    The Westinghouse Electric Corporation with Ducon, Inc. and Burns and Roe, Inc. are conducting a test and evaluation program of a Granular-Bed Filter (GBF) for gas-cleaning applications in pressurized-fluidized-bed combustion processes. This work is funded by DOE PRDA for Exploratory Research, Development, Testing and Evaluation of Systems or Devices for Hot Gas Clean-up. This report describes the status of the testing of the subpilot scale GBF unit under simulated Pressurized-Fluidized-Bed Combustion (PFBC) conditions through Phase IV. 9 references, 21 figures, 4 tables.

  20. Formal Development and Verification of Railway Control Systems - In the context of ERTMS/ETCS Level 2

    DEFF Research Database (Denmark)

    Vu, Linh Hong

    This dissertation presents a holistic, formal method for efficient modelling and verification of safety-critical railway control systems that have product line characteristics, i.e., each individual system is constructed by instantiating common generic applications with concrete configuration data....... The proposed method is based on a combination of formal methods and domain-specific approaches. While formal methods offer mathematically rigorous specification, verification and validation, domain-specific approaches encapsulate the use of formal methods with familiar concepts and notions of the domain, hence...... making the method easy for the railway engineers to use. Furthermore, the method features a 4-step verification and validation approach that can be integrated naturally into different phases of the software development process. This 4-step approach identifies possible errors in generic applications...

  1. The Filter Wheel and Filters development for the X-IFU instrument on-board Athena

    CERN Document Server

    Bozzo, E; Genolet, L; Paltani, S; Sordet, M; Branduardi-Raymont, G; Rauw, G; Sciortino, S; Barret, D; Herder, J W Den

    2016-01-01

    Athena is the large mission selected by ESA in 2013 to investigate the science theme "Hot and Energetic Universe" and presently scheduled for launch in 2028. One of the two instruments located at the focus of the 12 m-long Athena telescope is the X-ray Integral Field Unit (X-IFU). This is an array of TES micro-calorimeters that will be operated at temperatures of 50 mK in order to perform high resolution spectroscopy with an energy resolution down to 2.5 eV at energies < 7 keV. In order to cope with the large dynamical range of X-ray fluxes spanned by the celestial objects Athena will be observing, the X-IFU will be equipped with a filter wheel. This will allow the user to fine tune the instrument set-up based on the nature of the target, thus optimizing the scientific outcomes of the observation. A few positions of the filter wheel will also be used to host a calibration source and to allow the measurement of the instrument intrinsic background.

  2. Development of Real-Time Error Ellipses as an Indicator of Kalman Filter Performance.

    Science.gov (United States)

    1984-03-01

    S q often than 3 to 5 seconds. However, before the HP-86 can e considered feasible for real-time Kalman filtr procssinz, more investigaz ion i: needi...Subtitle) S. TYPE OF REPORT & PERIOD COVERED Development of Real-Time Error Master’s Thesis; Ellipses as an Indicator of Kalman March 1984 Filter...SUPP.LEETARY NOTES 19. KEY WORDS (Cmntine on reveo ole, It ndeeaey md Identil by block number) Error Ellipsoids; Kalman Filter; Extended Kalman Filter

  3. Developing particulate thin filter using coconut fiber for motor vehicle emission

    Science.gov (United States)

    Wardoyo, A. Y. P.; Juswono, U. P.; Riyanto, S.

    2016-03-01

    Amounts of motor vehicles in Indonesia have been recognized a sharply increase from year to year with the increment reaching to 22 % per annum. Meanwhile motor vehicles produce particulate emissions in different sizes with high concentrations depending on type of vehicles, fuels, and engine capacity. Motor Particle emissions are not only to significantly contribute the atmosphric particles but also adverse to human health. In order to reduce the particle emission, it is needed a filter. This study was aimed to develop a thin filter using coconut fiber to reduce particulate emissions for motor vehicles. The filter was made of coconut fibers that were grinded into power and mixed with glues. The filter was tested by the measurements of particle concentrations coming out from the vehicle exhaust directly and the particle concentrations after passing through the filter. The efficiency of the filter was calculated by ratio of the particle concentrations before comming in the filter to the particle conentrations after passing through the filter. The results showed that the efficiency of the filter obtained more than 30 %. The efficiency increases sharply when a number of the filters are arranged paralelly.

  4. Development of the Verification and Validation Matrix for Safety Analysis Code SPACE

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Yo Han; Ha, Sang Jun; Yang, Chang Keun [Korea Electric Power Research Institute, Daejeon (Korea, Republic of)

    2009-10-15

    Korea Electric Power Research Institute (KEPRI) has been developed the safety analysis code, called as SPACE (Safety and Performance Analysis CodE for Nuclear Power Plant), for typical pressurized water reactors (PWR). Current safety analysis codes were conducted from foreign vendors, such as Westinghouse Electric Corp., ABB Combustion Engineering Inc., Kraftwerk Union, etc. Considering the conservatism and inflexibility of the foreign code systems, it is difficult to expand the application areas and analysis scopes. To overcome the mentioned problems KEPRI has launched the project to develop the native safety analysis code with Korea Power Engineering Co.(KOPEC), Korea Atomic Energy Research Inst.(KAERI), Korea Nuclear Fuel(KNF), and Korea Hydro and Nuclear Power Co.(KHNP) under the funding of Ministry of Knowledge Economy (MKE). As a result of the project, the demo-version of SPACE has been released in July 2009. As an advance preparation of the next step, KEPRI and colleagues have developed the verification and validation (V and V) matrix for SPACE. To develop the matrix, the preceding studies and experiments were reviewed. After mature consideration, the V and V matrix has been developed and the experiment plans were designed for the next step to compensate the lack of data.

  5. Development of An Automatic Verification Program for Thermal-hydraulic System Codes

    Energy Technology Data Exchange (ETDEWEB)

    Lee, J. Y.; Ahn, K. T.; Ko, S. H.; Kim, Y. S.; Kim, D. W. [Pusan National University, Busan (Korea, Republic of); Suh, J. S.; Cho, Y. S.; Jeong, J. J. [System Engineering and Technology Co., Daejeon (Korea, Republic of)

    2012-05-15

    As a project activity of the capstone design competitive exhibition, supported by the Education Center for Green Industry-friendly Fusion Technology (GIFT), we have developed a computer program which can automatically perform non-regression test, which is needed repeatedly during a developmental process of a thermal-hydraulic system code, such as the SPACE code. A non-regression test (NRT) is an approach to software testing. The purpose of the non-regression testing is to verify whether, after updating a given software application (in this case, the code), previous software functions have not been compromised. The goal is to prevent software regression, whereby adding new features results in software bugs. As the NRT is performed repeatedly, a lot of time and human resources will be needed during the development period of a code. It may cause development period delay. To reduce the cost and the human resources and to prevent wasting time, non-regression tests need to be automatized. As a tool to develop an automatic verification program, we have used Visual Basic for Application (VBA). VBA is an implementation of Microsoft's event-driven programming language Visual Basic 6 and its associated integrated development environment, which are built into most Microsoft Office applications (In this case, Excel)

  6. Exploratory Research and Development of Microwave Filters in Silicon Technology

    Science.gov (United States)

    2013-09-25

    Conf. Proc. (APMC), pp. 1146-1149, 2010. [45] H. Issa , J. -M. Duchamp, S. Abou-Chahine, and Ph. Ferrari, “Compact Semi-lumped Two-pole DBR Filter with...Spurious Suppression,” 2011 Asia- Pacific Microw. Conf. Proc. (APMC), pp. 425-428, 2011. [46] H. Issa , J. -M Duchamp, and Ph. Ferrari, “Miniaturized

  7. Development of Out-pile Test Technology for Fuel Assembly Performance Verification

    Energy Technology Data Exchange (ETDEWEB)

    Chun, Tae Hyun; In, W. K.; Oh, D. S. [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)] (and others)

    2007-03-15

    Out-pile tests with full scale fuel assembly are to verify the design and to evaluate the performance of the final products. HTL for the hydraulic tests and FAMeCT for mechanical/structural tests were constructed in this project. The maximum operating conditions of HTL are 30 bar, 320 .deg. C, and 500 m3/hr. This facility can perform the pressure drop test, fuel assembly uplift test, and flow induced vibration test. FAMeCT can perform the bending and vibration tests. The verification of the developed facilities were carried out by comparing the reference data of the fuel assembly which was obtained at the Westinghouse Co. The compared data showed a good coincidence within uncertainties. FRETONUS was developed for high temperature and high pressure fretting wear simulator and performance test. A performance test was conducted for 500 hours to check the integrity, endurance, data acquisition capability of the simulator. The technology of turbulent flow analysis and finite element analysis by computation was developed. From the establishments of out-pile test facilities for full scale fuel assembly, the domestic infrastructure for PWR fuel development has been greatly upgraded.

  8. Visualization of Instrumental Verification Information Details (VIVID) : code development, description, and usage.

    Energy Technology Data Exchange (ETDEWEB)

    Roy, Christopher John; Bainbridge, Bruce L.; Potter, Donald L.; Blottner, Frederick G.; Black, Amalia Rebecca

    2005-03-01

    The formulation, implementation and usage of a numerical solution verification code is described. This code uses the Richardson extrapolation procedure to estimate the order of accuracy and error of a computational program solution. It evaluates multiple solutions performed in numerical grid convergence studies to verify a numerical algorithm implementation. Analyses are performed on both structured and unstructured grid codes. Finite volume and finite element discretization programs are examined. Two and three-dimensional solutions are evaluated. Steady state and transient solution analysis capabilities are present in the verification code. Multiple input data bases are accepted. Benchmark options are included to allow for minimal solution validation capability as well as verification.

  9. Development and Verification of Tritium Analyses Code for a Very High Temperature Reactor

    Energy Technology Data Exchange (ETDEWEB)

    Chang H. Oh; Eung S. Kim

    2009-09-01

    A tritium permeation analyses code (TPAC) has been developed by Idaho National Laboratory for the purpose of analyzing tritium distributions in the VHTR systems including integrated hydrogen production systems. A MATLAB SIMULINK software package was used for development of the code. The TPAC is based on the mass balance equations of tritium-containing species and a various form of hydrogen (i.e., HT, H2, HTO, HTSO4, and TI) coupled with a variety of tritium source, sink, and permeation models. In the TPAC, ternary fission and neutron reactions with 6Li, 7Li 10B, 3He were taken into considerations as tritium sources. Purification and leakage models were implemented as main tritium sinks. Permeation of HT and H2 through pipes, vessels, and heat exchangers were importantly considered as main tritium transport paths. In addition, electroyzer and isotope exchange models were developed for analyzing hydrogen production systems including both high-temperature electrolysis and sulfur-iodine process. The TPAC has unlimited flexibility for the system configurations, and provides easy drag-and-drops for making models by adopting a graphical user interface. Verification of the code has been performed by comparisons with the analytical solutions and the experimental data based on the Peach Bottom reactor design. The preliminary results calculated with a former tritium analyses code, THYTAN which was developed in Japan and adopted by Japan Atomic Energy Agency were also compared with the TPAC solutions. This report contains descriptions of the basic tritium pathways, theory, simple user guide, verifications, sensitivity studies, sample cases, and code tutorials. Tritium behaviors in a very high temperature reactor/high temperature steam electrolysis system have been analyzed by the TPAC based on the reference indirect parallel configuration proposed by Oh et al. (2007). This analysis showed that only 0.4% of tritium released from the core is transferred to the product hydrogen

  10. Verification and intercomparison of mesoscale ensemble prediction systems in the Beijing 2008 Olympics Research and Development Project

    Science.gov (United States)

    Kunii, Masaru; Saito, Kazuo; Seko, Hiromu; Hara, Masahiro; Hara, Tabito; Yamaguchi, Munehiko; Gong, Jiandong; Charron, Martin; Du, Jun; Wang, Yong; Chen, Dehui

    2011-05-01

    During the period around the Beijing 2008 Olympic Games, the Beijing 2008 Olympics Research and Development Project (B08RDP) was conducted as part of the World Weather Research Program short-range weather forecasting research project. Mesoscale ensemble prediction (MEP) experiments were carried out by six organizations in near-real time, in order to share their experiences in the development of MEP systems. The purpose of this study is to objectively verify these experiments and to clarify the problems associated with the current MEP systems through the same experiences. Verification was performed using the MEP outputs interpolated into a common verification domain with a horizontal resolution of 15 km. For all systems, the ensemble spreads grew as the forecast time increased, and the ensemble mean improved the forecast errors compared with individual control forecasts in the verification against the analysis fields. However, each system exhibited individual characteristics according to the MEP method. Some participants used physical perturbation methods. The significance of these methods was confirmed by the verification. However, the mean error (ME) of the ensemble forecast in some systems was worse than that of the individual control forecast. This result suggests that it is necessary to pay careful attention to physical perturbations.

  11. Development and Validation of Search Filters to Identify Articles on Family Medicine in Online Medical Databases

    NARCIS (Netherlands)

    Pols, D.H.; Bramer, W.M.; Bindels, P.J.; Laar, F.A. van de; Bohnen, A.M.

    2015-01-01

    Physicians and researchers in the field of family medicine often need to find relevant articles in online medical databases for a variety of reasons. Because a search filter may help improve the efficiency and quality of such searches, we aimed to develop and validate search filters to identify

  12. Development and validation of search filters to identify articles on family medicine in online medical databases

    NARCIS (Netherlands)

    D.H.J. Pols (David); W.M. Bramer (Wichor); P.J.E. Bindels (Patrick J.E.); F.A. van de Laar (Floris A.); A.M. Bohnen

    2015-01-01

    textabstractPhysicians and researchers in the field of family medicine often need to find relevant articles in online medical databases for a variety of reasons. Because a search filter may help improve the efficiency and quality of such searches, we aimed to develop and validate search filters to

  13. Development and Validation of Search Filters to Identify Articles on Family Medicine in Online Medical Databases

    NARCIS (Netherlands)

    Pols, D.H.; Bramer, W.M.; Bindels, P.J.; Laar, F.A. van de; Bohnen, A.M.

    2015-01-01

    Physicians and researchers in the field of family medicine often need to find relevant articles in online medical databases for a variety of reasons. Because a search filter may help improve the efficiency and quality of such searches, we aimed to develop and validate search filters to identify rese

  14. Development and validation of search filters to identify articles on family medicine in online medical databases

    NARCIS (Netherlands)

    D.H.J. Pols (David); W.M. Bramer (Wichor M); P.J.E. Bindels (Patrick J.E.); F.A. van de Laar (Floris A.); A.M. Bohnen

    2015-01-01

    textabstractPhysicians and researchers in the field of family medicine often need to find relevant articles in online medical databases for a variety of reasons. Because a search filter may help improve the efficiency and quality of such searches, we aimed to develop and validate search filters to i

  15. REDD+ readiness: early insights on monitoring, reporting and verification systems of project developers

    Science.gov (United States)

    Joseph, Shijo; Herold, Martin; Sunderlin, William D.; Verchot, Louis V.

    2013-09-01

    A functional measuring, monitoring, reporting and verification (MRV) system is essential to assess the additionality and impact on forest carbon in REDD+ (reducing emissions from deforestation and degradation) projects. This study assesses the MRV capacity and readiness of project developers at 20 REDD+ projects in Brazil, Peru, Cameroon, Tanzania, Indonesia and Vietnam, using a questionnaire survey and field visits. Nineteen performance criteria with 76 indicators were formulated in three categories, and capacity was measured with respect to each category. Of the 20 projects, 11 were found to have very high or high overall MRV capacity and readiness. At the regional level, capacity and readiness tended to be highest in the projects in Brazil and Peru and somewhat lower in Cameroon, Tanzania, Indonesia and Vietnam. Although the MRV capacities of half the projects are high, there are capacity deficiencies in other projects that are a source of concern. These are not only due to limitations in technical expertise, but can also be attributed to the slowness of international REDD+ policy formulation and the unclear path of development of the forest carbon market. Based on the study results, priorities for MRV development and increased investment in readiness are proposed.

  16. Metal fuel development and verification for prototype generation- IV Sodium- Cooled Fast Reactor

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Chan Bock; Cheon, Jin Sik; Kim, Sung Ho; Park, Jeong Yong; Joo, Hyung Kook [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2016-10-15

    Metal fuel is being developed for the prototype generation-IV sodium-cooled fast reactor (PGSFR) to be built by 2028. U-Zr fuel is a driver for the initial core of the PGSFR, and U -transuranics (TRU)-Zr fuel will gradually replace U-Zr fuel through its qualification in the PGSFR. Based on the vast worldwide experiences of U-Zr fuel, work on U-Zr fuel is focused on fuel design, fabrication of fuel components, and fuel verification tests. U-TRU-Zr fuel uses TRU recovered through pyroelectrochemical processing of spent PWR (pressurized water reactor) fuels, which contains highly radioactive minor actinides and chemically active lanthanide or rare earth elements as carryover impurities. An advanced fuel slug casting system, which can prevent vaporization of volatile elements through a control of the atmospheric pressure of the casting chamber and also deal with chemically active lanthanide elements using protective coatings in the casting crucible, was developed. Fuel cladding of the ferritic-martensitic steel FC92, which has higher mechanical strength at a high temperature than conventional HT9 cladding, was developed and fabricated, and is being irradiated in the fast reactor.

  17. DEVELOPING VERIFICATION SYSTEMS FOR BUILDING INFORMATION MODELS OF HERITAGE BUILDINGS WITH HETEROGENEOUS DATASETS

    Directory of Open Access Journals (Sweden)

    L. Chow

    2017-08-01

    Full Text Available The digitization and abstraction of existing buildings into building information models requires the translation of heterogeneous datasets that may include CAD, technical reports, historic texts, archival drawings, terrestrial laser scanning, and photogrammetry into model elements. In this paper, we discuss a project undertaken by the Carleton Immersive Media Studio (CIMS that explored the synthesis of heterogeneous datasets for the development of a building information model (BIM for one of Canada’s most significant heritage assets – the Centre Block of the Parliament Hill National Historic Site. The scope of the project included the development of an as-found model of the century-old, six-story building in anticipation of specific model uses for an extensive rehabilitation program. The as-found Centre Block model was developed in Revit using primarily point cloud data from terrestrial laser scanning. The data was captured by CIMS in partnership with Heritage Conservation Services (HCS, Public Services and Procurement Canada (PSPC, using a Leica C10 and P40 (exterior and large interior spaces and a Faro Focus (small to mid-sized interior spaces. Secondary sources such as archival drawings, photographs, and technical reports were referenced in cases where point cloud data was not available. As a result of working with heterogeneous data sets, a verification system was introduced in order to communicate to model users/viewers the source of information for each building element within the model.

  18. Cost-Effective CNC Part Program Verification Development for Laboratory Instruction.

    Science.gov (United States)

    Chen, Joseph C.; Chang, Ted C.

    2000-01-01

    Describes a computer numerical control program verification system that checks a part program before its execution. The system includes character recognition, word recognition, a fuzzy-nets system, and a tool path viewer. (SK)

  19. Development of a tunable filter for coronal polarimetry

    Science.gov (United States)

    Tomczyk, S.; Mathew, S. K.; Gallagher, D.

    2016-07-01

    Measuring magnetic fields in the solar corona is crucial to understanding and predicting the Sun's generation of space weather that affects communications, GPS systems, space flight, and power transmission. The Coronal Solar Magnetism Observatory Large Coronagraph (COSMO LC) is a proposed 1.5 m aperture coronagraph designed to synoptically observe magnetic fields and plasma properties in the large-scale corona to improve our understanding of solar processes that cause space weather. The LC will observe coronal emission lines over the wavelength range from 500 to 1100 nm with a field of view of 1° and a spatial resolution of 2 arcsec. A spectral resolution greater than 8000 over the wavelength range is needed to resolve the polarization signatures of magnetic fields in the emission line profiles. The aperture and field of view of the LC set an étendue requirement of 1.39 m2 deg2 for the postfocus instrumentation. We find that a tunable wide-field birefringent filter using Lithium Niobate crystals can meet the étendue and spectral resolution requirements for the LC spectrometer. We have tested a number of commercially available crystals and verify that crystals of the required size and birefringence uniformity are available. We also evaluate electro-optical tuning of a Lithium Niobate birefringent filter by the application of high voltage. This tunable filter represents a key enabling technology for the COSMO LC.

  20. Advanced hot-gas filter development. Topical report, September 30, 1994--May 31, 1996

    Energy Technology Data Exchange (ETDEWEB)

    Lane, J.E.; LeCostaouec, J.F.; Painter, C.J.; Sue, W.A.; Radford, K.C.

    1996-12-31

    The application of high-performance, high-temperature particulate control devices is considered to be beneficial to advanced fossil fuel processing technology, to selected high-temperature industrial processes, and to waste incineration concepts. Ceramic rigid filters represent the most attractive technology for these applications due to their capability to withstand high-temperature corrosive environments. However, current generation monolithic filters have demonstrated poor resistance to crack propagation and can experience catastrophic failure during use. To address this problem, ceramic fiber-reinforced ceramic matrix composite (CMC) filter materials are needed for reliable damage tolerant candle filters. This program is focused on the development of an oxide-fiber reinforced oxide material composite filter material that is cost competitive with prototype next generation filters. This goal would be achieved through the development of a low cost sol-gel fabrication process and a three-dimensional fiber architecture optimized for high volume filter manufacturing. The 3D continuous fiber reinforcement provides a damage tolerant structure which is not subject to delamination-type failures. This report documents the Phase 1, Filter Material Development and Evaluation, results. Section 2 provides a program summary. Technical results, including experimental procedures, are presented and discussed in Section 3. Section 4 and 5 provide the Phase 1 conclusions and recommendations, respectively. The remaining sections cover acknowledgements and references.

  1. Glomerular disease search filters for Pubmed, Ovid Medline, and Embase: a development and validation study

    Directory of Open Access Journals (Sweden)

    Hildebrand Ainslie M

    2012-06-01

    Full Text Available Abstract Background Tools to enhance physician searches of Medline and other bibliographic databases have potential to improve the application of new knowledge in patient care. This is particularly true for articles about glomerular disease, which are published across multiple disciplines and are often difficult to track down. Our objective was to develop and test search filters for PubMed, Ovid Medline, and Embase that allow physicians to search within a subset of the database to retrieve articles relevant to glomerular disease. Methods We used a diagnostic test assessment framework with development and validation phases. We read a total of 22,992 full text articles for relevance and assigned them to the development or validation set to define the reference standard. We then used combinations of search terms to develop 997,298 unique glomerular disease filters. Outcome measures for each filter included sensitivity, specificity, precision, and accuracy. We selected optimal sensitive and specific search filters for each database and applied them to the validation set to test performance. Results High performance filters achieved at least 93.8% sensitivity and specificity in the development set. Filters optimized for sensitivity reached at least 96.7% sensitivity and filters optimized for specificity reached at least 98.4% specificity. Performance of these filters was consistent in the validation set and similar among all three databases. Conclusions PubMed, Ovid Medline, and Embase can be filtered for articles relevant to glomerular disease in a reliable manner. These filters can now be used to facilitate physician searching.

  2. Characterizing proton-activated materials to develop PET-mediated proton range verification markers

    Science.gov (United States)

    Cho, Jongmin; Ibbott, Geoffrey S.; Kerr, Matthew D.; Amos, Richard A.; Stingo, Francesco C.; Marom, Edith M.; Truong, Mylene T.; Palacio, Diana M.; Betancourt, Sonia L.; Erasmus, Jeremy J.; DeGroot, Patricia M.; Carter, Brett W.; Gladish, Gregory W.; Sabloff, Bradley S.; Benveniste, Marcelo F.; Godoy, Myrna C.; Patil, Shekhar; Sorensen, James; Mawlawi, Osama R.

    2016-06-01

    Conventional proton beam range verification using positron emission tomography (PET) relies on tissue activation alone and therefore requires particle therapy PET whose installation can represent a large financial burden for many centers. Previously, we showed the feasibility of developing patient implantable markers using high proton cross-section materials (18O, Cu, and 68Zn) for in vivo proton range verification using conventional PET scanners. In this technical note, we characterize those materials to test their usability in more clinically relevant conditions. Two phantoms made of low-density balsa wood (~0.1 g cm-3) and beef (~1.0 g cm-3) were embedded with Cu or 68Zn foils of several volumes (10-50 mm3). The metal foils were positioned at several depths in the dose fall-off region, which had been determined from our previous study. The phantoms were then irradiated with different proton doses (1-5 Gy). After irradiation, the phantoms with the embedded foils were moved to a diagnostic PET scanner and imaged. The acquired data were reconstructed with 20-40 min of scan time using various delay times (30-150 min) to determine the maximum contrast-to-noise ratio. The resultant PET/computed tomography (CT) fusion images of the activated foils were then examined and the foils’ PET signal strength/visibility was scored on a 5 point scale by 13 radiologists experienced in nuclear medicine. For both phantoms, the visibility of activated foils increased in proportion to the foil volume, dose, and PET scan time. A linear model was constructed with visibility scores as the response variable and all other factors (marker material, phantom material, dose, and PET scan time) as covariates. Using the linear model, volumes of foils that provided adequate visibility (score 3) were determined for each dose and PET scan time. The foil volumes that were determined will be used as a guideline in developing practical implantable markers.

  3. Characterizing proton-activated materials to develop PET-mediated proton range verification markers.

    Science.gov (United States)

    Cho, Jongmin; Ibbott, Geoffrey S; Kerr, Matthew D; Amos, Richard A; Stingo, Francesco C; Marom, Edith M; Truong, Mylene T; Palacio, Diana M; Betancourt, Sonia L; Erasmus, Jeremy J; DeGroot, Patricia M; Carter, Brett W; Gladish, Gregory W; Sabloff, Bradley S; Benveniste, Marcelo F; Godoy, Myrna C; Patil, Shekhar; Sorensen, James; Mawlawi, Osama R

    2016-06-07

    Conventional proton beam range verification using positron emission tomography (PET) relies on tissue activation alone and therefore requires particle therapy PET whose installation can represent a large financial burden for many centers. Previously, we showed the feasibility of developing patient implantable markers using high proton cross-section materials ((18)O, Cu, and (68)Zn) for in vivo proton range verification using conventional PET scanners. In this technical note, we characterize those materials to test their usability in more clinically relevant conditions. Two phantoms made of low-density balsa wood (~0.1 g cm(-3)) and beef (~1.0 g cm(-3)) were embedded with Cu or (68)Zn foils of several volumes (10-50 mm(3)). The metal foils were positioned at several depths in the dose fall-off region, which had been determined from our previous study. The phantoms were then irradiated with different proton doses (1-5 Gy). After irradiation, the phantoms with the embedded foils were moved to a diagnostic PET scanner and imaged. The acquired data were reconstructed with 20-40 min of scan time using various delay times (30-150 min) to determine the maximum contrast-to-noise ratio. The resultant PET/computed tomography (CT) fusion images of the activated foils were then examined and the foils' PET signal strength/visibility was scored on a 5 point scale by 13 radiologists experienced in nuclear medicine. For both phantoms, the visibility of activated foils increased in proportion to the foil volume, dose, and PET scan time. A linear model was constructed with visibility scores as the response variable and all other factors (marker material, phantom material, dose, and PET scan time) as covariates. Using the linear model, volumes of foils that provided adequate visibility (score 3) were determined for each dose and PET scan time. The foil volumes that were determined will be used as a guideline in developing practical implantable markers.

  4. Development of optical ground verification method for μm to sub-mm reflectors

    Science.gov (United States)

    Stockman, Y.; Thizy, C.; Lemaire, P.; Georges, M.; Mazy, E.; Mazzoli, A.; Houbrechts, Y.; Rochus, P.; Roose, S.; Doyle, D.; Ulbrich, G.

    2004-06-01

    develop and realise suitable verification tools based on infrared interferometry and other optical techniques for testing large reflector structures, telescope configurations and their performances under simulated space conditions. The first one is an IR-phase shifting interferometer with high spatial resolution. This interferometer shall be used specifically for the verification of high precision IR, FIR and sub-mm reflector surfaces and telescopes under both ambient and thermal vacuum conditions. The second one presented hereafter is a holographic method for relative shape measurement. The holographic solution proposed makes use of a home built vacuum compatible holographic camera that allows displacement measurements from typically 20 nanometres to 25 microns in one shot. An iterative process allows the measurement of a total of up to several mm of deformation. Uniquely the system is designed to measure both specular and diffuse surfaces.

  5. Survey of Verification and Validation Techniques for Small Satellite Software Development

    Science.gov (United States)

    Jacklin, Stephen A.

    2015-01-01

    The purpose of this paper is to provide an overview of the current trends and practices in small-satellite software verification and validation. This document is not intended to promote a specific software assurance method. Rather, it seeks to present an unbiased survey of software assurance methods used to verify and validate small satellite software and to make mention of the benefits and value of each approach. These methods include simulation and testing, verification and validation with model-based design, formal methods, and fault-tolerant software design with run-time monitoring. Although the literature reveals that simulation and testing has by far the longest legacy, model-based design methods are proving to be useful for software verification and validation. Some work in formal methods, though not widely used for any satellites, may offer new ways to improve small satellite software verification and validation. These methods need to be further advanced to deal with the state explosion problem and to make them more usable by small-satellite software engineers to be regularly applied to software verification. Last, it is explained how run-time monitoring, combined with fault-tolerant software design methods, provides an important means to detect and correct software errors that escape the verification process or those errors that are produced after launch through the effects of ionizing radiation.

  6. Development of PIRT and Assessment Matrix for Verification and Validation of Sodium Fire Analysis Codes

    Science.gov (United States)

    Ohno, Shuji; Ohshima, Hiroyuki; Tajima, Yuji; Ohki, Hiroshi

    Thermodynamic consequence in liquid sodium leak and fire accident is one of the important issues to be evaluated when considering the safety aspect of fast reactor plant building. The authors are therefore initiating systematic verification and validation (V&V) activity to assure and demonstrate the reliability of numerical simulation tool for sodium fire analysis. The V&V activity is in progress with the main focuses on already developed sodium fire analysis codes SPHINCS and AQUA-SF. The events to be evaluated are hypothetical sodium spray, pool, or combined fire accidents followed by thermodynamic behaviors postulated in a plant building. The present paper describes that the ‘Phenomena Identification and Ranking Table (PIRT)’ is developed at first for clarifying the important validation points in the sodium fire analysis codes, and that an ‘Assessment Matrix’ is proposed which summarizes both separate effect tests and integral effect tests for validating the computational models or whole code for important phenomena. Furthermore, the paper shows a practical validation with a separate effect test in which the spray droplet combustion model of SPHINCS and AQUA-SF predicts the burned amount of a falling sodium droplet with the error mostly less than 30%.

  7. SU-E-T-265: Development of Dose-To-Water Conversion Models for Pre-Treatment Verification with the New AS1200 Imager

    Energy Technology Data Exchange (ETDEWEB)

    Miri, N [University of Newcastle, Newcastle, NSW (Australia); Baltes, C; Keller, P [Varian Medical Systems Imaging, Baden-Dättwil (Switzerland); Greer, P [Newcastle Mater Hospital, Newcastle, NSW (Australia)

    2015-06-15

    Purpose: To develop and evaluate models for dose verification of flattened (FF) and flattening filter free (FFF) beams for the new Varian aS1200 backscatter-shielded electronic portal imaging device (EPID). Methods: The model converts EPID images to incident energy fluence using deconvolution of EPID scatter kernels and fluence to dose in water using convolution with dose-to-water kernels. Model parameters were optimized using non-transmission EPID images of varying jaw defined field sizes for energies of 6 and 10 MV FF and FFF beams. Energy fluence was obtained from the Acuros planning system and reference dose profiles and output factors were measured at depths of 5, 10, 15 and 20 cm in a water phantom. Images for 34 IMRT fields acquired at 6 and 10 MV FF energy were converted to dose at 10 cm depth in water and compared to treatment planning system dose plane calculations using gamma criteria. Results: Gamma evaluations for the IMRT fields had mean (1 standard deviation) pass rates of 99.4% (0.8%) and mean gamma scores of 0.32 (0.06) with 2%, 2 mm criteria and 10% of maximum dose threshold. Conclusion: The developed model has been shown to be highly accurate for pre-treatment verification with the new aS1200 imager which does not display support-arm backscatter artefact and has improved dosimetric properties. Further investigation of FFF modes is in progress. The model is currently being evaluated at sites for potential clinical release.

  8. The development of a porous silicon nitride crossflow filter; Final report, September 1988--September 1992

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1992-09-01

    This report summarizes the work performed in developing a permeable form of silicon nitride for application to ceramic crossflow filters for use in advanced coal-fired electric power plants. The program was sponsored by the Department of Energy Morgantown Energy Technology Center and consisted of a design analysis and material development phase and a filter manufacture and demonstration phase. The crossflow filter design and operating requirements were defined. A filter design meeting the requirements was developed and thermal and stress analyses were performed. Material development efforts focused initially on reaction-bonded silicon nitride material. This approach was not successful, and the materials effort was refocused on the development of a permeable form of sintered silicon nitride (SSN). This effort was successful. The SSN material was used for the second phase of the program, filter manufacture and evaluation. Four half-scale SAN filter modules were fabricated. Three of the modules were qualified for filter performance tests. Tests were performed on two of the three qualified modules in the High-Temperature, High-Pressure facility at the Westinghouse Science and Technology Center. The first module failed on test when it expanded into the clamping device, causing dust leakage through the filter. The second module performed well for a cumulative 150-hr test. It displayed excellent filtration capability during the test. The blowback pulse cleaning was highly effective, and the module apparently withstood the stresses induced by the periodic pulse cleaning. Testing of the module resumed, and when the flow of combustion gas through the filter was doubled, cracks developed and the test was concluded.

  9. ParFlow.RT: Development and Verification of a New Reactive Transport Model

    Science.gov (United States)

    Beisman, J. J., III

    2015-12-01

    In natural subsurface systems, total elemental fluxes are often heavily influenced by areas of disproportionately high reaction rates. These pockets of high reaction rates tend to occur at interfaces, such as the hyporheic zone, where a hydrologic flowpath converges with either a chemically distinct hydrologic flowpath or a reactive substrate. Understanding the affects that these highly reactive zones have on the behavior of shallow subsurface systems is integral to the accurate quantification of nutrient fluxes and biogeochemical cycling. Numerical simulations of these systems may be able to offer some insight. To that end, we have developed a new reactive transport model, ParFlow.RT, by coupling the parallel flow and transport code ParFlow with the geochemical engines of both PFLOTRAN and CrunchFlow. The coupling was accomplished via the Alquimia biogeochemistry API, which provides a unified interface to several geochemical codes and allows a relatively simple implementation of advanced geochemical functionality in flow and transport codes. This model uses an operator-splitting approach, where the transport and reaction steps are solved separately. Here, we present the details of this new model, and the results of verification simulations and biogeochemical cycling simulations of the DOE's East River field site outside of Gothic, CO.

  10. A consensus rating method for small virus-retentive filters. I. Method development.

    Science.gov (United States)

    Lute, Scott; Riordan, William; Pease, Leonard F; Tsai, De-Hao; Levy, Richard; Haque, Mohammed; Martin, Jerold; Moroe, Ichiro; Sato, Terry; Morgan, Michael; Krishnan, Mani; Campbell, Jennifer; Genest, Paul; Dolan, Sherri; Tarrach, Klaus; Meyer, Anika; Zachariah, Michael R; Tarlov, Michael J; EtzeL, Mark; Brorson, Kurt; Aranha, Hazel; Bailey, Mark; Bender, Jean; Carter, Jeff; Chen, Qi; Dowd, Chris; Jani, Raj; Jen, David; Kidd, Stanley; Meltzer, Ted; Remington, Kathryn; Rice, Iris; Romero, Cynthia; Sato, Terry; Jornitz, Maik; Sekura, Carol Marcus; Sofer, Gail; Specht, Rachel; Wojciechowski, Peter

    2008-01-01

    Virus filters are membrane-based devices that remove large viruses (e.g., retroviruses) and/or small viruses (e.g., parvoviruses) from products by a size exclusion mechanism. In 2002, the Parenteral Drug Association (PDA) organized the PDA Virus Filter Task Force to develop a common nomenclature and a standardized test method for classifying and identifying viral-retentive filters. One goal of the task force was to develop a test method for small virus-retentive filters. Because small virus-retentive filters present unique technical challenges, the test method development process was guided by laboratory studies to determine critical variables such as choice of bacteriophage challenge, choice of model protein, filtration operating parameters, target log10 reduction value, and filtration endpoint definition. Based on filtration, DLS, electrospray differential mobility analysis, and polymerase chain reaction studies, a final rating based on retention of bacteriophage PP7 was chosen by the PDA Virus Filter Task Force. The detailed final consensus filter method was published in the 2008 update of PDA Technical Report 41. Virus Filtration.

  11. Development and verification of a software tool for the acoustic location of partial discharge in a power transformer

    Directory of Open Access Journals (Sweden)

    Polužanski Vladimir

    2014-01-01

    Full Text Available This paper discusses the development and verification of software tool for determining the location of partial discharge in a power transformer with the acoustic method. Classification and systematization of physical principles and detection methods and tests of partial discharge in power transformers are shown at the beginning of this paper. The most important mathematical models, features, algorithms, and real problems that affect measurement accuracy are highlighted. This paper describes the development and implementation of a software tool for determining the location of partial discharge in a power transformer based on a no iterative mathematical algorithm. Verification and accuracy of measurement are proved both by computer simulation and experimental results available in the literature.

  12. Feature Selection for Generator Excitation Neurocontroller Development Using Filter Technique

    Directory of Open Access Journals (Sweden)

    Abdul Ghani Abro

    2011-09-01

    Full Text Available Essentially, motive behind using control system is to generate suitable control signal for yielding desired response of a physical process. Control of synchronous generator has always remained very critical in power system operation and control. For certain well known reasons power generators are normally operated well below their steady state stability limit. This raises demand for efficient and fast controllers. Artificial intelligence has been reported to give revolutionary outcomes in the field of control engineering. Artificial Neural Network (ANN, a branch of artificial intelligence has been used for nonlinear and adaptive control, utilizing its inherent observability. The overall performance of neurocontroller is dependent upon input features too. Selecting optimum features to train a neurocontroller optimally is very critical. Both quality and size of data are of equal importance for better performance. In this work filter technique is employed to select independent factors for ANN training.

  13. Development of multidye UV filters for OPVs using luminescent materials

    Science.gov (United States)

    Vignoto Fernandes, Ricardo; Bristow, Noel; Stoichkov, Vasil; Scapin Anizelli, Helder; Leonil Duarte, José; Laureto, Edson; Kettle, Jeff

    2017-01-01

    Luminescence down-shifting (LDS) is used in several photovoltaic technologies aiming to improve the photon conversion efficiency (PCE) of the devices through the increase of the light harvesting in the regions of the electromagnetic spectrum where the EQE of the solar cells is poor. The aim of this work was to produce films of mixtures (blends) of two luminescent materials, dispersed in a poly-methyl methacrylate (PMMA) matrix, hoping to improve their properties both as LDS layer and as UV filter when applied on the clear, external surface of P3HT:PC61BM photovoltaic devices. The best results led to an increment of 7.4% in the PCE of the devices, and a six fold enhancement in their half-life (T 50%). This study indicates that multidye LDS layers with optimized optical properties can lead to an effective improvement in the performance and operational stability of OPVs.

  14. Program Verification with Monadic Second-Order Logic & Languages for Web Service Development

    DEFF Research Database (Denmark)

    Møller, Anders

    that only valid HTML documents are ever shown to the clients at runtime and that the documents are constructed consistently. In addition, the language design provides support for declarative form-field validation, caching of dynamic documents, concurrency control based on temporal-logic specifications......Domain-specific formal languages are an essential part of computer science, combining theory and practice. Such languages are characterized by being tailor-made for specific application domains and thereby providing expressiveness on high abstraction levels and allowing specialized analysis...... and verification techniques. This dissertation describes two projects, each exploring one particular instance of such languages: monadic second-order logic and its application to program verification, and programming languages for construction of interactive Web services. Both program verification and Web service...

  15. Development Concept of Guaranteed Verification Electric Power System Simulation Tools and Its Realization

    Directory of Open Access Journals (Sweden)

    Gusev Alexander

    2015-01-01

    Full Text Available The analysis of existing problem reliability and verification of widespread electric power systems (EPS simulation tools is presented in this article. Everything simulation tools are based on the using of numerical methods for ordinary differential equations. Described the concept of guaranteed verification EPS simulation tools and the structure of its realization are based using the Simulator having properties of continuous , without decomposition three-phase EPS simulation in real time and on an unlimited range with guaranteed accuracy. The information from the Simulator can be verified by using data only quasi-steady-state regime received from the SCADA and such Simulator can be applied as the standard model for verification any EPS simulation tools.

  16. Development of experimental verification techniques for non-linear deformation and fracture on the nanometer scale.

    Energy Technology Data Exchange (ETDEWEB)

    Moody, Neville Reid; Bahr, David F.

    2005-11-01

    This work covers three distinct aspects of deformation and fracture during indentations. In particular, we develop an approach to verification of nanoindentation induced film fracture in hard film/soft substrate systems; we examine the ability to perform these experiments in harsh environments; we investigate the methods by which the resulting deformation from indentation can be quantified and correlated to computational simulations, and we examine the onset of plasticity during indentation testing. First, nanoindentation was utilized to induce fracture of brittle thin oxide films on compliant substrates. During the indentation, a load is applied and the penetration depth is continuously measured. A sudden discontinuity, indicative of film fracture, was observed upon the loading portion of the load-depth curve. The mechanical properties of thermally grown oxide films on various substrates were calculated using two different numerical methods. The first method utilized a plate bending approach by modeling the thin film as an axisymmetric circular plate on a compliant foundation. The second method measured the applied energy for fracture. The crack extension force and applied stress intensity at fracture was then determined from the energy measurements. Secondly, slip steps form on the free surface around indentations in most crystalline materials when dislocations reach the free surface. Analysis of these slip steps provides information about the deformation taking place in the material. Techniques have now been developed to allow for accurate and consistent measurement of slip steps and the effects of crystal orientation and tip geometry are characterized. These techniques will be described and compared to results from dislocation dynamics simulations.

  17. Development of an advanced real time simulation tool, ARTIST and its verification

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Hee Cheol; Moon, S. K.; Yoon, B. J.; Sim, S. K.; Lee, W. J. [Korea Atomic Energy Research Institute, Taejeon (Korea)

    1999-10-01

    A real time reactor system analysis code ARTIST, based on drift flux model has been developed to investigate the transient system behavior under low pressure, low flow and low power conditions with noncondensable gas present in the system. The governing equations of the ARTIST code consist of three mass continuity equations (steam, liquid and noncondensables), two energy equations (steam and mixture) and one mixture equation constituted with the drift flux model. The drift flux model of ARTIST has been validated against the THETIS experimental data by comparing the void distribution in the system. Especially, the calculated void fraction by Chexal-Lellouche void fraction correlation at low pressure and low flow, is better than the results of both the homogeneous model of TASS code and the two-fluid model of RELAP5/MOD3 code. When noncondensable gas exists, thermal-hydraulic state solution scheme and the calculation methods of the partial derivatives are developed. Numerical consistency and convergence was tested with the one volume problems and the manometric oscillation was assessed to examine the calculation methods of the partial derivatives. Calculated thermal-hydraulic state for each test shows the consistent and expected behaviour. In order to evaluate the ARTIST code capability in predicting the two phase thermal-hydraulic phenomena of the loss of RHR accident during midloop operation, BETHSY test 6.9d is simulated. From the results, it is judged that the reflux condensation model and the critical flow model for the noncondensable gas are necessary to correctly predict the thermal-hydraulic behaviour. Finally, the verification run was performed without the drift flux model and the noncondensable gas model for the postulated accidents of the real plants. The ARTIST code well reproduces the parametric trends which are calculated by TASS code. Therefore, the integrity of ARTIST code was verified. 35 refs., 70 figs., 3 tabs. (Author)

  18. Development of a noise reduction filter algorithm for pediatric body images in multidetector CT.

    Science.gov (United States)

    Nishimaru, Eiji; Ichikawa, Katsuhiro; Okita, Izumi; Tomoshige, Yukihiro; Kurokawa, Takehiro; Nakamura, Yuko; Suzuki, Masayuki

    2010-12-01

    Recently, several types of post-processing image filter which was designed to reduce noise allowing a corresponding dose reduction in CT images have been proposed and these were reported to be useful for noise reduction of CT images of adult patients. However, these have not been reported on adaptation for pediatric patients. Because they are not very effective with small (<20 cm) display fields of view, they could not be used for pediatric (e.g., premature babies and infants) body CT images. In order to solve this restriction, we have developed a new noise reduction filter algorithm which can be applicable for pediatric body CT images. This algorithm is based on a three-dimensional post processing, in which output pixel values are calculated by multi-directional, one-dimensional median filters on original volumetric datasets. The processed directions were selected except in in-plane (axial plane) direction, and consequently the in-plane spatial resolution was not affected by the filter. Also, in other directions, the spatial resolutions including slice thickness were almost maintained due to a characteristic of non-linear filtering of the median filter. From the results of phantom studies, the proposed algorithm could reduce standard deviation values as a noise index by up to 30% without affecting the spatial resolution of all directions, and therefore, contrast-to-noise ratio was improved by up to 30%. This newly developed filter algorithm will be useful for the diagnosis and radiation dose reduction of pediatric body CT images.

  19. The medline UK filter: development and validation of a geographic search filter to retrieve research about the UK from OVID medline.

    Science.gov (United States)

    Ayiku, Lynda; Levay, Paul; Hudson, Tom; Craven, Jenny; Barrett, Elizabeth; Finnegan, Amy; Adams, Rachel

    2017-07-13

    A validated geographic search filter for the retrieval of research about the United Kingdom (UK) from bibliographic databases had not previously been published. To develop and validate a geographic search filter to retrieve research about the UK from OVID medline with high recall and precision. Three gold standard sets of references were generated using the relative recall method. The sets contained references to studies about the UK which had informed National Institute for Health and Care Excellence (NICE) guidance. The first and second sets were used to develop and refine the medline UK filter. The third set was used to validate the filter. Recall, precision and number-needed-to-read (NNR) were calculated using a case study. The validated medline UK filter demonstrated 87.6% relative recall against the third gold standard set. In the case study, the medline UK filter demonstrated 100% recall, 11.4% precision and a NNR of nine. A validated geographic search filter to retrieve research about the UK with high recall and precision has been developed. The medline UK filter can be applied to systematic literature searches in OVID medline for topics with a UK focus. © 2017 Crown copyright. Health Information and Libraries Journal © 2017 Health Libraries GroupThis article is published with the permission of the Controller of HMSO and the Queen's Printer for Scotland.

  20. Development and characterization of ultra lightweight, highly selective, filter media for oil-water mixtures

    Science.gov (United States)

    Baghernejad, Lida

    Emulsions formed by oil-water mixtures can cause serious issues at different stages of crude oil production, produced water remediation, and oil spills. Efficient, cost-effective processes for separation of oil--water emulsions or dispersions are critical and highly desirable. Filters are among the most common means employed to separate oil-water emulsions into their corresponding components. To conduct single step gravity-based or centrifugal separation of oil--water mixtures into their pure phase, it is essential that the filter be hydrophilic and oleophobic both in air and water. The filter medium should also have high surface porosity, which affects the rate of permeation of one phase. It should be stable at operating temperatures and pressures and be resistant to degradation by chemicals in the feed stream. Favorable oil rejection characteristics, resistance to fouling by organic and inorganic foulants and low cost of production are also important. The goal of this project is to develop ultra-lightweight filters that are durable, highly porous and able to selectively separate oil-water mixtures, from non-woven cellulose based materials by electrospinning. Since electrospinning is a cost-effective, scalable method that can be used to fabricate filters with very thin nanoscale fibers and nano-dimension pores, and cellulose is a very cheap and abundant ingredient, these filters may be considered as novel tools for efficient, cost-effective separation of oil-water emulsions in industry. Currently there is a growing demand for highly porous selective filters for oil-water mixtures in the petroleum industry. These filters are beneficial to both manufacturers and consumers. This research focuses on the development and characterization of the new filter material and evaluation of its suitability for oil-water separation in the field.

  1. Development and Verification of a Pilot Code based on Two-fluid Three-field Model

    Energy Technology Data Exchange (ETDEWEB)

    Hwang, Moon Kyu; Bae, S. W.; Lee, Y. J.; Chung, B. D.; Jeong, J. J.; Ha, K. S.; Kang, D. H

    2006-09-15

    In this study, a semi-implicit pilot code is developed for a one-dimensional channel flow as three-fields. The three fields are comprised of a gas, continuous liquid and entrained liquid fields. All the three fields are allowed to have their own velocities. The temperatures of the continuous liquid and the entrained liquid are, however, assumed to be equilibrium. The interphase phenomena include heat and mass transfer, as well as momentum transfer. The fluid/structure interaction, generally, include both heat and momentum transfer. Assuming adiabatic system, only momentum transfer is considered in this study, leaving the wall heat transfer for the future study. Using 10 conceptual problems, the basic pilot code has been verified. The results of the verification are summarized below: It was confirmed that the basic pilot code can simulate various flow conditions (such as single-phase liquid flow, bubbly flow, slug/churn turbulent flow, annular-mist flow, and single-phase vapor flow) and transitions of the flow conditions. The pilot code was programmed so that the source terms of the governing equations and numerical solution schemes can be easily tested. The mass and energy conservation was confirmed for single-phase liquid and single-phase vapor flows. It was confirmed that the inlet pressure and velocity boundary conditions work properly. It was confirmed that, for single- and two-phase flows, the velocity and temperature of non-existing phase are calculated as intended. Complete phase depletion which might occur during a phase change was found to adversely affect the code stability. A further study would be required to enhance code capability in this regard.

  2. Development and Experimental Verification of Key Techniques to Validate Remote Sensing Products

    Science.gov (United States)

    Li, X.; Wang, S. G.; Ge, Y.; Jin, R.; Liu, S. M.; Ma, M. G.; Shi, W. Z.; Li, R. X.; Liu, Q. H.

    2013-05-01

    Validation of remote sensing land products is a fundamental issue for Earth observation. Ministry of Science and Technology of the People's Republic of China (MOST) has launched a high-tech R&D Program named `Development and experimental verification of key techniques to validate remote sensing products' in 2011. This paper introduces the background, scientific objectives, research contents of this project and research result already achieved. The objectives of this project include (1) to build a technical specification for the validation of remote sensing products; (2) to investigate the performance, we will carry out a comprehensive remote sensing experiment on satellite - aircraft - ground truth and then modify Step 1 until reach the predefined requirement; (3) to establish a validation network of China for remote sensing products. In summer 2012, with support of the Heihe Watershed Allied Telemetry Experimental Research (HiWATER), field observations have been successfully conducted in the central stream of the Heihe River Basin, a typical inland river basin in northwest China. A flux observation matrix composed of eddy covariance (EC) and large aperture scintillometer (LAS), in addition to a densely distributed eco-hydrological wireless sensor network have been established to capture multi-scale heterogeneities of evapotranspiration (ET), leaf area index (LAI), soil moisture and temperature. Airborne missions have been flown with the payloads of imaging spectrometer, light detection and ranging (LiDAR), infrared thermal imager and microwave radiometer that provide various scales of aerial remote sensing observations. Satellite images with high resolution have been collected and pre-processed, e.g. PROBA-CHRIS and TerraSAR-X. Simultaneously, ground measurements have been conducted over specific sampling plots and transects to obtain validation data sets. With this setup complex problems are addressed, e.g. heterogeneity, scaling, uncertainty, and eventually to

  3. The Development of Nanofibrous Media Filter Containing Nanoparticles for Removing Particles from Air Stream

    OpenAIRE

    S. Farhang Dehghan; B Maddah; F Golbabaei

    2016-01-01

    Background and Objectives: The goal of the present study was to develop nanofibrous media filters containing MgO nanoparticles for future application in removing particles from gas stream. Materials and Methods: Electrospun nanofibers were fabricated using experimental design prepared by Response Surface Methodology. Optimization of electrospinning parameters was conducted for achieving the desired filter properties including fiber diameter, porosity, and bead number. ...

  4. Automation of microbial enumeration: development of a disposable hydrophobic grid-membrane filter unit.

    OpenAIRE

    Tsuji, K.; Bussey, D M

    1986-01-01

    A disposable filter unit containing a hydrophobic grid-membrane filter (HGMF) was developed. The unit is liquid tight to serve as a specimen transport container and, by removal of the funnel extender (175- or 300-ml capacity), the unit becomes less than the height of two stacked petri plates to save space during in situ incubation. The polyethylene mesh which supports the HGMF facilitates rinse removal of any substance(s) that would interfere with microbial growth. The correlations between a ...

  5. Development of high-temperature superconducting filters operating at temperatures above 90 K

    Institute of Scientific and Technical Information of China (English)

    XIA HouHai; ZHOU ChunXia; ZUO Tao; HE Ming; JI Lu; ZHOU TieGe; ZHAO XinJie; FANG Lan; YAN ShaoLin

    2009-01-01

    This paper represents the development of a high temperature superconducting (HTS) filter, the highest operating temperature of which is up to 93 K. The filter is designed for S band with 4% fractional bandwidth and fabricated using thallium-barium-calcium-copper oxide (Tl_2Ba_2CaCu_2O_8) thin films. At 93 K, the measurements of the filter show that the insertion loss in the passband is less than 0.22 dB, the return loss is better than 20 dB, and the out-of-band rejection is more than 80 dB. The analysis on the characteristics of the filter operating at different temperatures shows that the filter can work well at temperatures around 90 K. The temperature of 93 K is the highest among the previous reports for HTS filters. The result reported in this paper is significant for HTS filters to be used in the field of microwave communication requiring high sensitivity.

  6. The development of advanced instrumentation and control technology -The development of verification and validation technology for instrumentation and control in NPPs-

    Energy Technology Data Exchange (ETDEWEB)

    Kwon, Kee Choon; Ham, Chang Sik; Lee, Byung Sun; Kim, Jung Taek; Park, Won Man; Park, Jae Chang; Lee, Jang Soo; Um, Heung Sub; Kim, Jang Yul; Ryoo, Chan Hoh; Joo, Jae Yoon; Song, Soon Ja [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1995-07-01

    We collected and analyzed the domestic/international codes, standards and guidelines to develop high reliable software verification and validation methodology which is satisfied to our real situation. The three major parts of work are performed that is the construction of the frame for high reliable software development environment, establishment of high reliable software development methodology and study for the basic technology related to safety-critical software. These three parts are tightly coupled each other to achieve self-reliable software verification and validation technology for digital I and C in NPPs. The configuration of hardware and software are partly performed using requirements which is developed in first stage for the development of I and C test facility. In hardware part, expanded interface using VXI bus and driving software is completed. The main program for math, modelling and supervisor program for instructions are developed. 27 figs, 22 tabs, 69 refs. (Author).

  7. Spot scanning proton therapy plan assessment: design and development of a dose verification application for use in routine clinical practice

    Science.gov (United States)

    Augustine, Kurt E.; Walsh, Timothy J.; Beltran, Chris J.; Stoker, Joshua B.; Mundy, Daniel W.; Parry, Mark D.; Bues, Martin; Fatyga, Mirek

    2016-04-01

    The use of radiation therapy for the treatment of cancer has been carried out clinically since the late 1800's. Early on however, it was discovered that a radiation dose sufficient to destroy cancer cells can also cause severe injury to surrounding healthy tissue. Radiation oncologists continually strive to find the perfect balance between a dose high enough to destroy the cancer and one that avoids damage to healthy organs. Spot scanning or "pencil beam" proton radiotherapy offers another option to improve on this. Unlike traditional photon therapy, proton beams stop in the target tissue, thus better sparing all organs beyond the targeted tumor. In addition, the beams are far narrower and thus can be more precisely "painted" onto the tumor, avoiding exposure to surrounding healthy tissue. To safely treat patients with proton beam radiotherapy, dose verification should be carried out for each plan prior to treatment. Proton dose verification systems are not currently commercially available so the Department of Radiation Oncology at the Mayo Clinic developed its own, called DOSeCHECK, which offers two distinct dose simulation methods: GPU-based Monte Carlo and CPU-based analytical. The three major components of the system include the web-based user interface, the Linux-based dose verification simulation engines, and the supporting services and components. The architecture integrates multiple applications, libraries, platforms, programming languages, and communication protocols and was successfully deployed in time for Mayo Clinic's first proton beam therapy patient. Having a simple, efficient application for dose verification greatly reduces staff workload and provides additional quality assurance, ultimately improving patient safety.

  8. Automation of microbial enumeration: development of a disposable hydrophobic grid-membrane filter unit.

    Science.gov (United States)

    Tsuji, K; Bussey, D M

    1986-10-01

    A disposable filter unit containing a hydrophobic grid-membrane filter (HGMF) was developed. The unit is liquid tight to serve as a specimen transport container and, by removal of the funnel extender (175- or 300-ml capacity), the unit becomes less than the height of two stacked petri plates to save space during in situ incubation. The polyethylene mesh which supports the HGMF facilitates rinse removal of any substance(s) that would interfere with microbial growth. The correlations between a pour plate, a conventional square HGMF, and a disposable filter unit on microbial enumeration were examined. Characteristics (e.g., clumping, spreading, etc.) of some microorganisms limit the linear counting range to less than 1,000 CFU per filter.

  9. DEVELOPMENT AND UTILIZATION OF TEST FACILITY FOR THE STUDY OF CANDLE FILTER SURFACE REGENERATION

    Energy Technology Data Exchange (ETDEWEB)

    Bruce S. Kang; Eric K. Johnson

    2003-07-14

    Hot gas particulate filtration is a basic component in advanced power generation systems such as Integrated Gasification Combined Cycle (IGCC) and Pressurized Fluidized Bed Combustion (PFBC). These systems require effective particulate removal to protect the downstream gas turbine and also to meet environmental emission requirements. The ceramic barrier filter is one of the options for hot gas filtration. Hot gases flow through ceramic candle filters leaving ash deposited on the outer surface of the filter. A process known as surface regeneration removes the deposited ash periodically by using a high pressure pulse of gas to back flush the filter. After this cleaning process has been completed there may be some residual ash on the filter surface. This residual ash may grow and this may then lead to mechanical failure of the filter. A Room Temperature Test Facility (RTTF) and a High Temperature Test Facility (HTTF) were built to investigate the ash characteristics during surface regeneration at room and selected high temperatures. The RTTF system was used to gain experience with the selected instrumentation and develop an operating procedure to be used later at elevated temperatures. The HTTF system is capable of conducting surface regeneration tests of a single candle filter at temperatures up to 1500 F. In order to obtain sequential digital images of ash particle distribution during the surface regeneration process, a high resolution, high speed image acquisition system was integrated into the HTTF system. The regeneration pressure and the transient pressure difference between the inside of the candle filter and the chamber during regeneration were measured using a high speed PC data acquisition system. The control variables for the high temperature regeneration tests were (1) face velocity, (2) pressure of the back pulse, and (3) cyclic ash built-up time. Coal ash sample obtained from the Power System Development Facility (PSDF) at Wilsonville, AL was used at the

  10. DEVELOPMENT AND VERIFICATION OF NEW SOLID DENTAL FILLING TEMPORARY MATERIALS CONTAINING ZINC. FORMULA DEVELOPMENT STAGE.

    Science.gov (United States)

    Pytko-Polończyk, Jolanta; Antosik, Agata; Zajac, Magdalena; Szlósarczyk, Marek; Krywult, Agnieszka; Jachowicz, Renata; Opoka, Włodzimierz

    2016-01-01

    Caries is the most popular problem affecting teeth and this is the reason why so many temporary dental filling materials are being developed. An example of such filling is zinc oxide paste mixed with eugenol, Thymodentin and Coltosol F®. Zinc-oxide eugenol is used in dentistry because of its multiplied values: it improves heeling of the pulp by dentine bridge formation; has antiseptic properties; is hygroscopic. Because of these advantages compouds of zinc oxide are used as temporary fillings, especially in deep caries lesions when treatment is oriented on support of vital pulp. Temporary dental fillings based on zinc oxide are prepared ex tempone by simple mixing powder (Thymodentin) and eugenol liqiud together or a ready to use paste Coltosol F®. Quantitative composition depends mainly on experience of person who is preparing it, therefore, exact qualitative composition of dental fillings is not replicable. The main goal of the study was to develop appropriate dental fillings in solid form containing set amount of zinc oxide. Within the study, the influence of preparation method on solid dental fillings properties like mechanical properties and zinc ions release were examined.

  11. The development of a HEPA filter with improved dust holding characteristics

    Energy Technology Data Exchange (ETDEWEB)

    Dyment, J.; Hamblin, C.

    1995-02-01

    A limitation of the HEPA filters used in the extract of nuclear facilities is their relatively low capacity for captured dust. The costs associated with the disposal of a typical filter means that there are clear incentives to extend filter life. The work described in this report are the initial stages in the development of a filter which incorporates a medium which enhances its dust holding capacity. Experimental equipment was installed to enable the dust loading characteristics of candidate media to be compared with those of the glass fibre based papers currently used in filter construction. These tests involved challenging representative samples of the media with an air stream containing a controlled concentration of thermally generated sodium chloride particles. The dust loading characteristics of the media were then compared in terms of the rate of increasing in pressure differential. A number of {open_quotes}graded density{close_quotes} papers were subsequently identified which appeared to offer significant improvements in dust holding. In the second phase of the programme deep-pleat filters (1,700 M{sup 3}h{sup {minus}1}) incorporating graded density papers were manufactured and tested. Improvements of up to 50% were observed in their capacity for the sub-micron sodium chloride test dust. Smaller differences (15%) were measured when a coarser, carbon black, challenge was used. This is attributed to the differences in the particles sizes of the two dusts.

  12. Filling schemes at submicron scale: Development of submicron sized plasmonic colour filters

    Science.gov (United States)

    Rajasekharan, Ranjith; Balaur, Eugeniu; Minovich, Alexander; Collins, Sean; James, Timothy D.; Djalalian-Assl, Amir; Ganesan, Kumaravelu; Tomljenovic-Hanic, Snjezana; Kandasamy, Sasikaran; Skafidas, Efstratios; Neshev, Dragomir N.; Mulvaney, Paul; Roberts, Ann; Prawer, Steven

    2014-09-01

    The pixel size imposes a fundamental limit on the amount of information that can be displayed or recorded on a sensor. Thus, there is strong motivation to reduce the pixel size down to the nanometre scale. Nanometre colour pixels cannot be fabricated by simply downscaling current pixels due to colour cross talk and diffraction caused by dyes or pigments used as colour filters. Colour filters based on plasmonic effects can overcome these difficulties. Although different plasmonic colour filters have been demonstrated at the micron scale, there have been no attempts so far to reduce the filter size to the submicron scale. Here, we present for the first time a submicron plasmonic colour filter design together with a new challenge - pixel boundary errors at the submicron scale. We present simple but powerful filling schemes to produce submicron colour filters, which are free from pixel boundary errors and colour cross- talk, are polarization independent and angle insensitive, and based on LCD compatible aluminium technology. These results lay the basis for the development of submicron pixels in displays, RGB-spatial light modulators, liquid crystal over silicon, Google glasses and pico-projectors.

  13. Design and development of high performance panel air filter with experimental evaluation and analysis of filter media pleats

    Directory of Open Access Journals (Sweden)

    Sagar R. Patil

    2015-11-01

    Full Text Available In automobile vehicles mostly plastic molded panel filters used for the purpose of engine air filtration. Fibrous structured cellulose media were being used with different permeability’s according to requirement of rated air flow rate required for the engine. To optimize the filter pleat design of automotive panel air filter, it is important to study correlation of pressure drop, dust holding capacity & efficiency. The main role of a filter is to provide least pressure drop with high dust holding and efficiency. A channel made for the testing of different pleat designs. This research comprises of experimental design & evaluation of filter element with variable pleat depth and pleat density. This assessment offers the selection of pleat design according to the performance requirements.

  14. [Formula: see text]Children's sense of reality: The development of orbitofrontal reality filtering.

    Science.gov (United States)

    Liverani, Maria Chiara; Manuel, Aurélie L; Nahum, Louis; Guardabassi, Veronica; Tomasetto, Carlo; Schnider, Armin

    2017-05-01

    Orbitofrontal reality filtering denotes a memory control mechanism necessary to keep thought and behavior in phase with reality. In adults, it is mediated by the orbitofrontal cortex and subcortical connections and its failure induces reality confusion, confabulations, and disorientation. Here we investigated for the first time the development of this mechanism in 83 children from ages 7 to 11 years and 20 adults. We used an adapted version of a continuous recognition task composed of two runs with the same picture set but arranged in different order. The first run measures storage and recognition capacity (item memory), the second run measures reality filtering. We found that accuracy and reaction times in response to all stimulus types of the task improved in parallel across ages. Importantly, at no age was there a notable performance drop in the second run. This means that reality filtering was already efficacious at age 7 and then steadily improved as item memory became stronger. At the age of 11 years, reality filtering dissociated from item memory, similar to the pattern observed in adults. However, performance in 11-year-olds was still inferior as compared to adults. The study shows that reality filtering develops early in childhood and becomes more efficacious as memory capacity increases. For the time being, it remains unresolved, however, whether this function already depends on the orbitofrontal cortex, as it does in adults, or on different brain structures in the developing brains of children.

  15. Development, verification and validation of an FPGA-based core heat removal protection system for a PWR

    Energy Technology Data Exchange (ETDEWEB)

    Wu, Yichun, E-mail: ycwu@xmu.edu.cn [College of Energy, Xiamen University, Xiamen 361102 (China); Shui, Xuanxuan, E-mail: 807001564@qq.com [College of Energy, Xiamen University, Xiamen 361102 (China); Cai, Yuanfeng, E-mail: 1056303902@qq.com [College of Energy, Xiamen University, Xiamen 361102 (China); Zhou, Junyi, E-mail: 1032133755@qq.com [College of Energy, Xiamen University, Xiamen 361102 (China); Wu, Zhiqiang, E-mail: npic_wu@126.com [State Key Laboratory of Reactor System Design Technology, Nuclear Power Institute of China, Chengdu 610041 (China); Zheng, Jianxiang, E-mail: zwu@xmu.edu.cn [College of Energy, Xiamen University, Xiamen 361102 (China)

    2016-05-15

    Highlights: • An example on life cycle development process and V&V on FPGA-based I&C is presented. • Software standards and guidelines are used in FPGA-based NPP I&C system logic V&V. • Diversified FPGA design and verification languages and tools are utilized. • An NPP operation principle simulator is used to simulate operation scenarios. - Abstract: To reach high confidence and ensure reliability of nuclear FPGA-based safety system, life cycle processes of discipline specification and implementation of design as well as regulations verification and validation (V&V) are needed. A specific example on how to conduct life cycle development process and V&V on FPGA-based core heat removal (CHR) protection system for CPR1000 pressure water reactor (PWR) is presented in this paper. Using the existing standards and guidelines for life cycle development and V&V, a simplified FPGA-based CHR protection system for PWR has been designed, implemented, verified and validated. Diversified verification and simulation languages and tools are used by the independent design team and the V&V team. In the system acceptance testing V&V phase, a CPR1000 NPP operation principle simulator (OPS) model is utilized to simulate normal and abnormal operation scenarios, and provide input data to the under-test FPGA-based CHR protection system and a verified C code CHR function module. The evaluation results are applied to validate the under-test FPGA-based CHR protection system. The OPS model operation outputs also provide reasonable references for the tests. Using an OPS model in the system acceptance testing V&V is cost-effective and high-efficient. A dedicated OPS, as a commercial-off-the-shelf (COTS) item, would contribute as an important tool in the V&V process of NPP I&C systems, including FPGA-based and microprocessor-based systems.

  16. SU-E-T-203: Development of a QA Software Tool for Automatic Verification of Plan Data Transfer and Delivery.

    Science.gov (United States)

    Chen, G; Li, X

    2012-06-01

    Consistency verification between the data from treatment planning system (TPS), record and verification system (R&V), and delivered recorder with visual inspection is time consuming and subject to human error. The purpose of this work is to develop a software tool to automatically perform such verifications. Using Microsoft visual C++, a quality assurance (QA) tool was developed to (1) read plan data including gantry/collimator/couch parameters, multi-leaf-collimator leaf positions, and monitor unit (MU) numbers from a TPS (Xio, CMS/Elekta, or RealART, Prowess) via RTP link or DICOM transfer, (2) retrieve imported (prior to delivery) and recorded (after delivery) data from a R&V system (Mosaiq, Elekta) with open database connectivity, calculate MU independently based on the DICOM plan data using a modified Clarkson integration algorithm, and (4) compare all the extracted data to identify possible discrepancy between TPS and R&V, and R&V and delivery. The tool was tested for 20 patients with 3DCRT and IMRT plans from regular and the online adaptive radiotherapy treatments. It was capable of automatically detecting any inconsistency between the beam data from the TPS and the data stored in the R&V system with an independent MU check and any significant treatment delivery deviation from the plan within a few seconds. With this tool being used prior to and after the delivery as an essential QA step, our clinical online adaptive re-planning process can be speeded up to save a few minutes by eliminating the tedious visual inspection. A QA software tool has been developed to automatically verify the treatment data consistency from delivery back to plan and to identify discrepancy in MU calculations between the TPS and the secondary MU check. This tool speeds up clinical QA process and eliminating human errors from visual inspection, thus improves safety. © 2012 American Association of Physicists in Medicine.

  17. Feature-Aware Verification

    CERN Document Server

    Apel, Sven; Wendler, Philipp; von Rhein, Alexander; Beyer, Dirk

    2011-01-01

    A software product line is a set of software products that are distinguished in terms of features (i.e., end-user--visible units of behavior). Feature interactions ---situations in which the combination of features leads to emergent and possibly critical behavior--- are a major source of failures in software product lines. We explore how feature-aware verification can improve the automatic detection of feature interactions in software product lines. Feature-aware verification uses product-line verification techniques and supports the specification of feature properties along with the features in separate and composable units. It integrates the technique of variability encoding to verify a product line without generating and checking a possibly exponential number of feature combinations. We developed the tool suite SPLverifier for feature-aware verification, which is based on standard model-checking technology. We applied it to an e-mail system that incorporates domain knowledge of AT&T. We found that feat...

  18. Development of Improved Iron-Aluminide Filter Tubes and Elements

    Energy Technology Data Exchange (ETDEWEB)

    Judkins, R.R.; Sutton, T.G.; Miller, C.J.; Tortorelli, P.F.

    2008-01-14

    The purpose of this Cooperative Research and Development Agreement (CRADA) was to explore and develop advanced manufacturing techniques to fabricate sintered iron-aluminide intermetallic porous bodies used for gas filtration so as to reduce production costs while maintaining or improving performance in advanced coal gasification and combustion systems. The use of a power turbine fired with coal-derived synthesis gas requires some form of gas cleaning in order to protect turbine and downstream components from degradation by erosion, corrosion, and/or deposition. Hot-gas filtration is one form of cleaning that offers the ability to remove particles from the gases produced by gasification processes without having to substantially cool and, possibly, reheat them before their introduction into the turbine. This technology depends critically on materials durability and reliability, which have been the subject of study for a number of years.

  19. DEVELOPMENT OF A CANDLE FILTER FAILURE SAFEGUARD DEVICE

    Energy Technology Data Exchange (ETDEWEB)

    G.J. Bruck; E.E. Smeltzer; Z.N. Sanjana

    2002-06-06

    Development, testing and optimization of advanced metal and ceramic, barrier and fiber safeguard devices (SGDs) is described. Metal barrier devices are found prone to manufacturing defects and premature blinding. Fiber devices are found to be satisfactory if fine fibers are used. Durable alloys are identified for both oxidation and gasification conditions. Ceramic honeycomb SGDs were found to perform as excellent barrier devices. Optimization has shown such devices to be durable. Field testing of ceramic honeycomb SGDs from two different manufacturers is being pursued.

  20. Verification of Wegelin\\'s design criteria for horizontal flow roughing ...

    African Journals Online (AJOL)

    Verification of Wegelin\\'s design criteria for horizontal flow roughing filters (HRFs) with alternative filter material. ... This study aimed at verifying these criteria based on gravel as a filter medium and two other possible ... Article Metrics.

  1. Development of a mixed pixel filter for improved dimension estimation using AMCW laser scanner

    Science.gov (United States)

    Wang, Qian; Sohn, Hoon; Cheng, Jack C. P.

    2016-09-01

    Accurate dimension estimation is desired in many fields, but the traditional dimension estimation methods are time-consuming and labor-intensive. In the recent decades, 3D laser scanners have become popular for dimension estimation due to their high measurement speed and accuracy. Nonetheless, scan data obtained by amplitude-modulated continuous-wave (AMCW) laser scanners suffer from erroneous data called mixed pixels, which can influence the accuracy of dimension estimation. This study develops a mixed pixel filter for improved dimension estimation using AMCW laser scanners. The distance measurement of mixed pixels is firstly formulated based on the working principle of laser scanners. Then, a mixed pixel filter that can minimize the classification errors between valid points and mixed pixels is developed. Validation experiments were conducted to verify the formulation of the distance measurement of mixed pixels and to examine the performance of the proposed mixed pixel filter. Experimental results show that, for a specimen with dimensions of 840 mm × 300 mm, the overall errors of the dimensions estimated after applying the proposed filter are 1.9 mm and 1.0 mm for two different scanning resolutions, respectively. These errors are much smaller than the errors (4.8 mm and 3.5 mm) obtained by the scanner's built-in filter.

  2. Development of a bifunctional filter for prion protein and leukoreduction of red blood cell components.

    Science.gov (United States)

    Yokomizo, Tomo; Kai, Takako; Miura, Morikazu; Ohto, Hitoshi

    2015-02-01

    Leukofiltration of blood components is currently implemented worldwide as a precautionary measure against white blood cell-associated adverse effects and the potential transmission of variant Creutzfeldt-Jakob disease (vCJD). A newly developed bifunctional filter (Sepacell Prima, Asahi Kasei Medical) was assessed for prion removal, leukoreduction (LR), and whether the filter significantly affected red blood cells (RBCs). Sepacell Prima's postfiltration effects on RBCs, including hemolysis, complement activation, and RBC chemistry, were compared with those of a conventional LR filter (Sepacell Pure RC). Prion removal was measured by Western blot after spiking RBCs with microsomal fractions derived from scrapie-infected hamster brain homogenate. Serially diluted exogenous prion solutions (0.05 mL), with or without filtration, were injected intracerebrally into Golden Syrian hamsters. LR efficiency of 4.44 log with the Sepacell Prima was comparable to 4.11 log with the conventional LR filter. There were no significant differences between the two filters in hemoglobin loss, hemolysis, complement activation, and RBC biomarkers. In vitro reduction of exogenously spiked prions by the filter exceeded 3 log. The titer, 6.63 (log ID50 /mL), of prefiltration infectivity of healthy hamsters was reduced to 2.52 (log ID50 /mL) after filtration. The reduction factor was calculated as 4.20 (log ID50 ). With confirmed removal efficacy for exogenous prion protein, this new bifunctional prion and LR filter should reduce the residual risk of vCJD transmission through blood transfusion without adding complexity to component processing. © 2014 AABB.

  3. Development of regeneration technique for diesel particulate filter made of porous metal; Kinzoku takotai DPF no saisei gijutsu no kaihatsu

    Energy Technology Data Exchange (ETDEWEB)

    Yoro, K.; Ban, S.; Ooka, T.; Saito, H.; Oji, M.; Nakajima, S.; Okamoto, S. [Sumitomo Electric Industries, Ltd., Osaka (Japan)

    1997-10-01

    We have developed the diesel particulate filter (DPF) in which porous metal is used for a filter because of its high thermal conductivity and a radiation heater is used for a regeneration device because of its uniform thermal distribution. In the case high trapping efficiency is required, filter thickness should be thick. The thicker filter has a disadvantage of difficulty in regeneration because of the thermal distribution in the direction of thickness. In order to improve regeneration efficiency, we designed the best filter-heater construction which achieves uniform thermal distribution by using computer simulation and we confirmed good regeneration efficiency in the experiment. 4 refs., 14 figs., 1 tab.

  4. Development of miniature HTSC wide-band filter with open-loop resonators

    Institute of Scientific and Technical Information of China (English)

    ZHANG TianLiang; YANG Kai; NING JunSong; BU ShiRong; LIU JuanXiu; LUO ZhengXiang

    2008-01-01

    The strong electric and magnetic coupled novel HTSC (high temperature superconductor) open-loop mierostrip resonators are studied in this report and the traditional structure of open-loop resonators is improved. A miniature wide-band HTSC bandpass filter is developed by the novel structure, which is fabricated on YBCO/LaAIO3/BCO substrate with dimensions of 14.8×9.6 mm2. This filter is tested at 77K, and the specifications are that the center frequency is 2230 MHz, the bandwidth is 455 MHz, and the best insertion loss is 0.14 dB in passband.

  5. Improvements of low-detection-limit filter-free fluorescence sensor developed by charge accumulation operation

    Science.gov (United States)

    Tanaka, Kiyotsugu; Choi, Yong Joon; Moriwaki, Yu; Hizawa, Takeshi; Iwata, Tatsuya; Dasai, Fumihiro; Kimura, Yasuyuki; Takahashi, Kazuhiro; Sawada, Kazuaki

    2017-04-01

    We developed a low-detection-limit filter-free fluorescence sensor by a charge accumulation technique. For charge accumulation, a floating diffusion amplifier (FDA), which included a floating diffusion capacitor, a transfer gate, and a source follower circuit, was used. To integrate CMOS circuits with the filter-free fluorescence sensor, we adopted a triple-well process to isolate transistors from the sensor on a single chip. We detected 0.1 nW fluorescence under the illumination of excitation light by 1.5 ms accumulation, which was one order of magnitude greater than that of a previous current detection sensor.

  6. Iron oxide impregnated filter paper (Pi test): a review of its development and methodological research

    NARCIS (Netherlands)

    Chardon, W.J.; Menon, R.G.; Chien, S.H.

    1996-01-01

    Iron oxide impregnated filter paper (FeO paper) has been used to study the availability of phosphorus (P) to plants and algae, P desorption kinetics and P dynamics in the field. Since its initial development a number of differences in the method of preparation of the paper and its application have

  7. Iron oxide impregnated filter paper (Pi test): a review of its development and methodological research

    NARCIS (Netherlands)

    Chardon, W.J.; Menon, R.G.; Chien, S.H.

    1996-01-01

    Iron oxide impregnated filter paper (FeO paper) has been used to study the availability of phosphorus (P) to plants and algae, P desorption kinetics and P dynamics in the field. Since its initial development a number of differences in the method of preparation of the paper and its application have b

  8. Development of Cryogenic Filter Wheels for the HERSCHEL Photodetector Array Camera & Spectrometer (PACS)

    Science.gov (United States)

    Koerner, Christian; Kampf, Dirk; Poglitsch, Albrecht; Schubert, Josef; Ruppert, U.; Schoele, M.

    2014-01-01

    This paper describes the two PACS Filter Wheels that are direct-drive rotational mechanisms operated at a temperature below 5K inside the PACS focal plane unit of the Herschel Satellite. The purpose of the mechanisms is to switch between filters. The rotation axis is pivoted to the support structure via a slightly preloaded pair of ball bearings and driven by a Cryotorquer. Position sensing is realized by a pair of Hall effect sensors. Powerless positioning at the filter positions is achieved by a magnetic ratchet system. The key technologies are the Cryotorquer design and the magnetic ratchet design in the low temperature range. Furthermore, we will report on lessons learned during the development and qualification of the mechanism and the paint.

  9. Design and Development of a High Efficiency CarbonGranular Bed Filter in Industrial Scale

    Institute of Scientific and Technical Information of China (English)

    张济宇; 旷戈; 林诚

    2004-01-01

    The new dust removal technical route using the carbon-granular bed filter, packed of carbon particles with appropriate grade derive from an online-process vibration sieve, to replace the traditional baggy filter had been developed successfully for capturing the micro-carbon dusts produced from pulverization of petroleum coke,and the green close loop of carbon materials is thus completed in the combined pulverizing and classifying system and pulverized carbon dust removal process. The high dust removal efficiency greater than 99%, low outlet dust concentration less than 100 mg-m-S, low pressure drop through dust filtration chamber less than 980 Pa, simple and easy design, and flexible and stable operation were achieved also with the carbon-granular bed filter in both bench and industrial scale operations.

  10. Knowing How Good Our Searches Are: An Approach Derived from Search Filter Development Methodology

    Directory of Open Access Journals (Sweden)

    Sarah Hayman

    2015-12-01

    Full Text Available Objective – Effective literature searching is of paramount importance in supporting evidence based practice, research, and policy. Missed references can have adverse effects on outcomes. This paper reports on the development and evaluation of an online learning resource, designed for librarians and other interested searchers, presenting an evidence based approach to enhancing and testing literature searches. Methods – We developed and evaluated the set of free online learning modules for librarians called Smart Searching, suggesting the use of techniques derived from search filter development undertaken by the CareSearch Palliative Care Knowledge Network and its associated project Flinders Filters. The searching module content has been informed by the processes and principles used in search filter development. The self-paced modules are intended to help librarians and other interested searchers test the effectiveness of their literature searches, provide evidence of search performance that can be used to improve searches, as well as to evaluate and promote searching expertise. Each module covers one of four techniques, or core principles, employed in search filter development: (1 collaboration with subject experts; (2 use of a reference sample set; (3 term identification through frequency analysis; and (4 iterative testing. Evaluation of the resource comprised ongoing monitoring of web analytics to determine factors such as numbers of users and geographic origin; a user survey conducted online elicited qualitative information about the usefulness of the resource. Results – The resource was launched in May 2014. Web analytics show over 6,000 unique users from 101 countries (at 9 August 2015. Responses to the survey (n=50 indicated that 80% would recommend the resource to a colleague. Conclusions – An evidence based approach to searching, derived from search filter development methodology, has been shown to have value as an online learning

  11. Development of discrete-time H{infinity} filtering method for time-delay compensation of rhodium incore detectors

    Energy Technology Data Exchange (ETDEWEB)

    Park, Moon Kyu; Kim, Yong Hee; Cha, Kune Ho; Kim, Myung Ki [KEPCO, KEPRI, Taejon (Korea, Republic of)

    1998-10-01

    A method is described to develop an H{infinity} filtering method for the dynamic compensation of self-powered neutron detectors normally used for fixed incore instruments. An H{infinity} norm of the filter transfer matrix is used as the optimization criteria in the worst-case estimation error sense. Filter modeling is performed for discrete-time model. The filter gains are optimized in the sense of noise attenuation level of H{infinity} setting. By introducing Bounded Real Lemma, the conventional algebraic Riccati inequalities are converted into Linear Matrix Inequalities (LMIs). Finally, the filter design problem is solved via the convex optimization framework using LMIs. The simulation results show that remarkable improvements are achieved in view of the filter response time and the filter design efficiency.

  12. Design for sustainable development--household drinking water filter for arsenic and pathogen treatment in Nepal.

    Science.gov (United States)

    Ngai, Tommy K K; Shrestha, Roshan R; Dangol, Bipin; Maharjan, Makhan; Murcott, Susan E

    2007-10-01

    In the last 20 years, the widespread adoption of shallow tubewells in Nepal Terai region enabled substantial improvement in access to water, but recent national water quality testing showed that 3% of these sources contain arsenic above the Nepali interim guideline of 50 microg/L, and up to 60% contain unsafe microbial contamination. To combat this crisis, MIT, ENPHO and CAWST together researched, developed and implemented a household water treatment technology by applying an iterative, learning development framework. A pilot study comparing 3 technologies against technical, social, and economic criteria showed that the Kanchan Arsenic Filter (KAF) is the most promising technology for Nepal. A two-year technical and social evaluation of over 1000 KAFs deployed in rural villages of Nepal determined that the KAF typically removes 85-90% arsenic, 90-95% iron, 80-95% turbidity, and 85-99% total coliforms. Then 83% of the households continued to use the filter after 1 year, mainly motivated by the clean appearance, improved taste, and reduced odour of the filtered water, as compared to the original water source. Although over 5,000 filters have been implemented in Nepal by January 2007, further research rooted in sustainable development is necessary to understand the technology diffusion and scale-up process, in order to expand access to safe water in the country and beyond.

  13. Developing a Fundamental Model for an Integrated GPS/INS State Estimation System with Kalman Filtering

    Science.gov (United States)

    Canfield, Stephen

    1999-01-01

    This work will demonstrate the integration of sensor and system dynamic data and their appropriate models using an optimal filter to create a robust, adaptable, easily reconfigurable state (motion) estimation system. This state estimation system will clearly show the application of fundamental modeling and filtering techniques. These techniques are presented at a general, first principles level, that can easily be adapted to specific applications. An example of such an application is demonstrated through the development of an integrated GPS/INS navigation system. This system acquires both global position data and inertial body data, to provide optimal estimates of current position and attitude states. The optimal states are estimated using a Kalman filter. The state estimation system will include appropriate error models for the measurement hardware. The results of this work will lead to the development of a "black-box" state estimation system that supplies current motion information (position and attitude states) that can be used to carry out guidance and control strategies. This black-box state estimation system is developed independent of the vehicle dynamics and therefore is directly applicable to a variety of vehicles. Issues in system modeling and application of Kalman filtering techniques are investigated and presented. These issues include linearized models of equations of state, models of the measurement sensors, and appropriate application and parameter setting (tuning) of the Kalman filter. The general model and subsequent algorithm is developed in Matlab for numerical testing. The results of this system are demonstrated through application to data from the X-33 Michael's 9A8 mission and are presented in plots and simple animations.

  14. Scalable Techniques for Formal Verification

    CERN Document Server

    Ray, Sandip

    2010-01-01

    This book presents state-of-the-art approaches to formal verification techniques to seamlessly integrate different formal verification methods within a single logical foundation. It should benefit researchers and practitioners looking to get a broad overview of the spectrum of formal verification techniques, as well as approaches to combining such techniques within a single framework. Coverage includes a range of case studies showing how such combination is fruitful in developing a scalable verification methodology for industrial designs. This book outlines both theoretical and practical issue

  15. SYSTEM-COGNITIVE MODEL OF FORECASTING THE DEVELOPMENT OF DIVERSIFIED AGRO-INDUSTRIAL CORPORATIONS. PART II. SYNTHESIS AND MODEL VERIFICATION

    Directory of Open Access Journals (Sweden)

    Lutsenko Y. V.

    2015-11-01

    Full Text Available In this article, in accordance with the methodology of the Automated system-cognitive analysis (ASCanalysis, we examine the implementation of the 3rd ASC-analysis: synthesis and verification of forecasting models of development of diversified agro-industrial corporations. In this step, we have synthesis and verification of 3 statistical and 7 system-cognitive models: ABS – matrix of the absolute frequencies, PRC1 and PRC2 – matrix of the conditional and unconditional distributions, INF1 and INF2 private criterion: the amount of knowledge based on A. Kharkevich, INF3 – private criterion: the Chi-square test: difference between the actual and the theoretically expected absolute frequencies INF4 and INF5 – private criterion: ROI - Return On Investment, INF6 and INF7 – private criterion: the difference between conditional and unconditional probability (coefficient of relationship. The reliability of the created models was assessed in accordance with the proposed metric is similar to the known F-test, but does not involve the performance of normal distribution, linearity of the object modeling, the independence and additivity acting factors. The accuracy of the obtained models was high enough to resolve the subsequent problems of identification, forecasting and decision making, as well as studies of the modeled object by studying its model, scheduled for consideration in future articles

  16. Case study of verification, validation, and testing in the Automated Data Processing (ADP) system development life cycle

    Energy Technology Data Exchange (ETDEWEB)

    Riemer, C.A.

    1990-05-01

    Staff of the Environmental Assessment and Information Sciences Division of Argonne National Laboratory (ANL) studies the role played by the organizational participants in the Department of Veterans Affairs (VA) that conduct verification, validation, and testing (VV T) activities at various stages in the automated data processing (ADP) system development life cycle (SDLC). A case-study methodology was used to assess the effectiveness of VV T activities (tasks) and products (inputs and outputs). The case selected for the study was a project designed to interface the compensation and pension (C P) benefits systems with the centralized accounts receivable system (CARS). Argonne developed an organizational SDLC VV T model and checklists to help collect information from C P/CARS participants on VV T procedures and activities, and these were then evaluated against VV T standards.

  17. DEVELOPMENT AND TESTING OF A CERIA-ZIRCONIA TOUGHENED ALUMINA PROTOTYPE FILTER ELEMENT MADE OF RETICULATED CERAMIC FOAM COATED WITH A CERAMIC MEMBRANE ACTING AS BARRIER FILTER FOR FLY ASH

    Energy Technology Data Exchange (ETDEWEB)

    Guilio A. Rossi; Kenneth R. Butcher; Stacia M. Wagner

    1999-02-19

    The objective of this work was to fabricate subscale candle filters using a Ce-ZTA reticulated foam material. Specifically Selee fabricated 60mm diameter cylinders with one closed end and one flanged end. Selee Corporation developed a small pore size (5-10 {micro}m) filtration membrane which was applied to the reticulated foam surface to provide a barrier filter surface. The specific tasks to be performed were as follows: (Task 1) Filter Element Development--To fabricate subscale filter elements from zirconia toughened alumina using the reticulated foam manufacturing process. The filter elements were required to meet dimensional tolerances specified by an appropriate filter system supplier. The subscale filter elements were fabricated with integral flanges and end caps, that is, with no glued joints. (Task 2) Membrane Development--To develop a small pore filtration membrane that is to be applied to the reticulated foam material. This membrane was to provide filtration characteristics that meet gas turbine requirements and pressure drop or permeability requirements specified by the filter system supplier. (Task 3) Subscale Filter Element Fabrication--To fabricate six subscale filter elements with integral flanges and closed ends, as well as fine pore size filtration membranes. Three filters were to have a central clean gas channel, while three would have no central channel. The filters were to be provided to FETC for testing in laboratory systems or pilot scale exposure systems as appropriate. The candles were to meet dimensional tolerances as provided by filter system suppliers.

  18. KAERI software verification and validation guideline for developing safety-critical software in digital I and C system of NPP

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jang Yeol; Lee, Jang Soo; Eom, Heung Seop

    1997-07-01

    This technical report is to present V and V guideline development methodology for safety-critical software in NPP safety system. Therefore it is to present V and V guideline of planning phase for the NPP safety system in addition to critical safety items, for example, independence philosophy, software safety analysis concept, commercial off the shelf (COTS) software evaluation criteria, inter-relationships between other safety assurance organizations, including the concepts of existing industrial standard, IEEE Std-1012, IEEE Std-1059. This technical report includes scope of V and V guideline, guideline framework as part of acceptance criteria, V and V activities and task entrance as part of V and V activity and exit criteria, review and audit, testing and QA records of V and V material and configuration management, software verification and validation plan production etc., and safety-critical software V and V methodology. (author). 11 refs.

  19. Development of high-efficiency passive counters (HEPC) for the verification of large LEU samples

    Energy Technology Data Exchange (ETDEWEB)

    Peerani, P. [European Commission, DG-JRC, IPSC, Ispra (Italy)], E-mail: paolo.peerani@jrc.it; Canadell, V.; Garijo, J.; Jackson, K. [European Commission, DG-TREN/I, Nuclear Inspections (Luxembourg); Jaime, R.; Looman, M.; Ravazzani, A. [European Commission, DG-JRC, IPSC, Ispra (Italy); Schwalbach, P. [European Commission, DG-TREN/I, Nuclear Inspections (Luxembourg); Swinhoe, M. [Los Alamos National Laboratory, Los Alamos, NM (United States)

    2009-04-01

    A paper describing the conceptual idea of using passive neutron assay for the verification of large size uranium samples in fuel fabrication plants was first presented at the 2001 ESARDA conference. The advantages of this technique, as a replacement of active interrogation using the PHOto-Neutron Interrogation Device (PHONID) device, were evident provided that a suitable detector with higher efficiency than those commercially available would be realised. The previous paper also included a feasibility study based on the experimental data. To implement this technique, a high-efficiency passive counter (HEPC) has been designed by the JRC, Ispra. JRC has also built a first smaller-scale prototype. This paper will describe the tests made in the PERLA laboratory and report the performance of the prototype. In parallel, the design of the large HEPC has been finalised for Euratom safeguards. Two units for the fuel fabrication plants in Dessel (B) and Juzbado (E) have been produced by a commercial manufacturer under JRC specifications. The two detectors have been installed in the two sites in summer 2004 after an extensive test campaign in PERLA. Since then they are in use and some feedback on the experience gained is reported at the end of this paper.

  20. General-Purpose Heat Source development: Safety Verification Test Program. Bullet/fragment test series

    Energy Technology Data Exchange (ETDEWEB)

    George, T.G.; Tate, R.E.; Axler, K.M.

    1985-05-01

    The radioisotope thermoelectric generator (RTG) that will provide power for space missions contains 18 General-Purpose Heat Source (GPHS) modules. Each module contains four /sup 238/PuO/sub 2/-fueled clads and generates 250 W/sub (t)/. Because a launch-pad or post-launch explosion is always possible, we need to determine the ability of GPHS fueled clads within a module to survive fragment impact. The bullet/fragment test series, part of the Safety Verification Test Plan, was designed to provide information on clad response to impact by a compact, high-energy, aluminum-alloy fragment and to establish a threshold value of fragment energy required to breach the iridium cladding. Test results show that a velocity of 555 m/s (1820 ft/s) with an 18-g bullet is at or near the threshold value of fragment velocity that will cause a clad breach. Results also show that an exothermic Ir/Al reaction occurs if aluminum and hot iridium are in contact, a contact that is possible and most damaging to the clad within a narrow velocity range. The observed reactions between the iridium and the aluminum were studied in the laboratory and are reported in the Appendix.

  1. Development and verification of a screening model for surface spreading of petroleum

    Science.gov (United States)

    Hussein, Maged; Jin, Minghui; Weaver, James W.

    2002-08-01

    Overflows and leakage from aboveground storage tanks and pipelines carrying crude oil and petroleum products occur frequently. The spilled hydrocarbons pose environmental threats by contaminating the surrounding soil and the underlying ground water. Predicting the fate and transport of these chemicals is required for environmental risk assessment and for remedial measure design. The present paper discusses the formulation and application of the Oil Surface Flow Screening Model (OILSFSM) for predicting the surface flow of oil by taking into account infiltration and evaporation. Surface flow is simulated using a semi-analytical model based on the lubrication theory approximation of viscous flow. Infiltration is simulated using a version of the Green and Ampt infiltration model, which is modified to account for oil properties. Evaporation of volatile compounds is simulated using a compositional model that accounts for the changes in the fraction of each compound in the spilled oil. The coupling between surface flow, infiltration and evaporation is achieved by incorporating the infiltration and evaporation fluxes into the global continuity equation of the spilled oil. The model was verified against numerical models for infiltration and analytical models for surface flow. The verification study demonstrates the applicability of the model.

  2. Development and Implementation of Dynamic Scripts to Support Local Model Verification at National Weather Service Weather Forecast Offices

    Science.gov (United States)

    Zavodsky, Bradley; Case, Jonathan L.; Gotway, John H.; White, Kristopher; Medlin, Jeffrey; Wood, Lance; Radell, Dave

    2014-01-01

    Local modeling with a customized configuration is conducted at National Weather Service (NWS) Weather Forecast Offices (WFOs) to produce high-resolution numerical forecasts that can better simulate local weather phenomena and complement larger scale global and regional models. The advent of the Environmental Modeling System (EMS), which provides a pre-compiled version of the Weather Research and Forecasting (WRF) model and wrapper Perl scripts, has enabled forecasters to easily configure and execute the WRF model on local workstations. NWS WFOs often use EMS output to help in forecasting highly localized, mesoscale features such as convective initiation, the timing and inland extent of lake effect snow bands, lake and sea breezes, and topographically-modified winds. However, quantitatively evaluating model performance to determine errors and biases still proves to be one of the challenges in running a local model. Developed at the National Center for Atmospheric Research (NCAR), the Model Evaluation Tools (MET) verification software makes performing these types of quantitative analyses easier, but operational forecasters do not generally have time to familiarize themselves with navigating the sometimes complex configurations associated with the MET tools. To assist forecasters in running a subset of MET programs and capabilities, the Short-term Prediction Research and Transition (SPoRT) Center has developed and transitioned a set of dynamic, easily configurable Perl scripts to collaborating NWS WFOs. The objective of these scripts is to provide SPoRT collaborating partners in the NWS with the ability to evaluate the skill of their local EMS model runs in near real time with little prior knowledge of the MET package. The ultimate goal is to make these verification scripts available to the broader NWS community in a future version of the EMS software. This paper provides an overview of the SPoRT MET scripts, instructions for how the scripts are run, and example use

  3. Open verification methodology cookbook

    CERN Document Server

    Glasser, Mark

    2009-01-01

    Functional verification is an art as much as a science. It requires not only creativity and cunning, but also a clear methodology to approach the problem. The Open Verification Methodology (OVM) is a leading-edge methodology for verifying designs at multiple levels of abstraction. It brings together ideas from electrical, systems, and software engineering to provide a complete methodology for verifying large scale System-on-Chip (SoC) designs. OVM defines an approach for developing testbench architectures so they are modular, configurable, and reusable. This book is designed to help both novic

  4. Development and verification of a real-time stochastic precipitation nowcasting system for urban hydrology in Belgium

    Directory of Open Access Journals (Sweden)

    L. Foresti

    2015-07-01

    Full Text Available The Short-Term Ensemble Prediction System (STEPS is implemented in real-time at the Royal Meteorological Institute (RMI of Belgium. The main idea behind STEPS is to quantify the forecast uncertainty by adding stochastic perturbations to the deterministic Lagrangian extrapolation of radar images. The stochastic perturbations are designed to account for the unpredictable precipitation growth and decay processes and to reproduce the dynamic scaling of precipitation fields, i.e. the observation that large scale rainfall structures are more persistent and predictable than small scale convective cells. This paper presents the development, adaptation and verification of the system STEPS for Belgium (STEPS-BE. STEPS-BE provides in real-time 20 member ensemble precipitation nowcasts at 1 km and 5 min resolution up to 2 h lead time using a 4 C-band radar composite as input. In the context of the PLURISK project, STEPS forecasts were generated to be used as input in sewer system hydraulic models for nowcasting urban inundations in the cities of Ghent and Leuven. Comprehensive forecast verification was performed in order to detect systematic biases over the given urban areas and to analyze the reliability of probabilistic forecasts for a set of case studies in 2013 and 2014. The forecast biases over the cities of Leuven and Ghent were found to be small, which is encouraging for future integration of STEPS nowcasts into the hydraulic models. Probabilistic forecasts of exceeding 0.5 mm h-1 are reliable up to 60–90 min lead time, while the ones of exceeding 5.0 mm h-1 are only reliable up to 30 min. The STEPS ensembles are slightly under-dispersive and represent only 80–90 % of the forecast errors.

  5. DEVELOPMENT OF A NEW RECURSIVE MEDIAN FILTERING SCHEME FOR PROCESSING POTENTIOMETER SIGNAL IN CSRDM OF PFBR

    Directory of Open Access Journals (Sweden)

    M. Mohana

    2012-07-01

    Full Text Available Prototype Fast Breeder Reactor (PFBR which is in advanced stage of construction at Kalpakkam has two shutdown systems, namely - Control & Safety Rod Drive Mechanisms (CSRDM and Diverse Safety Rod Drive Mechanisms (DSRDM. Since response time of Electromagnet (EM present in CSRDM is an important safety parameter, it is measured and monitored periodically. In PFBR, measurement of response time of EM is done by Current Decay method. An alternate approach to measure EM response time is Displacement Method, wherein it is proposed to utilise the existing potentiometer in CSRDM for position monitoring to measure EM response time. In light of this, a study was carried out to probe the feasibility of its implementation. To aid in the study, a new Recursive Median filtering scheme termed as ‘Multipass recursive median filter with variable window’ was developed to process the unconditioned potentiometer signal in Displacement method. EM response time obtained from Displacement Method and that obtained through Current decay method were found to be in good agreement with each other. This paper details the new Recursive Median filtering scheme developed.

  6. Slow-sand water filter: design, implementation, accessibility and sustainability in developing countries.

    Science.gov (United States)

    Clark, Peter A; Pinedo, Catalina Arango; Fadus, Matthew; Capuzzi, Stephen

    2012-07-01

    The need for clean water has risen exponentially over the globe. Millions of people are affected daily by a lack of clean water, especially women and children, as much of their day is dedicated to collecting water. The global water crisis not only has severe medical implications, but social, political, and economic consequences as well. The Institute of Catholic Bioethics at Saint Joseph's University has recognized this, and has designed a slow-sand water filter that is accessible, cost-effective, and sustainable. Through the implementation of the Institute's slow-sand water filter and the utilization of microfinancing services, developing countries will not only have access to clean, drinkable water, but will also have the opportunity to break out of a devastating cycle of poverty.

  7. Toward polarized antiprotons: Machine development for spin-filtering experiments at COSY

    CERN Document Server

    Weidemann, C; Stein, H J; Lorentz, B; Bagdasarian, Z; Barion, L; Barsov, S; Bechstedt, U; Bertelli, S; Chiladze, D; Ciullo, G; Contalbrigo, M; Dymov, S; Engels, R; Gaisser, M; Gebel, R; Goslawski, P; Grigoriev, K; Guidoboni, G; Kacharava, A; Kamerdzhiev, V; Khoukaz, A; Kulikov, A; Lehrach, A; Lenisa, P; Lomidze, N; Macharashvili, G; Maier, R; Martin, S; Mchedlishvili, D; Meyer, H O; Merzliakov, S; Mielke, M; Mikirtychiants, M; Mikirtychiants, S; Nass, A; Nikolaev, N N; Oellers, D; Papenbrock, M; Pesce, A; Prasuhn, D; Retzlaff, M; Schleichert, R; Schröer, D; Seyfarth, H; Soltner, H; Statera, M; Steffens, E; Stockhorst, H; Ströher, H; Tabidze, M; Tagliente, G; Engblom, P Thörngren; Trusov, S; Valdau, Yu; Vasiliev, A; Wüstner, P

    2014-01-01

    The paper describes the commissioning of the experimental equipment and the machine studies required for the first spin-filtering experiment with protons at a beam kinetic energy of $49.3\\,$MeV in COSY. The implementation of a low-$\\beta$ insertion made it possible to achieve beam lifetimes of $\\tau_{\\rm{b}}=8000\\,$s in the presence of a dense polarized hydrogen storage-cell target of areal density $d_{\\rm t}=(5.5\\pm 0.2)\\times 10^{13}\\,\\mathrm{atoms/cm^{2}}$. The developed techniques can be directly applied to antiproton machines and allow for the determination of the spin-dependent $\\bar{p}p$ cross sections via spin filtering.

  8. Development of neck filters for reducing artifact in cervical bone SPECT

    Energy Technology Data Exchange (ETDEWEB)

    Kikkawa, Nobutada; Kimura, Shigeo [Kyoto Minami Hospital (Japan)

    2001-01-01

    In cervical bone scintillation SPECT studies using triple energy window (TEW) and ordered subject-expection maximization (OS-EM) methods, we have observed an artifact that may interfere with evaluation of the image; higher accumulation in cervical vertebra compared with in the head and thoracic vertebra. As the neck is smaller in diameter than in the thorax and head, gamma ray absorption is lower. In addition, as the distance between the neck and the detector is greater, scattered gamma rays are increased, interfering with imaging and causing artifact. To overcome these problems, we have developed special absorbers (neck filter) to make the relative absorption level of the neck comparable to that of the head and thorax and have employed these cervical filters in bone scintillation SPECT studies in combination with TEW scatter correction and OS-EM method. Our results showed that artifacts were significantly reduced and satisfactory images were obtained. (author)

  9. Further development of the cleanable steel HEPA filter, cost/benefit analysis, and comparison with competing technologies

    Energy Technology Data Exchange (ETDEWEB)

    Bergman, W.; Lopez, R.; Wilson, K. [Lawrence Livermore National Lab., CA (United States)] [and others

    1997-08-01

    We have made further progress in developing a cleanable steel fiber HEPA filter. We fabricated a pleated cylindrical cartridge using commercially available steel fiber media that is made with 1 {mu}m stainless steel fibers and sintered into a sheet form. Test results at the Department of Energy (DOE) Filter Test Station at Oak Ridge show the prototype filter cartridge has 99.99% efficiency for 0.3 {mu}m dioctyl phthalate (DOP) aerosols and a pressure drop of 1.5 inches. Filter loading and cleaning tests using AC Fine dust showed the filter could be repeatedly cleaned using reverse air pulses. Our analysis of commercially optimized filters suggest that cleanable steel HEPA filters need to be made from steel fibers less than 1{mu}m, and preferably 0.5 {mu}m, to meet the standard HEPA filter requirements in production units. We have demonstrated that 0.5 {mu}m steel fibers can be produced using the fiber bundling and drawing process. The 0.5 {mu}m steel fibers are then sintered into small filter samples and tested for efficiency and pressure drop. Test results on the sample showed a penetration of 0.0015 % at 0.3 {mu}m and a pressure drop of 1.15 inches at 6.9 ft/min (3.5 cm/s) velocity. Based on these results, steel fiber media can easily meet the requirements of 0.03 % penetration and 1.0 inch of pressure drop by using less fibers in the media. A cost analysis of the cleanable steel HEPA filter shows that, although the steel HEPA filter costs much more than the standard glass fiber HEPA filter, it has the potential to be very cost effective because of the high disposal costs of contaminated HEPA filters. We estimate that the steel HEPA filter will save an average of $16,000 over its 30 year life. The additional savings from the clean-up costs resulting from ruptured glass HEPA filters during accidents was not included but makes the steel HEPA filter even more cost effective. 33 refs., 28 figs., 1 tab.

  10. Development of synthetic and natural mineral based adsorptive and filter media containing cyclodextrin moieties

    Science.gov (United States)

    Andersen, E.; Rácz, I.; Erös, A.; Bánhegyi, Gy; Fenyvesi, É.; Takács, E.

    2013-12-01

    Adsorptive filter media were developed based on UHMWPE (ultra high molecular weight polyethylene), perlite mineral and sol-gel synthesized silica gel as support and various cyclodextrin oligomers and polymers as active adsorbents. Adsorptive capacity was characterized by dye adsorption before and after Soxhlet extraction in water to check the hydrolytic stability of the structures obtained. Morphological and in some cases spectroscopic studies were made to understand the differences in behaviour. At the present stage the development of such structures hardly exceeds the trial and error approach, nevertheless some promising formulations were found.

  11. Verification and validation benchmarks.

    Energy Technology Data Exchange (ETDEWEB)

    Oberkampf, William Louis; Trucano, Timothy Guy

    2007-02-01

    Verification and validation (V&V) are the primary means to assess the accuracy and reliability of computational simulations. V&V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V&V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the level of

  12. Vacuum-assisted resin transfer molding (VARTM) model development, verification, and process analysis

    Science.gov (United States)

    Sayre, Jay Randall

    2000-12-01

    Vacuum-Assisted Resin Transfer Molding (VARTM) processes are becoming promising technologies in the manufacturing of primary composite structures in the aircraft industry as well as infrastructure. A great deal of work still needs to be done on efforts to reduce the costly trial-and-error methods of VARTM processing that are currently in practice today. A computer simulation model of the VARTM process would provide a cost-effective tool in the manufacturing of composites utilizing this technique. Therefore, the objective of this research was to modify an existing three-dimensional, Resin Film Infusion (RFI)/Resin Transfer Molding (RTM) model to include VARTM simulation capabilities and to verify this model with the fabrication of aircraft structural composites. An additional objective was to use the VARTM model as a process analysis tool, where this tool would enable the user to configure the best process for manufacturing quality composites. Experimental verification of the model was performed by processing several flat composite panels. The parameters verified included flow front patterns and infiltration times. The flow front patterns were determined to be qualitatively accurate, while the simulated infiltration times over predicted experimental times by 8 to 10%. Capillary and gravitational forces were incorporated into the existing RFI/RTM model in order to simulate VARTM processing physics more accurately. The theoretical capillary pressure showed the capability to reduce the simulated infiltration times by as great as 6%. The gravity, on the other hand, was found to be negligible for all cases. Finally, the VARTM model was used as a process analysis tool. This enabled the user to determine such important process constraints as the location and type of injection ports and the permeability and location of the high-permeable media. A process for a three-stiffener composite panel was proposed. This configuration evolved from the variation of the process

  13. Development of dose delivery verification by PET imaging of photonuclear reactions following high energy photon therapy

    Energy Technology Data Exchange (ETDEWEB)

    Janek, S [Medical Radiation Physics, Department of Oncology and Pathology, Karolinska Institutet and Stockholm University, Box 260, 171 76 Stockholm (Sweden); Svensson, R [Medical Radiation Physics, Department of Oncology and Pathology, Karolinska Institutet and Stockholm University, Box 260, 171 76 Stockholm (Sweden); Jonsson, C [Medical Radiation Physics, Department of Oncology and Pathology, Karolinska Institutet and Stockholm University, Box 260, 171 76 Stockholm (Sweden); Brahme, A [Medical Radiation Physics, Department of Oncology and Pathology, Karolinska Institutet and Stockholm University, Box 260, 171 76 Stockholm (Sweden)

    2006-11-21

    verification by means of PET imaging seems to be applicable provided that biological transport processes such as capillary blood flow containing mobile {sup 15}O and {sup 11}C in the activated tissue volume can be accounted for.

  14. Development of dose delivery verification by PET imaging of photonuclear reactions following high energy photon therapy

    Science.gov (United States)

    Janek, S.; Svensson, R.; Jonsson, C.; Brahme, A.

    2006-11-01

    A method for dose delivery monitoring after high energy photon therapy has been investigated based on positron emission tomography (PET). The technique is based on the activation of body tissues by high energy bremsstrahlung beams, preferably with energies well above 20 MeV, resulting primarily in 11C and 15O but also 13N, all positron-emitting radionuclides produced by photoneutron reactions in the nuclei of 12C, 16O and 14N. A PMMA phantom and animal tissue, a frozen hind leg of a pig, were irradiated to 10 Gy and the induced positron activity distributions were measured off-line in a PET camera a couple of minutes after irradiation. The accelerator used was a Racetrack Microtron at the Karolinska University Hospital using 50 MV scanned photon beams. From photonuclear cross-section data integrated over the 50 MV photon fluence spectrum the predicted PET signal was calculated and compared with experimental measurements. Since measured PET images change with time post irradiation, as a result of the different decay times of the radionuclides, the signals from activated 12C, 16O and 14N within the irradiated volume could be separated from each other. Most information is obtained from the carbon and oxygen radionuclides which are the most abundant elements in soft tissue. The predicted and measured overall positron activities are almost equal (-3%) while the predicted activity originating from nitrogen is overestimated by almost a factor of two, possibly due to experimental noise. Based on the results obtained in this first feasibility study the great value of a combined radiotherapy-PET-CT unit is indicated in order to fully exploit the high activity signal from oxygen immediately after treatment and to avoid patient repositioning. With an RT-PET-CT unit a high signal could be collected even at a dose level of 2 Gy and the acquisition time for the PET could be reduced considerably. Real patient dose delivery verification by means of PET imaging seems to be

  15. Development of biomass in a drinking water granular active carbon (GAC) filter.

    Science.gov (United States)

    Velten, Silvana; Boller, Markus; Köster, Oliver; Helbing, Jakob; Weilenmann, Hans-Ulrich; Hammes, Frederik

    2011-12-01

    Indigenous bacteria are essential for the performance of drinking water biofilters, yet this biological component remains poorly characterized. In the present study we followed biofilm formation and development in a granular activated carbon (GAC) filter on pilot-scale during the first six months of operation. GAC particles were sampled from four different depths (10, 45, 80 and 115 cm) and attached biomass was measured with adenosine tri-phosphate (ATP) analysis. The attached biomass accumulated rapidly on the GAC particles throughout all levels in the filter during the first 90 days of operation and maintained a steady state afterward. Vertical gradients of biomass density and growth rates were observed during start-up and also in steady state. During steady state, biomass concentrations ranged between 0.8-1.83 x 10(-6) g ATP/g GAC in the filter, and 22% of the influent dissolved organic carbon (DOC) was removed. Concomitant biomass production was about 1.8 × 10(12) cells/m(2)h, which represents a yield of 1.26 × 10(6) cells/μg. The bacteria assimilated only about 3% of the removed carbon as biomass. At one point during the operational period, a natural 5-fold increase in the influent phytoplankton concentration occurred. As a result, influent assimilable organic carbon concentrations increased and suspended bacteria in the filter effluent increased 3-fold as the direct consequence of increased growth in the biofilter. This study shows that the combination of different analytical methods allows detailed quantification of the microbiological activity in drinking water biofilters.

  16. Genomic analyses with biofilter 2.0: knowledge driven filtering, annotation, and model development.

    Science.gov (United States)

    Pendergrass, Sarah A; Frase, Alex; Wallace, John; Wolfe, Daniel; Katiyar, Neerja; Moore, Carrie; Ritchie, Marylyn D

    2013-12-30

    tool that provides a flexible way to use the ever-expanding expert biological knowledge that exists to direct filtering, annotation, and complex predictive model development for elucidating the etiology of complex phenotypic outcomes.

  17. Development and Verification of the Tire/Road Friction Estimation Algorithm for Antilock Braking System

    Directory of Open Access Journals (Sweden)

    Jian Zhao

    2014-01-01

    Full Text Available Road friction information is very important for vehicle active braking control systems such as ABS, ASR, or ESP. It is not easy to estimate the tire/road friction forces and coefficient accurately because of the nonlinear system, parameters uncertainties, and signal noises. In this paper, a robust and effective tire/road friction estimation algorithm for ABS is proposed, and its performance is further discussed by simulation and experiment. The tire forces were observed by the discrete Kalman filter, and the road friction coefficient was estimated by the recursive least square method consequently. Then, the proposed algorithm was analysed and verified by simulation and road test. A sliding mode based ABS with smooth wheel slip ratio control and a threshold based ABS by pulse pressure control with significant fluctuations were used for the simulation. Finally, road tests were carried out in both winter and summer by the car equipped with the same threshold based ABS, and the algorithm was evaluated on different road surfaces. The results show that the proposed algorithm can identify the variation of road conditions with considerable accuracy and response speed.

  18. A Runtime Verification System for Developing, Analyzing and Controlling Complex Safety-Critical Software Project

    Data.gov (United States)

    National Aeronautics and Space Administration — A comprehensive commercial-grade system for the development of safe parallel and serial programs is developed. The system has the ability to perform efficient...

  19. Practical mask inspection system with printability and pattern priority verification

    Science.gov (United States)

    Tsuchiya, Hideo; Ozaki, Fumio; Takahara, Kenichi; Inoue, Takafumi; Kikuiri, Nobutaka

    2011-05-01

    Through the four years of study in Association of Super-Advanced Electronics Technologies (ASET) on reducing mask manufacturing Turn Around Time (TAT) and cost, we have been able to establish a technology to improve the efficiency of the review process by applying a printability verification function that utilizes computational lithography simulations to analyze defects detected by a high-resolution mask inspection system. With the advent of Source-Mask Optimization (SMO) and other technologies that extend the life of existing optical lithography, it is becoming extremely difficult to judge a defect only by the shape of a mask pattern, while avoiding pseudo-defects. Thus, printability verification is indispensable for filtering out nuisance defects from high-resolution mask inspection results. When using computational lithography simulations to verify printability with high precision, the image captured by the inspection system must be prepared with extensive care. However, for practical applications, this preparation process needs to be simplified. In addition, utilizing Mask Data Rank (MDR) to vary the defect detection sensitivity according to the patterns is also useful for simultaneously inspecting minute patterns and avoiding pseudo-defects. Combining these two technologies, we believe practical mask inspection for next generation lithography is achievable. We have been improving the estimation accuracy of the printability verification function through discussion with several customers and evaluation of their masks. In this report, we will describe the progress of these practical mask verification functions developed through customers' evaluations.

  20. Qualifications of Candle Filters for Combined Cycle Combustion Applications

    Energy Technology Data Exchange (ETDEWEB)

    Tomasz Wiltowski

    2008-08-31

    The direct firing of coal produces particulate matter that has to be removed for environmental and process reasons. In order to increase the current advanced coal combustion processes, under the U.S. Department of Energy's auspices, Siemens Westinghouse Power Corporation (SWPC) has developed ceramic candle filters that can operate at high temperatures. The Coal Research Center of Southern Illinois University (SIUC), in collaboration with SWPC, developed a program for long-term filter testing at the SIUC Steam Plant followed by experiments using a single-filter reactor unit. The objectives of this program funded by the U.S. Department of Energy were to identify and demonstrate the stability of porous candle filter elements for use in high temperature atmospheric fluidized-bed combustion (AFBC) process applications. These verifications were accomplished through extended time slipstream testing of a candle filter array under AFBC conditions using SIUC's existing AFBC boiler. Temperature, mass flow rate, and differential pressure across the filter array were monitored for a duration of 45 days. After test exposure at SIUC, the filter elements were characterized using Scanning Electron Microscopy and BET surface area analyses. In addition, a single-filter reactor was built and utilized to study long term filter operation, the permeability exhibited by a filter element before and after the slipstream test, and the thermal shock resilience of a used filter by observing differential pressure changes upon rapid heating and cooling of the filter. The data acquired during the slipstream test and the post-test evaluations demonstrated the suitability of filter elements in advanced power generation applications.

  1. Water Usage and Availability in Bongo's Communities: Research Leading to the Development of an Indigenous Fluoride Filter

    Science.gov (United States)

    Friscia, J. M.; Epstein, B.; Cumberbatch, T.; Okuneff, A.

    2010-12-01

    Over the course of a six-week period in both 2009 and 2010, an investigation into the collection and usage of water was undertaken in the Bongo District of Ghana. The outcome of this research was to provide data for the design of a defluoridation filter for the groundwater that could be used either in the household or at the borehole. This filter would use laterite as the filter medium and would prevent the development of dental fluorosis, which is common in the District. In 2009, the focus was on denser, more centrally located communities, while the research in 2010 focused on communities in which people live further from water sources. The localities studied were in Namoo, Kuyelingo, Bongo Central, and Kadare. After an analysis of data collected in 2009 and a preliminary review of data collected in 2010, it has been determined that the different localities have different requirements for a filter design. Denser communities, including parts of Namoo, Kuyelingo, and Bongo Central, would benefit most from a filter installed directly at the borehole. This filter would not process all the water fetched, since less than half the water collected is ingested. In more remote communities, such as parts of Kuyelingo near the Vea Dam and Kadare, a household filter would be ideal. In these communities, when people live far from the borehole, they seek other sources, including river water, wells, rainwater, and dam water. Many of these sources are unsafe to drink without proper treatment. Therefore, a household filter that can filter the fluoride from borehole water (when the household does indeed fetch from the pump) and can filter bacteria and viruses from the other water sources would be most appropriate. The results from the water survey provide an overview of the water collection rate throughout the day, distance the water is carried to the individual households, and breakdown of water usage within the household - in both dense and remote communities. Child with Dental

  2. Reconfigurable system design and verification

    CERN Document Server

    Hsiung, Pao-Ann; Huang, Chun-Hsian

    2009-01-01

    Reconfigurable systems have pervaded nearly all fields of computation and will continue to do so for the foreseeable future. Reconfigurable System Design and Verification provides a compendium of design and verification techniques for reconfigurable systems, allowing you to quickly search for a technique and determine if it is appropriate to the task at hand. It bridges the gap between the need for reconfigurable computing education and the burgeoning development of numerous different techniques in the design and verification of reconfigurable systems in various application domains. The text e

  3. Development of adaptive IIR filtered-e LMS algorithm for active noise control

    Institute of Scientific and Technical Information of China (English)

    SUN Xu; MENG Guang; TENG Pengxiao; CHEN Duanshi

    2003-01-01

    Compared to finite impulse response (FIR) filters, infinite impulse response (IIR)filters can match the system better with much fewer coefficients, and hence the computationload is saved and the performance improves. Therefore, it is attractive to use IIR filters insteadof FIR filters in active noise control (ANC). However, filtered-U LMS (FULMS) algorithm, theIIR filter-based algorithm commonly used so far cannot ensure global convergence. A new IIRfilter based adaptive algorithm, which can ensure global convergence with computation loadonly slightly increasing, is proposed in this paper. The new algorithm is called as filtered-eLMS algorithm since the error signal of which need to be filtered. Simulation results show thatthe FELMS algorithm presents better performance than the FULMS algorithm.

  4. Development and verification testing of automation and robotics for assembly of space structures

    Science.gov (United States)

    Rhodes, Marvin D.; Will, Ralph W.; Quach, Cuong C.

    1993-01-01

    A program was initiated within the past several years to develop operational procedures for automated assembly of truss structures suitable for large-aperture antennas. The assembly operations require the use of a robotic manipulator and are based on the principle of supervised autonomy to minimize crew resources. A hardware testbed was established to support development and evaluation testing. A brute-force automation approach was used to develop the baseline assembly hardware and software techniques. As the system matured and an operation was proven, upgrades were incorprated and assessed against the baseline test results. This paper summarizes the developmental phases of the program, the results of several assembly tests, the current status, and a series of proposed developments for additional hardware and software control capability. No problems that would preclude automated in-space assembly of truss structures have been encountered. The current system was developed at a breadboard level and continued development at an enhanced level is warranted.

  5. [Influence of wearing long wavelength filter glasses on refractive development of children's hyperopia].

    Science.gov (United States)

    Huang, J; Yu, Z Q; Chu, R Y; Qian, Y S; Xu, Y; Wang, X Q

    2017-01-11

    Objective: To investigate the effect of wearing long wavelength filter glasses on refractive development of children's hyperopia. Methods: Case control study. Seventeen 5-7 years' old children with high hyperopia from optometry clinic of Eye and ENT Hospital Affiliated to Fudan University were enrolled in this research. The experiment design was self-control between right and left eye, 3 children were lost during two years' period of observation, all the children's hyperopic refraction were more than +6.00 D, cycloplegic by 1% atropine. All the children were required to wear long wavelength filter glasses for 6 hours after waking up, the rest of the time with the conventional glasses. Refraction, axis and red/green match point were tested before the intervention and 3, 6, 12, 18, 24 months, after the intervention. Results: After two years' intervention, hyperopia decreased, eye axis increased, the best corrected visual acuity increased both in experimental eyes and control eyes, but there were no statistically significant difference between the two groups at each time point. All children were with normal color vision, compared to the long-wavelength light, the hyperopic eyes were more sensitive to middle-wavelength light, no significant difference was found between two groups, red/green match points were 42.802±1.216 and 42.889±1.560 respectively. After wearing long wavelength filter, red/green match point were significant decreased in the experimental group in 6 months and 12 months time points (6 months: 0.995±0. 543 vs. 0.104±0.143, t=3.04, P=0.005, 12 months: 1.096±0.392 vs. 0.17±0.248, t=2.725, P=0.008). The experiment eyes were more sensitive to long-wavelength light than the control eyes. But in later time, there was no significant difference between two groups. Conclusion: Wearing long wavelength filter glasses two years has no effect on refractive development on children with high hyperopia, but it can cause short-term chromatic adaptation, making

  6. A "Kane's Dynamics" Model for the Active Rack Isolation System Part Two: Nonlinear Model Development, Verification, and Simplification

    Science.gov (United States)

    Beech, G. S.; Hampton, R. D.; Rupert, J. K.

    2004-01-01

    Many microgravity space-science experiments require vibratory acceleration levels that are unachievable without active isolation. The Boeing Corporation's active rack isolation system (ARIS) employs a novel combination of magnetic actuation and mechanical linkages to address these isolation requirements on the International Space Station. Effective model-based vibration isolation requires: (1) An isolation device, (2) an adequate dynamic; i.e., mathematical, model of that isolator, and (3) a suitable, corresponding controller. This Technical Memorandum documents the validation of that high-fidelity dynamic model of ARIS. The verification of this dynamics model was achieved by utilizing two commercial off-the-shelf (COTS) software tools: Deneb's ENVISION(registered trademark), and Online Dynamics Autolev(trademark). ENVISION is a robotics software package developed for the automotive industry that employs three-dimensional computer-aided design models to facilitate both forward and inverse kinematics analyses. Autolev is a DOS-based interpreter designed, in general, to solve vector-based mathematical problems and specifically to solve dynamics problems using Kane's method. The simplification of this model was achieved using the small-angle theorem for the joint angle of the ARIS actuators. This simplification has a profound effect on the overall complexity of the closed-form solution while yielding a closed-form solution easily employed using COTS control hardware.

  7. Development and performances of very long stripe-filters for a multispectral detector

    Science.gov (United States)

    Laubier, David; Gimenez, Thierry; Mercier-Ythier, Renaud

    2002-01-01

    The development in the past few years of all-mirror telescopes has opened a wide new range of possibilities to instrument designers, with features like high compactness and outstanding optical quality over wide fields of view. However, this design imposes specific constraints on the focal plane: it can no longer accommodate glass beamsplitters and its size increases with the field of view. New CCD detectors with multiple long lines are well-suited to this application, but require a new filters strategy. This paper will detail what ours was in the particular case of a 78-mm long, 4-channel CCD. The choice of the stripe- filters concept was made on the basis of a performance versus cost analysis. Two kinds of assemblies were retained at this stage. The components manufactured by SAGEM-REOSC PRODUCTS in an initial development phase showed good spectral performance with high rejection over a very wide range of wavelengths. Some topics like local defects and straylight needed specific work. The paper focuses on the impact of the defects on the performances and the way they have been dealt with, and on the straylight design strategy with the results obtained in the different cases. In particular, it shows how the detector's design can be partially driven by the straylight requirements.

  8. Development and Verification of a Fully Coupled Simulator for Offshore Wind Turbines: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Jonkman, J. M.; Buhl, M. L. Jr.

    2007-01-01

    This report outlines the development of an analysis tool capable of analyzing a variety of wind turbine, support platform, and mooring system configurations.The simulation capability was tested by model-to-model comparisons to ensure its correctness.

  9. Hybrid Adaptive Filter development for the minimisation of transient fluctuations superimposed on electrotelluric field recordings mainly by magnetic storms

    Directory of Open Access Journals (Sweden)

    A. Konstantaras

    2006-01-01

    Full Text Available The method of Hybrid Adaptive Filtering (HAF aims to recover the recorded electric field signals from anomalies of magnetotelluric origin induced mainly by magnetic storms. An adaptive filter incorporating neuro-fuzzy technology has been developed to remove any significant distortions from the equivalent magnetic field signal, as retrieved from the original electric field signal by reversing the magnetotelluric method. Testing with further unseen data verifies the reliability of the model and demonstrates the effectiveness of the HAF method.

  10. CFD Modeling & Verification in an Aircraft Paint Hangar

    Science.gov (United States)

    2011-05-01

    Collaboration •Navy Bureau of Medicine and Surgery (BUMED), IH Division –Assists CNO with health and safety of Navy aircraft artisans –Quarterly monitoring...levels • Handling paint particulates and vapors 10 E2S2. Verification Pitfalls • Artisans change process in the weeks between baseline and...verification – Added a fabric blanket in front of filter to save filter bank blocking exhaust airflow during sanding • Learn how to go w/o sleep

  11. User input verification and test driven development in the NJOY21 nuclear data processing code

    Energy Technology Data Exchange (ETDEWEB)

    Trainer, Amelia Jo [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Conlin, Jeremy Lloyd [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); McCartney, Austin Paul [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-08-21

    Before physically-meaningful data can be used in nuclear simulation codes, the data must be interpreted and manipulated by a nuclear data processing code so as to extract the relevant quantities (e.g. cross sections and angular distributions). Perhaps the most popular and widely-trusted of these processing codes is NJOY, which has been developed and improved over the course of 10 major releases since its creation at Los Alamos National Laboratory in the mid-1970’s. The current phase of NJOY development is the creation of NJOY21, which will be a vast improvement from its predecessor, NJOY2016. Designed to be fast, intuitive, accessible, and capable of handling both established and modern formats of nuclear data, NJOY21 will address many issues that many NJOY users face, while remaining functional for those who prefer the existing format. Although early in its development, NJOY21 is quickly providing input validation to check user input. By providing rapid and helpful responses to users while writing input files, NJOY21 will prove to be more intuitive and easy to use than any of its predecessors. Furthermore, during its development, NJOY21 is subject to regular testing, such that its test coverage must strictly increase with the addition of any production code. This thorough testing will allow developers and NJOY users to establish confidence in NJOY21 as it gains functionality. This document serves as a discussion regarding the current state input checking and testing practices of NJOY21.

  12. Development, implementation, and verification of multicycle depletion perturbation theory for reactor burnup analysis

    Energy Technology Data Exchange (ETDEWEB)

    White, J.R.

    1980-08-01

    A generalized depletion perturbation formulation based on the quasi-static method for solving realistic multicycle reactor depletion problems is developed and implemented within the VENTURE/BURNER modular code system. The present development extends the original formulation derived by M.L. Williams to include nuclide discontinuities such as fuel shuffling and discharge. This theory is first described in detail with particular emphasis given to the similarity of the forward and adjoint quasi-static burnup equations. The specific algorithm and computational methods utilized to solve the adjoint problem within the newly developed DEPTH (Depletion Perturbation Theory) module are then briefly discussed. Finally, the main features and computational accuracy of this new method are illustrated through its application to several representative reactor depletion problems.

  13. Development, implementation, and verification of multicycle depletion perturbation theory for reactor burnup analysis

    Energy Technology Data Exchange (ETDEWEB)

    White, J.R.

    1980-08-01

    A generalized depletion perturbation formulation based on the quasi-static method for solving realistic multicycle reactor depletion problems is developed and implemented within the VENTURE/BURNER modular code system. The present development extends the original formulation derived by M.L. Williams to include nuclide discontinuities such as fuel shuffling and discharge. This theory is first described in detail with particular emphasis given to the similarity of the forward and adjoint quasi-static burnup equations. The specific algorithm and computational methods utilized to solve the adjoint problem within the newly developed DEPTH (Depletion Perturbation Theory) module are then briefly discussed. Finally, the main features and computational accuracy of this new method are illustrated through its application to several representative reactor depletion problems.

  14. DEVELOPMENT OF AN INNOVATIVE LASER SCANNER FOR GEOMETRICAL VERIFICATION OF METALLIC AND PLASTIC PARTS

    DEFF Research Database (Denmark)

    Carmignato, Simone; De Chiffre, Leonardo; Fisker, Rune

    2008-01-01

    and plastic parts. A first prototype of the novel measuring system has been developed, using laser triangulation. The system, besides ensuring the automatic reconstruction of complete surface models, has been designed to guarantee user-friendliness, versatility, reliability and speed. The paper focuses mainly...... dimensional measurements with adequate accuracy for most industrial requirements....

  15. Development of Wien filter for small ion gun of surface analysis

    Energy Technology Data Exchange (ETDEWEB)

    Bahng, Jungbae [Department of Physics, Kyungpook National University, Daegu 702-701 (Korea, Republic of); Busan Center, Korea Basic Science Institute, Busan 609-735 (Korea, Republic of); Hong, Jonggi; Choi, Myoung Choul; Won, Mi-Sook; Lee, Byoung-Seob, E-mail: bslee@kbsi.re.kr [Busan Center, Korea Basic Science Institute, Busan 609-735 (Korea, Republic of)

    2016-02-15

    The gas cluster ion beam (GCIB) and liquid metal ion beam have been studied in the context of ion beam usage for analytical equipment in applications such as X-ray photoelectron spectroscopy and secondary ion mass spectroscopy (SIMS). In particular, small ion sources are used for the secondary ion generation and ion etching. To set the context to this study, the SIMS project has been launched to develop ion-gun based analytical equipment for the Korea Basic Science Institute. The objective of the first stage of the project is the generation of argon beams with a GCIB system [A. Kirkpatrick, Nucl. Instrum. Methods Phys. Res., Sect. B 206, 830–837 (2003)] that consists of a nozzle, skimmer, ionizer, acceleration tube, separation system, transport system, and target. The Wien filter directs the selected cluster beam to the target system by exploiting the velocity difference of the generated particles from GCIB. In this paper, we present the theoretical modeling and three-dimensional electromagnetic analysis of the Wien filter, which can separate Ar{sup +}{sub 2500} clusters from Ar{sup +}{sub 2400} to Ar{sup +}{sub 2600} clusters with a 1-mm collimator.

  16. Temperature Modeling of Lost Creek Lake Using CE-QUAL-W2: A Report on the Development, Calibration, Verification, and Application of the Model

    Science.gov (United States)

    2017-05-01

    ER D C/ EL T R- 17 -6 Temperature Modeling of Applegate Lake Using CE-QUAL-W2 A Report on the Development, Calibration, Verification...and Application of the Model En vi ro nm en ta l L ab or at or y Tammy L. Threadgill, Daniel F. Turner, Laurie A. Nicholas, Barry W. Bunch...Dorothy H. Tillman, and David L. Smith May 2017 Approved for public release; distribution is unlimited. The U.S. Army Engineer Research and

  17. Development and Verification of Unstructured Adaptive Mesh Technique with Edge Compatibility

    Science.gov (United States)

    Ito, Kei; Kunugi, Tomoaki; Ohshima, Hiroyuki

    In the design study of the large-sized sodium-cooled fast reactor (JSFR), one key issue is suppression of gas entrainment (GE) phenomena at a gas-liquid interface. Therefore, the authors have been developed a high-precision CFD algorithm to evaluate the GE phenomena accurately. The CFD algorithm has been developed on unstructured meshes to establish an accurate modeling of JSFR system. For two-phase interfacial flow simulations, a high-precision volume-of-fluid algorithm is employed. It was confirmed that the developed CFD algorithm could reproduce the GE phenomena in a simple GE experiment. Recently, the authors have been developed an important technique for the simulation of the GE phenomena in JSFR. That is an unstructured adaptive mesh technique which can apply fine cells dynamically to the region where the GE occurs in JSFR. In this paper, as a part of the development, a two-dimensional unstructured adaptive mesh technique is discussed. In the two-dimensional adaptive mesh technique, each cell is refined isotropically to reduce distortions of the mesh. In addition, connection cells are formed to eliminate the edge incompatibility between refined and non-refined cells. The two-dimensional unstructured adaptive mesh technique is verified by solving well-known lid-driven cavity flow problem. As a result, the two-dimensional unstructured adaptive mesh technique succeeds in providing a high-precision solution, even though poor-quality distorted initial mesh is employed. In addition, the simulation error on the two-dimensional unstructured adaptive mesh is much less than the error on the structured mesh with a larger number of cells.

  18. Security Policy Development: Towards a Life-Cycle and Logic-Based Verification Model

    Directory of Open Access Journals (Sweden)

    Luay A. Wahsheh

    2008-01-01

    Full Text Available Although security plays a major role in the design of software systems, security requirements and policies are usually added to an already existing system, not created in conjunction with the product. As a result, there are often numerous problems with the overall design. In this paper, we discuss the relationship between software engineering, security engineering, and policy engineering and present a security policy life-cycle; an engineering methodology to policy development in high assurance computer systems. The model provides system security managers with a procedural engineering process to develop security policies. We also present an executable Prolog-based model as a formal specification and knowledge representation method using a theorem prover to verify system correctness with respect to security policies in their life-cycle stages.

  19. Relative Navigation Light Detection and Ranging (LIDAR) Sensor Development Test Objective (DTO) Performance Verification

    Science.gov (United States)

    Dennehy, Cornelius J.

    2013-01-01

    The NASA Engineering and Safety Center (NESC) received a request from the NASA Associate Administrator (AA) for Human Exploration and Operations Mission Directorate (HEOMD), to quantitatively evaluate the individual performance of three light detection and ranging (LIDAR) rendezvous sensors flown as orbiter's development test objective on Space Transportation System (STS)-127, STS-133, STS-134, and STS-135. This document contains the outcome of the NESC assessment.

  20. Simulink based behavioural modelling of a pulse oximeter for deployment in rapid development, prototyping and verification.

    Science.gov (United States)

    Shokouhian, M; Morling, R C S; Kale, I

    2012-01-01

    The pulse oximeter is a well-known device for measuring the level of oxygen in blood. Since their invention, pulse oximeters have been under constant development in both aspects of hardware and software; however there are still unsolved problems that limit their performance [6], [7]. Many fresh algorithms and new design techniques are being suggested every year by industry and academic researchers which claim that they can improve accuracy of measurements [8], [9]. With the lack of an accurate computer-based behavioural model for pulse oximeters, the only way for evaluation of these newly developed systems and algorithms is through hardware implementation which can be both expensive and time consuming. This paper presents an accurate Simulink based behavioural model for a pulse oximeter that can be used by industry and academia alike working in this area, as an exploration as well as productivity enhancement tool during their research and development process. The aim of this paper is to introduce a new computer-based behavioural model which provides a simulation environment from which new ideas can be rapidly evaluated long before the real implementation.

  1. The development of verification and validation technology for instrumentation and control in NPPs - A study on the software development methodology of a highly reliable software

    Energy Technology Data Exchange (ETDEWEB)

    Kwon, Yong Rae; Cha, Sung Deok; Lee, Woo Jin; Chae, Hong Seok; Yoon, Kwang Sik; Jeong, Ki Suk [Korea Advanced Institute of Science and Technology,= Taejon (Korea, Republic of)

    1996-07-01

    Nuclear industries have tried to use the digital I and C technology in developing advanced nuclear power plants. However, because the industries did= not establish the highly reliable software development methodologies and standards applied to developing the highly reliable and safe software for digital I and C systems, they were confronted with the difficulties to avoid software common mode failures. To mitigate the difficulties, the highly reliable software development environments and methodologies and validation and verification techniques should be the cornerstone of all digital implementation in nuclear power plants. The objectives of this project is to establish the highly reliable software development methodology to support developing digital instrumentation and control systems in nuclear power plants. In this project, we have investigated the business-oriented and the real-time software development methods and techniques for ensuring safety and reliability of the software. Also we have studied standards related to licensing the software for digital I and C systems. 50 refs., 51 figs. (author)

  2. Swarm Verification

    Science.gov (United States)

    Holzmann, Gerard J.; Joshi, Rajeev; Groce, Alex

    2008-01-01

    Reportedly, supercomputer designer Seymour Cray once said that he would sooner use two strong oxen to plow a field than a thousand chickens. Although this is undoubtedly wise when it comes to plowing a field, it is not so clear for other types of tasks. Model checking problems are of the proverbial "search the needle in a haystack" type. Such problems can often be parallelized easily. Alas, none of the usual divide and conquer methods can be used to parallelize the working of a model checker. Given that it has become easier than ever to gain access to large numbers of computers to perform even routine tasks it is becoming more and more attractive to find alternate ways to use these resources to speed up model checking tasks. This paper describes one such method, called swarm verification.

  3. Optics Design for the U.S. SKA Technology Development Project Design Verification Antenna

    Science.gov (United States)

    Imbriale, W. A.; Baker, L.; Cortes-Medellin, G.

    2012-01-01

    The U.S. design concept for the Square Kilometer Array (SKA) program is based on utilizing a large number of 15 meter dish antennas. The Technology Development Project (TDP) is planning to design and build the first of these antennas to provide a demonstration of the technology and a solid base on which to estimate costs. This paper describes the performance of the selected optics design. It is a dual-shaped offset Gregorian design with a feed indexer that can accommodate corrugated horns, wide band single pixel feeds or phased array feeds.

  4. Optics Design for the U.S. SKA Technology Development Project Design Verification Antenna

    Science.gov (United States)

    Imbriale, W. A.; Baker, L.; Cortes-Medellin, G.

    2012-01-01

    The U.S. design concept for the Square Kilometer Array (SKA) program is based on utilizing a large number of 15 meter dish antennas. The Technology Development Project (TDP) is planning to design and build the first of these antennas to provide a demonstration of the technology and a solid base on which to estimate costs. This paper describes the performance of the selected optics design. It is a dual-shaped offset Gregorian design with a feed indexer that can accommodate corrugated horns, wide band single pixel feeds or phased array feeds.

  5. Development and verification of the modified dynamic two-fluid model GOPS

    Science.gov (United States)

    Song, Chengyi; Li, Yuxing; Meng, Lan; Wang, Haiyan

    2013-07-01

    In the oil and gas industry, many versions of software have been developed to calculate the flow parameters of multiphase flow. However, the existing software is not perfect. To improve the accuracy, a new version of software GOPS has been developed by Daqing Oilfield Construction Design and Research Institute, and China University of Petroleum. GOPS modifies the general extended two-fluid model, and considers the gas bubble phase in liquid and liquid droplet phase in gas. There are four continuity equations, two momentum equations, one mixture energy-conservation equation and one pressure-conservation equation in the controlling equations of GOPS. These controlling equations are combined with flow pattern transition model and closure relationships for every flow pattern. By this way, GOPS can simulate the dynamic variation of multiphase flow. To verify GOPS, relevant experiment has been made in Surface Engineering Pilot Test Center, CNPC. The experimental pressure gradients are compared with the results from GOPS, and the accuracy of GOPS is high.

  6. Postures and Motions Library Development for Verification of Ground Crew Human Systems Integration Requirements

    Science.gov (United States)

    Jackson, Mariea Dunn; Dischinger, Charles; Stambolian, Damon; Henderson, Gena

    2012-01-01

    Spacecraft and launch vehicle ground processing activities require a variety of unique human activities. These activities are being documented in a Primitive motion capture library. The Library will be used by the human factors engineering in the future to infuse real to life human activities into the CAD models to verify ground systems human factors requirements. As the Primitive models are being developed for the library the project has selected several current human factors issues to be addressed for the SLS and Orion launch systems. This paper explains how the Motion Capture of unique ground systems activities are being used to verify the human factors analysis requirements for ground system used to process the STS and Orion vehicles, and how the primitive models will be applied to future spacecraft and launch vehicle processing.

  7. Postures and Motions Library Development for Verification of Ground Crew Human Factors Requirements

    Science.gov (United States)

    Stambolian, Damon; Henderson, Gena; Jackson, Mariea Dunn; Dischinger, Charles

    2013-01-01

    Spacecraft and launch vehicle ground processing activities require a variety of unique human activities. These activities are being documented in a primitive motion capture library. The library will be used by human factors engineering analysts to infuse real to life human activities into the CAD models to verify ground systems human factors requirements. As the primitive models are being developed for the library, the project has selected several current human factors issues to be addressed for the Space Launch System (SLS) and Orion launch systems. This paper explains how the motion capture of unique ground systems activities is being used to verify the human factors engineering requirements for ground systems used to process the SLS and Orion vehicles, and how the primitive models will be applied to future spacecraft and launch vehicle processing.

  8. Development and Performance Verification of Fiber Optic Temperature Sensors in High Temperature Engine Environments

    Science.gov (United States)

    Adamovsky, Grigory; Mackey, Jeffrey R.; Kren, Lawrence A.; Floyd, Bertram M.; Elam, Kristie A.; Martinez, Martel

    2014-01-01

    A High Temperature Fiber Optic Sensor (HTFOS) has been developed at NASA Glenn Research Center for aircraft engine applications. After fabrication and preliminary in-house performance evaluation, the HTFOS was tested in an engine environment at NASA Armstrong Flight Research Center. The engine tests enabled the performance of the HTFOS in real engine environments to be evaluated along with the ability of the sensor to respond to changes in the engine's operating condition. Data were collected prior, during, and after each test in order to observe the change in temperature from ambient to each of the various test point levels. An adequate amount of data was collected and analyzed to satisfy the research team that HTFOS operates properly while the engine was running. Temperature measurements made by HTFOS while the engine was running agreed with those anticipated.

  9. Program Verification with Monadic Second-Order Logic & Languages for Web Service Development

    DEFF Research Database (Denmark)

    Møller, Anders

    development are areas of programming language research that have received increased attention during the last years. We first show how the logic Weak monadic Second-order Logic on Strings and Trees can be implemented efficiently despite an intractable theoretical worst-case complexity. Among several other......, such as maintaining session state and dynamically producing HTML or XML documents. By introducing explicit language-based mechanisms for those issues, we liberate the Web service programmer from the tedious and error-prone alternatives. Specialized program analyses aid the programmer by verifying at compile time......Domain-specific formal languages are an essential part of computer science, combining theory and practice. Such languages are characterized by being tailor-made for specific application domains and thereby providing expressiveness on high abstraction levels and allowing specialized analysis...

  10. The CoreGram project: theoretical linguistics, theory development and verification

    Directory of Open Access Journals (Sweden)

    Stefan Müller

    2015-06-01

    Full Text Available This paper describes the CoreGram project, a multilingual grammar engineering project that develops HPSG grammars for several typologically diverse languages that share a common core. The paper provides a general motivation for doing theoretical linguistics the way it is done in the CoreGram project and therefore is not targeted at computational linguists exclusively. I argue for a constraint-based approach to language rather than a generative-enumerative one and discuss issues of formalization. Recent advantages in the language acquisition research are mentioned and conclusions on how theories should be constructed are drawn. The paper discusses some of the highlights in the implemented grammars, gives a brief overview of central theoretical concepts and their implementation in TRALE and compares the CoreGram project with other multilingual grammar engineering projects.

  11. The development and verification of a highly accurate collision prediction model for automated noncoplanar plan delivery

    Energy Technology Data Exchange (ETDEWEB)

    Yu, Victoria Y.; Tran, Angelia; Nguyen, Dan; Cao, Minsong; Ruan, Dan; Low, Daniel A.; Sheng, Ke, E-mail: ksheng@mednet.ucla.edu [Department of Radiation Oncology, David Geffen School of Medicine, University of California Los Angeles, Los Angeles, California 90024 (United States)

    2015-11-15

    Purpose: Significant dosimetric benefits had been previously demonstrated in highly noncoplanar treatment plans. In this study, the authors developed and verified an individualized collision model for the purpose of delivering highly noncoplanar radiotherapy and tested the feasibility of total delivery automation with Varian TrueBeam developer mode. Methods: A hand-held 3D scanner was used to capture the surfaces of an anthropomorphic phantom and a human subject, which were positioned with a computer-aided design model of a TrueBeam machine to create a detailed virtual geometrical collision model. The collision model included gantry, collimator, and couch motion degrees of freedom. The accuracy of the 3D scanner was validated by scanning a rigid cubical phantom with known dimensions. The collision model was then validated by generating 300 linear accelerator orientations corresponding to 300 gantry-to-couch and gantry-to-phantom distances, and comparing the corresponding distance measurements to their corresponding models. The linear accelerator orientations reflected uniformly sampled noncoplanar beam angles to the head, lung, and prostate. The distance discrepancies between measurements on the physical and virtual systems were used to estimate treatment-site-specific safety buffer distances with 0.1%, 0.01%, and 0.001% probability of collision between the gantry and couch or phantom. Plans containing 20 noncoplanar beams to the brain, lung, and prostate optimized via an in-house noncoplanar radiotherapy platform were converted into XML script for automated delivery and the entire delivery was recorded and timed to demonstrate the feasibility of automated delivery. Results: The 3D scanner measured the dimension of the 14 cm cubic phantom within 0.5 mm. The maximal absolute discrepancy between machine and model measurements for gantry-to-couch and gantry-to-phantom was 0.95 and 2.97 cm, respectively. The reduced accuracy of gantry-to-phantom measurements was

  12. Development and Verification of MAAP5.0.3 Parameter file for APR1400

    Energy Technology Data Exchange (ETDEWEB)

    Seo, Mi Ro; Kim, Hyeong Taek [KHNP-CRI, Daejeon (Korea, Republic of)

    2015-05-15

    After the Fukushima accident, EPRI has continuously upgrade the MAAP5 (Modular Accident Analysis Program version 5) that is expected to expand the limitation of MAAP4. As a result of those efforts, the MAAP5.0.2 (Build 5020000) was released officially in December, 2013. Also, in August, 2014, the newest version of MAAP5, MAAP 5.0.3 (Build 5030000), was officially released. The parameter file development is essential for severe accident analysis using MAAP code for specific plant. In 2014, KHNP developed the first draft version of MAAP 5.0.2 parameter file for APR1400 type NPP and had tested for some basic severe accident sequence. And, until now, KHNP has continuously complemented the first draft version of APR1400 type NPP parameter file for MAAP 5.0.2 and 5.0.3. In this study, we analysis the MCCI phenomena using MAAP 5.0.3 version with the 2''n''d draft version of APR1400 parameter file developed by KHNP. The purpose of this study is to compare the major difference in MAAP 5.0.2 and 5.0.3 MCCI model and to verify the appropriateness of the 2''n''d draft version of parameter file. The MCCI phenomena have been controversial issues in the severe accident progression, so there have been great efforts to solve them until now. As the part of these efforts, EPRI published MAAP 5.0.3 version which is known that the 'Lower head plenum model' and the 'MCCI model' was upgraded. KHNP have the plan in order to upgrade the old parameter file based on MAAP4 to that based on MAAP5.0.2 or higher version for all domestic nuclear power plants. So, we have continuously developed the MAAP 5.0.2 and 5.0.3 parameter file for APR1400 type NPP. In this study, we analyzed the MCCI phenomena using MAAP 5.0.3 and 2''n''d draft version parameter file. And we found some insight as belows; (1) The Melt Eruption Model can greatly affect the MCCI progression only in the case of limestone concrete in the wet cavity

  13. Recent Developments In Fast Neutron Detection And Multiplicity Counting With Verification With Liquid Scintillator

    Energy Technology Data Exchange (ETDEWEB)

    Nakae, L; Chapline, G; Glenn, A; Kerr, P; Kim, K; Ouedraogo, S; Prasad, M; Sheets, S; Snyderman, N; Verbeke, J; Wurtz, R

    2011-09-30

    For many years at LLNL, we have been developing time-correlated neutron detection techniques and algorithms for applications such as Arms Control, Threat Detection and Nuclear Material Assay. Many of our techniques have been developed specifically for the relatively low efficiency (a few percent) attainable by detector systems limited to man-portability. Historically, we used thermal neutron detectors (mainly {sup 3}He), taking advantage of the high thermal neutron interaction cross-sections. More recently, we have been investigating the use of fast neutron detection with liquid scintillators, inorganic crystals, and in the near future, pulse-shape discriminating plastics which respond over 1000 times faster (nanoseconds versus tens of microseconds) than thermal neutron detectors. Fast neutron detection offers considerable advantages, since the inherent nanosecond production time-scales of spontaneous fission and neutron-induced fission are preserved and measured instead of being lost by thermalization required for thermal neutron detectors. We are now applying fast neutron technology to the safeguards regime in the form of fast portable digital electronics as well as faster and less hazardous scintillator formulations. Faster detector response times and sensitivity to neutron momentum show promise for measuring, differentiating, and assaying samples that have modest to very high count rates, as well as mixed fission sources like Cm and Pu. We report on measured results with our existing liquid scintillator array, and progress on the design of a nuclear material assay system that incorporates fast neutron detection, including the surprising result that fast liquid scintillator detectors become competitive and even surpass the precision of {sup 3}He-based counters measuring correlated pairs in modest (kg) samples of plutonium.

  14. Development and verification of child observation sheet for 5-year-old children.

    Science.gov (United States)

    Fujimoto, Keiko; Nagai, Toshisaburo; Okazaki, Shin; Kawajiri, Mie; Tomiwa, Kiyotaka

    2014-02-01

    The aim of the study was to develop a newly devised child observation sheet (COS-5) as a scoring sheet, based on the Childhood Autism Rating Scale (CARS), for use in the developmental evaluation of 5-year-old children, especially focusing on children with autistic features, and to verify its validity. Seventy-six children were studied. The children were recruited among participants of the Japan Children's Cohort Study, a research program implemented by the Research Institute of Science and Technology for Society (RISTEX) from 2004 to 2009. The developmental evaluation procedure was performed by doctors, clinical psychologists, and public health nurses. The COS-5 was also partly based on the Kyoto Scale of Psychological Development 2001 (Kyoto Scale 2001). Further, the Developmental Disorders Screening Questionnaire for 5-Years-Olds, PDD-Autism Society Japan Rating Scale (PARS), doctor interview questions and neurological examination for 5-year-old children, and the Draw-a-Man Test (DAM) were used as evaluation scales. Eighteen (25.4%) children were rated as Suspected, including Suspected PDD, Suspected ADHD and Suspected MR. The COS-5 was suggested to be valid with favorable reliability (α=0.89) and correlation with other evaluation scales. The COS-5 may be useful, with the following advantages: it can be performed within a shorter time frame; it facilitates the maintenance of observation quality; it facilitates sharing information with other professions; and it is reliable to identify the autistic features of 5-year-old children. In order to verify its wider applications including the screening of infants (18months to 3years old) by adjusting the items of younger age, additional study is needed.

  15. A mathematical model of the nickel converter: Part I. Model development and verification

    Science.gov (United States)

    Kyllo, A. K.; Richards, G. G.

    1991-04-01

    A mathematical model of the nickel converter has been developed. The primary assumption of the model is that the three phases in the converter are in thermal and chemical equilibrium. All matte, slag, and gas in the converter is brought to equilibrium at the end of each of a series of short time steps throughout an entire charge. An empirical model of both the matte and slag is used to characterize the activity coefficients in each phase. Two nickel sulfide species were used to allow for the modeling of sulfur-deficient mattes. A heat balance is carried out over each time step, considering the major heat flows in the converter. The model was validated by a detailed comparison with measured data from six industrial charges. The overall predicted mass balance was shown to be close to that seen in actual practice, and the heat balance gave a good fit of converter temperature up to the last two or three blows of a charge. At this point, reactions in the converter begin to deviate strongly from “equilibrium,” probably due to the converter reactions coming under liquid-phase mass-transfer control. While the equilibrium assumption does work, it is not strictly valid, and the majority of the charge is probably under gas-phase mass-transfer control.

  16. The Development and Verification of a Novel ECMS of Hybrid Electric Bus

    Directory of Open Access Journals (Sweden)

    Jun Wang

    2014-01-01

    Full Text Available This paper presents the system modeling, control strategy design, and hardware-in-the-loop test for a series-parallel hybrid electric bus. First, the powertrain mathematical models and the system architecture were proposed. Then an adaptive ECMS is developed for the real-time control of a hybrid electric bus, which is investigated and verified in a hardware-in-the-loop simulation system. The ECMS through driving cycle recognition results in updating the equivalent charge and discharge coefficients and extracting optimized rules for real-time control. This method not only solves the problems of mode transition frequently and improves the fuel economy, but also simplifies the complexity of control strategy design and provides new design ideas for the energy management strategy and gear-shifting rules designed. Finally, the simulation results show that the proposed real-time A-ECMS can coordinate the overall hybrid electric powertrain to optimize fuel economy and sustain the battery SOC level.

  17. Development and verification of fuel burn-up calculation model in a reduced reactor geometry

    Energy Technology Data Exchange (ETDEWEB)

    Sembiring, Tagor Malem [Center for Reactor Technology and Nuclear Safety (PTKRN), National Nuclear Energy Agency (BATAN), Kawasan PUSPIPTEK Gd. No. 80, Serpong, Tangerang 15310 (Indonesia)], E-mail: tagorms@batan.go.id; Liem, Peng Hong [Research Laboratory for Nuclear Reactor (RLNR), Tokyo Institute of Technology (Tokyo Tech), O-okayama, Meguro-ku, Tokyo 152-8550 (Japan)

    2008-02-15

    A fuel burn-up model in a reduced reactor geometry (2-D) is successfully developed and implemented in the Batan in-core fuel management code, Batan-FUEL. Considering the bank mode operation of the control rods, several interpolation functions are investigated which best approximate the 3-D fuel assembly radial power distributions across the core as function of insertion depth of the control rods. Concerning the applicability of the interpolation functions, it can be concluded that the optimal coefficients of the interpolation functions are not very sensitive to the core configuration and core or fuel composition in RSG GAS (MPR-30) reactor. Consequently, once the optimal interpolation function and its coefficients are derived then they can be used for 2-D routine operational in-core fuel management without repeating the expensive 3-D neutron diffusion calculations. At the selected fuel elements (at H-9 and G-6 core grid positions), the discrepancy of the FECFs (fuel element channel power peaking factors) between the 2-D and 3-D models are within the range of 3.637 x 10{sup -4}, 3.241 x 10{sup -4} and 7.556 x 10{sup -4} for the oxide, silicide cores with 250 g {sup 235}U/FE and the silicide core with 300 g {sup 235}U/FE, respectively.

  18. Systems, methods and apparatus for pattern matching in procedure development and verification

    Science.gov (United States)

    Hinchey, Michael G. (Inventor); Rash, James L. (Inventor); Rouff, Christopher A. (Inventor)

    2011-01-01

    Systems, methods and apparatus are provided through which, in some embodiments, a formal specification is pattern-matched from scenarios, the formal specification is analyzed, and flaws in the formal specification are corrected. The systems, methods and apparatus may include pattern-matching an equivalent formal model from an informal specification. Such a model can be analyzed for contradictions, conflicts, use of resources before the resources are available, competition for resources, and so forth. From such a formal model, an implementation can be automatically generated in a variety of notations. The approach can improve the resulting implementation, which, in some embodiments, is provably equivalent to the procedures described at the outset, which in turn can improve confidence that the system reflects the requirements, and in turn reduces system development time and reduces the amount of testing required of a new system. Moreover, in some embodiments, two or more implementations can be "reversed" to appropriate formal models, the models can be combined, and the resulting combination checked for conflicts. Then, the combined, error-free model can be used to generate a new (single) implementation that combines the functionality of the original separate implementations, and may be more likely to be correct.

  19. Development of the VESUVIUS module. Molten jet breakup modeling and model verification

    Energy Technology Data Exchange (ETDEWEB)

    Vierow, K. [Nuclear Power Engineering Corp., Tokyo (Japan); Nagano, Katsuhiro; Araki, Kazuhiro

    1998-01-01

    With the in-vessel vapor explosion issue ({alpha}-mode failure) now considered to pose an acceptably small risk to the safety of a light water reactor, ex-vessel vapor explosions are being given considerable attention. Attempts are being made to analytically model breakup of continuous-phase jets, however uncertainty exists regarding the basic phenomena. In addition, the conditions upon reactor vessel failure, which determine the starting point of the ex-vessel vapor explosion process, are difficult to quantify. Herein, molten jet ejection from the reactor pressure vessel is characterized. Next, the expected mode of jet breakup is determined and the current state of analytical modeling is reviewed. A jet breakup model for ex-vessel scenarios, with the primary breakup mechanism being the Kelvin-Helmholtz instability, is described. The model has been incorporated into the VESUVIUS module and comparisons of VESUVIUS calculations against FARO L-06 experimental data show differences, particularly in the pressure curve and amount of jet breakup. The need for additional development to resolve these differences is discussed. (author)

  20. AXAF-I Low Intensity-Low Temperature (LILT) Testing of the Development Verification Test (DVT) Solar Panel

    Science.gov (United States)

    Alexander, Doug; Edge, Ted; Willowby, Doug

    1998-01-01

    The planned orbit of the AXAF-I spacecraft will subject the spacecraft to both short, less than 30 minutes for solar and less than 2 hours for lunar, and long earth eclipses and lunar eclipses with combined conjunctive duration of up to 3 to 4 hours. Lack of proper Electrical Power System (EPS) conditioning prior to eclipse may cause loss of mission. To avoid this problem, for short eclipses, it is necessary to off-point the solar array prior to or at the beginning of the eclipse to reduce the battery state of charge (SOC). This yields less overcharge during the high charge currents at sun entry. For long lunar eclipses, solar array pointing and load scheduling must be tailored for the profile of the eclipse. The battery SOC, loads, and solar array current-voltage (I-V) must be known or predictable to maintain the bus voltage within acceptable range. To address engineering concerns about the electrical performance of the AXAF-I solar array under Low Intensity and Low Temperature (LILT) conditions, Marshall Space Flight Center (MSFC) engineers undertook special testing of the AXAF-I Development Verification Test (DVT) solar panel in September-November 1997. In the test the DVT test panel was installed in a thermal vacuum chamber with a large view window with a mechanical "flapper door". The DVT test panel was "flash" tested with a Large Area Pulse Solar Simulator (LAPSS) at various fractional sun intensities and panel (solar cell) temperatures. The testing was unique with regards to the large size of the test article and type of testing performed. The test setup, results, and lessons learned from the testing will be presented.

  1. Distorted Fingerprint Verification System

    Directory of Open Access Journals (Sweden)

    Divya KARTHIKAESHWARAN

    2011-01-01

    Full Text Available Fingerprint verification is one of the most reliable personal identification methods. Fingerprint matching is affected by non-linear distortion introduced in fingerprint impression during the image acquisition process. This non-linear deformation changes both the position and orientation of minutiae. The proposed system operates in three stages: alignment based fingerprint matching, fuzzy clustering and classifier framework. First, an enhanced input fingerprint image has been aligned with the template fingerprint image and matching score is computed. To improve the performance of the system, a fuzzy clustering based on distance and density has been used to cluster the feature set obtained from the fingerprint matcher. Finally a classifier framework has been developed and found that cost sensitive classifier produces better results. The system has been evaluated on fingerprint database and the experimental result shows that system produces a verification rate of 96%. This system plays an important role in forensic and civilian applications.

  2. Development and evaluation of antimicrobial activated carbon fiber filters using Sophora flavescens nanoparticles.

    Science.gov (United States)

    Sim, Kyoung Mi; Kim, Kyung Hwan; Hwang, Gi Byoung; Seo, SungChul; Bae, Gwi-Nam; Jung, Jae Hee

    2014-09-15

    Activated carbon fiber (ACF) filters have a wide range of applications, including air purification, dehumidification, and water purification, due to their large specific surface area, high adsorption capacity and rate, and specific surface reactivity. However, when airborne microorganisms such as bacteria and fungi adhere to the carbon substrate, ACF filters can become a source of microbial contamination, and their filter efficacy declines. Antimicrobial treatments are a promising means of preventing ACF bio-contamination. In this study, we demonstrate the use of Sophora flavescens in antimicrobial nanoparticles coated onto ACF filters. The particles were prepared using an aerosol process consisting of nebulization-thermal drying and particle deposition. The extract from S. flavescens is an effective, natural antimicrobial agent that exhibits antibacterial activity against various pathogens. The efficiency of Staphylococcus epidermidis inactivation increased with the concentration of S. flavescens nanoparticles in the ACF filter coating. The gas adsorption efficiency of the coated antimicrobial ACF filters was also evaluated using toluene. The toluene-removal capacity of the ACF filters remained unchanged while the antimicrobial activity was over 90% for some nanoparticle concentrations. Our results provide a scientific basis for controlling both bioaerosol and gaseous pollutants using antimicrobial ACF filters coated with S. flavescens nanoparticles.

  3. Methods of Software Verification

    Directory of Open Access Journals (Sweden)

    R. E. Gurin

    2015-01-01

    Full Text Available This article is devoted to the problem of software verification (SW. Methods of software verification designed to check the software for compliance with the stated requirements such as correctness, system security and system adaptability to small changes in the environment, portability and compatibility, etc. These are various methods both by the operation process and by the way of achieving result. The article describes the static and dynamic methods of software verification and paid attention to the method of symbolic execution. In its review of static analysis are discussed and described the deductive method, and methods for testing the model. A relevant issue of the pros and cons of a particular method is emphasized. The article considers classification of test techniques for each method. In this paper we present and analyze the characteristics and mechanisms of the static analysis of dependencies, as well as their views, which can reduce the number of false positives in situations where the current state of the program combines two or more states obtained both in different paths of execution and in working with multiple object values. Dependences connect various types of software objects: single variables, the elements of composite variables (structure fields, array elements, the size of the heap areas, the length of lines, the number of initialized array elements in the verification code using static methods. The article pays attention to the identification of dependencies within the framework of the abstract interpretation, as well as gives an overview and analysis of the inference tools.Methods of dynamic analysis such as testing, monitoring and profiling are presented and analyzed. Also some kinds of tools are considered which can be applied to the software when using the methods of dynamic analysis. Based on the work a conclusion is drawn, which describes the most relevant problems of analysis techniques, methods of their solutions and

  4. Development of sealed radioactive sources immobilized in epoxy resin for verification of detectors used in nuclear medicine

    Energy Technology Data Exchange (ETDEWEB)

    Tiezzi, Rodrigo; Rostelato, Maria Elisa C.M.; Nagatomi, Helio R.; Zeituni, Calos A.; Benega, Marcos A.G.; Souza, Daiane B. de; Costa, Osvaldo L. da; Souza, Carla D.; Rodrigues, Bruna T.; Souza, Anderson S. de; Peleias Junior, Fernando S.; Santos, Rafael Melo dos; Melo, Emerson Ronaldo de, E-mail: rktiezzi@gmail.com [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil); Karan Junior, Dib [Universidade de Sao Paulo (USP), Sao Paulo, SP (Brazil)

    2015-07-01

    The radioactive sealed sources are used in verification ionization chamber detectors, which measure the activity of radioisotopes used in several areas, such as in nuclear medicine. The measurement of the activity of radioisotopes must be made with accuracy, because it is administered to a patient. To ensure the proper functioning of the ionization chamber detectors, standardized tests are set by the International Atomic Energy Agency (IAEA) and the National Nuclear Energy Commission using sealed radioactive sources of Barium-133, Cesium-137 and Cobalt-57. The tests assess the accuracy, precision, reproducibility and linearity of response of the equipment. The focus of this work was the study and the development of these radioactive sources with standard Barium-133, Cesium-137 and Cobalt-57,using a polymer, in case commercial epoxy resin of diglycidyl ether of bisphenol A (DGEBA) and a curing agent based on modified polyamine diethylenetriamine (DETA), to immobilize the radioactive material. The polymeric matrix has the main function of fix and immobilize the radioactive contents not allowing them to leak within the technical limits required by the standards of radiological protection in the category of characteristics of a sealed source and additionally have the ability to retain the emanation of any gases that may be formed during the manufacture process and the useful life of this artifact. The manufacturing process of a sealed source standard consists of the potting ,into bottle standardized geometry, in fixed volume of a quantity of a polymeric matrix within which is added and dispersed homogeneously to need and exact amount in activity of the radioactive materials standards. Accordingly, a study was conducted for the choice of epoxy resin, analyzing its characteristics and properties. Studies and tests were performed, examining the maximum solubility of the resin in water (acidic solution, simulating the conditions of radioactive solution), loss of mechanical

  5. Notch filter

    Science.gov (United States)

    Shelton, G. B. (Inventor)

    1977-01-01

    A notch filter for the selective attenuation of a narrow band of frequencies out of a larger band was developed. A helical resonator is connected to an input circuit and an output circuit through discrete and equal capacitors, and a resistor is connected between the input and the output circuits.

  6. Enhanced Bank of Kalman Filters Developed and Demonstrated for In-Flight Aircraft Engine Sensor Fault Diagnostics

    Science.gov (United States)

    Kobayashi, Takahisa; Simon, Donald L.

    2005-01-01

    In-flight sensor fault detection and isolation (FDI) is critical to maintaining reliable engine operation during flight. The aircraft engine control system, which computes control commands on the basis of sensor measurements, operates the propulsion systems at the demanded conditions. Any undetected sensor faults, therefore, may cause the control system to drive the engine into an undesirable operating condition. It is critical to detect and isolate failed sensors as soon as possible so that such scenarios can be avoided. A challenging issue in developing reliable sensor FDI systems is to make them robust to changes in engine operating characteristics due to degradation with usage and other faults that can occur during flight. A sensor FDI system that cannot appropriately account for such scenarios may result in false alarms, missed detections, or misclassifications when such faults do occur. To address this issue, an enhanced bank of Kalman filters was developed, and its performance and robustness were demonstrated in a simulation environment. The bank of filters is composed of m + 1 Kalman filters, where m is the number of sensors being used by the control system and, thus, in need of monitoring. Each Kalman filter is designed on the basis of a unique fault hypothesis so that it will be able to maintain its performance if a particular fault scenario, hypothesized by that particular filter, takes place.

  7. Development of a pilot-scale kinetic extruder feeder system and test program. Phase II. Verification testing. Final report

    Energy Technology Data Exchange (ETDEWEB)

    1984-01-12

    This report describes the work done under Phase II, the verification testing of the Kinetic Extruder. The main objective of the test program was to determine failure modes and wear rates. Only minor auxiliary equipment malfunctions were encountered. Wear rates indicate useful life expectancy of from 1 to 5 years for wear-exposed components. Recommendations are made for adapting the equipment for pilot plant and commercial applications. 3 references, 20 figures, 12 tables.

  8. Development of a digital method for neutron/gamma-ray discrimination based on matched filtering

    Science.gov (United States)

    Korolczuk, S.; Linczuk, M.; Romaniuk, R.; Zychor, I.

    2016-09-01

    Neutron/gamma-ray discrimination is crucial for measurements with detectors sensitive to both neutron and gamma-ray radiation. Different techniques to discriminate between neutrons and gamma-rays based on pulse shape analysis are widely used in many applications, e.g., homeland security, radiation dosimetry, environmental monitoring, fusion experiments, nuclear spectroscopy. A common requirement is to improve a radiation detection level with a high detection reliability. Modern electronic components, such as high speed analog to digital converters and powerful programmable digital circuits for signal processing, allow us to develop a fully digital measurement system. With this solution it is possible to optimize digital signal processing algorithms without changing any electronic components in an acquisition signal path. We report on results obtained with a digital acquisition system DNG@NCBJ designed at the National Centre for Nuclear Research. A 2'' × 2'' EJ309 liquid scintillator was used to register mixed neutron and gamma-ray radiation from PuBe sources. A dedicated algorithm for pulse shape discrimination, based on real-time filtering, was developed and implemented in hardware.

  9. Development of simple band-spectral pyranometer and quantum meter using photovoltaic cells and bandpass filters

    Energy Technology Data Exchange (ETDEWEB)

    Bilguun, Amarsaikhan, E-mail: bilguun@pes.ee.tut.ac.jp; Nakaso, Tetsushi; Harigai, Toru; Suda, Yoshiyuki; Takikawa, Hirofumi, E-mail: takikawa@ee.tut.ac.jp [Toyohashi University of Technology, 1-1 Habarigaoka, Tempaku, Toyohashi 441-8580 (Japan); Tanoue, Hideto [Kitakyushu National College of Technology, 5-20-1, Kokuraminami, Kitakyushu, Fukuoka 802-0985 (Japan)

    2016-02-01

    In recent years, greenhouse automatic-control, based on the measurement of solar irradiance, has been attracting attention. This control is an effective method for improving crop production. In the agricultural field, it is necessary to measure Photon Flux Density (PFD), which is an important parameter in the promotion of plant growth. In particular, the PFD of Photosynthetically Active Radiation (PAR, 400-700 nm) and Plant Biologically Active Radiation (PBAR, 300-800 nm) have been discussed in agricultural plant science. The commercial quantum meter (QM, PAR meter) can only measure Photosynthetically Photon Flux Density (PPFD) which is the integrated PFD quantity on the PAR wavelength. In this research, a band-spectral pyranometer or quantum meter using PVs with optical bandpass filters for dividing the PBAR wavelength into 100 nm bands (five independent channels) was developed. Before field testing, calibration of the instruments was carried out using a solar simulator. Next, a field test was conducted in three differing weather conditions such as clear, partly cloudy and cloudy skies. As a result, it was found that the response rate of the developed pyranometer was faster by four seconds compared with the response rate of the commercial pyranometer. Moreover, the outputs of each channel in the developed pyranometer were very similar to the integrated outputs of the commercial spectroradiometer. It was confirmed that the solar irradiance could be measured in each band separately using the developed band-spectral pyranometer. It was indicated that the developed band-spectral pyranometer could also be used as a PV band-spectral quantum meter which is obtained by converting the band irradiance into band PFD.

  10. Exploring Middle School Students' Representational Competence in Science: Development and Verification of a Framework for Learning with Visual Representations

    Science.gov (United States)

    Tippett, Christine Diane

    Scientific knowledge is constructed and communicated through a range of forms in addition to verbal language. Maps, graphs, charts, diagrams, formulae, models, and drawings are just some of the ways in which science concepts can be represented. Representational competence---an aspect of visual literacy that focuses on the ability to interpret, transform, and produce visual representations---is a key component of science literacy and an essential part of science reading and writing. To date, however, most research has examined learning from representations rather than learning with representations. This dissertation consisted of three distinct projects that were related by a common focus on learning from visual representations as an important aspect of scientific literacy. The first project was the development of an exploratory framework that is proposed for use in investigations of students constructing and interpreting multimedia texts. The exploratory framework, which integrates cognition, metacognition, semiotics, and systemic functional linguistics, could eventually result in a model that might be used to guide classroom practice, leading to improved visual literacy, better comprehension of science concepts, and enhanced science literacy because it emphasizes distinct aspects of learning with representations that can be addressed though explicit instruction. The second project was a metasynthesis of the research that was previously conducted as part of the Explicit Literacy Instruction Embedded in Middle School Science project (Pacific CRYSTAL, http://www.educ.uvic.ca/pacificcrystal). Five overarching themes emerged from this case-to-case synthesis: the engaging and effective nature of multimedia genres, opportunities for differentiated instruction using multimodal strategies, opportunities for assessment, an emphasis on visual representations, and the robustness of some multimodal literacy strategies across content areas. The third project was a mixed

  11. Secure Position Verification for Wireless Sensor Networks in Noisy Channels

    CERN Document Server

    Mandal, Partha Sarathi

    2011-01-01

    Position verification in wireless sensor networks (WSNs) is quite tricky in presence of attackers (malicious sensor nodes), who try to break the verification protocol by reporting their incorrect positions (locations) during the verification stage. In the literature of WSNs, most of the existing methods of position verification have used trusted verifiers, which are in fact vulnerable to attacks by malicious nodes. They also depend on some distance estimation techniques, which are not accurate in noisy channels (mediums). In this article, we propose a secure position verification scheme for WSNs in noisy channels without relying on any trusted entities. Our verification scheme detects and filters out all malicious nodes from the network with very high probability.

  12. Genomic analyses with biofilter 2.0: knowledge driven filtering, annotation, and model development

    National Research Council Canada - National Science Library

    Pendergrass, Sarah A; Frase, Alex; Wallace, John; Wolfe, Daniel; Katiyar, Neerja; Moore, Carrie; Ritchie, Marylyn D

    2013-01-01

    .... We have now extensively revised and updated the multi-purpose software tool Biofilter that allows researchers to annotate and/or filter data as well as generate gene-gene interaction models based...

  13. Developments in non-linear Kalman Ensemble and Particle Filtering techniques for hydrological data assimilation

    Science.gov (United States)

    Khaki, Mehdi; Forootan, Ehsan; Kuhn, Michael; Awange, Joseph; Pattiaratchi, Charitha

    2016-04-01

    Quantifying large-scale (basin/global) water storage changes is essential to understand the Earth's hydrological water cycle. Hydrological models have usually been used to simulate variations in storage compartments resulting from changes in water fluxes (i.e., precipitation, evapotranspiration and runoff) considering physical or conceptual frameworks. Models however represent limited skills in accurately simulating the storage compartments that could be the result of e.g., the uncertainty of forcing parameters, model structure, etc. In this regards, data assimilation provides a great chance to combine observational data with a prior forecast state to improve both the accuracy of model parameters and to improve the estimation of model states at the same time. Various methods exist that can be used to perform data assimilation into hydrological models. The one more frequently used particle-based algorithms suitable for non-linear systems high-dimensional systems is the Ensemble Kalman Filtering (EnKF). Despite efficiency and simplicity (especially in EnKF), this method indicate some drawbacks. To implement EnKF, one should use the sample covariance of observations and model state variables to update a priori estimates of the state variables. The sample covariance can be suboptimal as a result of small ensemble size, model errors, model nonlinearity, and other factors. Small ensemble can also lead to the development of correlations between state components that are at a significant distance from one another where there is no physical relation. To investigate the under-sampling issue raise by EnKF, covariance inflation technique in conjunction with localization was implemented. In this study, a comparison between latest methods used in the data assimilation framework, to overcome the mentioned problem, is performed. For this, in addition to implementing EnKF, we introduce and apply the Local Ensemble Kalman Filter (LEnKF) utilizing covariance localization to remove

  14. Tunable transportable spectroradiometer based on an acousto-optical tunable filter: Development and optical performance

    Science.gov (United States)

    Kozlova, O.; Sadouni, A.; Truong, D.; Briaudeau, S.; Himbert, M.

    2016-12-01

    We describe a high-performance, transportable, versatile spectroradiometer based on an acousto-optical tunable filter (AOTF). The instrument was developed for temperature metrology, namely, to determine the thermodynamic temperature of black bodies above the Ag freezing point (961.78 °C). Its main design feature is the attenuation of the diffraction side lobes (and, thus, out-of-band stray light) thanks to the use of a double-pass configuration. The radiofrequency tuning of the AOTF allows continuous, fine, and rapid wavelength control over a wide spectral range (650 nm-1000 nm). The instrument tunability can be easily calibrated with an Ar spectral lamp with reproducibility within 10 pm over one week. The instrument was characterised in terms of relative signal stability (few 10-4) and wavelength stability (1 pm) over several hours. The spectral responsivity of the instrument was calibrated with two complementary methods: tuning of the wavelength of the optical source or tuning the radiofrequency of the AOTF. Besides the application for thermodynamic temperature determination at the lowest uncertainty level, this instrument can also be used for multispectral non-contact thermometry of processed materials of non-grey and non-unitary emissivity (in the glass or metallurgical industries).

  15. Critical parameters in the production of ceramic pot filters for household water treatment in developing countries.

    Science.gov (United States)

    Soppe, A I A; Heijman, S G J; Gensburger, I; Shantz, A; van Halem, D; Kroesbergen, J; Wubbels, G H; Smeets, P W M H

    2015-06-01

    The need to improve the access to safe water is generally recognized for the benefit of public health in developing countries. This study's objective was to identify critical parameters which are essential for improving the performance of ceramic pot filters (CPFs) as a point-of-use water treatment system. Defining critical production parameters was also relevant to confirm that CPFs with high-flow rates may have the same disinfection capacity as pots with normal flow rates. A pilot unit was built in Cambodia to produce CPFs under controlled and constant conditions. Pots were manufactured from a mixture of clay, laterite and rice husk in a small-scale, gas-fired, temperature-controlled kiln and tested for flow rate, removal efficiency of bacteria and material strength. Flow rate can be increased by increasing pore sizes and by increasing porosity. Pore sizes were increased by using larger rice husk particles and porosity was increased with larger proportions of rice husk in the clay mixture. The main conclusions: larger pore size decreases the removal efficiency of bacteria; higher porosity does not affect the removal efficiency of bacteria, but does influence the strength of pots; flow rates of CPFs can be raised to 10-20 L/hour without a significant decrease in bacterial removal efficiency.

  16. Developing a software for tracking the memory states of the machines in the LHCb Filter Farm

    CERN Document Server

    Jain, Harshit

    2017-01-01

    The LHCb Event Filter Farm consists of more than 1500 server nodes with a total amount of roughly 65 TB operating memory .The memory is crucial for the success of the LHCb experiment, since the proton-proton collisions are temporarily stored on these memory modules. Unfortunately, the aging nodes of the server farm occasionally suffer losses of their memory modules. The lower the available memory, the lower performance we can get out of it. Inducing the users or administrators to pay attention to this matter is inefficient. One needs to upgrade it to an acceptable way. The aim of this project was to develop a software to monitor a set of test machines. The software stores the data of the memory sticks in advance in a database which will be used for future reference. Then it checks the memory sticks at a future time instant to find any failures. In the case of any such losses the software looks up in the database to find out which memory sticks have lost and displays all information of those sticks in a log fi...

  17. Verification Support for Object Database Design

    NARCIS (Netherlands)

    Spelt, David

    1999-01-01

    In this thesis we have developed a verification theory and a tool for the automated analysis of assertions about object-oriented database schemas. The approach is inspired by the work of [SS89] in which a theorem prover is used to automate the verification of invariants for transactions on a relatio

  18. DATA VERIFICATION IN ISSUE SUPERVISING SYSTEMS

    Directory of Open Access Journals (Sweden)

    R. S. Katerinenko

    2013-01-01

    Full Text Available The paper proposes a method of data verification in issues tracking systems by means of production rules. This model makes it possible to formulate declaratively conditions that the information containment should comply with and apply reasoning procedures. Practical application of proposed verification system in a real software development project is described.

  19. SU-E-T-254: Development of a HDR-BT QA Tool for Verification of Source Position with Oncentra Applicator Modeling

    Energy Technology Data Exchange (ETDEWEB)

    Kumazaki, Y; Miyaura, K; Hirai, R; Miyazawa, K; Makino, S; Tamaki, T; Shikama, N; Kato, S [Saitama Medical University International Medical Center, Hidaka, Saitama (Japan)

    2015-06-15

    Purpose: To develop a High Dose Rate Brachytherapy (HDR-BT) quality assurance (QA) tool for verification of source position with Oncentra applicator modeling, and to report the results of radiation source positions with this tool. Methods: We developed a HDR-BT QA phantom and automated analysis software for verification of source position with Oncentra applicator modeling for the Fletcher applicator used in the MicroSelectron HDR system. This tool is intended for end-to-end tests that mimic the clinical 3D image-guided brachytherapy (3D-IGBT) workflow. The phantom is a 30x30x3 cm cuboid phantom with radiopaque markers, which are inserted into the phantom to evaluate applicator tips and reference source positions; positions are laterally shifted 10 mm from the applicator axis. The markers are lead-based and scatter radiation to expose the films. Gafchromic RTQA2 films are placed on the applicators. The phantom includes spaces to embed the applicators. The source position is determined as the distance between the exposed source position and center position of two pairs of the first radiopaque markers. We generated a 3D-IGBT plan with applicator modeling. The first source position was 6 mm from the applicator tips, and the second source position was 10 mm from the first source position. Results: All source positions were consistent with the exposed positions within 1 mm for all Fletcher applicators using in-house software. Moreover, the distance between source positions was in good agreement with the reference distance. Applicator offset, determined as the distance from the applicator tips at the first source position in the treatment planning system, was accurate. Conclusion: Source position accuracy of applicator modeling used in 3D-IGBT was acceptable. This phantom and software will be useful as a HDR-BT QA tool for verification of source position with Oncentra applicator modeling.

  20. Study on physico - chemical properties of Korean anthracite for utilization development - application to filtering materials for waste water treatment

    Energy Technology Data Exchange (ETDEWEB)

    Park, Hong Soo; Lee, Jae Ho; Park, Suk Whan [Korea Institute of Geology Mining and Materials, Taejon (Korea, Republic of)

    1996-12-01

    This research was initiated for the development of filtering materials those can be used in waste water treatment sites. The selected Jangseong coal for filtering material has low Hardgrove Grindability Index (HGI : 38.38) and crushed two granule size. One is 1-2 mm size (effective size : 0.77 mm, uniformity coefficient : 1.70) and the other is 2-4 mm size (2.04 mm, 1.37) First, we had application test to find out the possibility of 2-4 mm sample for using water filtering material instead of silica sand in Sandflo filter. The result were unsuitable for treatment efficiency and micron size granule. But it will be solution with control of granule size and washing of coal. For feasibility study, the small scale of filtration tester was built on the waste water treatment plant of Lotte-chilsung beverage Co. to use the precipitated water during filtration test processed by purifying system. Measurement items are filtration rate, temperature of waste water, Electric Conductivity (EC), pH, turbidity, Dissolved Oxygen (DO), Chemical Oxygen Demand (COD), Biochemical Oxygen Demand (BOD), Nitrogen Nitrate (NO{sub 3}-N), Organophosphorus and trace elements content (Zn, Al, Fe, Mg, K) of the supplied water and filtered water were carried out to find the filtration capacity of coal. The results indicated decreasing degree in turbidity (1-2 mm : 15.08 %, 2-4 mm : 11.58 %), COD (1-2 mm : 5.76 %, 2-4 mm : 5.49 %) and increasing degree in DO (1-2 mm : 11.25 %, 2-4 mm : 10 %). Trace elements removal degree of filtered waste water were about 30 % for Fe and 5 % for K. (author). 32 refs., tabs., figs.

  1. Development of particle filters for ships; Udvikling af partikelfiltre til skibe

    Energy Technology Data Exchange (ETDEWEB)

    Jakobsen, O.; Norre Holm, J.; Koecks, M. [Teknologisk Institut, Aarhus (Denmark)

    2013-04-01

    The project has resulted in a solution with a well-functioning maritime particle filter which reduces the particle emission significantly. The visible smoke from the vessels funnel, which typically is seen while manoeuvring in the harbour, is also reduced to a minimum. The system is constructed in such a way that the exhaust gases can be bypassed around the filter unit, in this situation to ensure the engines operation in case of filter clogging. The system has been provided with safety functions to prevent an excessive exhaust gas back-pressure and there are fitted remote controlled exhaust valves. Some of the challenges in the project have been the requirement from the engine manufacturer of keeping a low turbocharger back-pressure, besides the space conditions aboard the test vessel and the achievement of sufficient temperatures for regeneration of the particle filter. To oppose the requirement of low exhaust gas back-pressure, the filter housing was designed with space for twice as many monoliths as originally planned. In the funnel casing the original installations were removed to make space for the filter housing, and the system was enlarged with electrically controlled exhaust valves to improve the daily operation of the crew. The regeneration issue was solved by mounting electric automatically controlled heating elements in the filter housing and by an ash exhaust system. Regeneration is carried out by the crew when the vessel lies in harbour in the evening after the last tour of the day. Before mounting the particle filter, measurements were carried out aboard, showing a compound of particle emissions with an expected high NO{sub x}-level of 8.33 g/kW, whereas the other emissions were lower than expected at first. Especially HC and CO were very low, but also the particle mass (PM) had a relatively low value of 0.22 g/kWh. After commissioning the particle filter, a significant reduction of 93% of the particle number (N) was observed. A reduction in N was

  2. Development of particle filters for ships; Udvikling af partikelfiltre til skibe

    Energy Technology Data Exchange (ETDEWEB)

    Jakobsen, O.; Norre Holm, J.; Koecks, M. [Teknologisk Institut, Aarhus (Denmark)

    2013-04-01

    The project has resulted in a solution with a well-functioning maritime particle filter which reduces the particle emission significantly. The visible smoke from the vessels funnel, which typically is seen while manoeuvring in the harbour, is also reduced to a minimum. The system is constructed in such a way that the exhaust gases can be bypassed around the filter unit, in this situation to ensure the engines operation in case of filter clogging. The system has been provided with safety functions to prevent an excessive exhaust gas back-pressure and there are fitted remote controlled exhaust valves. Some of the challenges in the project have been the requirement from the engine manufacturer of keeping a low turbocharger back-pressure, besides the space conditions aboard the test vessel and the achievement of sufficient temperatures for regeneration of the particle filter. To oppose the requirement of low exhaust gas back-pressure, the filter housing was designed with space for twice as many monoliths as originally planned. In the funnel casing the original installations were removed to make space for the filter housing, and the system was enlarged with electrically controlled exhaust valves to improve the daily operation of the crew. The regeneration issue was solved by mounting electric automatically controlled heating elements in the filter housing and by an ash exhaust system. Regeneration is carried out by the crew when the vessel lies in harbour in the evening after the last tour of the day. Before mounting the particle filter, measurements were carried out aboard, showing a compound of particle emissions with an expected high NO{sub x}-level of 8.33 g/kW, whereas the other emissions were lower than expected at first. Especially HC and CO were very low, but also the particle mass (PM) had a relatively low value of 0.22 g/kWh. After commissioning the particle filter, a significant reduction of 93% of the particle number (N) was observed. A reduction in N was

  3. Results of the verification of the NIR MOS EMIR

    Science.gov (United States)

    Garzón, F.; Castro-Rodríguez, N.; Insausti, M.; López-Martín, L.; Hammersley, Peter; Barreto, M.; Fernández, P.; Joven, E.; López, P.; Mato, A.; Moreno, H.; Núñez, M.; Patrón, J.; Rasilla, J. L.; Redondo, P.; Rosich, J.; Pascual, S.; Grange, R.

    2014-07-01

    EMIR is one of the first common user instruments for the GTC, the 10 meter telescope operating at the Roque de los Muchachos Observatory (La Palma, Canary Islands, Spain). EMIR is being built by a Consortium of Spanish and French institutes led by the Instituto de Astrofísica de Canarias (IAC). EMIR is primarily designed to be operated as a MOS in the K band, but offers a wide range of observing modes, including imaging and spectroscopy, both long slit and multiobject, in the wavelength range 0.9 to 2.5 μm. This contribution reports on the results achieved so far during the verification phase at the IAC prior to its shipment to the GTC for being commissioned, which is due by mid 2015. After a long period of design and fabrication, EMIR finally entered into its integration phase by mid 2013. Soon after this, the verification phase at the IAC was initiated aimed at configuring and tuning the EMIR functions, mostly the instrument control system, which includes a sophisticated on line data reduction pipeline, and demonstrating the fulfillment of the top level requirements. We have designed an ambitious verification plan structured along the three kind of detectors at hand: the MUX and the engineering and scientific grade arrays. The EMIR subsystems are being integrated as they are needed for the purposes of the verification plan. In the first stage, using the MUX, the full optical system, but with a single dispersive element out of the three which form the EMIR suite, the two large wheels mounting the filters and the pseudo-grisms, plus the detector translation unit holding the MUX, were mounted. This stage was mainly devoted to learn about the capabilities of the instrument, define different settings for its basic operation modes and test the accuracy, repeatability and reliability of the mechanisms. In the second stage, using the engineering Hawaii2 FPA, the full set of pseudo-grisms and band filters are mounted, which means that the instrument is fully assembled

  4. 24 CFR 5.512 - Verification of eligible immigration status.

    Science.gov (United States)

    2010-04-01

    ... immigration status. 5.512 Section 5.512 Housing and Urban Development Office of the Secretary, Department of... Noncitizens § 5.512 Verification of eligible immigration status. (a) General. Except as described in paragraph...) Primary verification—(1) Automated verification system. Primary verification of the immigration status...

  5. Development of gel-filter method for high enrichment of low-molecular weight proteins from serum.

    Directory of Open Access Journals (Sweden)

    Lingsheng Chen

    Full Text Available The human serum proteome has been extensively screened for biomarkers. However, the large dynamic range of protein concentrations in serum and the presence of highly abundant and large molecular weight proteins, make identification and detection changes in the amount of low-molecular weight proteins (LMW, molecular weight ≤ 30kDa difficult. Here, we developed a gel-filter method including four layers of different concentration of tricine SDS-PAGE-based gels to block high-molecular weight proteins and enrich LMW proteins. By utilizing this method, we identified 1,576 proteins (n = 2 from 10 μL serum. Among them, 559 (n = 2 proteins belonged to LMW proteins. Furthermore, this gel-filter method could identify 67.4% and 39.8% more LMW proteins than that in representative methods of glycine SDS-PAGE and optimized-DS, respectively. By utilizing SILAC-AQUA approach with labeled recombinant protein as internal standard, the recovery rate for GST spiked in serum during the treatment of gel-filter, optimized-DS, and ProteoMiner was 33.1 ± 0.01%, 18.7 ± 0.01% and 9.6 ± 0.03%, respectively. These results demonstrate that the gel-filter method offers a rapid, highly reproducible and efficient approach for screening biomarkers from serum through proteomic analyses.

  6. Optical secure image verification system based on ghost imaging

    Science.gov (United States)

    Wu, Jingjing; Haobogedewude, Buyinggaridi; Liu, Zhengjun; Liu, Shutian

    2017-09-01

    The ghost imaging can perform Fourier-space filtering by tailoring the configuration. We proposed a novel optical secure image verification system based on this theory with the help of phase matched filtering. In the verification process, the system key and the ID card which contain the information of the correct image and the information to be verified are put in the reference and the test paths, respectively. We demonstrate that the ghost imaging configuration can perform an incoherent correlation between the system key and the ID card. The correct verification manifests itself with a correlation peak in the ghost image. The primary image and the image to be verified are encrypted and encoded into pure phase masks beforehand for security. Multi-image secure verifications can also be implemented in the proposed system.

  7. Developments of DPF systems with mesh laminated structures. Performances of DPF systems which consist of the metal-mesh laminated filter combustion with the alumina-fiber mesh, and the combustion device of trapped diesel particles; Mesh taso kozo no DPF no kaihatsu. Kinzokusen to arumina sen`i mesh ni yoru fukugo filter to filter heiyo heater ni yoru DPF no seino

    Energy Technology Data Exchange (ETDEWEB)

    Kojima, T.; Tange, A.; Matsuda, K. [NHK Spring Co. Ltd., Yokohama (Japan)

    1997-10-01

    For the purpose of continuous run without any maintenance, new DPF (diesel particulate filter)systems laminated by both metal-wire mesh and alumina-fiber mesh alternately, are under the developments. The perfect combustion of trapped diesel particulate can be achieved by a couple of the resistance heating devices inserted into the filter. 5 refs., 7 figs., 3 tabs.

  8. Hierarchical Representation Learning for Kinship Verification.

    Science.gov (United States)

    Kohli, Naman; Vatsa, Mayank; Singh, Richa; Noore, Afzel; Majumdar, Angshul

    2017-01-01

    Kinship verification has a number of applications such as organizing large collections of images and recognizing resemblances among humans. In this paper, first, a human study is conducted to understand the capabilities of human mind and to identify the discriminatory areas of a face that facilitate kinship-cues. The visual stimuli presented to the participants determine their ability to recognize kin relationship using the whole face as well as specific facial regions. The effect of participant gender and age and kin-relation pair of the stimulus is analyzed using quantitative measures such as accuracy, discriminability index d' , and perceptual information entropy. Utilizing the information obtained from the human study, a hierarchical kinship verification via representation learning (KVRL) framework is utilized to learn the representation of different face regions in an unsupervised manner. We propose a novel approach for feature representation termed as filtered contractive deep belief networks (fcDBN). The proposed feature representation encodes relational information present in images using filters and contractive regularization penalty. A compact representation of facial images of kin is extracted as an output from the learned model and a multi-layer neural network is utilized to verify the kin accurately. A new WVU kinship database is created, which consists of multiple images per subject to facilitate kinship verification. The results show that the proposed deep learning framework (KVRL-fcDBN) yields the state-of-the-art kinship verification accuracy on the WVU kinship database and on four existing benchmark data sets. Furthermore, kinship information is used as a soft biometric modality to boost the performance of face verification via product of likelihood ratio and support vector machine based approaches. Using the proposed KVRL-fcDBN framework, an improvement of over 20% is observed in the performance of face verification.

  9. A unified Kalman filter

    Science.gov (United States)

    Stubberud, Allen R.

    2017-01-01

    When considering problems of linear sequential estimation, two versions of the Kalman filter, the continuous-time version and the discrete-time version, are often used. (A hybrid filter also exists.) In many applications in which the Kalman filter is used, the system to which the filter is applied is a linear continuous-time system, but the Kalman filter is implemented on a digital computer, a discrete-time device. The two general approaches for developing a discrete-time filter for implementation on a digital computer are: (1) approximate the continuous-time system by a discrete-time system (called discretization of the continuous-time system) and develop a filter for the discrete-time approximation; and (2) develop a continuous-time filter for the system and then discretize the continuous-time filter. Generally, the two discrete-time filters will be different, that is, it can be said that discretization and filter generation are not, in general, commutative operations. As a result, any relationship between the discrete-time and continuous-time versions of the filter for the same continuous-time system is often obfuscated. This is particularly true when an attempt is made to generate the continuous-time version of the Kalman filter through a simple limiting process (the sample period going to zero) applied to the discrete-time version. The correct result is, generally, not obtained. In a 1961 research report, Kalman showed that the continuous-time Kalman filter can be obtained from the discrete-time Kalman filter by taking limits as the sample period goes to zero if the white noise process for the continuous-time version is appropriately defined. Using this basic concept, a discrete-time Kalman filter can be developed for a continuous-time system as follows: (1) discretize the continuous-time system using Kalman's technique; and (2) develop a discrete-time Kalman filter for that discrete-time system. Kalman's results show that the discrete-time filter generated in

  10. Development of research tool to evaluate the potential of using chlorella sorokiniana as bio-filter in recycled tilapia production

    OpenAIRE

    Latif, Muhammad Saqib

    2016-01-01

    The current study was attempted to develop the research tools in order to evaluate if Chlorella sorokiniana has a potential to perform as a bio-filter in recycle water tilapia production. The overall objective was to test the hypothesis that C. sorokiniana will effectively remove nitrogenous catabolites from the water and benefit the tilapia with oxygen and nutrients by photosynthesis. Removal of ammonia and nitrite from the water is improved by fertilization with phosphate, th...

  11. Ceramic fiber filter technology

    Energy Technology Data Exchange (ETDEWEB)

    Holmes, B.L.; Janney, M.A.

    1996-06-01

    Fibrous filters have been used for centuries to protect individuals from dust, disease, smoke, and other gases or particulates. In the 1970s and 1980s ceramic filters were developed for filtration of hot exhaust gases from diesel engines. Tubular, or candle, filters have been made to remove particles from gases in pressurized fluidized-bed combustion and gasification-combined-cycle power plants. Very efficient filtration is necessary in power plants to protect the turbine blades. The limited lifespan of ceramic candle filters has been a major obstacle in their development. The present work is focused on forming fibrous ceramic filters using a papermaking technique. These filters are highly porous and therefore very lightweight. The papermaking process consists of filtering a slurry of ceramic fibers through a steel screen to form paper. Papermaking and the selection of materials will be discussed, as well as preliminary results describing the geometry of papers and relative strengths.

  12. Verification Games: Crowd-Sourced Formal Verification

    Science.gov (United States)

    2016-03-01

    Government drawings, specifications, or other data included in this document for any purpose other than Government procurement does not in any way...information exchange, and its publication does not constitute the Government’s approval or disapproval of its ideas or findings. REPORT DOCUMENTATION ...DATES COVERED (From - To) JUN 2012 – SEP 2015 4. TITLE AND SUBTITLE VERIFICATION GAMES: CROWD-SOURCED FORMAL VERIFICATION 5a. CONTRACT NUMBER FA8750

  13. Filter arrays

    Energy Technology Data Exchange (ETDEWEB)

    Page, Ralph H.; Doty, Patrick F.

    2017-08-01

    The various technologies presented herein relate to a tiled filter array that can be used in connection with performance of spatial sampling of optical signals. The filter array comprises filter tiles, wherein a first plurality of filter tiles are formed from a first material, the first material being configured such that only photons having wavelengths in a first wavelength band pass therethrough. A second plurality of filter tiles is formed from a second material, the second material being configured such that only photons having wavelengths in a second wavelength band pass therethrough. The first plurality of filter tiles and the second plurality of filter tiles can be interspersed to form the filter array comprising an alternating arrangement of first filter tiles and second filter tiles.

  14. Generic Kalman Filter Software

    Science.gov (United States)

    Lisano, Michael E., II; Crues, Edwin Z.

    2005-01-01

    The Generic Kalman Filter (GKF) software provides a standard basis for the development of application-specific Kalman-filter programs. Historically, Kalman filters have been implemented by customized programs that must be written, coded, and debugged anew for each unique application, then tested and tuned with simulated or actual measurement data. Total development times for typical Kalman-filter application programs have ranged from months to weeks. The GKF software can simplify the development process and reduce the development time by eliminating the need to re-create the fundamental implementation of the Kalman filter for each new application. The GKF software is written in the ANSI C programming language. It contains a generic Kalman-filter-development directory that, in turn, contains a code for a generic Kalman filter function; more specifically, it contains a generically designed and generically coded implementation of linear, linearized, and extended Kalman filtering algorithms, including algorithms for state- and covariance-update and -propagation functions. The mathematical theory that underlies the algorithms is well known and has been reported extensively in the open technical literature. Also contained in the directory are a header file that defines generic Kalman-filter data structures and prototype functions and template versions of application-specific subfunction and calling navigation/estimation routine code and headers. Once the user has provided a calling routine and the required application-specific subfunctions, the application-specific Kalman-filter software can be compiled and executed immediately. During execution, the generic Kalman-filter function is called from a higher-level navigation or estimation routine that preprocesses measurement data and post-processes output data. The generic Kalman-filter function uses the aforementioned data structures and five implementation- specific subfunctions, which have been developed by the user on

  15. Development of improved low-cost ceramic water filters for viral removal in the Haitian context

    OpenAIRE

    Guerrero-Latorre, Laura; Rusiñol Arantegui, Marta; Hundesa Gonfa, Ayalkibet; Garcia Vallès, Maite; Martínez Manent,Salvador; Joseph, Osnick; Bofill Mas, Silvia; Gironès Llop, Rosina

    2015-01-01

    Household-based water treatment (HWT) is increasingly being promoted to improve water quality and, therefore, health status in low-income countries. Ceramic water filters (CWFs) are used in many regions as sustainable HWT and have been proven to meet World Health Organization (WHO) microbiological performance targets for bacterial removal (24 log); however, the described viral removal efficiencies are insufficient to significantly reduce the associated risk of viral infection. With the object...

  16. Development of 2.8 V Ketjen black supercapacitors with high rate capabilities for AC line filtering

    Science.gov (United States)

    Yoo, Yongju; Park, Jinwoo; Kim, Min-Seop; Kim, Woong

    2017-08-01

    Supercapacitors are generally more compact than conventional bulky aluminum electrolytic capacitors (AECs). Replacement of AECs with supercapacitors can lead to miniaturization of electronic devices. However, even state-of-the-art supercapacitors developed in laboratories are superior to or competitive with AECs only in low voltage applications (values of ∼574 μF cm-2, 2.8 V, and ∼-80°, respectively. In addition, we demonstrate that an AC line filtering circuit with three supercapacitors connected in series can extend the application voltage without significant sacrifice in rate capability (ϕ ∼ -77° at 120 Hz). On the other hand, KBs are much less expensive than carbon materials previously demonstrated for AC line filtering and hence are very attractive for practical applications. We believe that this demonstration of high-performance supercapacitors made from low-cost carbon materials is both scientifically interesting and important for practical applications.

  17. Procedure Verification and Validation Toolset Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The proposed research is aimed at investigating a procedure verification and validation toolset, which will allow the engineers who are responsible for developing...

  18. The high-energy multi-group HEST1.0 library based on ENDF/B-VII.0: development, verification and preliminary application

    Science.gov (United States)

    Wu, Jun; Chen, Yi-Xue; Wang, Wei-Jin; Yin, Wen; Liang, Tian-Jiao; Jia, Xue-Jun

    2012-03-01

    ENDF/B-VII.0, which was released by the USA Cross Section Evaluation Working Group (CSEWG) in December 2006, was demonstrated to perform much better than previous ENDF evaluations over a broad range of benchmark experiments. A high-energy (up to 150 MeV) multi-group library set named HEST1.0 with 253-neutron and 48-photon groups has been developed based on ENDF/B-VII.0 using the NJOY code. This paper provides a summary of the procedure to produce the library set and a detailed description of the verification of the multi-group library set by several shielding benchmark devices, in particular for high-energy neutron data. In addition, the first application of HEST1.0 to the shielding design of the China Spallation Neutron Source (CSNS) is demonstrated.

  19. The high-energy multi-group HEST1.0 library based on ENDF/B-Ⅶ.0: development, verification and preliminary application

    Institute of Scientific and Technical Information of China (English)

    WU Jun; CHEN Yi-Xue; WANG Wei-Jin; YIN Wen; LIANG Tian-Jiao; JIA Xue-Jun

    2012-01-01

    ENDF/B-Ⅶ.0,which was released by the USA Cross Section Evaluation Working Group (CSEWG)in December 2006,was demonstrated to perform much better than previous ENDF evaluations over a broad range of benchmark experiments.A high-energy (up to 150 MeV) multi-group library set named HEST1.0with 253-neutron and 48-photon groups has been developed based on ENDF/B-Ⅶ.0 using the N JOY code.This paper provides a summary of the procedure to produce the library set and a detailed description of the verification of the multi-group library set by several shielding benchmark devices,in particular for high-energy neutron data.In addition,the first application of HEST1.0 to the shielding design of the China Spallation Neutron Source (CSNS) is demonstrated.

  20. Development of a Compton camera for online ion beam range verification via prompt γ detection. Session: HK 12.6 Mo 18:30

    Energy Technology Data Exchange (ETDEWEB)

    Aldawood, S. [LMU Munich, Garching (Germany); King Saud University, Riyadh (Saudi Arabia); Liprandi, S.; Marinsek, T.; Bortfeldt, J.; Lang, C.; Lutter, R.; Dedes, G.; Parodi, K.; Thirolf, P.G. [LMU Munich, Garching (Germany); Maier, L.; Gernhaeuser, R. [TU Munich, Garching (Germany); Kolff, H. van der; Schaart, D. [TU Delft (Netherlands); Castelhano, I. [University of Lisbon, Lisbon (Portugal)

    2015-07-01

    A real-time ion beam verification in hadron-therapy is playing a major role in cancer treatment evaluation. This will make the treatment interuption possible if the planned and actual ion range are mismatched. An imaging system is being developed in Garching aiming to detect prompt γ rays induced by nuclear reactions between the ion beam and biological tissue. The Compton camera prototype consists of a stack of six customized double-sided Si-strip detectors (DSSSD, 50 x 50 mm{sup 2}, 128 strips/side) acting as scatterer, while the absorber is formed by a monolithic LaBr{sub 3}:Ce scintillator crystal (50 x 50 x 30 mm{sup 3}) read out by a position-sensitive multi-anode photomultiplier (Hamamatsu H9500). The study of the Compton camera properties and its individual component are in progress both in the laboratory as well as at the online facilities.

  1. Development of Filtered Bispectrum for EEG Signal Feature Extraction in Automatic Emotion Recognition Using Artificial Neural Networks

    Directory of Open Access Journals (Sweden)

    Prima Dewi Purnamasari

    2017-05-01

    Full Text Available The development of automatic emotion detection systems has recently gained significant attention due to the growing possibility of their implementation in several applications, including affective computing and various fields within biomedical engineering. Use of the electroencephalograph (EEG signal is preferred over facial expression, as people cannot control the EEG signal generated by their brain; the EEG ensures a stronger reliability in the psychological signal. However, because of its uniqueness between individuals and its vulnerability to noise, use of EEG signals can be rather complicated. In this paper, we propose a methodology to conduct EEG-based emotion recognition by using a filtered bispectrum as the feature extraction subsystem and an artificial neural network (ANN as the classifier. The bispectrum is theoretically superior to the power spectrum because it can identify phase coupling between the nonlinear process components of the EEG signal. In the feature extraction process, to extract the information contained in the bispectrum matrices, a 3D pyramid filter is used for sampling and quantifying the bispectrum value. Experiment results show that the mean percentage of the bispectrum value from 5 × 5 non-overlapped 3D pyramid filters produces the highest recognition rate. We found that reducing the number of EEG channels down to only eight in the frontal area of the brain does not significantly affect the recognition rate, and the number of data samples used in the training process is then increased to improve the recognition rate of the system. We have also utilized a probabilistic neural network (PNN as another classifier and compared its recognition rate with that of the back-propagation neural network (BPNN, and the results show that the PNN produces a comparable recognition rate and lower computational costs. Our research shows that the extracted bispectrum values of an EEG signal using 3D filtering as a feature extraction

  2. Vehicle Detection Based on Probability Hypothesis Density Filter

    Directory of Open Access Journals (Sweden)

    Feihu Zhang

    2016-04-01

    Full Text Available In the past decade, the developments of vehicle detection have been significantly improved. By utilizing cameras, vehicles can be detected in the Regions of Interest (ROI in complex environments. However, vision techniques often suffer from false positives and limited field of view. In this paper, a LiDAR based vehicle detection approach is proposed by using the Probability Hypothesis Density (PHD filter. The proposed approach consists of two phases: the hypothesis generation phase to detect potential objects and the hypothesis verification phase to classify objects. The performance of the proposed approach is evaluated in complex scenarios, compared with the state-of-the-art.

  3. Liquefied Natural Gas (LNG) dispenser verification device

    Science.gov (United States)

    Xiong, Maotao; Yang, Jie-bin; Zhao, Pu-jun; Yu, Bo; Deng, Wan-quan

    2013-01-01

    The composition of working principle and calibration status of LNG (Liquefied Natural Gas) dispenser in China are introduced. According to the defect of weighing method in the calibration of LNG dispenser, LNG dispenser verification device has been researched. The verification device bases on the master meter method to verify LNG dispenser in the field. The experimental results of the device indicate it has steady performance, high accuracy level and flexible construction, and it reaches the international advanced level. Then LNG dispenser verification device will promote the development of LNG dispenser industry in China and to improve the technical level of LNG dispenser manufacture.

  4. Verification Account Management System (VAMS)

    Data.gov (United States)

    Social Security Administration — The Verification Account Management System (VAMS) is the centralized location for maintaining SSA's verification and data exchange accounts. VAMS account management...

  5. Verification of ceramic structures

    NARCIS (Netherlands)

    Behar-Lafenetre, S.; Cornillon, L.; Rancurel, M.; Graaf, D. de; Hartmann, P.; Coe, G.; Laine, B.

    2012-01-01

    In the framework of the "Mechanical Design and Verification Methodologies for Ceramic Structures" contract [1] awarded by ESA, Thales Alenia Space has investigated literature and practices in affiliated industries to propose a methodological guideline for verification of ceramic spacecraft and instr

  6. The Identification of Filters and Interdependencies for Effective Resource Allocation: Coupling the Mitigation of Natural Hazards to Economic Development.

    Science.gov (United States)

    Agar, S. M.; Kunreuther, H.

    2005-12-01

    Policy formulation for the mitigation and management of risks posed by natural hazards requires that governments confront difficult decisions for resource allocation and be able to justify their spending. Governments also need to recognize when spending offers little improvement and the circumstances in which relatively small amounts of spending can make substantial differences. Because natural hazards can have detrimental impacts on local and regional economies, patterns of economic development can also be affected by spending decisions for disaster mitigation. This paper argues that by mapping interdependencies among physical, social and economic factors, governments can improve resource allocation to mitigate the risks of natural hazards while improving economic development on local and regional scales. Case studies of natural hazards in Turkey have been used to explore specific "filters" that act to modify short- and long-term outcomes. Pre-event filters can prevent an event from becoming a natural disaster or change a routine event into a disaster. Post-event filters affect both short and long-term recovery and development. Some filters cannot be easily modified by spending (e.g., rural-urban migration) but others (e.g., land-use practices) provide realistic spending targets. Net social benefits derived from spending, however, will also depend on the ways by which filters are linked, or so-called "interdependencies". A single weak link in an interdependent system, such as a power grid, can trigger a cascade of failures. Similarly, weak links in social and commercial networks can send waves of disruption through communities. Conversely, by understanding the positive impacts of interdependencies, spending can be targeted to maximize net social benefits while mitigating risks and improving economic development. Detailed information on public spending was not available for this study but case studies illustrate how networks of interdependent filters can modify

  7. Demonstration of Design Verification Model of Rubidium Frequency Standard

    CERN Document Server

    Ghosal, Bikash; Nandanwar, Satish; Banik, Alak; Dasgupta, K S; Saxena, G M

    2011-01-01

    In this paper we report the development of the design verification model (DVM) of Rb atomic frequency standard. The Rb atomic frequency standard or clock has two distinct parts. One is the Physics Package where the hyperfine transitions produce the clock signal in the integrated filter cell configuration and the other is the electronic circuits which generate the resonant microwave hyperfine frequency, phase modulator and phase sensitive detector. In this paper the details of the Rb Physics package and the electronic circuits are given. The effect of putting the photo detector inside the microwave cavity is studied and reported with its effect on the resonance signal profile. The Rb clock frequency stability measurements have also been discussed.

  8. Verification and Validation in Systems Engineering

    CERN Document Server

    Debbabi, Mourad; Jarraya, Yosr; Soeanu, Andrei; Alawneh, Luay

    2010-01-01

    "Verification and validation" represents an important process used for the quality assessment of engineered systems and their compliance with the requirements established at the beginning of or during the development cycle. Debbabi and his coauthors investigate methodologies and techniques that can be employed for the automatic verification and validation of systems engineering design models expressed in standardized modeling languages. Their presentation includes a bird's eye view of the most prominent modeling languages for software and systems engineering, namely the Unified Model

  9. An Assembler Driven Verification Methodology (ADVM)

    CERN Document Server

    Macbeth, John S; Gray, Ken

    2011-01-01

    This paper presents an overview of an assembler driven verification methodology (ADVM) that was created and implemented for a chip card project at Infineon Technologies AG. The primary advantage of this methodology is that it enables rapid porting of directed tests to new targets and derivatives, with only a minimum amount of code refactoring. As a consequence, considerable verification development time and effort was saved.

  10. Advanced hot gas filter development. Topical report, May 1995--December 1996

    Energy Technology Data Exchange (ETDEWEB)

    Hurley, J.L.; June, M.R.

    1997-12-31

    Porous iron aluminide was evaluated for use as a particulate filter in pressurized fluid-bed combustion (PFBC) and integrated gasification combined cycles (IGCC) with a short term test. Three alloy compositions were tested: Fe{sub 3}Al 5% chromium (FAL), Fe{sub 3}Al 2% chromium (FAS) and FeAl 0% chromium. The test conditions simulated air blown (Tampa Electric) and oxygen blown (Sierra Pacific) gasifiers with one test gas composition. Four test conditions were used with hydrogen sulfide levels varying from 783 ppm to 78,3000 ppm at 1 atmosphere along with temperatures ranging between 925 F and 1200 F. The iron aluminide was found capable of withstanding the proposed operating conditions and capable of giving years of service. The production method and preferred composition were established as seamless cylinders of Fe{sub 3}Al 2% chromium with a preoxidation of seven hours at 1472 F.

  11. Formal Verification of Continuous Systems

    DEFF Research Database (Denmark)

    Sloth, Christoffer

    2012-01-01

    and the verification procedures should be algorithmically synthesizable. Autonomous control plays an important role in many safety-critical systems. This implies that a malfunction in the control system can have catastrophic consequences, e.g., in space applications where a design flaw can result in large economic...... losses. Furthermore, a malfunction in the control system of a surgical robot may cause death of patients. The previous examples involve complex systems that are required to operate according to complex specifications. The systems cannot be formally verified by modern verification techniques, due......The purpose of this thesis is to develop a method for verifying timed temporal properties of continuous dynamical systems, and to develop a method for verifying the safety of an interconnection of continuous systems. The methods must be scalable in the number of continuous variables...

  12. Optimal filtering

    CERN Document Server

    Anderson, Brian D O

    2005-01-01

    This graduate-level text augments and extends beyond undergraduate studies of signal processing, particularly in regard to communication systems and digital filtering theory. Vital for students in the fields of control and communications, its contents are also relevant to students in such diverse areas as statistics, economics, bioengineering, and operations research.Topics include filtering, linear systems, and estimation; the discrete-time Kalman filter; time-invariant filters; properties of Kalman filters; computational aspects; and smoothing of discrete-time signals. Additional subjects e

  13. Program Verification of Numerical Computation

    OpenAIRE

    Pantelis, Garry

    2014-01-01

    These notes outline a formal method for program verification of numerical computation. It forms the basis of the software package VPC in its initial phase of development. Much of the style of presentation is in the form of notes that outline the definitions and rules upon which VPC is based. The initial motivation of this project was to address some practical issues of computation, especially of numerically intensive programs that are commonplace in computer models. The project evolved into a...

  14. Verification and validation plan for the SFR system analysis module

    Energy Technology Data Exchange (ETDEWEB)

    Hu, R. [Argonne National Lab. (ANL), Argonne, IL (United States)

    2014-12-18

    This report documents the Verification and Validation (V&V) Plan for software verification and validation of the SFR System Analysis Module (SAM), developed at Argonne National Laboratory for sodium fast reactor whole-plant transient analysis. SAM is developed under the DOE NEAMS program and is part of the Reactor Product Line toolkit. The SAM code, the phenomena and computational models of interest, the software quality assurance, and the verification and validation requirements and plans are discussed in this report.

  15. Advanced verification topics

    CERN Document Server

    Bhattacharya, Bishnupriya; Hall, Gary; Heaton, Nick; Kashai, Yaron; Khan Neyaz; Kirshenbaum, Zeev; Shneydor, Efrat

    2011-01-01

    The Accellera Universal Verification Methodology (UVM) standard is architected to scale, but verification is growing and in more than just the digital design dimension. It is growing in the SoC dimension to include low-power and mixed-signal and the system integration dimension to include multi-language support and acceleration. These items and others all contribute to the quality of the SOC so the Metric-Driven Verification (MDV) methodology is needed to unify it all into a coherent verification plan. This book is for verification engineers and managers familiar with the UVM and the benefits it brings to digital verification but who also need to tackle specialized tasks. It is also written for the SoC project manager that is tasked with building an efficient worldwide team. While the task continues to become more complex, Advanced Verification Topics describes methodologies outside of the Accellera UVM standard, but that build on it, to provide a way for SoC teams to stay productive and profitable.

  16. Development of multi-filter spectroradiometry; Filter hoshiki ni yoru bunka hosharyo no keisoku hoho to sono supekutoru no hyogen hoho ni tsuite

    Energy Technology Data Exchange (ETDEWEB)

    Miyake, Y.; Aoshima, T.; Minoda, T.; Kato, T.; Kondo, S. [Eiko Instruments Trading Co. Ltd., Tokyo (Japan)

    1996-10-27

    Described in this paper is a technique of solar radiation spectroradiometry in which high-resolution wavelength computation adds to a multi-filter method. The solar spectrum upon entering the atmosphere is scattered and absorbed by parameter-constituting elements such as gas, aerosol, cloud particles, etc., and its spectral contour is complicatedly deformed relative to wavelength. Taking advantage of the fact that the scattering and absorbing characteristics of some of the elements are constant relative to wavelength, a simple equation was constructed to enable high-resolution spectrum measurement wavelength-wise, and this compensates for the limit in measurable wavelength that the conventional multi-filter method suffers from. The new method discussed here is not so expensive as the grating method thanks to the employment of filters, is capable of determining spectral radiation quantities with a precision of {plus_minus}5%, and is reduced in terms of the capacity of memory for data storage. The new method enables data collection under various atmospheric conditions that the four seasons present, which the difficult-to-apply and expensive spectroradiometer fails. It is expected that this method will find its use in collecting basic data for the designing of photovoltaic power generation systems, in the study of photochemical reaction in agriculture, and in collecting basic data for daylight lighting. 1 ref., 6 figs., 2 tabs.

  17. 天然气计量检定技术现状及进展%Current Situation and Development of the Natural Gas Metrological Verification Technology

    Institute of Scientific and Technical Information of China (English)

    赵士海; 刘博韬

    2015-01-01

    With the continual increase of requirement of natural gas and global development of natural gas trade, the accuracy of natural gas flow rate measurement attracts more and more attention from the supply and demand sides. In this paper, the current technology situation and development trend of orifice-plate flowmeter, ultrasonic flowmeter and turbine flowmeter were reviewed. Development of the natural gas metrological verification technology used for measurement tracing at home and abroad was also analyzed.%随着天然气需求量的不断增加和天然气贸易的全球化发展,供求双方越来越重视天然气流量计量的准确性。重点论述了广泛应用于天然气流量计量的孔板流量计、超声流量计、涡轮流量计的技术现状和发展趋势,以及国内外用于流量计量值溯源的天然气计量检定站技术的进展情况。

  18. SU-E-T-105: Development of 3D Dose Verification System for Volumetric Modulated Arc Therapy Using Improved Polyacrylamide-Based Gel Dosimeter

    Energy Technology Data Exchange (ETDEWEB)

    Ono, K; Fujimoto, S; Akagi, Y; Hirokawa, Y [Hiroshima Heiwa Clinic, Hiroshima (Japan); Hayashi, S [Hiroshima International University, Hiroshima (Japan); Miyazawa, M [R-TECH.INC, Toukyo (Japan)

    2014-06-01

    Purpose: The aim of this dosimetric study was to develop 3D dose verification system for volumetric modulated arc therapy (VMAT) using polyacrylamide-based gel (PAGAT) dosimeter improved the sensitivity by magnesium chloride (MgCl{sub 2}). Methods: PAGAT gel containing MgCl{sub 2} as a sensitizer was prepared in this study. Methacrylic-acid-based gel (MAGAT) was also prepared to compare the dosimetric characteristics with PAGAT gel. The cylindrical glass vials (4 cm diameter, 12 cm length) filled with each polymer gel were irradiated with 6 MV photon beam using Novalis Tx linear accelerator (Varian/BrainLAB). The irradiated polymer gel dosimeters were scanned with Signa 1.5 T MRI system (GE), and dose calibration curves were obtained using T{sub 2} relaxation rate (R{sub 2} = 1/T{sub 2}). Dose rate (100-600 MU min{sup −1}) and fractionation (1-8 fractions) were varied. In addition, a cubic acrylic phantom (10 × 10 × 10 cm{sup 3}) filled with improved PAGAT gel inserted into the IMRT phantom (IBA) was irradiated with VMAT (RapidArc). C-shape structure was used for the VMAT planning by the Varian Eclipse treatment planning system (TPS). The dose comparison of TPS and measurements with the polymer gel dosimeter was accomplished by the gamma index analysis, overlaying the dose profiles for a set of data on selected planes using in-house developed software. Results: Dose rate and fractionation dependence of improved PAGAT gel were smaller than MAGAT gel. A high similarity was found by overlaying the dose profiles measured with improved PAGAT gel dosimeter and the TPS dose, and the mean pass rate of the gamma index analysis using 3%/3 mm criteria was achieved 90% on orthogonal planes for VMAT using improved PAGAT gel dosimeter. Conclusion: In-house developed 3D dose verification system using improved polyacrylamide-based gel dosimeter had a potential as an effective tool for VMAT QA.

  19. Recent advances in polarized 3 He based neutron spin filter development

    Science.gov (United States)

    Chen, Wangchun; Gentile, Thomas; Erwin, Ross; Watson, Shannon; Krycka, Kathryn; Ye, Qiang; NCNR NIST Team; University of Maryland Team

    2015-04-01

    Polarized 3 He neutron spin filters (NSFs) are based on the strong spin-dependence of the neutron absorption cross section by 3 He. NSFs can polarize large area, widely divergent, and broadband neutron beams effectively and allow for combining a neutron polarizer and a spin flipper into a single polarizing device. The last capability utilizes 3 He spin inversion based on the adiabatic fast passage (AFP) nuclear magnetic resonance technique. Polarized 3 He NSFs are significantly expanding the polarized neutron measurement capabilities at the NIST Center for Neutron Research (NCNR). Here we present an overview of 3 He NSF applications to small-angle neutron scattering, thermal triple axis spectrometry, and wide-angle polarization analysis. We discuss a recent upgrade of our spin-exchange optical pumping (SEOP) systems that utilize chirped volume holographic gratings for spectral narrowing. The new capability allows us to polarize rubidium/potassium hybrid SEOP cells over a liter in volume within a day, with 3 He polarizations up to 88%, Finally we discuss how we can achieve nearly lossless 3 He polarization inversion with AFP.

  20. SSN Verification Service

    Data.gov (United States)

    Social Security Administration — The SSN Verification Service is used by Java applications to execute the GUVERF02 service using the WebSphere/CICS Interface. It accepts several input data fields...

  1. Integrated Java Bytecode Verification

    DEFF Research Database (Denmark)

    Gal, Andreas; Probst, Christian; Franz, Michael

    2005-01-01

    Existing Java verifiers perform an iterative data-flow analysis to discover the unambiguous type of values stored on the stack or in registers. Our novel verification algorithm uses abstract interpretation to obtain definition/use information for each register and stack location in the program......, which in turn is used to transform the program into Static Single Assignment form. In SSA, verification is reduced to simple type compatibility checking between the definition type of each SSA variable and the type of each of its uses. Inter-adjacent transitions of a value through stack and registers...... are no longer verified explicitly. This integrated approach is more efficient than traditional bytecode verification but still as safe as strict verification, as overall program correctness can be induced once the data flow from each definition to all associated uses is known to be type-safe....

  2. Voltage verification unit

    Science.gov (United States)

    Martin, Edward J.

    2008-01-15

    A voltage verification unit and method for determining the absence of potentially dangerous potentials within a power supply enclosure without Mode 2 work is disclosed. With this device and method, a qualified worker, following a relatively simple protocol that involves a function test (hot, cold, hot) of the voltage verification unit before Lock Out/Tag Out and, and once the Lock Out/Tag Out is completed, testing or "trying" by simply reading a display on the voltage verification unit can be accomplished without exposure of the operator to the interior of the voltage supply enclosure. According to a preferred embodiment, the voltage verification unit includes test leads to allow diagnostics with other meters, without the necessity of accessing potentially dangerous bus bars or the like.

  3. Standard Verification System (SVS)

    Data.gov (United States)

    Social Security Administration — SVS is a mainframe program that accesses the NUMIDENT to perform SSN verifications. This program is called by SSA Internal applications to verify SSNs. There is also...

  4. Development of digital device based work verification system for cooperation between main control room operators and field workers in nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Seung Min, E-mail: jewellee@kaeri.re.kr [Korea Atomic Energy Research Institute, 305-353, 989-111 Daedeok-daero, Yuseong-gu, Daejeon (Korea, Republic of); Department of Nuclear and Quantum Engineering, Korea Advanced Institute of Science and Technology, 373-1, Guseong-dong, Yuseong-gu, Daejeon 305-701 (Korea, Republic of); Lee, Hyun Chul, E-mail: leehc@kaeri.re.kr [Korea Atomic Energy Research Institute, 305-353, 989-111 Daedeok-daero, Yuseong-gu, Daejeon (Korea, Republic of); Ha, Jun Su, E-mail: junsu.ha@kustar.ac.ae [Department of Nuclear Engineering, Khalifa University of Science Technology and Research, Abu Dhabi P.O. Box 127788 (United Arab Emirates); Seong, Poong Hyun, E-mail: phseong@kaist.ac.kr [Department of Nuclear and Quantum Engineering, Korea Advanced Institute of Science and Technology, 373-1, Guseong-dong, Yuseong-gu, Daejeon 305-701 (Korea, Republic of)

    2016-10-15

    Highlights: • A digital device-based work verification and cooperation support system was developed. • Requirements were derived by interviewing field operators having experiences with mobile-based work support systems. • The usability of the proposed system was validated by conducting questionnaire surveys. • The proposed system will be useful if the manual or the set of guidelines is well constructed. - Abstract: Digital technologies have been applied in the nuclear field to check task results, monitor events and accidents, and transmit/receive data. The results of using digital devices have proven that these devices can provide high accuracy and convenience for workers, allowing them to obtain obvious positive effects by reducing their workloads. In this study, as one step forward, a digital device-based cooperation support system, the nuclear cooperation support and mobile documentation system (Nu-COSMOS), is proposed to support communication between main control room (MCR) operators and field workers by verifying field workers’ work results in nuclear power plants (NPPs). The proposed system consists of a mobile based information storage system to support field workers by providing various functions to make workers more trusted by MCR operators; also to improve the efficiency of meeting, and a large screen based information sharing system supports meetings by allowing both sides to share one medium. The usability of this system was estimated by interviewing field operators working in nuclear power plants and experts who have experience working as operators. A survey to estimate the usability of the suggested system and the suitability of the functions of the system for field working was conducted for 35 subjects who have experience in field works or with support system development-related research. The usability test was conducted using the system usability scale (SUS), which is widely used in industrial usability evaluation. Using questionnaires

  5. Hybrid Filter Membrane

    Science.gov (United States)

    Laicer, Castro; Rasimick, Brian; Green, Zachary

    2012-01-01

    Cabin environmental control is an important issue for a successful Moon mission. Due to the unique environment of the Moon, lunar dust control is one of the main problems that significantly diminishes the air quality inside spacecraft cabins. Therefore, this innovation was motivated by NASA s need to minimize the negative health impact that air-suspended lunar dust particles have on astronauts in spacecraft cabins. It is based on fabrication of a hybrid filter comprising nanofiber nonwoven layers coated on porous polymer membranes with uniform cylindrical pores. This design results in a high-efficiency gas particulate filter with low pressure drop and the ability to be easily regenerated to restore filtration performance. A hybrid filter was developed consisting of a porous membrane with uniform, micron-sized, cylindrical pore channels coated with a thin nanofiber layer. Compared to conventional filter media such as a high-efficiency particulate air (HEPA) filter, this filter is designed to provide high particle efficiency, low pressure drop, and the ability to be regenerated. These membranes have well-defined micron-sized pores and can be used independently as air filters with discreet particle size cut-off, or coated with nanofiber layers for filtration of ultrafine nanoscale particles. The filter consists of a thin design intended to facilitate filter regeneration by localized air pulsing. The two main features of this invention are the concept of combining a micro-engineered straight-pore membrane with nanofibers. The micro-engineered straight pore membrane can be prepared with extremely high precision. Because the resulting membrane pores are straight and not tortuous like those found in conventional filters, the pressure drop across the filter is significantly reduced. The nanofiber layer is applied as a very thin coating to enhance filtration efficiency for fine nanoscale particles. Additionally, the thin nanofiber coating is designed to promote capture of

  6. 多层放射性铬胶片剂量验证系统的研制%Development of multi-layer radiochromic film dose verification system

    Institute of Scientific and Technical Information of China (English)

    张可; 谢玲灵; 张中柱; 戴建荣

    2014-01-01

    目的:采用放射性铬胶片(RCF)快速准确地验证调强放射治疗(IMRT)形成的复杂剂量分布,研制多层RCF剂量验证系统。方法:以RCF为载体,利用测量模体、RCF、胶片扫描仪及验证软件的剂量验证系统实施测量及分析;模体模拟人体外形,包含多种专用模块,软件包含二维、三维无标记点配准及验证分析功能。结果:模体凸凹槽结构结合软件无标记点自动定位功能,可快速完成胶片的固定及其与计划数据的配准以及RCF的免冲洗自动显影,减少了调强放射治疗测量和分析的不确定因素,减少物理师工作量。患者调强验证以剂量偏差3%和3 mm距离偏差为控制标准,6个临床病例的伽马分析通过率均>90%。结论:多层RCF剂量验证系统是调强适形放射治疗剂量验证和常规质量保证的多用途工具,具有方便、准确、真实、海量信息及多用途等特点,可用于对直线加速器、伽玛刀、射波刀、后装机及粒子植入等放射治疗设备质量验证和调强治疗患者剂量的二维、三维验证。%Objective: Multifilm QA(Quality Assurance) system is developed to guarantee the quality of radiotherapy, and verify the dose distribution of IMRT more accurately. Methods:Radiochromic film is used as carrier and the dose analysis system is composed of phantom, radiochromic film, film scanner and analysis software. The phantom is shaped as human body, and it consists of appropriative modules for ionization chamber, film and MOSFET dosimeters and inhomogeneous module containing lung and bone density inserts. Software provides unique functions in 2D and 3D registering and analyzing, facilitates plan verifications and routine quality assurance programs. Results:The tongue and groove joint and the automatic register function make the film fixing and register convenient. Combined with the self-developed radiochromic film, the uncertainties in measurement

  7. Advanced Filtering Techniques Applied to Spaceflight Project

    Data.gov (United States)

    National Aeronautics and Space Administration — IST-Rolla developed two nonlinear filters for spacecraft orbit determination during the Phase I contract. The theta-D filter and the cost based filter, CBF, were...

  8. [PIV: a computer-aided portal image verification system].

    Science.gov (United States)

    Fu, Weihua; Zhang, Hongzhi; Wu, Jing

    2002-12-01

    Portal image verification (PIV) is one of the key actions in QA procedure for sophisticated accurate radiotherapy. The purpose of this study was to develop a PIV software as a tool for improving the accuracy and visualization of portal field verification and computing field placement errors. PIV was developed in the visual C++ integrated environment under Windows 95 operating system. It can improve visualization by providing tools for image processing and multimode images display. Semi-automatic register methods make verification more accurate than view-box method. It can provide useful quantitative errors for regular fields. PIV is flexible and accurate. It is an effective tool for portal field verification.

  9. Development of nuclear thermal hydraulic verification test and evaluation technology - Development of fundamental technique for experiment of natural circulation phenomena in PWR systems

    Energy Technology Data Exchange (ETDEWEB)

    Park, Goon Cherl; Lee, Tae Ho; Kim, Moon Oh; Kim, Hak Joon [Seoul National University, Seoul (Korea)

    2000-04-01

    The dimensional analysis applied two-fluid model of CFX-4,2 were performed. For verification of analysis results, experimental measurement data of two-phase flow parameters in subcooled boiling flow were produced for vertical(0 deg) and inclination (60 deg). And through comparison analysis and experiments the application possibility of various two -phase flow models and the analysis ability of code were evaluated. Measurement technique of bubble velocity in two-phase flow using backscattering standard LDV was investigated from slug to bubbly flow regime. The range of velocity measured is from 0.2 to 1.5 m/s and that of bubble size is from 2 to 20 mm. For local temperature of boiling flow measurement, microthermocouple were manufactured and local liquid and vapor temperatures were measured in pool boiling and boiling flow. 66 refs., 74 figs., 4 tabs. (Author)

  10. USER CONTEXT MODELS : A FRAMEWORK TO EASE SOFTWARE FORMAL VERIFICATIONS

    OpenAIRE

    2010-01-01

    This article is accepted to appear in ICEIS 2010 proceedings; International audience; Several works emphasize the difficulties of software verification applied to embedded systems. In past years, formal verification techniques and tools were widely developed and used by the research community. However, the use of formal verification at industrial scale remains difficult, expensive and requires lot of time. This is due to the size and the complexity of manipulated models, but also, to the impo...

  11. Region-specific growth effects in the developing rat prostate following fetal exposure to estrogenic ultraviolet filters.

    Science.gov (United States)

    Hofkamp, Luke; Bradley, Sarahann; Tresguerres, Jesus; Lichtensteiger, Walter; Schlumpf, Margret; Timms, Barry

    2008-07-01

    Exposure to environmental endocrine disruptors is a potential risk factor for humans. Many of these chemicals have been shown to exhibit disruption of normal cellular and developmental processes in animal models. Ultraviolet (UV) filters used as sunscreens in cosmetics have previously been shown to exhibit estrogenic activity in in vitro and in vivo assays. We examined the effects of two UV filters, 4-methylbenzylidene camphor (4-MBC) and 3-benzylidene camphor (3-BC), in the developing prostate of the fetal rat. Pregnant Long Evans rats were fed diets containing doses of 4-MBC and 3-BC that resulted in average daily intakes of these chemicals corresponding to the lowest observed adverse effects level (LOAEL) and the no observed adverse effects level (NOAEL) doses in prior developmental toxicity studies. Using digital photographs of serial sections from postnatal day 1 animals, we identified, contoured, and aligned the epithelial ducts from specific regions of the developing prostate, plus the accessory sex glands and calculated the total volume for each region from three-dimensional, surface-rendered models. Fetal exposure to 4-MBC (7.0 mg/kg body weight/day) resulted in a significant increase (p < 0.05) in tissue volume in the prostate and accessory sex glands. Treated males exhibited a 62% increase in the number of ducts in the caudal dorsal prostate. Increased distal branching morphogenesis appears to be a consequence of exposure in the ventral region, resulting in a 106% increase in ductal volume. 4-MBC exposure during development of the male reproductive accessory sex glands exhibited classical growth effects associated with estrogenic endocrine disruptors. The different regional responses suggest that the two developmental processes of ductal outgrowth and branching morphogenesis are affected independently by exposure to the environmental chemicals.

  12. Real-Time Flood Forecasting System Using Channel Flow Routing Model with Updating by Particle Filter

    Science.gov (United States)

    Kudo, R.; Chikamori, H.; Nagai, A.

    2008-12-01

    A real-time flood forecasting system using channel flow routing model was developed for runoff forecasting at water gauged and ungaged points along river channels. The system is based on a flood runoff model composed of upstream part models, tributary part models and downstream part models. The upstream part models and tributary part models are lumped rainfall-runoff models, and the downstream part models consist of a lumped rainfall-runoff model for hillslopes adjacent to a river channel and a kinematic flow routing model for a river channel. The flow forecast of this model is updated by Particle filtering of the downstream part model as well as by the extended Kalman filtering of the upstream part model and the tributary part models. The Particle filtering is a simple and powerful updating algorithm for non-linear and non-gaussian system, so that it can be easily applied to the downstream part model without complicated linearization. The presented flood runoff model has an advantage in simlecity of updating procedure to the grid-based distributed models, which is because of less number of state variables. This system was applied to the Gono-kawa River Basin in Japan, and flood forecasting accuracy of the system with both Particle filtering and extended Kalman filtering and that of the system with only extended Kalman filtering were compared. In this study, water gauging stations in the objective basin were divided into two types of stations, that is, reference stations and verification stations. Reference stations ware regarded as ordinary water gauging stations and observed data at these stations are used for calibration and updating of the model. Verification stations ware considered as ungaged or arbitrary points and observed data at these stations are used not for calibration nor updating but for only evaluation of forecasting accuracy. The result confirms that Particle filtering of the downstream part model improves forecasting accuracy of runoff at

  13. Software Development for Quality Verification of Remote Sensing Image Products%遥感图像产品元数据质量检查软件研制

    Institute of Scientific and Technical Information of China (English)

    罗娇

    2013-01-01

    The quality verification of remote sensing image products is an essential step of the process of the sensing satellite data producing .This research is built into a common platform to routine quality test for remote sensing image products ,Kinds of remote sensing image products of multiple formats can be realized in this system .Three techniques are used in this system ,they are Java development language ,Eclipse RCP structure and WebSphere MQ tool .This system has the advantages of strong expansibility ,strong reusability and easy maintenance .%遥感图像产品质量检查工作是遥感数据产品生产过程生必不可少的一步。本研究建成的遥感图像产品质量检查系统是对遥感数据产品进行常规质量检测的通用平台,它能实现对TIF遥感图像产品的元数据质量检查。此系统使用Java开发语言、Eclipse RCP框架和WebSphere MQ技术进行设计,具有可拓展性强、可复用性强、方便维护的特点。

  14. An approach for rapid development of nasal delivery of analgesics--identification of relevant features, in vitro screening and in vivo verification.

    Science.gov (United States)

    Wang, Shu; Chow, Moses S S; Zuo, Zhong

    2011-11-25

    Drug delivery via the nasal route is gaining increasing interest over the last two decades as an alternative to oral or parenteral drug administration. In the current study an approach for rapid identification of relevant features, screening and in vivo verification of potential therapeutic agents for nasal delivery was carried out using "analgesic agents" as an example. Four such drug candidates (rizatriptan, meloxicam, lornoxicam and nebivolol) were initially identified as potentially viable agents based on their therapeutic use and physicochemical characteristics. An in vitro screening was then carried out using the Calu-3 cell line model. Based on the in vitro screening results and the reported pharmacokinetic and the stability data, meloxicam was predicted to be the most promising drug candidate and was subsequently verified using an in vivo animal model. The in vivo results showed that nasal administration of meloxicam was comparable to its intravenous administration, with respect to plasma drug concentration and AUC(0-2h). In addition, nasal absorption of meloxicam was much more rapid with higher plasma drug concentration and AUC(0-2h) than that of oral administration. The current approach appears to be capable of developing "analgesic agents" suitable for nasal delivery. Further studies are needed to prove the clinical advantage of the specific selected agent, meloxicam, by nasal administration in patients.

  15. SMARD-REXUS-18: Development and Verification of an SMA Based CubeSat Solar Panel Deployment Mechanism

    Science.gov (United States)

    Grulich, M.; Koop, A.; Ludewig, P.; Gutsmiedl, J.; Kugele, J.; Ruck, T.; Mayer, I.; Schmid, A.; Dietmann, K.

    2015-09-01

    SMARD (Shape Memory Alloy Reusable Deployment Mechanism) is an experiment for a sounding rocket developed by students at Technische Universität MUnchen (TUM). It was launched in March 2015 on REXUS 18 (Rocket Experiments for University Students). The goal of SMARD was to develop a solar panel holddown and release mechanism (HDRM) for a CubeSat using shape memory alloys (SMA) for repeatable actuation and the ability to be quickly resettable. This paper describes the technical approach as well as the technological development and design of the experiment platform, which is capable of proving the functionality of the deployment mechanism. Furthermore, the realization of the experiment as well as the results of the flight campaign are presented. Finally, the future applications of the developed HDRM and its possible further developments are discussed.

  16. Embedded software verification and debugging

    CERN Document Server

    Winterholer, Markus

    2017-01-01

    This book provides comprehensive coverage of verification and debugging techniques for embedded software, which is frequently used in safety critical applications (e.g., automotive), where failures are unacceptable. Since the verification of complex systems needs to encompass the verification of both hardware and embedded software modules, this book focuses on verification and debugging approaches for embedded software with hardware dependencies. Coverage includes the entire flow of design, verification and debugging of embedded software and all key approaches to debugging, dynamic, static, and hybrid verification. This book discusses the current, industrial embedded software verification flow, as well as emerging trends with focus on formal and hybrid verification and debugging approaches. Includes in a single source the entire flow of design, verification and debugging of embedded software; Addresses the main techniques that are currently being used in the industry for assuring the quality of embedded softw...

  17. Improvement on post-OPC verification efficiency for contact/via coverage check by final CD biasing of metal lines and considering their location on the metal layout

    Science.gov (United States)

    Kim, Youngmi; Choi, Jae-Young; Choi, Kwangseon; Choi, Jung-Hoe; Lee, Sooryong

    2011-04-01

    As IC design complexity keeps increasing, it is more and more difficult to ensure the pattern transfer after optical proximity correction (OPC) due to the continuous reduction of layout dimensions and lithographic limitation by k1 factor. To guarantee the imaging fidelity, resolution enhancement technologies (RET) such as off-axis illumination (OAI), different types of phase shift masks and OPC technique have been developed. In case of model-based OPC, to cross-confirm the contour image versus target layout, post-OPC verification solutions continuously keep developed - contour generation method and matching it to target structure, method for filtering and sorting the patterns to eliminate false errors and duplicate patterns. The way to detect only real errors by excluding false errors is the most important thing for accurate and fast verification process - to save not only reviewing time and engineer resource, but also whole wafer process time and so on. In general case of post-OPC verification for metal-contact/via coverage (CC) check, verification solution outputs huge of errors due to borderless design, so it is too difficult to review and correct all points of them. It should make OPC engineer to miss the real defect, and may it cause the delay time to market, at least. In this paper, we studied method for increasing efficiency of post-OPC verification, especially for the case of CC check. For metal layers, final CD after etch process shows various CD bias, which depends on distance with neighbor patterns, so it is more reasonable that consider final metal shape to confirm the contact/via coverage. Through the optimization of biasing rule for different pitches and shapes of metal lines, we could get more accurate and efficient verification results and decrease the time for review to find real errors. In this paper, the suggestion in order to increase efficiency of OPC verification process by using simple biasing rule to metal layout instead of etch model

  18. THz Discrimination of Materials: Development of an Apparatus Based on Room Temperature Detection and Metasurfaces Selective Filters

    Science.gov (United States)

    Carelli, P.; Chiarello, F.; Torrioli, G.; Castellano, M. G.

    2017-03-01

    We present an apparatus for terahertz discrimination of materials designed to be fast, simple, compact, and economical in order to be suitable for preliminary on-field analysis. The system working principles, bio-inspired by the human vision of colors, are based on the use of an incoherent source, a room temperature detector, a series of microfabricated metamaterials selective filters, a very compact optics based on metallic ellipsoidal mirrors in air, and a treatment of the mirrors' surfaces that select the frequency band of interest. We experimentally demonstrate the operation of the apparatus in discriminating simple substances such as salt, staple foods, and grease. We present the system and the obtained results and discuss issues and possible developments.

  19. THz Discrimination of Materials: Development of an Apparatus Based on Room Temperature Detection and Metasurfaces Selective Filters

    Science.gov (United States)

    Carelli, P.; Chiarello, F.; Torrioli, G.; Castellano, M. G.

    2016-12-01

    We present an apparatus for terahertz discrimination of materials designed to be fast, simple, compact, and economical in order to be suitable for preliminary on-field analysis. The system working principles, bio-inspired by the human vision of colors, are based on the use of an incoherent source, a room temperature detector, a series of microfabricated metamaterials selective filters, a very compact optics based on metallic ellipsoidal mirrors in air, and a treatment of the mirrors' surfaces that select the frequency band of interest. We experimentally demonstrate the operation of the apparatus in discriminating simple substances such as salt, staple foods, and grease. We present the system and the obtained results and discuss issues and possible developments.

  20. Multi-Axis Independent Electromechanical Load Control for Docking System Actuation Development and Verification Using dSPACE

    Science.gov (United States)

    Oesch, Christopher; Dick, Brandon; Rupp, Timothy

    2015-01-01

    The development of highly complex and advanced actuation systems to meet customer demands has accelerated as the use of real-time testing technology expands into multiple markets at Moog. Systems developed for the autonomous docking of human rated spacecraft to the International Space Station (ISS), envelope multi-operational characteristics which place unique constraints on an actuation system. Real-time testing hardware has been used as a platform for incremental testing and development for the linear actuation system which controls initial capture and docking for vehicles visiting the ISS. This presentation will outline the role of dSPACE hardware as a platform for rapid control-algorithm prototyping as well as an Electromechanical Actuator (EMA) system dynamic loading simulator, both conducted at Moog to develop the safety critical Linear Actuator System (LAS) of the NASA Docking System (NDS).

  1. Ultraviolet filters.

    Science.gov (United States)

    Shaath, Nadim A

    2010-04-01

    The chemistry, photostability and mechanism of action of ultraviolet filters are reviewed. The worldwide regulatory status of the 55 approved ultraviolet filters and their optical properties are documented. The photostabilty of butyl methoxydibenzoyl methane (avobenzone) is considered and methods to stabilize it in cosmetic formulations are presented.

  2. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT: STORMWATER MANAGEMENT STORMFILTER® TREATMENT SYSTEM USING PERLITE MEDIA

    Science.gov (United States)

    Verification testing of the Stormwater Management, Inc. StormFilter® Using Perlite Filter Media was conducted on a 0.7 acre drainage basin near downtown Griffin, Georgia. The system consists of an inlet bay, flow spreader, cartridge bay, overflow baffle, and outlet bay, housed in...

  3. ENVIRONMENTAL TECHNOLOGY VERIFICATION, TEST REPORT OF CONTROL OF BIOAEROSOLS IN HVAC SYSTEMS, AIRFLOW PRODUCTS AFP30

    Science.gov (United States)

    The Environmental Technology Verification report discusses the technology and performance of the AFP30 air filter for dust and bioaerosol filtration manufactured by Airflow Products. The pressure drop across the filter was 62 Pa clean and 247 Pa dust loaded. The filtration effici...

  4. Regression Verification Using Impact Summaries

    Science.gov (United States)

    Backes, John; Person, Suzette J.; Rungta, Neha; Thachuk, Oksana

    2013-01-01

    Regression verification techniques are used to prove equivalence of syntactically similar programs. Checking equivalence of large programs, however, can be computationally expensive. Existing regression verification techniques rely on abstraction and decomposition techniques to reduce the computational effort of checking equivalence of the entire program. These techniques are sound but not complete. In this work, we propose a novel approach to improve scalability of regression verification by classifying the program behaviors generated during symbolic execution as either impacted or unimpacted. Our technique uses a combination of static analysis and symbolic execution to generate summaries of impacted program behaviors. The impact summaries are then checked for equivalence using an o-the-shelf decision procedure. We prove that our approach is both sound and complete for sequential programs, with respect to the depth bound of symbolic execution. Our evaluation on a set of sequential C artifacts shows that reducing the size of the summaries can help reduce the cost of software equivalence checking. Various reduction, abstraction, and compositional techniques have been developed to help scale software verification techniques to industrial-sized systems. Although such techniques have greatly increased the size and complexity of systems that can be checked, analysis of large software systems remains costly. Regression analysis techniques, e.g., regression testing [16], regression model checking [22], and regression verification [19], restrict the scope of the analysis by leveraging the differences between program versions. These techniques are based on the idea that if code is checked early in development, then subsequent versions can be checked against a prior (checked) version, leveraging the results of the previous analysis to reduce analysis cost of the current version. Regression verification addresses the problem of proving equivalence of closely related program

  5. Assessment of ceramic membrane filters

    Energy Technology Data Exchange (ETDEWEB)

    Ahluwalia, R.K.; Geyer, H.K.; Im, K.H. [and others

    1995-08-01

    The objectives of this project include the development of analytical models for evaluating the fluid mechanics of membrane coated, dead-end ceramic filters, and to determine the effects of thermal and thermo-chemical aging on the material properties of emerging ceramic hot gas filters. A honeycomb cordierite monolith with a thin ceramic coating and a rigid candle filter were evaluated.

  6. GENERIC VERIFICATION PROTOCOL: DISTRIBUTED GENERATION AND COMBINED HEAT AND POWER FIELD TESTING PROTOCOL

    Science.gov (United States)

    This report is a generic verification protocol by which EPA’s Environmental Technology Verification program tests newly developed equipment for distributed generation of electric power, usually micro-turbine generators and internal combustion engine generators. The protocol will ...

  7. 75 FR 4101 - Enterprise Income Verification (EIV) System User Access Authorization Form and Rules of Behavior...

    Science.gov (United States)

    2010-01-26

    ... From the Federal Register Online via the Government Publishing Office DEPARTMENT OF HOUSING AND URBAN DEVELOPMENT Enterprise Income Verification (EIV) System User Access Authorization Form and Rules.... This notice also lists the following information: Title of Proposal: Enterprise Income Verification...

  8. Development and verification of a model for estimating the screening utility in the detection of PCBs in transformer oil.

    Science.gov (United States)

    Terakado, Shingo; Glass, Thomas R; Sasaki, Kazuhiro; Ohmura, Naoya

    2014-01-01

    A simple new model for estimating the screening performance (false positive and false negative rates) of a given test for a specific sample population is presented. The model is shown to give good results on a test population, and is used to estimate the performance on a sampled population. Using the model developed in conjunction with regulatory requirements and the relative costs of the confirmatory and screening tests allows evaluation of the screening test's utility in terms of cost savings. Testers can use the methods developed to estimate the utility of a screening program using available screening tests with their own sample populations.

  9. 24 CFR 81.102 - Verification and enforcement to ensure GSE data integrity.

    Science.gov (United States)

    2010-04-01

    ... 24 Housing and Urban Development 1 2010-04-01 2010-04-01 false Verification and enforcement to ensure GSE data integrity. 81.102 Section 81.102 Housing and Urban Development Office of the Secretary... Provisions § 81.102 Verification and enforcement to ensure GSE data integrity. (a) Independent verification...

  10. Development of Filter-Blower Unit for use in the Advanced Nuclear Biological Chemical Protection System (ANBCPS) Helicopter/Transport-aircraft version

    NARCIS (Netherlands)

    Sabel, R.; Reffeltrath, P.A.; Jonkman, A.; Post, T.

    2006-01-01

    As a participant in the three-nation partnership for development of the ANBCP-S for use in Helicopters, Transport Aircraft and Fast Jet, the Royal Netherlands Airforce (RNLAF) picked up the challenge to design a Filter- Blower-Unit (FBU). Major Command (MajCom) of the RNLAF set priority to develop a

  11. Development of Filter-Blower Unit for use in the Advanced Nuclear Biological Chemical Protection System (ANBCPS) Helicopter/Transport-aircraft version

    NARCIS (Netherlands)

    Sabel, R.; Reffeltrath, P.A.; Jonkman, A.; Post, T.

    2006-01-01

    As a participant in the three-nation partnership for development of the ANBCP-S for use in Helicopters, Transport Aircraft and Fast Jet, the Royal Netherlands Airforce (RNLAF) picked up the challenge to design a Filter- Blower-Unit (FBU). Major Command (MajCom) of the RNLAF set priority to develop a

  12. Development and verification of signal processing system of avalanche photo diode for the active shields onboard ASTRO-H

    Science.gov (United States)

    Ohno, M.; Kawano, T.; Edahiro, I.; Shirakawa, H.; Ohashi, N.; Okada, C.; Habata, S.; Katsuta, J.; Tanaka, Y.; Takahashi, H.; Mizuno, T.; Fukazawa, Y.; Murakami, H.; Kobayashi, S.; Miyake, K.; Ono, K.; Kato, Y.; Furuta, Y.; Murota, Y.; Okuda, K.; Wada, Y.; Nakazawa, K.; Mimura, T.; Kataoka, J.; Ichinohe, Y.; Uchida, Y.; Katsuragawa, M.; Yoneda, H.; Sato, G.; Sato, R.; Kawaharada, M.; Harayama, A.; Odaka, H.; Hayashi, K.; Ohta, M.; Watanabe, S.; Kokubun, M.; Takahashi, T.; Takeda, S.; Kinoshita, M.; Yamaoka, K.; Tajima, H.; Yatsu, Y.; Uchiyama, H.; Saito, S.; Yuasa, T.; Makishima, K.

    2016-09-01

    The hard X-ray Imager and Soft Gamma-ray Detector onboard ASTRO-H demonstrate high sensitivity to hard X-ray (5-80 keV) and soft gamma-rays (60-600 keV), respectively. To reduce the background, both instruments are actively shielded by large, thick Bismuth Germanate scintillators. We have developed the signal processing system of the avalanche photodiode in the BGO active shields and have demonstrated its effectiveness after assembly in the flight model of the HXI/SGD sensor and after integration into the satellite. The energy threshold achieved is about 150 keV and anti-coincidence efficiency for cosmic-ray events is almost 100%. Installed in the BGO active shield, the developed signal processing system successfully reduces the room background level of the main detector.

  13. Development and verification of hardware for life science experiments in the Japanese Experiment Module "Kibo" on the International Space Station.

    Science.gov (United States)

    Ishioka, Noriaki; Suzuki, Hiromi; Asashima, Makoto; Kamisaka, Seiichiro; Mogami, Yoshihiro; Ochiai, Toshimasa; Aizawa-Yano, Sachiko; Higashibata, Akira; Ando, Noboru; Nagase, Mutsumu; Ogawa, Shigeyuki; Shimazu, Toru; Fukui, Keiji; Fujimoto, Nobuyoshi

    2004-03-01

    Japan Aerospace Exploration Agency (JAXA) has developed a cell biology experiment facility (CBEF) and a clean bench (CB) as a common hardware in which life science experiments in the Japanese Experiment Module (JEM known as "Kibo") of the International Space Station (ISS) can be performed. The CBEF, a CO2 incubator with a turntable that provides variable gravity levels, is the basic hardware required to carry out the biological experiments using microorganisms, cells, tissues, small animals, plants, etc. The CB provides a closed aseptic operation area for life science and biotechnology experiments in Kibo. A phase contrast and fluorescence microscope is installed inside CB. The biological experiment units (BEU) are designed to run individual experiments using the CBEF and the CB. A plant experiment unit (PEU) and two cell experiment units (CEU type1 and type2) for the BEU have been developed.

  14. Development and verification of signal processing system of avalanche photo diode for the active shields onboard ASTRO-H

    Energy Technology Data Exchange (ETDEWEB)

    Ohno, M., E-mail: ohno@hep01.hepl.hiroshima-u.ac.jp [Department of Physical Sciences, Hiroshima University, Hiroshima 739-8526 (Japan); Kawano, T.; Edahiro, I.; Shirakawa, H.; Ohashi, N.; Okada, C.; Habata, S.; Katsuta, J.; Tanaka, Y.; Takahashi, H.; Mizuno, T.; Fukazawa, Y. [Department of Physical Sciences, Hiroshima University, Hiroshima 739-8526 (Japan); Murakami, H.; Kobayashi, S.; Miyake, K.; Ono, K.; Kato, Y.; Furuta, Y.; Murota, Y.; Okuda, K. [Department of Physics, University of Tokyo, Tokyo 113-0033 (Japan); and others

    2016-09-21

    The hard X-ray Imager and Soft Gamma-ray Detector onboard ASTRO-H demonstrate high sensitivity to hard X-ray (5–80 keV) and soft gamma-rays (60–600 keV), respectively. To reduce the background, both instruments are actively shielded by large, thick Bismuth Germanate scintillators. We have developed the signal processing system of the avalanche photodiode in the BGO active shields and have demonstrated its effectiveness after assembly in the flight model of the HXI/SGD sensor and after integration into the satellite. The energy threshold achieved is about 150 keV and anti-coincidence efficiency for cosmic-ray events is almost 100%. Installed in the BGO active shield, the developed signal processing system successfully reduces the room background level of the main detector. - Highlights: • A detail of development of signal processing system for ASTRO-H is presented. • Digital filer with FPGA instead of discrete analog circuit is applied. • Expected performance is verified after integration of the satellite.

  15. Development and laboratory verification of control algorithms for formation flying configuration with a single-input control

    Science.gov (United States)

    Ovchinnikov, M.; Bindel, D.; Ivanov, D.; Smirnov, G.; Theil, S.; Zaramenskikh, I.

    2010-11-01

    Once been orbited, the technological nanosatellite TNS-0 no. 1 is supposed to be used in one of the next missions for the demonstration of orbital maneuvering capability to eliminate a secular relative motion of two satellites due to the J2 harmonic of the Earth gravitational field. It is assumed that the longitudinal axis of the satellite is stabilized along the induction vector of the geomagnetic field and a thruster engine is installed along this axis. Continuous and impulsive thruster control algorithms eliminating the secular relative motion have been developed. Special equipment was developed in ZARM for demonstration and laboratory testing of the satellite motion identification and control algorithms. The facility consists of a horizontal smooth table and mobile mock-up that enables to glide over the table surface due to compressed air stored in on-board pressure tanks. Compressed air is used to control the translation and attitude motion of the mock-up equipped with a number of pulse thrusters. In this work a dynamic model for mock-up controlled motion over the table is developed. This allows us to simulate a relative motion of a pair of TNS-0 type nanosatellites in the plane of the orbit.

  16. Whole-body isometric force/torque measurements for functional assessment in neuro-rehabilitation: platform design, development and verification

    Directory of Open Access Journals (Sweden)

    Cavallo Giuseppe

    2009-10-01

    Full Text Available Abstract Background One of the main scientific and technological challenges of rehabilitation bioengineering is the development of innovative methodologies, based on the use of appropriate technological devices, for an objective assessment of patients undergoing a rehabilitation treatment. Such tools should be as fast and cheap to use as clinical scales, which are currently the daily instruments most widely used in the routine clinical practice. Methods A human-centered approach was used in the design and development of a mechanical structure equipped with eight force/torque sensors that record quantitative data during the initiation of a predefined set of Activities of Daily Living (ADL tasks, in isometric conditions. Results Preliminary results validated the appropriateness, acceptability and functionality of the proposed platform, that has become now a tool used for clinical research in three clinical centres. Conclusion This paper presented the design and development of an innovative platform for whole-body force and torque measurements on human subjects. The platform has been designed to perform accurate quantitative measurements in isometric conditions with the specific aim to address the needs for functional assessment tests of patients undergoing a rehabilitation treatment as a consequence of a stroke. The versatility of the system also enlightens several other interesting possible areas of application for therapy in neurorehabilitation, for research in basic neuroscience, and more.

  17. Updating the OMERACT filter

    DEFF Research Database (Denmark)

    Kirwan, John R; Boers, Maarten; Hewlett, Sarah

    2014-01-01

    OBJECTIVE: The Outcome Measures in Rheumatology (OMERACT) Filter provides guidelines for the development and validation of outcome measures for use in clinical research. The "Truth" section of the OMERACT Filter presupposes an explicit framework for identifying the relevant core outcomes that are......OBJECTIVE: The Outcome Measures in Rheumatology (OMERACT) Filter provides guidelines for the development and validation of outcome measures for use in clinical research. The "Truth" section of the OMERACT Filter presupposes an explicit framework for identifying the relevant core outcomes...... that are universal to all studies of the effects of intervention effects. There is no published outline for instrument choice or development that is aimed at measuring outcome, was derived from broad consensus over its underlying philosophy, or includes a structured and documented critique. Therefore, a new proposal...

  18. Wind gust warning verification

    Science.gov (United States)

    Primo, Cristina

    2016-07-01

    Operational meteorological centres around the world increasingly include warnings as one of their regular forecast products. Warnings are issued to warn the public about extreme weather situations that might occur leading to damages and losses. In forecasting these extreme events, meteorological centres help their potential users in preventing the damage or losses they might suffer. However, verifying these warnings requires specific methods. This is due not only to the fact that they happen rarely, but also because a new temporal dimension is added when defining a warning, namely the time window of the forecasted event. This paper analyses the issues that might appear when dealing with warning verification. It also proposes some new verification approaches that can be applied to wind warnings. These new techniques are later applied to a real life example, the verification of wind gust warnings at the German Meteorological Centre ("Deutscher Wetterdienst"). Finally, the results obtained from the latter are discussed.

  19. Nuclear disarmament verification

    Energy Technology Data Exchange (ETDEWEB)

    DeVolpi, A.

    1993-12-31

    Arms control treaties, unilateral actions, and cooperative activities -- reflecting the defusing of East-West tensions -- are causing nuclear weapons to be disarmed and dismantled worldwide. In order to provide for future reductions and to build confidence in the permanency of this disarmament, verification procedures and technologies would play an important role. This paper outlines arms-control objectives, treaty organization, and actions that could be undertaken. For the purposes of this Workshop on Verification, nuclear disarmament has been divided into five topical subareas: Converting nuclear-weapons production complexes, Eliminating and monitoring nuclear-weapons delivery systems, Disabling and destroying nuclear warheads, Demilitarizing or non-military utilization of special nuclear materials, and Inhibiting nuclear arms in non-nuclear-weapons states. This paper concludes with an overview of potential methods for verification.

  20. Development of Work Verification System for Cooperation between MCR Operators and Field Workers in Nuclear Power Plants

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Seung Min; Seong, Poong Hyun [Korea Advanced Institute of Science and Technology, Daejeon (Korea, Republic of); Lee, Hyun Chul [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2014-05-15

    In this work, as an application of digital devices to NPPs, a cooperation support system to aid communication between MCR operators and field workers in Nuclear Power Plants (NPPs), NUclear COoperation Support and MObile document System (Nu-COSMOS), is suggested. It is not easy for MCR operators to estimate whether field workers conduct their work correctly because MCR operators cannot monitor field workers at a real time, and records on paper procedure written by field workers do not contain the detailed information about work process and results. Thus, for safety operation without any events induced by misunderstand and miscommunication between MCR operators and field workers, the Nu-COSMOS is developed and it will be useful from the supporting cooperation point of view. To support the cooperation between MCR operators and field workers in NPPs, the cooperation support and mobile documentation system Nu-COSMOS is suggested in this work. To improve usability and applicability of the suggested system, the results of using existed digital device based support systems were analyzed. Through the analysis, the disincentive elements of using digital device-based developments and the recommendations for developing new mobile based system were derived. Based on derived recommendations, two sub systems, the mobile device based in-formation storing system and the large screen based information sharing system were suggested. The usability of the suggested system will be conducted by a survey with questionnaires. Field workers and operators, and nuclear-related person who had experiences as an operator, graduate students affiliated in nuclear engineering department will use and test the functions of the suggested system. It is expected that the mobile based information storing system can reduce the field workers' work load and enhance the understanding of MCR operators about field operators work process by monitoring all work results and work processes stored in devices.

  1. Development of Automated Image Analysis Tools for Verification of Radiotherapy Field Accuracy with AN Electronic Portal Imaging Device.

    Science.gov (United States)

    Dong, Lei

    1995-01-01

    The successful management of cancer with radiation relies on the accurate deposition of a prescribed dose to a prescribed anatomical volume within the patient. Treatment set-up errors are inevitable because the alignment of field shaping devices with the patient must be repeated daily up to eighty times during the course of a fractionated radiotherapy treatment. With the invention of electronic portal imaging devices (EPIDs), patient's portal images can be visualized daily in real-time after only a small fraction of the radiation dose has been delivered to each treatment field. However, the accuracy of human visual evaluation of low-contrast portal images has been found to be inadequate. The goal of this research is to develop automated image analysis tools to detect both treatment field shape errors and patient anatomy placement errors with an EPID. A moments method has been developed to align treatment field images to compensate for lack of repositioning precision of the image detector. A figure of merit has also been established to verify the shape and rotation of the treatment fields. Following proper alignment of treatment field boundaries, a cross-correlation method has been developed to detect shifts of the patient's anatomy relative to the treatment field boundary. Phantom studies showed that the moments method aligned the radiation fields to within 0.5mm of translation and 0.5^ circ of rotation and that the cross-correlation method aligned anatomical structures inside the radiation field to within 1 mm of translation and 1^ circ of rotation. A new procedure of generating and using digitally reconstructed radiographs (DRRs) at megavoltage energies as reference images was also investigated. The procedure allowed a direct comparison between a designed treatment portal and the actual patient setup positions detected by an EPID. Phantom studies confirmed the feasibility of the methodology. Both the moments method and the cross -correlation technique were

  2. Analog filters in nanometer CMOS

    CERN Document Server

    Uhrmann, Heimo; Zimmermann, Horst

    2014-01-01

    Starting from the basics of analog filters and the poor transistor characteristics in nanometer CMOS 10 high-performance analog filters developed by the authors in 120 nm and 65 nm CMOS are described extensively. Among them are gm-C filters, current-mode filters, and active filters for system-on-chip realization for Bluetooth, WCDMA, UWB, DVB-H, and LTE applications. For the active filters several operational amplifier designs are described. The book, furthermore, contains a review of the newest state of research on low-voltage low-power analog filters. To cover the topic of the book comprehensively, linearization issues and measurement methods for the characterization of advanced analog filters are introduced in addition. Numerous elaborate illustrations promote an easy comprehension. This book will be of value to engineers and researchers in industry as well as scientists and Ph.D students at universities. The book is also recommendable to graduate students specializing on nanoelectronics, microelectronics ...

  3. Requirement Assurance: A Verification Process

    Science.gov (United States)

    Alexander, Michael G.

    2011-01-01

    Requirement Assurance is an act of requirement verification which assures the stakeholder or customer that a product requirement has produced its "as realized product" and has been verified with conclusive evidence. Product requirement verification answers the question, "did the product meet the stated specification, performance, or design documentation?". In order to ensure the system was built correctly, the practicing system engineer must verify each product requirement using verification methods of inspection, analysis, demonstration, or test. The products of these methods are the "verification artifacts" or "closure artifacts" which are the objective evidence needed to prove the product requirements meet the verification success criteria. Institutional direction is given to the System Engineer in NPR 7123.1A NASA Systems Engineering Processes and Requirements with regards to the requirement verification process. In response, the verification methodology offered in this report meets both the institutional process and requirement verification best practices.

  4. Development of core design/analysis technology for integral reactor; verification of SMART nuclear design by Monte Carlo method

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Chang Hyo; Hong, In Seob; Han, Beom Seok; Jeong, Jong Seong [Seoul National University, Seoul (Korea)

    2002-03-01

    The objective of this project is to verify neutronics characteristics of the SMART core design as to compare computational results of the MCNAP code with those of the MASTER code. To achieve this goal, we will analyze neutronics characteristics of the SMART core using the MCNAP code and compare these results with results of the MASTER code. We improved parallel computing module and developed error analysis module of the MCNAP code. We analyzed mechanism of the error propagation through depletion computation and developed a calculation module for quantifying these errors. We performed depletion analysis for fuel pins and assemblies of the SMART core. We modeled a 3-D structure of the SMART core and considered a variation of material compositions by control rods operation and performed depletion analysis for the SMART core. We computed control-rod worths of assemblies and a reactor core for operation of individual control-rod groups. We computed core reactivity coefficients-MTC, FTC and compared these results with computational results of the MASTER code. To verify error analysis module of the MCNAP code, we analyzed error propagation through depletion of the SMART B-type assembly. 18 refs., 102 figs., 36 tabs. (Author)

  5. Development and Kinematic Verification of a Finite Element Model for the Lumbar Spine: Application to Disc Degeneration

    Directory of Open Access Journals (Sweden)

    Elena Ibarz

    2013-01-01

    Full Text Available The knowledge of the lumbar spine biomechanics is essential for clinical applications. Due to the difficulties to experiment on living people and the irregular results published, simulation based on finite elements (FE has been developed, making it possible to adequately reproduce the biomechanics of the lumbar spine. A 3D FE model of the complete lumbar spine (vertebrae, discs, and ligaments has been developed. To verify the model, radiological images (X-rays were taken over a group of 25 healthy, male individuals with average age of 27.4 and average weight of 78.6 kg with the corresponding informed consent. A maximum angle of 34.40° is achieved in flexion and of 35.58° in extension with a flexion-extension angle of 69.98°. The radiological measurements were 33.94 ± 4.91°, 38.73 ± 4.29°, and 72.67°, respectively. In lateral bending, the maximum angles were 19.33° and 23.40 ± 2.39, respectively. In rotation a maximum angle of 9.96° was obtained. The model incorporates a precise geometrical characterization of several elements (vertebrae, discs, and ligaments, respecting anatomical features and being capable of reproducing a wide range of physiological movements. Application to disc degeneration (L5-S1 allows predicting the affection in the mobility of the different lumbar segments, by means of parametric studies for different ranges of degeneration.

  6. Development of a Wearable Instrumented Vest for Posture Monitoring and System Usability Verification Based on the Technology Acceptance Model.

    Science.gov (United States)

    Lin, Wen-Yen; Chou, Wen-Cheng; Tsai, Tsai-Hsuan; Lin, Chung-Chih; Lee, Ming-Yih

    2016-12-17

    Body posture and activity are important indices for assessing health and quality of life, especially for elderly people. Therefore, an easily wearable device or instrumented garment would be valuable for monitoring elderly people's postures and activities to facilitate healthy aging. In particular, such devices should be accepted by elderly people so that they are willing to wear it all the time. This paper presents the design and development of a novel, textile-based, intelligent wearable vest for real-time posture monitoring and emergency warnings. The vest provides a highly portable and low-cost solution that can be used both indoors and outdoors in order to provide long-term care at home, including health promotion, healthy aging assessments, and health abnormality alerts. The usability of the system was verified using a technology acceptance model-based study of 50 elderly people. The results indicated that although elderly people are anxious about some newly developed wearable technologies, they look forward to wearing this instrumented posture-monitoring vest in the future.

  7. Development of a multi-dimensional realistic thermal-hydraulic system analysis code, MARS 1.3 and its verification

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Won Jae; Chung, Bub Dong; Jeong, Jae Jun; Ha, Kwi Seok [Korea Atomic Energy Research Institute, Taejon (Korea)

    1998-06-01

    A multi-dimensional realistic thermal-hydraulic system analysis code, MARS version 1.3 has been developed. Main purpose of MARS 1.3 development is to have the realistic analysis capability of transient two-phase thermal-hydraulics of Pressurized Water Reactors (PWRs) especially during Large Break Loss of Coolant Accidents (LBLOCAs) where the multi-dimensional phenomena domain the transients. MARS code is a unified version of USNRC developed COBRA-TF, domain the transients. MARS code is a unified version of USNRC developed COBRA-TF, three-dimensional (3D) reactor vessel analysis code, and RELAP5/MOD3.2.1.2, one-dimensional (1D) reactor system analysis code., Developmental requirements for MARS are chosen not only to best utilize the existing capability of the codes but also to have the enhanced capability in code maintenance, user accessibility, user friendliness, code portability, code readability, and code flexibility. For the maintenance of existing codes capability and the enhancement of code maintenance capability, user accessibility and user friendliness, MARS has been unified to be a single code consisting of 1D module (RELAP5) and 3D module (COBRA-TF). This is realized by implicitly integrating the system pressure matrix equations of hydrodynamic models and solving them simultaneously, by modifying the 1D/3D calculation sequence operable under a single Central Processor Unit (CPU) and by unifying the input structure and the light water property routines of both modules. In addition, the code structure of 1D module is completely restructured using the modular data structure of standard FORTRAN 90, which greatly improves the code maintenance capability, readability and portability. For the code flexibility, a dynamic memory management scheme is applied in both modules. MARS 1.3 now runs on PC/Windows and HP/UNIX platforms having a single CPU, and users have the options to select the 3D module to model the 3D thermal-hydraulics in the reactor vessel or other

  8. Impact of Chloramination on the Development of Laboratory-Grown Biofilms Fed with Filter-Pretreated Groundwater

    KAUST Repository

    Ling, Fangqiong

    2013-01-01

    This study evaluated the continuous impact of monochloramine disinfection on laboratory-grown biofilms through the characterization of biofilm architecture and microbial community structure. Biofilm development and disinfection were achieved using CDC (Centers for Disease Control and Prevention) biofilm reactor systems with polyvinyl chloride (PVC) coupons as the substratum and sand filter-pretreated groundwater as the source of microbial seeding and growth nutrient. After 2 weeks of growth, the biofilms were subjected to chloramination for 8 more weeks at concentrations of 7.5±1.4 to 9.1±0.4 mg Cl2 L-1. Control reactors received no disinfection during the development of biofilms. Confocal laser scanning microscopy and image analysis indicated that chloramination could lead to 81.4-83.5% and 86.3-95.6% reduction in biofilm biomass and thickness, respectively, but could not eliminate biofilm growth. 16S rRNA gene terminal restriction fragment length polymorphism analysis indicated that microbial community structures between chloraminated and non-chloraminated biofilms exhibited different successional trends. 16S rRNA gene pyrosequencing analysis further revealed that chloramination could select members of Actinobacteria and Acidobacteria as the dominant populations, whereas natural development leads to the selection of members of Nitrospira and Bacteroidetes as dominant biofilm populations. Overall, chloramination treatment could alter the growth of multi-species biofilms on the PVC surface, shape the biofilm architecture, and select a certain microbial community that can survive or proliferate under chloramination.

  9. Development of Web GIS-Based VFSMOD System with Three Modules for Effective Vegetative Filter Strip Design

    Directory of Open Access Journals (Sweden)

    Dong Soo Kong

    2013-08-01

    Full Text Available In recent years, Non-Point Source Pollution has been rising as a significant environmental issue. The sediment-laden water problem is causing serious impacts on river ecosystems not only in South Korea but also in most countries. The vegetative filter strip (VFS has been thought to be one of the most effective methods to reduce the transport of sediment to down-gradient area. However, the effective width of the VFS first needs to be determined before VFS installation in the field. To provide an easy-to-use interface with a scientific VFS modeling engine, the Web GIS-based VFSMOD system was developed in this study. The Web GIS-based VFSMOD uses the UH and VFSM executable programs from the VFSMOD-w model as core engines to simulate rainfall-runoff and sediment trapping. To provide soil information for a point of interest, the Google Map interface to the MapServer soil database system was developed using the Google Map API, Javascript, Perl/CGI, and Oracle DB programming. Three modules of the Web GIS-based VFSMOD system were developed for various VFS designs under single storm, multiple storm, and long-term period scenarios. These modules in the Web GIS-based VFSMOD system were applied to the study watershed in South Korea and these were proven as efficient tools for the VFS design for various purposes.

  10. SU-E-T-641: Development and Verification of Automatic Reading Dose of Interest From Eclipse's DVH

    Energy Technology Data Exchange (ETDEWEB)

    Wu, Q [Department of Radiation Oncology, Beijing Hospital, Ministry of Health, Beijing (China)

    2014-06-15

    Purpose: According to clinical and research requirement, we develop a function of automatic reading dose of interest from dose volume histogram(DVH), to replace the traditional method with a mouse one by one point, and it's also verified. Methods: The DVH automatic reading function will be developed in an in-house developed radiotherapy information management system(RTIMS), which is based on Apache+PHP+MySQL. A DVH ASCII file is exported from Varian Eclipse V8.6, which includes the following contents: 1. basic information of patient; 2. dose information of plan; 3. dose information of structures, including basic information and dose volume data of target volume and organ at risk. And the default exported dose volume data also includes relative doses by 1% step and corresponding absolute doses and cumulative relative volumes, and the volumes are 4 decimal fraction. Clinically, we often need read the doses of some integer percent volumes, such as D50 and D30. So it couldn't be directly obtained from the above data, but we can use linear interpolation bye the near volumes and doses: Dx=D2−(V2−Vx)*(D2−D1)/(V2−V1), and program a function to search, read and calculate the corresponding data. And the doses of all preseted volume of interest of all structures can be automatically read one by one patient, and saved as a CSV file. To verify it, we select 24 IMRT plans for prostate cancer, and doses of interest are PTV D98/D95/D5/D2, bladder D30/D50, and rectum D25/D50. Two groups of data, using the automatic reading method(ARM) and pointed dose method(PDM), are analyzed with SPSS 16. The absolute difference=D-ARM-D-PDM, relative difference=absolute difference*100%/prescription dose(7600cGy). Results: The differences are as following: PTV D98/D95/D5/D2: −0.04%/− 0.04%/0.13%/0.19%, bladder D30/D50: −0.02%/0.01%, and rectum D25/D50: 0.03%/0.01%. Conclusion: Using this function, the error is very small, and can be neglected. It could greatly improve the

  11. Development of polarized {sup 3}He filter for polarized neutron experiment

    Energy Technology Data Exchange (ETDEWEB)

    Sakai, K.; Sato, H.; Yoshimi, A.; Asahi, K. [Tokyo Inst. of Tech. (Japan). Faculty of Science; Masuda, Y.; Muto, S.; Ishimoto, S.; Morimoto, K.

    1996-08-01

    A high-pressure polarized {sup 3}He gas cell, pumped with two diode lasers, has been developed at KEK for use as a polarizer and a spin analyzer for low energy neutrons. The polarization attained of {sup 3}He was determined through the measurement of the transmission of the unpolarized neutrons through the {sup 3}He cell. So far we obtained P{sub He}=18% at 10 atm and P{sub He}=12% at 20 atm. (author)

  12. Development and verification of NRC`s single-rod fuel performance codes FRAPCON-3 AND FRAPTRAN

    Energy Technology Data Exchange (ETDEWEB)

    Beyer, C.E.; Cunningham, M.E.; Lanning, D.D. [Pacific Northwest National Lab., Richland, WA (United States)

    1998-03-01

    The FRAPCON and FRAP-T code series, developed in the 1970s and early 1980s, are used by the US Nuclear Regulatory Commission (NRC) to predict fuel performance during steady-state and transient power conditions, respectively. Both code series are now being updated by Pacific Northwest National Laboratory to improve their predictive capabilities at high burnup levels. The newest versions of the codes are called FRAPCON-3 and FRAPTRAN. The updates to fuel property and behavior models are focusing on providing best estimate predictions under steady-state and fast transient power conditions up to extended fuel burnups (> 55 GWd/MTU). Both codes will be assessed against a data base independent of the data base used for code benchmarking and an estimate of code predictive uncertainties will be made based on comparisons to the benchmark and independent data bases.

  13. The development and verification of thermal-hydraulic code on passive residual heat removal system of Chinese advanced PWR

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    The technology of passive safety is the current trend among safety systems in nuclear power plant. Passive residual heat removal system (PRHRS), a major part of passive safety systems of Chinese advanced PWR, is a novel design with three-fold natural circulation. On the basis of reasonable physics and mathematics models, MITAP-PRHRS code was developed to analyze steady and transient characteristics of the PRHRS. The calculation and analysis show that the code simulates steady characteristics of the PRHRS very well, and it is able to simulate transient characteristics of all startup modes of the PRHRS. However, the quantitative description is poor during the initial stages of the transition process when water hammer occurs.

  14. Neutron spectrometric methods for core inventory verification in research reactors

    CERN Document Server

    Ellinger, A; Hansen, W; Knorr, J; Schneider, R

    2002-01-01

    In consequence of the Non-Proliferation Treaty safeguards, inspections are periodically made in nuclear facilities by the IAEA and the EURATOM Safeguards Directorate. The inspection methods are permanently improved. Therefore, the Core Inventory Verification method is being developed as an indirect method for the verification of the core inventory and to check the declared operation of research reactors.

  15. Verification of a CT scanner using a miniature step gauge

    DEFF Research Database (Denmark)

    Cantatore, Angela; Andreasen, J.L.; Carmignato, S.;

    2011-01-01

    The work deals with performance verification of a CT scanner using a 42mm miniature replica step gauge developed for optical scanner verification. Errors quantification and optimization of CT system set-up in terms of resolution and measurement accuracy are fundamental for use of CT scanning...

  16. CPAchecker: A Tool for Configurable Software Verification

    CERN Document Server

    Beyer, Dirk

    2009-01-01

    Configurable software verification is a recent concept for expressing different program analysis and model checking approaches in one single formalism. This paper presents CPAchecker, a tool and framework that aims at easy integration of new verification components. Every abstract domain, together with the corresponding operations, is required to implement the interface of configurable program analysis (CPA). The main algorithm is configurable to perform a reachability analysis on arbitrary combinations of existing CPAs. The major design goal during the development was to provide a framework for developers that is flexible and easy to extend. We hope that researchers find it convenient and productive to implement new verification ideas and algorithms using this platform and that it advances the field by making it easier to perform practical experiments. The tool is implemented in Java and runs as command-line tool or as Eclipse plug-in. We evaluate the efficiency of our tool on benchmarks from the software mo...

  17. Simulation Environment Based on the Universal Verification Methodology

    CERN Document Server

    Fiergolski, Adrian

    2016-01-01

    Universal Verification Methodology (UVM) is a standardized approach of verifying integrated circuit designs, targeting a Coverage-Driven Verification (CDV). It combines automatic test generation, self-checking testbenches, and coverage metrics to indicate progress in the design verification. The flow of the CDV differs from the traditional directed-testing approach. With the CDV, a testbench developer, by setting the verification goals, starts with an structured plan. Those goals are targeted further by a developed testbench, which generates legal stimuli and sends them to a device under test (DUT). The progress is measured by coverage monitors added to the simulation environment. In this way, the non-exercised functionality can be identified. Moreover, the additional scoreboards indicate undesired DUT behaviour. Such verification environments were developed for three recent ASIC and FPGA projects which have successfully implemented the new work-flow: (1) the CLICpix2 65 nm CMOS hybrid pixel readout ASIC desi...

  18. Simulation Environment Based on the Universal Verification Methodology

    CERN Document Server

    AUTHOR|(SzGeCERN)697338

    2016-01-01

    Universal Verification Methodology (UVM) is a standardized approach of verifying integrated circuit designs, targeting a Coverage-Driven Verification (CDV). It combines automatic test generation, self-checking testbenches, and coverage metrics to indicate progress in the design verification. The flow of the CDV differs from the traditional directed-testing approach. With the CDV, a testbench developer, by setting the verification goals, starts with an structured plan. Those goals are targeted further by a developed testbench, which generates legal stimuli and sends them to a device under test (DUT). The progress is measured by coverage monitors added to the simulation environment. In this way, the non-exercised functionality can be identified. Moreover, the additional scoreboards indicate undesired DUT behaviour. Such verification environments were developed for three recent ASIC and FPGA projects which have successfully implemented the new work-flow: (1) the CLICpix2 65 nm CMOS hybrid pixel readout ASIC desi...

  19. Food Filter

    Institute of Scientific and Technical Information of China (English)

    履之

    1995-01-01

    A typical food-processing plant produces about 500,000 gallons of waste water daily. Laden with organic compounds, this water usually is evaporated or discharged into sewers.A better solution is to filter the water through

  20. Formal Verification of UML Profil

    DEFF Research Database (Denmark)

    Bhutto, Arifa; Hussain, Dil Muhammad Akbar

    2011-01-01

    The Unified Modeling Language (UML) is based on the Model Driven Development (MDD) approach which capturing the system functionality using the platform-independent model (PMI) and appropriate domain-specific languages. In UML base system notations, structural view is model by the class, components...... and object diagrams and behavioral view model by the activity, use case, state, and sequence diagram. However, UML does not provide the formal syntax, therefore its semantics is not formally definable, so for assure of correctness, we need to incorporate semantic reasoning through verification, specification...

  1. Development of a Model for the Simulation of ROPS Tests on Agricultural Tractors Cabin: Numerical Models and Experimental Verification

    Directory of Open Access Journals (Sweden)

    Sergio Baragetti,

    2015-09-01

    Full Text Available It is here proposed a methodology for simulation of ROPS tests (ROPS = Roll Over Protective Structure of agricultural tractor cabins. The work is based on the resolution of this problem through the use of the finite element method. In order to limit the number of nodes of the model and thus to speed up the resolution,a twodimensional finite elements model has been chosen. The method presented here solves with relative ease, even very complex structures. There are also simplest methods in literature where specially made software is based on the finite element method for simulating approval tests on ROPS structures. In this case,codes developed just for this purposeare available, and therefore very simple to use and characterized by a high speed of preparation of the model following the definition of a small number of parameters. On the other side these are codes designed for structures having a specific geometric shape and in which the user is not free to set all the parameters existing in commercial software for the structural calculation, and are not very suitable in case of complex or not conventional structures. The methodology proposed by the authors instead, although not automated, allows simulating any type of structure in acceptable times. The results were validated by full scale experimental tests. Through the interpretation of the results it is possible to identify which areais the most critical for the structure and evaluate any change, something which is not easy to do through expensive tests.

  2. Development, Verification and Use of Gust Modeling in the NASA Computational Fluid Dynamics Code FUN3D

    Science.gov (United States)

    Bartels, Robert E.

    2012-01-01

    This paper presents the implementation of gust modeling capability in the CFD code FUN3D. The gust capability is verified by computing the response of an airfoil to a sharp edged gust. This result is compared with the theoretical result. The present simulations will be compared with other CFD gust simulations. This paper also serves as a users manual for FUN3D gust analyses using a variety of gust profiles. Finally, the development of an Auto-Regressive Moving-Average (ARMA) reduced order gust model using a gust with a Gaussian profile in the FUN3D code is presented. ARMA simulated results of a sequence of one-minus-cosine gusts is shown to compare well with the same gust profile computed with FUN3D. Proper Orthogonal Decomposition (POD) is combined with the ARMA modeling technique to predict the time varying pressure coefficient increment distribution due to a novel gust profile. The aeroelastic response of a pitch/plunge airfoil to a gust environment is computed with a reduced order model, and compared with a direct simulation of the system in the FUN3D code. The two results are found to agree very well.

  3. Improvement of data transfer speed and development of an EB data verification system in a VSB mask writer

    Science.gov (United States)

    Wakimoto, Osamu; Manabe, Hironobu; Hoshi, Hiromichi; Samoto, Norihiko; Komagata, Tadashi; Nakagawa, Yasutoshi; Yamabe, Masaki

    2009-04-01

    To extend the effectiveness of photo lithography, Optical Proximity Effect Correction (OPC) and Resolution Enhancement Technique (RET) incorporate increasingly complicated process steps, handling large volumes of data. This poses a challenge for mask making with EB lithography in two areas: data transfer speed and the reliability of pattern data processed by hardware. Traditionally, JEOL's variable shaped beam mask writers used single board CPU control to save in buffer memory pattern data per field on a magnetic disk. We developed a new parallel transfer technique using a dual board CPU to enhance the data transfer speed to buffer memory. This technique improved the data transfer speed from 40 MB/sec to 80 MB/sec or higher. To insure the reliability of pattern data processed by hardware, we also devised a way to save in the hard disk the shot position, size, and dose of patterns processed in the data transfer system. We verified that the system was able to record in real time 250G shot pattern data (size and positional data of figures to be exposed).

  4. Driving Force Filtering and Driving Mechanism Analysis of Urban Agricultural Development in Weifang County, China

    Directory of Open Access Journals (Sweden)

    SUI Fei-fei

    2016-03-01

    Full Text Available As an agricultural nation, the agricultural landscape is the basic appearance and existence in China, but the common existence often be neglected and contempted. As a new type of design and ideology, the development of urban agricultural landscape will greatly affect the texture and structure of the urban space. According to the urban agricultural production data and the socio-economic data of Weifang County, a set of evaluation index system that could analyze quantitatively the driving force of urban agricultural production changes and the internal drive mechanism was built. The original driving force indicators of economy, society, resources and environment from the time-series were chosen, and then 15 driving forces from the original driving forces by correlation analysis and principal component analysis were selected. The degree of influence was analyzed and the driving forces model by means of partial least squares(PLS was built. The results demonstrated that the factors greatly influenced the increase of urban agricultural output value in Weifang County were per capita net income of rural residents, agricultural machinery total power, effective irrigation area, centralized treatment rate of urban sewage, with the driving exponents 0.2509, 0.1019, 0.1655, 0.1332, respectively. The negative influence factor was the use amount of agricultural plastic film and the driving exponent was-0.2146. The research provides a reference for the development of urban agriculture, as well as a reference for the related study.

  5. Development and verification of a software system for the probabilistic safety analysis of nuclear plants as part of the proryv project

    Directory of Open Access Journals (Sweden)

    L.V. Abramov

    2016-06-01

    The paper presents results of the CRISS 5.3 code verification through the comparison of the analysis results obtained using the CRISS 5.3 system against analytical formulas and results of a qualitative and quantitative analysis based on certified nuclear plant PSA software tools.

  6. Verification in referral-based crowdsourcing.

    Directory of Open Access Journals (Sweden)

    Victor Naroditskiy

    Full Text Available Online social networks offer unprecedented potential for rallying a large number of people to accomplish a given task. Here we focus on information gathering tasks where rare information is sought through "referral-based crowdsourcing": the information request is propagated recursively through invitations among members of a social network. Whereas previous work analyzed incentives for the referral process in a setting with only correct reports, misreporting is known to be both pervasive in crowdsourcing applications, and difficult/costly to filter out. A motivating example for our work is the DARPA Red Balloon Challenge where the level of misreporting was very high. In order to undertake a formal study of verification, we introduce a model where agents can exert costly effort to perform verification and false reports can be penalized. This is the first model of verification and it provides many directions for future research, which we point out. Our main theoretical result is the compensation scheme that minimizes the cost of retrieving the correct answer. Notably, this optimal compensation scheme coincides with the winning strategy of the Red Balloon Challenge.

  7. Holographic interference filters

    Science.gov (United States)

    Diehl, Damon W.

    Holographic mirrors have wavelength-selection properties and thus qualify as a class of interference filters. Two theoretical methods for analyzing such structures are developed. The first method uses Hill's matrix method to yield closed-forms solutions in terms of the Floquet-Bloch waves within a periodic structure. A process is developed for implementing this solution method on a computer, using sparse-matrix memory allocation, numerical root-finding algorithms, and inverse-iteration techniques. It is demonstrated that Hill's matrix method is valid for the analysis of finite and multi-periodic problems. The second method of theoretical analysis is a transfer-matrix technique, which is herein termed thin-film decomposition. It is shown that the two methods of solution yield results that differ by, at worst, a fraction of a percent. Using both calculation techniques, a number of example problems are explored. Of key importance is the construction of a set of curves that are useful for the design and characterization of holographic interference filters. In addition to the theoretical development, methods are presented for the fabrication of holographic interference filters using DuPont HRF-800X001 photopolymer. Central to the exposure system is a frequency-stabilized, tunable dye laser. The types of filters fabricated include single-tone reflection filters, two types of multitone reflection filters, and reflection filters for infrared wavelengths. These filters feature index profiles that are not easily attainable through other fabrication methods. As a supplement to the body of the dissertation, the computer algorithms developed to implement Hill's matrix method and thin-film decomposition are also included as an appendix. Further appendices provide more information on Floquet's theorem and Hill's matrix method. A final appendix presents a design for an infrared laser spectrophotometer.

  8. Quantitative Verification in Practice

    NARCIS (Netherlands)

    Haverkort, Boudewijn R.; Katoen, Joost-Pieter; Larsen, Kim G.

    2010-01-01

    Soon after the birth of model checking, the first theoretical achievements have been reported on the automated verification of quanti- tative system aspects such as discrete probabilities and continuous time. These theories have been extended in various dimensions, such as con- tinuous probabilities

  9. Comparing formal verification approaches of interlocking systems

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Nguyen, Hoang Nga; Roggenbach, Markus

    2016-01-01

    The verification of railway interlocking systems is a challenging task, and therefore several research groups have suggested to improve this task by using formal methods, but they use different modelling and verification approaches. To advance this research, there is a need to compare these appro......The verification of railway interlocking systems is a challenging task, and therefore several research groups have suggested to improve this task by using formal methods, but they use different modelling and verification approaches. To advance this research, there is a need to compare...... these approaches. As a first step towards this, in this paper we suggest a way to compare different formal approaches for verifying designs of route-based interlocking systems and we demonstrate it on modelling and verification approaches developed within the research groups at DTU/Bremen and at Surrey....../Swansea. The focus is on designs that are specified by so-called control tables. The paper can serve as a starting point for further comparative studies. The DTU/Bremen research has been funded by the RobustRailS project granted by Innovation Fund Denmark. The Surrey/Swansea research has been funded by the Safe...

  10. VERIFICATION OF PARALLEL AUTOMATA-BASED PROGRAMS

    Directory of Open Access Journals (Sweden)

    M. A. Lukin

    2014-01-01

    Full Text Available The paper deals with an interactive method of automatic verification for parallel automata-based programs. The hierarchical state machines can be implemented in different threads and can interact with each other. Verification is done by means of Spin tool and includes automatic Promela model construction, conversion of LTL-formula to Spin format and counterexamples in terms of automata. Interactive verification gives the possibility to decrease verification time and increase the maximum size of verifiable programs. Considered method supports verification of the parallel system for hierarchical automata that interact with each other through messages and shared variables. The feature of automaton model is that each state machine is considered as a new data type and can have an arbitrary bounded number of instances. Each state machine in the system can run a different state machine in a new thread or have nested state machine. This method was implemented in the developed Stater tool. Stater shows correct operation for all test cases.

  11. 超声波复合纤维燃油滤材的研制与应用%Development and Application of Ultrasonic Composite Fiber Filter Materials

    Institute of Scientific and Technical Information of China (English)

    胥绍华

    2012-01-01

    This paper introduces the development of ultrasonic composite fiber filter materials and its application in the fuel oil filter paper.%文章介绍了超声波复合纤维过滤材料的研制及其应用,通过实践证明,超声波复合纤维作为燃油过滤材料,可使燃油过滤材料的过滤效率和容尘量大幅提高.

  12. Verification test report on a solar heating and hot water system

    Science.gov (United States)

    1978-01-01

    Information is provided on the development, qualification and acceptance verification of commercial solar heating and hot water systems and components. The verification includes the performances, the efficiences and the various methods used, such as similarity, analysis, inspection, test, etc., that are applicable to satisfying the verification requirements.

  13. Graft copolymerization onto cellulose-based filter paper and its further development as silver nanoparticles loaded antibacterial food-packaging material.

    Science.gov (United States)

    Tankhiwale, Rasika; Bajpai, S K

    2009-03-01

    The present work describes ceric ammonium nitrate (CAN) initiated graft copolymerization of acrylamide onto cellulose-based filter paper followed by entrapment of silver nanoparticles. The copolymerization was carried out in aqueous solution, containing 2M acrylamide monomer and 16mM N,N'-methylene bisacrylamide (MB) crosslinker. The optimum initiation time and grafting reaction temperature were found to be 15min and 30 degrees C, respectively. The silver nanoparticles were loaded into grafted filter paper by equilibration in silver nitrate solution followed by citrate reduction. The formation of silver nanoparticles has been confirmed by TEM and SAED analysis. The novel nano silver loaded filter paper has been investigated for its antimicrobial properties against E.coli. This newly developed material shows strong antibacterial property and thus offers its candidature for possible use as antibacterial food-packaging material.

  14. The MODUS approach to formal verification

    DEFF Research Database (Denmark)

    Brewka, Lukasz Jerzy; Soler, José; Berger, Michael Stübert

    2014-01-01

    in the process of providing competitive products Objectives: In relation to this, MODUS (Method and supporting toolset advancing embedded systems quality) project aims to provide small and medium-sized businesses ways to improve their position in the embedded market through a pragmatic and viable solution...... Methods/Approach: This paper will describe the MODUS project with focus on the technical methodologies that can assist formal verification and formal model checking. Results: Based on automated analysis of the characteristics of the system and by controlling the choice of the existing opensource model...... verification engines, model verification producing inputs to be fed into these engines. Conclusions: The MODUS approach is aligned with present market needs; the familiarity with tools, the ease of use and compatibility/interoperability remain among the most important criteria when selecting the development...

  15. Development and characterization of Textron continuous fiber ceramic composite hot gas filter materials. Final report, September 30, 1994--October 31, 1997

    Energy Technology Data Exchange (ETDEWEB)

    DiPietro, S.G.; Alvin, M.A.

    1997-12-31

    Uncertainties about the long-term ability of monolithic ceramics to survive in the IGCC or PFBC hot gas filter environment led DOE/METC to consider the merits of using continuous fiber reinforced ceramic composites (CFCCs) as potential next-generation high temperature filter elements. This seems to be a logical strategy to pursue in light of the fact that properly-engineered CFCC materials have shown much-improved damage tolerance and thermal shock behavior as compared to existing monolithic ceramic materials. Textron`s Advanced Hot Gas Filter Development Program was intended to be a two year, two phase program which transitioned developmental materials R and D into prototype filter element fabrication. The first phase was to demonstrate the technical feasibility of fabricating CFCC hot gas filter elements which could meet the pressure drop specifications of less than ten inches of water (iwg) at a face velocity of ten feet per minute (fpm), while showing sufficient integrity to survive normal mechanical loads and adequate environmental resistance to steam/alkali corrosion conditions at a temperature of approximately 870 C (1600 F). The primary objective of the second phase of the program was to scale up fabrication methods developed in Phase 1 to produce full-scale CFCC candle filters for validation testing. Textron encountered significant process-related and technical difficulties in merely meeting the program permeability specifications, and much effort was expended in showing that this could indeed be achieved. Thus, by the time the Phase 1 program was completed, expenditure of program funds precluded continuing on with Phase 2, and Textron elected to terminate their program after Phase 1. This allowed Textron to be able to focus technical and commercialization efforts on their largely successful DOE CFCC Program.

  16. Development and characterization of Textron continuous fiber ceramic composite hot gas filter materials. Final report, September 30, 1994--October 31, 1997

    Energy Technology Data Exchange (ETDEWEB)

    DiPietro, S.G.; Alvin, M.A.

    1997-12-31

    Uncertainties about the long-term ability of monolithic ceramics to survive in the IGCC or PFBC hot gas filter environment led DOE/METC to consider the merits of using continuous fiber reinforced ceramic composites (CFCCs) as potential next-generation high temperature filter elements. This seems to be a logical strategy to pursue in light of the fact that properly-engineered CFCC materials have shown much-improved damage tolerance and thermal shock behavior as compared to existing monolithic ceramic materials. Textron`s Advanced Hot Gas Filter Development Program was intended to be a two year, two phase program which transitioned developmental materials R and D into prototype filter element fabrication. The first phase was to demonstrate the technical feasibility of fabricating CFCC hot gas filter elements which could meet the pressure drop specifications of less than ten inches of water (iwg) at a face velocity of ten feet per minute (fpm), while showing sufficient integrity to survive normal mechanical loads and adequate environmental resistance to steam/alkali corrosion conditions at a temperature of approximately 870 C (1600 F). The primary objective of the second phase of the program was to scale up fabrication methods developed in Phase 1 to produce full-scale CFCC candle filters for validation testing. Textron encountered significant process-related and technical difficulties in merely meeting the program permeability specifications, and much effort was expended in showing that this could indeed be achieved. Thus, by the time the Phase 1 program was completed, expenditure of program funds precluded continuing on with Phase 2, and Textron elected to terminate their program after Phase 1. This allowed Textron to be able to focus technical and commercialization efforts on their largely successful DOE CFCC Program.

  17. Pattern Programmable Kernel Filter for Bot Detection

    Directory of Open Access Journals (Sweden)

    Kritika Govind

    2012-05-01

    Full Text Available Bots earn their unique name as they perform a wide variety of automated task. These tasks include stealing sensitive\tuser\tinformation. Detection of bots using solutions such as behavioral\tcorrelation of\tflow\trecords,\tgroup activity\tin DNS traffic, observing\tthe periodic repeatability in\tcommunication, etc., lead to monitoring\tthe network traffic\tand\tthen\tclassifying them as Bot or normal traffic.\tOther solutions for\tBot detection\tinclude kernel level key stroke verification, system call initialization,\tIP black listing, etc. In the first\ttwo solutions\tthere is no assurance\tthat\tthe\tpacket carrying user information is prevented from being sent to the attacker and the latter suffers from the problem of\tIP spoofing. This motivated\tus to think\tof a solution that would\tfilter\tout\tthe malicious\tpackets\tbefore being\tput onto\tthe network. To come out with such a\tsolution,\ta real time\tbot\tattack\twas\tgenerated with SpyEye Exploit kit and traffic\tcharacteristics were analyzed. The analysis revealed the existence\tof a unique repeated communication\tbetween\tthe Zombie machine\tand\tthe botmaster. This motivated us to propose, a Pattern\tProgrammable Kernel\tFilter (PPKF\tfor filtering out the malicious\tpackets generated by bots.\tPPKF was developed\tusing the\twindows\tfiltering platform (WFP filter engine.\tPPKF was programmed to\tfilter\tout\tthe\tpackets\twith\tunique pattern which were observed\tfrom\tthe\tbot\tattack experiments. Further\tPPKF was found\tto completely suppress the\tflow\tof packets having the programmed uniqueness in them thus preventing the functioning of bots in terms of user information being sent to the Botmaster.

  18. Updating the OMERACT filter

    DEFF Research Database (Denmark)

    Tugwell, Peter; Boers, Maarten; D'Agostino, Maria-Antonietta

    2014-01-01

    OBJECTIVE: The Outcome Measures in Rheumatology (OMERACT) Filter provides guidelines for the development and validation of outcome measures for use in clinical research. The "Truth" section of the OMERACT Filter requires that criteria be met to demonstrate that the outcome instrument meets...... the criteria for content, face, and construct validity. METHODS: Discussion groups critically reviewed a variety of ways in which case studies of current OMERACT Working Groups complied with the Truth component of the Filter and what issues remained to be resolved. RESULTS: The case studies showed...... that there is broad agreement on criteria for meeting the Truth criteria through demonstration of content, face, and construct validity; however, several issues were identified that the Filter Working Group will need to address. CONCLUSION: These issues will require resolution to reach consensus on how Truth...

  19. Inorganic UV filters

    Directory of Open Access Journals (Sweden)

    Eloísa Berbel Manaia

    2013-06-01

    Full Text Available Nowadays, concern over skin cancer has been growing more and more, especially in tropical countries where the incidence of UVA/B radiation is higher. The correct use of sunscreen is the most efficient way to prevent the development of this disease. The ingredients of sunscreen can be organic and/or inorganic sun filters. Inorganic filters present some advantages over organic filters, such as photostability, non-irritability and broad spectrum protection. Nevertheless, inorganic filters have a whitening effect in sunscreen formulations owing to the high refractive index, decreasing their esthetic appeal. Many techniques have been developed to overcome this problem and among them, the use of nanotechnology stands out. The estimated amount of nanomaterial in use must increase from 2000 tons in 2004 to a projected 58000 tons in 2020. In this context, this article aims to analyze critically both the different features of the production of inorganic filters (synthesis routes proposed in recent years and the permeability, the safety and other characteristics of the new generation of inorganic filters.

  20. The COST IC0701 verification competition 2011

    NARCIS (Netherlands)

    Bormer, T.; Brockschmidt, M.; Distefano, D.; Ernst, G.; Filliatre, J.-C.; Grigore, R.; Huisman, M.; Klebanov, V.; Marche, C.; Monahan, R.; Mostowski, W.I.; Poiikarpova, N.; Scheben, C.; Schellhorn, G.; Tofan, B.; Tschannen, J.; Ulbrich, M.; Beckert, B.; Damiani, F.; Gurov, D.

    2012-01-01

    This paper reports on the experiences with the program verification competition held during the FoVeOOS conference in October 2011. There were 6 teams participating in this competition. We discuss the three different challenges that were posed and the solutions developed by the teams. We conclude wi

  1. From prompt gamma distribution to dose: a novel approach combining an evolutionary algorithm and filtering based on Gaussian-powerlaw convolutions

    Science.gov (United States)

    Schumann, A.; Priegnitz, M.; Schoene, S.; Enghardt, W.; Rohling, H.; Fiedler, F.

    2016-10-01

    Range verification and dose monitoring in proton therapy is considered as highly desirable. Different methods have been developed worldwide, like particle therapy positron emission tomography (PT-PET) and prompt gamma imaging (PGI). In general, these methods allow for a verification of the proton range. However, quantification of the dose from these measurements remains challenging. For the first time, we present an approach for estimating the dose from prompt γ-ray emission profiles. It combines a filtering procedure based on Gaussian-powerlaw convolution with an evolutionary algorithm. By means of convolving depth dose profiles with an appropriate filter kernel, prompt γ-ray depth profiles are obtained. In order to reverse this step, the evolutionary algorithm is applied. The feasibility of this approach is demonstrated for a spread-out Bragg-peak in a water target.

  2. Development of a method for bacteria and virus recovery from heating, ventilation, and air conditioning (HVAC) filters.

    Science.gov (United States)

    Farnsworth, James E; Goyal, Sagar M; Kim, Seung Won; Kuehn, Thomas H; Raynor, Peter C; Ramakrishnan, M A; Anantharaman, Senthilvelan; Tang, Weihua

    2006-10-01

    The aim of the work presented here is to study the effectiveness of building air handling units (AHUs) in serving as high volume sampling devices for airborne bacteria and viruses. An HVAC test facility constructed according to ASHRAE Standard 52.2-1999 was used for the controlled loading of HVAC filter media with aerosolized bacteria and virus. Nonpathogenic Bacillus subtilis var. niger was chosen as a surrogate for Bacillus anthracis. Three animal viruses; transmissible gastroenteritis virus (TGEV), avian pneumovirus (APV), and fowlpox virus were chosen as surrogates for three human viruses; SARS coronavirus, respiratory syncytial virus, and smallpox virus; respectively. These bacteria and viruses were nebulized in separate tests and injected into the test duct of the test facility upstream of a MERV 14 filter. SKC Biosamplers upstream and downstream of the test filter served as reference samplers. The collection efficiency of the filter media was calculated to be 96.5 +/- 1.5% for B. subtilis, however no collection efficiency was measured for the viruses as no live virus was ever recovered from the downstream samplers. Filter samples were cut from the test filter and eluted by hand-shaking. An extraction efficiency of 105 +/- 19% was calculated for B. subtilis. The viruses were extracted at much lower efficiencies (0.7-20%). Our results indicate that the airborne concentration of spore-forming bacteria in building AHUs may be determined by analyzing the material collected on HVAC filter media, however culture-based analytical techniques are impractical for virus recovery. Molecular-based identification techniques such as PCR could be used.

  3. Comparison between In-house developed and Diamond commercial software for patient specific independent monitor unit calculation and verification with heterogeneity corrections.

    Science.gov (United States)

    Kuppusamy, Vijayalakshmi; Nagarajan, Vivekanandan; Jeevanandam, Prakash; Murugan, Lavanya

    2016-02-01

    The study was aimed to compare two different monitor unit (MU) or dose verification software in volumetric modulated arc therapy (VMAT) using modified Clarkson's integration technique for 6 MV photons beams. In-house Excel Spreadsheet based monitor unit verification calculation (MUVC) program and PTW's DIAMOND secondary check software (SCS), version-6 were used as a secondary check to verify the monitor unit (MU) or dose calculated by treatment planning system (TPS). In this study 180 patients were grouped into 61 head and neck, 39 thorax and 80 pelvic sites. Verification plans are created using PTW OCTAVIUS-4D phantom and also measured using 729 detector chamber and array with isocentre as the suitable point of measurement for each field. In the analysis of 154 clinically approved VMAT plans with isocentre at a region above -350 HU, using heterogeneity corrections, In-house Spreadsheet based MUVC program and Diamond SCS showed good agreement TPS. The overall percentage average deviations for all sites were (-0.93% + 1.59%) and (1.37% + 2.72%) for In-house Excel Spreadsheet based MUVC program and Diamond SCS respectively. For 26 clinically approved VMAT plans with isocentre at a region below -350 HU showed higher variations for both In-house Spreadsheet based MUVC program and Diamond SCS. It can be concluded that for patient specific quality assurance (QA), the In-house Excel Spreadsheet based MUVC program and Diamond SCS can be used as a simple and fast accompanying to measurement based verification for plans with isocentre at a region above -350 HU.

  4. A Formal Verification Methodology for Checking Data Integrity

    CERN Document Server

    Umezawa, Yasushi

    2011-01-01

    Formal verification techniques have been playing an important role in pre-silicon validation processes. One of the most important points considered in performing formal verification is to define good verification scopes; we should define clearly what to be verified formally upon designs under tests. We considered the following three practical requirements when we defined the scope of formal verification. They are (a) hard to verify (b) small to handle, and (c) easy to understand. Our novel approach is to break down generic properties for system into stereotype properties in block level and to define requirements for Verifiable RTL. Consequently, each designer instead of verification experts can describe properties of the design easily, and formal model checking can be applied systematically and thoroughly to all the leaf modules. During the development of a component chip for server platforms, we focused on RAS (Reliability, Availability, and Serviceability) features and described more than 2000 properties in...

  5. Formal Verification of UML Profil

    DEFF Research Database (Denmark)

    Bhutto, Arifa; Hussain, Dil Muhammad Akbar

    2011-01-01

    The Unified Modeling Language (UML) is based on the Model Driven Development (MDD) approach which capturing the system functionality using the platform-independent model (PMI) and appropriate domain-specific languages. In UML base system notations, structural view is model by the class, components...... and object diagrams and behavioral view model by the activity, use case, state, and sequence diagram. However, UML does not provide the formal syntax, therefore its semantics is not formally definable, so for assure of correctness, we need to incorporate semantic reasoning through verification, specification......, refinement, and incorporate into the development process. Our motivation of research is to make an easy structural view and suggest formal technique/ method which can be best applied or used for the UML based development system. We investigate the tools and methods, which broadly used for the formal...

  6. Spherical coverage verification

    CERN Document Server

    Petkovic, Marko D; Latecki, Longin Jan

    2011-01-01

    We consider the problem of covering hypersphere by a set of spherical hypercaps. This sort of problem has numerous practical applications such as error correcting codes and reverse k-nearest neighbor problem. Using the reduction of non degenerated concave quadratic programming (QP) problem, we demonstrate that spherical coverage verification is NP hard. We propose a recursive algorithm based on reducing the problem to several lower dimension subproblems. We test the performance of the proposed algorithm on a number of generated constellations. We demonstrate that the proposed algorithm, in spite of its exponential worst-case complexity, is applicable in practice. In contrast, our results indicate that spherical coverage verification using QP solvers that utilize heuristics, due to numerical instability, may produce false positives.

  7. Robust verification analysis

    Energy Technology Data Exchange (ETDEWEB)

    Rider, William, E-mail: wjrider@sandia.gov [Sandia National Laboratories, Center for Computing Research, Albuquerque, NM 87185 (United States); Witkowski, Walt [Sandia National Laboratories, Verification and Validation, Uncertainty Quantification, Credibility Processes Department, Engineering Sciences Center, Albuquerque, NM 87185 (United States); Kamm, James R. [Los Alamos National Laboratory, Methods and Algorithms Group, Computational Physics Division, Los Alamos, NM 87545 (United States); Wildey, Tim [Sandia National Laboratories, Center for Computing Research, Albuquerque, NM 87185 (United States)

    2016-02-15

    We introduce a new methodology for inferring the accuracy of computational simulations through the practice of solution verification. We demonstrate this methodology on examples from computational heat transfer, fluid dynamics and radiation transport. Our methodology is suited to both well- and ill-behaved sequences of simulations. Our approach to the analysis of these sequences of simulations incorporates expert judgment into the process directly via a flexible optimization framework, and the application of robust statistics. The expert judgment is systematically applied as constraints to the analysis, and together with the robust statistics guards against over-emphasis on anomalous analysis results. We have named our methodology Robust Verification. Our methodology is based on utilizing multiple constrained optimization problems to solve the verification model in a manner that varies the analysis' underlying assumptions. Constraints applied in the analysis can include expert judgment regarding convergence rates (bounds and expectations) as well as bounding values for physical quantities (e.g., positivity of energy or density). This approach then produces a number of error models, which are then analyzed through robust statistical techniques (median instead of mean statistics). This provides self-contained, data and expert informed error estimation including uncertainties for both the solution itself and order of convergence. Our method produces high quality results for the well-behaved cases relatively consistent with existing practice. The methodology can also produce reliable results for ill-behaved circumstances predicated on appropriate expert judgment. We demonstrate the method and compare the results with standard approaches used for both code and solution verification on well-behaved and ill-behaved simulations.

  8. Spent fuel verification options for final repository safeguards in Finland. A study on verification methods, their feasibility and safety aspects

    Energy Technology Data Exchange (ETDEWEB)

    Hautamaeki, J.; Tiitta, A. [VTT Chemical Technology, Espoo (Finland)

    2000-12-01

    The verification possibilities of the spent fuel assemblies from the Olkiluoto and Loviisa NPPs and the fuel rods from the research reactor of VTT are contemplated in this report. The spent fuel assemblies have to be verified at the partial defect level before the final disposal into the geologic repository. The rods from the research reactor may be verified at the gross defect level. Developing a measurement system for partial defect verification is a complicated and time-consuming task. The Passive High Energy Gamma Emission Tomography and the Fork Detector combined with Gamma Spectrometry are the most potential measurement principles to be developed for this purpose. The whole verification process has to be planned to be as slick as possible. An early start in the planning of the verification and developing the measurement devices is important in order to enable a smooth integration of the verification measurements into the conditioning and disposal process. The IAEA and Euratom have not yet concluded the safeguards criteria for the final disposal. E.g. criteria connected to the selection of the best place to perform the verification. Measurements have not yet been concluded. Options for the verification places have been considered in this report. One option for a verification measurement place is the intermediate storage. The other option is the encapsulation plant. Crucial viewpoints are such as which one offers the best practical possibilities to perform the measurements effectively and which would be the better place in the safeguards point of view. Verification measurements may be needed both in the intermediate storages and in the encapsulation plant. In this report also the integrity of the fuel assemblies after wet intermediate storage period is assessed, because the assemblies have to stand the handling operations of the verification measurements. (orig.)

  9. Generalised Filtering

    Directory of Open Access Journals (Sweden)

    Karl Friston

    2010-01-01

    Full Text Available We describe a Bayesian filtering scheme for nonlinear state-space models in continuous time. This scheme is called Generalised Filtering and furnishes posterior (conditional densities on hidden states and unknown parameters generating observed data. Crucially, the scheme operates online, assimilating data to optimize the conditional density on time-varying states and time-invariant parameters. In contrast to Kalman and Particle smoothing, Generalised Filtering does not require a backwards pass. In contrast to variational schemes, it does not assume conditional independence between the states and parameters. Generalised Filtering optimises the conditional density with respect to a free-energy bound on the model's log-evidence. This optimisation uses the generalised motion of hidden states and parameters, under the prior assumption that the motion of the parameters is small. We describe the scheme, present comparative evaluations with a fixed-form variational version, and conclude with an illustrative application to a nonlinear state-space model of brain imaging time-series.

  10. Development of a Safeguards Verification Method and Instrument to Detect Pin Diversion from Pressurized Water Reactor (PWR) Spent Fuel Assemblies Phase I Study

    Energy Technology Data Exchange (ETDEWEB)

    Ham, Y S; Sitaraman, S

    2008-12-24

    A novel methodology to detect diversion of spent fuel from Pressurized Water Reactors (PWR) has been developed in order to address a long unsolved safeguards verification problem for international safeguards community such as International Atomic Energy Agency (IAEA) or European Atomic Energy Community (EURATOM). The concept involves inserting tiny neutron and gamma detectors into the guide tubes of a spent fuel assembly and measuring the signals. The guide tubes form a quadrant symmetric pattern in the various PWR fuel product lines and the neutron and gamma signals from these various locations are processed to obtain a unique signature for an undisturbed fuel assembly. Signatures based on the neutron and gamma signals individually or in a combination can be developed. Removal of fuel pins from the assembly will cause the signatures to be visibly perturbed thus enabling the detection of diversion. All of the required signal processing to obtain signatures can be performed on standard laptop computers. Monte Carlo simulation studies and a set of controlled experiments with actual commercial PWR spent fuel assemblies were performed and validated this novel methodology. Based on the simulation studies and benchmarking measurements, the methodology developed promises to be a powerful and practical way to detect partial defects that constitute 10% or more of the total active fuel pins. This far exceeds the detection threshold of 50% missing pins from a spent fuel assembly, a threshold defined by the IAEA Safeguards Criteria. The methodology does not rely on any operator provided data like burnup or cooling time and does not require movement of the fuel assembly from the storage rack in the spent fuel pool. A concept was developed to build a practical field device, Partial Defect Detector (PDET), which will be completely portable and will use standard radiation measuring devices already in use at the IAEA. The use of the device will not require any information provided

  11. Sensor Fusion and Model Verification for a Mobile Robot

    OpenAIRE

    Bisgaard, Morten; Vinther, Dennis; Østergaard, Kasper Zinck; Bendtsen, Jan Dimon; Izadi-Zamanabadi, Roozbeh

    2005-01-01

    This paper presents the results of modeling, sensor fusion and model verification for a four-wheel driven, four-wheel steered mobile robot moving in outdoor terrain. The model derived for the robot describes the actuator and wheel dynamics and the vehicle kinematics, and includes friction terms as well as slip. An Unscented Kalman Filter (UKF) based on the dynamic model is used for sensor fusion, feeding sensor measurements back to the robot controller in an intelligent manner. Through practi...

  12. A Flash-ADC data acquisition system developed for a drift chamber array and a digital filter algorithm for signal processing

    Science.gov (United States)

    Yi, Han; Lü, Li-Ming; Zhang, Zhao; Cheng, Wen-Jing; Ji, Wei; Huang, Yan; Zhang, Yan; Li, Hong-Jie; Cui, Yin-Ping; Lin, Ming; Wang, Yi-Jie; Duan, Li-Min; Hu, Rong-Jiang; Xiao, Zhi-Gang

    2016-11-01

    A Flash-ADC data acquisition (DAQ) system has been developed for the drift chamber array designed for the External-Target-Experiment at the Cooling Storage Ring at the Heavy Ion Research Facility, Lanzhou. The simplified readout electronics system has been developed using the Flash-ADC modules and the whole waveform in the sampling window is obtained, with which the time and energy information can be deduced with an offline processing. A digital filter algorithm has been developed to discriminate the noise and the useful signal. With the digital filtering process, the signal to noise ratio (SNR) is increased and a better time and energy resolution can be obtained. Supported by National Basic Research Program of China (973) (2015CB856903 and 2014CB845405), partly by National Science Foundation of China (U1332207 and 11375094), and by Tsinghua University Initiative Scientific Research Program

  13. Continuous verification using multimodal biometrics.

    Science.gov (United States)

    Sim, Terence; Zhang, Sheng; Janakiraman, Rajkumar; Kumar, Sandeep

    2007-04-01

    Conventional verification systems, such as those controlling access to a secure room, do not usually require the user to reauthenticate himself for continued access to the protected resource. This may not be sufficient for high-security environments in which the protected resource needs to be continuously monitored for unauthorized use. In such cases, continuous verification is needed. In this paper, we present the theory, architecture, implementation, and performance of a multimodal biometrics verification system that continuously verifies the presence of a logged-in user. Two modalities are currently used--face and fingerprint--but our theory can be readily extended to include more modalities. We show that continuous verification imposes additional requirements on multimodal fusion when compared to conventional verification systems. We also argue that the usual performance metrics of false accept and false reject rates are insufficient yardsticks for continuous verification and propose new metrics against which we benchmark our system.

  14. 常用DPF过滤体材料发展现状及特性研究%The Development Status and Characteristics of Commonly Used DPF Filter Material

    Institute of Scientific and Technical Information of China (English)

    郭秀荣; 王雅慧

    2012-01-01

    柴油车尾气微粒捕集器(DPF)是解决碳烟颗粒排放的最为有效的方法之一,而其滤芯材料的发展又是制约DPF技术最为关键的因素。本文对目前常用的DPF滤芯材料按照陶瓷基材料,金属基材料及新型材料进行分类,对各种材料的结构特点、过滤性能及经济特性等方面做了详细介绍,并对比分析其优缺点,为各种DPF过滤体材料更好的应用提供有力参考。%Diesel particulate filter(DPF) is one of the most effective ways to solve the emissions of soot particles,while the development of the filter material is the most critical factor that constrains DPF technology.In this paper,the DPF filter materials were classified into three classes including ceramic-based materials,metal-based materials and new materials;and the structural characteristics,filtration performance and economic characteristics of the various materials were introduced in detail;then,the advantages and disadvantages of these materials were compared and analyzed,which could offer theory reference for the application of DPF filter materials.

  15. ETV TEST REPORT OF CONTROL OF BIOAEROSOLS IN HVAC SYSTEMS GLASFLOSS INDUSTRIES EXCEL FILTER, MODEL SBG24242898

    Science.gov (United States)

    The Environmental Technology Verification report discusses the technology and performance of the Excel Filter, Model SBG24242898 air filter for dust and bioaerosol filtration manufactured by Glasfloss Industries, Inc. The pressure drop across the filter was 82 Pa clean and 348 Pa...

  16. ENVIRONMENTAL TECHNOLOGY VERIFICATION, TEST REPORT OF CONTROL OF BIOAEROSOLS IN HVAC SYSTEMS, COLUMBUS INDUSTRIES SL-3 RING PANEL

    Science.gov (United States)

    The Environmental Technology Verification report discusses the technology and performance of the High Efficiency Mini Pleat air filter for dust and bioaerosol filtration manufactured by Columbus Industries. The pressure drop across the filter was 142 Pa clean and 283 Pa dust load...

  17. 40 CFR 1065.309 - Continuous gas analyzer system-response and updating-recording verification-for gas analyzers...

    Science.gov (United States)

    2010-07-01

    ... sampling system at the same time. If you use any analog or real-time digital filters during emission testing, you must operate those filters in the same manner during this verification. (2) Equipment setup... same time. In designing your experimental setup, avoid pressure pulsations due to stopping the...

  18. Whole genome sequencing of Streptococcus pneumoniae: development, evaluation and verification of targets for serogroup and serotype prediction using an automated pipeline

    Directory of Open Access Journals (Sweden)

    Georgia Kapatai

    2016-09-01

    Full Text Available Streptococcus pneumoniae typically express one of 92 serologically distinct capsule polysaccharide (cps types (serotypes. Some of these serotypes are closely related to each other; using the commercially available typing antisera, these are assigned to common serogroups containing types that show cross-reactivity. In this serotyping scheme, factor antisera are used to allocate serotypes within a serogroup, based on patterns of reactions. This serotyping method is technically demanding, requires considerable experience and the reading of the results can be subjective. This study describes the analysis of the S. pneumoniae capsular operon genetic sequence to determine serotype distinguishing features and the development, evaluation and verification of an automated whole genome sequence (WGS-based serotyping bioinformatics tool, PneumoCaT (Pneumococcal Capsule Typing. Initially, WGS data from 871 S. pneumoniae isolates were mapped to reference cps locus sequences for the 92 serotypes. Thirty-two of 92 serotypes could be unambiguously identified based on sequence similarities within the cps operon. The remaining 60 were allocated to one of 20 ‘genogroups’ that broadly correspond to the immunologically defined serogroups. By comparing the cps reference sequences for each genogroup, unique molecular differences were determined for serotypes within 18 of the 20 genogroups and verified using the set of 871 isolates. This information was used to design a decision-tree style algorithm within the PneumoCaT bioinformatics tool to predict to serotype level for 89/94 (92 + 2 molecular types/subtypes from WGS data and to serogroup level for serogroups 24 and 32, which currently comprise 2.1% of UK referred, invasive isolates submitted to the National Reference Laboratory (NRL, Public Health England (June 2014–July 2015. PneumoCaT was evaluated with an internal validation set of 2065 UK isolates covering 72/92 serotypes, including 19 non-typeable isolates

  19. Development of a new rapid isolation device for circulating tumor cells (CTCs) using 3D palladium filter and its application for genetic analysis.

    Science.gov (United States)

    Yusa, Akiko; Toneri, Makoto; Masuda, Taisuke; Ito, Seiji; Yamamoto, Shuhei; Okochi, Mina; Kondo, Naoto; Iwata, Hiroji; Yatabe, Yasushi; Ichinosawa, Yoshiyuki; Kinuta, Seichin; Kondo, Eisaku; Honda, Hiroyuki; Arai, Fumihito; Nakanishi, Hayao

    2014-01-01

    Circulating tumor cells (CTCs) in the blood of patients with epithelial malignancies provide a promising and minimally invasive source for early detection of metastasis, monitoring of therapeutic effects and basic research addressing the mechanism of metastasis. In this study, we developed a new filtration-based, sensitive CTC isolation device. This device consists of a 3-dimensional (3D) palladium (Pd) filter with an 8 µm-sized pore in the lower layer and a 30 µm-sized pocket in the upper layer to trap CTCs on a filter micro-fabricated by precise lithography plus electroforming process. This is a simple pump-less device driven by gravity flow and can enrich CTCs from whole blood within 20 min. After on-device staining of CTCs for 30 min, the filter cassette was removed from the device, fixed in a cassette holder and set up on the upright fluorescence microscope. Enumeration and isolation of CTCs for subsequent genetic analysis from the beginning were completed within 1.5 hr and 2 hr, respectively. Cell spike experiments demonstrated that the recovery rate of tumor cells from blood by this Pd filter device was more than 85%. Single living tumor cells were efficiently isolated from these spiked tumor cells by a micromanipulator, and KRAS mutation, HER2 gene amplification and overexpression, for example, were successfully detected from such isolated single tumor cells. Sequential analysis of blood from mice bearing metastasis revealed that CTC increased with progression of metastasis. Furthermore, a significant increase in the number of CTCs from the blood of patients with metastatic breast cancer was observed compared with patients without metastasis and healthy volunteers. These results suggest that this new 3D Pd filter-based device would be a useful tool for the rapid, cost effective and sensitive detection, enumeration, isolation and genetic analysis of CTCs from peripheral blood in both preclinical and clinical settings.

  20. Development of a new rapid isolation device for circulating tumor cells (CTCs using 3D palladium filter and its application for genetic analysis.

    Directory of Open Access Journals (Sweden)

    Akiko Yusa

    Full Text Available Circulating tumor cells (CTCs in the blood of patients with epithelial malignancies provide a promising and minimally invasive source for early detection of metastasis, monitoring of therapeutic effects and basic research addressing the mechanism of metastasis. In this study, we developed a new filtration-based, sensitive CTC isolation device. This device consists of a 3-dimensional (3D palladium (Pd filter with an 8 µm-sized pore in the lower layer and a 30 µm-sized pocket in the upper layer to trap CTCs on a filter micro-fabricated by precise lithography plus electroforming process. This is a simple pump-less device driven by gravity flow and can enrich CTCs from whole blood within 20 min. After on-device staining of CTCs for 30 min, the filter cassette was removed from the device, fixed in a cassette holder and set up on the upright fluorescence microscope. Enumeration and isolation of CTCs for subsequent genetic analysis from the beginning were completed within 1.5 hr and 2 hr, respectively. Cell spike experiments demonstrated that the recovery rate of tumor cells from blood by this Pd filter device was more than 85%. Single living tumor cells were efficiently isolated from these spiked tumor cells by a micromanipulator, and KRAS mutation, HER2 gene amplification and overexpression, for example, were successfully detected from such isolated single tumor cells. Sequential analysis of blood from mice bearing metastasis revealed that CTC increased with progression of metastasis. Furthermore, a significant increase in the number of CTCs from the blood of patients with metastatic breast cancer was observed compared with patients without metastasis and healthy volunteers. These results suggest that this new 3D Pd filter-based device would be a useful tool for the rapid, cost effective and sensitive detection, enumeration, isolation and genetic analysis of CTCs from peripheral blood in both preclinical and clinical settings.

  1. VERIFICATION AND RISK ASSESSMENT FOR LANDSLIDES IN THE SHIMEN RESERVOIR WATERSHED OF TAIWAN USING SPATIAL ANALYSIS AND DATA MINING

    Directory of Open Access Journals (Sweden)

    J. S. Lai

    2012-07-01

    Full Text Available Spatial information technologies and data can be used effectively to investigate and monitor natural disasters contiguously and to support policy- and decision-making for hazard prevention, mitigation and reconstruction. However, in addition to the vastly growing data volume, various spatial data usually come from different sources and with different formats and characteristics. Therefore, it is necessary to find useful and valuable information that may not be obvious in the original data sets from numerous collections. This paper presents the preliminary results of a research in the validation and risk assessment of landslide events induced by heavy torrential rains in the Shimen reservoir watershed of Taiwan using spatial analysis and data mining algorithms. In this study, eleven factors were considered, including elevation (Digital Elevation Model, DEM, slope, aspect, curvature, NDVI (Normalized Difference Vegetation Index, fault, geology, soil, land use, river and road. The experimental results indicate that overall accuracy and kappa coefficient in verification can reach 98.1% and 0.8829, respectively. However, the DT model after training is too over-fitting to carry prediction. To address this issue, a mechanism was developed to filter uncertain data by standard deviation of data distribution. Experimental results demonstrated that after filtering the uncertain data, the kappa coefficient in prediction substantially increased 29.5%.The results indicate that spatial analysis and data mining algorithm combining the mechanism developed in this study can produce more reliable results for verification and forecast of landslides in the study site.

  2. Survey on Offline Finger Print Verification System

    NARCIS (Netherlands)

    Suman, R.; Kaur, R.

    2012-01-01

    The fingerprint verification, means where "verification" implies a user matching a fingerprint against a single fingerprint associated with the identity that the user claims. Biometrics can be classified into two types Behavioral (signature verification, keystroke dynamics, etc.) And Physiological (

  3. Verification of LHS distributions.

    Energy Technology Data Exchange (ETDEWEB)

    Swiler, Laura Painton

    2006-04-01

    This document provides verification test results for normal, lognormal, and uniform distributions that are used in Sandia's Latin Hypercube Sampling (LHS) software. The purpose of this testing is to verify that the sample values being generated in LHS are distributed according to the desired distribution types. The testing of distribution correctness is done by examining summary statistics, graphical comparisons using quantile-quantile plots, and format statistical tests such as the Chisquare test, the Kolmogorov-Smirnov test, and the Anderson-Darling test. The overall results from the testing indicate that the generation of normal, lognormal, and uniform distributions in LHS is acceptable.

  4. High-level verification

    CERN Document Server

    Lerner, Sorin; Kundu, Sudipta

    2011-01-01

    Given the growing size and heterogeneity of Systems on Chip (SOC), the design process from initial specification to chip fabrication has become increasingly complex. This growing complexity provides incentive for designers to use high-level languages such as C, SystemC, and SystemVerilog for system-level design. While a major goal of these high-level languages is to enable verification at a higher level of abstraction, allowing early exploration of system-level designs, the focus so far for validation purposes has been on traditional testing techniques such as random testing and scenario-based

  5. Studies on upflow anaerobic filter

    Science.gov (United States)

    Varandani, Nanik Sobhraj

    The thesis presents a critical review of the available literature on the various studies carried out on various aspects of Upflow Anaerobic Filter (UAF) throughout the world. Young and McCarty (1969) did the pioneering work in developing UAF in 1969, since then several studies have been carried out by different researchers using different substrates under different operating conditions and variety of supporting media. However, the most significant modification of the original reactor developed by Young and McCarty (1968), has been the development and use of high porosity media. The use of high porosity media, in fact, has changed the character of the reactor, from basically a fixed film reactor to a fixed film reactor in which the contribution by the suspended bio-solids, entrapped in the numerous media pores, in the substrate removal is quite significant that is to say that the reactor no longer remains a biological reactor which can be modeled and designed on the basis of biofilm kinetics only. The thesis presents an attempt to validate the developed mathematical model(s) by using the laboratory scale reactor performance data and the calculated values of reaction kinetic and bio-kinetic constants. To simplify the verification process, computer programmes have been prepared using the "EXCELL" software and C language. The results of the "EXCELL" computer program runs are tabulated at table no. 7.1 to 7.5. The verification of various mathematical models indicate that the model III B, i.e. Non ideal plug flow model assumed to consist of Complete Mix Reactors in series based on reaction kinetics, gives results with least deviation from the real situation. An interesting observation being that the model offers least deviation or nearly satisfies the real situation for a particular COD removal efficiency, for a particular OLR, eg. the least deviations are obtained at COD removal efficiency of 89% for OLR 2, 81.5% for OLR 4, 78.5% for OLR 6 . However, the use of the

  6. The intractable cigarette 'filter problem'.

    Science.gov (United States)

    Harris, Bradford

    2011-05-01

    When lung cancer fears emerged in the 1950s, cigarette companies initiated a shift in cigarette design from unfiltered to filtered cigarettes. Both the ineffectiveness of cigarette filters and the tobacco industry's misleading marketing of the benefits of filtered cigarettes have been well documented. However, during the 1950s and 1960s, American cigarette companies spent millions of dollars to solve what the industry identified as the 'filter problem'. These extensive filter research and development efforts suggest a phase of genuine optimism among cigarette designers that cigarette filters could be engineered to mitigate the health hazards of smoking. This paper explores the early history of cigarette filter research and development in order to elucidate why and when seemingly sincere filter engineering efforts devolved into manipulations in cigarette design to sustain cigarette marketing and mitigate consumers' concerns about the health consequences of smoking. Relevant word and phrase searches were conducted in the Legacy Tobacco Documents Library online database, Google Patents, and media and medical databases including ProQuest, JSTOR, Medline and PubMed. 13 tobacco industry documents were identified that track prominent developments involved in what the industry referred to as the 'filter problem'. These reveal a period of intense focus on the 'filter problem' that persisted from the mid-1950s to the mid-1960s, featuring collaborations between cigarette producers and large American chemical and textile companies to develop effective filters. In addition, the documents reveal how cigarette filter researchers' growing scientific knowledge of smoke chemistry led to increasing recognition that filters were unlikely to offer significant health protection. One of the primary concerns of cigarette producers was to design cigarette filters that could be economically incorporated into the massive scale of cigarette production. The synthetic plastic cellulose acetate

  7. Retail applications of signature verification

    Science.gov (United States)

    Zimmerman, Thomas G.; Russell, Gregory F.; Heilper, Andre; Smith, Barton A.; Hu, Jianying; Markman, Dmitry; Graham, Jon E.; Drews, Clemens

    2004-08-01

    The dramatic rise in identity theft, the ever pressing need to provide convenience in checkout services to attract and retain loyal customers, and the growing use of multi-function signature captures devices in the retail sector provides favorable conditions for the deployment of dynamic signature verification (DSV) in retail settings. We report on the development of a DSV system to meet the needs of the retail sector. We currently have a database of approximately 10,000 signatures collected from 600 subjects and forgers. Previous work at IBM on DSV has been merged and extended to achieve robust performance on pen position data available from commercial point of sale hardware, achieving equal error rates on skilled forgeries and authentic signatures of 1.5% to 4%.

  8. MFTF sensor verification computer program

    Energy Technology Data Exchange (ETDEWEB)

    Chow, H.K.

    1984-11-09

    The design, requirements document and implementation of the MFE Sensor Verification System were accomplished by the Measurement Engineering Section (MES), a group which provides instrumentation for the MFTF magnet diagnostics. The sensors, installed on and around the magnets and solenoids, housed in a vacuum chamber, will supply information about the temperature, strain, pressure, liquid helium level and magnet voltage to the facility operator for evaluation. As the sensors are installed, records must be maintained as to their initial resistance values. Also, as the work progresses, monthly checks will be made to insure continued sensor health. Finally, after the MFTF-B demonstration, yearly checks will be performed as well as checks of sensors as problem develops. The software to acquire and store the data was written by Harry Chow, Computations Department. The acquired data will be transferred to the MFE data base computer system.

  9. The development of advanced instrumentation and control technology. The development of verification and validation technology for instrumentation and control in NPPs

    Energy Technology Data Exchange (ETDEWEB)

    Kwon, Kee Choon; Ham, Chang Shik; Lee, Jang Soo; Kim, Jang Yeol; Song, Soon Ja; Kim, Jung Taek; Park, Won Man; Lee, Dong Young; Eom, Heung Seop; Park, Kee Yong

    1997-07-01

    We found essential problems followed by digitalizing of instrumentation and control. A scheme is divided into hardware and software to resolve these problems. We have analyzed the hardware V and V methodologies about common mode failure, commercial grade dedication process and electromagnetic compatibility. We have developed several guidelines, the software classification guideline, the quality assurance handbook for the software in digital I and C, the software V and V planning guideline, and the software safety guideline. And then, we have established the integrated environment for the safety-critical software based on the Computer Aided Software Engineering (CASE) tools. We have also surveyed a trend and application case of test facility for establishment of functional requirements. Input/output interface which connects among the host computer and developed target and panel are designed and manufactured using UXI bus. The developed functional test facility is used for test and validate the automatic start-up intelligent control system, the dynamic alarm system, the accident identification system using hidden Markov model, and the intelligent logic tracking system. The result of evaluation of the above systems shows the functional test facility performance is sufficient in normal operating and transient conditions. (author). 24 tabs., 59 figs.

  10. Development of nuclear thermal hydraulic verification tests and evaluation technology - Development of the ultrasonic method for two-phase mixture level measurement in nuclear reactor

    Energy Technology Data Exchange (ETDEWEB)

    No, Hee Cheon; Kim, Sang Jae; Kim, Hyung Tae; Moon, Young Min [Korea Advanced Institute of Science and Technology, Taejon (Korea)

    2000-04-01

    An ultrasonic method is developed for the measurement of the two-phase mixture level in the reactor vessel or steam generator. The ultrasonic method is selected among the several non-nuclear two-phase mixture level measurement methods through two steps of selection procedure. A commercial ultrasonic level measurement method is modified for application into the high temperature, pressure, and other conditions. The calculation method of the ultrasonic velocity is modified to consider the medium as the homogeneous mixture of air and steam, and to be applied into the high temperature and pressure conditions. The cross-correlation technique is adopted as a detection method to reduced the effects of the attenuation and the diffused reflection caused by surface fluctuation. The waveguides are developed to reduce the loss of echo and to remove the effects of obstructs. The present experimental study shows that the developed ultrasonic method measures the two-phase mixture level more accurately than the conventional methods do. 21 refs., 60 figs., 13 tabs. (Author)

  11. Operational Simulation of LC Ladder Filter Using VDTA

    Directory of Open Access Journals (Sweden)

    Praveen Kumar

    2017-01-01

    Full Text Available In this paper, a systematic approach for implementing operational simulation of LC ladder filter using voltage differencing transconductance amplifier is presented. The proposed filter structure uses only grounded capacitor and possesses electronic tunability. PSPICE simulation using 180 nm CMOS technology parameter is carried out to verify the functionality of the presented approach. Experimental verification is also performed through commercially available IC LM13700/NS. Simulations and experimental results are found to be in close agreement with theoretical predictions.

  12. Component Verification and Certification in NASA Missions

    Science.gov (United States)

    Giannakopoulou, Dimitra; Penix, John; Norvig, Peter (Technical Monitor)

    2001-01-01

    Software development for NASA missions is a particularly challenging task. Missions are extremely ambitious scientifically, have very strict time frames, and must be accomplished with a maximum degree of reliability. Verification technologies must therefore be pushed far beyond their current capabilities. Moreover, reuse and adaptation of software architectures and components must be incorporated in software development within and across missions. This paper discusses NASA applications that we are currently investigating from these perspectives.

  13. Development of a new ridge filter with honeycomb geometry for a pencil beam scanning system in particle radiotherapy

    Science.gov (United States)

    Tansho, R.; Furukawa, T.; Hara, Y.; Mizushima, K.; Saotome, N.; Saraya, Y.; Shirai, T.; Noda, K.

    2017-09-01

    A ridge filter (RGF), a beam energy modulation device, is usually used for particle radiotherapy with a pencil beam scanning system. The conventional RGF has a one-dimensional (1D) periodic laterally stepped structure in orthogonal plane with a central beam direction. The energy of a beam passing through the different thicknesses of the stepped RGF is modulated. Although the lateral pencil beam size is required to cover the several stepped RGF units to modulate its energy as designed, the current trend is to decrease lateral beam size to improve the scanning system. As a result, the beam size becomes smaller than the size of the individual RGF unit. The aim of this study was to develop a new RGF with two-dimensional (2D) honeycomb geometry to simultaneously achieve both a decrease in lateral beam size and the desired energy modulation. The conventional 1D-RGF and the 2D-RGF with honeycomb geometry were both designed so that the Bragg peak size of a 79 MeV/u carbon ion pencil beam in water was 1 mm RMS in the beam direction. To validate the design of the 2D-RGF, we calculated depth dose distributions in water using a simplified Monte Carlo method. In the calculations, we decreased the lateral pencil beam size at the entrance of the RGF and investigated the threshold of lateral beam size with which the pencil beam can reproduce the desired Bragg peak size for each type of RGF. In addition, we calculated lateral dose distributions in air downstream from the RGF and evaluated the inhomogeneity of the lateral dose distributions. Using the 2D-RGF, the threshold of lateral beam size with which the pencil beam can reproduce the desired Bragg peak size was smaller than that using the 1D-RGF. Moreover, the distance from the RGF at which the lateral dose distribution becomes uniform was shorter using the 2D-RGF than that using the 1D-RGF. These results indicate that when the periodic length of both RGFs is the same, the 2D-RGF allows use of a pencil beam with smaller lateral

  14. FMEF Electrical single line diagram and panel schedule verification process

    Energy Technology Data Exchange (ETDEWEB)

    FONG, S.K.

    1998-11-11

    Since the FMEF did not have a mission, a formal drawing verification program was not developed, however, a verification process on essential electrical single line drawings and panel schedules was established to benefit the operations lock and tag program and to enhance the electrical safety culture of the facility. The purpose of this document is to provide a basis by which future landlords and cognizant personnel can understand the degree of verification performed on the electrical single lines and panel schedules. It is the intent that this document be revised or replaced by a more formal requirements document if a mission is identified for the FMEF.

  15. CRYSTAL FILTER TEST SET

    Science.gov (United States)

    CRYSTAL FILTERS, *HIGH FREQUENCY, *RADIOFREQUENCY FILTERS, AMPLIFIERS, ELECTRIC POTENTIAL, FREQUENCY, IMPEDANCE MATCHING , INSTRUMENTATION, RADIOFREQUENCY, RADIOFREQUENCY AMPLIFIERS, TEST EQUIPMENT, TEST METHODS

  16. Multilevel Mixture Kalman Filter

    Directory of Open Access Journals (Sweden)

    Xiaodong Wang

    2004-11-01

    Full Text Available The mixture Kalman filter is a general sequential Monte Carlo technique for conditional linear dynamic systems. It generates samples of some indicator variables recursively based on sequential importance sampling (SIS and integrates out the linear and Gaussian state variables conditioned on these indicators. Due to the marginalization process, the complexity of the mixture Kalman filter is quite high if the dimension of the indicator sampling space is high. In this paper, we address this difficulty by developing a new Monte Carlo sampling scheme, namely, the multilevel mixture Kalman filter. The basic idea is to make use of the multilevel or hierarchical structure of the space from which the indicator variables take values. That is, we draw samples in a multilevel fashion, beginning with sampling from the highest-level sampling space and then draw samples from the associate subspace of the newly drawn samples in a lower-level sampling space, until reaching the desired sampling space. Such a multilevel sampling scheme can be used in conjunction with the delayed estimation method, such as the delayed-sample method, resulting in delayed multilevel mixture Kalman filter. Examples in wireless communication, specifically the coherent and noncoherent 16-QAM over flat-fading channels, are provided to demonstrate the performance of the proposed multilevel mixture Kalman filter.

  17. HDL to verification logic translator

    Science.gov (United States)

    Gambles, J. W.; Windley, P. J.

    The increasingly higher number of transistors possible in VLSI circuits compounds the difficulty in insuring correct designs. As the number of possible test cases required to exhaustively simulate a circuit design explodes, a better method is required to confirm the absence of design faults. Formal verification methods provide a way to prove, using logic, that a circuit structure correctly implements its specification. Before verification is accepted by VLSI design engineers, the stand alone verification tools that are in use in the research community must be integrated with the CAD tools used by the designers. One problem facing the acceptance of formal verification into circuit design methodology is that the structural circuit descriptions used by the designers are not appropriate for verification work and those required for verification lack some of the features needed for design. We offer a solution to this dilemma: an automatic translation from the designers' HDL models into definitions for the higher-ordered logic (HOL) verification system. The translated definitions become the low level basis of circuit verification which in turn increases the designer's confidence in the correctness of higher level behavioral models.

  18. HDL to verification logic translator

    Science.gov (United States)

    Gambles, J. W.; Windley, P. J.

    1992-01-01

    The increasingly higher number of transistors possible in VLSI circuits compounds the difficulty in insuring correct designs. As the number of possible test cases required to exhaustively simulate a circuit design explodes, a better method is required to confirm the absence of design faults. Formal verification methods provide a way to prove, using logic, that a circuit structure correctly implements its specification. Before verification is accepted by VLSI design engineers, the stand alone verification tools that are in use in the research community must be integrated with the CAD tools used by the designers. One problem facing the acceptance of formal verification into circuit design methodology is that the structural circuit descriptions used by the designers are not appropriate for verification work and those required for verification lack some of the features needed for design. We offer a solution to this dilemma: an automatic translation from the designers' HDL models into definitions for the higher-ordered logic (HOL) verification system. The translated definitions become the low level basis of circuit verification which in turn increases the designer's confidence in the correctness of higher level behavioral models.

  19. Systems Approach to Arms Control Verification

    Energy Technology Data Exchange (ETDEWEB)

    Allen, K; Neimeyer, I; Listner, C; Stein, G; Chen, C; Dreicer, M

    2015-05-15

    Using the decades of experience of developing concepts and technologies for verifying bilateral and multilateral arms control agreements, a broad conceptual systems approach is being developed that takes into account varying levels of information and risk. The IAEA has already demonstrated the applicability of a systems approach by implementing safeguards at the State level, with acquisition path analysis as the key element. In order to test whether such an approach could also be implemented for arms control verification, an exercise was conducted in November 2014 at the JRC ITU Ispra. Based on the scenario of a hypothetical treaty between two model nuclear weapons states aimed at capping their nuclear arsenals at existing levels, the goal of this exercise was to explore how to use acquisition path analysis in an arms control context. Our contribution will present the scenario, objectives and results of this exercise, and attempt to define future workshops aimed at further developing verification measures that will deter or detect treaty violations.

  20. Stack filter classifiers

    Energy Technology Data Exchange (ETDEWEB)

    Porter, Reid B [Los Alamos National Laboratory; Hush, Don [Los Alamos National Laboratory

    2009-01-01

    Just as linear models generalize the sample mean and weighted average, weighted order statistic models generalize the sample median and weighted median. This analogy can be continued informally to generalized additive modeels in the case of the mean, and Stack Filters in the case of the median. Both of these model classes have been extensively studied for signal and image processing but it is surprising to find that for pattern classification, their treatment has been significantly one sided. Generalized additive models are now a major tool in pattern classification and many different learning algorithms have been developed to fit model parameters to finite data. However Stack Filters remain largely confined to signal and image processing and learning algorithms for classification are yet to be seen. This paper is a step towards Stack Filter Classifiers and it shows that the approach is interesting from both a theoretical and a practical perspective.

  1. Multimodal biometric fusion using multiple-input correlation filter classifiers

    Science.gov (United States)

    Hennings, Pablo; Savvides, Marios; Vijaya Kumar, B. V. K.

    2005-03-01

    In this work we apply a computationally efficient, closed form design of a jointly optimized filter bank of correlation filter classifiers for biometric verification with the use of multiple biometrics from individuals. Advanced correlation filters have been used successfully for biometric classification, and have shown robustness in verifying faces, palmprints and fingerprints. In this study we address the issues of performing robust biometric verification when multiple biometrics from the same person are available at the moment of authentication; we implement biometric fusion by using a filter bank of correlation filter classifiers which are jointly optimized with each biometric, instead of designing separate independent correlation filter classifiers for each biometric and then fuse the resulting match scores. We present results using fingerprint and palmprint images from a data set of 40 people, showing a considerable advantage in verification performance producing a large margin of separation between the impostor and authentic match scores. The method proposed in this paper is a robust and secure method for authenticating an individual.

  2. Development of nuclear thermal hydraulic verification tests and evaluation technology - Development of a sensor for two-phase flow measurement using impedance method

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Moo Whan; Kang, Hie Chan; Kwon, Jung Tae; Huh, Deok; Yang, Hoon Cheul [Pohang University of Science and Technology, Pohang (Korea)

    2000-04-01

    Impedance method was carried out to design the electrode that can measure the void fraction of the bubbly flow in pool reservoir. To find out the optimum electrode shape, Styrofoam-Simulator tests were performed in a specially designed acrylic reservoir. Three kinds of electrodes were designed to compare the measuring characteristics of water-air flow. The resistance increased with the increase of the void fraction and the capacitance decreased with the increase of the void fraction. The resistance is a main parameter to express the nature of the water-air flow in impedance method. Almost of impedance values come out from the resistance. The degree of deviation from the mean-resistance values showed reasonable results. Electrode type-I expressed excellent results among the three electrode shapes. The sensor developed can simultaneously measure the void fraction and the water level. 7 refs., 51 figs., 4 tabs. (Author)

  3. Generic System Verilog Universal Verification Methodology Based Reusable Verification Environment for Efficient Verification of Image Signal Processing IPS/SOCS

    Directory of Open Access Journals (Sweden)

    Abhishek Jain

    2012-12-01

    Full Text Available In this paper, we present Generic System Verilog Universal Verification Methodology based Reusable Verification Environment for efficient verification of Image Signal Processing IP’s/SoC’s. With the tight schedules on all projects it is important to have a strong verification methodology which contributes to First Silicon Success. Deploy methodologies which enforce full functional coverage and verification of corner cases through pseudo random test scenarios is required. Also, standardization of verification flow is needed. Previously, inside imaging group of ST, Specman (e/Verilog based Verification Environment forIP/Subsystem level verification and C/C++/Verilog based Directed Verification Environment for SoC Level Verification was used for Functional Verification. Different Verification Environments were used at IP level and SoC level. Different Verification/Validation Methodologies were used for SoC Verification across multiple sites. Verification teams were also looking for the ways how to catch bugs early in the design cycle? Thus, Generic System Verilog Universal Verification Methodology (UVM based Reusable Verification Environment is required to avoid the problem of having so many methodologies and provides a standard unified solution which compiles on all tools.

  4. Digital filter synthesis computer program

    Science.gov (United States)

    Moyer, R. A.; Munoz, R. M.

    1968-01-01

    Digital filter synthesis computer program expresses any continuous function of a complex variable in approximate form as a computational algorithm or difference equation. Once the difference equation has been developed, digital filtering can be performed by the program on any input data list.

  5. Clinical Verification of Homeopathy

    Directory of Open Access Journals (Sweden)

    Michel Van Wassenhoven

    2011-07-01

    Full Text Available The world is changing! This is certainly true regarding the homeopathic practice and access to homeopathic medicine. Therefore our first priority at the ECH-LMHI [1] has been to produce a yearly report on the scientific framework of homeopathy. In the 2010 version a new chapter about epidemic diseases has been added including the Leptospirosis survey on the Cuban population. A second priority has been to review the definition of the homeopathic medicines respecting the new framework generated by the official registration procedure and the WHO report. We are working now on a documented (Materia Medica and provings list of homeopathic remedies to facilitate the registration of our remedies. The new challenges are: first of all more good research proposals and as such more funding (possible through ISCHI + Blackie Foundation as examples [2]; international acceptance of new guidelines for proving and clinical verification of homeopathic symptoms (Proposals are ready for discussion; total reconsideration of the homeopathic repertories including results of the clinical verification of the symptoms. The world is changing, we are part of the world and changes are needed also for homeopathy!

  6. Development and Acceptance Testing of the Dual Wheel Mechanism for the Tunable Filter Imager Cryogenic Instrument on the JWST

    Science.gov (United States)

    Leckie, Martin; Ahmad, Zakir

    2010-01-01

    The James Webb Space Telescope (JWST) will carry four scientific instruments, one of which is the Tunable Filter Imager (TFI), which is an instrument within the Fine Guidance Sensor. The Dual Wheel (DW) mechanism is being designed, built and tested by COM DEV Ltd. under contract from the Canadian Space Agency. The DW mechanism includes a pupil wheel (PW) holding seven coronagraphic masks and two calibration elements and a filter wheel (FW) holding nine blocking filters. The DW mechanism must operate at both room temperature and at 35K. Successful operation at 35K comprises positioning each optical element with the required repeatability, for several thousand occasions over the five year mission. The paper discusses the results of testing geared motors and bearings at the cryogenic temperature. In particular bearing retainer design and PGM-HT material, the effects of temperature gradients across bearings and the problems associated with cooling mechanisms down to cryogenic temperatures. The results of additional bearing tests are described that were employed to investigate an abnormally high initial torque experienced at cryogenic temperatures. The findings of these tests, was that the bearing retainer and the ball/race system could be adversely affected by the large temperature change from room temperature to cryogenic temperature and also the temperature gradient across the bearing. The DW mechanism is now performing successfully at both room temperature and at cryogenic temperature. The life testing of the mechanism is expected to be completed in the first quarter of 2010.

  7. Development of an in-line filter to prevent intrusion of NO2 toxic vapors into A/C systems

    Science.gov (United States)

    Meneghelli, Barry; Mcnulty, R. J.; Springer, Mike; Lueck, Dale E.

    1995-01-01

    The hypergolic propellant nitrogen tetroxide (N2O4 or NTO) is routinely used in spacecraft launched at Kennedy Space Center (KSC) and Cape Canaveral Air Station (CCAS). In the case of a catastrophic failure of the spacecraft, there would be a release of the unspent propellant in the form of a toxic cloud. Inhalation of this material at downwind concentrations which may be as high as 20 parts per million (ppm) for 30 minutes in duration, may produce irritation to the eyes, nose and respiratory tract. Studies at both KSC and CCAS have shown that the indoor concentrations of N2O4 during a toxic release may range from 1 to 15 ppm and depend on the air change rate (ACR) for a particular building and whether or not the air conditioning (A/C) system has been shut down or left in an operating mode. This project was initiated in order to assess how current A/C systems could be easily modified to prevent personnel from being exposed to toxic vapors. A sample system has been constructed to test the ability of several types of filter material to capture the N2O4 vapors prior to their infiltration into the A/C system. Test results will be presented which compare the efficiencies of standard A/C filters, water wash systems, and chemically impregnated filter material in taking toxic vapors out of the incoming air stream.

  8. Digital filters

    CERN Document Server

    Hamming, Richard W

    1997-01-01

    Digital signals occur in an increasing number of applications: in telephone communications; in radio, television, and stereo sound systems; and in spacecraft transmissions, to name just a few. This introductory text examines digital filtering, the processes of smoothing, predicting, differentiating, integrating, and separating signals, as well as the removal of noise from a signal. The processes bear particular relevance to computer applications, one of the focuses of this book.Readers will find Hamming's analysis accessible and engaging, in recognition of the fact that many people with the s

  9. Developement of radioisotope tracer technique; development of verification method for hydraulic model using radioisotope tracer techniques in the municipal wastewater treatment plants

    Energy Technology Data Exchange (ETDEWEB)

    Kim, C. W.; Kim, S. H.; Kim, J. W.; Yun, J. S.; Wo, S. B. [Pusan National University, Pusan (Korea)

    2001-04-01

    This study focuses on the development of the computational fluid dynamics that can be used in secondary clarifier in wastewater treatment plants. This model could describe the internal flow characteristics and predicted similar results as the isotopic tracer experiment. Therefore, it was demonstrated that the isotopic tracer method was a powerful tool as a hydrodynamic model to understand the internal hydraulics. Generally the secondary clarifier can be improved by special design, changing coagulation characteristics by addition of coagulation chemicals and well management by experienced operator. Because of expensive coagulation chemicals and limited availability of experienced operator, the improvement of the design is feasible way to upgrade the secondary clarifier. Though it is very complex and difficult to model the fluid dynamics, CFD model can describe correctly density flow, short circuiting, turbulent dispersion and settling characteristics. There are few trust worthy methods for verifying the hydrodynamic model. Also, it is very difficult to prove the flow by experiment in secondary sedimentation tank because of the disturbing the flow by the experimental equipment. However, the isotope tracer experiment is known as a useful tool for the study of the hydraulic characteristics and floc movement in the sedimentation tank because the isotope tracer does not disturb the internal flow and provide the data quickly through the on-line system. Therefore, the computed fluid dynamic model was developed to make the isotope tracer experiment available as a model verifying method. Predicted results in model simulation were made the same pattern as the experiment on-line data with the time. These results were compared each other. Also, the model explained the detail flow pattern of the area without the monitoring in the sedimentation tank and visualized the internal flow and concentration distribution with time using the graphic software. Because of the complicated

  10. Developement of radioisotope tracer technique; development of verification method for hydraulic model using radioisotope tracer techniques in the municipal wastewater treatment plants

    Energy Technology Data Exchange (ETDEWEB)

    Kim, C. W.; Kim, S. H.; Kim, J. W.; Yun, J. S.; Wo, S. B. [Pusan National University, Pusan (Korea)

    2001-04-01

    This study focuses on the development of the computational fluid dynamics that can be used in secondary clarifier in wastewater treatment plants. This model could describe the internal flow characteristics and predicted similar results as the isotopic tracer experiment. Therefore, it was demonstrated that the isotopic tracer method was a powerful tool as a hydrodynamic model to understand the internal hydraulics. Generally the secondary clarifier can be improved by special design, changing coagulation characteristics by addition of coagulation chemicals and well management by experienced operator. Because of expensive coagulation chemicals and limited availability of experienced operator, the improvement of the design is feasible way to upgrade the secondary clarifier. Though it is very complex and difficult to model the fluid dynamics, CFD model can describe correctly density flow, short circuiting, turbulent dispersion and settling characteristics. There are few trust worthy methods for verifying the hydrodynamic model. Also, it is very difficult to prove the flow by experiment in secondary sedimentation tank because of the disturbing the flow by the experimental equipment. However, the isotope tracer experiment is known as a useful tool for the study of the hydraulic characteristics and floc movement in the sedimentation tank because the isotope tracer does not disturb the internal flow and provide the data quickly through the on-line system. Therefore, the computed fluid dynamic model was developed to make the isotope tracer experiment available as a model verifying method. Predicted results in model simulation were made the same pattern as the experiment on-line data with the time. These results were compared each other. Also, the model explained the detail flow pattern of the area without the monitoring in the sedimentation tank and visualized the internal flow and concentration distribution with time using the graphic software. Because of the complicated

  11. Development of safeguard technology for plutonium inventory verification in a glove box during facility maintenance. Development of holdup monitor system (HMOS)

    Energy Technology Data Exchange (ETDEWEB)

    Nakamura, Hironobu; Hosoma, Takashi; Tanaka, Izumi [Japan Nuclear Cycle Development Inst., Tokai, Ibaraki (Japan). Tokai Works

    2002-09-01

    In MOX facilities in JNC, Pu and U residue in the glove box (called Holdup'') has been measured and verified by an inspector by setting large neutron detectors beside the glove box (GB). However, it is difficult to apply such methods during plant maintenance, because the contamination control area is set up around the GB. JNC developed the ''Holdup Monitor System (HMOS)'' to enable sensitive monitoring instead of measurement. It has {sup 3}He detectors up and down the GB and counts the total neutron rate. The detectors are shielded from neutrons by the background and the sensitivity is 30g ({sigma}) for Pu in the middle of GB. From 1998, this system was used for inspection for about one year. As a result, a linear increase in count rate, fluctuation (1-2%) due to humidity in a room and sensitive change of count rate occurred by movement of Pu material in the GB were obtained. (author)

  12. Vertically Coupled Microring Resonator Filter :Versatile Building Block for VLSI Filter Circuits

    Institute of Scientific and Technical Information of China (English)

    Yasuo; Kokubun

    2003-01-01

    In this review, the recent progress in the development of vertically coupled micro-ring resonator filters is summarized and the potential applications of the filters leading to the development of VLSI photonics are described.

  13. Vertically Coupled Microring Resonator Filter : Versatile Building Block for VLSI Filter Circuits

    Institute of Scientific and Technical Information of China (English)

    Yasuo Kokubun

    2003-01-01

    In this review, the recent progress in the development of vertically coupled micro-ring resonator filters is summarized and the potential applications of the filters leading to the development of VLSI photonics are described.

  14. An Efficient Automatic Attendance System using Fingerprint Verification Technique

    OpenAIRE

    Chitresh Saraswat,; Amit Kumar

    2010-01-01

    The main aim of this paper is to develop an accurate, fast and very efficient automatic attendance system using fingerprint verification technique. We propose a system in which fingerprint verification is done by using extraction of minutiae technique and the system that automates the whole process of taking attendance, Manually which is a laborious and troublesome work and waste a lot of time, with its managing and maintaining the records for a period of time is also a burdensome task. For t...

  15. Verification and validation of diagnostic laboratory tests in clinical virology.

    Science.gov (United States)

    Rabenau, Holger F; Kessler, Harald H; Kortenbusch, Marhild; Steinhorst, Andreas; Raggam, Reinhard B; Berger, Annemarie

    2007-10-01

    This review summarizes major issues of verification and validation procedures and describes minimum requirements for verification and validation of diagnostic assays in clinical virology including instructions for CE/IVD-labeled as well as for self-developed ("home-brewed") tests or test systems. It covers techniques useful for detection of virus specific antibodies, for detection of viral antigens, for detection of viral nucleic acids, and for isolation of viruses on cell cultures in the routine virology laboratory.

  16. Efficient and Secure Fingerprint Verification for Embedded Devices

    OpenAIRE

    Sakiyama Kazuo; Verbauwhede Ingrid; Yang Shenglin

    2006-01-01

    This paper describes a secure and memory-efficient embedded fingerprint verification system. It shows how a fingerprint verification module originally developed to run on a workstation can be transformed and optimized in a systematic way to run real-time on an embedded device with limited memory and computation power. A complete fingerprint recognition module is a complex application that requires in the order of 1000 M unoptimized floating-point instruction cycles. The goal is to run both t...

  17. Formal Verification, Engineering and Business Value

    Directory of Open Access Journals (Sweden)

    Ralf Huuck

    2012-12-01

    Full Text Available How to apply automated verification technology such as model checking and static program analysis to millions of lines of embedded C/C++ code? How to package this technology in a way that it can be used by software developers and engineers, who might have no background in formal verification? And how to convince business managers to actually pay for such a software? This work addresses a number of those questions. Based on our own experience on developing and distributing the Goanna source code analyzer for detecting software bugs and security vulnerabilities in C/C++ code, we explain the underlying technology of model checking, static analysis and SMT solving, steps involved in creating industrial-proof tools.

  18. Sensor-fusion-based biometric identity verification

    Energy Technology Data Exchange (ETDEWEB)

    Carlson, J.J.; Bouchard, A.M.; Osbourn, G.C.; Martinez, R.F.; Bartholomew, J.W. [Sandia National Labs., Albuquerque, NM (United States); Jordan, J.B.; Flachs, G.M.; Bao, Z.; Zhu, L. [New Mexico State Univ., Las Cruces, NM (United States). Electronic Vision Research Lab.

    1998-02-01

    Future generation automated human biometric identification and verification will require multiple features/sensors together with internal and external information sources to achieve high performance, accuracy, and reliability in uncontrolled environments. The primary objective of the proposed research is to develop a theoretical and practical basis for identifying and verifying people using standoff biometric features that can be obtained with minimal inconvenience during the verification process. The basic problem involves selecting sensors and discovering features that provide sufficient information to reliably verify a person`s identity under the uncertainties caused by measurement errors and tactics of uncooperative subjects. A system was developed for discovering hand, face, ear, and voice features and fusing them to verify the identity of people. The system obtains its robustness and reliability by fusing many coarse and easily measured features into a near minimal probability of error decision algorithm.

  19. Design and Implementation of Wave Digital Filters

    Directory of Open Access Journals (Sweden)

    V. Davidek

    2001-09-01

    Full Text Available One of possibilities of the Wave Digital Filters (WDF design isusing the classical LC-filters theory. The aim of this paper is todemonstrate the design of WDF from the LC filter and the implementationof WDF on the fixed-point digital signal processor. The theory of wavedigital filter has been developed by using the classical scatteringparameter theory. The theory of ladder filters is well-known, and soour present problem can thus be reduced to a problem how to replace theL and C elements of the filters by adaptors and delay elements, addersand multipliers.

  20. Indirect Kalman Filter in Mobile Robot Application

    Directory of Open Access Journals (Sweden)

    Surachai Panich

    2010-01-01

    Full Text Available Problem statement: The most successful applications of Kalman filtering are to linearize about some nominal trajectory in state space that does not depend on the measurement data. The resulting filter is usually referred to as simply a linearized Kalman filter. Approach: This study introduced mainly indirect Kalman filter to estimate robot’s position. A developed differential encoder system integrated accelerometer is experimental tested in square shape. Results: Experimental results confirmed that indirect Kalman filter improves the accuracy and confidence of position estimation. Conclusion: In summary, we concluded that indirect Kalman filter has good potential to reduce error of measurement data.

  1. Verification tests for CANDU advanced fuel

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Chang Hwan; Chang, S.K.; Hong, S.D. [and others

    1997-07-01

    For the development of a CANDU advanced fuel, the CANFLEX-NU fuel bundles were tested under reactor operating conditions at the CANDU-Hot test loop. This report describes test results and test methods in the performance verification tests for the CANFLEX-NU bundle design. The main items described in the report are as follows. - Fuel bundle cross-flow test - Endurance fretting/vibration test - Freon CHF test - Production of technical document. (author). 25 refs., 45 tabs., 46 figs.

  2. Verification of concurrent programs with Chalice

    OpenAIRE

    K. Rustan M. Leino; Müller, Peter; Smans, Jan

    2009-01-01

    A program verifier is a tool that allows developers to prove that their code satisfies its specification for every possible input and every thread schedule. These lecture notes describe a verifier for concurrent programs called Chalice. Chalice's verification methodology centers around permissions and permission transfer. In particular, a memory location may be accessed by a thread only if that thread has permission to do so. Proper use of permissions allows Chalice to deduce upper bound...

  3. Simulation environment based on the Universal Verification Methodology

    Science.gov (United States)

    Fiergolski, A.

    2017-01-01

    Universal Verification Methodology (UVM) is a standardized approach of verifying integrated circuit designs, targeting a Coverage-Driven Verification (CDV). It combines automatic test generation, self-checking testbenches, and coverage metrics to indicate progress in the design verification. The flow of the CDV differs from the traditional directed-testing approach. With the CDV, a testbench developer, by setting the verification goals, starts with an structured plan. Those goals are targeted further by a developed testbench, which generates legal stimuli and sends them to a device under test (DUT). The progress is measured by coverage monitors added to the simulation environment. In this way, the non-exercised functionality can be identified. Moreover, the additional scoreboards indicate undesired DUT behaviour. Such verification environments were developed for three recent ASIC and FPGA projects which have successfully implemented the new work-flow: (1) the CLICpix2 65 nm CMOS hybrid pixel readout ASIC design; (2) the C3PD 180 nm HV-CMOS active sensor ASIC design; (3) the FPGA-based DAQ system of the CLICpix chip. This paper, based on the experience from the above projects, introduces briefly UVM and presents a set of tips and advices applicable at different stages of the verification process-cycle.

  4. Shift Verification and Validation

    Energy Technology Data Exchange (ETDEWEB)

    Pandya, Tara M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Evans, Thomas M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Davidson, Gregory G [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Johnson, Seth R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Godfrey, Andrew T. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-09-07

    This documentation outlines the verification and validation of Shift for the Consortium for Advanced Simulation of Light Water Reactors (CASL). Five main types of problems were used for validation: small criticality benchmark problems; full-core reactor benchmarks for light water reactors; fixed-source coupled neutron-photon dosimetry benchmarks; depletion/burnup benchmarks; and full-core reactor performance benchmarks. We compared Shift results to measured data and other simulated Monte Carlo radiation transport code results, and found very good agreement in a variety of comparison measures. These include prediction of critical eigenvalue, radial and axial pin power distributions, rod worth, leakage spectra, and nuclide inventories over a burn cycle. Based on this validation of Shift, we are confident in Shift to provide reference results for CASL benchmarking.

  5. Shift Verification and Validation

    Energy Technology Data Exchange (ETDEWEB)

    Pandya, Tara M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Evans, Thomas M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Davidson, Gregory G [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Johnson, Seth R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Godfrey, Andrew T. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-09-07

    This documentation outlines the verification and validation of Shift for the Consortium for Advanced Simulation of LightWater Reactors (CASL). Fivemain types of problems were used for validation: small criticality benchmark problems; full-core reactor benchmarks for light water reactors; fixed source coupled neutron-photon dosimetry benchmarks; depletion/burnup benchmarks; and full-core reactor performance benchmarks. We compared Shift results to measured data and other simulated Monte Carlo radiation transport code results and found very good agreement in a variety of comparison measures. These include prediction of critical eigenvalue, radial and axial pin power distributions, rod worth, leakage spectra, and nuclide inventories over a burn cycle. Based on this validation of Shift, we are confident in Shift to provide reference results for CASL benchmarking.

  6. 40 CFR 1065.307 - Linearity verification.

    Science.gov (United States)

    2010-07-01

    ... linearity verification generally consists of introducing a series of at least 10 reference values to a... reference values of the linearity verification. For pressure, temperature, dewpoint, and GC-ECD linearity verifications, we recommend at least three reference values. For all other linearity verifications select...

  7. A Hybrid On-line Verification Method of Relay Setting

    Science.gov (United States)

    Gao, Wangyuan; Chen, Qing; Si, Ji; Huang, Xin

    2017-05-01

    Along with the rapid development of the power industry, grid structure gets more sophisticated. The validity and rationality of protective relaying are vital to the security of power systems. To increase the security of power systems, it is essential to verify the setting values of relays online. Traditional verification methods mainly include the comparison of protection range and the comparison of calculated setting value. To realize on-line verification, the verifying speed is the key. The verifying result of comparing protection range is accurate, but the computation burden is heavy, and the verifying speed is slow. Comparing calculated setting value is much faster, but the verifying result is conservative and inaccurate. Taking the overcurrent protection as example, this paper analyses the advantages and disadvantages of the two traditional methods above, and proposes a hybrid method of on-line verification which synthesizes the advantages of the two traditional methods. This hybrid method can meet the requirements of accurate on-line verification.

  8. International Space Station Requirement Verification for Commercial Visiting Vehicles

    Science.gov (United States)

    Garguilo, Dan

    2017-01-01

    The COTS program demonstrated NASA could rely on commercial providers for safe, reliable, and cost-effective cargo delivery to ISS. The ISS Program has developed a streamlined process to safely integrate commercial visiting vehicles and ensure requirements are met Levy a minimum requirement set (down from 1000s to 100s) focusing on the ISS interface and safety, reducing the level of NASA oversight/insight and burden on the commercial Partner. Partners provide a detailed verification and validation plan documenting how they will show they've met NASA requirements. NASA conducts process sampling to ensure that the established verification processes is being followed. NASA participates in joint verification events and analysis for requirements that require both parties verify. Verification compliance is approved by NASA and launch readiness certified at mission readiness reviews.

  9. Formal verification of Simulink/Stateflow diagrams a deductive approach

    CERN Document Server

    Zhan, Naijun; Zhao, Hengjun

    2017-01-01

    This book presents a state-of-the-art technique for formal verification of continuous-time Simulink/Stateflow diagrams, featuring an expressive hybrid system modelling language, a powerful specification logic and deduction-based verification approach, and some impressive, realistic case studies. Readers will learn the HCSP/HHL-based deductive method and the use of corresponding tools for formal verification of Simulink/Stateflow diagrams. They will also gain some basic ideas about fundamental elements of formal methods such as formal syntax and semantics, and especially the common techniques applied in formal modelling and verification of hybrid systems. By investigating the successful case studies, readers will realize how to apply the pure theory and techniques to real applications, and hopefully will be inspired to start to use the proposed approach, or even develop their own formal methods in their future work.

  10. Ageing studies for the ATLAS MDT muonchambers and development of a gas filter to prevent drift tube ageing

    Energy Technology Data Exchange (ETDEWEB)

    Koenig, S.

    2008-01-15

    The muon spectrometer of the ATLAS detector, which is currently assembled at the LHC accelerator at CERN, uses drift tubes as basic detection elements over most of the solid angle. The performance of these monitored drift tubes (MDTs), in particular their spatial resolution of 80 {mu}m, determines the precision of the spectrometer. If ageing effects occur, the precision of the drift tubes will be degraded. Hence ageing effects have to be minimized or avoided altogether if possible. Even with a gas mixture of Ar:CO{sub 2}=93:7, which was selected for its good ageing properties, ageing effects were observed in test systems. They were caused by small amounts of impurities, in particular volatile silicon compounds. Systematic studies revealed the required impurity levels deteriorating the drift tubes to be well below 1 ppm. Many components of the ATLAS MDT gas system are supplied by industry. In a newly designed ageing experiment in Freiburg these components were validated for their use in ATLAS. With a fully assembled ATLAS gas distribution rack as test component ageing effects were observed. It was therefore decided to install gas filters in the gas distribution lines to remove volatile silicon compounds efficiently from the gas mixture. Finally a filter was designed that can adsorb up to 5.5 g of volatile silicon compounds, hereby reducing the impurities in the outlet gas mixture to less than 30 ppb. (orig.)

  11. Speaker-dependent Dictionary-based Speech Enhancement for Text-Dependent Speaker Verification

    DEFF Research Database (Denmark)

    Thomsen, Nicolai Bæk; Thomsen, Dennis Alexander Lehmann; Tan, Zheng-Hua

    2016-01-01

    The problem of text-dependent speaker verification under noisy conditions is becoming ever more relevant, due to increased usage for authentication in real-world applications. Classical methods for noise reduction such as spectral subtraction and Wiener filtering introduce distortion and do...

  12. Extensible Technology-Agnostic Runtime Verification

    Directory of Open Access Journals (Sweden)

    Christian Colombo

    2013-02-01

    Full Text Available With numerous specialised technologies available to industry, it has become increasingly frequent for computer systems to be composed of heterogeneous components built over, and using, different technologies and languages. While this enables developers to use the appropriate technologies for specific contexts, it becomes more challenging to ensure the correctness of the overall system. In this paper we propose a framework to enable extensible technology agnostic runtime verification and we present an extension of polyLarva, a runtime-verification tool able to handle the monitoring of heterogeneous-component systems. The approach is then applied to a case study of a component-based artefact using different technologies, namely C and Java.

  13. Spatial Verification Using Wavelet Transforms: A Review

    CERN Document Server

    Weniger, Michael; Friederichs, Petra

    2016-01-01

    Due to the emergence of new high resolution numerical weather prediction (NWP) models and the availability of new or more reliable remote sensing data, the importance of efficient spatial verification techniques is growing. Wavelet transforms offer an effective framework to decompose spatial data into separate (and possibly orthogonal) scales and directions. Most wavelet based spatial verification techniques have been developed or refined in the last decade and concentrate on assessing forecast performance (i.e. forecast skill or forecast error) on distinct physical scales. Particularly during the last five years, a significant growth in meteorological applications could be observed. However, a comparison with other scientific fields such as feature detection, image fusion, texture analysis, or facial and biometric recognition, shows that there is still a considerable, currently unused potential to derive useful diagnostic information. In order to tab the full potential of wavelet analysis, we revise the stat...

  14. Packaged low-level waste verification system

    Energy Technology Data Exchange (ETDEWEB)

    Tuite, K.; Winberg, M.R.; McIsaac, C.V. [Idaho National Engineering Lab., Idaho Falls, ID (United States)

    1995-12-31

    The Department of Energy through the National Low-Level Waste Management Program and WMG Inc. have entered into a joint development effort to design, build, and demonstrate the Packaged Low-Level Waste Verification System. Currently, states and low-level radioactive waste disposal site operators have no method to independently verify the radionuclide content of packaged low-level waste that arrives at disposal sites for disposition. At this time, the disposal site relies on the low-level waste generator shipping manifests and accompanying records to ensure that low-level waste received meets the site`s waste acceptance criteria. The subject invention provides the equipment, software, and methods to enable the independent verification of low-level waste shipping records to ensure that the site`s waste acceptance criteria are being met. The objective of the prototype system is to demonstrate a mobile system capable of independently verifying the content of packaged low-level waste.

  15. Formal verification of industrial control systems

    CERN Document Server

    CERN. Geneva

    2015-01-01

    Verification of critical software is a high priority but a challenging task for industrial control systems. For many kinds of problems, testing is not an efficient method. Formal methods, such as model checking appears to be an appropriate complementary method. However, it is not common to use model checking in industry yet, as this method needs typically formal methods expertise and huge computing power. In the EN-ICE-PLC section, we are working on a [methodology][1] and a tool ([PLCverif][2]) to overcome these challenges and to integrate formal verification in the development process of our PLC-based control systems. [1]: http://cern.ch/project-plc-formalmethods [2]: http://cern.ch/plcverif

  16. Safety Verification for Probabilistic Hybrid Systems

    DEFF Research Database (Denmark)

    Zhang, Lijun; She, Zhikun; Ratschan, Stefan;

    2010-01-01

    The interplay of random phenomena and continuous real-time control deserves increased attention for instance in wireless sensing and control applications. Safety verification for such systems thus needs to consider probabilistic variations of systems with hybrid dynamics. In safety verification...... hybrid systems and develop a general abstraction technique for verifying probabilistic safety problems. This gives rise to the first mechanisable technique that can, in practice, formally verify safety properties of non-trivial continuous-time stochastic hybrid systems-without resorting to point...... of classical hybrid systems we are interested in whether a certain set of unsafe system states can be reached from a set of initial states. In the probabilistic setting, we may ask instead whether the probability of reaching unsafe states is below some given threshold. In this paper, we consider probabilistic...

  17. Safety Verification for Probabilistic Hybrid Systems

    DEFF Research Database (Denmark)

    Zhang, Lijun; She, Zhikun; Ratschan, Stefan;

    2012-01-01

    The interplay of random phenomena and continuous dynamics deserves increased attention, especially in the context of wireless sensing and control applications. Safety verification for such systems thus needs to consider probabilistic variants of systems with hybrid dynamics. In safety verification...... hybrid systems and develop a general abstraction technique for verifying probabilistic safety problems. This gives rise to the first mechanisable technique that can, in practice, formally verify safety properties of non-trivial continuous-time stochastic hybrid systems. Moreover, being based...... of classical hybrid systems, we are interested in whether a certain set of unsafe system states can be reached from a set of initial states. In the probabilistic setting, we may ask instead whether the probability of reaching unsafe states is below some given threshold. In this paper, we consider probabilistic...

  18. Identification Filtering with fuzzy estimations

    Directory of Open Access Journals (Sweden)

    J.J Medel J

    2012-10-01

    Full Text Available A digital identification filter interacts with an output reference model signal known as a black-box output system. The identification technique commonly needs the transition and gain matrixes. Both estimation cases are based on mean square criterion obtaining of the minimum output error as the best estimation filtering. The evolution system represents adaptive properties that the identification mechanism includes considering the fuzzy logic strategies affecting in probability sense the evolution identification filter. The fuzzy estimation filter allows in two forms describing the transition and the gain matrixes applying actions that affect the identification structure. Basically, the adaptive criterion conforming the inference mechanisms set, the Knowledge and Rule bases, selecting the optimal coefficients in distribution form. This paper describes the fuzzy strategies applied to the Kalman filter transition function, and gain matrixes. The simulation results were developed using Matlab©.

  19. Convergent Filter Bases

    OpenAIRE

    Coghetto Roland

    2015-01-01

    We are inspired by the work of Henri Cartan [16], Bourbaki [10] (TG. I Filtres) and Claude Wagschal [34]. We define the base of filter, image filter, convergent filter bases, limit filter and the filter base of tails (fr: filtre des sections).

  20. Convergent Filter Bases

    Directory of Open Access Journals (Sweden)

    Coghetto Roland

    2015-09-01

    Full Text Available We are inspired by the work of Henri Cartan [16], Bourbaki [10] (TG. I Filtres and Claude Wagschal [34]. We define the base of filter, image filter, convergent filter bases, limit filter and the filter base of tails (fr: filtre des sections.

  1. Matched Spectral Filter Imager Project

    Data.gov (United States)

    National Aeronautics and Space Administration — OPTRA proposes the development of an imaging spectrometer for greenhouse gas and volcanic gas imaging based on matched spectral filtering and compressive imaging....

  2. Integrated Spatial Filter Array Project

    Data.gov (United States)

    National Aeronautics and Space Administration — To address the NASA Earth Science Division need for spatial filter arrays for amplitude and wavefront control, Luminit proposes to develop a novel Integrated Spatial...

  3. Ceramic HEPA Filter Program

    Energy Technology Data Exchange (ETDEWEB)

    Mitchell, M A; Bergman, W; Haslam, J; Brown, E P; Sawyer, S; Beaulieu, R; Althouse, P; Meike, A

    2012-04-30

    Potential benefits of ceramic filters in nuclear facilities: (1) Short term benefit for DOE, NRC, and industry - (a) CalPoly HTTU provides unique testing capability to answer questions for DOE - High temperature testing of materials, components, filter, (b) Several DNFSB correspondences and presentations by DNFSB members have highlighted the need for HEPA filter R and D - DNFSB Recommendation 2009-2 highlighted a nuclear facility response to an evaluation basis earthquake followed by a fire (aka shake-n-bake) and CalPoly has capability for a shake-n-bake test; (2) Intermediate term benefit for DOE and industry - (a) Filtration for specialty applications, e.g., explosive applications at Nevada, (b) Spin-off technologies applicable to other commercial industries; and (3) Long term benefit for DOE, NRC, and industry - (a) Across industry, strong desire for better performance filter, (b) Engineering solution to safety problem will improve facility safety and decrease dependence on associated support systems, (c) Large potential life-cycle cost savings, and (d) Facilitates development and deployment of LLNL process innovations to allow continuous ventilation system operation during a fire.

  4. Spot- Zombie Filtering System

    Directory of Open Access Journals (Sweden)

    Arathy Rajagopal

    2014-01-01

    Full Text Available A major security challenge on the Internet is the existence of the large number of compromised machines. Such machines have been increasingly used to launch various security attacks including spamming and spreading malware, DDoS, and identity theft. These compromised machines are called “Zombies”. In general E-mail applications and providers uses spam filters to filter the spam messages. Spam filtering is a technique for discriminating the genuine message from the spam messages. The attackers send the spam messages to the targeted machine by exalting the filters, which causes the increase in false positives and false negatives. We develop an effective spam zombie detection system named SPOT by monitoring outgoing messages of a network. SPOT focuses on the number of outgoing messages that are originated or forwarded by each computer on a network to identify the presence of Zombies. SPOT is designed based on a powerful statistical tool called Sequential Probability Ratio Test, which has bounded false positive and false negative error rates.

  5. Spot- Zombie Filtering System

    Directory of Open Access Journals (Sweden)

    Arathy Rajagopal

    2015-10-01

    Full Text Available A major security challenge on the Internet is the existence of the large number of compromised machines. Such machines have been increasingly used to launch various security attacks including spamming and spreading malware, DDoS, and identity theft. These compromised machines are called "Zombies". In general E-mail applications and providers uses spam filters to filter the spam messages. Spam filtering is a technique for discriminating the genuine message from the spam messages. The attackers send the spam messages to the targeted machine by exalting the filters, which causes the increase in false positives and false negatives. We develop an effective spam zombie detection system named SPOT by monitoring outgoing messages of a network. SPOT focuses on the number of outgoing messages that are originated or forwarded by each computer on a network to identify the presence of Zombies. SPOT is designed based on a powerful statistical tool called Sequential Probability Ratio Test, which has bounded false positive and false negative error rates.

  6. The MODUS Approach to Formal Verification

    Directory of Open Access Journals (Sweden)

    Brewka Lukasz

    2014-03-01

    Full Text Available Background: Software reliability is of great importance for the development of embedded systems that are often used in applications that have requirements for safety. Since the life cycle of embedded products is becoming shorter, productivity and quality simultaneously required and closely in the process of providing competitive products Objectives: In relation to this, MODUS (Method and supporting toolset advancing embedded systems quality project aims to provide small and medium-sized businesses ways to improve their position in the embedded market through a pragmatic and viable solution Methods/Approach: This paper will describe the MODUS project with focus on the technical methodologies that can assist formal verification and formal model checking. Results: Based on automated analysis of the characteristics of the system and by controlling the choice of the existing opensource model verification engines, model verification producing inputs to be fed into these engines. Conclusions: The MODUS approach is aligned with present market needs; the familiarity with tools, the ease of use and compatibility/interoperability remain among the most important criteria when selecting the development environment for a project

  7. Spatial verification using wavelet transforms: a review

    Science.gov (United States)

    Weniger, Michael; Kapp, Florian; Friederichs, Petra

    2017-01-01

    Due to the emergence of new high resolution numerical weather prediction (NWP) models and the availability of new or more reliable remote sensing data, the importance of efficient spatial verification techniques is growing. Wavelet transforms offer an effective framework to decompose spatial data into separate (and possibly orthogonal) scales and directions. Most wavelet based spatial verification techniques have been developed or refined in the last decade and concentrate on assessing forecast performance (i.e. forecast skill or forecast error) on distinct physical scales. Particularly during the last five years, a significant growth in meteorological applications could be observed. However, a comparison with other scientific fields such as feature detection, image fusion, texture analysis, or facial and biometric recognition, shows that there is still a considerable, currently unused potential to derive useful diagnostic information. In order to tab the full potential of wavelet analysis, we revise the state-of-the art in one- and two-dimensional wavelet analysis and its application with emphasis on spatial verification. We further use a technique developed for texture analysis in the context of high-resolution quantitative precipitation forecasts, which is able to assess structural characteristics of the precipitation fields and allows efficient clustering of ensemble data.

  8. The Development of Multi-dimensional Workbench for Solar Radiaometer Verification%太阳辐射计检定用多维工作台的研制

    Institute of Scientific and Technical Information of China (English)

    李志强; 苏拾; 王凌云; 张国玉; 陈曦

    2013-01-01

    For the verification needs of solar radiation measuring instruments, a multidimensional workbench with four degrees of freedom movement was proposed. In conjunction with a solar simulator,it can simultaneously test parameter errors of the two instruments to be seized for verification. Based on the study of the solar altitude angle and the azi-muth angle. elaborateing the design process about the table’s rotation mechanism, lifting mechanism and the transfer arm mechanism tilt mechanism, and the control system of the table has been described. The test results show that the rotation control error of the dimensional table at all levels are less than 0.05°. The mesa composed of inclinometer,op-tical encoder and stepper motor,whose horizontal control error is within ±10′. The above errors that can meet the de-sign requirements.%针对太阳辐射计量仪表进行检定的需要,提出了一种具有四个自由度功能动作的多维检定工作台。工作台在与太阳模拟器联合,可同时对两块待检仪表的参数误差进行检定。在对太阳高度角和方位角的研究基础上,阐述了工作台旋转机构、升降机构、俯仰机构和转臂机构的设计过程,并对工作台的控制系统进行了说明。测试结果表明多维工作台各级转动误差小于0.05°,由倾角仪和步进电机组成的台面水平控制误差在±10′以内,满足设计指标要求。

  9. Generic interpreters and microprocessor verification

    Science.gov (United States)

    Windley, Phillip J.

    1990-01-01

    The following topics are covered in viewgraph form: (1) generic interpreters; (2) Viper microprocessors; (3) microprocessor verification; (4) determining correctness; (5) hierarchical decomposition; (6) interpreter theory; (7) AVM-1; (8) phase-level specification; and future work.

  10. Numident Online Verification Utility (NOVU)

    Data.gov (United States)

    Social Security Administration — NOVU is a mainframe application that accesses the NUMIDENT to perform real-time SSN verifications. This program is called by other SSA online programs that serve as...

  11. An application of operational modal analysis in modal filtering

    Energy Technology Data Exchange (ETDEWEB)

    Kurowski, Piotr; Mendrok, Krzysztof; Uhl, Tadeusz, E-mail: mendrok@agh.edu.pl [AGH University of Science and Technology in Krakow, Al. Mickiewicza 30, 30-059, Krakow (Poland)

    2011-07-19

    Modal filtration in the field of damage detection has many advantages, including: its autonomous operation (without the interaction of qualified staff), low computational cost and low sensitivity to changes in external conditions. However, the main drawback of this group of damage detection methods is its limited applicability to operational data. In this paper a method of modal filter formulation from the in-operational data is described. The basis for this approach is FRFs synthesis using knowledge of the operational modal model. For that purpose a method of operational mode shape scaling is described. This is based on the measurements of several FRFs of the object. The method is then applied to the construction of modal filters and modal filtration. Additionally, the study presents verification of the method using data obtained from simulation and laboratory experiments. Verification consisted of comparing the results of modal filtering based on classical experimental modal analysis with the results of the approach proposed in the work.

  12. An application of operational modal analysis in modal filtering

    Science.gov (United States)

    Kurowski, Piotr; Mendrok, Krzysztof; Uhl, Tadeusz

    2011-07-01

    Modal filtration in the field of damage detection has many advantages, including: its autonomous operation (without the interaction of qualified staff), low computational cost and low sensitivity to changes in external conditions. However, the main drawback of this group of damage detection methods is its limited applicability to operational data. In this paper a method of modal filter formulation from the in-operational data is described. The basis for this approach is FRFs synthesis using knowledge of the operational modal model. For that purpose a method of operational mode shape scaling is described. This is based on the measurements of several FRFs of the object. The method is then applied to the construction of modal filters and modal filtration. Additionally, the study presents verification of the method using data obtained from simulation and laboratory experiments. Verification consisted of comparing the results of modal filtering based on classical experimental modal analysis with the results of the approach proposed in the work.

  13. Towards the formal verification of the requirements and design of a processor interface unit

    Science.gov (United States)

    Fura, David A.; Windley, Phillip J.; Cohen, Gerald C.

    1993-01-01

    The formal verification of the design and partial requirements for a Processor Interface Unit (PIU) using the Higher Order Logic (HOL) theorem-proving system is described. The processor interface unit is a single-chip subsystem within a fault-tolerant embedded system under development within the Boeing Defense and Space Group. It provides the opportunity to investigate the specification and verification of a real-world subsystem within a commercially-developed fault-tolerant computer. An overview of the PIU verification effort is given. The actual HOL listing from the verification effort are documented in a companion NASA contractor report entitled 'Towards the Formal Verification of the Requirements and Design of a Processor Interface Unit - HOL Listings' including the general-purpose HOL theories and definitions that support the PIU verification as well as tactics used in the proofs.

  14. Development of hybrid type pneumatic vibration isolation table by piezo-stack actuator and filtered-X LMS algorithm

    Science.gov (United States)

    Shin, Yun-ho; Jang, Dong-doo; Moon, Seok-jun; Jung, Hyung-Jo; Moon, Yeong-jong; Song, Chang-kyu

    2011-04-01

    Recently, vibration requirements are getting stricter as precise equipments need more improved vibration environment to realize their powerful performance. Though the passive pneumatic vibration isolation tables are frequently used to satisfy the rigorous vibration requirements, the specific vibration problem, especially continuous sinusoidal or periodic vibration induced by a rotor system of other precise equipment, a thermo-hygrostat or a ventilation system, is still left. In this research, the application procedure of Filtered-X LMS algorithm to pneumatic vibration isolation table with piezo-stack actuators is proposed to enhance the isolation performance for the continuous sinusoidal or periodic vibration. In addition, the experimental results to show the isolation performance of proposed system are also presented together with the isolation performance of passive pneumatic isolation table.

  15. Development of an Intrinsic Continuum Robot and Attitude Estimation of Its End-effector Based on a Kalman Filter

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Chang Hyun; Bae, Ji Hwan; Kang, Bong Soo [Hannam University, Daejeon (Korea, Republic of)

    2015-04-15

    This paper presents the design concept of an intrinsic continuum robot for safe man-machine interface and characteristic behaviors of its end-effector based on real experiments. Since pneumatic artificial muscles having similar antagonistic actuation to human muscles are used for main backbones of the proposed robot as well as in the role of the actuating devices, variable stiffness of robotic joints can be available in the actual environment. In order to solve the inherent shortcoming of an intrinsic continuum robot due to bending motion of the backbone materials, a Kalman filter scheme based on a triaxial accelerometer and a triaxial gyroscope was proposed to conduct an attitude estimation of the end-effector of the robot. The experimental results verified that the proposed method was effective in estimating the attitude of the end-effector of the intrinsic continuum robot.

  16. Development of a sequential workflow based on LC-PRM for the verification of endometrial cancer protein biomarkers in uterine aspirate samples.

    Science.gov (United States)

    Martinez-Garcia, Elena; Lesur, Antoine; Devis, Laura; Campos, Alexandre; Cabrera, Silvia; van Oostrum, Jan; Matias-Guiu, Xavier; Gil-Moreno, Antonio; Reventos, Jaume; Colas, Eva; Domon, Bruno

    2016-08-16

    About 30% of endometrial cancer (EC) patients are diagnosed at an advanced stage of the disease, which is associated with a drastic decrease in the 5-year survival rate. The identification of biomarkers in uterine aspirate samples, which are collected by a minimally invasive procedure, would improve early diagnosis of EC. We present a sequential workflow to select from a list of potential EC biomarkers, those which are the most promising to enter a validation study. After the elimination of confounding contributions by residual blood proteins, 52 potential biomarkers were analyzed in uterine aspirates from 20 EC patients and 18 non-EC controls by a high-resolution accurate mass spectrometer operated in parallel reaction monitoring mode. The differential abundance of 26 biomarkers was observed, and among them ten proteins showed a high sensitivity and specificity (AUC > 0.9). The study demonstrates that uterine aspirates are valuable samples for EC protein biomarkers screening. It also illustrates the importance of a biomarker verification phase to fill the gap between discovery and validation studies and highlights the benefits of high resolution mass spectrometry for this purpose. The proteins verified in this study have an increased likelihood to become a clinical assay after a subsequent validation phase.

  17. Optional inferior vena caval filters: where are we now?

    LENUS (Irish Health Repository)

    Keeling, A N

    2008-08-01

    With the advent of newer optional\\/retrievable inferior vena caval filters, there has been a rise in the number of filters inserted globally. This review article examines the currently available approved optional filter models, outlines the clinical indications for filter insertion and examines the expanding indications. Additionally, the available evidence behind the use of optional filters is reviewed, the issue of anticoagulation is discussed and possible future filter developments are considered.

  18. Verification strategies for fluid-based plasma simulation models

    Science.gov (United States)

    Mahadevan, Shankar

    2012-10-01

    Verification is an essential aspect of computational code development for models based on partial differential equations. However, verification of plasma models is often conducted internally by authors of these programs and not openly discussed. Several professional research bodies including the IEEE, AIAA, ASME and others have formulated standards for verification and validation (V&V) of computational software. This work focuses on verification, defined succinctly as determining whether the mathematical model is solved correctly. As plasma fluid models share several aspects with the Navier-Stokes equations used in Computational Fluid Dynamics (CFD), the CFD verification process is used as a guide. Steps in the verification process: consistency checks, examination of iterative, spatial and temporal convergence, and comparison with exact solutions, are described with examples from plasma modeling. The Method of Manufactured Solutions (MMS), which has been used to verify complex systems of PDEs in solid and fluid mechanics, is introduced. An example of the application of MMS to a self-consistent plasma fluid model using the local mean energy approximation is presented. The strengths and weaknesses of the techniques presented in this work are discussed.

  19. NOVEL MICROWAVE FILTER DESIGN TECHNIQUES.

    Science.gov (United States)

    ELECTROMAGNETIC WAVE FILTERS, MICROWAVE FREQUENCY, PHASE SHIFT CIRCUITS, BANDPASS FILTERS, TUNED CIRCUITS, NETWORKS, IMPEDANCE MATCHING , LOW PASS FILTERS, MULTIPLEXING, MICROWAVE EQUIPMENT, WAVEGUIDE FILTERS, WAVEGUIDE COUPLERS.

  20. Woodward Effect Experimental Verifications

    Science.gov (United States)

    March, Paul

    2004-02-01

    The work of J. F. Woodward (1990 1996a; 1996b; 1998; 2002a; 2002b; 2004) on the existence of ``mass fluctuations'' and their use in exotic propulsion schemes was examined for possible application in improving space flight propulsion and power generation. Woodward examined Einstein's General Relativity Theory (GRT) and assumed that if the strong Machian interpretation of GRT as well as gravitational / inertia like Wheeler-Feynman radiation reaction forces hold, then when an elementary particle is accelerated through a potential gradient, its rest mass should fluctuate around its mean value during its acceleration. Woodward also used GRT to clarify the precise experimental conditions necessary for observing and exploiting these mass fluctuations or ``Woodward effect'' (W-E). Later, in collaboration with his ex-graduate student T. Mahood, they also pushed the experimental verification boundaries of these proposals. If these purported mass fluctuations occur as Woodward claims, and his assumption that gravity and inertia are both byproducts of the same GRT based phenomenon per Mach's Principle is correct, then many innovative applications such as propellantless propulsion and gravitational exotic matter generators may be feasible. This paper examines the reality of mass fluctuations and the feasibility of using the W-E to design propellantless propulsion devices in the near to mid-term future. The latest experimental results, utilizing MHD-like force rectification systems, will also be presented.