WorldWideScience

Sample records for traditional filtered back-projection

  1. A Wiener filtering based back projection algorithm for image reconstruction

    In the context of computed tomography (CT), a key techniques is the image reconstruction from projection data. The filtered back projection (FBP) algorithm is commonly used in image reconstruction. Based on cause analysis of the artifacts, we propose a new image reconstruction algorithm combining the Wiener filter and FBP algorithm. The conventional FBP image reconstruction algorithm is improved by adopting Wiener filter: and artifacts in the reconstructed images are obviously reduced. Experimental results of typical flow regimes show that the improved algorithm can effectively improve the image quality. (authors)

  2. An adaptive filtered back-projection for photoacoustic image reconstruction

    Purpose: The purpose of this study is to develop an improved filtered-back-projection (FBP) algorithm for photoacoustic tomography (PAT), which allows image reconstruction with higher quality compared to images reconstructed through traditional algorithms. Methods: A rigorous expression of a weighting function has been derived directly from a photoacoustic wave equation and used as a ramp filter in Fourier domain. The authors’ new algorithm utilizes this weighting function to precisely calculate each photoacoustic signal’s contribution and then reconstructs the image based on the retarded potential generated from the photoacoustic sources. In addition, an adaptive criterion has been derived for selecting the cutoff frequency of a low pass filter. Two computational phantoms were created to test the algorithm. The first phantom contained five spheres with each sphere having different absorbances. The phantom was used to test the capability for correctly representing both the geometry and the relative absorbed energy in a planar measurement system. The authors also used another phantom containing absorbers of different sizes with overlapping geometry to evaluate the performance of the new method for complicated geometry. In addition, random noise background was added to the simulated data, which were obtained by using an arc-shaped array of 50 evenly distributed transducers that spanned 160° over a circle with a radius of 65 mm. A normalized factor between the neighbored transducers was applied for correcting measurement signals in PAT simulations. The authors assumed that the scanned object was mounted on a holder that rotated over the full 360° and the scans were set to a sampling rate of 20.48 MHz. Results: The authors have obtained reconstructed images of the computerized phantoms by utilizing the new FBP algorithm. From the reconstructed image of the first phantom, one can see that this new approach allows not only obtaining a sharp image but also showing

  3. Improvement of wavelet threshold filtered back-projection image reconstruction algorithm

    Ren, Zhong; Liu, Guodong; Huang, Zhen

    2014-11-01

    Image reconstruction technique has been applied into many fields including some medical imaging, such as X ray computer tomography (X-CT), positron emission tomography (PET) and nuclear magnetic resonance imaging (MRI) etc, but the reconstructed effects are still not satisfied because original projection data are inevitably polluted by noises in process of image reconstruction. Although some traditional filters e.g., Shepp-Logan (SL) and Ram-Lak (RL) filter have the ability to filter some noises, Gibbs oscillation phenomenon are generated and artifacts leaded by back-projection are not greatly improved. Wavelet threshold denoising can overcome the noises interference to image reconstruction. Since some inherent defects exist in the traditional soft and hard threshold functions, an improved wavelet threshold function combined with filtered back-projection (FBP) algorithm was proposed in this paper. Four different reconstruction algorithms were compared in simulated experiments. Experimental results demonstrated that this improved algorithm greatly eliminated the shortcomings of un-continuity and large distortion of traditional threshold functions and the Gibbs oscillation. Finally, the availability of this improved algorithm was verified from the comparison of two evaluation criterions, i.e. mean square error (MSE), peak signal to noise ratio (PSNR) among four different algorithms, and the optimum dual threshold values of improved wavelet threshold function was gotten.

  4. Filtered back-projection algorithm for Compton telescopes

    Gunter, Donald L.

    2008-03-18

    A method for the conversion of Compton camera data into a 2D image of the incident-radiation flux on the celestial sphere includes detecting coincident gamma radiation flux arriving from various directions of a 2-sphere. These events are mapped by back-projection onto the 2-sphere to produce a convolution integral that is subsequently stereographically projected onto a 2-plane to produce a second convolution integral which is deconvolved by the Fourier method to produce an image that is then projected onto the 2-sphere.

  5. Generalized filtered back-projection for digital breast tomosynthesis reconstruction

    Erhard, Klaus; Grass, Michael; Hitziger, Sebastian; Iske, Armin; Nielsen, Tim

    2012-03-01

    Filtered backprojection (FBP) has been commonly used as an efficient and robust reconstruction technique in tomographic X-ray imaging during the last decades. For standard geometries like circle or helix it is known how to efficiently filter the data. However, for geometries with only few projection views or with a limited angular range, the application of FBP algorithms generally provides poor results. In digital breast tomosynthesis (DBT) these limitations give rise to image artifacts due to the limited angular range and the coarse angular sampling. In this work, a generalized FBP algorithm is presented, which uses the filtered projection data of all acquired views for backprojection along one direction. The proposed method yields a computationally efficient generalized FBP algorithm for DBT, which provides similar image quality as iterative reconstruction techniques while preserving the ability for region of interest reconstructions. To demonstrate the excellent performance of this method, examples are given with a simulated breast phantom and the hardware BR3D phantom.

  6. Superresolution of Hyperspectral Image Using Advanced Nonlocal Means Filter and Iterative Back Projection

    Jin Wang

    2015-01-01

    Full Text Available We introduce an efficient superresolution algorithm based on advanced nonlocal means (NLM filter and iterative back projection for hyperspectral image. The nonlocal means method achieves the to-be-interpolated pixel by the weighted average of all pixels within an image, and the unrelated neighborhoods are automatically eliminated by the trivial weights. However, spatial location distance is also an important issue to reconstruct the missing pixel. Therefore, we proposed an advanced NLM (ANLM filter considering both neighborhood similarity and patch distance. In the conventional NLM method, the search region was the whole image, while the proposed ANLM utilizes the limited search to reduce the complexity. The iterative back projection (IBP is a very famous method to deal with the image restoration. In the superresolution issue, IBP is able to recover the high-resolution image iteratively from the given low-resolution image which is blurred due to the noise by minimizing the reconstruction error, while, because the reconstruction error of IBP is back projection and isotropic, the conventional IBP suffers from jaggy and ringing artifacts. Introducing the ANLM method to improve the visual quality is necessary.

  7. Monte Carlo evaluation of the Filtered Back Projection method for image reconstruction in proton computed tomography

    In this paper the use of the Filtered Back Projection (FBP) Algorithm, in order to reconstruct tomographic images using the high energy (200-250 MeV) proton beams, is investigated. The algorithm has been studied in detail with a Monte Carlo approach and image quality has been analysed and compared with the total absorbed dose. A proton Computed Tomography (pCT) apparatus, developed by our group, has been fully simulated to exploit the power of the Geant4 Monte Carlo toolkit. From the simulation of the apparatus, a set of tomographic images of a test phantom has been reconstructed using the FBP at different absorbed dose values. The images have been evaluated in terms of homogeneity, noise, contrast, spatial and density resolution.

  8. Monte Carlo evaluation of the Filtered Back Projection method for image reconstruction in proton computed tomography

    Cirrone, G.A.P., E-mail: cirrone@lns.infn.it [Laboratori Nazionali del Sud - National Instiute for Nuclear Physics INFN (INFN-LNS), Via S.Sofia 64, 95100 Catania (Italy); Bucciolini, M. [Department of ' Fisiopatologia Clinica' , University of Florence, V.le Morgagni 85, I-50134 Florence (Italy); Bruzzi, M. [Energetic Department, University of Florence, Via S. Marta 3, I-50139 Florence (Italy); Candiano, G. [Laboratorio di Tecnologie Oncologiche HSR, Giglio Contrada, Pietrapollastra-Pisciotto, 90015 Cefalu, Palermo (Italy); Civinini, C. [National Institute for Nuclear Physics INFN, Section of Florence, Via G. Sansone 1, Sesto Fiorentino, I-50019 Florence (Italy); Cuttone, G. [Laboratori Nazionali del Sud - National Instiute for Nuclear Physics INFN (INFN-LNS), Via S.Sofia 64, 95100 Catania (Italy); Guarino, P. [Nuclear Engineering Department, University of Palermo, Via... Palermo (Italy); Laboratori Nazionali del Sud - National Instiute for Nuclear Physics INFN (INFN-LNS), Via S.Sofia 64, 95100 Catania (Italy); Lo Presti, D. [Physics Department, University of Catania, Via S. Sofia 64, I-95123, Catania (Italy); Mazzaglia, S.E. [Laboratori Nazionali del Sud - National Instiute for Nuclear Physics INFN (INFN-LNS), Via S.Sofia 64, 95100 Catania (Italy); Pallotta, S. [Department of ' Fisiopatologia Clinica' , University of Florence, V.le Morgagni 85, I-50134 Florence (Italy); Randazzo, N. [National Institute for Nuclear Physics INFN, Section of Catania, Via S.Sofia 64, 95123 Catania (Italy); Sipala, V. [National Institute for Nuclear Physics INFN, Section of Catania, Via S.Sofia 64, 95123 Catania (Italy); Physics Department, University of Catania, Via S. Sofia 64, I-95123, Catania (Italy); Stancampiano, C. [National Institute for Nuclear Physics INFN, Section of Catania, Via S.Sofia 64, 95123 Catania (Italy); and others

    2011-12-01

    In this paper the use of the Filtered Back Projection (FBP) Algorithm, in order to reconstruct tomographic images using the high energy (200-250 MeV) proton beams, is investigated. The algorithm has been studied in detail with a Monte Carlo approach and image quality has been analysed and compared with the total absorbed dose. A proton Computed Tomography (pCT) apparatus, developed by our group, has been fully simulated to exploit the power of the Geant4 Monte Carlo toolkit. From the simulation of the apparatus, a set of tomographic images of a test phantom has been reconstructed using the FBP at different absorbed dose values. The images have been evaluated in terms of homogeneity, noise, contrast, spatial and density resolution.

  9. Comparison of adaptive statistical iterative and filtered back projection reconstruction techniques in brain CT

    Purpose: To compare image quality and visualization of normal structures and lesions in brain computed tomography (CT) with adaptive statistical iterative reconstruction (ASIR) and filtered back projection (FBP) reconstruction techniques in different X-ray tube current–time products. Materials and methods: In this IRB-approved prospective study, forty patients (nineteen men, twenty-one women; mean age 69.5 ± 11.2 years) received brain scan at different tube current–time products (300 and 200 mAs) in 64-section multi-detector CT (GE, Discovery CT750 HD). Images were reconstructed with FBP and four levels of ASIR-FBP blending. Two radiologists (please note that our hospital is renowned for its geriatric medicine department, and these two radiologists are more experienced in chronic cerebral vascular disease than in neoplastic disease, so this research did not contain cerebral tumors but as a discussion) assessed all the reconstructed images for visibility of normal structures, lesion conspicuity, image contrast and diagnostic confidence in a blinded and randomized manner. Volume CT dose index (CTDIvol) and dose-length product (DLP) were recorded. All the data were analyzed by using SPSS 13.0 statistical analysis software. Results: There was no statistically significant difference between the image qualities at 200 mAs with 50% ASIR blending technique and 300 mAs with FBP technique (p > .05). While between the image qualities at 200 mAs with FBP and 300 mAs with FBP technique a statistically significant difference (p < .05) was found. Conclusion: ASIR provided same image quality and diagnostic ability in brain imaging with greater than 30% dose reduction compared with FBP reconstruction technique

  10. Comparison of adaptive statistical iterative and filtered back projection reconstruction techniques in brain CT

    Ren, Qingguo, E-mail: renqg83@163.com [Department of Radiology, Hua Dong Hospital of Fudan University, Shanghai 200040 (China); Dewan, Sheilesh Kumar, E-mail: sheilesh_d1@hotmail.com [Department of Geriatrics, Hua Dong Hospital of Fudan University, Shanghai 200040 (China); Li, Ming, E-mail: minli77@163.com [Department of Radiology, Hua Dong Hospital of Fudan University, Shanghai 200040 (China); Li, Jianying, E-mail: Jianying.Li@med.ge.com [CT Imaging Research Center, GE Healthcare China, Beijing (China); Mao, Dingbiao, E-mail: maodingbiao74@163.com [Department of Radiology, Hua Dong Hospital of Fudan University, Shanghai 200040 (China); Wang, Zhenglei, E-mail: Williswang_doc@yahoo.com.cn [Department of Radiology, Shanghai Electricity Hospital, Shanghai 200050 (China); Hua, Yanqing, E-mail: cjr.huayanqing@vip.163.com [Department of Radiology, Hua Dong Hospital of Fudan University, Shanghai 200040 (China)

    2012-10-15

    Purpose: To compare image quality and visualization of normal structures and lesions in brain computed tomography (CT) with adaptive statistical iterative reconstruction (ASIR) and filtered back projection (FBP) reconstruction techniques in different X-ray tube current–time products. Materials and methods: In this IRB-approved prospective study, forty patients (nineteen men, twenty-one women; mean age 69.5 ± 11.2 years) received brain scan at different tube current–time products (300 and 200 mAs) in 64-section multi-detector CT (GE, Discovery CT750 HD). Images were reconstructed with FBP and four levels of ASIR-FBP blending. Two radiologists (please note that our hospital is renowned for its geriatric medicine department, and these two radiologists are more experienced in chronic cerebral vascular disease than in neoplastic disease, so this research did not contain cerebral tumors but as a discussion) assessed all the reconstructed images for visibility of normal structures, lesion conspicuity, image contrast and diagnostic confidence in a blinded and randomized manner. Volume CT dose index (CTDI{sub vol}) and dose-length product (DLP) were recorded. All the data were analyzed by using SPSS 13.0 statistical analysis software. Results: There was no statistically significant difference between the image qualities at 200 mAs with 50% ASIR blending technique and 300 mAs with FBP technique (p > .05). While between the image qualities at 200 mAs with FBP and 300 mAs with FBP technique a statistically significant difference (p < .05) was found. Conclusion: ASIR provided same image quality and diagnostic ability in brain imaging with greater than 30% dose reduction compared with FBP reconstruction technique.

  11. Filtered back-projection reconstruction for attenuation proton CT along most likely paths

    Quiñones, C. T.; Létang, J. M.; Rit, S.

    2016-05-01

    This work investigates the attenuation of a proton beam to reconstruct the map of the linear attenuation coefficient of a material which is mainly caused by the inelastic interactions of protons with matter. Attenuation proton computed tomography (pCT) suffers from a poor spatial resolution due to multiple Coulomb scattering (MCS) of protons in matter, similarly to the conventional energy-loss pCT. We therefore adapted a recent filtered back-projection algorithm along the most likely path (MLP) of protons for energy-loss pCT (Rit et al 2013) to attenuation pCT assuming a pCT scanner that can track the position and the direction of protons before and after the scanned object. Monte Carlo simulations of pCT acquisitions of density and spatial resolution phantoms were performed to characterize the new algorithm using Geant4 (via Gate). Attenuation pCT assumes an energy-independent inelastic cross-section, and the impact of the energy dependence of the inelastic cross-section below 100 MeV showed a capping artifact when the residual energy was below 100 MeV behind the object. The statistical limitation has been determined analytically and it was found that the noise in attenuation pCT images is 411 times and 278 times higher than the noise in energy-loss pCT images for the same imaging dose at 200 MeV and 300 MeV, respectively. Comparison of the spatial resolution of attenuation pCT images with a conventional straight-line path binning showed that incorporating the MLP estimates during reconstruction improves the spatial resolution of attenuation pCT. Moreover, regardless of the significant noise in attenuation pCT images, the spatial resolution of attenuation pCT was better than that of conventional energy-loss pCT in some studied situations thanks to the interplay of MCS and attenuation known as the West–Sherwood effect.

  12. Gamma-ray momentum reconstruction from Compton electron trajectories by filtered back-projection

    Gamma-ray imaging utilizing Compton scattering has traditionally relied on measuring coincident gamma-ray interactions to map directional information of the source distribution. This coincidence requirement makes it an inherently inefficient process. We present an approach to gamma-ray reconstruction from Compton scattering that requires only a single electron tracking detector, thus removing the coincidence requirement. From the Compton scattered electron momentum distribution, our algorithm analytically computes the incident photon's correlated direction and energy distributions. Because this method maps the source energy and location, it is useful in applications, where prior information about the source distribution is unknown. We demonstrate this method with electron tracks measured in a scientific Si charge coupled device. While this method was demonstrated with electron tracks in a Si-based detector, it is applicable to any detector that can measure electron direction and energy, or equivalently the electron momentum. For example, it can increase the sensitivity to obtain energy and direction in gas-based systems that suffer from limited efficiency.

  13. Gamma-ray momentum reconstruction from Compton electron trajectories by filtered back-projection

    Haefner, A., E-mail: ahaefner@berkeley.edu; Plimley, B.; Pavlovsky, R. [Department of Nuclear Engineering, University of California Berkeley, 4155 Etcheverry Hall, MC 1730, Berkeley, California 94720-1730 (United States); Gunter, D. [Applied Nuclear Physics, Lawrence Berkeley National Laboratory, 1 Cyclotron Road Berkeley, California 94720 (United States); Vetter, K. [Department of Nuclear Engineering, University of California Berkeley, 4155 Etcheverry Hall, MC 1730, Berkeley, California 94720-1730 (United States); Applied Nuclear Physics, Lawrence Berkeley National Laboratory, 1 Cyclotron Road Berkeley, California 94720 (United States)

    2014-11-03

    Gamma-ray imaging utilizing Compton scattering has traditionally relied on measuring coincident gamma-ray interactions to map directional information of the source distribution. This coincidence requirement makes it an inherently inefficient process. We present an approach to gamma-ray reconstruction from Compton scattering that requires only a single electron tracking detector, thus removing the coincidence requirement. From the Compton scattered electron momentum distribution, our algorithm analytically computes the incident photon's correlated direction and energy distributions. Because this method maps the source energy and location, it is useful in applications, where prior information about the source distribution is unknown. We demonstrate this method with electron tracks measured in a scientific Si charge coupled device. While this method was demonstrated with electron tracks in a Si-based detector, it is applicable to any detector that can measure electron direction and energy, or equivalently the electron momentum. For example, it can increase the sensitivity to obtain energy and direction in gas-based systems that suffer from limited efficiency.

  14. Evaluation of dose reduction and image quality in CT colonography: Comparison of low-dose CT with iterative reconstruction and routine-dose CT with filtered back projection

    Nagata, Koichi [Kameda Medical Center, Department of Radiology, Kamogawa, Chiba (Japan); Jichi Medical University, Department of Radiology, Tochigi (Japan); National Cancer Center, Cancer Screening Technology Division, Research Center for Cancer Prevention and Screening, Tokyo (Japan); Fujiwara, Masanori; Mogi, Tomohiro; Iida, Nao [Kameda Medical Center Makuhari, Department of Radiology, Chiba (Japan); Kanazawa, Hidenori; Sugimoto, Hideharu [Jichi Medical University, Department of Radiology, Tochigi (Japan); Mitsushima, Toru [Kameda Medical Center Makuhari, Department of Gastroenterology, Chiba (Japan); Lefor, Alan T. [Jichi Medical University, Department of Surgery, Tochigi (Japan)

    2015-01-15

    To prospectively evaluate the radiation dose and image quality comparing low-dose CT colonography (CTC) reconstructed using different levels of iterative reconstruction techniques with routine-dose CTC reconstructed with filtered back projection. Following institutional ethics clearance and informed consent procedures, 210 patients underwent screening CTC using automatic tube current modulation for dual positions. Examinations were performed in the supine position with a routine-dose protocol and in the prone position, randomly applying four different low-dose protocols. Supine images were reconstructed with filtered back projection and prone images with iterative reconstruction. Two blinded observers assessed the image quality of endoluminal images. Image noise was quantitatively assessed by region-of-interest measurements. The mean effective dose in the supine series was 1.88 mSv using routine-dose CTC, compared to 0.92, 0.69, 0.57, and 0.46 mSv at four different low doses in the prone series (p < 0.01). Overall image quality and noise of low-dose CTC with iterative reconstruction were significantly improved compared to routine-dose CTC using filtered back projection. The lowest dose group had image quality comparable to routine-dose images. Low-dose CTC with iterative reconstruction reduces the radiation dose by 48.5 to 75.1 % without image quality degradation compared to routine-dose CTC with filtered back projection. (orig.)

  15. Chest computed tomography using iterative reconstruction vs filtered back projection (Part 1): evaluation of image noise reduction in 32 patients

    Pontana, Francois; Pagniez, Julien; Faivre, Jean-Baptiste; Remy, Jacques [Univ. Lille Nord de France, Department of Thoracic Imaging Hospital Calmette (EA 2694), Lille (France); Flohr, Thomas [Siemens HealthCare, Computed Tomography Division, Forchheim (Germany); Duhamel, Alain [Univ. Lille Nord de France, Department of Medical Statistics, Lille (France); Remy-Jardin, Martine [Univ. Lille Nord de France, Department of Thoracic Imaging Hospital Calmette (EA 2694), Lille (France); Hospital Calmette, Department of Thoracic Imaging, Lille cedex (France)

    2011-03-15

    To assess noise reduction achievable with an iterative reconstruction algorithm. 32 consecutive chest CT angiograms were reconstructed with regular filtered back projection (FBP) (Group 1) and an iterative reconstruction technique (IRIS) with 3 (Group 2a) and 5 (Group 2b) iterations. Objective image noise was significantly reduced in Group 2a and Group 2b compared with FBP (p < 0.0001). There was a significant reduction in the level of subjective image noise in Group 2a compared with Group 1 images (p < 0.003), further reinforced on Group 2b images (Group 2b vs Group 1; p < 0.0001) (Group 2b vs Group 2a; p = 0.0006). The overall image quality scores significantly improved on Group 2a images compared with Group 1 images (p = 0.0081) and on Group 2b images compared with Group 2a images (p < 0.0001). Comparative analysis of individual CT features of mild lung infiltration showed improved conspicuity of ground glass attenuation (p < 0.0001), ill-defined micronodules (p = 0.0351) and emphysematous lesions (p < 0.0001) on Group 2a images, further improved on Group 2b images for ground glass attenuation (p < 0.0001), and emphysematous lesions (p = 0.0087). Compared with regular FBP, iterative reconstructions enable significant reduction of image noise without loss of diagnostic information, thus having the potential to decrease radiation dose during chest CT examinations. (orig.)

  16. Why do commercial CT scanners still employ traditional, filtered back-projection for image reconstruction?

    Despite major advances in x-ray sources, detector arrays, gantry mechanical design and especially computer performance, one component of computed tomography (CT) scanners has remained virtually constant for the past 25 years—the reconstruction algorithm. Fundamental advances have been made in the solution of inverse problems, especially tomographic reconstruction, but these works have not been translated into clinical and related practice. The reasons are not obvious and seldom discussed. This review seeks to examine the reasons for this discrepancy and provides recommendations on how it can be resolved. We take the example of field of compressive sensing (CS), summarizing this new area of research from the eyes of practical medical physicists and explaining the disconnection between theoretical and application-oriented research. Using a few issues specific to CT, which engineers have addressed in very specific ways, we try to distill the mathematical problem underlying each of these issues with the hope of demonstrating that there are interesting mathematical problems of general importance that can result from in depth analysis of specific issues. We then sketch some unconventional CT-imaging designs that have the potential to impact on CT applications, if the link between applied mathematicians and engineers/physicists were stronger. Finally, we close with some observations on how the link could be strengthened. There is, we believe, an important opportunity to rapidly improve the performance of CT and related tomographic imaging techniques by addressing these issues. (topical review)

  17. Results of clinical receiver operating characteristic study comparing ordered subset expectation maximization and filtered back projection images in FDG PET studies of hepatocellular carcinoma

    Jeon, Tae Joo; Lee, Jong Doo; Kim, Hee Joung; Kim, Myung Jin; Yoo, Hyung Sik [College of Medicine, Yonsei Univ., Seoul (Korea, Republic of)

    2000-07-01

    The aims of this study is to validate the usefulness of ordered subset expectation maximization (OSEM) comparing with filtered back projection in terms of diagnostic ability for hepatocellular carcinoma (HCC). The data of fifty seven patients with HCC and 62 patients with normal liver was reconstructed using both OSEM and FBP. Mean age of the patients group was 54.4{+-}1.5 year. All patient underwent whole body and liver scan after the injection of 10 mCi of (F-18)FDG using dedicated whole body PET camera (GE, Advance). Interpretation of PET images were performed by 3 observers with random exposure of normal and diseased cases. Receiver operator characteristic (ROC) study was used for the validation of results. The area of ROC curve. Az was represented as below and this results revealed statistical differences (p<0.05). In PET studies of patients with HCC, OSEM showed better results that those of conventional FBP in terms of lesion detectability.

  18. Ultra low-dose chest CT using filtered back projection: Comparison of 80-, 100- and 120 kVp protocols in a prospective randomized study

    Khawaja, Ranish Deedar Ali, E-mail: rkhawaja@mgh.harvard.edu [Division of Thoracic Radiology, MGH Imaging, Massachusetts General Hospital and Harvard Medical School, Boston (United States); Singh, Sarabjeet [Division of Thoracic Radiology, MGH Imaging, Massachusetts General Hospital and Harvard Medical School, Boston (United States); Madan, Rachna [Division of Thoracic Radiology, Brigham and Women' s Hospital and Harvard Medical School, Boston (United States); Sharma, Amita; Padole, Atul; Pourjabbar, Sarvenaz; Digumarthy, Subba; Shepard, Jo-Anne; Kalra, Mannudeep K. [Division of Thoracic Radiology, MGH Imaging, Massachusetts General Hospital and Harvard Medical School, Boston (United States)

    2014-10-15

    Highlights: • Filtered back projection technique enables acceptable image quality for chest CT examinations at 0.9 mGy (estimated effective dose of 0.5 mSv) for selected sizes of patients. • Lesion detection (such as solid non-calcified lung nodules) in lung parenchyma is optimal at 0.9 mGy, with limited visualization of thyroid nodules in FBP images. • Further dose reduction down to 0.4 mGy is possible for most patients undergoing follow-up chest CT for evaluation of larger lung nodules and GGOs. • Our results may help set the reference ALARA dose for chest CT examinations reconstructed with filtered back projection technique using the minimum possible radiation dose with acceptable image quality and lesion detection. - Abstract: Purpose: To assess lesion detection and diagnostic image quality of filtered back projection (FBP) reconstruction technique in ultra low-dose chest CT examinations. Methods and materials: In this IRB-approved ongoing prospective clinical study, 116 CT-image-series at four different radiation-doses were performed for 29 patients (age, 57–87 years; F:M – 15:12; BMI 16–32 kg/m{sup 2}). All patients provided written-informed-consent for the acquisitions of additional ultra low-dose (ULD) series on a 256-slice MDCT (iCT, Philips Healthcare). In-addition to their clinical standard-dose chest CT (SD, 120 kV mean CTDI{sub vol}, 6 ± 1 mGy), ULD-CT was subsequently performed at three-dose-levels (0.9 mGy [120 kV]; 0.5 mGy [100 kV] and 0.2 mGy [80 kV]). Images were reconstructed with FBP (2.5 mm * 1.25 mm) resulting into four-stacks: SD-FBP (reference-standard), FBP{sub 0.9}, FBP{sub 0.5}, and FBP{sub 0.2}. Four thoracic-radiologists from two-teaching-hospitals independently-evaluated data for lesion-detection and visibility-of-small-structures. Friedman's-non-parametric-test with post hoc Dunn's-test was used for data-analysis. Results: Interobserver-agreement was substantial between radiologists (k = 0.6–0.8). With

  19. Ultra low-dose chest CT using filtered back projection: Comparison of 80-, 100- and 120 kVp protocols in a prospective randomized study

    Highlights: • Filtered back projection technique enables acceptable image quality for chest CT examinations at 0.9 mGy (estimated effective dose of 0.5 mSv) for selected sizes of patients. • Lesion detection (such as solid non-calcified lung nodules) in lung parenchyma is optimal at 0.9 mGy, with limited visualization of thyroid nodules in FBP images. • Further dose reduction down to 0.4 mGy is possible for most patients undergoing follow-up chest CT for evaluation of larger lung nodules and GGOs. • Our results may help set the reference ALARA dose for chest CT examinations reconstructed with filtered back projection technique using the minimum possible radiation dose with acceptable image quality and lesion detection. - Abstract: Purpose: To assess lesion detection and diagnostic image quality of filtered back projection (FBP) reconstruction technique in ultra low-dose chest CT examinations. Methods and materials: In this IRB-approved ongoing prospective clinical study, 116 CT-image-series at four different radiation-doses were performed for 29 patients (age, 57–87 years; F:M – 15:12; BMI 16–32 kg/m2). All patients provided written-informed-consent for the acquisitions of additional ultra low-dose (ULD) series on a 256-slice MDCT (iCT, Philips Healthcare). In-addition to their clinical standard-dose chest CT (SD, 120 kV mean CTDIvol, 6 ± 1 mGy), ULD-CT was subsequently performed at three-dose-levels (0.9 mGy [120 kV]; 0.5 mGy [100 kV] and 0.2 mGy [80 kV]). Images were reconstructed with FBP (2.5 mm * 1.25 mm) resulting into four-stacks: SD-FBP (reference-standard), FBP0.9, FBP0.5, and FBP0.2. Four thoracic-radiologists from two-teaching-hospitals independently-evaluated data for lesion-detection and visibility-of-small-structures. Friedman's-non-parametric-test with post hoc Dunn's-test was used for data-analysis. Results: Interobserver-agreement was substantial between radiologists (k = 0.6–0.8). With pooled analysis, 146-pulmonary (27

  20. Comparison of applied dose and image quality in staging CT of neuroendocrine tumor patients using standard filtered back projection and adaptive statistical iterative reconstruction

    Böning, G., E-mail: georg.boening@charite.de [Department of Radiology, Charité, Humboldt-University Medical School, Charitéplatz 1, 10117 Berlin (Germany); Schäfer, M.; Grupp, U. [Department of Radiology, Charité, Humboldt-University Medical School, Charitéplatz 1, 10117 Berlin (Germany); Kaul, D. [Department of Radiation Oncology, Charité, Humboldt-University Medical School, Charitéplatz 1, 10117 Berlin (Germany); Kahn, J. [Department of Radiology, Charité, Humboldt-University Medical School, Charitéplatz 1, 10117 Berlin (Germany); Pavel, M. [Department of Gastroenterology, Charité, Humboldt-University Medical School, Charitéplatz 1, 10117 Berlin (Germany); Maurer, M.; Denecke, T.; Hamm, B.; Streitparth, F. [Department of Radiology, Charité, Humboldt-University Medical School, Charitéplatz 1, 10117 Berlin (Germany)

    2015-08-15

    Highlights: • Iterative reconstruction (IR) in staging CT provides equal objective image quality compared to filtered back projection (FBP). • IR delivers excellent subjective quality and reduces effective dose compared to FBP. • In patients with neuroendocrine tumor (NET) or may other hypervascular abdominal tumors IR can be used without scarifying diagnostic confidence. - Abstract: Objective: To investigate whether dose reduction via adaptive statistical iterative reconstruction (ASIR) affects image quality and diagnostic accuracy in neuroendocrine tumor (NET) staging. Methods: A total of 28 NET patients were enrolled in the study. Inclusion criteria were histologically proven NET and visible tumor in abdominal computed tomography (CT). In an intraindividual study design, the patients underwent a baseline CT (filtered back projection, FBP) and follow-up CT (ASIR 40%) using matched scan parameters. Image quality was assessed subjectively using a 5-grade scoring system and objectively by determining signal-to-noise ratio (SNR) and contrast-to-noise ratios (CNRs). Applied volume computed tomography dose index (CTDI{sub vol}) of each scan was taken from the dose report. Results: ASIR 40% significantly reduced CTDI{sub vol} (10.17 ± 3.06 mGy [FBP], 6.34 ± 2.25 mGy [ASIR] (p < 0.001) by 37.6% and significantly increased CNRs (complete tumor-to-liver, 2.76 ± 1.87 [FBP], 3.2 ± 2.32 [ASIR]) (p < 0.05) (complete tumor-to-muscle, 2.74 ± 2.67 [FBP], 4.31 ± 4.61 [ASIR]) (p < 0.05) compared to FBP. Subjective scoring revealed no significant changes for diagnostic confidence (5.0 ± 0 [FBP], 5.0 ± 0 [ASIR]), visibility of suspicious lesion (4.8 ± 0.5 [FBP], 4.8 ± 0.5 [ASIR]) and artifacts (5.0 ± 0 [FBP], 5.0 ± 0 [ASIR]). ASIR 40% significantly decreased scores for noise (4.3 ± 0.6 [FBP], 4.0 ± 0.8 [ASIR]) (p < 0.05), contrast (4.4 ± 0.6 [FBP], 4.1 ± 0.8 [ASIR]) (p < 0.001) and visibility of small structures (4.5 ± 0.7 [FBP], 4.3 ± 0.8 [ASIR]) (p < 0

  1. Comparison of applied dose and image quality in staging CT of neuroendocrine tumor patients using standard filtered back projection and adaptive statistical iterative reconstruction

    Highlights: • Iterative reconstruction (IR) in staging CT provides equal objective image quality compared to filtered back projection (FBP). • IR delivers excellent subjective quality and reduces effective dose compared to FBP. • In patients with neuroendocrine tumor (NET) or may other hypervascular abdominal tumors IR can be used without scarifying diagnostic confidence. - Abstract: Objective: To investigate whether dose reduction via adaptive statistical iterative reconstruction (ASIR) affects image quality and diagnostic accuracy in neuroendocrine tumor (NET) staging. Methods: A total of 28 NET patients were enrolled in the study. Inclusion criteria were histologically proven NET and visible tumor in abdominal computed tomography (CT). In an intraindividual study design, the patients underwent a baseline CT (filtered back projection, FBP) and follow-up CT (ASIR 40%) using matched scan parameters. Image quality was assessed subjectively using a 5-grade scoring system and objectively by determining signal-to-noise ratio (SNR) and contrast-to-noise ratios (CNRs). Applied volume computed tomography dose index (CTDIvol) of each scan was taken from the dose report. Results: ASIR 40% significantly reduced CTDIvol (10.17 ± 3.06 mGy [FBP], 6.34 ± 2.25 mGy [ASIR] (p < 0.001) by 37.6% and significantly increased CNRs (complete tumor-to-liver, 2.76 ± 1.87 [FBP], 3.2 ± 2.32 [ASIR]) (p < 0.05) (complete tumor-to-muscle, 2.74 ± 2.67 [FBP], 4.31 ± 4.61 [ASIR]) (p < 0.05) compared to FBP. Subjective scoring revealed no significant changes for diagnostic confidence (5.0 ± 0 [FBP], 5.0 ± 0 [ASIR]), visibility of suspicious lesion (4.8 ± 0.5 [FBP], 4.8 ± 0.5 [ASIR]) and artifacts (5.0 ± 0 [FBP], 5.0 ± 0 [ASIR]). ASIR 40% significantly decreased scores for noise (4.3 ± 0.6 [FBP], 4.0 ± 0.8 [ASIR]) (p < 0.05), contrast (4.4 ± 0.6 [FBP], 4.1 ± 0.8 [ASIR]) (p < 0.001) and visibility of small structures (4.5 ± 0.7 [FBP], 4.3 ± 0.8 [ASIR]) (p < 0.001). Conclusion

  2. FDG-PET standardized uptake values in normal anatomical structures using iterative reconstruction segmented attenuation correction and filtered back-projection

    Filtered back-projection (FBP) is the most commonly used reconstruction method for PET images, which are usually noisy. The iterative reconstruction segmented attenuation correction (IRSAC) algorithm improves image quality without reducing image resolution. The standardized uptake value (SUV) is the most clinically utilized quantitative parameter of [fluorine-18]fluoro-2-deoxy-d-glucose (FDG) accumulation. The objective of this study was to obtain a table of SUVs for several normal anatomical structures from both routinely used FBP and IRSAC reconstructed images and to compare the data obtained with both methods. Twenty whole-body PET scans performed in consecutive patients with proven or suspected non-small cell lung cancer were retrospectively analyzed. Images were processed using both IRSAC and FBP algorithms. Nonquantitative or gaussian filters were used to smooth the transmission scan when using FBP or IRSAC algorithms, respectively. A phantom study was performed to evaluate the effect of different filters on SUV. Maximum and average SUVs (SUVmax and SUVavg) were calculated in 28 normal anatomical structures and in one pathological site. The phantom study showed that the use of a nonquantitative smoothing filter in the transmission scan results in a less accurate quantification and in a 20% underestimation of the actual measurement. Most anatomical structures were identified in all patients using the IRSAC images. On average, SUVavg and SUVmax measured on IRSAC images using a gaussian filter in the transmission scan were respectively 20% and 8% higher than the SUVs calculated from conventional FBP images. Scatterplots of the data values showed an overall strong relationship between IRSAC and FBP SUVs. Individual scatterplots of each site demonstrated a weaker relationship for lower SUVs and for SUVmax than for higher SUVs and SUVavg. A set of reference values was obtained for SUVmax and SUVavg of normal anatomical structures, calculated with both IRSAC and FBP

  3. Half-dose abdominal CT with sinogram-affirmed iterative reconstruction technique in children - comparison with full-dose CT with filtered back projection

    Iterative reconstruction can be helpful to reduce radiation dose while maintaining image quality. However, this technique has not been fully evaluated in children during abdominal CT. To compare objective and subjective image quality between half-dose images reconstructed with iterative reconstruction at iteration strength levels 1 to 5 (half-S1 to half-S5 studies) and full-dose images reconstructed with filtered back projection (full studies) in pediatric abdominal CT. Twenty-one children (M:F = 13:8; mean age 8.2 ± 5.7 years) underwent dual-source abdominal CT (mean effective dose 4.8 ± 2.1 mSv). The objective image quality was evaluated as noise. Subjective image quality analysis was performed comparing each half study to the full study for noise, sharpness, artifact and diagnostic acceptability. Both objective and subjective image noise decreased with increasing iteration strength. Half-S4 and -S5 studies showed objective image noise similar to or lower than that of full studies. The half-S2 and -S3 studies produced the greatest sharpness and the half-S5 studies were the worst from a blocky appearance. Full and half studies did not differ in artifacts. Half-S3 studies showed the best diagnostic acceptability. Half-S4 and -S5 studies objectively and half-S3 studies subjectively showed comparable image quality to full studies in pediatric abdominal CT. (orig.)

  4. Chest computed tomography using iterative reconstruction vs filtered back projection (Part 2): image quality of low-dose CT examinations in 80 patients

    Pontana, Francois; Pagniez, Julien; Faivre, Jean-Baptiste; Hachulla, Anne-Lise; Remy, Jacques [University Lille Nord de France, Department of Thoracic Imaging, Hospital Calmette (EA 2694), Lille (France); Duhamel, Alain [University Lille Nord de France, Department of Medical Statistics, Lille (France); Flohr, Thomas [Computed Tomography Division, Siemens Healthcare, Forchheim (Germany); Remy-Jardin, Martine [University Lille Nord de France, Department of Thoracic Imaging, Hospital Calmette (EA 2694), Lille (France); Hospital Calmette, Department of Thoracic Imaging, Lille cedex (France)

    2011-03-15

    To evaluate the image quality of an iterative reconstruction algorithm (IRIS) in low-dose chest CT in comparison with standard-dose filtered back projection (FBP) CT. Eighty consecutive patients referred for a follow-up chest CT examination of the chest, underwent a low-dose CT examination (Group 2) in similar technical conditions to those of the initial examination, (Group 1) except for the milliamperage selection and the replacement of regular FBP reconstruction by iterative reconstructions using three (Group 2a) and five iterations (Group 2b). Despite a mean decrease of 35.5% in the dose-length-product, there was no statistically significant difference between Group 2a and Group 1 in the objective noise, signal-to-noise (SNR) and contrast-to-noise (CNR) ratios and distribution of the overall image quality scores. Compared to Group 1, objective image noise in Group 2b was significantly reduced with increased SNR and CNR and a trend towards improved image quality. Iterative reconstructions using three iterations provide similar image quality compared with the conventionally used FBP reconstruction at 35% less dose, thus enabling dose reduction without loss of diagnostic information. According to our preliminary results, even higher dose reductions than 35% may be feasible by using more than three iterations. (orig.)

  5. Forward problem solution as the operator of filtered and back projection matrix to reconstruct the various method of data collection and the object element model in electrical impedance tomography

    Back projection reconstruction has been implemented to get the dynamical image in electrical impedance tomography. However the implementation is still limited in method of adjacent data collection and circular object element model. The study aims to develop the methods of back projection as reconstruction method that has the high speed, accuracy, and flexibility, which can be used for various methods of data collection and model of the object element. The proposed method uses the forward problem solution as the operator of filtered and back projection matrix. This is done through a simulation study on several methods of data collection and various models of the object element. The results indicate that the developed method is capable of producing images, fastly and accurately for reconstruction of the various methods of data collection and models of the object element

  6. Comparison of iterative model, hybrid iterative, and filtered back projection reconstruction techniques in low-dose brain CT: impact of thin-slice imaging

    The purpose of this study was to evaluate the utility of iterative model reconstruction (IMR) in brain CT especially with thin-slice images. This prospective study received institutional review board approval, and prior informed consent to participate was obtained from all patients. We enrolled 34 patients who underwent brain CT and reconstructed axial images with filtered back projection (FBP), hybrid iterative reconstruction (HIR) and IMR with 1 and 5 mm slice thicknesses. The CT number, image noise, contrast, and contrast noise ratio (CNR) between the thalamus and internal capsule, and the rate of increase of image noise in 1 and 5 mm thickness images between the reconstruction methods, were assessed. Two independent radiologists assessed image contrast, image noise, image sharpness, and overall image quality on a 4-point scale. The CNRs in 1 and 5 mm slice thickness were significantly higher with IMR (1.2 ± 0.6 and 2.2 ± 0.8, respectively) than with FBP (0.4 ± 0.3 and 1.0 ± 0.4, respectively) and HIR (0.5 ± 0.3 and 1.2 ± 0.4, respectively) (p < 0.01). The mean rate of increasing noise from 5 to 1 mm thickness images was significantly lower with IMR (1.7 ± 0.3) than with FBP (2.3 ± 0.3) and HIR (2.3 ± 0.4) (p < 0.01). There were no significant differences in qualitative analysis of unfamiliar image texture between the reconstruction techniques. IMR offers significant noise reduction and higher contrast and CNR in brain CT, especially for thin-slice images, when compared to FBP and HIR. (orig.)

  7. Image quality of CT angiography with model-based iterative reconstruction in young children with congenital heart disease: comparison with filtered back projection and adaptive statistical iterative reconstruction.

    Son, Sung Sil; Choo, Ki Seok; Jeon, Ung Bae; Jeon, Gye Rok; Nam, Kyung Jin; Kim, Tae Un; Yeom, Jeong A; Hwang, Jae Yeon; Jeong, Dong Wook; Lim, Soo Jin

    2015-06-01

    To retrospectively evaluate the image quality of CT angiography (CTA) reconstructed by model-based iterative reconstruction (MBIR) and to compare this with images obtained by filtered back projection (FBP) and adaptive statistical iterative reconstruction (ASIR) in newborns and infants with congenital heart disease (CHD). Thirty-seven children (age 4.8 ± 3.7 months; weight 4.79 ± 0.47 kg) with suspected CHD underwent CTA on a 64detector MDCT without ECG gating (80 kVp, 40 mA using tube current modulation). Total dose length product was recorded in all patients. Images were reconstructed using FBP, ASIR, and MBIR. Objective image qualities (density, noise) were measured in the great vessels and heart chambers. The contrast-to-noise ratio (CNR) was calculated by measuring the density and noise of myocardial walls. Two radiologists evaluated images for subjective noise, diagnostic confidence, and sharpness at the level prior to the first branch of the main pulmonary artery. Images were compared with respect to reconstruction method, and reconstruction times were measured. Images from all patients were diagnostic, and the effective dose was 0.22 mSv. The objective image noise of MBIR was significantly lower than those of FBP and ASIR in the great vessels and heart chambers (P 0.05). Mean CNR values were 8.73 for FBP, 14.54 for ASIR, and 22.95 for MBIR. In addition, the subjective image noise of MBIR was significantly lower than those of the others (P confidence (P < 0.05), and mean reconstruction times were 5.1 ± 2.3 s for FBP and ASIR and 15.1 ± 2.4 min for MBIR. While CTA with MBIR in newborns and infants with CHD can reduce image noise and improve CNR more than other methods, it is more time-consuming than the other methods. PMID:25414055

  8. The comparison of ordered subset expectation maximization and filtered back projection technique for RBC blood pool SPECT in detection of liver hemangioma

    Jeon, Tae Joo; Kim, Hee Joung; Bong, Jung Kyun; Lee, Jong Doo [College of Medicine, Yonsei Univ., Seoul (Korea, Republic of)

    2000-07-01

    Odered subset expectation maximization (OSEM) is a new iterative reconstruction technique for tomographic images that can reduce the reconstruction time comparing with conventional iteration method. We adopted this method of RBC blood pool SPECT and tried to validate the usefulness of OSEM in detection of liver hemangioma comparing with filtered back projection (FBP). A 64 projection SPECT study was acquired over 360 .deg. C by dual-head cameras after the injection of 750MBq of {sup 99m}Tc-RBC. OSEM was performed with various condition of subset (1,2,4,8,16 and 32) and iteration number (1,2,4,8 and 16) to obtain the best set for lesion detection. OSEM underwent in 17 lesions of 15 patients with liver hemangioma and compared with FBP images. Two nuclear medicine physicians reviewed these results independently. Best set for images was 4 iteration and 16 subset. In general, OSEM revealed more homogeneous images than FBP. Eighty-eight percent (15/17) of OSEM images were superior or equal to FBP for anatomic resolution. According to the blind review of images 70.5% (12/17) of OSEM was better in contrast (4/17), anatomic detail (4/17) and both (2/17). Two small lesions were detected by OSEM only and another 2 small lesions were failed to depict in both methods. Remaining 3 lesions revealed no difference in image quality. OSEM can provide better image quality as well as better results in detection of liver hemangioma than conventional FBP technique.

  9. Image quality and radiation dose of low dose coronary CT angiography in obese patients: Sinogram affirmed iterative reconstruction versus filtered back projection

    Purpose: To investigate the image quality and radiation dose of low radiation dose CT coronary angiography (CTCA) using sinogram affirmed iterative reconstruction (SAFIRE) compared with standard dose CTCA using filtered back-projection (FBP) in obese patients. Materials and methods: Seventy-eight consecutive obese patients were randomized into two groups and scanned using a prospectively ECG-triggered step-and-shot (SAS) CTCA protocol on a dual-source CT scanner. Thirty-nine patients (protocol A) were examined using a routine radiation dose protocol at 120 kV and images were reconstructed with FBP (protocol A). Thirty-nine patients (protocol B) were examined using a low dose protocol at 100 kV and images were reconstructed with SAFIRE. Two blinded observers independently assessed the image quality of each coronary segment using a 4-point scale (1 = non-diagnostic, 4 = excellent) and measured the objective parameters image noise, signal-to-noise ratio (SNR), and contrast-to-noise ratio (CNR). Radiation dose was calculated. Results: The coronary artery image quality scores, image noise, SNR and CNR were not significantly different between protocols A and B (all p > 0.05), with image quality scores of 3.51 ± 0.70 versus 3.55 ± 0.47, respectively. The effective radiation dose was significantly lower in protocol B (4.41 ± 0.83 mSv) than that in protocol A (8.83 ± 1.74 mSv, p < 0.01). Conclusion: Compared with standard dose CTCA using FBP, low dose CTCA using SAFIRE can maintain diagnostic image quality with 50% reduction of radiation dose.

  10. Adaptive iterative dose reduction algorithm in CT: Effect on image quality compared with filtered back projection in body phantoms of different sizes

    Kim, Milim; Lee, Jeong Min; Son, Hyo Shin; Han, Joon Koo; Choi, Byung Ihn [College of Medicine, Seoul National University, Seoul (Korea, Republic of); Yoon, Jeong Hee; Choi, Jin Woo [Dept. of Radiology, Seoul National University Hospital, Seoul (Korea, Republic of)

    2014-04-15

    To evaluate the impact of the adaptive iterative dose reduction (AIDR) three-dimensional (3D) algorithm in CT on noise reduction and the image quality compared to the filtered back projection (FBP) algorithm and to compare the effectiveness of AIDR 3D on noise reduction according to the body habitus using phantoms with different sizes. Three different-sized phantoms with diameters of 24 cm, 30 cm, and 40 cm were built up using the American College of Radiology CT accreditation phantom and layers of pork belly fat. Each phantom was scanned eight times using different mAs. Images were reconstructed using the FBP and three different strengths of the AIDR 3D. The image noise, the contrast-to-noise ratio (CNR) and the signal-to-noise ratio (SNR) of the phantom were assessed. Two radiologists assessed the image quality of the 4 image sets in consensus. The effectiveness of AIDR 3D on noise reduction compared with FBP were also compared according to the phantom sizes. Adaptive iterative dose reduction 3D significantly reduced the image noise compared with FBP and enhanced the SNR and CNR (p < 0.05) with improved image quality (p < 0.05). When a stronger reconstruction algorithm was used, greater increase of SNR and CNR as well as noise reduction was achieved (p < 0.05). The noise reduction effect of AIDR 3D was significantly greater in the 40-cm phantom than in the 24-cm or 30-cm phantoms (p < 0.05). The AIDR 3D algorithm is effective to reduce the image noise as well as to improve the image-quality parameters compared by FBP algorithm, and its effectiveness may increase as the phantom size increases.

  11. Adaptive iterative dose reduction algorithm in CT: Effect on image quality compared with filtered back projection in body phantoms of different sizes

    To evaluate the impact of the adaptive iterative dose reduction (AIDR) three-dimensional (3D) algorithm in CT on noise reduction and the image quality compared to the filtered back projection (FBP) algorithm and to compare the effectiveness of AIDR 3D on noise reduction according to the body habitus using phantoms with different sizes. Three different-sized phantoms with diameters of 24 cm, 30 cm, and 40 cm were built up using the American College of Radiology CT accreditation phantom and layers of pork belly fat. Each phantom was scanned eight times using different mAs. Images were reconstructed using the FBP and three different strengths of the AIDR 3D. The image noise, the contrast-to-noise ratio (CNR) and the signal-to-noise ratio (SNR) of the phantom were assessed. Two radiologists assessed the image quality of the 4 image sets in consensus. The effectiveness of AIDR 3D on noise reduction compared with FBP were also compared according to the phantom sizes. Adaptive iterative dose reduction 3D significantly reduced the image noise compared with FBP and enhanced the SNR and CNR (p < 0.05) with improved image quality (p < 0.05). When a stronger reconstruction algorithm was used, greater increase of SNR and CNR as well as noise reduction was achieved (p < 0.05). The noise reduction effect of AIDR 3D was significantly greater in the 40-cm phantom than in the 24-cm or 30-cm phantoms (p < 0.05). The AIDR 3D algorithm is effective to reduce the image noise as well as to improve the image-quality parameters compared by FBP algorithm, and its effectiveness may increase as the phantom size increases.

  12. Comparison of iterative model, hybrid iterative, and filtered back projection reconstruction techniques in low-dose brain CT: impact of thin-slice imaging

    Nakaura, Takeshi; Iyama, Yuji; Kidoh, Masafumi; Yokoyama, Koichi [Amakusa Medical Center, Diagnostic Radiology, Amakusa, Kumamoto (Japan); Kumamoto University, Department of Diagnostic Radiology, Graduate School of Medical Sciences, Kumamoto (Japan); Oda, Seitaro; Yamashita, Yasuyuki [Kumamoto University, Department of Diagnostic Radiology, Graduate School of Medical Sciences, Kumamoto (Japan); Tokuyasu, Shinichi [Philips Electronics, Kumamoto (Japan); Harada, Kazunori [Amakusa Medical Center, Department of Surgery, Kumamoto (Japan)

    2016-03-15

    The purpose of this study was to evaluate the utility of iterative model reconstruction (IMR) in brain CT especially with thin-slice images. This prospective study received institutional review board approval, and prior informed consent to participate was obtained from all patients. We enrolled 34 patients who underwent brain CT and reconstructed axial images with filtered back projection (FBP), hybrid iterative reconstruction (HIR) and IMR with 1 and 5 mm slice thicknesses. The CT number, image noise, contrast, and contrast noise ratio (CNR) between the thalamus and internal capsule, and the rate of increase of image noise in 1 and 5 mm thickness images between the reconstruction methods, were assessed. Two independent radiologists assessed image contrast, image noise, image sharpness, and overall image quality on a 4-point scale. The CNRs in 1 and 5 mm slice thickness were significantly higher with IMR (1.2 ± 0.6 and 2.2 ± 0.8, respectively) than with FBP (0.4 ± 0.3 and 1.0 ± 0.4, respectively) and HIR (0.5 ± 0.3 and 1.2 ± 0.4, respectively) (p < 0.01). The mean rate of increasing noise from 5 to 1 mm thickness images was significantly lower with IMR (1.7 ± 0.3) than with FBP (2.3 ± 0.3) and HIR (2.3 ± 0.4) (p < 0.01). There were no significant differences in qualitative analysis of unfamiliar image texture between the reconstruction techniques. IMR offers significant noise reduction and higher contrast and CNR in brain CT, especially for thin-slice images, when compared to FBP and HIR. (orig.)

  13. Impact of hybrid iterative reconstruction on Agatston coronary artery calcium scores in comparison to filtered back projection in native cardiac CT

    To investigate whether the effects of hybrid iterative reconstruction (HIR) on coronary artery calcium (CAC) measurements using the Agatston score lead to changes in assignment of patients to cardiovascular risk groups compared to filtered back projection (FBP). 68 patients (mean age 61.5 years; 48 male; 20 female) underwent prospectively ECG-gated, non-enhanced, cardiac 256-MSCT for coronary calcium scoring. Scanning parameters were as follows: Tube voltage, 120 kV; Mean tube current time-product 63.67 mAs (50 - 150 mAs); collimation, 2 x 128 x 0.625 mm. Images were reconstructed with FBP and with HIR at all levels (L1 to L7). Two independent readers measured Agatston scores of all reconstructions and assigned patients to cardiovascular risk groups. Scores of HIR and FBP reconstructions were correlated (Spearman). Interobserver agreement and variability was assessed with k-statistics and Bland-Altmann-Plots. Agatston scores of HIR reconstructions were closely correlated with FBP reconstructions (L1, R = 0.9996; L2, R = 0.9995; L3, R = 0.9991; L4, R = 0.986; L5, R = 0.9986; L6, R = 0.9987; and L7, R = 0.9986). In comparison to FBP, HIR led to reduced Agatston scores between 97% (L1) and 87.4% (L7) of the FBP values. Using HIR iterations L1-L3, all patients were assigned to identical risk groups as after FPB reconstruction. In 5.4% of patients the risk group after HIR with the maximum iteration level was different from the group after FBP reconstruction. There was an excellent correlation of Agatston scores after HIR and FBP with identical risk group assignment at levels 1 - 3 for all patients. Hence it appears that the application of HIR in routine calcium scoring does not entail any disadvantages. Thus, future studies are needed to demonstrate whether HIR is a reliable method for reducing radiation dose in coronary calcium scoring.

  14. Evaluation of iterative reconstruction (OSEM) versus filtered back-projection for the assessment of myocardial glucose uptake and myocardial perfusion using dynamic PET

    Iterative reconstruction methods based on ordered-subset expectation maximisation (OSEM) has replaced filtered backprojection (FBP) in many clinical settings owing to the superior image quality. Whether OSEM is as accurate as FBP in quantitative positron emission tomography (PET) is uncertain. We compared the accuracy of OSEM and FBP for regional myocardial 18F-FDG uptake and 13NH3 perfusion measurements in cardiac PET. Ten healthy volunteers were studied. Five underwent dynamic 18F-FDG PET during hyperinsulinaemic-euglycaemic clamp, and five underwent 13NH3 perfusion measurement during rest and adenosine-induced hyperaemia. Images were reconstructed using FBP and OSEM ± an 8-mm Gaussian post-reconstruction filter. Filtered and unfiltered images showed agreement between the reconstruction methods within ±2SD in Bland-Altman plots of Ki values. The use of a Gaussian filter resulted in a systematic underestimation of Ki in the filtered images of 11%. The mean deviation between the reconstruction methods for both unfiltered and filtered images was 1.3%. Agreement within ±2SD between the methods was demonstrated for perfusion rate constants up to 2.5 min-1, corresponding to a perfusion of 3.4 ml g-1 min-1. The mean deviation between the two methods for unfiltered data was 2.7%, and for filtered data, 5.3%. The 18F-FDG uptake rate constants showed excellent agreement between the two reconstruction methods. In the perfusion range up to 3.4 ml g-1 min-1, agreement between 13NH3 perfusion obtained with OSEM and FBP was acceptable. The use of OSEM for measurement of perfusion values higher than 3.4 ml g-1 min-1 requires further evaluation. (orig.)

  15. Iterative reconstruction technique vs filter back projection: utility for quantitative bronchial assessment on low-dose thin-section MDCT in patients with/without chronic obstructive pulmonary disease

    Koyama, Hisanobu; Seki, Shinichiro; Sugimura, Kazuro [Kobe University Graduate School of Medicine, Division of Radiology, Department of Radiology, Kobe, Hyogo (Japan); Ohno, Yoshiharu; Nishio, Mizuho; Matsumoto, Sumiaki; Yoshikawa, Takeshi [Kobe University Graduate School of Medicine, Advanced Biomedical Imaging Research Centre, Kobe (Japan); Kobe University Graduate School of Medicine, Division of Functional and Diagnostic Imaging Research, Department of Radiology, Kobe (Japan); Sugihara, Naoki [Toshiba Medical Systems Corporation, Ohtawara, Tochigi (Japan)

    2014-08-15

    The aim of this study was to evaluate the utility of the iterative reconstruction (IR) technique for quantitative bronchial assessment during low-dose computed tomography (CT) as a substitute for standard-dose CT in patients with/without chronic obstructive pulmonary disease. Fifty patients (mean age, 69.2; mean % predicted FEV1, 79.4) underwent standard-dose CT (150mAs) and low-dose CT (25mAs). Except for tube current, the imaging parameters were identical for both protocols. Standard-dose CT was reconstructed using filtered back-projection (FBP), and low-dose CT was reconstructed using IR and FBP. For quantitative bronchial assessment, the wall area percentage (WA%) of the sub-segmental bronchi and the airway luminal volume percentage (LV%) from the main bronchus to the peripheral bronchi were acquired in each dataset. The correlation and agreement of WA% and LV% between standard-dose CT and both low-dose CTs were statistically evaluated. WA% and LV% between standard-dose CT and both low-dose CTs were significant correlated (r > 0.77, p < 0.00001); however, only the LV% agreement between SD-CT and low-dose CT reconstructed with IR was moderate (concordance correlation coefficient = 0.93); the other agreement was poor (concordance correlation coefficient <0.90). Quantitative bronchial assessment via low-dose CT has potential as a substitute for standard-dose CT by using IR and airway luminal volumetry techniques. circle Quantitative bronchial assessment of COPD using low-dose CT is possible. (orig.)

  16. Image quality of low mA CT pulmonary angiography reconstructed with model based iterative reconstruction versus standard CT pulmonary angiography reconstructed with filtered back projection: an equivalency trial

    To determine whether CT pulmonary angiography (CTPA) using low mA setting reconstructed with model-based iterative reconstruction (MBIR) is equivalent to routine CTPA reconstructed with filtered back projection (FBP). This prospective study was approved by the institutional review board and patients provided written informed consent. Eighty-two patients were examined with a low mA MBIR-CTPA (100 kV, 20 mA) and 82 patients with a standard FBP-CTPA (100 kV, 250 mA). Region of interests were drawn in nine pulmonary vessels; signal-to-noise ratio (SNR) and contrast-to-noise ratio (CNR) were calculated. A five-point scale was used to subjectively evaluate the image quality of FBP-CTPA and low mA MBIR-CTPA. Compared to routine FBP-CTPA, low mA MBIR-CTPA showed no differences in the attenuation measured in nine pulmonary vessels, higher SNR (56 ± 19 vs 43 ± 20, p < 0.0001) and higher CNR (50 ± 17 vs 38 ± 18, p < 0.0001) despite a dose reduction of 93 % (p < 0.0001). The subjective image quality of low mA MBIR-CTPA was quoted as diagnostic in 98 % of the cases for patient with body mass index less than 30 kg/m2. Low mA MBIR-CTPA is equivalent to routine FBP-CTPA and allows a significant dose reduction while improving SNR and CNR in the pulmonary vessels, as compared with routine FBP-CTPA. (orig.)

  17. Computed tomography of the cervical spine: comparison of image quality between a standard-dose and a low-dose protocol using filtered back-projection and iterative reconstruction

    To compare image quality of a standard-dose (SD) and a low-dose (LD) cervical spine CT protocol using filtered back-projection (FBP) and iterative reconstruction (IR). Forty patients investigated by cervical spine CT were prospectively randomised into two groups: SD (120 kVp, 275 mAs) and LD (120 kVp, 150 mAs), both applying automatic tube current modulation. Data were reconstructed using both FBP and sinogram-affirmed IR. Image noise, signal-to-noise (SNR) and contrast-to-noise (CNR) ratios were measured. Two radiologists independently and blindly assessed the following anatomical structures at C3-C4 and C6-C7 levels, using a four-point scale: intervertebral disc, content of neural foramina and dural sac, ligaments, soft tissues and vertebrae. They subsequently rated overall image quality using a ten-point scale. For both protocols and at each disc level, IR significantly decreased image noise and increased SNR and CNR, compared with FBP. SNR and CNR were statistically equivalent in LD-IR and SD-FBP protocols. Regardless of the dose and disc level, the qualitative scores with IR compared with FBP, and with LD-IR compared with SD-FBP, were significantly higher or not statistically different for intervertebral discs, neural foramina and ligaments, while significantly lower or not statistically different for soft tissues and vertebrae. The overall image quality scores were significantly higher with IR compared with FBP, and with LD-IR compared with SD-FBP. LD-IR cervical spine CT provides better image quality for intervertebral discs, neural foramina and ligaments, and worse image quality for soft tissues and vertebrae, compared with SD-FBP, while reducing radiation dose by approximately 40 %. (orig.)

  18. Iterative reconstruction technique vs filter back projection: utility for quantitative bronchial assessment on low-dose thin-section MDCT in patients with/without chronic obstructive pulmonary disease

    The aim of this study was to evaluate the utility of the iterative reconstruction (IR) technique for quantitative bronchial assessment during low-dose computed tomography (CT) as a substitute for standard-dose CT in patients with/without chronic obstructive pulmonary disease. Fifty patients (mean age, 69.2; mean % predicted FEV1, 79.4) underwent standard-dose CT (150mAs) and low-dose CT (25mAs). Except for tube current, the imaging parameters were identical for both protocols. Standard-dose CT was reconstructed using filtered back-projection (FBP), and low-dose CT was reconstructed using IR and FBP. For quantitative bronchial assessment, the wall area percentage (WA%) of the sub-segmental bronchi and the airway luminal volume percentage (LV%) from the main bronchus to the peripheral bronchi were acquired in each dataset. The correlation and agreement of WA% and LV% between standard-dose CT and both low-dose CTs were statistically evaluated. WA% and LV% between standard-dose CT and both low-dose CTs were significant correlated (r > 0.77, p < 0.00001); however, only the LV% agreement between SD-CT and low-dose CT reconstructed with IR was moderate (concordance correlation coefficient = 0.93); the other agreement was poor (concordance correlation coefficient <0.90). Quantitative bronchial assessment via low-dose CT has potential as a substitute for standard-dose CT by using IR and airway luminal volumetry techniques. circle Quantitative bronchial assessment of COPD using low-dose CT is possible. (orig.)

  19. Clinical evaluation of image quality and radiation dose reduction in upper abdominal computed tomography using model-based iterative reconstruction; comparison with filtered back projection and adaptive statistical iterative reconstruction

    Highlights: • MBIR significantly improves objective image quality. • MBIR reduces the radiation dose by 87.5% without increasing objective image noise. • A half dose will be needed to maintain the subjective image quality. - Abstract: Purpose: To evaluate the image quality of upper abdominal CT images reconstructed with model-based iterative reconstruction (MBIR) in comparison with filtered back projection (FBP) and adaptive statistical iterative reconstruction (ASIR) on scans acquired with various radiation exposure dose protocols. Materials and methods: This prospective study was approved by our institutional review board, and informed consent was obtained from all 90 patients who underwent both control-dose (CD) and reduced-dose (RD) CT of the upper abdomen (unenhanced: n = 45, contrast-enhanced: n = 45). The RD scan protocol was randomly selected from three protocols; Protocol A: 12.5% dose, Protocol B: 25% dose, Protocol C: 50% dose. Objective image noise, signal-to-noise (SNR) ratio for the liver parenchyma, visual image score and lesion conspicuity were compared among CD images of FBP and RD images of FBP, ASIR and MBIR. Results: RD images of MBIR yielded significantly lower objective image noise and higher SNR compared with RD images of FBP and ASIR for all protocols (P < .01) and CD images of FBP for Protocol C (P < .05). Although the subjective image quality of RD images of MBIR was almost acceptable for Protocol C, it was inferior to that of CD images of FBP for Protocols A and B (P < .0083). The conspicuity of the small lesions in RD images of MBIR tended to be superior to that in RD images of FBP and ASIR and inferior to that in CD images for Protocols A and B, although the differences were not significant (P > .0083). Conclusion: Although 12.5%-dose MBIR images (mean size-specific dose estimates [SSDE] of 1.13 mGy) yielded objective image noise and SNR comparable to CD-FBP images, at least a 50% dose (mean SSDE of 4.63 mGy) would be needed to

  20. Impact of hybrid iterative reconstruction on Agatston coronary artery calcium scores in comparison to filtered back projection in native cardiac CT; Einfluss der hybriden iterativen Rekonstruktion bei der nativen CT des Herzens auf die Agatston-Kalziumscores der Koronararterien

    Obmann, V.C.; Heverhagen, J.T. [Inselspital - University Hospital Bern (Switzerland). University Inst. for Diagnostic, Interventional and Pediatric Radiology; Klink, T. [Wuerzburg Univ. (Germany). Inst. of Diagnostic and Interventional Radiology; Stork, A.; Begemann, P.G.C. [Roentgeninstitut Duesseldorf, Duesseldorf (Germany); Laqmani, A.; Adam, G. [University Medical Center Hamburg-Eppendorf, Hamburg (Germany). Dept. of Diagnostic and Interventional Radiology

    2015-05-15

    To investigate whether the effects of hybrid iterative reconstruction (HIR) on coronary artery calcium (CAC) measurements using the Agatston score lead to changes in assignment of patients to cardiovascular risk groups compared to filtered back projection (FBP). 68 patients (mean age 61.5 years; 48 male; 20 female) underwent prospectively ECG-gated, non-enhanced, cardiac 256-MSCT for coronary calcium scoring. Scanning parameters were as follows: Tube voltage, 120 kV; Mean tube current time-product 63.67 mAs (50 - 150 mAs); collimation, 2 x 128 x 0.625 mm. Images were reconstructed with FBP and with HIR at all levels (L1 to L7). Two independent readers measured Agatston scores of all reconstructions and assigned patients to cardiovascular risk groups. Scores of HIR and FBP reconstructions were correlated (Spearman). Interobserver agreement and variability was assessed with k-statistics and Bland-Altmann-Plots. Agatston scores of HIR reconstructions were closely correlated with FBP reconstructions (L1, R = 0.9996; L2, R = 0.9995; L3, R = 0.9991; L4, R = 0.986; L5, R = 0.9986; L6, R = 0.9987; and L7, R = 0.9986). In comparison to FBP, HIR led to reduced Agatston scores between 97% (L1) and 87.4% (L7) of the FBP values. Using HIR iterations L1-L3, all patients were assigned to identical risk groups as after FPB reconstruction. In 5.4% of patients the risk group after HIR with the maximum iteration level was different from the group after FBP reconstruction. There was an excellent correlation of Agatston scores after HIR and FBP with identical risk group assignment at levels 1 - 3 for all patients. Hence it appears that the application of HIR in routine calcium scoring does not entail any disadvantages. Thus, future studies are needed to demonstrate whether HIR is a reliable method for reducing radiation dose in coronary calcium scoring.

  1. Feasible Dose Reduction in Routine Chest Computed Tomography Maintaining Constant Image Quality Using the Last Three Scanner Generations: From Filtered Back Projection to Sinogram-affirmed Iterative Reconstruction and Impact of the Novel Fully Integrated Detector Design Minimizing Electronic Noise

    Lukas Ebner

    2014-01-01

    Full Text Available Objective:The aim of the present study was to evaluate a dose reduction in contrast-enhanced chest computed tomography (CT by comparing the three latest generations of Siemens CT scanners used in clinical practice. We analyzed the amount of radiation used with filtered back projection (FBP and an iterative reconstruction (IR algorithm to yield the same image quality. Furthermore, the influence on the radiation dose of the most recent integrated circuit detector (ICD; Stellar detector, Siemens Healthcare, Erlangen, Germany was investigated. Materials and Methods: 136 Patients were included. Scan parameters were set to a thorax routine: SOMATOM Sensation 64 (FBP, SOMATOM Definition Flash (IR, and SOMATOM Definition Edge (ICD and IR. Tube current was set constantly to the reference level of 100 mA automated tube current modulation using reference milliamperes. Care kV was used on the Flash and Edge scanner, while tube potential was individually selected between 100 and 140 kVp by the medical technologists at the SOMATOM Sensation. Quality assessment was performed on soft-tissue kernel reconstruction. Dose was represented by the dose length product. Results: Dose-length product (DLP with FBP for the average chest CT was 308 mGycm ± 99.6. In contrast, the DLP for the chest CT with IR algorithm was 196.8 mGycm ± 68.8 (P = 0.0001. Further decline in dose can be noted with IR and the ICD: DLP: 166.4 mGycm ± 54.5 (P = 0.033. The dose reduction compared to FBP was 36.1% with IR and 45.6% with IR/ICD. Signal-to-noise ratio (SNR was favorable in the aorta, bone, and soft tissue for IR/ICD in combination compared to FBP (the P values ranged from 0.003 to 0.048. Overall contrast-to-noise ratio (CNR improved with declining DLP. Conclusion: The most recent technical developments, namely IR in combination with integrated circuit detectors, can significantly lower radiation dose in chest CT examinations.

  2. 基于共焦圆周滤波-逆投影算法的近场层析成像%Tomography Imaging in Near-field Based Confocal Filtered Back-Projection of Circle Algorithm

    田八林; 杨恒; 罗永健; 李克昭; 陈瑶

    2014-01-01

    Tomography imaging of synthetic aperture radar can realize 3D tomography although its algorithm is simple,computation quantity is small,but is mainly suitable for the far field conditions. In addition,by increasing the number of baseline to improve the resolution will greatly increase the system complexity. According to the above problems,a tomography imaging technique in near-field using confocal circular SAR which on a Partial-circle or curved orbit is proposed. Two techniques of synthetic aperture and confocal imaging are combined to achieve space-time. Namely,confocal filtered back-projection of circle algorithm which usually used in two-dimensional imaging is extensively applied in confocal imaging. The results of digital simulation and experiment show that the proposed algorithm overcomes altitude ambiguity and has the capability to extract quasi-3D imaging information from a target scene.%层析成像合成孔径雷达三维层析成像实现三维层析成像算法简单、计算量小,但该算法主要适用于远场情况;并且,靠增加基线数提高分辨率的办法也会大大增加系统复杂度。针对上述问题提出一种基于近场圆周扫描的共焦层析成像诊断方案,结合合成孔径与共焦层析成像思想,通过修改映射关系,将用于二维成像中基于部分圆周扫描的滤波逆投影算法扩展到共焦层析成像中,在不同的目标高度层聚焦,完成空时共焦成像。计算机仿真和实验室测试结果均表明该方案可以克服高度模糊,从三维目标场景中提取三维信息,完成准三维成像重建。

  3. Non-traditional Machining Techniques for Fabricating Metal Aerospace Filters

    Wang Wei; Zhu Di; D.M.Allen; H.J.A.Almondb

    2008-01-01

    Thanks to recent advances in manufacturing technology, aerospace system designers have many more options to fabricate high-quality, low-weight, high-capacity, cost-effective filters. Aside from traditional methods such as stamping, drilling and milling,many new approaches have been widely used in filter-manufacturing practices on account of their increased processing abilities. However, the restrictions on costs, the need for studying under stricter conditions such as in aggressive fluids, the complicity in design, the workability of materials, and others have made it difficult to choose a satisfactory method from the newly developed processes, such as,photochemical machining (PCM), photo electroforming (PEF) and laser beam machining (LBM) to produce small, inexpensive, lightweight aerospace filters. This article appraises the technical and economical viability of PCM, PEF, and LBM to help engineers choose the fittest approach to turn out aerospace filters.

  4. Reconstruction of CT images by the Bayes- back projection method

    Haruyama, M; Takase, M; Tobita, H

    2002-01-01

    In the course of research on quantitative assay of non-destructive measurement of radioactive waste, the have developed a unique program based on the Bayesian theory for reconstruction of transmission computed tomography (TCT) image. The reconstruction of cross-section images in the CT technology usually employs the Filtered Back Projection method. The new imaging reconstruction program reported here is based on the Bayesian Back Projection method, and it has a function of iterative improvement images by every step of measurement. Namely, this method has the capability of prompt display of a cross-section image corresponding to each angled projection data from every measurement. Hence, it is possible to observe an improved cross-section view by reflecting each projection data in almost real time. From the basic theory of Baysian Back Projection method, it can be not only applied to CT types of 1st, 2nd, and 3rd generation. This reported deals with a reconstruction program of cross-section images in the CT of ...

  5. Image reconstruction of simulated specimens using convolution back projection

    Mohd. Farhan Manzoor

    2001-04-01

    Full Text Available This paper reports about the reconstruction of cross-sections of composite structures. The convolution back projection (CBP algorithm has been used to capture the attenuation field over the specimen. Five different test cases have been taken up for evaluation. These cases represent varying degrees of complexity. In addition, the role of filters on the nature of the reconstruction errors has also been discussed. Numerical results obtained in the study reveal that CBP algorithm is a useful tool for qualitative as well as quantitative assessment of composite regions encountered in engineering applications.

  6. Traditional Medicine Through the Filter of Modernity: A brief historical analysis

    R. Rabarihoela Razafimandimby

    2014-12-01

    Full Text Available Traditional medicines still prevail in current Malagasy context. A careful historical analysis shows however that Malagasy traditional medicine has been screened through many filters before being accepted in a global context. Traditional medicine in its authentic form has been more or less rejected with the advent of  modern medicine – although not without reaction. This paper will retrace the historical encountering of the modern and traditional to determine the extent to which traditional medicine is acknowledged and used in the current prevailing modern, rational and scientific global context.

  7. Complete Localization of HVDC Back-to-Back Project Realized

    Yu Xinqiang; Liang Xuming; Wang Zuli; Ye Qing

    2006-01-01

    The first completely localized DC back-to-back project for asynchronous interconnection between Northwest and Central China plays an important role in realizing national power grid interconnection, spurring indigenous manufacturing industries and promoting DC transmission equipment. Insisting on the principle of autonomous innovation, this project was based on domestic forces in every aspect, from engineering organization, system design, equipment completion, engineering design, equipment manufacturing and procurement to construction and debugging. By passing through strict quality control, intermediate supervision and acceptance test and assessment, the project has been proved up to world advanced level.

  8. Acceleration of iterative tomographic image reconstruction by reference-based back projection

    Cheng, Chang-Chieh; Li, Ping-Hui; Ching, Yu-Tai

    2016-03-01

    The purpose of this paper is to design and implement an efficient iterative reconstruction algorithm for computational tomography. We accelerate the reconstruction speed of algebraic reconstruction technique (ART), an iterative reconstruction method, by using the result of filtered backprojection (FBP), a wide used algorithm of analytical reconstruction, to be an initial guess and the reference for the first iteration and each back projection stage respectively. Both two improvements can reduce the error between the forward projection of each iteration and the measurements. We use three methods of quantitative analysis, root-mean-square error (RMSE), peak signal to noise ratio (PSNR), and structural content (SC), to show that our method can reduce the number of iterations by more than half and the quality of the result is better than the original ART.

  9. Camera calibration based on the back projection process

    Gu, Feifei; Zhao, Hong; Ma, Yueyang; Bu, Penghui

    2015-12-01

    Camera calibration plays a crucial role in 3D measurement tasks of machine vision. In typical calibration processes, camera parameters are iteratively optimized in the forward imaging process (FIP). However, the results can only guarantee the minimum of 2D projection errors on the image plane, but not the minimum of 3D reconstruction errors. In this paper, we propose a universal method for camera calibration, which uses the back projection process (BPP). In our method, a forward projection model is used to obtain initial intrinsic and extrinsic parameters with a popular planar checkerboard pattern. Then, the extracted image points are projected back into 3D space and compared with the ideal point coordinates. Finally, the estimation of the camera parameters is refined by a non-linear function minimization process. The proposed method can obtain a more accurate calibration result, which is more physically useful. Simulation and practical data are given to demonstrate the accuracy of the proposed method.

  10. Camera calibration based on the back projection process

    Camera calibration plays a crucial role in 3D measurement tasks of machine vision. In typical calibration processes, camera parameters are iteratively optimized in the forward imaging process (FIP). However, the results can only guarantee the minimum of 2D projection errors on the image plane, but not the minimum of 3D reconstruction errors. In this paper, we propose a universal method for camera calibration, which uses the back projection process (BPP). In our method, a forward projection model is used to obtain initial intrinsic and extrinsic parameters with a popular planar checkerboard pattern. Then, the extracted image points are projected back into 3D space and compared with the ideal point coordinates. Finally, the estimation of the camera parameters is refined by a non-linear function minimization process. The proposed method can obtain a more accurate calibration result, which is more physically useful. Simulation and practical data are given to demonstrate the accuracy of the proposed method. (paper)

  11. Locations and focal mechanisms of deep long period events beneath Aleutian Arc volcanoes using back projection methods

    Lough, A. C.; Roman, D. C.; Haney, M. M.

    2015-12-01

    Deep long period (DLP) earthquakes are commonly observed in volcanic settings such as the Aleutian Arc in Alaska. DLPs are poorly understood but are thought to be associated with movements of fluids, such as magma or hydrothermal fluids, deep in the volcanic plumbing system. These events have been recognized for several decades but few studies have gone beyond their identification and location. All long period events are more difficult to identify and locate than volcano-tectonic (VT) earthquakes because traditional detection schemes focus on high frequency (short period) energy. In addition, DLPs present analytical challenges because they tend to be emergent and so it is difficult to accurately pick the onset of arriving body waves. We now expect to find DLPs at most volcanic centers, the challenge lies in identification and location. We aim to reduce the element of human error in location by applying back projection to better constrain the depth and horizontal position of these events. Power et al. (2004) provided the first compilation of DLP activity in the Aleutian Arc. This study focuses on the reanalysis of 162 cataloged DLPs beneath 11 volcanoes in the Aleutian arc (we expect to ultimately identify and reanalyze more DLPs). We are currently adapting the approach of Haney (2014) for volcanic tremor to use back projection over a 4D grid to determine position and origin time of DLPs. This method holds great potential in that it will allow automated, high-accuracy picking of arrival times and could reduce the number of arrival time picks necessary for traditional location schemes to well constrain event origins. Back projection can also calculate a relative focal mechanism (difficult with traditional methods due to the emergent nature of DLPs) allowing the first in depth analysis of source properties. Our event catalog (spanning over 25 years and volcanoes) is one of the longest and largest and enables us to investigate spatial and temporal variation in DLPs.

  12. Imaging Seismic Source Variations Using Back-Projection Methods at El Tatio Geyser Field, Northern Chile

    Kelly, C. L.; Lawrence, J. F.

    2014-12-01

    During October 2012, 51 geophones and 6 broadband seismometers were deployed in an ~50x50m region surrounding a periodically erupting columnar geyser in the El Tatio Geyser Field, Chile. The dense array served as the seismic framework for a collaborative project to study the mechanics of complex hydrothermal systems. Contemporaneously, complementary geophysical measurements (including down-hole temperature and pressure, discharge rates, thermal imaging, water chemistry, and video) were also collected. Located on the western flanks of the Andes Mountains at an elevation of 4200m, El Tatio is the third largest geyser field in the world. Its non-pristine condition makes it an ideal location to perform minutely invasive geophysical studies. The El Jefe Geyser was chosen for its easily accessible conduit and extremely periodic eruption cycle (~120s). During approximately 2 weeks of continuous recording, we recorded ~2500 nighttime eruptions which lack cultural noise from tourism. With ample data, we aim to study how the source varies spatially and temporally during each phase of the geyser's eruption cycle. We are developing a new back-projection processing technique to improve source imaging for diffuse signals. Our method was previously applied to the Sierra Negra Volcano system, which also exhibits repeating harmonic and diffuse seismic sources. We back-project correlated seismic signals from the receivers back to their sources, assuming linear source to receiver paths and a known velocity model (obtained from ambient noise tomography). We apply polarization filters to isolate individual and concurrent geyser energy associated with P and S phases. We generate 4D, time-lapsed images of the geyser source field that illustrate how the source distribution changes through the eruption cycle. We compare images for pre-eruption, co-eruption, post-eruption and quiescent periods. We use our images to assess eruption mechanics in the system (i.e. top-down vs. bottom-up) and

  13. SAR focusing of P-band ice sounding data using back-projection

    Kusk, Anders; Dall, Jørgen

    2010-01-01

    accommodated at the expense of computation time. The back-projection algorithm can be easily parallelized however, and can advantageously be implemented on a graphics processing unit (GPU). Results from using the back-projection algorithm on POLARIS ice sounder data from North Greenland shows that the quality...... of data is improved by the processing, and the performance of the GPU implementation allows for very fast focusing....

  14. Super-resolution Reconstruction Algorithm Based on Patch Similarity and Back-projection Modification

    Wei-long Chen; Li Guo; Wu He; Wei Wu; Xiao-min Yang

    2014-01-01

    We propose an effective super-resolution reconstruction algorithm based on patch similarity and back-projection modification. In the proposed algorithm, we assume patch to be similar in natural images and extract the high-frequency information from the best similar patch to add into goal high-resolution image. In the process of reconstruction, the high-resolution patch is back-projected into the low-resolution patch so as to gain detailed modification. Experiments performed on simulated low-r...

  15. Image Resolution Enhancement by Using Interpolation Followed by Iterative Back Projection

    Rasti, Pejman; Hasan DEMIREL; Anbarjafari, Gholamreza

    2016-01-01

    In this paper, we propose a new super resolution technique based on the interpolation followed by registering them using iterative back projection (IBP). Low resolution images are being interpolated and then the interpolated images are being registered in order to generate a sharper high resolution image. The proposed technique has been tested on Lena, Elaine, Pepper, and Baboon. The quantitative peak signal-to-noise ratio (PSNR) and structural similarity index (SSIM) results as well as the v...

  16. Comparison of back projection methods of determining earthquake rupture process in time and frequency domains

    Wang, W.; Wen, L.

    2013-12-01

    Back projection is a method to back project the seismic energy recorded in a seismic array back to the earthquake source region and determine the rupture process of a large earthquake. The method takes advantage of the coherence of seismic energy in a seismic array and is quick in determining some important properties of earthquake source. The method can be performed in both time and frequency domains. In time domain, the most conventional procedure is beam forming with some measures of suppressing the noise, such as the Nth root stacking, etc. In the frequency domain, the multiple signal classification method (MUSIC) estimates the direction of arrivals of multiple waves propagating through an array using the subspace method. The advantage of this method is the ability to study rupture properties at various frequencies and to resolve simultaneous arrivals making it suitable for detecting biliteral rupture of an earthquake source. We present a comparison of back projection results on some large earthquakes between the methods in time domain and frequency domain. The time-domain procedure produces an image that is smeared and exhibits some artifacts, although some enhancing stacking methods can at some extent alleviate the problem. On the other hand, the MUSIC method resolves clear multiple arrivals and provides higher resolution of rupture imaging.

  17. Imaging spatial and temporal seismic source variations at Sierra Negra Volcano, Galapagos Islands using back-projection methods

    Kelly, C. L.; Lawrence, J. F.; Ebinger, C. J.

    2013-12-01

    Imaging spatial and temporal seismic source variations at Sierra Negra Volcano, Galapagos Islands using back-projection methods Cyndi Kelly1, Jesse F. Lawrence1, Cindy Ebinger2 1Stanford University, Department of Geophysics, 397 Panama Mall, Stanford, CA 94305, USA 2University of Rochester, Department of Earth and Environmental Science, 227 Hutchison Hall, Rochester, NY 14627, USA Low-magnitude seismic signals generated by processes that characterize volcanic and hydrothermal systems and their plumbing networks are difficult to observe remotely. Seismic records from these systems tend to be extremely 'noisy', making it difficult to resolve 3D subsurface structures using traditional seismic methods. Easily identifiable high-amplitude bursts within the noise that might be suitable for use with traditional seismic methods (i.e. eruptions) tend to occur relatively infrequently compared to the length of an entire eruptive cycle. Furthermore, while these impulsive events might help constrain the dynamics of a particular eruption, they shed little insight into the mechanisms that occur throughout an entire eruption sequence. It has been shown, however, that the much more abundant low-amplitude seismic 'noise' in these records (i.e. volcanic or geyser 'tremor') actually represents a series of overlapping low-magnitude displacements that can be directly linked to magma, fluid, and volatile movement at depth. This 'noisy' data therefore likely contains valuable information about the processes occurring in the volcanic or hydrothermal system before, during and after eruption events. In this study, we present a new method to comprehensively study how the seismic source distribution of all events - including micro-events - evolves during different phases of the eruption sequence of Sierra Negra Volcano in the Galapagos Islands. We apply a back-projection search algorithm to image sources of seismic 'noise' at Sierra Negra Volcano during a proposed intrusion event. By analyzing

  18. Tradition, tradition

    Rockman, Howard A.

    2012-01-01

    Starting with this issue, the Editorial duties for the JCI move to Duke University and the University of North Carolina at Chapel Hill. As we begin our five-year tenure at the helm of this prestigious journal, the tradition of excellence that these two schools typically display on the basketball court now enters the editorial boardroom. PMID:22378046

  19. External force back-projective composition and globally deformable optimization for 3-D coronary artery reconstruction

    The clinical value of the 3D reconstruction of a coronary artery is important for the diagnosis and intervention of cardiovascular diseases. This work proposes a method based on a deformable model for reconstructing coronary arteries from two monoplane angiographic images acquired from different angles. First, an external force back-projective composition model is developed to determine the external force, for which the force distributions in different views are back-projected to the 3D space and composited in the same coordinate system based on the perspective projection principle of x-ray imaging. The elasticity and bending forces are composited as an internal force to maintain the smoothness of the deformable curve. Second, the deformable curve evolves rapidly toward the true vascular centerlines in 3D space and angiographic images under the combination of internal and external forces. Third, densely matched correspondence among vessel centerlines is constructed using a curve alignment method. The bundle adjustment method is then utilized for the global optimization of the projection parameters and the 3D structures. The proposed method is validated on phantom data and routine angiographic images with consideration for space and re-projection image errors. Experimental results demonstrate the effectiveness and robustness of the proposed method for the reconstruction of coronary arteries from two monoplane angiographic images. The proposed method can achieve a mean space error of 0.564 mm and a mean re-projection error of 0.349 mm. (paper)

  20. A new linear back projection algorithm to electrical tomography based on measuring data decomposition

    As an advanced measurement technique of non-radiant, non-intrusive, rapid response, and low cost, the electrical tomography (ET) technique has developed rapidly in recent decades. The ET imaging algorithm plays an important role in the ET imaging process. Linear back projection (LBP) is the most used ET algorithm due to its advantages of dynamic imaging process, real-time response, and easy realization. But the LBP algorithm is of low spatial resolution due to the natural ‘soft field’ effect and ‘ill-posed solution’ problems; thus its applicable ranges are greatly limited. In this paper, an original data decomposition method is proposed, and every ET measuring data are decomposed into two independent new data based on the positive and negative sensing areas of the measuring data. Consequently, the number of total measuring data is extended to twice as many as the number of the original data, thus effectively reducing the ‘ill-posed solution’. On the other hand, an index to measure the ‘soft field’ effect is proposed. The index shows that the decomposed data can distinguish between different contributions of various units (pixels) for any ET measuring data, and can efficiently reduce the ‘soft field’ effect of the ET imaging process. In light of the data decomposition method, a new linear back projection algorithm is proposed to improve the spatial resolution of the ET image. A series of simulations and experiments are applied to validate the proposed algorithm by the real-time performances and the progress of spatial resolutions. (paper)

  1. Imaging the ruptures of the 2009 Samoan and Sumatran earthquakes using broadband network back-projections: Results and limitations

    Hutko, A. R.; Lay, T.; Koper, K. D.

    2009-12-01

    Applications of teleseismic P-wave back-projection to image gross characteristics of large earthquake finite-source ruptures have been enabled by ready availability of large digital data sets. Imaging with short-period data from dense arrays or broadband data from global networks can place constraints on rupture attributes that otherwise have to be treated parametrically in conventional modeling and inversion procedures. Back-projection imaging may constrain choice of fault plane and rupture direction, velocity, duration and length for large (M>~8.0) earthquakes, and can robustly locate early aftershocks embedded in mainshock surface waves. Back-projection methods seek locations of coherent energy release from the source region, ideally associated with down-going P wave energy. For shallow events, depth phase arrivals can produce artifacts in back-projection images that appear as secondary or even prominent features with incorrect apparent source locations and times, and such effects need to be recognized. We apply broadband P-wave back-projection imaging to the 29 September 2009 Samoa (Mw8.2) and 30 September 2009 Sumatra (Mw7.6) earthquakes using data from globally distributed broadband stations and compare results to back-projections of synthetic seismograms from finite-source models for these events to evaluate the artifacts from depth phases. Back-projection images for the great normal-faulting Samoa event feature two prominent bright spots, which could be interpreted to correspond to two distinct slip patches, one near the epicenter in the outer trench slope and the other approximately 80 km to the west near the plate boundary megathrust where many aftershocks occurred. This interpretation is at odds with finite-fault modeling results, which indicate a predominantly bilateral rupture in the NW-SE direction on a steeply dipping trench slope fault, with rupture extending about 60 km in each direction. Back-projections of data and synthetic seismograms from the

  2. A comparative study between matched and mis-matched projection/back projection pairs used with ASIRT reconstruction method

    For algebraic reconstruction techniques both forward and back projection operators are needed. The ability to perform accurate reconstruction relies fundamentally on the forward projection and back projection methods which are usually, the transpose of each other. Even though the mis-matched pairs may introduce additional errors during the iterative process, the usefulness of mis-matched projector/back projector pairs has been proved in image reconstruction. This work investigates the performance of matched and mis-matched reconstruction pairs using popular forward projectors and their transposes when used in reconstruction tasks with additive simultaneous iterative reconstruction techniques (ASIRT) in a parallel beam approach. Simulated noiseless phantoms are used to compare the performance of the investigated pairs in terms of the root mean squared errors (RMSE) which are calculated between reconstructed slices and the reference in different regions. Results show that mis-matched projection/back projection pairs can promise more accuracy of reconstructed images than matched ones. The forward projection operator performance seems independent of the choice of the back projection operator and vice versa.

  3. A fast GPU-based approach to branchless distance-driven projection and back-projection in cone beam CT

    Schlifske, Daniel; Medeiros, Henry

    2016-03-01

    Modern CT image reconstruction algorithms rely on projection and back-projection operations to refine an image estimate in iterative image reconstruction. A widely-used state-of-the-art technique is distance-driven projection and back-projection. While the distance-driven technique yields superior image quality in iterative algorithms, it is a computationally demanding process. This has a detrimental effect on the relevance of the algorithms in clinical settings. A few methods have been proposed for enhancing the distance-driven technique in order to take advantage of modern computer hardware. This paper explores a two-dimensional extension of the branchless method proposed by Samit Basu and Bruno De Man. The extension of the branchless method is named "pre-integration" because it achieves a significant performance boost by integrating the data before the projection and back-projection operations. It was written with Nvidia's CUDA platform and carefully designed for massively parallel GPUs. The performance and the image quality of the pre-integration method were analyzed. Both projection and back-projection are significantly faster with preintegration. The image quality was analyzed using cone beam image reconstruction algorithms within Jeffrey Fessler's Image Reconstruction Toolbox. Images produced from regularized, iterative image reconstruction algorithms using the pre-integration method show no significant impact to image quality.

  4. Back-projection source reconstruction in the presence of point scatterers

    Solimene, Raffaele; Cuccaro, Antonio; Pierri, Rocco

    2016-06-01

    Inverse source and inverse scattering problems can benefit from multipath due to a scattering environment. At the same time, multipath can be source of artefacts in the reconstructions. In this paper the aim is to understand when and how multipath manifests its positive or negative effects. To this end, a simple scenario is considered. First of all, the problem is cast within a 2D scalar setting where multipath is assumed due to known ‘extra’ point-like scatterers. In order to make the study easier, the inverse source problem is dealt with. This allows us to handle a modelling operator with fewer terms than inverse scattering and hence it makes the study less tedious. A back-projection inversion method based on the adjoint of the radiation operator is exploited. This allows us to easily compute the model resolution kernels, i.e. the point spread function, whose dominant contributions are determined by phase stationary arguments. The role played by the point scatterers and how they contribute to an improvement of the achievable resolution is highlighted.

  5. Sinogram bow-tie filtering in FBP PET reconstruction

    Abella, Mónica; Vaquero, Juan José; Soto-Montenegro, M. L.; Lage, E.; Desco, Manuel

    2009-01-01

    Low-pass filtering of sinograms in the radial direction is the most common practice to limit noise amplification in filtered back projection FBP reconstruction of positron emission tomography studies. Other filtering strategies have been proposed to prevent the loss in resolution due to low-pass radial filters, although results have been diverse. Using the well-known properties of the Fourier transform of a sinogram, the authors defined a binary mask that matches the expected shape of the sup...

  6. A rapid parallelization of cone-beam projection and back-projection operator based on texture fetching interpolation

    Xie, Lizhe; Hu, Yining; Chen, Yang; Shi, Luyao

    2015-03-01

    Projection and back-projection are the most computational consuming parts in Computed Tomography (CT) reconstruction. Parallelization strategies using GPU computing techniques have been introduced. We in this paper present a new parallelization scheme for both projection and back-projection. The proposed method is based on CUDA technology carried out by NVIDIA Corporation. Instead of build complex model, we aimed on optimizing the existing algorithm and make it suitable for CUDA implementation so as to gain fast computation speed. Besides making use of texture fetching operation which helps gain faster interpolation speed, we fixed sampling numbers in the computation of projection, to ensure the synchronization of blocks and threads, thus prevents the latency caused by inconsistent computation complexity. Experiment results have proven the computational efficiency and imaging quality of the proposed method.

  7. Mitigating artifacts in back-projection source imaging with implications for frequency-dependent properties of the Tohoku-Oki earthquake

    Meng, Lingsen; Ampuero, Jean-Paul; Luo, Yingdi; Wu, Wenbo; Ni, Sidao

    2012-12-01

    Comparing teleseismic array back-projection source images of the 2011 Tohoku-Oki earthquake with results from static and kinematic finite source inversions has revealed little overlap between the regions of high- and low-frequency slip. Motivated by this interesting observation, back-projection studies extended to intermediate frequencies, down to about 0.1 Hz, have suggested that a progressive transition of rupture properties as a function of frequency is observable. Here, by adapting the concept of array response function to non-stationary signals, we demonstrate that the "swimming artifact", a systematic drift resulting from signal non-stationarity, induces significant bias on beamforming back-projection at low frequencies. We introduce a "reference window strategy" into the multitaper-MUSIC back-projection technique and significantly mitigate the "swimming artifact" at high frequencies (1 s to 4 s). At lower frequencies, this modification yields notable, but significantly smaller, artifacts than time-domain stacking. We perform extensive synthetic tests that include a 3D regional velocity model for Japan. We analyze the recordings of the Tohoku-Oki earthquake at the USArray and at the European array at periods from 1 s to 16 s. The migration of the source location as a function of period, regardless of the back-projection methods, has characteristics that are consistent with the expected effect of the "swimming artifact". In particular, the apparent up-dip migration as a function of frequency obtained with the USArray can be explained by the "swimming artifact". This indicates that the most substantial frequency-dependence of the Tohoku-Oki earthquake source occurs at periods longer than 16 s. Thus, low-frequency back-projection needs to be further tested and validated in order to contribute to the characterization of frequency-dependent rupture properties.

  8. Images of gravitational and magnetic phenomena derived from two-dimensional back-projection Doppler tomography of interacting binary stars

    Richards, Mercedes T.; Cocking, Alexander S.; Fisher, John G.; Conover, Marshall J., E-mail: mrichards@astro.psu.edu, E-mail: asc5097@psu.edu [Department of Astronomy and Astrophysics, Pennsylvania State University, University Park, PA 16802 (United States)

    2014-11-10

    We have used two-dimensional back-projection Doppler tomography as a tool to examine the influence of gravitational and magnetic phenomena in interacting binaries that undergo mass transfer from a magnetically active star onto a non-magnetic main-sequence star. This multitiered study of over 1300 time-resolved spectra of 13 Algol binaries involved calculations of the predicted dynamical behavior of the gravitational flow and the dynamics at the impact site, analysis of the velocity images constructed from tomography, and the influence on the tomograms of orbital inclination, systemic velocity, orbital coverage, and shadowing. The Hα tomograms revealed eight sources: chromospheric emission, a gas stream along the gravitational trajectory, a star-stream impact region, a bulge of absorption or emission around the mass-gaining star, a Keplerian accretion disk, an absorption zone associated with hotter gas, a disk-stream impact region, and a hot spot where the stream strikes the edge of a disk. We described several methods used to extract the physical properties of the emission sources directly from the velocity images, including S-wave analysis, the creation of simulated velocity tomograms from hydrodynamic simulations, and the use of synthetic spectra with tomography to sequentially extract the separate sources of emission from the velocity image. In summary, the tomography images have revealed results that cannot be explained solely by gravitational effects: chromospheric emission moving with the mass-losing star, a gas stream deflected from the gravitational trajectory, and alternating behavior between stream state and disk state. Our results demonstrate that magnetic effects cannot be ignored in these interacting binaries.

  9. Images of gravitational and magnetic phenomena derived from two-dimensional back-projection Doppler tomography of interacting binary stars

    We have used two-dimensional back-projection Doppler tomography as a tool to examine the influence of gravitational and magnetic phenomena in interacting binaries that undergo mass transfer from a magnetically active star onto a non-magnetic main-sequence star. This multitiered study of over 1300 time-resolved spectra of 13 Algol binaries involved calculations of the predicted dynamical behavior of the gravitational flow and the dynamics at the impact site, analysis of the velocity images constructed from tomography, and the influence on the tomograms of orbital inclination, systemic velocity, orbital coverage, and shadowing. The Hα tomograms revealed eight sources: chromospheric emission, a gas stream along the gravitational trajectory, a star-stream impact region, a bulge of absorption or emission around the mass-gaining star, a Keplerian accretion disk, an absorption zone associated with hotter gas, a disk-stream impact region, and a hot spot where the stream strikes the edge of a disk. We described several methods used to extract the physical properties of the emission sources directly from the velocity images, including S-wave analysis, the creation of simulated velocity tomograms from hydrodynamic simulations, and the use of synthetic spectra with tomography to sequentially extract the separate sources of emission from the velocity image. In summary, the tomography images have revealed results that cannot be explained solely by gravitational effects: chromospheric emission moving with the mass-losing star, a gas stream deflected from the gravitational trajectory, and alternating behavior between stream state and disk state. Our results demonstrate that magnetic effects cannot be ignored in these interacting binaries.

  10. Digital Filters for Low Frequency Equalization

    Tyril, Marni; Abildgaard, J.; Rubak, Per

    2001-01-01

    Digital filters with high resolution in the low-frequency range are studied. Specifically, for a given computational power, traditional IIR filters are compared with warped FIR filters, warped IIR filters, and modified warped FIR filters termed warped individual z FIR filters (WizFIR). The results...

  11. Implicit Kalman filtering

    Skliar, M.; Ramirez, W. F.

    1997-01-01

    For an implicitly defined discrete system, a new algorithm for Kalman filtering is developed and an efficient numerical implementation scheme is proposed. Unlike the traditional explicit approach, the implicit filter can be readily applied to ill-conditioned systems and allows for generalization to descriptor systems. The implementation of the implicit filter depends on the solution of the congruence matrix equation (A1)(Px)(AT1) = Py. We develop a general iterative method for the solution of this equation, and prove necessary and sufficient conditions for convergence. It is shown that when the system matrices of an implicit system are sparse, the implicit Kalman filter requires significantly less computer time and storage to implement as compared to the traditional explicit Kalman filter. Simulation results are presented to illustrate and substantiate the theoretical developments.

  12. Generalized Nonlinear Complementary Attitude Filter

    Jensen, Kenneth

    2011-01-01

    This work describes a family of attitude estimators that are based on a generalization of Mahony's nonlinear complementary filter. This generalization reveals the close mathematical relationship between the nonlinear complementary filter and the more traditional multiplicative extended Kalman filter. In fact, the bias-free and constant gain multiplicative continuous-time extended Kalman filters may be interpreted as special cases of the generalized attitude estimator. The correspondence provi...

  13. 用过滤器实现Web网站汉字简繁体自动转换%Using ISAPI Filter To Implement the Web Pages Auto Conversion Between Simplified and Traditional Chinese Characters

    张震; 张曾科

    2001-01-01

    本文对网络上汉字的显示与传输进行了研究,提出-种新的在Web服务器端直接解决汉字繁简体内码转换的方案,使得只有一种内码的中文主页也可以自动地对不同内码浏览器提供支持,而不必要求客户端安装软件。这种思想在Windows NT下用IIS里的ISAPI过滤器得以实现。%This paper put forward a new conversion method between simplified and traditional Chinese characters. By using the ISAPI filter,homepages in Chinese characters on the web site could be translated automatically to support browsers on different Chinese systems without any software or language packages installed on the clients. Thisidea is completed in the Intemet Information Server 4.0 under the Window NT.

  14. A back projection dosimetry method for diagnostic and orthovoltage x-ray from 40 to 140 kVp for patients and phantoms

    Afrashteh, Hossein

    2005-07-01

    Patient dosimetry in practice is involved with time consuming, tedious calculations during the measurement process. There is a need for a straight forward and accurate method to perform patient dosimetry when required. A back projection dosimetry method for patient/phantom using Entrance Surface Dose (ESD) and its corresponding Exit Surface Dose with an average value for attenuation coefficient (mu), (e.g., mean effective attenuation coefficient (mu`)), was developed. The method focused on low energy X-ray units (40--140 kVp), primarily for conventional diagnostic radiography and low energy radiation therapy procedures. The assumption is that it may be used for similar concepts and modalities within the same energy range, (e.g., fluoroscopy, where the skin injuries have been common in the past, or mammography, where the radiation carcinogenesis has been a matter of concern). A new Gafchromic film, XR-QA, as a precision dosimeter was assessed and used with this algorithm. Due to the fact that the dose range often seen in conventional radiography exams in most cases is not high enough to activate the sensitive layer of this film sufficiently, the measured net Optical Density (OD) changes were not substantial enough. Therefore, a conventional and relatively low speed dental film, DF58 Ultra, was used. Various thicknesses of Acrylic, a tissue equivalent material, were used with the algorithm. When compared with the other sources and reference data, the results from the developed mathematical algorithm are in a reasonable agreement with these values. The developed method is straight forward, and within the acceptable accuracy range. The back projection dosimetry method is effective and may be used individually for the desired body parts or fetus areas, depending on the clinical practice and interests.

  15. Microwave Filters

    Zhou, Jiafeng

    2010-01-01

    The general theory of microwave filter design based on lumped-element circuit is described in this chapter. The lowpass prototype filters with Butterworth, Chebyshev and quasielliptic characteristics are synthesized, and the prototype filters are then transformed to bandpass filters by lowpass to bandpass frequency mapping. By using immitance inverters ( J - or K -inverters), the bandpass filters can be realized by the same type of resonators. One design example is given to verify the theory ...

  16. Water Filters

    1993-01-01

    The Aquaspace H2OME Guardian Water Filter, available through Western Water International, Inc., reduces lead in water supplies. The filter is mounted on the faucet and the filter cartridge is placed in the "dead space" between sink and wall. This filter is one of several new filtration devices using the Aquaspace compound filter media, which combines company developed and NASA technology. Aquaspace filters are used in industrial, commercial, residential, and recreational environments as well as by developing nations where water is highly contaminated.

  17. Improved resolution and reduced clutter in ultra-wideband microwave imaging using cross-correlated back projection: experimental and numerical results.

    Jacobsen, S; Birkelund, Y

    2010-01-01

    Microwave breast cancer detection is based on the dielectric contrast between healthy and malignant tissue. This radar-based imaging method involves illumination of the breast with an ultra-wideband pulse. Detection of tumors within the breast is achieved by some selected focusing technique. Image formation algorithms are tailored to enhance tumor responses and reduce early-time and late-time clutter associated with skin reflections and heterogeneity of breast tissue. In this contribution, we evaluate the performance of the so-called cross-correlated back projection imaging scheme by using a scanning system in phantom experiments. Supplementary numerical modeling based on commercial software is also presented. The phantom is synthetically scanned with a broadband elliptical antenna in a mono-static configuration. The respective signals are pre-processed by a data-adaptive RLS algorithm in order to remove artifacts caused by antenna reverberations and signal clutter. Successful detection of a 7 mm diameter cylindrical tumor immersed in a low permittivity medium was achieved in all cases. Selecting the widely used delay-and-sum (DAS) beamforming algorithm as a benchmark, we show that correlation based imaging methods improve the signal-to-clutter ratio by at least 10 dB and improves spatial resolution through a reduction of the imaged peak full-width half maximum (FWHM) of about 40-50%. PMID:21331362

  18. Portable Wideband Microwave Imaging System for Intracranial Hemorrhage Detection Using Improved Back-projection Algorithm with Model of Effective Head Permittivity

    Mobashsher, Ahmed Toaha; Mahmoud, A.; Abbosh, A. M.

    2016-02-01

    Intracranial hemorrhage is a medical emergency that requires rapid detection and medication to restrict any brain damage to minimal. Here, an effective wideband microwave head imaging system for on-the-spot detection of intracranial hemorrhage is presented. The operation of the system relies on the dielectric contrast between healthy brain tissues and a hemorrhage that causes a strong microwave scattering. The system uses a compact sensing antenna, which has an ultra-wideband operation with directional radiation, and a portable, compact microwave transceiver for signal transmission and data acquisition. The collected data is processed to create a clear image of the brain using an improved back projection algorithm, which is based on a novel effective head permittivity model. The system is verified in realistic simulation and experimental environments using anatomically and electrically realistic human head phantoms. Quantitative and qualitative comparisons between the images from the proposed and existing algorithms demonstrate significant improvements in detection and localization accuracy. The radiation and thermal safety of the system are examined and verified. Initial human tests are conducted on healthy subjects with different head sizes. The reconstructed images are statistically analyzed and absence of false positive results indicate the efficacy of the proposed system in future preclinical trials.

  19. High security and robust optical image encryption approach based on computer-generated integral imaging pickup and iterative back-projection techniques

    Li, Xiao Wei; Cho, Sung Jin; Kim, Seok Tae

    2014-04-01

    In this paper, a novel optical image encryption algorithm by combining the use of computer-generated integral imaging (CGII) pickup technique and iterative back-projection (IBP) technique is proposed. In this scheme, a color image to be encrypted which is firstly segregated into three channels: red, green, and blue. Each of these three channels is independently captured by using a virtual pinhole array and be computationally transformed as a sub-image array. Then, each of these three sub-image arrays are scrambled by the Fibonacci transformation (FT) algorithm, respectively. These three scrambled sub-image arrays are encrypted by the hybrid cellular automata (HCA), respectively. Ultimately, these three encrypted images are combined to produce the colored encrypted image. In the reconstruction process, because the computational integral imaging reconstruction (CIIR) is a pixel-overlapping reconstruction technique, the interference of the adjacent pixels will decrease the quality of the reconstructed image. To address this problem, we introduce an image super-resolution reconstruction technique, the image can be computationally reconstructed by the IBP technique. Some numerical simulations are made to test the validity and the capability of the proposed image encryption algorithm.

  20. Cold Crystal Reflector Filter Concept

    Muhrer, G.

    2014-01-01

    In this paper the theoretical concept of a cold crystal reflector filter will be presented. The aim of this concept is to balance the shortcoming of the traditional cold polycrystalline reflector filter, which lies in the significant reduction of the neutron flux right above (in energy space) or right below (wavelength space) the first Bragg edge.

  1. GMTI processing using back projection.

    Doerry, Armin Walter

    2013-07-01

    Backprojection has long been applied to SAR image formation. It has equal utility in forming the range-velocity maps for Ground Moving Target Indicator (GMTI) radar processing. In particular, it overcomes the problem of targets migrating through range resolution cells.

  2. Optimal filtering

    Anderson, Brian D O

    2005-01-01

    This graduate-level text augments and extends beyond undergraduate studies of signal processing, particularly in regard to communication systems and digital filtering theory. Vital for students in the fields of control and communications, its contents are also relevant to students in such diverse areas as statistics, economics, bioengineering, and operations research.Topics include filtering, linear systems, and estimation; the discrete-time Kalman filter; time-invariant filters; properties of Kalman filters; computational aspects; and smoothing of discrete-time signals. Additional subjects e

  3. Characterizing trends in HIV infection among men who have sex with men in Australia by birth cohorts: results from a modified back-projection method

    Wand Handan

    2009-09-01

    Full Text Available Abstract Background We set out to estimate historical trends in HIV incidence in Australian men who have sex with men with respect to age at infection and birth cohort. Methods A modified back-projection technique is applied to data from the HIV/AIDS Surveillance System in Australia, including "newly diagnosed HIV infections", "newly acquired HIV infections" and "AIDS diagnoses", to estimate trends in HIV incidence over both calendar time and age at infection. Results Our results demonstrate that since 2000, there has been an increase in new HIV infections in Australian men who have sex with men across all age groups. The estimated mean age at infection increased from ~35 years in 2000 to ~37 years in 2007. When the epidemic peaked in the mid 1980s, the majority of the infections (56% occurred among men aged 30 years and younger; 30% occurred in ages 31 to 40 years; and only ~14% of them were attributed to the group who were older than 40 years of age. In 2007, the proportion of infections occurring in persons 40 years or older doubled to 31% compared to the mid 1980s, while the proportion of infections attributed to the group younger than 30 years of age decreased to 36%. Conclusion The distribution of HIV incidence for birth cohorts by infection year suggests that the HIV epidemic continues to affect older homosexual men as much as, if not more than, younger men. The results are useful for evaluating the impact of the epidemic across successive birth cohorts and study trends among the age groups most at risk.

  4. Gaoling Back to Back Project Thyristor Valve Electrical Design%高岭背靠背工程换流阀电气设计

    王英洁; 李斌; 田方

    2011-01-01

    The main purpose of HVDC back to back project is to unite two large AC power grid of different voltage grade or different frequency. Thyristor valve is the important equipment in the converter station. The purpose of thyristor valve is to use rectifier valve make AC power into DC power, and use inverter valve make DC power into AC power. In allusion to main parameter requirement of Gaoling +125 kV HVDC back to back thyristor valve electrical design, according to thyristor valve design technical specification, use simulate and analysis calculation,optimize the main parameter of the thyristor, and carry out calculation the number of thyristor positions and damping circuit parameters (including damping capacitance, damping resistance and DC resistance), and coordination of protective firing level. Calculation result verifies the correctness of electrical design.%高压直流背靠背工程的主要用途就是使不同电压等级和频率的两大交流系统联网,其中换流阀是换流站中的主要设备,它的作用是把交流电力变换成直流电力,或者实现逆变换.针对高岭背靠背工程±125 kV换流阀电气设计主参数要求,依据换流阀技术规范,通过模拟仿真分析计算,优化了的晶闸管参数,计算出单阀晶闸管串联数、阻尼电阻、阻尼电容、直流电阻以及晶闸管级保护触发电压值等主要电气设计数据,说明设计思路及计算方法的正确性.

  5. Rupture Processes of the Mw8.3 Sea of Okhotsk Earthquake and Aftershock Sequences from 3-D Back Projection Imaging

    Jian, P. R.; Hung, S. H.; Meng, L.

    2014-12-01

    On May 24, 2013, the largest deep earthquake ever recorded in history occurred on the southern tip of the Kamchatka Island, where the Pacific Plate subducts underneath the Okhotsk Plate. Previous 2D beamforming back projection (BP) of P- coda waves suggests the mainshock ruptured bilaterally along a horizontal fault plane determined by the global centroid moment tensor solution. On the other hand, the multiple point source inversion of P and SH waveforms argued that the earthquake comprises a sequence of 6 subevents not located on a single plane but actually distributed in a zone that extends 64 km horizontally and 35 km in depth. We then apply a three-dimensional MUSIC BP approach to resolve the rupture processes of the manishock and two large aftershocks (M6.7) with no a priori setup of preferential orientations of the planar rupture. The maximum pseudo-spectrum of high-frequency P wave in a sequence of time windows recorded by the densely-distributed stations from US and EU Array are used to image 3-D temporal and spatial rupture distribution. The resulting image confirms that the nearly N-S striking but two antiparallel rupture stages. The first subhorizontal rupture initially propagates toward the NNE direction, while at 18 s later it directs reversely to the SSW and concurrently shifts downward to 35 km deeper lasting for about 20 s. The rupture lengths in the first NNE-ward and second SSW-ward stage are about 30 km and 85 km; the estimated rupture velocities are 3 km/s and 4.25 km/s, respectively. Synthetic experiments are undertaken to assess the capability of the 3D MUSIC BP for the recovery of spatio-temporal rupture processes. Besides, high frequency BP images based on the EU-Array data show two M6.7 aftershocks are more likely to rupture on the vertical fault planes.

  6. Keeping Tradition

    Zenhong, C.; Buwalda, P.L.

    2011-01-01

    Chinese dumplings such as Jiao Zi and Bao Zi are two of the popular traditional foods in Asia. They are usually made from wheat flour dough (rice flour or starch is sometimes used) that contains fillings. They can be steamed, boiled and fried and are consumed either as a main meal or dessert. As the

  7. Extensions to polar formatting with spatially variant post-filtering

    Garber, Wendy L.; Hawley, Robert W.

    2011-06-01

    The polar format algorithm (PFA) is computationally faster than back projection for producing spotlight mode synthetic aperture radar (SAR). This is very important in applications such as video SAR for persistent surveillance, as images may need to be produced in real time. PFA's speed is largely due to making a planar wavefront assumption and forming the image onto a regular grid of pixels lying in a plane. Unfortunately, both assumptions cause loss of focus in airborne persistent surveillance applications. The planar wavefront assumption causes a loss of focus in the scene for pixels that are far from scene center. The planar grid of image pixels causes loss of the depth of focus for conic flight geometries. In this paper, we present a method to compensate for the loss of depth of focus while warping the image onto a terrain map to produce orthorectified imagery. This technique applies a spatially variant post-filter and resampling to correct the defocus while dewarping the image. This work builds on spatially variant post-filtering techniques previously developed at Sandia National Laboratories in that it incorporates corrections for terrain height and circular flight paths. This approach produces high quality SAR images many times faster than back projection.

  8. Application of circular filter inserts

    High efficiency particulate air (HEPA) filters are used in the ventilation of nuclear plant as passive clean-up devices. Traditionally, the work-horse of the industry has been the rectangular HEPA filter. An assessment of the problems associated with remote handling, changing, and disposal of these rectangular filters suggested that significant advantages to filtration systems could be obtained by the adoption of HEPA filters with circular geometry for both new and existing ventilation plants. This paper covers the development of circular geometry filters and highlights the advantages of this design over their rectangular counterparts. The work has resulted in a range of commercially available filters for flows from 45 m3/h up to 3400 m3/h. This paper also covers the development of a range of sizes and types of housings that employ simple change techniques which take advantage of the circular geometry. The systems considered here have been designed in response to the requirements for shielded (remote filter change) and for unshielded facilities (potentially for bag changing of filters). Additionally the designs have allowed for the possibility of retrofitting circular geometry HEPA filters in place of the rectangular geometry filter

  9. High Resolution Telesesimic P-wave Back-Projection Imaging Using Variable Travel Time Corrections: Characterizing Sub-Events of the Great April 11th 2012 Indian Ocean Intraplate Earthquakes

    Kwong, K. B.; Koper, K. D.; Yue, H.; Lay, T.

    2012-12-01

    Two of the largest strike-slip earthquakes ever recorded occurred off the coast of northern Sumatra on April 11th 2012. The Mw 8.7 mainshock and Mw 8.2 aftershock occurred east of the NinetyEast Ridge in the Wharton Basin, a region of intraplate deformation with prominent fracture zones striking NNE-SSW. The relative lack of geodetic and local seismic data compared to other recent great earthquakes make teleseismic data especially important for understanding the rupture properties of these events. We performed short-period P-wave back-projection imaging using independent networks of stations in Europe and Japan. Preliminary images from the two networks showed similarly complex multi-event sources for the mainshock that indicate rupture occurred along both nodal planes of the gCMT solution, consistent with the locations of early aftershocks. Back-projection images of the Mw 8.2 aftershock showed a single, compact, bilateral rupture corresponding to the NNE-SSW nodal plane of the CMT solution [Yue et al., 2012]. Here we improve upon the resolution and accuracy of our initial back-projection images by estimating station specific travel time corrections that vary across the source region [e.g., Ishii et al., 2007]. These corrections are used to compensate for 3D variations in Earth structure that occur between the source region and the seismometers, and act to focus the array beams. We perform multi-channel cross-correlations of P waves recorded for 7 aftershocks that were (1) distributed broadly around the source region and (2) well-observed at seismometers in Europe. For each seismometer in the array, the 8 measured static corrections are smoothly interpolated over the entire source region with a Kriging method to form a travel time correction surface. These surfaces are then used with an otherwise conventional back-projection approach [Xu et al., 2009] to image the ruptures. Our new images are broadly consistent with our original results, indicating that the

  10. An Sub-Image Fast Factorized Back Projection Algorithm Based on Optimal Regional Partition%基于最优区域划分的子块快速因子分解后向投影算法

    林世斌; 李悦丽; 严少石; 周智敏

    2012-01-01

    Back Projection (BP) algorithm is suitable for airborne Ultra Wide Band Synthetic Aperture Radar (UWB SAR) imaging for its advantages such as perfect focusing and motion compensation. However, its application will be limited by the big computation load. The Sub-Image Fast Factorized Back Projection (SIFFBP) algorithm, which substantially reduces the computational load, can improve the practicability of the BP algorithm. In this article, an improved SIFFBP algorithm based on optimal regional partition is proposed by analyzing the regional division constraints. It provides a way to solve the acceleration-decline problem of the conventional SIFFBP algorithm when the SAR system has a small integration angle. The proposal can further reduce the computational load when the integration angle is smaller than 60 degree or when the length of the imaging region is much larger or smaller than the width. The performance of the proposal is demonstrated by u-sing simulated data, as well as real SAR data.%后向投影( Back Projection,BP)算法具有精确聚焦、完美运动补偿等优点,适合于机载超宽带合成孔径雷达(Ultra Wide Band Synthetic Aperture Radar,UWB SAR)成像,但是巨大的计算量限制了它的实际应用.子块快速因子分解后向投影算法(Sub-Image Fast Factorized Back Projection,SIFFBP)算法大幅度减小了BP算法的计算量,提高了BP算法的实用性.本文通过分析SIFFBP算法区域划分的约束条件,提出了一种基于最优区域划分的改进算法,解决了传统SIFFBP算法在小波束积累角时加速性能下降的问题.当波束积累角小于60度或成像区域长宽相差较大时,改进算法进一步减小了计算量.仿真和实测SAR数据的成像结果验证了改进算法的性能.

  11. Sinogram bow-tie filtering in FBP PET reconstruction.

    Abella, M; Vaquero, J J; Soto-Montenegro, M L; Lage, E; Desco, M

    2009-05-01

    Low-pass filtering of sinograms in the radial direction is the most common practice to limit noise amplification in filtered back projection (FBP) reconstruction of positron emission tomography studies. Other filtering strategies have been proposed to prevent the loss in resolution due to low-pass radial filters, although results have been diverse. Using the well-known properties of the Fourier transform of a sinogram, the authors defined a binary mask that matches the expected shape of the support region in the Fourier domain of the sinogram ("bow tie"). This mask was smoothed by a convolution with a ten-point Gaussian kernel which not only avoids ringing but also introduces a pre-emphasis at low frequencies. A new filtering scheme for FBP is proposed, comprising this smoothed bow-tie filter combined with a standard radial filter and an axial filter. The authors compared the performance of the bow-tie filtering scheme with that of other previously reported methods: Standard radial filtering, angular filtering, and stackgram-domain filtering. All the quantitative data in the comparisons refer to a baseline reconstruction using a ramp filter only. When using the smallest size of the Gaussian kernel in the stackgram domain, the authors achieved a noise reduction of 33% at the cost of degrading radial and tangential resolutions (14.5% and 16%, respectively, for cubic interpolation). To reduce the noise by 30%, the angular filter produced a larger degradation of contrast (3%) and tangential resolution (46% at 10 mm from the center of the field of view) and showed noticeable artifacts in the form of circular blurring dependent on the distance to the center of the field of view. For a similar noise reduction (33%), the proposed bow-tie filtering scheme yielded optimum results in resolution (gain in radial resolution of 10%) and contrast (1% increase) when compared with any of the other filters alone. Experiments with rodent images showed noticeable image quality

  12. Data assimilation the ensemble Kalman filter

    Evensen, Geir

    2006-01-01

    Covers data assimilation and inverse methods, including both traditional state estimation and parameter estimation. This text and reference focuses on various popular data assimilation methods, such as weak and strong constraint variational methods and ensemble filters and smoothers.

  13. Generalised Filtering

    Karl Friston

    2010-01-01

    Full Text Available We describe a Bayesian filtering scheme for nonlinear state-space models in continuous time. This scheme is called Generalised Filtering and furnishes posterior (conditional densities on hidden states and unknown parameters generating observed data. Crucially, the scheme operates online, assimilating data to optimize the conditional density on time-varying states and time-invariant parameters. In contrast to Kalman and Particle smoothing, Generalised Filtering does not require a backwards pass. In contrast to variational schemes, it does not assume conditional independence between the states and parameters. Generalised Filtering optimises the conditional density with respect to a free-energy bound on the model's log-evidence. This optimisation uses the generalised motion of hidden states and parameters, under the prior assumption that the motion of the parameters is small. We describe the scheme, present comparative evaluations with a fixed-form variational version, and conclude with an illustrative application to a nonlinear state-space model of brain imaging time-series.

  14. 反投影滤波重建功能材料梯度折射率模拟分析*%Simulation reconstruction analysis of gradient refractive index for functionally gradient materials based on filtered back projection

    周文静; 李海鹏; 韩冰

    2014-01-01

    开展功能材料梯度折射率分布反投影滤波重建效果及其最小折射率分辨率的模拟分析.为提高折射率重建分辨率,选取适用于多投影方向的滤波反投影算法,分别分析了轴向梯度变化及径向梯度变化的两类功能材料梯度折射率重建效果,相同模拟条件下,重建误差均约为1%;同时也模拟分析了上述两类梯度折射率的最小分辨率,模拟结果表明,当投影量保持为90不变的情况下,径向梯度变化折射率较轴向梯度变化折射率重建准确度高,且相邻梯度间隔>0.003时,重建得到的两种梯度变化趋势的梯度折射率分布仍然能较好地分辨出其梯度间隔.%As parameters of functionally gradient materials were hard to represent,reconstruction method based on compressive sensing digital holographic tomography was proposed.In order to improve the reconstructed res-olution of refractive index,this paper focuses on simulation analysis of the tomographic reconstruction effect and resolution of functionally gradient materials’gradient refractive index.Functional materials with radical or axial varient refractive index are analyzed,and reconstruction results were compared.In the same simulation condition,the reconstruction error are all 1%.The resolution of reconstruction was also analyzed in simulation situation.From the simulation result,when the number of projection was 90,radial reconstruction error was lower than axial reconstruction.In addition,the gradient intervals can be identified from the reconstructed gra-dient trend of refractive index when the original gradient interval was larger than 0.003.

  15. Water Filter

    1982-01-01

    A compact, lightweight electrolytic water sterilizer available through Ambassador Marketing, generates silver ions in concentrations of 50 to 100 parts per billion in water flow system. The silver ions serve as an effective bactericide/deodorizer. Tap water passes through filtering element of silver that has been chemically plated onto activated carbon. The silver inhibits bacterial growth and the activated carbon removes objectionable tastes and odors caused by addition of chlorine and other chemicals in municipal water supply. The three models available are a kitchen unit, a "Tourister" unit for portable use while traveling and a refrigerator unit that attaches to the ice cube water line. A filter will treat 5,000 to 10,000 gallons of water.

  16. Robust filtering for uncertain systems a parameter-dependent approach

    Gao, Huijun

    2014-01-01

    This monograph provides the reader with a systematic treatment of robust filter design, a key issue in systems, control and signal processing, because of the fact that the inevitable presence of uncertainty in system and signal models often degrades the filtering performance and may even cause instability. The methods described are therefore not subject to the rigorous assumptions of traditional Kalman filtering. The monograph is concerned with robust filtering for various dynamical systems with parametric uncertainties, and focuses on parameter-dependent approaches to filter design. Classical filtering schemes, like H2 filtering and H¥ filtering, are addressed, and emerging issues such as robust filtering with constraints on communication channels and signal frequency characteristics are discussed. The text features: ·        design approaches to robust filters arranged according to varying complexity level, and emphasizing robust filtering in the parameter-dependent framework for the first time; ·...

  17. Variable Span Filters for Speech Enhancement

    Jensen, Jesper Rindom; Benesty, Jacob; Christensen, Mads Græsbøll

    2016-01-01

    In this work, we consider enhancement of multichannel speech recordings. Linear filtering and subspace approaches have been considered previously for solving the problem. The current linear filtering methods, although many variants exist, have limited control of noise reduction and speech...... distortion. Subspace approaches, on the other hand, can potentially yield better control by filtering in the eigen-domain, but traditionally these approaches have not been optimized explicitly for traditional noise reduction and signal distortion measures. Herein, we combine these approaches by deriving...... optimal filters using a joint diagonalization as a basis. This gives excellent control over the performance, as we can optimize for noise reduction or signal distortion performance. Results from real data experiments show that the proposed variable span filters can achieve better performance than existing...

  18. Satisfactory Optimization Design of IIR Digital Filters

    Jin Weidong; Zhang Gexiang; Zhao Duo

    2005-01-01

    A new method called satisfactory optimization method is proposed to design IIR (Infinite Impulse Response) digital filters, and the satisfactory optimization model is presented. The detailed algorithm of designing IIR digital filters using satisfactory optimization method is described. By using quantum genetic algorithm characterized by rapid convergence and good global search capability, the satisfying solutions are achieved in the experiment of designing lowpass and bandpass IIR digital filters. Experimental results show that the performances of IIR filters designed by the introduced method are better than those by traditional methods.

  19. Digital filters

    Hamming, Richard W

    1997-01-01

    Digital signals occur in an increasing number of applications: in telephone communications; in radio, television, and stereo sound systems; and in spacecraft transmissions, to name just a few. This introductory text examines digital filtering, the processes of smoothing, predicting, differentiating, integrating, and separating signals, as well as the removal of noise from a signal. The processes bear particular relevance to computer applications, one of the focuses of this book.Readers will find Hamming's analysis accessible and engaging, in recognition of the fact that many people with the s

  20. Anti-Aliasing filter for reverse-time migration

    Zhan, Ge

    2012-01-01

    We develop an anti-aliasing filter for reverse-time migration (RTM). It is similar to the traditional anti-aliasing filter used for Kirchhoff migration in that it low-pass filters the migration operator so that the dominant wavelength in the operator is greater than two times the trace sampling interval, except it is applied to both primary and multiple reflection events. Instead of applying this filter to the data in the traditional RTM operation, we apply the anti-aliasing filter to the generalized diffraction-stack migration operator. This gives the same migration image as computed by anti-aliased RTM. Download

  1. Research on dynamic image reconstruction for MIT based on back-projection algorithm%基于反投影的MIT动态图像重建方法研究

    柯丽; 林筱; 杜强; 赵璐璐

    2013-01-01

    Magnetic induction tomography (MIT) is a contactless and non-invasive medical imaging technology, and the image reconstruction algorithm plays an important role in realizing MIT imaging quickly and accurately. An improved back-projection image reconstruction algorithm is presented in this paper. Firstly, the back-projection path is determined from the magnetic field line according to the magnetic field distribution in the imaging field, which reduces the positioning error of the magnetic imaging using the straight line back-projection algorithm. Secondly, an edge detection data modification model is built based on the electromagnetic relation in MIT, and this model is used to modify the phase shifts detected by the detection coils, which further improves the reconstructed image positioning accuracy. Two groups of sequence images were reconstructed for the two cases of conductivity variation and position variation of the disturbance object in the imaging field. With the conjoint analysis of the image sequence, the longitudinal impedance variation information was obtained, which reflects the dynamic information that the imaging object varies with time. Experiment results show that the improved back-projection algorithm for MIT possesses the characteristics of high reconstruction speed and accurate positioning; the method can accurately reflect the conductivity variation in the imaging field. Combined with the conjoint analysis of sequence image, the method can realize the dynamic imaging for MIT.%磁感应断层成像(magnetic induction tomography,MIT)是一种无创、非接触的新型医学成像技术,图像重建算法是实现MIT快速、精确成像的关键.提出一种改进的反投影图像重建算法,首先根据成像区域的磁场分布,由磁力线确定反投影路径,降低了直线反投影用于磁场成像的定位误差;其次根据MIT电磁关系推导,构建了边界检测数据的修正模型,据此对边界相位差数据进行修正处

  2. SavGolFilterCov: Savitzky Golay filter for data with error covariance

    More, Surhud

    2016-01-01

    A Savitzky-Golay filter is often applied to data to smooth the data without greatly distorting the signal; however, almost all data inherently comes with noise, and the noise properties can differ from point to point. This python script improves upon the traditional Savitzky-Golay filter by accounting for error covariance in the data. The inputs and arguments are modeled after scipy.signal.savgol_filter.

  3. Convergent Filter Bases

    Coghetto Roland

    2015-01-01

    We are inspired by the work of Henri Cartan [16], Bourbaki [10] (TG. I Filtres) and Claude Wagschal [34]. We define the base of filter, image filter, convergent filter bases, limit filter and the filter base of tails (fr: filtre des sections).

  4. Optimal Gaussian Filter for Effective Noise Filtering

    Kopparapu, Sunil; Satish, M

    2014-01-01

    In this paper we show that the knowledge of noise statistics contaminating a signal can be effectively used to choose an optimal Gaussian filter to eliminate noise. Very specifically, we show that the additive white Gaussian noise (AWGN) contaminating a signal can be filtered best by using a Gaussian filter of specific characteristics. The design of the Gaussian filter bears relationship with the noise statistics and also some basic information about the signal. We first derive a relationship...

  5. Filter Bank Design for Subband Adaptive Filtering

    Haan, Jan Mark de

    2001-01-01

    Adaptive filtering is an important subject in the field of signal processing and has numerous applications in fields such as speech processing and communications. Examples in speech processing include speech enhancement, echo- and interference- cancellation, and speech coding. Subband filter banks have been introduced in the area of adaptive filtering in order to improve the performance of time domain adaptive filters. The main improvements are faster convergence speed and the reduction of co...

  6. Fault Tolerant Parallel Filters Based On Bch Codes

    K.Mohana Krishna

    2015-04-01

    Full Text Available Digital filters are used in signal processing and communication systems. In some cases, the reliability of those systems is critical, and fault tolerant filter implementations are needed. Over the years, many techniques that exploit the filters’ structure and properties to achieve fault tolerance have been proposed. As technology scales, it enables more complex systems that incorporate many filters. In those complex systems, it is common that some of the filters operate in parallel, for example, by applying the same filter to different input signals. Recently, a simple technique that exploits the presence of parallel filters to achieve multiple fault tolerance has been presented. In this brief, that idea is generalized to show that parallel filters can be protected using Bose– Chaudhuri–Hocquenghem codes (BCH in which each filter is the equivalent of a bit in a traditional ECC. This new scheme allows more efficient protection when the number of parallel filters is large.

  7. Filters in 2D and 3D Cardiac SPECT Image Processing

    Maria Lyra

    2014-01-01

    Full Text Available Nuclear cardiac imaging is a noninvasive, sensitive method providing information on cardiac structure and physiology. Single photon emission tomography (SPECT evaluates myocardial perfusion, viability, and function and is widely used in clinical routine. The quality of the tomographic image is a key for accurate diagnosis. Image filtering, a mathematical processing, compensates for loss of detail in an image while reducing image noise, and it can improve the image resolution and limit the degradation of the image. SPECT images are then reconstructed, either by filter back projection (FBP analytical technique or iteratively, by algebraic methods. The aim of this study is to review filters in cardiac 2D, 3D, and 4D SPECT applications and how these affect the image quality mirroring the diagnostic accuracy of SPECT images. Several filters, including the Hanning, Butterworth, and Parzen filters, were evaluated in combination with the two reconstruction methods as well as with a specified MatLab program. Results showed that for both 3D and 4D cardiac SPECT the Butterworth filter, for different critical frequencies and orders, produced the best results. Between the two reconstruction methods, the iterative one might be more appropriate for cardiac SPECT, since it improves lesion detectability due to the significant improvement of image contrast.

  8. Traditional Agriculture and Permaculture.

    Pierce, Dick

    1997-01-01

    Discusses benefits of combining traditional agricultural techniques with the concepts of "permaculture," a framework for revitalizing traditions, culture, and spirituality. Describes school, college, and community projects that have assisted American Indian communities in revitalizing sustainable agricultural practices that incorporate cultural…

  9. Active Optical Lattice Filters

    Gary Evans; MacFarlane, Duncan L.; Govind Kannan; Jian Tong; Issa Panahi; Vishnupriya Govindan; L. Roberts Hunt

    2005-01-01

    Optical lattice filter structures including gains are introduced and analyzed. The photonic realization of the active, adaptive lattice filter is described. The algorithms which map between gains space and filter coefficients space are presented and studied. The sensitivities of filter parameters with respect to gains are derived and calculated. An example which is relevant to adaptive signal processing is also provided.

  10. Passive Power Filters

    Künzi, R

    2015-01-01

    Power converters require passive low-pass filters which are capable of reducing voltage ripples effectively. In contrast to signal filters, the components of power filters must carry large currents or withstand large voltages, respectively. In this paper, three different suitable filter struc tures for d.c./d.c. power converters with inductive load are introduced. The formulas needed to calculate the filter components are derived step by step and practical examples are given. The behaviour of the three discussed filters is compared by means of the examples. P ractical aspects for the realization of power filters are also discussed.

  11. Intelligent Optimize Design of LCL Filter for Three-Phase Voltage-Source PWM Rectifier

    Sun, Wei; Chen, Zhe; Wu, Xiaojie

    2009-01-01

    Compared to traditional L filter, a LCL filter is more effective on reducing harmonic distortion at switch frequency. So it is important to choose the LCL filter parameters to achieve good filtering effect. This paper introduces some traditional design methods. Design of a LCL filter by genetic...... algorithm (GA) and particle swam optimization (PSO) are presented in this paper and comparison of the two intelligent optimization. Simulation result and calculate data are provided to prove that intelligent optimization are more effective and simple than traditional methods....

  12. Filter synthesis using Genesys S/Filter

    Rhea, Randall W

    2014-01-01

    S/Filter includes tools beyond direct synthesis, including a wide variety of both exact and approximate equivalent network transforms, methods for selecting the most desirable out of potentially thousands of synthesized alternatives, and a transform history record that simplifies design attempts requiring iteration. Very few software programs are based on direct synthesis, and the additional features of S/Filter make it a uniquely effective tool for filter design.This resource presents a practical guide to using Genesys software for microwave and RF filter design and synthesis. The focus of th

  13. HEPA Filter Performance under Adverse Conditions

    This study involved challenging nuclear grade high-efficiency particulate air (HEPA) filters under a variety of conditions that can arise in Department of Energy (DOE) applications such as: low or high RH, controlled and uncontrolled challenge, and filters with physically damaged media or seals (i.e., leaks). Reported findings correlate filter function as measured by traditional differential pressure techniques in comparison with simultaneous instrumental determination of up and down stream PM concentrations. Additionally, emission rates and failure signatures will be discussed for filters that have either failed or exceeded their usable lifetime. Significant findings from this effort include the use of thermocouples up and down stream of the filter housing to detect the presence of moisture. Also demonstrated in the moisture challenge series of tests is the effect of repeated wetting of the filter. This produces a phenomenon referred to as transient failure before the tensile strength of the media weakens to the point of physical failure. An evaluation of the effect of particle size distribution of the challenge aerosol on loading capacity of filters is also included. Results for soot and two size distributions of KCl are reported. Loading capacities for filters ranged from approximately 70 g of soot to nearly 900 g for the larger particle size distribution of KCl. (authors)

  14. Adaptive Threshold Median Filter for Multiple-Impulse Noise

    JIANG Bo; HUANG Wei

    2007-01-01

    Attenuating the noises plays an essential role in the image processing. Almost all the traditional median filters concern the removal of impulse noise having a single layer, whose noise gray level value is constant. In this paper, a new adaptive median filter is proposed to handle those images corrupted not only by single layer noise. The adaptive threshold median filter(ATMF) has been developed by combining the adaptive median filter (AMF) and two dynamic thresholds. Because of the dynamic threshold being used, the ATMF is able to balance the removal of the multiple-impulse noise and the quality of image. Comparison of the proposed method with traditional median filters is provided. Some visual examples are given to demonstrate the performance of the proposed Filter.

  15. FIRT: Filtered iterative reconstruction technique with information restoration.

    Chen, Yu; Zhang, Yan; Zhang, Kai; Deng, Yuchen; Wang, Shengliu; Zhang, Fa; Sun, Fei

    2016-07-01

    Electron tomography (ET) combining subsequent sub-volume averaging has been becoming a unique way to study the in situ 3D structures of macromolecular complexes. However, information missing in electron tomography due to limited angular sampling is still the bottleneck in high-resolution electron tomography application. Here, based on the understanding of smooth nature of biological specimen, we present a new iterative image reconstruction algorithm, FIRT (filtered iterative reconstruction technique) for electron tomography by combining the algebra reconstruction technique (ART) and the nonlinear diffusion (ND) filter technique. Using both simulated and experimental data, in comparison to ART and weight back projection method, we proved that FIRT could generate a better reconstruction with reduced ray artifacts and significant improved correlation with the ground truth and partially restore the information at the non-sampled angular region, which was proved by investigating the 90° re-projection and by the cross-validation method. This new algorithm will be subsequently useful in the future for both cellular and molecular ET with better quality and improved structural details. PMID:27134004

  16. A local particle filter for high dimensional geophysical systems

    S. G. Penny; Miyoshi, T.

    2015-01-01

    A local particle filter (LPF) is introduced that outperforms traditional ensemble Kalman filters in highly nonlinear/non-Gaussian scenarios, both in accuracy and computational cost. The standard Sampling Importance Resampling (SIR) particle filter is augmented with an observation-space localization approach, for which an independent analysis is computed locally at each gridpoint. The deterministic resampling approach of Kitagawa is adapted for application locally and combine...

  17. Ceramic filters for bulk inoculation of nickel alloy castings

    F. Binczyk; J. Śleziona; P. Gradoń

    2011-01-01

    The work includes the results of research on production technology of ceramic filters which, besides the traditional filtering function, playalso the role of an inoculant modifying the macrostructure of cast nickel alloys. To play this additional role, filters should demonstratesufficient compression strength and ensure proper flow rate of liquid alloy. The role of an inoculant is played by cobalt aluminateintroduced to the composition of external coating in an amount from 5 to 10 wt.% . The ...

  18. Single-periodic-film optical bandpass filter

    Niraula, Manoj; Magnusson, Robert

    2015-01-01

    Resonant periodic surfaces and films enable new functionalities with wide applicability in practical optical systems. Their material sparsity, ease of fabrication, and minimal interface count provide environmental and thermal stability and robustness in applications. Here we report an experimental bandpass filter fashioned in a single patterned layer on a substrate. Its performance corresponds to bandpass filters requiring perhaps 30 traditional thin-film layers as shown by an example. We demonstrate an ultra-narrow, high-efficiency bandpass filter with extremely wide, flat, and low sidebands. This class of devices is designed with rigorous solutions of the Maxwell equations while engaging the physical principles of resonant waveguide gratings. The proposed technology is integration-friendly and opens doors for further development in various disciplines and spectral regions where thin-film solutions are traditionally applied.

  19. An Adaptive Approach to Mitigate Background Covariance Limitations in the Ensemble Kalman Filter

    Song, Hajoon

    2010-07-01

    A new approach is proposed to address the background covariance limitations arising from undersampled ensembles and unaccounted model errors in the ensemble Kalman filter (EnKF). The method enhances the representativeness of the EnKF ensemble by augmenting it with new members chosen adaptively to add missing information that prevents the EnKF from fully fitting the data to the ensemble. The vectors to be added are obtained by back projecting the residuals of the observation misfits from the EnKF analysis step onto the state space. The back projection is done using an optimal interpolation (OI) scheme based on an estimated covariance of the subspace missing from the ensemble. In the experiments reported here, the OI uses a preselected stationary background covariance matrix, as in the hybrid EnKF–three-dimensional variational data assimilation (3DVAR) approach, but the resulting correction is included as a new ensemble member instead of being added to all existing ensemble members. The adaptive approach is tested with the Lorenz-96 model. The hybrid EnKF–3DVAR is used as a benchmark to evaluate the performance of the adaptive approach. Assimilation experiments suggest that the new adaptive scheme significantly improves the EnKF behavior when it suffers from small size ensembles and neglected model errors. It was further found to be competitive with the hybrid EnKF–3DVAR approach, depending on ensemble size and data coverage.

  20. An iterative ensemble Kalman filter for reservoir engineering applications

    Krymskaya, M.V.; Hanea, R.G.; Verlaan, M.

    2009-01-01

    The study has been focused on examining the usage and the applicability of ensemble Kalman filtering techniques to the history matching procedures. The ensemble Kalman filter (EnKF) is often applied nowadays to solving such a problem. Meanwhile, traditional EnKF requires assumption of the distributi

  1. HEPA Filter Vulnerability Assessment

    GUSTAVSON, R.D.

    2000-05-11

    This assessment of High Efficiency Particulate Air (HEPA) filter vulnerability was requested by the USDOE Office of River Protection (ORP) to satisfy a DOE-HQ directive to evaluate the effect of filter degradation on the facility authorization basis assumptions. Within the scope of this assessment are ventilation system HEPA filters that are classified as Safety-Class (SC) or Safety-Significant (SS) components that perform an accident mitigation function. The objective of the assessment is to verify whether HEPA filters that perform a safety function during an accident are likely to perform as intended to limit release of hazardous or radioactive materials, considering factors that could degrade the filters. Filter degradation factors considered include aging, wetting of filters, exposure to high temperature, exposure to corrosive or reactive chemicals, and exposure to radiation. Screening and evaluation criteria were developed by a site-wide group of HVAC engineers and HEPA filter experts from published empirical data. For River Protection Project (RPP) filters, the only degradation factor that exceeded the screening threshold was for filter aging. Subsequent evaluation of the effect of filter aging on the filter strength was conducted, and the results were compared with required performance to meet the conditions assumed in the RPP Authorization Basis (AB). It was found that the reduction in filter strength due to aging does not affect the filter performance requirements as specified in the AB. A portion of the HEPA filter vulnerability assessment is being conducted by the ORP and is not part of the scope of this study. The ORP is conducting an assessment of the existing policies and programs relating to maintenance, testing, and change-out of HEPA filters used for SC/SS service. This document presents the results of a HEPA filter vulnerability assessment conducted for the River protection project as requested by the DOE Office of River Protection.

  2. HEPA Filter Vulnerability Assessment

    This assessment of High Efficiency Particulate Air (HEPA) filter vulnerability was requested by the USDOE Office of River Protection (ORP) to satisfy a DOE-HQ directive to evaluate the effect of filter degradation on the facility authorization basis assumptions. Within the scope of this assessment are ventilation system HEPA filters that are classified as Safety-Class (SC) or Safety-Significant (SS) components that perform an accident mitigation function. The objective of the assessment is to verify whether HEPA filters that perform a safety function during an accident are likely to perform as intended to limit release of hazardous or radioactive materials, considering factors that could degrade the filters. Filter degradation factors considered include aging, wetting of filters, exposure to high temperature, exposure to corrosive or reactive chemicals, and exposure to radiation. Screening and evaluation criteria were developed by a site-wide group of HVAC engineers and HEPA filter experts from published empirical data. For River Protection Project (RPP) filters, the only degradation factor that exceeded the screening threshold was for filter aging. Subsequent evaluation of the effect of filter aging on the filter strength was conducted, and the results were compared with required performance to meet the conditions assumed in the RPP Authorization Basis (AB). It was found that the reduction in filter strength due to aging does not affect the filter performance requirements as specified in the AB. A portion of the HEPA filter vulnerability assessment is being conducted by the ORP and is not part of the scope of this study. The ORP is conducting an assessment of the existing policies and programs relating to maintenance, testing, and change-out of HEPA filters used for SC/SS service. This document presents the results of a HEPA filter vulnerability assessment conducted for the River protection project as requested by the DOE Office of River Protection

  3. Effect of different thickness of material filter on Tc-99m spectra and performance parameters of gamma camera

    This study aimed to investigate the effects of material filter technique on Tc-99m spectra and performance parameters of Philip ADAC forte dual head gamma camera. Thickness of material filter was selected on the basis of percentage attenuation of various gamma ray energies by different thicknesses of zinc material. A cylindrical source tank of NEMA single photon emission computed tomography (SPECT) Triple Line Source Phantom filled with water and Tc-99m radionuclide injected was used for spectra, uniformity and sensitivity measurements. Vinyl plastic tube was used as a line source for spatial resolution. Images for uniformity were reconstructed by filtered back projection method. Butterworth filter of order 5 and cut off frequency 0.35 cycles/cm was selected. Chang's attenuation correction method was applied by selecting 0.13/cm linear attenuation coefficient. Count rate was decreased with material filter from the compton region of Tc-99m energy spectrum, also from the photopeak region. Spatial resolution was improved. However, uniformity of tomographic image was equivocal, and system volume sensitivity was reduced by material filter. Material filter improved system's spatial resolution. Therefore, the technique may be used for phantom studies to improve the image quality

  4. Effect of different thickness of material filter on Tc-99m spectra and performance parameters of gamma camera

    Nazifah, A.; Norhanna, S.; Shah, S. I.; Zakaria, A.

    2014-11-01

    This study aimed to investigate the effects of material filter technique on Tc-99m spectra and performance parameters of Philip ADAC forte dual head gamma camera. Thickness of material filter was selected on the basis of percentage attenuation of various gamma ray energies by different thicknesses of zinc material. A cylindrical source tank of NEMA single photon emission computed tomography (SPECT) Triple Line Source Phantom filled with water and Tc-99m radionuclide injected was used for spectra, uniformity and sensitivity measurements. Vinyl plastic tube was used as a line source for spatial resolution. Images for uniformity were reconstructed by filtered back projection method. Butterworth filter of order 5 and cut off frequency 0.35 cycles/cm was selected. Chang's attenuation correction method was applied by selecting 0.13/cm linear attenuation coefficient. Count rate was decreased with material filter from the compton region of Tc-99m energy spectrum, also from the photopeak region. Spatial resolution was improved. However, uniformity of tomographic image was equivocal, and system volume sensitivity was reduced by material filter. Material filter improved system's spatial resolution. Therefore, the technique may be used for phantom studies to improve the image quality.

  5. HEPA filter monitoring program

    Kirchner, K. N.; Johnson, C. M.; Aiken, W. F.; Lucerna, J. J.; Barnett, R. L.; Jensen, R. T.

    1986-07-01

    The testing and replacement of HEPA filters, widely used in the nuclear industry to purify process air, are costly and labor-intensive. Current methods of testing filter performance, such as differential pressure measurement and scanning air monitoring, allow determination of overall filter performance but preclude detection of incipient filter failure such as small holes in the filters. Using current technology, a continual in-situ monitoring system was designed which provides three major improvements over current methods of filter testing and replacement. The improvements include: cost savings by reducing the number of intact filters which are currently being replaced unnecessarily; more accurate and quantitative measurement of filter performance; and reduced personnel exposure to a radioactive environment by automatically performing most testing operations.

  6. Novel Backup Filter Device for Candle Filters

    Bishop, B.; Goldsmith, R.; Dunham, G.; Henderson, A.

    2002-09-18

    The currently preferred means of particulate removal from process or combustion gas generated by advanced coal-based power production processes is filtration with candle filters. However, candle filters have not shown the requisite reliability to be commercially viable for hot gas clean up for either integrated gasifier combined cycle (IGCC) or pressurized fluid bed combustion (PFBC) processes. Even a single candle failure can lead to unacceptable ash breakthrough, which can result in (a) damage to highly sensitive and expensive downstream equipment, (b) unacceptably low system on-stream factor, and (c) unplanned outages. The U.S. Department of Energy (DOE) has recognized the need to have fail-safe devices installed within or downstream from candle filters. In addition to CeraMem, DOE has contracted with Siemens-Westinghouse, the Energy & Environmental Research Center (EERC) at the University of North Dakota, and the Southern Research Institute (SRI) to develop novel fail-safe devices. Siemens-Westinghouse is evaluating honeycomb-based filter devices on the clean-side of the candle filter that can operate up to 870 C. The EERC is developing a highly porous ceramic disk with a sticky yet temperature-stable coating that will trap dust in the event of filter failure. SRI is developing the Full-Flow Mechanical Safeguard Device that provides a positive seal for the candle filter. Operation of the SRI device is triggered by the higher-than-normal gas flow from a broken candle. The CeraMem approach is similar to that of Siemens-Westinghouse and involves the development of honeycomb-based filters that operate on the clean-side of a candle filter. The overall objective of this project is to fabricate and test silicon carbide-based honeycomb failsafe filters for protection of downstream equipment in advanced coal conversion processes. The fail-safe filter, installed directly downstream of a candle filter, should have the capability for stopping essentially all particulate

  7. MST Filterability Tests

    Poirier, M. R. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Burket, P. R. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Duignan, M. R. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2015-03-12

    The Savannah River Site (SRS) is currently treating radioactive liquid waste with the Actinide Removal Process (ARP) and the Modular Caustic Side Solvent Extraction Unit (MCU). The low filter flux through the ARP has limited the rate at which radioactive liquid waste can be treated. Recent filter flux has averaged approximately 5 gallons per minute (gpm). Salt Batch 6 has had a lower processing rate and required frequent filter cleaning. Savannah River Remediation (SRR) has a desire to understand the causes of the low filter flux and to increase ARP/MCU throughput. In addition, at the time the testing started, SRR was assessing the impact of replacing the 0.1 micron filter with a 0.5 micron filter. This report describes testing of MST filterability to investigate the impact of filter pore size and MST particle size on filter flux and testing of filter enhancers to attempt to increase filter flux. The authors constructed a laboratory-scale crossflow filter apparatus with two crossflow filters operating in parallel. One filter was a 0.1 micron Mott sintered SS filter and the other was a 0.5 micron Mott sintered SS filter. The authors also constructed a dead-end filtration apparatus to conduct screening tests with potential filter aids and body feeds, referred to as filter enhancers. The original baseline for ARP was 5.6 M sodium salt solution with a free hydroxide concentration of approximately 1.7 M.3 ARP has been operating with a sodium concentration of approximately 6.4 M and a free hydroxide concentration of approximately 2.5 M. SRNL conducted tests varying the concentration of sodium and free hydroxide to determine whether those changes had a significant effect on filter flux. The feed slurries for the MST filterability tests were composed of simple salts (NaOH, NaNO2, and NaNO3) and MST (0.2 – 4.8 g/L). The feed slurry for the filter enhancer tests contained simulated salt batch 6 supernate, MST, and filter enhancers.

  8. Tradition og Modernisme

    Bay, Carl Erik

    artiklen "Tradition og Modernisme" sit syn på konventionelle og moderne former i byplanlægning, arkitektur og design. I denne antologi er "Tradition og Modernisme" genoptrykt og fem forskere analyserer den med forskellige indfaldsvinkler. Den perspektiveres i forhold til PHs filosofiske afsæt i...

  9. Family Customs and Traditions.

    MacGregor, Cynthia

    Recognizing the importance of maintaining open communication with immediate and extended family members, this book provides a compilation of ideas for family traditions and customs that are grounded in compassion and human kindness. The traditions were gathered from families in the United States and Canada who responded to advertisements in…

  10. Oriented Fiber Filter Media

    Bharadwaj, R; A. Patel, S. Chokdeepanich, Ph.D.; G.G. Chase, Ph.D.

    2008-01-01

    Coalescing filters are widely used throughout industry and improved performance will reduce droplet emissions and operating costs. Experimental observations show orientation of micro fibers in filter media effect the permeability and the separation efficiency of the filter media. In this work two methods are used to align the fibers to alter the filter structure. The results show that axially aligned fiber media improve quality factor on the order of 20% and cutting media on an angle from a t...

  11. Introduction to Kalman Filtering

    Alazard, Daniel

    2005-01-01

    This document is an introduction to Kalman optimal Filtering applied to linear systems. It is assumed that the reader is already aware of linear servo-loop theory, frequency-domain Filtering (continuous and discrete-time) and state-space approach to represent linear systems. Generally, Filtering consists in estimating a useful information (signal) from a measurement (of this information) perturbed by a noise. Frequency-domain Filtering assumes that a frequency-domain separation exists between...

  12. Adaptive acoustooptic filter

    Psaltis, Demetri; Hong, John

    1984-01-01

    A new adaptive filter utilizing acoustooptic devices in a space integrating architecture is described. Two configurations are presented; one of them, suitable for signal estimation, is shown to approximate the Wiener filter, while the other, suitable for detection, is shown to approximate the matched filter.

  13. Filtering in Finance.

    Lautier, Delphine; Javaheri, Alireza; Galli, Alain

    2003-01-01

    In this article we present an introduction to various Filtering algorithms and some of their applications to the world of Quantitative Finance. We shall first mention the fundamental case of Gaussian noises where we obtain the well-known Kalman Filter. Because of common nonlinearities, we will be discussing the Extended Kalman Filter.

  14. HEPA filter encapsulation

    Gates-Anderson, Dianne D.; Kidd, Scott D.; Bowers, John S.; Attebery, Ronald W.

    2003-01-01

    A low viscosity resin is delivered into a spent HEPA filter or other waste. The resin is introduced into the filter or other waste using a vacuum to assist in the mass transfer of the resin through the filter media or other waste.

  15. Filter service system

    Sellers, Cheryl L.; Nordyke, Daniel S.; Crandell, Richard A.; Tomlins, Gregory; Fei, Dong; Panov, Alexander; Lane, William H.; Habeger, Craig F.

    2008-12-09

    According to an exemplary embodiment of the present disclosure, a system for removing matter from a filtering device includes a gas pressurization assembly. An element of the assembly is removably attachable to a first orifice of the filtering device. The system also includes a vacuum source fluidly connected to a second orifice of the filtering device.

  16. Modified Adaptive Weighted Averaging Filtering Algorithm for Noisy Image Sequences

    LI Weifeng; YU Daoyin; CHEN Xiaodong

    2007-01-01

    In order to avoid the influence of noise variance on the filtering performances, a modified adaptive weighted averaging (MAWA) filtering algorithm is proposed for noisy image sequences. Based upon adaptive weighted averaging pixel values in consecutive frames, this algorithm achieves the filtering goal by assigning smaller weights to the pixels with inappropriate estimated motion trajectory for noise. It only utilizes the intensity of pixels to suppress noise and accordingly is independent of noise variance. To evaluate the performance of the proposed filtering algorithm, its mean square error and percentage of preserved edge points were compared with those of traditional adaptive weighted averaging and non-adaptive mean filtering algorithms under different noise variances. Relevant results show that the MAWA filtering algorithm can preserve image structures and edges under motion after attenuating noise, and thus may be used in image sequence filtering.

  17. Improved Passive-Damped LCL Filter to Enhance Stability in Grid-Connected Voltage-Source Converters

    Beres, Remus Narcis; Wang, Xiongfei; Blaabjerg, Frede;

    2015-01-01

    This paper proposes an improved passive-damped LCL filter to be used as interface between the grid-connected voltage-source converters and the utility grid. The proposed filter replaces the LCL filter capacitor with a traditional C-type filter with the resonant circuit tuned in such a way that sw...

  18. Filter assessment applied to analytical reconstruction for industrial third-generation tomography

    Multiphase systems are structures that contain a mixture of solids, liquids and gases inside a chemical reactor or pipes in a dynamic process. These systems are found in chemical, food, pharmaceutical and petrochemical industries. The gamma ray computed tomography (CT) system has been applied to visualize the distribution of multiphase systems without interrupting production. CT systems have been used to improve design, operation and troubleshooting of industrial processes. Computer tomography for multiphase processes is being developed at several laboratories. It is well known that scanning systems demand high processing time, limited set of data projections and views to obtain an image. Because of it, the image quality is dependent on the number of projection, number of detectors, acquisition time and reconstruction time. A phantom containing air, iron and aluminum was used on the third generation industrial tomography with 662 keV (137Cs) radioactive source. It was applied the Filtered Back Projection algorithm to reconstruct the images. An efficient tomography is dependent of the image quality, thus the objective of this research was to apply different types of filters on the analytical algorithm and compare each other using the figure of merit denominated root mean squared error (RMSE), the filter that presents lower value of RMSE has better quality. On this research, five types of filters were used: Ram-Lak, Shepp-Logan, Cosine, Hamming and Hann filters. As results, all filters presented lower values of RMSE, that means the filters used have low stand deviation compared to the mass absorption coefficient, however, the Hann filter presented better RMSE and CNR compared to the others. (author)

  19. Filter assessment applied to analytical reconstruction for industrial third-generation tomography

    Velo, Alexandre F.; Martins, Joao F.T.; Oliveira, Adriano S.; Carvalho, Diego V.S.; Faria, Fernando S.; Hamada, Margarida M.; Mesquita, Carlos H., E-mail: afvelo@usp.br [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)

    2015-07-01

    Multiphase systems are structures that contain a mixture of solids, liquids and gases inside a chemical reactor or pipes in a dynamic process. These systems are found in chemical, food, pharmaceutical and petrochemical industries. The gamma ray computed tomography (CT) system has been applied to visualize the distribution of multiphase systems without interrupting production. CT systems have been used to improve design, operation and troubleshooting of industrial processes. Computer tomography for multiphase processes is being developed at several laboratories. It is well known that scanning systems demand high processing time, limited set of data projections and views to obtain an image. Because of it, the image quality is dependent on the number of projection, number of detectors, acquisition time and reconstruction time. A phantom containing air, iron and aluminum was used on the third generation industrial tomography with 662 keV ({sup 137}Cs) radioactive source. It was applied the Filtered Back Projection algorithm to reconstruct the images. An efficient tomography is dependent of the image quality, thus the objective of this research was to apply different types of filters on the analytical algorithm and compare each other using the figure of merit denominated root mean squared error (RMSE), the filter that presents lower value of RMSE has better quality. On this research, five types of filters were used: Ram-Lak, Shepp-Logan, Cosine, Hamming and Hann filters. As results, all filters presented lower values of RMSE, that means the filters used have low stand deviation compared to the mass absorption coefficient, however, the Hann filter presented better RMSE and CNR compared to the others. (author)

  20. Bias aware Kalman filters

    Drecourt, J.-P.; Madsen, H.; Rosbjerg, Dan

    2006-01-01

    This paper reviews two different approaches that have been proposed to tackle the problems of model bias with the Kalman filter: the use of a colored noise model and the implementation of a separate bias filter. Both filters are implemented with and without feedback of the bias into the model state...... illustrated on a simple one-dimensional groundwater problem. The results show that the presented filters outperform the standard Kalman filter and that the implementations with bias feedback work in more general conditions than the implementations without feedback. 2005 Elsevier Ltd. All rights reserved....

  1. Ceramic fiber filter technology

    Holmes, B.L.; Janney, M.A.

    1996-06-01

    Fibrous filters have been used for centuries to protect individuals from dust, disease, smoke, and other gases or particulates. In the 1970s and 1980s ceramic filters were developed for filtration of hot exhaust gases from diesel engines. Tubular, or candle, filters have been made to remove particles from gases in pressurized fluidized-bed combustion and gasification-combined-cycle power plants. Very efficient filtration is necessary in power plants to protect the turbine blades. The limited lifespan of ceramic candle filters has been a major obstacle in their development. The present work is focused on forming fibrous ceramic filters using a papermaking technique. These filters are highly porous and therefore very lightweight. The papermaking process consists of filtering a slurry of ceramic fibers through a steel screen to form paper. Papermaking and the selection of materials will be discussed, as well as preliminary results describing the geometry of papers and relative strengths.

  2. Changing ventilation filters

    A filter changing unit has a door which interlocks with the door of a filter chamber so as to prevent contamination of the outer surfaces of the doors by radioactive material collected on the filter element and a movable support which enables a filter chamber thereonto to be stored within the unit in such a way that the doors of the unit and the filter chamber can be replaced. The door pivots and interlocks with another door by means of a bolt, a seal around the periphery lip of the first door engages the periphery of the second door to seal the gap. A support pivots into a lower filter element storage position. Inspection windows and glove ports are provided. The unit is releasably connected to the filter chamber by bolts engaging in a flange provided around an opening. (author)

  3. Traditional Urban Aboriginal Religion

    Kristina Everett

    2009-01-01

    Full Text Available This paper represents a group of Aboriginal people who claim traditional Aboriginal ownership of a large Australian metropol is. They have struggled for at least the last 25 to 30 years to articulate and represent the ir contemporary group identity to the wider Australian society that very often does not take th eir expressions seriously. This is largely because dominant discourses claim that ‘authentic’ Aboriginal culture only exists in remote, pristine areas far away from western societ y and that urban Aboriginal traditions, especially urban religious traditions are, today, d efunct. This paper is an account of one occasion on which such traditional Aboriginal relig ious practice was performed before the eyes of a group of tourists.

  4. Compact planar microwave blocking filters

    U-Yen, Kongpop (Inventor); Wollack, Edward J. (Inventor)

    2012-01-01

    A compact planar microwave blocking filter includes a dielectric substrate and a plurality of filter unit elements disposed on the substrate. The filter unit elements are interconnected in a symmetrical series cascade with filter unit elements being organized in the series based on physical size. In the filter, a first filter unit element of the plurality of filter unit elements includes a low impedance open-ended line configured to reduce the shunt capacitance of the filter.

  5. Frequency weighting filter design for automotive ride comfort evaluation

    Du, Feng

    2016-04-01

    Few study gives guidance to design weighting filters according to the frequency weighting factors, and the additional evaluation method of automotive ride comfort is not made good use of in some countries. Based on the regularities of the weighting factors, a method is proposed and the vertical and horizontal weighting filters are developed. The whole frequency range is divided several times into two parts with respective regularity. For each division, a parallel filter constituted by a low- and a high-pass filter with the same cutoff frequency and the quality factor is utilized to achieve section factors. The cascading of these parallel filters obtains entire factors. These filters own a high order. But, low order filters are preferred in some applications. The bilinear transformation method and the least P-norm optimal infinite impulse response(IIR) filter design method are employed to develop low order filters to approximate the weightings in the standard. In addition, with the window method, the linear phase finite impulse response(FIR) filter is designed to keep the signal from distorting and to obtain the staircase weighting. For the same case, the traditional method produces 0.330 7 m • s-2 weighted root mean square(r.m.s.) acceleration and the filtering method gives 0.311 9 m • s-2 r.m.s. The fourth order filter for approximation of vertical weighting obtains 0.313 9 m • s-2 r.m.s. Crest factors of the acceleration signal weighted by the weighting filter and the fourth order filter are 3.002 7 and 3.011 1, respectively. This paper proposes several methods to design frequency weighting filters for automotive ride comfort evaluation, and these developed weighting filters are effective.

  6. KASTAMONU TRADITIONAL WOMEN CLOTHES

    E. Elhan ÖZUS; ERDEN, Filiz; TUFAN, Melek

    2015-01-01

    Clothing is a unique dressing style of a community, a period or a profession. In clothing there is social status and difference principle rather than fashion. In this context, the society created a clothing style in line with its own customs, traditions and social structure. One of the features separating societies from each other and indicating their cultural and social classes is the clothing style. As it is known, traditional Turkish clothes reflecting...

  7. Survey of Sparse Adaptive Filters for Acoustic Echo Cancellation

    Krishna Samalla

    2013-01-01

    Full Text Available This paper reviews the existing developments of adaptive methods of sparse adaptive filters for the identification of sparse impulse response in both network and acoustic echo cancellation from the last decade. A variety of different architectures and novel training algorithms have been proposed in literature. At present most of the work in echo cancellation on using more than one method. Sparse adaptive filters take the advantage of each method and showing good improvement in the sparseness measure performance. This survey gives an overview of existing sparse adaptive filters mechanisms and discusses their advantages over the traditional adaptive filters developed for echo cancellation.

  8. Miniature wideband filter based on coupled-line sections and quasi-lumped element resonator

    Zhurbenko, Vitaliy; Krozer, Viktor; Meincke, Peter

    2007-01-01

    A new design of a wideband bandpass filter is proposed, based on coupled-line sections and quasi-lumped element resonator, taking advantage of the last one to introduce two transmission zeros and suppress a spurious response. The proposed filter demonstrates significantly improved characteristics...... in comparison with traditional coupled-line filter and exhibits a very compact structure....

  9. Filter material charging apparatus for filter assembly for radioactive contaminants

    A filter charging apparatus for a filter assembly is described. The filter assembly includes a housing with at least one filter bed therein and the filter charging apparatus for adding filter material to the filter assembly includes a tank with an opening therein, the tank opening being disposed in flow communication with opposed first and second conduit means, the first conduit means being in flow communication with the filter assembly housing and the second conduit means being in flow communication with a blower means. Upon activation of the blower means, the blower means pneumatically conveys the filter material from the tank to the filter housing

  10. Evaluation of median filtering after reconstruction with maximum likelihood expectation maximization (ML-EM) by real space and frequency space

    Matsumoto, Keiichi; Fujita, Toru; Oogari, Koji [Kyoto Univ. (Japan). Hospital

    2002-05-01

    Maximum likelihood expectation maximization (ML-EM) image quality is sensitive to the number of iterations, because a large number of iterations leads to images with checkerboard noise. The use of median filtering in the reconstruction process allows both noise reduction and edge preservation. We examined the value of median filtering after reconstruction with ML-EM by comparing filtered back projection (FBP) with a ramp filter or ML-EM without filtering. SPECT images were obtained with a dual-head gamma camera. The acquisition time was changed from 10 to 200 (seconds/frame) to examine the effect of the count statistics on the quality of the reconstructed images. First, images were reconstructed with ML-EM by changing the number of iterations from 1 to 150 in each study. Additionally, median filtering was applied following reconstruction with ML-EM. The quality of the reconstructed images was evaluated in terms of normalized mean square error (NMSE) values and two-dimensional power spectrum analysis. Median filtering after reconstruction by the ML-EM method provided stable NMSE values even when the number of iterations was increased. The signal element of the image was close to the reference image for any repetition number of iterations. Median filtering after reconstruction with ML-EM was useful in reducing noise, with a similar resolution achieved by reconstruction with FBP and a ramp filter. Especially in images with poor count statistics, median filtering after reconstruction with ML-EM is effective as a simple, widely available method. (author)

  11. Generic Kalman Filter Software

    Lisano, Michael E., II; Crues, Edwin Z.

    2005-01-01

    The Generic Kalman Filter (GKF) software provides a standard basis for the development of application-specific Kalman-filter programs. Historically, Kalman filters have been implemented by customized programs that must be written, coded, and debugged anew for each unique application, then tested and tuned with simulated or actual measurement data. Total development times for typical Kalman-filter application programs have ranged from months to weeks. The GKF software can simplify the development process and reduce the development time by eliminating the need to re-create the fundamental implementation of the Kalman filter for each new application. The GKF software is written in the ANSI C programming language. It contains a generic Kalman-filter-development directory that, in turn, contains a code for a generic Kalman filter function; more specifically, it contains a generically designed and generically coded implementation of linear, linearized, and extended Kalman filtering algorithms, including algorithms for state- and covariance-update and -propagation functions. The mathematical theory that underlies the algorithms is well known and has been reported extensively in the open technical literature. Also contained in the directory are a header file that defines generic Kalman-filter data structures and prototype functions and template versions of application-specific subfunction and calling navigation/estimation routine code and headers. Once the user has provided a calling routine and the required application-specific subfunctions, the application-specific Kalman-filter software can be compiled and executed immediately. During execution, the generic Kalman-filter function is called from a higher-level navigation or estimation routine that preprocesses measurement data and post-processes output data. The generic Kalman-filter function uses the aforementioned data structures and five implementation- specific subfunctions, which have been developed by the user on

  12. Conservative Noise Filters

    Mona M.Jamjoom

    2016-05-01

    Full Text Available Noisy training data have a huge negative impact on machine learning algorithms. Noise-filtering algorithms have been proposed to eliminate such noisy instances. In this work, we empirically show that the most popular noise-filtering algorithms have a large False Positive (FP error rate. In other words, these noise filters mistakenly identify genuine instances as outliers and eliminate them. Therefore, we propose more conservative outlier identification criteria that improve the FP error rate and, thus, the performance of the noise filters. With the new filter, an instance is eliminated if and only if it is misclassified by a mutual decision of Naïve Bayesian (NB classifier and the original filtering criteria being used. The number of genuine instances that are incorrectly eliminated is reduced as a result, thereby improving the classification accuracy.

  13. KASTAMONU TRADITIONAL WOMEN CLOTHES

    E.Elhan ÖZUS

    2015-08-01

    Full Text Available Clothing is a unique dressing style of a community, a period or a profession. In clothing there is social status and difference principle rather than fashion. In this context, the society created a clothing style in line with its own customs, traditions and social structure. One of the features separating societies from each other and indicating their cultural and social classes is the clothing style. As it is known, traditional Turkish clothes reflecting the characteristics of Turkish society is our most beautiful heritage from past to present. From this heritage there are several examples of women's clothes c arried to present. When these examples are examined, it is possible to see the taste, the way of understanding art, joy and the lifestyle of the history. These garments are also the documents outlining the taste and grace of Turkish people. In the present study, traditional Kastamonu women's clothing, that has an important place in traditional cultural clothes of Anatolia, is investigated . The method of the present research is primarily defined as the examination of the written sources. The study is complet ed with the observations and examinations made in Kastamonu. According to the findings of the study, traditional Kastamonu women's clothing are examined and adapted to todays’ clothing.

  14. Retrofitting fabric filters for clean stack emission

    The fly ash generated from New South Wales coals, which are predominately low sulphur coals, has been difficult to collect in traditional electrostatic precipitators. During the early 1970's development work was undertaken on the use of fabric filters at some of the Commission's older power stations. The satisfactory performance of the plant at those power stations led to the selection of fabric filters for flue gas cleaning at the next two new power stations constructed by the Electricity Commission of New South Wales. On-going pilot plant testing has continued to indicate the satisfactory performance of enhanced designs of fabric filters of varying types and the Commission has recently retrofitted pulse cleaned fabric filters to 2 x 350 MW units at a further power station with plans to retrofit similar plant to the remaining 2 x 350 MW units at that station. A contract has also been let for the retrofitting of pulse cleaned fabric filters to 4 x 500 MW units at another power station in the Commission's system. The paper reviews the performance of the 6000 MW of plant operating with fabric filters. Fabric selection and fabric life forms an important aspect of this review

  15. Hybrid Filter Membrane

    Laicer, Castro; Rasimick, Brian; Green, Zachary

    2012-01-01

    Cabin environmental control is an important issue for a successful Moon mission. Due to the unique environment of the Moon, lunar dust control is one of the main problems that significantly diminishes the air quality inside spacecraft cabins. Therefore, this innovation was motivated by NASA s need to minimize the negative health impact that air-suspended lunar dust particles have on astronauts in spacecraft cabins. It is based on fabrication of a hybrid filter comprising nanofiber nonwoven layers coated on porous polymer membranes with uniform cylindrical pores. This design results in a high-efficiency gas particulate filter with low pressure drop and the ability to be easily regenerated to restore filtration performance. A hybrid filter was developed consisting of a porous membrane with uniform, micron-sized, cylindrical pore channels coated with a thin nanofiber layer. Compared to conventional filter media such as a high-efficiency particulate air (HEPA) filter, this filter is designed to provide high particle efficiency, low pressure drop, and the ability to be regenerated. These membranes have well-defined micron-sized pores and can be used independently as air filters with discreet particle size cut-off, or coated with nanofiber layers for filtration of ultrafine nanoscale particles. The filter consists of a thin design intended to facilitate filter regeneration by localized air pulsing. The two main features of this invention are the concept of combining a micro-engineered straight-pore membrane with nanofibers. The micro-engineered straight pore membrane can be prepared with extremely high precision. Because the resulting membrane pores are straight and not tortuous like those found in conventional filters, the pressure drop across the filter is significantly reduced. The nanofiber layer is applied as a very thin coating to enhance filtration efficiency for fine nanoscale particles. Additionally, the thin nanofiber coating is designed to promote capture of

  16. Spot- Zombie Filtering System

    Arathy Rajagopal; B. Geethanjali; Arulprakash P

    2014-01-01

    A major security challenge on the Internet is the existence of the large number of compromised machines. Such machines have been increasingly used to launch various security attacks including spamming and spreading malware, DDoS, and identity theft. These compromised machines are called "Zombies". In general E-mail applications and providers uses spam filters to filter the spam messages. Spam filtering is a technique for discriminating the genuine message from the spam messages. The attackers...

  17. Kalman Filter Neuron Training

    Murase, Haruhiko; KOYAMA, Shuhei; HONAMI, Nobuo; Kuwabara, Takao

    1991-01-01

    An attempt of implementing Kalman filter algorithm in the procedure for training the neural network was made and evaluated. The Kalman filter neuron training program (KNT) was coded. The performance of Kalman filter in KNT was compared to commonly used neuron training algorithm. The study revealed that KNT requires much less calculation time to accomplish neuron training than commonly used other algorithms do. KNT also gave much smaller final error than any other algorithms tested in this study.

  18. Morphing ensemble Kalman filters

    Beezley, Jonathan D.; Mandel, Jan

    2008-01-01

    A new type of ensemble filter is proposed, which combines an ensemble Kalman filter (EnKF) with the ideas of morphing and registration from image processing. This results in filters suitable for non-linear problems whose solutions exhibit moving coherent features, such as thin interfaces in wildfire modelling. The ensemble members are represented as the composition of one common state with a spatial transformation, called registration mapping, plus a residual. A fully automatic registration m...

  19. Morphing Ensemble Kalman Filters

    Beezley, Jonathan D.; Mandel, Jan

    2007-01-01

    A new type of ensemble filter is proposed, which combines an ensemble Kalman filter (EnKF) with the ideas of morphing and registration from image processing. This results in filters suitable for nonlinear problems whose solutions exhibit moving coherent features, such as thin interfaces in wildfire modeling. The ensemble members are represented as the composition of one common state with a spatial transformation, called registration mapping, plus a residual. A fully automatic registration met...

  20. Nanofiber Filters Eliminate Contaminants

    2009-01-01

    With support from Phase I and II SBIR funding from Johnson Space Center, Argonide Corporation of Sanford, Florida tested and developed its proprietary nanofiber water filter media. Capable of removing more than 99.99 percent of dangerous particles like bacteria, viruses, and parasites, the media was incorporated into the company's commercial NanoCeram water filter, an inductee into the Space Foundation's Space Technology Hall of Fame. In addition to its drinking water filters, Argonide now produces large-scale nanofiber filters used as part of the reverse osmosis process for industrial water purification.

  1. Filters in nuclear facilities

    The topics of the nine papers given include the behavior of HEPA filters during exposure to air flows of high humidity as well as of high differential pressure, the development of steel-fiber filters suitable for extreme operating conditions, and the occurrence of various radioactive iodine species in the exhaust air from boiling water reactors. In an introductory presentation the German view of the performance requirements to be met by filters in nuclear facilities as well as the present status of filter quality assurance are discussed. (orig.)

  2. Updating the OMERACT filter

    Wells, George; Beaton, Dorcas E; Tugwell, Peter; Boers, Maarten; Kirwan, John R; Bingham, Clifton O; Boonen, Annelies; Brooks, Peter; Conaghan, Philip G; D'Agostino, Maria-Antonietta; Dougados, Maxime; Furst, Daniel E; Gossec, Laure; Guillemin, Francis; Helliwell, Philip; Hewlett, Sarah; Kvien, Tore K; Landewé, Robert B; March, Lyn; Mease, Philip J; Østergaard, Mikkel; Simon, Lee; Singh, Jasvinder A; Strand, Vibeke; van der Heijde, Désirée M

    2014-01-01

    The "Discrimination" part of the OMERACT Filter asks whether a measure discriminates between situations that are of interest. "Feasibility" in the OMERACT Filter encompasses the practical considerations of using an instrument, including its ease of use, time to complete, monetary costs, and...... interpretability of the question(s) included in the instrument. Both the Discrimination and Reliability parts of the filter have been helpful but were agreed on primarily by consensus of OMERACT participants rather than through explicit evidence-based guidelines. In Filter 2.0 we wanted to improve this definition...

  3. Oriented Fiber Filter Media

    R. Bharadwaj

    2008-06-01

    Full Text Available Coalescing filters are widely used throughout industry and improved performance will reduce droplet emissions and operating costs. Experimental observations show orientation of micro fibers in filter media effect the permeability and the separation efficiency of the filter media. In this work two methods are used to align the fibers to alter the filter structure. The results show that axially aligned fiber media improve quality factor on the order of 20% and cutting media on an angle from a thick layered media can improve performance by about 40%. The results also show the improved performance is not monotonically correlated to the average fiber angle of the medium.

  4. Traditional Chinese Biotechnology

    Xu, Yan; Wang, Dong; Fan, Wen Lai; Mu, Xiao Qing; Chen, Jian

    The earliest industrial biotechnology originated in ancient China and developed into a vibrant industry in traditional Chinese liquor, rice wine, soy sauce, and vinegar. It is now a significant component of the Chinese economy valued annually at about 150 billion RMB. Although the production methods had existed and remained basically unchanged for centuries, modern developments in biotechnology and related fields in the last decades have greatly impacted on these industries and led to numerous technological innovations. In this chapter, the main biochemical processes and related technological innovations in traditional Chinese biotechnology are illustrated with recent advances in functional microbiology, microbial ecology, solid-state fermentation, enzymology, chemistry of impact flavor compounds, and improvements made to relevant traditional industrial facilities. Recent biotechnological advances in making Chinese liquor, rice wine, soy sauce, and vinegar are reviewed.

  5. Traditional Chinese biotechnology.

    Xu, Yan; Wang, Dong; Fan, Wen Lai; Mu, Xiao Qing; Chen, Jian

    2010-01-01

    The earliest industrial biotechnology originated in ancient China and developed into a vibrant industry in traditional Chinese liquor, rice wine, soy sauce, and vinegar. It is now a significant component of the Chinese economy valued annually at about 150 billion RMB. Although the production methods had existed and remained basically unchanged for centuries, modern developments in biotechnology and related fields in the last decades have greatly impacted on these industries and led to numerous technological innovations. In this chapter, the main biochemical processes and related technological innovations in traditional Chinese biotechnology are illustrated with recent advances in functional microbiology, microbial ecology, solid-state fermentation, enzymology, chemistry of impact flavor compounds, and improvements made to relevant traditional industrial facilities. Recent biotechnological advances in making Chinese liquor, rice wine, soy sauce, and vinegar are reviewed. PMID:19888561

  6. Family traditions and generations.

    Schneiderman, Gerald; Barrera, Maru

    2009-01-01

    Currently, traditional family values that have been passed down through generations appear to be at risk. This has significant implications for the stability and health of individuals, families, and communities. This article explores selected issues related to intergenerational transmission of family values and cultural beliefs, with particular reference to Western culture and values that are rooted in Jewish and Christian traditions. It also examines family values and parenting styles as they influence the developing perspective of children and the family's adaptation to a changing world. PMID:19752638

  7. FPGA Based Kalman Filter for Wireless Sensor Networks

    Vikrant Vij

    2011-01-01

    Full Text Available A Wireless Sensor Network (WSN is a set of tiny and low-cost devices equipped with different kind of sensors, a small microcontroller and a radio transceiver, typically powered by batteries. Target tracking is one of the very important applications of such a network system. Traditionally, KF (Kalman filtering and its derivatives are used for tracking of a random signal. Kalman filter is a linear optimal filtering approach, to address the problem when system dynamics become nonlinear, researchers developed sub-optimal extensions of Kalman filter, two popular versions are EKF (extended Kalman filter and UKF (unscented Kalman filter.The rapidly increasing popularity of WSNs has placed increased computational demands upon these systemswhich can be met by FPGA based design. FPGAs offer increased performance compared to microprocessors and increased flexibility compared to ASICs , while maintaining low power consumption

  8. The Daala Directional Deringing Filter

    Valin, Jean-Marc

    2016-01-01

    This paper presents the deringing filter used in the Daala royalty-free video codec. The filter is based on a non-linear conditional replacement filter and is designed for vectorization efficiency. It takes into account the direction of edges and patterns being filtered. The filter works by identifying the direction of each block and then adaptively filtering along the identified direction. In a second pass, the blocks are also filtered in a different direction, with more conservative thresho...

  9. Students’ Weakness Detective in Traditional Class

    Fatimah Altuhaifa

    2016-10-01

    Full Text Available In Artificial Intelligent in Education in learning contexts and domains, the traditional classroom is tough to find students’ weakness during lecture due to the student’s number and because the instruction is busy with explaining the lesson. According to that, choosing teaching style that can improve student talent or skills to performs better in their classes or professional life would not be an easy task. This system is going to detect the average of students’ weakness and find either a solution for this or instruction a style that can increase students’ ability and skills by filtering the collection data, understanding the problem. After that, it provides a teaching style.

  10. Investigation of New Microstrip Bandpass Filter Based on Patch Resonator with Geometrical Fractal Slot

    Mezaal, Yaqeen S.; Eyyuboglu, Halil T.

    2016-01-01

    A compact dual-mode microstrip bandpass filter using geometrical slot is presented in this paper. The adopted geometrical slot is based on first iteration of Cantor square fractal curve. This filter has the benefits of possessing narrower and sharper frequency responses as compared to microstrip filters that use single mode resonators and traditional dual-mode square patch resonators. The filter has been modeled and demonstrated by Microwave Office EM simulator designed at a resonant frequenc...

  11. Traditional Cherokee Food.

    Hendrix, Janey B.

    A collection for children and teachers of traditional Cherokee recipes emphasizes the art, rather than the science, of cooking. The hand-printed, illustrated format is designed to communicate the feeling of Cherokee history and culture and to encourage readers to collect and add family recipes. The cookbook could be used as a starting point for…

  12. Major Traditional Festivals

    2006-01-01

    Spring Festival is the most important and most celebrated Chinese traditional festival, and it is the only indigenous celebration with legal holidays. People have different opinions on the origin of the event. Many say it can be dated back to 4,000 years ago, when people sacrificed to their

  13. Traditional healers formalised?

    Van Niekerk, Jp

    2012-03-01

    Traditional healers are the first to be called for help when illness strikes the majority of South Africans. Their communities have faith in their ability to cure or alleviate conditions managed by doctors, and much more. A visit to such practitioners' websites (they are up with the latest advertising technology!) shows that they promise help with providing more power, love, security or money, protection from evil people and spirits, enhancing one's sex life with penis enlargement and vagina tightening spells, etc. Contemplating such claims, it is easy to be dismissive of traditional healers. But in this issue of the SAMJ Nompumelelo Mbatha and colleagues1 argue that the traditional healers' regulatory council, promised by an Act of Parliament, should be established, followed by (or preferably preceded by) formal recognition by employers of sick certificates issued by traditional healers. Can matters be so simply resolved? What does this mean for doctors and other formally recognised healthcare professionals, and how to respond to such claims and social pressures? PMID:22380886

  14. Making Tradition Healthy

    2007-11-01

    In this podcast, a Latina nutrition educator shows how a community worked with local farmers to grow produce traditionally enjoyed by Hispanic/Latinos.  Created: 11/1/2007 by National Diabetes Education Program (NDEP), a joint program of the Centers for Disease Control and Prevention and the National Institutes of Health.   Date Released: 11/10/2007.

  15. Traditional versus shadow banking

    Bryan J. Noeth; Wolla, Scott A.

    2012-01-01

    Modern economies rely heavily on financial intermediaries to channel funds between borrowers and lenders. In the first edition of the Page One Economics Newsletter, the role of traditional banking is outlined and a parallel system—shadow banking—is explored.

  16. Traditional Chinese Medicine

    2010-01-01

    2010150 A prospective multicenter randomized double-blinded controlled clinical trial on effects of Tiantai No. 1 in treating mild cognitive impairment. WU Zhengzhi(吴正治),et al. Shenzhen Hosp,Southern Med Univ,Guangdong 518035.Chin J Integr Tradit & West Med 2010;30(3):255-258.

  17. 3.TRADITIONAL CHINESE MEDICINE

    1992-01-01

    920220 Studies on plasma cortisol concen-tration and blood leukocyte content of gluco-corticoid receptors in patients with asthenia-cold asthenia-heat syndrome.ZHANG Guan-gyu (张广宇),XLE Zhufar (谢竹藩).Tradit & West

  18. Tibetan traditional medicine

    2005-01-01

    Tibetan medicine companies in T.A.R can manufacture more than 360 Tibetan patent medicines. There are 18 Tibetan medicine factories in Tibet, and total out value exceeds 3 billion yuan. 24 kinds of Tibetan patent medicines have been incorporated into State Fundamental Medicine List, in which 14 Tibetan patent medicines are listed in national protected traditional medicine category.

  19. Kalman filtering technique for reactivity measurement

    Measurement of reactivity and its on-line display is of great help in calibration of reactivity control and safety devices and in the planning of suitable actions during the reactor operation. In traditional approaches the reactivity is estimated from reactor period or by solving the inverse point kinetic equation. In this paper, an entirely new approach based on the Kalman filtering technique has been presented. The theory and design of the reactivity measuring instrument based on the approach has been explained. Its performance has been compared with traditional approaches by estimation of transient reactivity from flux variation data recorded in a research reactor. It is demonstrated that the Kalman filtering approach is superior to other methods from the viewpoints of accuracy, noise suppression, and robustness against uncertainties in the reactor parameters. (author). 1 fig

  20. A Robust Gaussian Filter Corresponding to the Transmisson Characterisic of the Gaussian Filter

    A surface roughness profile of an object can be measured by extracting a mean line of the long wavelength component from the primary profile, and by subtracting it from the primary profile. This mean line is usually computed by convolving the traditional Gaussian filter (GF) and the primary profile. However, if an outlier exists in the primary profile, the output of a Gaussian filter will be greatly affected by the outlier. To solve the outlier problem, several schemes of robust Gaussian filter have been proposed. However there are several fatal problems that a mean line determined with respect to the measurement data containing no outliers does not agree with the mean line of the Gaussian filter output. To solve these problems, this paper proposes a new robust Gaussian filter based on a fast M-estimation method (FMGF) and the performance of the new robust Gaussian filter was experimentally clarified. As a result, if an outlier exist, the proposed method behaves a robust performance. If no outlier exists, the output wave pattern, RMSE and transmission characteristic accorded mutually with Gaussian filter

  1. Randomized Filtering Algorithms

    Katriel, Irit; Van Hentenryck, Pascal

    2008-01-01

    Filtering every global constraint of a CPS to are consistency at every search step can be costly and solvers often compromise on either the level of consistency or the frequency at which are consistency is enforced. In this paper we propose two randomized filtering schemes for dense instances...

  2. Internet Filtering in China

    Zittrain, Jonathan L.

    2003-01-01

    We collected data on the methods, scope, and depth of selective barriers to Internet usage through networks in China. Tests conducted from May through November 2002 indicated at least four distinct and independently operable Internet filtering methods - Web server IP address, DNS server IP address, keyword, and DNS redirection with a quantifiable leap in filtering sophistication beginning in September 2002.

  3. Multilevel ensemble Kalman filtering

    Hoel, Håkon; Law, Kody J. H.; Tempone, Raul

    2015-01-01

    This work embeds a multilevel Monte Carlo (MLMC) sampling strategy into the Monte Carlo step of the ensemble Kalman filter (ENKF), thereby yielding a multilevel ensemble Kalman filter (MLENKF) which has provably superior asymptotic cost to a given accuracy level. The theoretical results are illustrated numerically.

  4. Sub-micron filter

    Tepper, Frederick; Kaledin, Leonid

    2009-10-13

    Aluminum hydroxide fibers approximately 2 nanometers in diameter and with surface areas ranging from 200 to 650 m.sup.2/g have been found to be highly electropositive. When dispersed in water they are able to attach to and retain electronegative particles. When combined into a composite filter with other fibers or particles they can filter bacteria and nano size particulates such as viruses and colloidal particles at high flux through the filter. Such filters can be used for purification and sterilization of water, biological, medical and pharmaceutical fluids, and as a collector/concentrator for detection and assay of microbes and viruses. The alumina fibers are also capable of filtering sub-micron inorganic and metallic particles to produce ultra pure water. The fibers are suitable as a substrate for growth of cells. Macromolecules such as proteins may be separated from each other based on their electronegative charges.

  5. Filter Bank Fusion Frames

    Chebira, Amina; Mixon, Dustin G

    2010-01-01

    In this paper we characterize and construct novel oversampled filter banks implementing fusion frames. A fusion frame is a sequence of orthogonal projection operators whose sum can be inverted in a numerically stable way. When properly designed, fusion frames can provide redundant encodings of signals which are optimally robust against certain types of noise and erasures. However, up to this point, few implementable constructions of such frames were known; we show how to construct them using oversampled filter banks. In this work, we first provide polyphase domain characterizations of filter bank fusion frames. We then use these characterizations to construct filter bank fusion frame versions of discrete wavelet and Gabor transforms, emphasizing those specific finite impulse response filters whose frequency responses are well-behaved.

  6. Experimental validation of a single shaped filter approach for CT using variable source-to-filter distance for examination of arbitrary object diameters

    The purpose of this study was to validate the use of a single shaped filter (SF) for computed tomography (CT) using variable source-to-filter distance (SFD) for the examination of different object diameters. A SF was designed by performing simulations with the purpose of achieving noise homogeneity in the reconstructed volume and dose reduction for arbitrary phantom diameters. This was accomplished by using a filter design method thats target is to achieve a homogeneous detector noise, but also uses a correction factor for the filtered back projection process. According to simulation results, a single SF designed for one of the largest phantom diameters meets the requirements for all diameters when SFD can be adjusted. To validate these results, a SF made of aluminium alloy was manufactured. Measurements were performed on a CT scanner with polymethyl methacrylate (PMMA) phantoms of diameters from 40–100 mm. The filter was positioned at SFDs ranging from 97–168 mm depending on the phantom diameter. Image quality was evaluated for the reconstructed volume by assessing CT value accuracy, noise homogeneity, contrast-to-noise ratio weighted by dose (CNRD) and spatial resolution. Furthermore, scatter distribution was determined with the use of a beam-stop phantom. Dose was measured for a PMMA phantom with a diameter of 100 mm using a calibrated ionization chamber. The application of a single SF at variable SFD led to improved noise uniformity and dose reduction: noise homogeneity was improved from 15% down to about 0%, and dose was reduced by about 37%. Furthermore, scatter dropped by about 32%, which led to reduced cupping artifacts and improved CT value accuracy. Spatial resolution and CNRD was not affected by the SF. By means of a single SF with variable SFD designed for CT, significant dose reduction can be achieved and image quality can be improved by reducing noise inhomogeneity as well as scatter-induced artifacts. (paper)

  7. PRESERVING A TRADITION

    2007-01-01

    COVER STORY The Chinese art of paper cutting has long been a popular pastime in the country’s rural areas.For more than 1,000 years,farming families have used it as a method for decorating their homes,but the tradition has struggled for survival in recent years.In Yanchuan County in China’s northwestern Shaanxi Province,however,the art form has experienced a revival thanks to the efforts of a local woman.Paper cutting master Gao Fenglian has invested her own money in establishing a paper cutting gallery in the region.The craft’s growing popularity has also fuelled a new wave of people wanting to learn how to cut.More than 10,000 of the county’s 200,000 are now skilled in the ancient craft,and its revival could serve as a model for the preservation of other Chinese traditions.

  8. Traditional Chinese Medicine

    2009-01-01

    2009013 Clinical observation on treatment of active rheumatoid arthritis with Chinese herbal medicine. SHENG Zhenghe(盛正和), et al.Dept TCM, 5th Affili Hosp, Guangxi Med Univ, Guangxi 545001. Chin J Integr Tradit West Med 2008;28(11):990-993. Objective To study the efficacy and safety of Chinese drugs for expelling evil-wind, removing dampness, promoting blood circulation and invigorating yin in treating active rheumatoid arthritis (RA).

  9. Distance and Traditional Education

    Liu, Yuliang

    2002-01-01

    This case study is designed to investigate how distance education technology affects the instructor’s simultaneously teaching the same course via instructional television (ITV) and traditional education (face-to-face) formats. This study involved random observations of the instructor in a graduate course in both instructional television and face-to-face classrooms. In addition, an interview with the instructor was conducted to collect more data. This study has suggested that the instructor wh...

  10. 3.TRADITIONAL CHINESE MEDICINE

    1993-01-01

    930625 Clinical study of rotundium in treating atrialfibrillation.WANG Dajin,et al.CardiovascDis Instit,Tongji Med Univ,Wuhan,430022.Chin JIntegr Tradit & West Med 1993;13(8):455—457.L—tetrahydropalmatine(Rotundium)is an alkaloidof Corydalis turtschaninovii.Some animal experimentshad demonstrated that Rotundium had a good antiar-rhythmic effect on blocking the calcium channel andthat it was a class Ⅳ antiarrhythmic agent,similar to

  11. The tyranny of tradition.

    Gulati, L

    1999-01-01

    This paper narrates the cruelty enforced by tradition on the lives of women in India. It begins with the life of the author's great-grandmother Ponnamma wherein the family was rigidly patriarchal, and Brahmin values were applied. Here, women had very little say in the decisions men made, were forced in an arranged marriage before puberty, were not sent to school, and were considered unimportant. This tradition lived on in the author's grandmother Seetha and in the life of her mother Saras. However, in the story of Saras, following the death of her husband, they departed from rigid Brahmin tradition and orthodoxy. Her mother, unperturbed by the challenges she faced, consistently devised ways to cope and succeeded in changing environment. Meaningless Brahmatic rituals and prayers found no place in her life, which she approached with a cosmopolitan and humanitarian outlook. In essence, she shaped the lives of three daughters and a son, and all her grandchildren, making a success of not only her own but of all whose lives she touched. PMID:12322347

  12. Defueling filter test

    The Three Mile Island Unit 2 Reactor (TMI-2) has sustained core damage creating a significant quantity of fine debris, which can become suspended during the planned defueling operations, and will have to be constantly removed to maintain water clarity and minimize radiation exposure. To accomplish these objectives, a Defueling Water Cleanup System (DWCS) has been designed. One of the primary components in the DWCS is a custom designed filter canister using an all stainless steel filter medium. The full scale filter canister is designed to remove suspended solids from 800 microns to 0.5 microns in size. Filter cartridges are fabricated into an element cluster to provide for a flowrate of greater than 100 gals/min. Babcock and Wilcox (B and W) under contract to GPU Nuclear Corporation has evaluated two candidate DWCS filter concepts in a 1/100 scale proof-of-principle test program at BandW's Lynchburg Research Center. The filters were challenged with simulated solids suspensions of 1400 and 140 ppm in borated water (5000 ppm boron). Test data collected includes solids loading, effluent turbidity, and differential pressure trends versus time. From the proof-of-principle test results, a full-scale filter canister was generated

  13. Smoke and pollutant filtering device

    A smoke and pollutant filtering device comprising a mask having a filter composed of a series of contiguous, serial layers of filtering material. The filter consists of front and rear gas permeable covers, a first filter layer of pressed vegetable matter, a second filter layer comprising a layer of activated charcoal adjacent a layer of aqua filter floss, a third filter comprising a gas permeable cloth situated between layers of pressed vegetable matter, and a fourth filter layer comprising an aqua filter floss. The first through fourth filter layers are sandwiched between the front and rear gas permeable covers. The filtering device is stitched together and mounted within a fireretardant hood shaped to fit over a human head. Elastic bands are included in the hood to maintain the hood snugly about the head when worn

  14. An area efficient low noise 100 Hz low-pass filter

    Ølgaard, Christian; Sassene, Haoues; Perch-Nielsen, Ivan R.

    1996-01-01

    A technique based on scaling a filter's capacitor currents to improve the noise performance of low frequency continuous-time filters is presented. Two 100 Hz low-pass filters have been implemented: a traditional low pass filter (as reference), and a filter utilizing the above mentioned current...... when a class A/B biasing scheme is used in the current divider. Obtaining identical noise performance from the reference filter would require a 3.6 times larger filter capacitor. This would increase the reference filter's die area by 100%. Therefore, the current scaling technique allows filters with...... improved noise performance/dynamic range, given a fixed silicon area and a fixed power supply...

  15. Improved multilevel filters to enhance infrared small target

    Xiaoping Wang; Tianxu Zhang; Luxin Yan; Man Wang; Jiawei Wu

    2011-01-01

    We propose improved multilevel filters (IMLFs) involving the absolute value operation into the algorithmic framework of traditional multilevel filters (MLFs) to improve the robustness of infrared small target enhancement techniques under a complex infrared cluttered background. Compared with the widely used small target enhancement methods which only deal with bright targets, the proposed technique can enhance the infrared small target, whether it is bright or dark. Experimental results verify that the proposed technique is efficient and practical.%@@ We propose improved multilevel filters (IMLFs) involving the absolute value operation into the algorithmic framework of traditional multilevel filters (MLFs) to improve the robustness of infrared small target enhancement techniques under a complex infrared cluttered background.Compared with the widely used small target enhancement methods which only deal with bright targets, the proposed technique can enhance the infrared small target, whether it is bright or dark.Experimental results verify that the proposed technique is efficient and practical.

  16. Particle Filtering: The Need for Speed

    Karlsson Rickard

    2010-01-01

    Full Text Available Abstract The particle filter(PF has during the last decade been proposed for a wide range of localization and tracking applications. There is a general need in such embedded system to have a platform for efficient and scalable implementation of the PF. One such platform is the graphics processing unit (GPU, originally aimed to be used for fast rendering of graphics. To achieve this, GPUs are equipped with a parallel architecture which can be exploited for general-purpose computing on GPU (GPGPU as a complement to the central processing unit (CPU. In this paper, GPGPU techniques are used to make a parallel recursive Bayesian estimation implementation using particle filters. The modifications made to obtain a parallel particle filter, especially for the resampling step, are discussed and the performance of the resulting GPU implementation is compared to the one achieved with a traditional CPU implementation. The comparison is made using a minimal sensor network with bearings-only sensors. The resulting GPU filter, which is the first complete GPU implementation of a PF published to this date, is faster than the CPU filter when many particles are used, maintaining the same accuracy. The parallelization utilizes ideas that can be applicable for other applications.

  17. Particle Filtering: The Need for Speed

    Gustaf Hendeby

    2010-01-01

    Full Text Available The particle filter (PF has during the last decade been proposed for a wide range of localization and tracking applications. There is a general need in such embedded system to have a platform for efficient and scalable implementation of the PF. One such platform is the graphics processing unit (GPU, originally aimed to be used for fast rendering of graphics. To achieve this, GPUs are equipped with a parallel architecture which can be exploited for general-purpose computing on GPU (GPGPU as a complement to the central processing unit (CPU. In this paper, GPGPU techniques are used to make a parallel recursive Bayesian estimation implementation using particle filters. The modifications made to obtain a parallel particle filter, especially for the resampling step, are discussed and the performance of the resulting GPU implementation is compared to the one achieved with a traditional CPU implementation. The comparison is made using a minimal sensor network with bearings-only sensors. The resulting GPU filter, which is the first complete GPU implementation of a PF published to this date, is faster than the CPU filter when many particles are used, maintaining the same accuracy. The parallelization utilizes ideas that can be applicable for other applications.

  18. Multilevel filtering elliptic preconditioners

    Kuo, C. C. Jay; Chan, Tony F.; Tong, Charles

    1989-01-01

    A class of preconditioners is presented for elliptic problems built on ideas borrowed from the digital filtering theory and implemented on a multilevel grid structure. They are designed to be both rapidly convergent and highly parallelizable. The digital filtering viewpoint allows the use of filter design techniques for constructing elliptic preconditioners and also provides an alternative framework for understanding several other recently proposed multilevel preconditioners. Numerical results are presented to assess the convergence behavior of the new methods and to compare them with other preconditioners of multilevel type, including the usual multigrid method as preconditioner, the hierarchical basis method and a recent method proposed by Bramble-Pasciak-Xu.

  19. Circuits and filters handbook

    Chen, Wai-Kai

    2003-01-01

    A bestseller in its first edition, The Circuits and Filters Handbook has been thoroughly updated to provide the most current, most comprehensive information available in both the classical and emerging fields of circuits and filters, both analog and digital. This edition contains 29 new chapters, with significant additions in the areas of computer-aided design, circuit simulation, VLSI circuits, design automation, and active and digital filters. It will undoubtedly take its place as the engineer's first choice in looking for solutions to problems encountered in the design, analysis, and behavi

  20. HEPA filter jointer

    Hill, D.; Martinez, H.E.

    1998-02-01

    A HEPA filter jointer system was created to remove nitrate contaminated wood from the wooden frames of HEPA filters that are stored at the Rocky Flats Plant. A commercial jointer was chosen to remove the nitrated wood. The chips from the wood removal process are in the right form for caustic washing. The jointer was automated for safety and ease of operation. The HEPA filters are prepared for jointing by countersinking the nails with a modified air hammer. The equipment, computer program, and tests are described in this report.

  1. Trajectory probability hypothesis density filter

    García-Fernández, Ángel F.; Svensson, Lennart

    2016-01-01

    This paper presents the probability hypothesis density (PHD) filter for sets of trajectories. The resulting filter, which is referred to as trajectory probability density filter (TPHD), is capable of estimating trajectories in a principled way without requiring to evaluate all measurement-to-target association hypotheses. As the PHD filter, the TPHD filter is based on recursively obtaining the best Poisson approximation to the multitrajectory filtering density in the sense of minimising the K...

  2. Derivative free filtering using Kalmtool

    Bayramoglu, Enis; Hansen, Søren; Ravn, Ole;

    2010-01-01

    In this paper we present a toolbox enabling easy evaluation and comparison of different filtering algorithms. The toolbox is called Kalmtool 4 and is a set of MATLAB tools for state estimation of nonlinear systems. The toolbox contains functions for extended Kalman filtering as well as for DD1...... filter and the DD2 filter. It also contains functions for Unscented Kalman filters as well as several versions of particle filters. The toolbox requires MATLAB version 7, but no additional toolboxes are required....

  3. Spatial filtering efficiency of monostatic biaxial lidar: analysis and applications

    Agishev, Ravil R.; Comerón Tejero, Adolfo

    2002-01-01

    Results of lidar modeling based on spatial-angular filtering efficiency criteria are presented. Their analysis shows that the low spatial-angular filtering efficiency of traditional visible and near-infrared systems is an important cause of low signal background-radiation ratio SBR at the photodetector input. The low SBR may be responsible for considerable measurement errors and ensuing the low accuracy of the retrieval of atmospheric optical parameters. As shown, the most effec...

  4. Blind Source Parameters for Performance Evaluation of Despeckling Filters

    Nagashettappa Biradar; Dewal, M. L.; ManojKumar Rohit; Sanjaykumar Gowre; Yogesh Gundge

    2016-01-01

    The speckle noise is inherent to transthoracic echocardiographic images. A standard noise-free reference echocardiographic image does not exist. The evaluation of filters based on the traditional parameters such as peak signal-to-noise ratio, mean square error, and structural similarity index may not reflect the true filter performance on echocardiographic images. Therefore, the performance of despeckling can be evaluated using blind assessment metrics like the speckle suppression index, spec...

  5. Paul Rodgersi filter Kohilas

    2000-01-01

    28. I Kohila keskkoolis kohaspetsiifiline skulptuur ja performance "Filter". Kooli 130. aastapäeva tähistava ettevõtmise eesotsas oli skulptor Paul Rodgers ja kaks viimase klassi noormeest ئ Marko Heinmäe, Hendrik Karm.

  6. Updating the OMERACT filter

    Kirwan, John R; Boers, Maarten; Hewlett, Sarah; Beaton, Dorcas; Bingham, Clifton O; Choy, Ernest; Conaghan, Philip G; D'Agostino, Maria-Antonietta; Dougados, Maxime; Furst, Daniel E; Guillemin, Francis; Gossec, Laure; van der Heijde, Désirée M; Kloppenburg, Margreet; Kvien, Tore K; Landewé, Robert B M; Mackie, Sarah L; Matteson, Eric L; Mease, Philip J; Merkel, Peter A; Østergaard, Mikkel; Saketkoo, Lesley Ann; Simon, Lee; Singh, Jasvinder A; Strand, Vibeke; Tugwell, Peter

    2014-01-01

    OBJECTIVE: The Outcome Measures in Rheumatology (OMERACT) Filter provides guidelines for the development and validation of outcome measures for use in clinical research. The "Truth" section of the OMERACT Filter presupposes an explicit framework for identifying the relevant core outcomes that are...... for defining core areas of measurement ("Filter 2.0 Core Areas of Measurement") was presented at OMERACT 11 to explore areas of consensus and to consider whether already endorsed core outcome sets fit into this newly proposed framework. METHODS: Discussion groups critically reviewed the extent to...... construction, presentation, and clarity of the framework were questioned. The discussion groups and subsequent feedback highlighted 20 such issues. CONCLUSION: These issues will require resolution to reach consensus on accepting the proposed Filter 2.0 framework of Core Areas as the basis for the selection of...

  7. Updating the OMERACT filter

    Tugwell, Peter; Boers, Maarten; D'Agostino, Maria-Antonietta; Beaton, Dorcas; Boonen, Annelies; Bingham, Clifton O; Choy, Ernest; Conaghan, Philip G; Dougados, Maxime; Duarte, Catia; Furst, Daniel E; Guillemin, Francis; Gossec, Laure; Heiberg, Turid; van der Heijde, Désirée M; Hewlett, Sarah; Kirwan, John R; Kvien, Tore K; Landewé, Robert B; Mease, Philip J; Østergaard, Mikkel; Simon, Lee; Singh, Jasvinder A; Strand, Vibeke; Wells, George

    2014-01-01

    OBJECTIVE: The Outcome Measures in Rheumatology (OMERACT) Filter provides guidelines for the development and validation of outcome measures for use in clinical research. The "Truth" section of the OMERACT Filter requires that criteria be met to demonstrate that the outcome instrument meets the...... criteria for content, face, and construct validity. METHODS: Discussion groups critically reviewed a variety of ways in which case studies of current OMERACT Working Groups complied with the Truth component of the Filter and what issues remained to be resolved. RESULTS: The case studies showed that there...... is broad agreement on criteria for meeting the Truth criteria through demonstration of content, face, and construct validity; however, several issues were identified that the Filter Working Group will need to address. CONCLUSION: These issues will require resolution to reach consensus on how Truth...

  8. HEPA air filter (image)

    ... pet dander and other irritating allergens from the air. Along with other methods to reduce allergens, such ... controlling the amount of allergens circulating in the air. HEPA filters can be found in most air ...

  9. In the Dirac tradition

    It was Paul Dirac who cast quantum mechanics into the form we now use, and many generations of theoreticians openly acknowledge his influence on their thinking. When Dirac died in 1984, St. John's College, Cambridge, his base for most of his lifetime, instituted an annual lecture in his memory at Cambridge. The first lecture, in 1986, attracted two heavyweights - Richard Feynman and Steven Weinberg. Far from using the lectures as a platform for their own work, in the Dirac tradition they presented stimulating material on deep underlying questions

  10. Retina-inspired Filter

    Doutsi, Effrosyni; Fillatre, Lionel; Antonini, Marc; Gaulmin, Julien

    2016-01-01

    This paper introduces a novel filter which is inspired by the human retina. The human retina consists of three different layers: the Outer Plexiform Layer (OPL), the inner plexiform layer and the ganglionic layer. Our inspiration is the linear transform which takes place in the OPL and has been mathematically described by the neuroscientific model “virtual retina”. This model is the cornerstone to derive the non-separable spatiotemporal OPL retina-inspired filter, briefly renamed retina- insp...

  11. Filtering Solid Gabor Noise

    Lagae, Ares; Drettakis, George

    2011-01-01

    Solid noise is a fundamental tool in computer graphics. Surprisingly, no existing noise function supports both high-quality anti-aliasing and continuity across sharp edges. In this paper we show that a slicing approach is required to preserve continuity across sharp edges, and we present a new noise function that supports anisotropic filtering of sliced solid noise. This is made possible by individually filtering the slices of Gabor kernels, which requires the proper treatment of phase. This ...

  12. Kalman Filtering in R

    Fernando Tusell

    2011-03-01

    Full Text Available Support in R for state space estimation via Kalman filtering was limited to one package, until fairly recently. In the last five years, the situation has changed with no less than four additional packages offering general implementations of the Kalman filter, including in some cases smoothing, simulation smoothing and other functionality. This paper reviews some of the offerings in R to help the prospective user to make an informed choice.

  13. Feedback Particle Filter

    Yang, Tao; Mehta, Prashant G.; Meyn, Sean P.

    2013-01-01

    A new formulation of the particle filter for nonlinear filtering is presented, based on concepts from optimal control, and from the mean-field game theory. The optimal control is chosen so that the posterior distribution of a particle matches as closely as possible the posterior distribution of the true state given the observations. This is achieved by introducing a cost function, defined by the Kullback-Leibler (K-L) divergence between the actual posterior, and the posterior of any particle....

  14. Filtered Social Learning

    Paul Niehaus

    2011-01-01

    Knowledge sharing is economically important but also typically incomplete: we "filter" our communication. This paper analyzes the consequences of filtering. In the model, homogeneous agents share knowledge with their peers whenever the private benefits exceed communication costs. The welfare implications of this transmission mechanism hinge on whether units of knowledge complement, substitute for, or are independent of each other. Both substitutability and complementarity generate externaliti...

  15. The Band Pass Filter

    Christiano, Lawrence J.; Terry J. Fitzgerald

    1999-01-01

    The `ideal' band pass filter can be used to isolate the component of a time series that lies within a particular band of frequencies. However, applying this filter requires a dataset of infinite length. In practice, some sort of approximation is needed. Using projections, we derive approximations that are optimal when the time series representations underlying the raw data have a unit root, or are stationary about a trend. We identify one approximation which, though it is only optimal for one...

  16. Novel quaternion Kalman filter

    Choukroun, Daniel; Bar-Itzhack, Itzhack Y.; Oshman, Yaakov

    2006-01-01

    This paper presents a novel Kalman filter for estimating the attitude-quaternion as well as gyro random drifts from vector measurements. Employing a special manipulation on the measurement equation results in a linear pseudo-measurement equation whose error is state-dependent. Because the quaternion kinematics equation is linear, the combination of the two yields a linear Kalman filter that eliminates the usual linearization procedure and is less sensitive to initial estimation errors. Genera...

  17. Spatial filter issues

    Experiments and calculations indicate that the threshold pressure in spatial filters for distortion of a transmitted pulse scales approximately as IO.2 and (F number-sign)2 over the intensity range from 1014 to 2xlO15 W/CM2 . We also demonstrated an interferometric diagnostic that will be used to measure the scaling relationships governing pinhole closure in spatial filters

  18. Kalman Filtering in R

    Fernando Tusell

    2011-01-01

    Support in R for state space estimation via Kalman filtering was limited to one package, until fairly recently. In the last five years, the situation has changed with no less than four additional packages offering general implementations of the Kalman filter, including in some cases smoothing, simulation smoothing and other functionality. This paper reviews some of the offerings in R to help the prospective user to make an informed choice.

  19. Dip-separated structural filtering using seislet transform and adaptive empirical mode decomposition based dip filter

    Chen, Yangkang

    2016-04-01

    The seislet transform has been demonstrated to have a better compression performance for seismic data compared with other well-known sparsity promoting transforms, thus it can be used to remove random noise by simply applying a thresholding operator in the seislet domain. Since the seislet transform compresses the seismic data along the local structures, the seislet thresholding can be viewed as a simple structural filtering approach. Because of the dependence on a precise local slope estimation, the seislet transform usually suffers from low compression ratio and high reconstruction error for seismic profiles that have dip conflicts. In order to remove the limitation of seislet thresholding in dealing with conflicting-dip data, I propose a dip-separated filtering strategy. In this method, I first use an adaptive empirical mode decomposition based dip filter to separate the seismic data into several dip bands (5 or 6). Next, I apply seislet thresholding to each separated dip component to remove random noise. Then I combine all the denoised components to form the final denoised data. Compared with other dip filters, the empirical mode decomposition based dip filter is data-adaptive. One only need to specify the number of dip components to be separated. Both complicated synthetic and field data examples show superior performance of my proposed approach than the traditional alternatives. The dip-separated structural filtering is not limited to seislet thresholding, and can also be extended to all those methods that require slope information.

  20. Dip-separated structural filtering using seislet transform and adaptive empirical mode decomposition based dip filter

    Chen, Yangkang

    2016-07-01

    The seislet transform has been demonstrated to have a better compression performance for seismic data compared with other well-known sparsity promoting transforms, thus it can be used to remove random noise by simply applying a thresholding operator in the seislet domain. Since the seislet transform compresses the seismic data along the local structures, the seislet thresholding can be viewed as a simple structural filtering approach. Because of the dependence on a precise local slope estimation, the seislet transform usually suffers from low compression ratio and high reconstruction error for seismic profiles that have dip conflicts. In order to remove the limitation of seislet thresholding in dealing with conflicting-dip data, I propose a dip-separated filtering strategy. In this method, I first use an adaptive empirical mode decomposition based dip filter to separate the seismic data into several dip bands (5 or 6). Next, I apply seislet thresholding to each separated dip component to remove random noise. Then I combine all the denoised components to form the final denoised data. Compared with other dip filters, the empirical mode decomposition based dip filter is data-adaptive. One only needs to specify the number of dip components to be separated. Both complicated synthetic and field data examples show superior performance of my proposed approach than the traditional alternatives. The dip-separated structural filtering is not limited to seislet thresholding, and can also be extended to all those methods that require slope information.

  1. Filter Paper: Solution to High Self-Attenuation Corrections in HEPA Filter Measurements

    Oberer, R.B.; Harold, N.B.; Gunn, C.A.; Brummett, M.; Chaing, L.G.

    2005-10-01

    An 8 by 8 by 6 inch High Efficiency Particulate Air (HEPA) filter was measured as part of a uranium holdup survey in June of 2005 as it has been routinely measured every two months since 1998. Although the survey relies on gross gamma count measurements, this was one of a few measurements that had been converted to a quantitative measurement in 1998. The measurement was analyzed using the traditional Generalized Geometry Holdup (GGH) approach, using HMS3 software, with an area calibration and self-attenuation corrected with an empirical correction factor of 1.06. A result of 172 grams of {sup 235}U was reported. The actual quantity of {sup 235}U in the filter was approximately 1700g. Because of this unusually large discrepancy, the measurement of HEPA filters will be discussed. Various techniques for measuring HEPA filters will be described using the measurement of a 24 by 24 by 12 inch HEPA filter as an example. A new method to correct for self attenuation will be proposed for this measurement Following the discussion of the 24 by 24 by 12 inch HEPA filter, the measurement of the 8 by 8 by 6 inch will be discussed in detail.

  2. Filter Paper: Solution to High Self-Attenuation Corrections in HEPA Filter Measurements

    An 8 by 8 by 6 inch High Efficiency Particulate Air (HEPA) filter was measured as part of a uranium holdup survey in June of 2005 as it has been routinely measured every two months since 1998. Although the survey relies on gross gamma count measurements, this was one of a few measurements that had been converted to a quantitative measurement in 1998. The measurement was analyzed using the traditional Generalized Geometry Holdup (GGH) approach, using HMS3 software, with an area calibration and self-attenuation corrected with an empirical correction factor of 1.06. A result of 172 grams of 235U was reported. The actual quantity of 235U in the filter was approximately 1700g. Because of this unusually large discrepancy, the measurement of HEPA filters will be discussed. Various techniques for measuring HEPA filters will be described using the measurement of a 24 by 24 by 12 inch HEPA filter as an example. A new method to correct for self attenuation will be proposed for this measurement Following the discussion of the 24 by 24 by 12 inch HEPA filter, the measurement of the 8 by 8 by 6 inch will be discussed in detail

  3. Miniaturized superconducting microwave filters

    In this paper we present methods for the miniaturization of superconducting filters. We consider two designs of seventh-order bandpass Chebyshev filters based on lumped elements and a novel quasi-lumped element resonator. In both designs the area of the filters, with a central frequency of 2-5 GHz, is less than 1.2 mm2. Such small filters can be readily integrated on a single board for multi-channel microwave control of superconducting qubits. The filters have been experimentally tested and the results are compared with simulations. The miniaturization resulted in parasitic coupling between resonators and within each resonator that affected primarily the stopband and increased the bandwidth. The severity of the error depends on the design in particular, and was less sensitive when a groundplane was used under the inductances of the resonators. The best performance was reached for the quasi-lumped filter with central frequency of 4.45 GHz, quality factor of 40 and 50 dB stopband

  4. Circular filter bag change ladderack system video presentation

    A great deal of research and development at Harwell over the last few years has centered around the design of circular radial flow HEPA filters as alternatives to the traditional rectangular HEPA filter. With a circular insert there are inherent features which give this geometry certain advantages over its counterpart, such as ease of sealing, compatibility with remote handling and disposal routes; these have been well publicized in previous works. A mock-up is shown of a bag change ladderack system of 3400m3/h circular filter. It highlights the space requirements for bag changing and demonstrates the ease with which a filter may be replaced. The filter throat incorporates a silicone rubber lip seal which forms a flap seal against a tapered spigot feature built into the wall. The novelty of this filter design is that the bag is an integral part of the filter and is attached onto the filter flange. This enables the inside of the filter, where the contamination particulate has collected, to be sealed/bagged off and hence the dust burden retained

  5. Ceramic filters for bulk inoculation of nickel alloy castings

    F. Binczyk

    2011-07-01

    Full Text Available The work includes the results of research on production technology of ceramic filters which, besides the traditional filtering function, playalso the role of an inoculant modifying the macrostructure of cast nickel alloys. To play this additional role, filters should demonstratesufficient compression strength and ensure proper flow rate of liquid alloy. The role of an inoculant is played by cobalt aluminateintroduced to the composition of external coating in an amount from 5 to 10 wt.% . The required compression strength (over 1MPa isprovided by the supporting layers, deposited on the preform, which is a polyurethane foam. Based on a two-level fractional experiment24-1, the significance of an impact of various technological parameters (independent variables on selected functional parameters of theready filters was determined. Important effect of the number of the supporting layers and sintering temperature of filters after evaporationof polyurethane foam was stated.

  6. Distortion Parameters Analysis Method Based on Improved Filtering Algorithm

    ZHANG Shutuan

    2013-10-01

    Full Text Available In order to realize the accurate distortion parameters test of aircraft power supply system, and satisfy the requirement of corresponding equipment in the aircraft, the novel power parameters test system based on improved filtering algorithm is introduced in this paper. The hardware of the test system has the characters of s portable and high-speed data acquisition and processing, and the software parts utilize the software Labwindows/CVI as exploitation software, and adopt the pre-processing technique and adding filtering algorithm. Compare with the traditional filtering algorithm, the test system adopted improved filtering algorithm can help to increase the test accuracy. The application shows that the test system with improved filtering algorithm can realize the accurate test results, and reach to the design requirements.  

  7. Robust Hammerstein Adaptive Filtering under Maximum Correntropy Criterion

    Zongze Wu

    2015-10-01

    Full Text Available The maximum correntropy criterion (MCC has recently been successfully applied to adaptive filtering. Adaptive algorithms under MCC show strong robustness against large outliers. In this work, we apply the MCC criterion to develop a robust Hammerstein adaptive filter. Compared with the traditional Hammerstein adaptive filters, which are usually derived based on the well-known mean square error (MSE criterion, the proposed algorithm can achieve better convergence performance especially in the presence of impulsive non-Gaussian (e.g., α-stable noises. Additionally, some theoretical results concerning the convergence behavior are also obtained. Simulation examples are presented to confirm the superior performance of the new algorithm.

  8. Preparation and Application of New Porous Environmental Ceramics Filter Medium

    LI Meng; WU Jianfeng; JIN Jianhua; LIU Xinming

    2005-01-01

    A new kind of environmental ceramics medium which was made of industrial solid wastes discharged by Shandong Alum Corporation has been used in the process of drinking water treatment. New techniques were introduced to ensure its remarkable advantages such as high porosity and strength. The results of practical application show that this sort of filter medium has shorter filtration run, shorter mature period and higher filter deposit capability compared with traditional sand filter medium. Moreover, up to 25%- 30% of the daily running costs are expected to be reduced by using this ceramics medium.

  9. Axial 3D region of interest reconstruction using weighted cone beam BPF/DBPF algorithm cascaded with adequately oriented orthogonal butterfly filtering

    Tang, Shaojie; Tang, Xiangyang

    2016-03-01

    Axial cone beam (CB) computed tomography (CT) reconstruction is still the most desirable in clinical applications. As the potential candidates with analytic form for the task, the back projection-filtration (BPF) and the derivative backprojection filtered (DBPF) algorithms, in which Hilbert filtering is the common algorithmic feature, are originally derived for exact helical and axial reconstruction from CB and fan beam projection data, respectively. These two algorithms have been heuristically extended for axial CB reconstruction via adoption of virtual PI-line segments. Unfortunately, however, streak artifacts are induced along the Hilbert filtering direction, since these algorithms are no longer accurate on the virtual PI-line segments. We have proposed to cascade the extended BPF/DBPF algorithm with orthogonal butterfly filtering for image reconstruction (namely axial CB-BPP/DBPF cascaded with orthogonal butterfly filtering), in which the orientation-specific artifacts caused by post-BP Hilbert transform can be eliminated, at a possible expense of losing the BPF/DBPF's capability of dealing with projection data truncation. Our preliminary results have shown that this is not the case in practice. Hence, in this work, we carry out an algorithmic analysis and experimental study to investigate the performance of the axial CB-BPP/DBPF cascaded with adequately oriented orthogonal butterfly filtering for three-dimensional (3D) reconstruction in region of interest (ROI).

  10. Decontamination of HEPA filters

    Mound Facility, during many years of plutonium-238 experience, has recovered over 150 kg of plutonium-238. Much of this material was recovered from HEPA filters or from solid wastes such as sludge and slag. The objective of this task was to modify and improve the existing nitric acid leaching process used at Mound so that filters from the nuclear fuel cycle could be decontaminated effectively. Various leaching agents were tested to determine their capability for dissolving PuO2, UO2, U3O8, AmO2, NpO2, CmO2, and ThO2 in mixtures of the following: HNO3-HF; HNO3-HF-H2SO4; and HNO3-(NH4)2Ce(NO3)6. Adsorption isotherms were obtained for two leaching systems. In some tests simulated contaminated HEPA filter material was used, while in others actual spent glovebox filters were used. The maximum decontamination factor of 833 was achieved in the recovery of plutonium-238 from actual filters. The dissolution was accomplished by using a six-stage process with 4N HNO3-0.23M (NH4)2Ce(NO3)6 as the leaching agent. Thorium oxide was also effectively dissolved from filter media using a mixture of nitric acid and ceric ammonium nitrate. Sodium carbonate and Na2CO3-KNO3 fusion tests were performed using simulated PuO2-contaminated filter media at various temperatures. Approximately 70 wt% of the PuO2 was soluble in a mixture composed of 70 wt% Na2CO3-30 wt% KNO3 (heated for 1 h at 9500C). 23 figs., 14 tables

  11. Shape Preserving Filament Enhancement Filtering

    Wilkinson, Michael H.F.; Westenberg, Michel A.

    2001-01-01

    Morphological connected set filters for extraction of filamentous details from medical images are developed. The advantages of these filters are that they are shape preserving and do not amplify noise. Two approaches are compared: (i) multi-scale filtering (ii) single-step shape filtering using conn

  12. Design of a cavity filter

    A cavity filter was developed for the SSRF 0-mode beam feedback. The filter is used to pick up the 500 MHz signal from the storage ring beam. The Superfish was used to simulate the model of the cavity bandpass filter. The design method, parameters of the filter and results of beam measurements are described in this paper. (authors)

  13. Kalman Filtering for Manufacturing Processes

    Oakes, Thomas; Tang, Lie; Robert G. Landers; Balakrishnan, S.N.

    2009-01-01

    This chapter presented a methodology, based on stochastic process modeling and Kalman filtering, to filter manufacturing process measurements, which are known to be inherently noisy. Via simulation studies, the methodology was compared to low pass and Butterworth filters. The methodology was applied in a Friction Stir Welding (FSW) process to filter data

  14. TRADITIONAL CHINESE MEDICINE

    1993-01-01

    930433 A study on relationship between hy-pothyroidism and deficiency of kidney YANG.ZHA Lianglun(查良伦),et al.lnstit Integr TCM& West Med,Shanghai Med Univ,Shanghai,200040.Chin J Integr Tradit & West Med 1993;13(4):202—204.Thirty—two cases of hypothyroidism causedby various factors were treated for one year withChinese medicinal herbs preparation“Shen Lutablet”(SLT)to warm and reinforce the KidneyYang.34 normal persons were studied as a con-trol group.After treatment with SLT,the clini-cal symptoms of hypothyroidism were markedlyimproved.Average serum concentration of totalT3,T4 increased significantly from 67.06±4.81

  15. Traditional preventive treatment options

    Longbottom, C; Ekstrand, K; Zero, D

    2009-01-01

    prevention of caries in children, e.g. pit and fissure sealants and topically applied fluorides (including patient-applied fluoride toothpastes and professionally applied fluoride varnishes), but limited strong evidence for these techniques for secondary prevention--i.e. where early to established lesions......Preventive treatment options can be divided into primary, secondary and tertiary prevention techniques, which can involve patient- or professionally applied methods. These include: oral hygiene (instruction), pit and fissure sealants ('temporary' or 'permanent'), fluoride applications (patient- or...... conventional operative care, and since controlling the caries process prior to first restoration is the key to breaking the repair cycle and improving care for patients, future research should address the shortcomings in the current level of supporting evidence for the various traditional preventive treatment...

  16. Asymmetric Baxter-King filter

    Buss, Ginters

    2011-01-01

    The paper proposes an extension of the symmetric Baxter-King band pass filter to an asymmetric Baxter-King filter. The optimal correction scheme of the ideal filter weights is the same as in the symmetric version, i.e, cut the ideal filter at the appropriate length and add a constant to all filter weights to ensure zero weight on zero frequency. Since the symmetric Baxter-King filter is unable to extract the desired signal at the very ends of the series, the extension to an asymmetric filter...

  17. Choosing and using astronomical filters

    Griffiths, Martin

    2014-01-01

    As a casual read through any of the major amateur astronomical magazines will demonstrate, there are filters available for all aspects of optical astronomy. This book provides a ready resource on the use of the following filters, among others, for observational astronomy or for imaging: Light pollution filters Planetary filters Solar filters Neutral density filters for Moon observation Deep-sky filters, for such objects as galaxies, nebulae and more Deep-sky objects can be imaged in much greater detail than was possible many years ago. Amateur astronomers can take

  18. Anti-clogging filter system

    Brown, Erik P.

    2015-05-19

    An anti-clogging filter system for filtering a fluid containing large particles and small particles includes an enclosure with at least one individual elongated tubular filter element in the enclosure. The individual elongated tubular filter element has an internal passage, a closed end, an open end, and a filtering material in or on the individual elongated tubular filter element. The fluid travels through the open end of the elongated tubular element and through the internal passage and through the filtering material. An anti-clogging element is positioned on or adjacent the individual elongated tubular filter element and provides a fluid curtain that preferentially directs the larger particulates to one area of the filter material allowing the remainder of the filter material to remain more efficient.

  19. Multilevel Mixture Kalman Filter

    Xiaodong Wang

    2004-11-01

    Full Text Available The mixture Kalman filter is a general sequential Monte Carlo technique for conditional linear dynamic systems. It generates samples of some indicator variables recursively based on sequential importance sampling (SIS and integrates out the linear and Gaussian state variables conditioned on these indicators. Due to the marginalization process, the complexity of the mixture Kalman filter is quite high if the dimension of the indicator sampling space is high. In this paper, we address this difficulty by developing a new Monte Carlo sampling scheme, namely, the multilevel mixture Kalman filter. The basic idea is to make use of the multilevel or hierarchical structure of the space from which the indicator variables take values. That is, we draw samples in a multilevel fashion, beginning with sampling from the highest-level sampling space and then draw samples from the associate subspace of the newly drawn samples in a lower-level sampling space, until reaching the desired sampling space. Such a multilevel sampling scheme can be used in conjunction with the delayed estimation method, such as the delayed-sample method, resulting in delayed multilevel mixture Kalman filter. Examples in wireless communication, specifically the coherent and noncoherent 16-QAM over flat-fading channels, are provided to demonstrate the performance of the proposed multilevel mixture Kalman filter.

  20. Boolean filters of distributive lattices

    M. Sambasiva Rao

    2013-07-01

    Full Text Available In this paper we introduce the notion of Boolean filters in a pseudo-complemented distributive lattice and characterize the class of all Boolean filters. Further a set of equivalent conditions are derived for a proper filter to become a prime Boolean filter. Also a set of equivalent conditions is derived for a pseudo-complemented distributive lattice to become a Boolean algebra. Finally, a Boolean filter is characterized in terms of congruences.

  1. DOE HEPA filter test program

    This standard establishes essential elements of a Department of Energy (DOE) program for testing HEPA filters to be installed in DOE nuclear facilities or used in DOE-contracted activities. A key element is the testing of HEPA filters for performance at a DOE Filter Test Facility (FTF) prior to installation. Other key elements are (1) providing for a DOE HEPA filter procurement program, and (2) verifying that HEPA filters to be installed in nuclear facilities appear on a Qualified Products List (QPL)

  2. DOE HEPA filter test program

    NONE

    1998-05-01

    This standard establishes essential elements of a Department of Energy (DOE) program for testing HEPA filters to be installed in DOE nuclear facilities or used in DOE-contracted activities. A key element is the testing of HEPA filters for performance at a DOE Filter Test Facility (FTF) prior to installation. Other key elements are (1) providing for a DOE HEPA filter procurement program, and (2) verifying that HEPA filters to be installed in nuclear facilities appear on a Qualified Products List (QPL).

  3. Efficiency of new Miswak, titanium dioxide and sand filters in reducing pollutants from wastewater

    Mohamed Ramadan

    2015-01-01

    In this work, I tried to improve the efficiency and performance of the sand filter with a simple low cost method. This work successfully reported an alternative for the traditional biological sand filter which was considered as a time consuming filter. Miswak and TiO2 were added to the filter and their effects were observed. Three filters were designed (SF-1, SF-2, and SF-3). The results of this work indicated that chemical oxygen demand (COD) was reduced by 47.54%, 90.94%, and 95.47% in case...

  4. Fuzzy filters in BCI-algebras

    C. Lele; Wu, C; Mamadou, T.

    2002-01-01

    We introduce the notion of fuzzy filters and weak filters in BCI-algebras and discuss their properties. Then we establish some relations among filters, fuzzy filters, and weak filters in BCI-algebras.

  5. A new algorithm of inter-frame filtering in IR image based on threshold value

    Liu, Wei; Leng, Hanbing; Chen, Weining; Yang, Hongtao; Xie, Qingsheng; Yi, Bo; Zhang, Haifeng

    2013-09-01

    This paper proposed a new algorithm of inter-frame filtering in IR image based on threshold value for the purpose of solving image blur and smear brought by traditional inter-frame filtering algorithm. At first, it finds out causes of image blur and smear by analyzing general inter-frame filtering algorithm and dynamic inter-frame filtering algorithm, hence to bring up a new kind of time-domain filter. In order to obtain coefficients of the filter, it firstly gets difference image of present image and previous image, and then, it gets noisy threshold value by analyzing difference image with probability analysis method. The relationship between difference image and threshold value helps obtaining the coefficients of filter. At last, inter-frame filtering method is adopted to process pixels interrupted by noise. The experimental result shows that this algorithm has successfully repressed IR image blur and smear, and NETD tested by traditional inter filtering algorithm and the new algorithm are respectively 78mK and 70mK, which shows it has a better noise reduction performance than traditional ones. The algorithm is not only applied to still image, but also to sports image. As a new algorithm with great practical value, it is easy to achieve on FPGA, of excellent real-time performance and it effectively extends application scope of time domain filtering algorithm.

  6. Generalized Filtering Decomposition

    Grigori, Laura

    2011-01-01

    This paper introduces a new preconditioning technique that is suitable for matrices arising from the discretization of a system of PDEs on unstructured grids. The preconditioner satisfies a so-called filtering property, which ensures that the input matrix is identical with the preconditioner on a given filtering vector. This vector is chosen to alleviate the effect of low frequency modes on convergence and so decrease or eliminate the plateau which is often observed in the convergence of iterative methods. In particular, the paper presents a general approach that allows to ensure that the filtering condition is satisfied in a matrix decomposition. The input matrix can have an arbitrary sparse structure. Hence, it can be reordered using nested dissection, to allow a parallel computation of the preconditioner and of the iterative process.

  7. Glove-box filters

    Description is given of a device for simply and rapidly assembling and dissassembling the filters used inside sealed enclosures, such as glove-boxes and shielded cells equipped with nippers or manipulators, said filters being of the type comprising a cylindrical casing containing a filtering member, the upper portion of said casing being open so as to allow the gases to be cleaned to flow in, whereas the casing bottom is centrally provided with a hole extended outwardly by a threaded collar on which is screwed a connecting-sleeve to be fixed to the mouth of a gas outlet pipe. To a yoke transverse bar is welded a pin which can be likened to a bent spring-blade, one arm of which welded to said transverse bar, is rectilinear whereas its other arm is provided with a boss cooperating with a cavity made in a protrusion of said pipe, right under the mouth thereof

  8. Updating the OMERACT filter

    D'Agostino, Maria-Antonietta; Boers, Maarten; Kirwan, John; van der Heijde, Désirée; Østergaard, Mikkel; Schett, Georg; Landewé, Robert B; Maksymowych, Walter P; Naredo, Esperanza; Dougados, Maxime; Iagnocco, Annamaria; Bingham, Clifton O; Brooks, Peter M; Beaton, Dorcas E; Gandjbakhch, Frederique; Gossec, Laure; Guillemin, Francis; Hewlett, Sarah E; Kloppenburg, Margreet; March, Lyn; Mease, Philip J; Moller, Ingrid; Simon, Lee S; Singh, Jasvinder A; Strand, Vibeke; Wakefield, Richard J; Wells, George A; Tugwell, Peter; Conaghan, Philip G

    2014-01-01

    OBJECTIVE: The Outcome Measures in Rheumatology (OMERACT) Filter provides a framework for the validation of outcome measures for use in rheumatology clinical research. However, imaging and biochemical measures may face additional validation challenges because of their technical nature. The Imaging...... evaluated using the original OMERACT Filter and the newly proposed structure. Breakout groups critically reviewed the extent to which the candidate biomarkers complied with the proposed stepwise approach, as a way of examining the utility of the proposed 3-dimensional structure. RESULTS: Although there was...... was obtained for a proposed tri-axis structure to assess validation of imaging and soluble biomarkers; nevertheless, additional work is required to better evaluate its place within the OMERACT Filter 2.0....

  9. Ferroelectric electronically tunable filters

    A cylindrical cavity is loaded with a ferroelectric rod and is resonant at the dominant mode. The loaded cylindrical cavity is a band pass filter. As a bias voltage is applied across the ferroelectric rod, its permittivity changes resulting in a new resonant frequency for the loaded cylindrical cavity. The ferroelectric rod is operated at a temperature slightly above its Curie temperature. The loaded cylindrical cavity is kept at a constant designed temperature. The cylindrical cavity is made of conductors, a single crystal high Tc superconductor including YBCO and a single crystal dielectric, including sapphire and lanthanum aluminate, the interior conducting surfaces of which are deposited with a film of a single crystal high Tc superconductor. Embodiments also include waveguide single and multiple cavity type tunable filters. Embodiments also include tunable band reject filters. 10 figs

  10. Kalman filter modeling

    Brown, R. G.

    1984-01-01

    The formulation of appropriate state-space models for Kalman filtering applications is studied. The so-called model is completely specified by four matrix parameters and the initial conditions of the recursive equations. Once these are determined, the die is cast, and the way in which the measurements are weighted is determined foreverafter. Thus, finding a model that fits the physical situation at hand is all important. Also, it is often the most difficult aspect of designing a Kalman filter. Formulation of discrete state models from the spectral density and ARMA random process descriptions is discussed. Finally, it is pointed out that many common processes encountered in applied work (such as band-limited white noise) simply do not lend themselves very well to Kalman filter modeling.