WorldWideScience

Sample records for traditional filtered back-projection

  1. A Wiener filtering based back projection algorithm for image reconstruction

    In the context of computed tomography (CT), a key techniques is the image reconstruction from projection data. The filtered back projection (FBP) algorithm is commonly used in image reconstruction. Based on cause analysis of the artifacts, we propose a new image reconstruction algorithm combining the Wiener filter and FBP algorithm. The conventional FBP image reconstruction algorithm is improved by adopting Wiener filter: and artifacts in the reconstructed images are obviously reduced. Experimental results of typical flow regimes show that the improved algorithm can effectively improve the image quality. (authors)

  2. An adaptive filtered back-projection for photoacoustic image reconstruction

    Purpose: The purpose of this study is to develop an improved filtered-back-projection (FBP) algorithm for photoacoustic tomography (PAT), which allows image reconstruction with higher quality compared to images reconstructed through traditional algorithms. Methods: A rigorous expression of a weighting function has been derived directly from a photoacoustic wave equation and used as a ramp filter in Fourier domain. The authors’ new algorithm utilizes this weighting function to precisely calculate each photoacoustic signal’s contribution and then reconstructs the image based on the retarded potential generated from the photoacoustic sources. In addition, an adaptive criterion has been derived for selecting the cutoff frequency of a low pass filter. Two computational phantoms were created to test the algorithm. The first phantom contained five spheres with each sphere having different absorbances. The phantom was used to test the capability for correctly representing both the geometry and the relative absorbed energy in a planar measurement system. The authors also used another phantom containing absorbers of different sizes with overlapping geometry to evaluate the performance of the new method for complicated geometry. In addition, random noise background was added to the simulated data, which were obtained by using an arc-shaped array of 50 evenly distributed transducers that spanned 160° over a circle with a radius of 65 mm. A normalized factor between the neighbored transducers was applied for correcting measurement signals in PAT simulations. The authors assumed that the scanned object was mounted on a holder that rotated over the full 360° and the scans were set to a sampling rate of 20.48 MHz. Results: The authors have obtained reconstructed images of the computerized phantoms by utilizing the new FBP algorithm. From the reconstructed image of the first phantom, one can see that this new approach allows not only obtaining a sharp image but also showing

  3. Improvement of wavelet threshold filtered back-projection image reconstruction algorithm

    Ren, Zhong; Liu, Guodong; Huang, Zhen

    2014-11-01

    Image reconstruction technique has been applied into many fields including some medical imaging, such as X ray computer tomography (X-CT), positron emission tomography (PET) and nuclear magnetic resonance imaging (MRI) etc, but the reconstructed effects are still not satisfied because original projection data are inevitably polluted by noises in process of image reconstruction. Although some traditional filters e.g., Shepp-Logan (SL) and Ram-Lak (RL) filter have the ability to filter some noises, Gibbs oscillation phenomenon are generated and artifacts leaded by back-projection are not greatly improved. Wavelet threshold denoising can overcome the noises interference to image reconstruction. Since some inherent defects exist in the traditional soft and hard threshold functions, an improved wavelet threshold function combined with filtered back-projection (FBP) algorithm was proposed in this paper. Four different reconstruction algorithms were compared in simulated experiments. Experimental results demonstrated that this improved algorithm greatly eliminated the shortcomings of un-continuity and large distortion of traditional threshold functions and the Gibbs oscillation. Finally, the availability of this improved algorithm was verified from the comparison of two evaluation criterions, i.e. mean square error (MSE), peak signal to noise ratio (PSNR) among four different algorithms, and the optimum dual threshold values of improved wavelet threshold function was gotten.

  4. Filtered back-projection algorithm for Compton telescopes

    Gunter, Donald L.

    2008-03-18

    A method for the conversion of Compton camera data into a 2D image of the incident-radiation flux on the celestial sphere includes detecting coincident gamma radiation flux arriving from various directions of a 2-sphere. These events are mapped by back-projection onto the 2-sphere to produce a convolution integral that is subsequently stereographically projected onto a 2-plane to produce a second convolution integral which is deconvolved by the Fourier method to produce an image that is then projected onto the 2-sphere.

  5. Generalized filtered back-projection for digital breast tomosynthesis reconstruction

    Erhard, Klaus; Grass, Michael; Hitziger, Sebastian; Iske, Armin; Nielsen, Tim

    2012-03-01

    Filtered backprojection (FBP) has been commonly used as an efficient and robust reconstruction technique in tomographic X-ray imaging during the last decades. For standard geometries like circle or helix it is known how to efficiently filter the data. However, for geometries with only few projection views or with a limited angular range, the application of FBP algorithms generally provides poor results. In digital breast tomosynthesis (DBT) these limitations give rise to image artifacts due to the limited angular range and the coarse angular sampling. In this work, a generalized FBP algorithm is presented, which uses the filtered projection data of all acquired views for backprojection along one direction. The proposed method yields a computationally efficient generalized FBP algorithm for DBT, which provides similar image quality as iterative reconstruction techniques while preserving the ability for region of interest reconstructions. To demonstrate the excellent performance of this method, examples are given with a simulated breast phantom and the hardware BR3D phantom.

  6. Superresolution of Hyperspectral Image Using Advanced Nonlocal Means Filter and Iterative Back Projection

    Jin Wang

    2015-01-01

    Full Text Available We introduce an efficient superresolution algorithm based on advanced nonlocal means (NLM filter and iterative back projection for hyperspectral image. The nonlocal means method achieves the to-be-interpolated pixel by the weighted average of all pixels within an image, and the unrelated neighborhoods are automatically eliminated by the trivial weights. However, spatial location distance is also an important issue to reconstruct the missing pixel. Therefore, we proposed an advanced NLM (ANLM filter considering both neighborhood similarity and patch distance. In the conventional NLM method, the search region was the whole image, while the proposed ANLM utilizes the limited search to reduce the complexity. The iterative back projection (IBP is a very famous method to deal with the image restoration. In the superresolution issue, IBP is able to recover the high-resolution image iteratively from the given low-resolution image which is blurred due to the noise by minimizing the reconstruction error, while, because the reconstruction error of IBP is back projection and isotropic, the conventional IBP suffers from jaggy and ringing artifacts. Introducing the ANLM method to improve the visual quality is necessary.

  7. Monte Carlo evaluation of the Filtered Back Projection method for image reconstruction in proton computed tomography

    Cirrone, G.A.P., E-mail: cirrone@lns.infn.it [Laboratori Nazionali del Sud - National Instiute for Nuclear Physics INFN (INFN-LNS), Via S.Sofia 64, 95100 Catania (Italy); Bucciolini, M. [Department of ' Fisiopatologia Clinica' , University of Florence, V.le Morgagni 85, I-50134 Florence (Italy); Bruzzi, M. [Energetic Department, University of Florence, Via S. Marta 3, I-50139 Florence (Italy); Candiano, G. [Laboratorio di Tecnologie Oncologiche HSR, Giglio Contrada, Pietrapollastra-Pisciotto, 90015 Cefalu, Palermo (Italy); Civinini, C. [National Institute for Nuclear Physics INFN, Section of Florence, Via G. Sansone 1, Sesto Fiorentino, I-50019 Florence (Italy); Cuttone, G. [Laboratori Nazionali del Sud - National Instiute for Nuclear Physics INFN (INFN-LNS), Via S.Sofia 64, 95100 Catania (Italy); Guarino, P. [Nuclear Engineering Department, University of Palermo, Via... Palermo (Italy); Laboratori Nazionali del Sud - National Instiute for Nuclear Physics INFN (INFN-LNS), Via S.Sofia 64, 95100 Catania (Italy); Lo Presti, D. [Physics Department, University of Catania, Via S. Sofia 64, I-95123, Catania (Italy); Mazzaglia, S.E. [Laboratori Nazionali del Sud - National Instiute for Nuclear Physics INFN (INFN-LNS), Via S.Sofia 64, 95100 Catania (Italy); Pallotta, S. [Department of ' Fisiopatologia Clinica' , University of Florence, V.le Morgagni 85, I-50134 Florence (Italy); Randazzo, N. [National Institute for Nuclear Physics INFN, Section of Catania, Via S.Sofia 64, 95123 Catania (Italy); Sipala, V. [National Institute for Nuclear Physics INFN, Section of Catania, Via S.Sofia 64, 95123 Catania (Italy); Physics Department, University of Catania, Via S. Sofia 64, I-95123, Catania (Italy); Stancampiano, C. [National Institute for Nuclear Physics INFN, Section of Catania, Via S.Sofia 64, 95123 Catania (Italy); and others

    2011-12-01

    In this paper the use of the Filtered Back Projection (FBP) Algorithm, in order to reconstruct tomographic images using the high energy (200-250 MeV) proton beams, is investigated. The algorithm has been studied in detail with a Monte Carlo approach and image quality has been analysed and compared with the total absorbed dose. A proton Computed Tomography (pCT) apparatus, developed by our group, has been fully simulated to exploit the power of the Geant4 Monte Carlo toolkit. From the simulation of the apparatus, a set of tomographic images of a test phantom has been reconstructed using the FBP at different absorbed dose values. The images have been evaluated in terms of homogeneity, noise, contrast, spatial and density resolution.

  8. Monte Carlo evaluation of the Filtered Back Projection method for image reconstruction in proton computed tomography

    In this paper the use of the Filtered Back Projection (FBP) Algorithm, in order to reconstruct tomographic images using the high energy (200-250 MeV) proton beams, is investigated. The algorithm has been studied in detail with a Monte Carlo approach and image quality has been analysed and compared with the total absorbed dose. A proton Computed Tomography (pCT) apparatus, developed by our group, has been fully simulated to exploit the power of the Geant4 Monte Carlo toolkit. From the simulation of the apparatus, a set of tomographic images of a test phantom has been reconstructed using the FBP at different absorbed dose values. The images have been evaluated in terms of homogeneity, noise, contrast, spatial and density resolution.

  9. Comparison of adaptive statistical iterative and filtered back projection reconstruction techniques in brain CT

    Purpose: To compare image quality and visualization of normal structures and lesions in brain computed tomography (CT) with adaptive statistical iterative reconstruction (ASIR) and filtered back projection (FBP) reconstruction techniques in different X-ray tube current–time products. Materials and methods: In this IRB-approved prospective study, forty patients (nineteen men, twenty-one women; mean age 69.5 ± 11.2 years) received brain scan at different tube current–time products (300 and 200 mAs) in 64-section multi-detector CT (GE, Discovery CT750 HD). Images were reconstructed with FBP and four levels of ASIR-FBP blending. Two radiologists (please note that our hospital is renowned for its geriatric medicine department, and these two radiologists are more experienced in chronic cerebral vascular disease than in neoplastic disease, so this research did not contain cerebral tumors but as a discussion) assessed all the reconstructed images for visibility of normal structures, lesion conspicuity, image contrast and diagnostic confidence in a blinded and randomized manner. Volume CT dose index (CTDIvol) and dose-length product (DLP) were recorded. All the data were analyzed by using SPSS 13.0 statistical analysis software. Results: There was no statistically significant difference between the image qualities at 200 mAs with 50% ASIR blending technique and 300 mAs with FBP technique (p > .05). While between the image qualities at 200 mAs with FBP and 300 mAs with FBP technique a statistically significant difference (p < .05) was found. Conclusion: ASIR provided same image quality and diagnostic ability in brain imaging with greater than 30% dose reduction compared with FBP reconstruction technique

  10. Comparison of adaptive statistical iterative and filtered back projection reconstruction techniques in brain CT

    Ren, Qingguo, E-mail: renqg83@163.com [Department of Radiology, Hua Dong Hospital of Fudan University, Shanghai 200040 (China); Dewan, Sheilesh Kumar, E-mail: sheilesh_d1@hotmail.com [Department of Geriatrics, Hua Dong Hospital of Fudan University, Shanghai 200040 (China); Li, Ming, E-mail: minli77@163.com [Department of Radiology, Hua Dong Hospital of Fudan University, Shanghai 200040 (China); Li, Jianying, E-mail: Jianying.Li@med.ge.com [CT Imaging Research Center, GE Healthcare China, Beijing (China); Mao, Dingbiao, E-mail: maodingbiao74@163.com [Department of Radiology, Hua Dong Hospital of Fudan University, Shanghai 200040 (China); Wang, Zhenglei, E-mail: Williswang_doc@yahoo.com.cn [Department of Radiology, Shanghai Electricity Hospital, Shanghai 200050 (China); Hua, Yanqing, E-mail: cjr.huayanqing@vip.163.com [Department of Radiology, Hua Dong Hospital of Fudan University, Shanghai 200040 (China)

    2012-10-15

    Purpose: To compare image quality and visualization of normal structures and lesions in brain computed tomography (CT) with adaptive statistical iterative reconstruction (ASIR) and filtered back projection (FBP) reconstruction techniques in different X-ray tube current–time products. Materials and methods: In this IRB-approved prospective study, forty patients (nineteen men, twenty-one women; mean age 69.5 ± 11.2 years) received brain scan at different tube current–time products (300 and 200 mAs) in 64-section multi-detector CT (GE, Discovery CT750 HD). Images were reconstructed with FBP and four levels of ASIR-FBP blending. Two radiologists (please note that our hospital is renowned for its geriatric medicine department, and these two radiologists are more experienced in chronic cerebral vascular disease than in neoplastic disease, so this research did not contain cerebral tumors but as a discussion) assessed all the reconstructed images for visibility of normal structures, lesion conspicuity, image contrast and diagnostic confidence in a blinded and randomized manner. Volume CT dose index (CTDI{sub vol}) and dose-length product (DLP) were recorded. All the data were analyzed by using SPSS 13.0 statistical analysis software. Results: There was no statistically significant difference between the image qualities at 200 mAs with 50% ASIR blending technique and 300 mAs with FBP technique (p > .05). While between the image qualities at 200 mAs with FBP and 300 mAs with FBP technique a statistically significant difference (p < .05) was found. Conclusion: ASIR provided same image quality and diagnostic ability in brain imaging with greater than 30% dose reduction compared with FBP reconstruction technique.

  11. Filtered back-projection reconstruction for attenuation proton CT along most likely paths

    Quiñones, C. T.; Létang, J. M.; Rit, S.

    2016-05-01

    This work investigates the attenuation of a proton beam to reconstruct the map of the linear attenuation coefficient of a material which is mainly caused by the inelastic interactions of protons with matter. Attenuation proton computed tomography (pCT) suffers from a poor spatial resolution due to multiple Coulomb scattering (MCS) of protons in matter, similarly to the conventional energy-loss pCT. We therefore adapted a recent filtered back-projection algorithm along the most likely path (MLP) of protons for energy-loss pCT (Rit et al 2013) to attenuation pCT assuming a pCT scanner that can track the position and the direction of protons before and after the scanned object. Monte Carlo simulations of pCT acquisitions of density and spatial resolution phantoms were performed to characterize the new algorithm using Geant4 (via Gate). Attenuation pCT assumes an energy-independent inelastic cross-section, and the impact of the energy dependence of the inelastic cross-section below 100 MeV showed a capping artifact when the residual energy was below 100 MeV behind the object. The statistical limitation has been determined analytically and it was found that the noise in attenuation pCT images is 411 times and 278 times higher than the noise in energy-loss pCT images for the same imaging dose at 200 MeV and 300 MeV, respectively. Comparison of the spatial resolution of attenuation pCT images with a conventional straight-line path binning showed that incorporating the MLP estimates during reconstruction improves the spatial resolution of attenuation pCT. Moreover, regardless of the significant noise in attenuation pCT images, the spatial resolution of attenuation pCT was better than that of conventional energy-loss pCT in some studied situations thanks to the interplay of MCS and attenuation known as the West–Sherwood effect.

  12. Gamma-ray momentum reconstruction from Compton electron trajectories by filtered back-projection

    Gamma-ray imaging utilizing Compton scattering has traditionally relied on measuring coincident gamma-ray interactions to map directional information of the source distribution. This coincidence requirement makes it an inherently inefficient process. We present an approach to gamma-ray reconstruction from Compton scattering that requires only a single electron tracking detector, thus removing the coincidence requirement. From the Compton scattered electron momentum distribution, our algorithm analytically computes the incident photon's correlated direction and energy distributions. Because this method maps the source energy and location, it is useful in applications, where prior information about the source distribution is unknown. We demonstrate this method with electron tracks measured in a scientific Si charge coupled device. While this method was demonstrated with electron tracks in a Si-based detector, it is applicable to any detector that can measure electron direction and energy, or equivalently the electron momentum. For example, it can increase the sensitivity to obtain energy and direction in gas-based systems that suffer from limited efficiency.

  13. Gamma-ray momentum reconstruction from Compton electron trajectories by filtered back-projection

    Haefner, A., E-mail: ahaefner@berkeley.edu; Plimley, B.; Pavlovsky, R. [Department of Nuclear Engineering, University of California Berkeley, 4155 Etcheverry Hall, MC 1730, Berkeley, California 94720-1730 (United States); Gunter, D. [Applied Nuclear Physics, Lawrence Berkeley National Laboratory, 1 Cyclotron Road Berkeley, California 94720 (United States); Vetter, K. [Department of Nuclear Engineering, University of California Berkeley, 4155 Etcheverry Hall, MC 1730, Berkeley, California 94720-1730 (United States); Applied Nuclear Physics, Lawrence Berkeley National Laboratory, 1 Cyclotron Road Berkeley, California 94720 (United States)

    2014-11-03

    Gamma-ray imaging utilizing Compton scattering has traditionally relied on measuring coincident gamma-ray interactions to map directional information of the source distribution. This coincidence requirement makes it an inherently inefficient process. We present an approach to gamma-ray reconstruction from Compton scattering that requires only a single electron tracking detector, thus removing the coincidence requirement. From the Compton scattered electron momentum distribution, our algorithm analytically computes the incident photon's correlated direction and energy distributions. Because this method maps the source energy and location, it is useful in applications, where prior information about the source distribution is unknown. We demonstrate this method with electron tracks measured in a scientific Si charge coupled device. While this method was demonstrated with electron tracks in a Si-based detector, it is applicable to any detector that can measure electron direction and energy, or equivalently the electron momentum. For example, it can increase the sensitivity to obtain energy and direction in gas-based systems that suffer from limited efficiency.

  14. Evaluation of dose reduction and image quality in CT colonography: Comparison of low-dose CT with iterative reconstruction and routine-dose CT with filtered back projection

    Nagata, Koichi [Kameda Medical Center, Department of Radiology, Kamogawa, Chiba (Japan); Jichi Medical University, Department of Radiology, Tochigi (Japan); National Cancer Center, Cancer Screening Technology Division, Research Center for Cancer Prevention and Screening, Tokyo (Japan); Fujiwara, Masanori; Mogi, Tomohiro; Iida, Nao [Kameda Medical Center Makuhari, Department of Radiology, Chiba (Japan); Kanazawa, Hidenori; Sugimoto, Hideharu [Jichi Medical University, Department of Radiology, Tochigi (Japan); Mitsushima, Toru [Kameda Medical Center Makuhari, Department of Gastroenterology, Chiba (Japan); Lefor, Alan T. [Jichi Medical University, Department of Surgery, Tochigi (Japan)

    2015-01-15

    To prospectively evaluate the radiation dose and image quality comparing low-dose CT colonography (CTC) reconstructed using different levels of iterative reconstruction techniques with routine-dose CTC reconstructed with filtered back projection. Following institutional ethics clearance and informed consent procedures, 210 patients underwent screening CTC using automatic tube current modulation for dual positions. Examinations were performed in the supine position with a routine-dose protocol and in the prone position, randomly applying four different low-dose protocols. Supine images were reconstructed with filtered back projection and prone images with iterative reconstruction. Two blinded observers assessed the image quality of endoluminal images. Image noise was quantitatively assessed by region-of-interest measurements. The mean effective dose in the supine series was 1.88 mSv using routine-dose CTC, compared to 0.92, 0.69, 0.57, and 0.46 mSv at four different low doses in the prone series (p < 0.01). Overall image quality and noise of low-dose CTC with iterative reconstruction were significantly improved compared to routine-dose CTC using filtered back projection. The lowest dose group had image quality comparable to routine-dose images. Low-dose CTC with iterative reconstruction reduces the radiation dose by 48.5 to 75.1 % without image quality degradation compared to routine-dose CTC with filtered back projection. (orig.)

  15. Chest computed tomography using iterative reconstruction vs filtered back projection (Part 1): evaluation of image noise reduction in 32 patients

    Pontana, Francois; Pagniez, Julien; Faivre, Jean-Baptiste; Remy, Jacques [Univ. Lille Nord de France, Department of Thoracic Imaging Hospital Calmette (EA 2694), Lille (France); Flohr, Thomas [Siemens HealthCare, Computed Tomography Division, Forchheim (Germany); Duhamel, Alain [Univ. Lille Nord de France, Department of Medical Statistics, Lille (France); Remy-Jardin, Martine [Univ. Lille Nord de France, Department of Thoracic Imaging Hospital Calmette (EA 2694), Lille (France); Hospital Calmette, Department of Thoracic Imaging, Lille cedex (France)

    2011-03-15

    To assess noise reduction achievable with an iterative reconstruction algorithm. 32 consecutive chest CT angiograms were reconstructed with regular filtered back projection (FBP) (Group 1) and an iterative reconstruction technique (IRIS) with 3 (Group 2a) and 5 (Group 2b) iterations. Objective image noise was significantly reduced in Group 2a and Group 2b compared with FBP (p < 0.0001). There was a significant reduction in the level of subjective image noise in Group 2a compared with Group 1 images (p < 0.003), further reinforced on Group 2b images (Group 2b vs Group 1; p < 0.0001) (Group 2b vs Group 2a; p = 0.0006). The overall image quality scores significantly improved on Group 2a images compared with Group 1 images (p = 0.0081) and on Group 2b images compared with Group 2a images (p < 0.0001). Comparative analysis of individual CT features of mild lung infiltration showed improved conspicuity of ground glass attenuation (p < 0.0001), ill-defined micronodules (p = 0.0351) and emphysematous lesions (p < 0.0001) on Group 2a images, further improved on Group 2b images for ground glass attenuation (p < 0.0001), and emphysematous lesions (p = 0.0087). Compared with regular FBP, iterative reconstructions enable significant reduction of image noise without loss of diagnostic information, thus having the potential to decrease radiation dose during chest CT examinations. (orig.)

  16. Why do commercial CT scanners still employ traditional, filtered back-projection for image reconstruction?

    Despite major advances in x-ray sources, detector arrays, gantry mechanical design and especially computer performance, one component of computed tomography (CT) scanners has remained virtually constant for the past 25 years—the reconstruction algorithm. Fundamental advances have been made in the solution of inverse problems, especially tomographic reconstruction, but these works have not been translated into clinical and related practice. The reasons are not obvious and seldom discussed. This review seeks to examine the reasons for this discrepancy and provides recommendations on how it can be resolved. We take the example of field of compressive sensing (CS), summarizing this new area of research from the eyes of practical medical physicists and explaining the disconnection between theoretical and application-oriented research. Using a few issues specific to CT, which engineers have addressed in very specific ways, we try to distill the mathematical problem underlying each of these issues with the hope of demonstrating that there are interesting mathematical problems of general importance that can result from in depth analysis of specific issues. We then sketch some unconventional CT-imaging designs that have the potential to impact on CT applications, if the link between applied mathematicians and engineers/physicists were stronger. Finally, we close with some observations on how the link could be strengthened. There is, we believe, an important opportunity to rapidly improve the performance of CT and related tomographic imaging techniques by addressing these issues. (topical review)

  17. Results of clinical receiver operating characteristic study comparing ordered subset expectation maximization and filtered back projection images in FDG PET studies of hepatocellular carcinoma

    Jeon, Tae Joo; Lee, Jong Doo; Kim, Hee Joung; Kim, Myung Jin; Yoo, Hyung Sik [College of Medicine, Yonsei Univ., Seoul (Korea, Republic of)

    2000-07-01

    The aims of this study is to validate the usefulness of ordered subset expectation maximization (OSEM) comparing with filtered back projection in terms of diagnostic ability for hepatocellular carcinoma (HCC). The data of fifty seven patients with HCC and 62 patients with normal liver was reconstructed using both OSEM and FBP. Mean age of the patients group was 54.4{+-}1.5 year. All patient underwent whole body and liver scan after the injection of 10 mCi of (F-18)FDG using dedicated whole body PET camera (GE, Advance). Interpretation of PET images were performed by 3 observers with random exposure of normal and diseased cases. Receiver operator characteristic (ROC) study was used for the validation of results. The area of ROC curve. Az was represented as below and this results revealed statistical differences (p<0.05). In PET studies of patients with HCC, OSEM showed better results that those of conventional FBP in terms of lesion detectability.

  18. Ultra low-dose chest CT using filtered back projection: Comparison of 80-, 100- and 120 kVp protocols in a prospective randomized study

    Khawaja, Ranish Deedar Ali, E-mail: rkhawaja@mgh.harvard.edu [Division of Thoracic Radiology, MGH Imaging, Massachusetts General Hospital and Harvard Medical School, Boston (United States); Singh, Sarabjeet [Division of Thoracic Radiology, MGH Imaging, Massachusetts General Hospital and Harvard Medical School, Boston (United States); Madan, Rachna [Division of Thoracic Radiology, Brigham and Women' s Hospital and Harvard Medical School, Boston (United States); Sharma, Amita; Padole, Atul; Pourjabbar, Sarvenaz; Digumarthy, Subba; Shepard, Jo-Anne; Kalra, Mannudeep K. [Division of Thoracic Radiology, MGH Imaging, Massachusetts General Hospital and Harvard Medical School, Boston (United States)

    2014-10-15

    Highlights: • Filtered back projection technique enables acceptable image quality for chest CT examinations at 0.9 mGy (estimated effective dose of 0.5 mSv) for selected sizes of patients. • Lesion detection (such as solid non-calcified lung nodules) in lung parenchyma is optimal at 0.9 mGy, with limited visualization of thyroid nodules in FBP images. • Further dose reduction down to 0.4 mGy is possible for most patients undergoing follow-up chest CT for evaluation of larger lung nodules and GGOs. • Our results may help set the reference ALARA dose for chest CT examinations reconstructed with filtered back projection technique using the minimum possible radiation dose with acceptable image quality and lesion detection. - Abstract: Purpose: To assess lesion detection and diagnostic image quality of filtered back projection (FBP) reconstruction technique in ultra low-dose chest CT examinations. Methods and materials: In this IRB-approved ongoing prospective clinical study, 116 CT-image-series at four different radiation-doses were performed for 29 patients (age, 57–87 years; F:M – 15:12; BMI 16–32 kg/m{sup 2}). All patients provided written-informed-consent for the acquisitions of additional ultra low-dose (ULD) series on a 256-slice MDCT (iCT, Philips Healthcare). In-addition to their clinical standard-dose chest CT (SD, 120 kV mean CTDI{sub vol}, 6 ± 1 mGy), ULD-CT was subsequently performed at three-dose-levels (0.9 mGy [120 kV]; 0.5 mGy [100 kV] and 0.2 mGy [80 kV]). Images were reconstructed with FBP (2.5 mm * 1.25 mm) resulting into four-stacks: SD-FBP (reference-standard), FBP{sub 0.9}, FBP{sub 0.5}, and FBP{sub 0.2}. Four thoracic-radiologists from two-teaching-hospitals independently-evaluated data for lesion-detection and visibility-of-small-structures. Friedman's-non-parametric-test with post hoc Dunn's-test was used for data-analysis. Results: Interobserver-agreement was substantial between radiologists (k = 0.6–0.8). With

  19. Ultra low-dose chest CT using filtered back projection: Comparison of 80-, 100- and 120 kVp protocols in a prospective randomized study

    Highlights: • Filtered back projection technique enables acceptable image quality for chest CT examinations at 0.9 mGy (estimated effective dose of 0.5 mSv) for selected sizes of patients. • Lesion detection (such as solid non-calcified lung nodules) in lung parenchyma is optimal at 0.9 mGy, with limited visualization of thyroid nodules in FBP images. • Further dose reduction down to 0.4 mGy is possible for most patients undergoing follow-up chest CT for evaluation of larger lung nodules and GGOs. • Our results may help set the reference ALARA dose for chest CT examinations reconstructed with filtered back projection technique using the minimum possible radiation dose with acceptable image quality and lesion detection. - Abstract: Purpose: To assess lesion detection and diagnostic image quality of filtered back projection (FBP) reconstruction technique in ultra low-dose chest CT examinations. Methods and materials: In this IRB-approved ongoing prospective clinical study, 116 CT-image-series at four different radiation-doses were performed for 29 patients (age, 57–87 years; F:M – 15:12; BMI 16–32 kg/m2). All patients provided written-informed-consent for the acquisitions of additional ultra low-dose (ULD) series on a 256-slice MDCT (iCT, Philips Healthcare). In-addition to their clinical standard-dose chest CT (SD, 120 kV mean CTDIvol, 6 ± 1 mGy), ULD-CT was subsequently performed at three-dose-levels (0.9 mGy [120 kV]; 0.5 mGy [100 kV] and 0.2 mGy [80 kV]). Images were reconstructed with FBP (2.5 mm * 1.25 mm) resulting into four-stacks: SD-FBP (reference-standard), FBP0.9, FBP0.5, and FBP0.2. Four thoracic-radiologists from two-teaching-hospitals independently-evaluated data for lesion-detection and visibility-of-small-structures. Friedman's-non-parametric-test with post hoc Dunn's-test was used for data-analysis. Results: Interobserver-agreement was substantial between radiologists (k = 0.6–0.8). With pooled analysis, 146-pulmonary (27

  20. Comparison of applied dose and image quality in staging CT of neuroendocrine tumor patients using standard filtered back projection and adaptive statistical iterative reconstruction

    Böning, G., E-mail: georg.boening@charite.de [Department of Radiology, Charité, Humboldt-University Medical School, Charitéplatz 1, 10117 Berlin (Germany); Schäfer, M.; Grupp, U. [Department of Radiology, Charité, Humboldt-University Medical School, Charitéplatz 1, 10117 Berlin (Germany); Kaul, D. [Department of Radiation Oncology, Charité, Humboldt-University Medical School, Charitéplatz 1, 10117 Berlin (Germany); Kahn, J. [Department of Radiology, Charité, Humboldt-University Medical School, Charitéplatz 1, 10117 Berlin (Germany); Pavel, M. [Department of Gastroenterology, Charité, Humboldt-University Medical School, Charitéplatz 1, 10117 Berlin (Germany); Maurer, M.; Denecke, T.; Hamm, B.; Streitparth, F. [Department of Radiology, Charité, Humboldt-University Medical School, Charitéplatz 1, 10117 Berlin (Germany)

    2015-08-15

    Highlights: • Iterative reconstruction (IR) in staging CT provides equal objective image quality compared to filtered back projection (FBP). • IR delivers excellent subjective quality and reduces effective dose compared to FBP. • In patients with neuroendocrine tumor (NET) or may other hypervascular abdominal tumors IR can be used without scarifying diagnostic confidence. - Abstract: Objective: To investigate whether dose reduction via adaptive statistical iterative reconstruction (ASIR) affects image quality and diagnostic accuracy in neuroendocrine tumor (NET) staging. Methods: A total of 28 NET patients were enrolled in the study. Inclusion criteria were histologically proven NET and visible tumor in abdominal computed tomography (CT). In an intraindividual study design, the patients underwent a baseline CT (filtered back projection, FBP) and follow-up CT (ASIR 40%) using matched scan parameters. Image quality was assessed subjectively using a 5-grade scoring system and objectively by determining signal-to-noise ratio (SNR) and contrast-to-noise ratios (CNRs). Applied volume computed tomography dose index (CTDI{sub vol}) of each scan was taken from the dose report. Results: ASIR 40% significantly reduced CTDI{sub vol} (10.17 ± 3.06 mGy [FBP], 6.34 ± 2.25 mGy [ASIR] (p < 0.001) by 37.6% and significantly increased CNRs (complete tumor-to-liver, 2.76 ± 1.87 [FBP], 3.2 ± 2.32 [ASIR]) (p < 0.05) (complete tumor-to-muscle, 2.74 ± 2.67 [FBP], 4.31 ± 4.61 [ASIR]) (p < 0.05) compared to FBP. Subjective scoring revealed no significant changes for diagnostic confidence (5.0 ± 0 [FBP], 5.0 ± 0 [ASIR]), visibility of suspicious lesion (4.8 ± 0.5 [FBP], 4.8 ± 0.5 [ASIR]) and artifacts (5.0 ± 0 [FBP], 5.0 ± 0 [ASIR]). ASIR 40% significantly decreased scores for noise (4.3 ± 0.6 [FBP], 4.0 ± 0.8 [ASIR]) (p < 0.05), contrast (4.4 ± 0.6 [FBP], 4.1 ± 0.8 [ASIR]) (p < 0.001) and visibility of small structures (4.5 ± 0.7 [FBP], 4.3 ± 0.8 [ASIR]) (p < 0

  1. Comparison of applied dose and image quality in staging CT of neuroendocrine tumor patients using standard filtered back projection and adaptive statistical iterative reconstruction

    Highlights: • Iterative reconstruction (IR) in staging CT provides equal objective image quality compared to filtered back projection (FBP). • IR delivers excellent subjective quality and reduces effective dose compared to FBP. • In patients with neuroendocrine tumor (NET) or may other hypervascular abdominal tumors IR can be used without scarifying diagnostic confidence. - Abstract: Objective: To investigate whether dose reduction via adaptive statistical iterative reconstruction (ASIR) affects image quality and diagnostic accuracy in neuroendocrine tumor (NET) staging. Methods: A total of 28 NET patients were enrolled in the study. Inclusion criteria were histologically proven NET and visible tumor in abdominal computed tomography (CT). In an intraindividual study design, the patients underwent a baseline CT (filtered back projection, FBP) and follow-up CT (ASIR 40%) using matched scan parameters. Image quality was assessed subjectively using a 5-grade scoring system and objectively by determining signal-to-noise ratio (SNR) and contrast-to-noise ratios (CNRs). Applied volume computed tomography dose index (CTDIvol) of each scan was taken from the dose report. Results: ASIR 40% significantly reduced CTDIvol (10.17 ± 3.06 mGy [FBP], 6.34 ± 2.25 mGy [ASIR] (p < 0.001) by 37.6% and significantly increased CNRs (complete tumor-to-liver, 2.76 ± 1.87 [FBP], 3.2 ± 2.32 [ASIR]) (p < 0.05) (complete tumor-to-muscle, 2.74 ± 2.67 [FBP], 4.31 ± 4.61 [ASIR]) (p < 0.05) compared to FBP. Subjective scoring revealed no significant changes for diagnostic confidence (5.0 ± 0 [FBP], 5.0 ± 0 [ASIR]), visibility of suspicious lesion (4.8 ± 0.5 [FBP], 4.8 ± 0.5 [ASIR]) and artifacts (5.0 ± 0 [FBP], 5.0 ± 0 [ASIR]). ASIR 40% significantly decreased scores for noise (4.3 ± 0.6 [FBP], 4.0 ± 0.8 [ASIR]) (p < 0.05), contrast (4.4 ± 0.6 [FBP], 4.1 ± 0.8 [ASIR]) (p < 0.001) and visibility of small structures (4.5 ± 0.7 [FBP], 4.3 ± 0.8 [ASIR]) (p < 0.001). Conclusion

  2. FDG-PET standardized uptake values in normal anatomical structures using iterative reconstruction segmented attenuation correction and filtered back-projection

    Filtered back-projection (FBP) is the most commonly used reconstruction method for PET images, which are usually noisy. The iterative reconstruction segmented attenuation correction (IRSAC) algorithm improves image quality without reducing image resolution. The standardized uptake value (SUV) is the most clinically utilized quantitative parameter of [fluorine-18]fluoro-2-deoxy-d-glucose (FDG) accumulation. The objective of this study was to obtain a table of SUVs for several normal anatomical structures from both routinely used FBP and IRSAC reconstructed images and to compare the data obtained with both methods. Twenty whole-body PET scans performed in consecutive patients with proven or suspected non-small cell lung cancer were retrospectively analyzed. Images were processed using both IRSAC and FBP algorithms. Nonquantitative or gaussian filters were used to smooth the transmission scan when using FBP or IRSAC algorithms, respectively. A phantom study was performed to evaluate the effect of different filters on SUV. Maximum and average SUVs (SUVmax and SUVavg) were calculated in 28 normal anatomical structures and in one pathological site. The phantom study showed that the use of a nonquantitative smoothing filter in the transmission scan results in a less accurate quantification and in a 20% underestimation of the actual measurement. Most anatomical structures were identified in all patients using the IRSAC images. On average, SUVavg and SUVmax measured on IRSAC images using a gaussian filter in the transmission scan were respectively 20% and 8% higher than the SUVs calculated from conventional FBP images. Scatterplots of the data values showed an overall strong relationship between IRSAC and FBP SUVs. Individual scatterplots of each site demonstrated a weaker relationship for lower SUVs and for SUVmax than for higher SUVs and SUVavg. A set of reference values was obtained for SUVmax and SUVavg of normal anatomical structures, calculated with both IRSAC and FBP

  3. Half-dose abdominal CT with sinogram-affirmed iterative reconstruction technique in children - comparison with full-dose CT with filtered back projection

    Iterative reconstruction can be helpful to reduce radiation dose while maintaining image quality. However, this technique has not been fully evaluated in children during abdominal CT. To compare objective and subjective image quality between half-dose images reconstructed with iterative reconstruction at iteration strength levels 1 to 5 (half-S1 to half-S5 studies) and full-dose images reconstructed with filtered back projection (full studies) in pediatric abdominal CT. Twenty-one children (M:F = 13:8; mean age 8.2 ± 5.7 years) underwent dual-source abdominal CT (mean effective dose 4.8 ± 2.1 mSv). The objective image quality was evaluated as noise. Subjective image quality analysis was performed comparing each half study to the full study for noise, sharpness, artifact and diagnostic acceptability. Both objective and subjective image noise decreased with increasing iteration strength. Half-S4 and -S5 studies showed objective image noise similar to or lower than that of full studies. The half-S2 and -S3 studies produced the greatest sharpness and the half-S5 studies were the worst from a blocky appearance. Full and half studies did not differ in artifacts. Half-S3 studies showed the best diagnostic acceptability. Half-S4 and -S5 studies objectively and half-S3 studies subjectively showed comparable image quality to full studies in pediatric abdominal CT. (orig.)

  4. Chest computed tomography using iterative reconstruction vs filtered back projection (Part 2): image quality of low-dose CT examinations in 80 patients

    Pontana, Francois; Pagniez, Julien; Faivre, Jean-Baptiste; Hachulla, Anne-Lise; Remy, Jacques [University Lille Nord de France, Department of Thoracic Imaging, Hospital Calmette (EA 2694), Lille (France); Duhamel, Alain [University Lille Nord de France, Department of Medical Statistics, Lille (France); Flohr, Thomas [Computed Tomography Division, Siemens Healthcare, Forchheim (Germany); Remy-Jardin, Martine [University Lille Nord de France, Department of Thoracic Imaging, Hospital Calmette (EA 2694), Lille (France); Hospital Calmette, Department of Thoracic Imaging, Lille cedex (France)

    2011-03-15

    To evaluate the image quality of an iterative reconstruction algorithm (IRIS) in low-dose chest CT in comparison with standard-dose filtered back projection (FBP) CT. Eighty consecutive patients referred for a follow-up chest CT examination of the chest, underwent a low-dose CT examination (Group 2) in similar technical conditions to those of the initial examination, (Group 1) except for the milliamperage selection and the replacement of regular FBP reconstruction by iterative reconstructions using three (Group 2a) and five iterations (Group 2b). Despite a mean decrease of 35.5% in the dose-length-product, there was no statistically significant difference between Group 2a and Group 1 in the objective noise, signal-to-noise (SNR) and contrast-to-noise (CNR) ratios and distribution of the overall image quality scores. Compared to Group 1, objective image noise in Group 2b was significantly reduced with increased SNR and CNR and a trend towards improved image quality. Iterative reconstructions using three iterations provide similar image quality compared with the conventionally used FBP reconstruction at 35% less dose, thus enabling dose reduction without loss of diagnostic information. According to our preliminary results, even higher dose reductions than 35% may be feasible by using more than three iterations. (orig.)

  5. Forward problem solution as the operator of filtered and back projection matrix to reconstruct the various method of data collection and the object element model in electrical impedance tomography

    Back projection reconstruction has been implemented to get the dynamical image in electrical impedance tomography. However the implementation is still limited in method of adjacent data collection and circular object element model. The study aims to develop the methods of back projection as reconstruction method that has the high speed, accuracy, and flexibility, which can be used for various methods of data collection and model of the object element. The proposed method uses the forward problem solution as the operator of filtered and back projection matrix. This is done through a simulation study on several methods of data collection and various models of the object element. The results indicate that the developed method is capable of producing images, fastly and accurately for reconstruction of the various methods of data collection and models of the object element

  6. Adaptive iterative dose reduction algorithm in CT: Effect on image quality compared with filtered back projection in body phantoms of different sizes

    Kim, Milim; Lee, Jeong Min; Son, Hyo Shin; Han, Joon Koo; Choi, Byung Ihn [College of Medicine, Seoul National University, Seoul (Korea, Republic of); Yoon, Jeong Hee; Choi, Jin Woo [Dept. of Radiology, Seoul National University Hospital, Seoul (Korea, Republic of)

    2014-04-15

    To evaluate the impact of the adaptive iterative dose reduction (AIDR) three-dimensional (3D) algorithm in CT on noise reduction and the image quality compared to the filtered back projection (FBP) algorithm and to compare the effectiveness of AIDR 3D on noise reduction according to the body habitus using phantoms with different sizes. Three different-sized phantoms with diameters of 24 cm, 30 cm, and 40 cm were built up using the American College of Radiology CT accreditation phantom and layers of pork belly fat. Each phantom was scanned eight times using different mAs. Images were reconstructed using the FBP and three different strengths of the AIDR 3D. The image noise, the contrast-to-noise ratio (CNR) and the signal-to-noise ratio (SNR) of the phantom were assessed. Two radiologists assessed the image quality of the 4 image sets in consensus. The effectiveness of AIDR 3D on noise reduction compared with FBP were also compared according to the phantom sizes. Adaptive iterative dose reduction 3D significantly reduced the image noise compared with FBP and enhanced the SNR and CNR (p < 0.05) with improved image quality (p < 0.05). When a stronger reconstruction algorithm was used, greater increase of SNR and CNR as well as noise reduction was achieved (p < 0.05). The noise reduction effect of AIDR 3D was significantly greater in the 40-cm phantom than in the 24-cm or 30-cm phantoms (p < 0.05). The AIDR 3D algorithm is effective to reduce the image noise as well as to improve the image-quality parameters compared by FBP algorithm, and its effectiveness may increase as the phantom size increases.

  7. Image quality of CT angiography with model-based iterative reconstruction in young children with congenital heart disease: comparison with filtered back projection and adaptive statistical iterative reconstruction.

    Son, Sung Sil; Choo, Ki Seok; Jeon, Ung Bae; Jeon, Gye Rok; Nam, Kyung Jin; Kim, Tae Un; Yeom, Jeong A; Hwang, Jae Yeon; Jeong, Dong Wook; Lim, Soo Jin

    2015-06-01

    To retrospectively evaluate the image quality of CT angiography (CTA) reconstructed by model-based iterative reconstruction (MBIR) and to compare this with images obtained by filtered back projection (FBP) and adaptive statistical iterative reconstruction (ASIR) in newborns and infants with congenital heart disease (CHD). Thirty-seven children (age 4.8 ± 3.7 months; weight 4.79 ± 0.47 kg) with suspected CHD underwent CTA on a 64detector MDCT without ECG gating (80 kVp, 40 mA using tube current modulation). Total dose length product was recorded in all patients. Images were reconstructed using FBP, ASIR, and MBIR. Objective image qualities (density, noise) were measured in the great vessels and heart chambers. The contrast-to-noise ratio (CNR) was calculated by measuring the density and noise of myocardial walls. Two radiologists evaluated images for subjective noise, diagnostic confidence, and sharpness at the level prior to the first branch of the main pulmonary artery. Images were compared with respect to reconstruction method, and reconstruction times were measured. Images from all patients were diagnostic, and the effective dose was 0.22 mSv. The objective image noise of MBIR was significantly lower than those of FBP and ASIR in the great vessels and heart chambers (P 0.05). Mean CNR values were 8.73 for FBP, 14.54 for ASIR, and 22.95 for MBIR. In addition, the subjective image noise of MBIR was significantly lower than those of the others (P confidence (P < 0.05), and mean reconstruction times were 5.1 ± 2.3 s for FBP and ASIR and 15.1 ± 2.4 min for MBIR. While CTA with MBIR in newborns and infants with CHD can reduce image noise and improve CNR more than other methods, it is more time-consuming than the other methods. PMID:25414055

  8. The comparison of ordered subset expectation maximization and filtered back projection technique for RBC blood pool SPECT in detection of liver hemangioma

    Jeon, Tae Joo; Kim, Hee Joung; Bong, Jung Kyun; Lee, Jong Doo [College of Medicine, Yonsei Univ., Seoul (Korea, Republic of)

    2000-07-01

    Odered subset expectation maximization (OSEM) is a new iterative reconstruction technique for tomographic images that can reduce the reconstruction time comparing with conventional iteration method. We adopted this method of RBC blood pool SPECT and tried to validate the usefulness of OSEM in detection of liver hemangioma comparing with filtered back projection (FBP). A 64 projection SPECT study was acquired over 360 .deg. C by dual-head cameras after the injection of 750MBq of {sup 99m}Tc-RBC. OSEM was performed with various condition of subset (1,2,4,8,16 and 32) and iteration number (1,2,4,8 and 16) to obtain the best set for lesion detection. OSEM underwent in 17 lesions of 15 patients with liver hemangioma and compared with FBP images. Two nuclear medicine physicians reviewed these results independently. Best set for images was 4 iteration and 16 subset. In general, OSEM revealed more homogeneous images than FBP. Eighty-eight percent (15/17) of OSEM images were superior or equal to FBP for anatomic resolution. According to the blind review of images 70.5% (12/17) of OSEM was better in contrast (4/17), anatomic detail (4/17) and both (2/17). Two small lesions were detected by OSEM only and another 2 small lesions were failed to depict in both methods. Remaining 3 lesions revealed no difference in image quality. OSEM can provide better image quality as well as better results in detection of liver hemangioma than conventional FBP technique.

  9. Image quality and radiation dose of low dose coronary CT angiography in obese patients: Sinogram affirmed iterative reconstruction versus filtered back projection

    Purpose: To investigate the image quality and radiation dose of low radiation dose CT coronary angiography (CTCA) using sinogram affirmed iterative reconstruction (SAFIRE) compared with standard dose CTCA using filtered back-projection (FBP) in obese patients. Materials and methods: Seventy-eight consecutive obese patients were randomized into two groups and scanned using a prospectively ECG-triggered step-and-shot (SAS) CTCA protocol on a dual-source CT scanner. Thirty-nine patients (protocol A) were examined using a routine radiation dose protocol at 120 kV and images were reconstructed with FBP (protocol A). Thirty-nine patients (protocol B) were examined using a low dose protocol at 100 kV and images were reconstructed with SAFIRE. Two blinded observers independently assessed the image quality of each coronary segment using a 4-point scale (1 = non-diagnostic, 4 = excellent) and measured the objective parameters image noise, signal-to-noise ratio (SNR), and contrast-to-noise ratio (CNR). Radiation dose was calculated. Results: The coronary artery image quality scores, image noise, SNR and CNR were not significantly different between protocols A and B (all p > 0.05), with image quality scores of 3.51 ± 0.70 versus 3.55 ± 0.47, respectively. The effective radiation dose was significantly lower in protocol B (4.41 ± 0.83 mSv) than that in protocol A (8.83 ± 1.74 mSv, p < 0.01). Conclusion: Compared with standard dose CTCA using FBP, low dose CTCA using SAFIRE can maintain diagnostic image quality with 50% reduction of radiation dose.

  10. Comparison of iterative model, hybrid iterative, and filtered back projection reconstruction techniques in low-dose brain CT: impact of thin-slice imaging

    Nakaura, Takeshi; Iyama, Yuji; Kidoh, Masafumi; Yokoyama, Koichi [Amakusa Medical Center, Diagnostic Radiology, Amakusa, Kumamoto (Japan); Kumamoto University, Department of Diagnostic Radiology, Graduate School of Medical Sciences, Kumamoto (Japan); Oda, Seitaro; Yamashita, Yasuyuki [Kumamoto University, Department of Diagnostic Radiology, Graduate School of Medical Sciences, Kumamoto (Japan); Tokuyasu, Shinichi [Philips Electronics, Kumamoto (Japan); Harada, Kazunori [Amakusa Medical Center, Department of Surgery, Kumamoto (Japan)

    2016-03-15

    The purpose of this study was to evaluate the utility of iterative model reconstruction (IMR) in brain CT especially with thin-slice images. This prospective study received institutional review board approval, and prior informed consent to participate was obtained from all patients. We enrolled 34 patients who underwent brain CT and reconstructed axial images with filtered back projection (FBP), hybrid iterative reconstruction (HIR) and IMR with 1 and 5 mm slice thicknesses. The CT number, image noise, contrast, and contrast noise ratio (CNR) between the thalamus and internal capsule, and the rate of increase of image noise in 1 and 5 mm thickness images between the reconstruction methods, were assessed. Two independent radiologists assessed image contrast, image noise, image sharpness, and overall image quality on a 4-point scale. The CNRs in 1 and 5 mm slice thickness were significantly higher with IMR (1.2 ± 0.6 and 2.2 ± 0.8, respectively) than with FBP (0.4 ± 0.3 and 1.0 ± 0.4, respectively) and HIR (0.5 ± 0.3 and 1.2 ± 0.4, respectively) (p < 0.01). The mean rate of increasing noise from 5 to 1 mm thickness images was significantly lower with IMR (1.7 ± 0.3) than with FBP (2.3 ± 0.3) and HIR (2.3 ± 0.4) (p < 0.01). There were no significant differences in qualitative analysis of unfamiliar image texture between the reconstruction techniques. IMR offers significant noise reduction and higher contrast and CNR in brain CT, especially for thin-slice images, when compared to FBP and HIR. (orig.)

  11. Adaptive iterative dose reduction algorithm in CT: Effect on image quality compared with filtered back projection in body phantoms of different sizes

    To evaluate the impact of the adaptive iterative dose reduction (AIDR) three-dimensional (3D) algorithm in CT on noise reduction and the image quality compared to the filtered back projection (FBP) algorithm and to compare the effectiveness of AIDR 3D on noise reduction according to the body habitus using phantoms with different sizes. Three different-sized phantoms with diameters of 24 cm, 30 cm, and 40 cm were built up using the American College of Radiology CT accreditation phantom and layers of pork belly fat. Each phantom was scanned eight times using different mAs. Images were reconstructed using the FBP and three different strengths of the AIDR 3D. The image noise, the contrast-to-noise ratio (CNR) and the signal-to-noise ratio (SNR) of the phantom were assessed. Two radiologists assessed the image quality of the 4 image sets in consensus. The effectiveness of AIDR 3D on noise reduction compared with FBP were also compared according to the phantom sizes. Adaptive iterative dose reduction 3D significantly reduced the image noise compared with FBP and enhanced the SNR and CNR (p < 0.05) with improved image quality (p < 0.05). When a stronger reconstruction algorithm was used, greater increase of SNR and CNR as well as noise reduction was achieved (p < 0.05). The noise reduction effect of AIDR 3D was significantly greater in the 40-cm phantom than in the 24-cm or 30-cm phantoms (p < 0.05). The AIDR 3D algorithm is effective to reduce the image noise as well as to improve the image-quality parameters compared by FBP algorithm, and its effectiveness may increase as the phantom size increases.

  12. Comparison of iterative model, hybrid iterative, and filtered back projection reconstruction techniques in low-dose brain CT: impact of thin-slice imaging

    The purpose of this study was to evaluate the utility of iterative model reconstruction (IMR) in brain CT especially with thin-slice images. This prospective study received institutional review board approval, and prior informed consent to participate was obtained from all patients. We enrolled 34 patients who underwent brain CT and reconstructed axial images with filtered back projection (FBP), hybrid iterative reconstruction (HIR) and IMR with 1 and 5 mm slice thicknesses. The CT number, image noise, contrast, and contrast noise ratio (CNR) between the thalamus and internal capsule, and the rate of increase of image noise in 1 and 5 mm thickness images between the reconstruction methods, were assessed. Two independent radiologists assessed image contrast, image noise, image sharpness, and overall image quality on a 4-point scale. The CNRs in 1 and 5 mm slice thickness were significantly higher with IMR (1.2 ± 0.6 and 2.2 ± 0.8, respectively) than with FBP (0.4 ± 0.3 and 1.0 ± 0.4, respectively) and HIR (0.5 ± 0.3 and 1.2 ± 0.4, respectively) (p < 0.01). The mean rate of increasing noise from 5 to 1 mm thickness images was significantly lower with IMR (1.7 ± 0.3) than with FBP (2.3 ± 0.3) and HIR (2.3 ± 0.4) (p < 0.01). There were no significant differences in qualitative analysis of unfamiliar image texture between the reconstruction techniques. IMR offers significant noise reduction and higher contrast and CNR in brain CT, especially for thin-slice images, when compared to FBP and HIR. (orig.)

  13. Impact of hybrid iterative reconstruction on Agatston coronary artery calcium scores in comparison to filtered back projection in native cardiac CT

    To investigate whether the effects of hybrid iterative reconstruction (HIR) on coronary artery calcium (CAC) measurements using the Agatston score lead to changes in assignment of patients to cardiovascular risk groups compared to filtered back projection (FBP). 68 patients (mean age 61.5 years; 48 male; 20 female) underwent prospectively ECG-gated, non-enhanced, cardiac 256-MSCT for coronary calcium scoring. Scanning parameters were as follows: Tube voltage, 120 kV; Mean tube current time-product 63.67 mAs (50 - 150 mAs); collimation, 2 x 128 x 0.625 mm. Images were reconstructed with FBP and with HIR at all levels (L1 to L7). Two independent readers measured Agatston scores of all reconstructions and assigned patients to cardiovascular risk groups. Scores of HIR and FBP reconstructions were correlated (Spearman). Interobserver agreement and variability was assessed with k-statistics and Bland-Altmann-Plots. Agatston scores of HIR reconstructions were closely correlated with FBP reconstructions (L1, R = 0.9996; L2, R = 0.9995; L3, R = 0.9991; L4, R = 0.986; L5, R = 0.9986; L6, R = 0.9987; and L7, R = 0.9986). In comparison to FBP, HIR led to reduced Agatston scores between 97% (L1) and 87.4% (L7) of the FBP values. Using HIR iterations L1-L3, all patients were assigned to identical risk groups as after FPB reconstruction. In 5.4% of patients the risk group after HIR with the maximum iteration level was different from the group after FBP reconstruction. There was an excellent correlation of Agatston scores after HIR and FBP with identical risk group assignment at levels 1 - 3 for all patients. Hence it appears that the application of HIR in routine calcium scoring does not entail any disadvantages. Thus, future studies are needed to demonstrate whether HIR is a reliable method for reducing radiation dose in coronary calcium scoring.

  14. Evaluation of iterative reconstruction (OSEM) versus filtered back-projection for the assessment of myocardial glucose uptake and myocardial perfusion using dynamic PET

    Iterative reconstruction methods based on ordered-subset expectation maximisation (OSEM) has replaced filtered backprojection (FBP) in many clinical settings owing to the superior image quality. Whether OSEM is as accurate as FBP in quantitative positron emission tomography (PET) is uncertain. We compared the accuracy of OSEM and FBP for regional myocardial 18F-FDG uptake and 13NH3 perfusion measurements in cardiac PET. Ten healthy volunteers were studied. Five underwent dynamic 18F-FDG PET during hyperinsulinaemic-euglycaemic clamp, and five underwent 13NH3 perfusion measurement during rest and adenosine-induced hyperaemia. Images were reconstructed using FBP and OSEM ± an 8-mm Gaussian post-reconstruction filter. Filtered and unfiltered images showed agreement between the reconstruction methods within ±2SD in Bland-Altman plots of Ki values. The use of a Gaussian filter resulted in a systematic underestimation of Ki in the filtered images of 11%. The mean deviation between the reconstruction methods for both unfiltered and filtered images was 1.3%. Agreement within ±2SD between the methods was demonstrated for perfusion rate constants up to 2.5 min-1, corresponding to a perfusion of 3.4 ml g-1 min-1. The mean deviation between the two methods for unfiltered data was 2.7%, and for filtered data, 5.3%. The 18F-FDG uptake rate constants showed excellent agreement between the two reconstruction methods. In the perfusion range up to 3.4 ml g-1 min-1, agreement between 13NH3 perfusion obtained with OSEM and FBP was acceptable. The use of OSEM for measurement of perfusion values higher than 3.4 ml g-1 min-1 requires further evaluation. (orig.)

  15. Computed tomography of the cervical spine: comparison of image quality between a standard-dose and a low-dose protocol using filtered back-projection and iterative reconstruction

    To compare image quality of a standard-dose (SD) and a low-dose (LD) cervical spine CT protocol using filtered back-projection (FBP) and iterative reconstruction (IR). Forty patients investigated by cervical spine CT were prospectively randomised into two groups: SD (120 kVp, 275 mAs) and LD (120 kVp, 150 mAs), both applying automatic tube current modulation. Data were reconstructed using both FBP and sinogram-affirmed IR. Image noise, signal-to-noise (SNR) and contrast-to-noise (CNR) ratios were measured. Two radiologists independently and blindly assessed the following anatomical structures at C3-C4 and C6-C7 levels, using a four-point scale: intervertebral disc, content of neural foramina and dural sac, ligaments, soft tissues and vertebrae. They subsequently rated overall image quality using a ten-point scale. For both protocols and at each disc level, IR significantly decreased image noise and increased SNR and CNR, compared with FBP. SNR and CNR were statistically equivalent in LD-IR and SD-FBP protocols. Regardless of the dose and disc level, the qualitative scores with IR compared with FBP, and with LD-IR compared with SD-FBP, were significantly higher or not statistically different for intervertebral discs, neural foramina and ligaments, while significantly lower or not statistically different for soft tissues and vertebrae. The overall image quality scores were significantly higher with IR compared with FBP, and with LD-IR compared with SD-FBP. LD-IR cervical spine CT provides better image quality for intervertebral discs, neural foramina and ligaments, and worse image quality for soft tissues and vertebrae, compared with SD-FBP, while reducing radiation dose by approximately 40 %. (orig.)

  16. Iterative reconstruction technique vs filter back projection: utility for quantitative bronchial assessment on low-dose thin-section MDCT in patients with/without chronic obstructive pulmonary disease

    Koyama, Hisanobu; Seki, Shinichiro; Sugimura, Kazuro [Kobe University Graduate School of Medicine, Division of Radiology, Department of Radiology, Kobe, Hyogo (Japan); Ohno, Yoshiharu; Nishio, Mizuho; Matsumoto, Sumiaki; Yoshikawa, Takeshi [Kobe University Graduate School of Medicine, Advanced Biomedical Imaging Research Centre, Kobe (Japan); Kobe University Graduate School of Medicine, Division of Functional and Diagnostic Imaging Research, Department of Radiology, Kobe (Japan); Sugihara, Naoki [Toshiba Medical Systems Corporation, Ohtawara, Tochigi (Japan)

    2014-08-15

    The aim of this study was to evaluate the utility of the iterative reconstruction (IR) technique for quantitative bronchial assessment during low-dose computed tomography (CT) as a substitute for standard-dose CT in patients with/without chronic obstructive pulmonary disease. Fifty patients (mean age, 69.2; mean % predicted FEV1, 79.4) underwent standard-dose CT (150mAs) and low-dose CT (25mAs). Except for tube current, the imaging parameters were identical for both protocols. Standard-dose CT was reconstructed using filtered back-projection (FBP), and low-dose CT was reconstructed using IR and FBP. For quantitative bronchial assessment, the wall area percentage (WA%) of the sub-segmental bronchi and the airway luminal volume percentage (LV%) from the main bronchus to the peripheral bronchi were acquired in each dataset. The correlation and agreement of WA% and LV% between standard-dose CT and both low-dose CTs were statistically evaluated. WA% and LV% between standard-dose CT and both low-dose CTs were significant correlated (r > 0.77, p < 0.00001); however, only the LV% agreement between SD-CT and low-dose CT reconstructed with IR was moderate (concordance correlation coefficient = 0.93); the other agreement was poor (concordance correlation coefficient <0.90). Quantitative bronchial assessment via low-dose CT has potential as a substitute for standard-dose CT by using IR and airway luminal volumetry techniques. circle Quantitative bronchial assessment of COPD using low-dose CT is possible. (orig.)

  17. Image quality of low mA CT pulmonary angiography reconstructed with model based iterative reconstruction versus standard CT pulmonary angiography reconstructed with filtered back projection: an equivalency trial

    To determine whether CT pulmonary angiography (CTPA) using low mA setting reconstructed with model-based iterative reconstruction (MBIR) is equivalent to routine CTPA reconstructed with filtered back projection (FBP). This prospective study was approved by the institutional review board and patients provided written informed consent. Eighty-two patients were examined with a low mA MBIR-CTPA (100 kV, 20 mA) and 82 patients with a standard FBP-CTPA (100 kV, 250 mA). Region of interests were drawn in nine pulmonary vessels; signal-to-noise ratio (SNR) and contrast-to-noise ratio (CNR) were calculated. A five-point scale was used to subjectively evaluate the image quality of FBP-CTPA and low mA MBIR-CTPA. Compared to routine FBP-CTPA, low mA MBIR-CTPA showed no differences in the attenuation measured in nine pulmonary vessels, higher SNR (56 ± 19 vs 43 ± 20, p < 0.0001) and higher CNR (50 ± 17 vs 38 ± 18, p < 0.0001) despite a dose reduction of 93 % (p < 0.0001). The subjective image quality of low mA MBIR-CTPA was quoted as diagnostic in 98 % of the cases for patient with body mass index less than 30 kg/m2. Low mA MBIR-CTPA is equivalent to routine FBP-CTPA and allows a significant dose reduction while improving SNR and CNR in the pulmonary vessels, as compared with routine FBP-CTPA. (orig.)

  18. Iterative reconstruction technique vs filter back projection: utility for quantitative bronchial assessment on low-dose thin-section MDCT in patients with/without chronic obstructive pulmonary disease

    The aim of this study was to evaluate the utility of the iterative reconstruction (IR) technique for quantitative bronchial assessment during low-dose computed tomography (CT) as a substitute for standard-dose CT in patients with/without chronic obstructive pulmonary disease. Fifty patients (mean age, 69.2; mean % predicted FEV1, 79.4) underwent standard-dose CT (150mAs) and low-dose CT (25mAs). Except for tube current, the imaging parameters were identical for both protocols. Standard-dose CT was reconstructed using filtered back-projection (FBP), and low-dose CT was reconstructed using IR and FBP. For quantitative bronchial assessment, the wall area percentage (WA%) of the sub-segmental bronchi and the airway luminal volume percentage (LV%) from the main bronchus to the peripheral bronchi were acquired in each dataset. The correlation and agreement of WA% and LV% between standard-dose CT and both low-dose CTs were statistically evaluated. WA% and LV% between standard-dose CT and both low-dose CTs were significant correlated (r > 0.77, p < 0.00001); however, only the LV% agreement between SD-CT and low-dose CT reconstructed with IR was moderate (concordance correlation coefficient = 0.93); the other agreement was poor (concordance correlation coefficient <0.90). Quantitative bronchial assessment via low-dose CT has potential as a substitute for standard-dose CT by using IR and airway luminal volumetry techniques. circle Quantitative bronchial assessment of COPD using low-dose CT is possible. (orig.)

  19. Clinical evaluation of image quality and radiation dose reduction in upper abdominal computed tomography using model-based iterative reconstruction; comparison with filtered back projection and adaptive statistical iterative reconstruction

    Highlights: • MBIR significantly improves objective image quality. • MBIR reduces the radiation dose by 87.5% without increasing objective image noise. • A half dose will be needed to maintain the subjective image quality. - Abstract: Purpose: To evaluate the image quality of upper abdominal CT images reconstructed with model-based iterative reconstruction (MBIR) in comparison with filtered back projection (FBP) and adaptive statistical iterative reconstruction (ASIR) on scans acquired with various radiation exposure dose protocols. Materials and methods: This prospective study was approved by our institutional review board, and informed consent was obtained from all 90 patients who underwent both control-dose (CD) and reduced-dose (RD) CT of the upper abdomen (unenhanced: n = 45, contrast-enhanced: n = 45). The RD scan protocol was randomly selected from three protocols; Protocol A: 12.5% dose, Protocol B: 25% dose, Protocol C: 50% dose. Objective image noise, signal-to-noise (SNR) ratio for the liver parenchyma, visual image score and lesion conspicuity were compared among CD images of FBP and RD images of FBP, ASIR and MBIR. Results: RD images of MBIR yielded significantly lower objective image noise and higher SNR compared with RD images of FBP and ASIR for all protocols (P < .01) and CD images of FBP for Protocol C (P < .05). Although the subjective image quality of RD images of MBIR was almost acceptable for Protocol C, it was inferior to that of CD images of FBP for Protocols A and B (P < .0083). The conspicuity of the small lesions in RD images of MBIR tended to be superior to that in RD images of FBP and ASIR and inferior to that in CD images for Protocols A and B, although the differences were not significant (P > .0083). Conclusion: Although 12.5%-dose MBIR images (mean size-specific dose estimates [SSDE] of 1.13 mGy) yielded objective image noise and SNR comparable to CD-FBP images, at least a 50% dose (mean SSDE of 4.63 mGy) would be needed to

  20. Impact of hybrid iterative reconstruction on Agatston coronary artery calcium scores in comparison to filtered back projection in native cardiac CT; Einfluss der hybriden iterativen Rekonstruktion bei der nativen CT des Herzens auf die Agatston-Kalziumscores der Koronararterien

    Obmann, V.C.; Heverhagen, J.T. [Inselspital - University Hospital Bern (Switzerland). University Inst. for Diagnostic, Interventional and Pediatric Radiology; Klink, T. [Wuerzburg Univ. (Germany). Inst. of Diagnostic and Interventional Radiology; Stork, A.; Begemann, P.G.C. [Roentgeninstitut Duesseldorf, Duesseldorf (Germany); Laqmani, A.; Adam, G. [University Medical Center Hamburg-Eppendorf, Hamburg (Germany). Dept. of Diagnostic and Interventional Radiology

    2015-05-15

    To investigate whether the effects of hybrid iterative reconstruction (HIR) on coronary artery calcium (CAC) measurements using the Agatston score lead to changes in assignment of patients to cardiovascular risk groups compared to filtered back projection (FBP). 68 patients (mean age 61.5 years; 48 male; 20 female) underwent prospectively ECG-gated, non-enhanced, cardiac 256-MSCT for coronary calcium scoring. Scanning parameters were as follows: Tube voltage, 120 kV; Mean tube current time-product 63.67 mAs (50 - 150 mAs); collimation, 2 x 128 x 0.625 mm. Images were reconstructed with FBP and with HIR at all levels (L1 to L7). Two independent readers measured Agatston scores of all reconstructions and assigned patients to cardiovascular risk groups. Scores of HIR and FBP reconstructions were correlated (Spearman). Interobserver agreement and variability was assessed with k-statistics and Bland-Altmann-Plots. Agatston scores of HIR reconstructions were closely correlated with FBP reconstructions (L1, R = 0.9996; L2, R = 0.9995; L3, R = 0.9991; L4, R = 0.986; L5, R = 0.9986; L6, R = 0.9987; and L7, R = 0.9986). In comparison to FBP, HIR led to reduced Agatston scores between 97% (L1) and 87.4% (L7) of the FBP values. Using HIR iterations L1-L3, all patients were assigned to identical risk groups as after FPB reconstruction. In 5.4% of patients the risk group after HIR with the maximum iteration level was different from the group after FBP reconstruction. There was an excellent correlation of Agatston scores after HIR and FBP with identical risk group assignment at levels 1 - 3 for all patients. Hence it appears that the application of HIR in routine calcium scoring does not entail any disadvantages. Thus, future studies are needed to demonstrate whether HIR is a reliable method for reducing radiation dose in coronary calcium scoring.

  1. 基于共焦圆周滤波-逆投影算法的近场层析成像%Tomography Imaging in Near-field Based Confocal Filtered Back-Projection of Circle Algorithm

    田八林; 杨恒; 罗永健; 李克昭; 陈瑶

    2014-01-01

    Tomography imaging of synthetic aperture radar can realize 3D tomography although its algorithm is simple,computation quantity is small,but is mainly suitable for the far field conditions. In addition,by increasing the number of baseline to improve the resolution will greatly increase the system complexity. According to the above problems,a tomography imaging technique in near-field using confocal circular SAR which on a Partial-circle or curved orbit is proposed. Two techniques of synthetic aperture and confocal imaging are combined to achieve space-time. Namely,confocal filtered back-projection of circle algorithm which usually used in two-dimensional imaging is extensively applied in confocal imaging. The results of digital simulation and experiment show that the proposed algorithm overcomes altitude ambiguity and has the capability to extract quasi-3D imaging information from a target scene.%层析成像合成孔径雷达三维层析成像实现三维层析成像算法简单、计算量小,但该算法主要适用于远场情况;并且,靠增加基线数提高分辨率的办法也会大大增加系统复杂度。针对上述问题提出一种基于近场圆周扫描的共焦层析成像诊断方案,结合合成孔径与共焦层析成像思想,通过修改映射关系,将用于二维成像中基于部分圆周扫描的滤波逆投影算法扩展到共焦层析成像中,在不同的目标高度层聚焦,完成空时共焦成像。计算机仿真和实验室测试结果均表明该方案可以克服高度模糊,从三维目标场景中提取三维信息,完成准三维成像重建。

  2. Feasible Dose Reduction in Routine Chest Computed Tomography Maintaining Constant Image Quality Using the Last Three Scanner Generations: From Filtered Back Projection to Sinogram-affirmed Iterative Reconstruction and Impact of the Novel Fully Integrated Detector Design Minimizing Electronic Noise

    Lukas Ebner

    2014-01-01

    Full Text Available Objective:The aim of the present study was to evaluate a dose reduction in contrast-enhanced chest computed tomography (CT by comparing the three latest generations of Siemens CT scanners used in clinical practice. We analyzed the amount of radiation used with filtered back projection (FBP and an iterative reconstruction (IR algorithm to yield the same image quality. Furthermore, the influence on the radiation dose of the most recent integrated circuit detector (ICD; Stellar detector, Siemens Healthcare, Erlangen, Germany was investigated. Materials and Methods: 136 Patients were included. Scan parameters were set to a thorax routine: SOMATOM Sensation 64 (FBP, SOMATOM Definition Flash (IR, and SOMATOM Definition Edge (ICD and IR. Tube current was set constantly to the reference level of 100 mA automated tube current modulation using reference milliamperes. Care kV was used on the Flash and Edge scanner, while tube potential was individually selected between 100 and 140 kVp by the medical technologists at the SOMATOM Sensation. Quality assessment was performed on soft-tissue kernel reconstruction. Dose was represented by the dose length product. Results: Dose-length product (DLP with FBP for the average chest CT was 308 mGycm ± 99.6. In contrast, the DLP for the chest CT with IR algorithm was 196.8 mGycm ± 68.8 (P = 0.0001. Further decline in dose can be noted with IR and the ICD: DLP: 166.4 mGycm ± 54.5 (P = 0.033. The dose reduction compared to FBP was 36.1% with IR and 45.6% with IR/ICD. Signal-to-noise ratio (SNR was favorable in the aorta, bone, and soft tissue for IR/ICD in combination compared to FBP (the P values ranged from 0.003 to 0.048. Overall contrast-to-noise ratio (CNR improved with declining DLP. Conclusion: The most recent technical developments, namely IR in combination with integrated circuit detectors, can significantly lower radiation dose in chest CT examinations.

  3. Non-traditional Machining Techniques for Fabricating Metal Aerospace Filters

    Wang Wei; Zhu Di; D.M.Allen; H.J.A.Almondb

    2008-01-01

    Thanks to recent advances in manufacturing technology, aerospace system designers have many more options to fabricate high-quality, low-weight, high-capacity, cost-effective filters. Aside from traditional methods such as stamping, drilling and milling,many new approaches have been widely used in filter-manufacturing practices on account of their increased processing abilities. However, the restrictions on costs, the need for studying under stricter conditions such as in aggressive fluids, the complicity in design, the workability of materials, and others have made it difficult to choose a satisfactory method from the newly developed processes, such as,photochemical machining (PCM), photo electroforming (PEF) and laser beam machining (LBM) to produce small, inexpensive, lightweight aerospace filters. This article appraises the technical and economical viability of PCM, PEF, and LBM to help engineers choose the fittest approach to turn out aerospace filters.

  4. Reconstruction of CT images by the Bayes- back projection method

    Haruyama, M; Takase, M; Tobita, H

    2002-01-01

    In the course of research on quantitative assay of non-destructive measurement of radioactive waste, the have developed a unique program based on the Bayesian theory for reconstruction of transmission computed tomography (TCT) image. The reconstruction of cross-section images in the CT technology usually employs the Filtered Back Projection method. The new imaging reconstruction program reported here is based on the Bayesian Back Projection method, and it has a function of iterative improvement images by every step of measurement. Namely, this method has the capability of prompt display of a cross-section image corresponding to each angled projection data from every measurement. Hence, it is possible to observe an improved cross-section view by reflecting each projection data in almost real time. From the basic theory of Baysian Back Projection method, it can be not only applied to CT types of 1st, 2nd, and 3rd generation. This reported deals with a reconstruction program of cross-section images in the CT of ...

  5. Image reconstruction of simulated specimens using convolution back projection

    Mohd. Farhan Manzoor

    2001-04-01

    Full Text Available This paper reports about the reconstruction of cross-sections of composite structures. The convolution back projection (CBP algorithm has been used to capture the attenuation field over the specimen. Five different test cases have been taken up for evaluation. These cases represent varying degrees of complexity. In addition, the role of filters on the nature of the reconstruction errors has also been discussed. Numerical results obtained in the study reveal that CBP algorithm is a useful tool for qualitative as well as quantitative assessment of composite regions encountered in engineering applications.

  6. Traditional Medicine Through the Filter of Modernity: A brief historical analysis

    R. Rabarihoela Razafimandimby

    2014-12-01

    Full Text Available Traditional medicines still prevail in current Malagasy context. A careful historical analysis shows however that Malagasy traditional medicine has been screened through many filters before being accepted in a global context. Traditional medicine in its authentic form has been more or less rejected with the advent of  modern medicine – although not without reaction. This paper will retrace the historical encountering of the modern and traditional to determine the extent to which traditional medicine is acknowledged and used in the current prevailing modern, rational and scientific global context.

  7. Complete Localization of HVDC Back-to-Back Project Realized

    Yu Xinqiang; Liang Xuming; Wang Zuli; Ye Qing

    2006-01-01

    The first completely localized DC back-to-back project for asynchronous interconnection between Northwest and Central China plays an important role in realizing national power grid interconnection, spurring indigenous manufacturing industries and promoting DC transmission equipment. Insisting on the principle of autonomous innovation, this project was based on domestic forces in every aspect, from engineering organization, system design, equipment completion, engineering design, equipment manufacturing and procurement to construction and debugging. By passing through strict quality control, intermediate supervision and acceptance test and assessment, the project has been proved up to world advanced level.

  8. Acceleration of iterative tomographic image reconstruction by reference-based back projection

    Cheng, Chang-Chieh; Li, Ping-Hui; Ching, Yu-Tai

    2016-03-01

    The purpose of this paper is to design and implement an efficient iterative reconstruction algorithm for computational tomography. We accelerate the reconstruction speed of algebraic reconstruction technique (ART), an iterative reconstruction method, by using the result of filtered backprojection (FBP), a wide used algorithm of analytical reconstruction, to be an initial guess and the reference for the first iteration and each back projection stage respectively. Both two improvements can reduce the error between the forward projection of each iteration and the measurements. We use three methods of quantitative analysis, root-mean-square error (RMSE), peak signal to noise ratio (PSNR), and structural content (SC), to show that our method can reduce the number of iterations by more than half and the quality of the result is better than the original ART.

  9. Camera calibration based on the back projection process

    Gu, Feifei; Zhao, Hong; Ma, Yueyang; Bu, Penghui

    2015-12-01

    Camera calibration plays a crucial role in 3D measurement tasks of machine vision. In typical calibration processes, camera parameters are iteratively optimized in the forward imaging process (FIP). However, the results can only guarantee the minimum of 2D projection errors on the image plane, but not the minimum of 3D reconstruction errors. In this paper, we propose a universal method for camera calibration, which uses the back projection process (BPP). In our method, a forward projection model is used to obtain initial intrinsic and extrinsic parameters with a popular planar checkerboard pattern. Then, the extracted image points are projected back into 3D space and compared with the ideal point coordinates. Finally, the estimation of the camera parameters is refined by a non-linear function minimization process. The proposed method can obtain a more accurate calibration result, which is more physically useful. Simulation and practical data are given to demonstrate the accuracy of the proposed method.

  10. Camera calibration based on the back projection process

    Camera calibration plays a crucial role in 3D measurement tasks of machine vision. In typical calibration processes, camera parameters are iteratively optimized in the forward imaging process (FIP). However, the results can only guarantee the minimum of 2D projection errors on the image plane, but not the minimum of 3D reconstruction errors. In this paper, we propose a universal method for camera calibration, which uses the back projection process (BPP). In our method, a forward projection model is used to obtain initial intrinsic and extrinsic parameters with a popular planar checkerboard pattern. Then, the extracted image points are projected back into 3D space and compared with the ideal point coordinates. Finally, the estimation of the camera parameters is refined by a non-linear function minimization process. The proposed method can obtain a more accurate calibration result, which is more physically useful. Simulation and practical data are given to demonstrate the accuracy of the proposed method. (paper)

  11. Locations and focal mechanisms of deep long period events beneath Aleutian Arc volcanoes using back projection methods

    Lough, A. C.; Roman, D. C.; Haney, M. M.

    2015-12-01

    Deep long period (DLP) earthquakes are commonly observed in volcanic settings such as the Aleutian Arc in Alaska. DLPs are poorly understood but are thought to be associated with movements of fluids, such as magma or hydrothermal fluids, deep in the volcanic plumbing system. These events have been recognized for several decades but few studies have gone beyond their identification and location. All long period events are more difficult to identify and locate than volcano-tectonic (VT) earthquakes because traditional detection schemes focus on high frequency (short period) energy. In addition, DLPs present analytical challenges because they tend to be emergent and so it is difficult to accurately pick the onset of arriving body waves. We now expect to find DLPs at most volcanic centers, the challenge lies in identification and location. We aim to reduce the element of human error in location by applying back projection to better constrain the depth and horizontal position of these events. Power et al. (2004) provided the first compilation of DLP activity in the Aleutian Arc. This study focuses on the reanalysis of 162 cataloged DLPs beneath 11 volcanoes in the Aleutian arc (we expect to ultimately identify and reanalyze more DLPs). We are currently adapting the approach of Haney (2014) for volcanic tremor to use back projection over a 4D grid to determine position and origin time of DLPs. This method holds great potential in that it will allow automated, high-accuracy picking of arrival times and could reduce the number of arrival time picks necessary for traditional location schemes to well constrain event origins. Back projection can also calculate a relative focal mechanism (difficult with traditional methods due to the emergent nature of DLPs) allowing the first in depth analysis of source properties. Our event catalog (spanning over 25 years and volcanoes) is one of the longest and largest and enables us to investigate spatial and temporal variation in DLPs.

  12. Imaging Seismic Source Variations Using Back-Projection Methods at El Tatio Geyser Field, Northern Chile

    Kelly, C. L.; Lawrence, J. F.

    2014-12-01

    During October 2012, 51 geophones and 6 broadband seismometers were deployed in an ~50x50m region surrounding a periodically erupting columnar geyser in the El Tatio Geyser Field, Chile. The dense array served as the seismic framework for a collaborative project to study the mechanics of complex hydrothermal systems. Contemporaneously, complementary geophysical measurements (including down-hole temperature and pressure, discharge rates, thermal imaging, water chemistry, and video) were also collected. Located on the western flanks of the Andes Mountains at an elevation of 4200m, El Tatio is the third largest geyser field in the world. Its non-pristine condition makes it an ideal location to perform minutely invasive geophysical studies. The El Jefe Geyser was chosen for its easily accessible conduit and extremely periodic eruption cycle (~120s). During approximately 2 weeks of continuous recording, we recorded ~2500 nighttime eruptions which lack cultural noise from tourism. With ample data, we aim to study how the source varies spatially and temporally during each phase of the geyser's eruption cycle. We are developing a new back-projection processing technique to improve source imaging for diffuse signals. Our method was previously applied to the Sierra Negra Volcano system, which also exhibits repeating harmonic and diffuse seismic sources. We back-project correlated seismic signals from the receivers back to their sources, assuming linear source to receiver paths and a known velocity model (obtained from ambient noise tomography). We apply polarization filters to isolate individual and concurrent geyser energy associated with P and S phases. We generate 4D, time-lapsed images of the geyser source field that illustrate how the source distribution changes through the eruption cycle. We compare images for pre-eruption, co-eruption, post-eruption and quiescent periods. We use our images to assess eruption mechanics in the system (i.e. top-down vs. bottom-up) and

  13. SAR focusing of P-band ice sounding data using back-projection

    Kusk, Anders; Dall, Jørgen

    2010-01-01

    accommodated at the expense of computation time. The back-projection algorithm can be easily parallelized however, and can advantageously be implemented on a graphics processing unit (GPU). Results from using the back-projection algorithm on POLARIS ice sounder data from North Greenland shows that the quality...... of data is improved by the processing, and the performance of the GPU implementation allows for very fast focusing....

  14. Super-resolution Reconstruction Algorithm Based on Patch Similarity and Back-projection Modification

    Wei-long Chen; Li Guo; Wu He; Wei Wu; Xiao-min Yang

    2014-01-01

    We propose an effective super-resolution reconstruction algorithm based on patch similarity and back-projection modification. In the proposed algorithm, we assume patch to be similar in natural images and extract the high-frequency information from the best similar patch to add into goal high-resolution image. In the process of reconstruction, the high-resolution patch is back-projected into the low-resolution patch so as to gain detailed modification. Experiments performed on simulated low-r...

  15. Image Resolution Enhancement by Using Interpolation Followed by Iterative Back Projection

    Rasti, Pejman; Hasan DEMIREL; Anbarjafari, Gholamreza

    2016-01-01

    In this paper, we propose a new super resolution technique based on the interpolation followed by registering them using iterative back projection (IBP). Low resolution images are being interpolated and then the interpolated images are being registered in order to generate a sharper high resolution image. The proposed technique has been tested on Lena, Elaine, Pepper, and Baboon. The quantitative peak signal-to-noise ratio (PSNR) and structural similarity index (SSIM) results as well as the v...

  16. Comparison of back projection methods of determining earthquake rupture process in time and frequency domains

    Wang, W.; Wen, L.

    2013-12-01

    Back projection is a method to back project the seismic energy recorded in a seismic array back to the earthquake source region and determine the rupture process of a large earthquake. The method takes advantage of the coherence of seismic energy in a seismic array and is quick in determining some important properties of earthquake source. The method can be performed in both time and frequency domains. In time domain, the most conventional procedure is beam forming with some measures of suppressing the noise, such as the Nth root stacking, etc. In the frequency domain, the multiple signal classification method (MUSIC) estimates the direction of arrivals of multiple waves propagating through an array using the subspace method. The advantage of this method is the ability to study rupture properties at various frequencies and to resolve simultaneous arrivals making it suitable for detecting biliteral rupture of an earthquake source. We present a comparison of back projection results on some large earthquakes between the methods in time domain and frequency domain. The time-domain procedure produces an image that is smeared and exhibits some artifacts, although some enhancing stacking methods can at some extent alleviate the problem. On the other hand, the MUSIC method resolves clear multiple arrivals and provides higher resolution of rupture imaging.

  17. Imaging spatial and temporal seismic source variations at Sierra Negra Volcano, Galapagos Islands using back-projection methods

    Kelly, C. L.; Lawrence, J. F.; Ebinger, C. J.

    2013-12-01

    Imaging spatial and temporal seismic source variations at Sierra Negra Volcano, Galapagos Islands using back-projection methods Cyndi Kelly1, Jesse F. Lawrence1, Cindy Ebinger2 1Stanford University, Department of Geophysics, 397 Panama Mall, Stanford, CA 94305, USA 2University of Rochester, Department of Earth and Environmental Science, 227 Hutchison Hall, Rochester, NY 14627, USA Low-magnitude seismic signals generated by processes that characterize volcanic and hydrothermal systems and their plumbing networks are difficult to observe remotely. Seismic records from these systems tend to be extremely 'noisy', making it difficult to resolve 3D subsurface structures using traditional seismic methods. Easily identifiable high-amplitude bursts within the noise that might be suitable for use with traditional seismic methods (i.e. eruptions) tend to occur relatively infrequently compared to the length of an entire eruptive cycle. Furthermore, while these impulsive events might help constrain the dynamics of a particular eruption, they shed little insight into the mechanisms that occur throughout an entire eruption sequence. It has been shown, however, that the much more abundant low-amplitude seismic 'noise' in these records (i.e. volcanic or geyser 'tremor') actually represents a series of overlapping low-magnitude displacements that can be directly linked to magma, fluid, and volatile movement at depth. This 'noisy' data therefore likely contains valuable information about the processes occurring in the volcanic or hydrothermal system before, during and after eruption events. In this study, we present a new method to comprehensively study how the seismic source distribution of all events - including micro-events - evolves during different phases of the eruption sequence of Sierra Negra Volcano in the Galapagos Islands. We apply a back-projection search algorithm to image sources of seismic 'noise' at Sierra Negra Volcano during a proposed intrusion event. By analyzing

  18. Tradition, tradition

    Rockman, Howard A.

    2012-01-01

    Starting with this issue, the Editorial duties for the JCI move to Duke University and the University of North Carolina at Chapel Hill. As we begin our five-year tenure at the helm of this prestigious journal, the tradition of excellence that these two schools typically display on the basketball court now enters the editorial boardroom. PMID:22378046

  19. External force back-projective composition and globally deformable optimization for 3-D coronary artery reconstruction

    The clinical value of the 3D reconstruction of a coronary artery is important for the diagnosis and intervention of cardiovascular diseases. This work proposes a method based on a deformable model for reconstructing coronary arteries from two monoplane angiographic images acquired from different angles. First, an external force back-projective composition model is developed to determine the external force, for which the force distributions in different views are back-projected to the 3D space and composited in the same coordinate system based on the perspective projection principle of x-ray imaging. The elasticity and bending forces are composited as an internal force to maintain the smoothness of the deformable curve. Second, the deformable curve evolves rapidly toward the true vascular centerlines in 3D space and angiographic images under the combination of internal and external forces. Third, densely matched correspondence among vessel centerlines is constructed using a curve alignment method. The bundle adjustment method is then utilized for the global optimization of the projection parameters and the 3D structures. The proposed method is validated on phantom data and routine angiographic images with consideration for space and re-projection image errors. Experimental results demonstrate the effectiveness and robustness of the proposed method for the reconstruction of coronary arteries from two monoplane angiographic images. The proposed method can achieve a mean space error of 0.564 mm and a mean re-projection error of 0.349 mm. (paper)

  20. A new linear back projection algorithm to electrical tomography based on measuring data decomposition

    As an advanced measurement technique of non-radiant, non-intrusive, rapid response, and low cost, the electrical tomography (ET) technique has developed rapidly in recent decades. The ET imaging algorithm plays an important role in the ET imaging process. Linear back projection (LBP) is the most used ET algorithm due to its advantages of dynamic imaging process, real-time response, and easy realization. But the LBP algorithm is of low spatial resolution due to the natural ‘soft field’ effect and ‘ill-posed solution’ problems; thus its applicable ranges are greatly limited. In this paper, an original data decomposition method is proposed, and every ET measuring data are decomposed into two independent new data based on the positive and negative sensing areas of the measuring data. Consequently, the number of total measuring data is extended to twice as many as the number of the original data, thus effectively reducing the ‘ill-posed solution’. On the other hand, an index to measure the ‘soft field’ effect is proposed. The index shows that the decomposed data can distinguish between different contributions of various units (pixels) for any ET measuring data, and can efficiently reduce the ‘soft field’ effect of the ET imaging process. In light of the data decomposition method, a new linear back projection algorithm is proposed to improve the spatial resolution of the ET image. A series of simulations and experiments are applied to validate the proposed algorithm by the real-time performances and the progress of spatial resolutions. (paper)

  1. Imaging the ruptures of the 2009 Samoan and Sumatran earthquakes using broadband network back-projections: Results and limitations

    Hutko, A. R.; Lay, T.; Koper, K. D.

    2009-12-01

    Applications of teleseismic P-wave back-projection to image gross characteristics of large earthquake finite-source ruptures have been enabled by ready availability of large digital data sets. Imaging with short-period data from dense arrays or broadband data from global networks can place constraints on rupture attributes that otherwise have to be treated parametrically in conventional modeling and inversion procedures. Back-projection imaging may constrain choice of fault plane and rupture direction, velocity, duration and length for large (M>~8.0) earthquakes, and can robustly locate early aftershocks embedded in mainshock surface waves. Back-projection methods seek locations of coherent energy release from the source region, ideally associated with down-going P wave energy. For shallow events, depth phase arrivals can produce artifacts in back-projection images that appear as secondary or even prominent features with incorrect apparent source locations and times, and such effects need to be recognized. We apply broadband P-wave back-projection imaging to the 29 September 2009 Samoa (Mw8.2) and 30 September 2009 Sumatra (Mw7.6) earthquakes using data from globally distributed broadband stations and compare results to back-projections of synthetic seismograms from finite-source models for these events to evaluate the artifacts from depth phases. Back-projection images for the great normal-faulting Samoa event feature two prominent bright spots, which could be interpreted to correspond to two distinct slip patches, one near the epicenter in the outer trench slope and the other approximately 80 km to the west near the plate boundary megathrust where many aftershocks occurred. This interpretation is at odds with finite-fault modeling results, which indicate a predominantly bilateral rupture in the NW-SE direction on a steeply dipping trench slope fault, with rupture extending about 60 km in each direction. Back-projections of data and synthetic seismograms from the

  2. A comparative study between matched and mis-matched projection/back projection pairs used with ASIRT reconstruction method

    For algebraic reconstruction techniques both forward and back projection operators are needed. The ability to perform accurate reconstruction relies fundamentally on the forward projection and back projection methods which are usually, the transpose of each other. Even though the mis-matched pairs may introduce additional errors during the iterative process, the usefulness of mis-matched projector/back projector pairs has been proved in image reconstruction. This work investigates the performance of matched and mis-matched reconstruction pairs using popular forward projectors and their transposes when used in reconstruction tasks with additive simultaneous iterative reconstruction techniques (ASIRT) in a parallel beam approach. Simulated noiseless phantoms are used to compare the performance of the investigated pairs in terms of the root mean squared errors (RMSE) which are calculated between reconstructed slices and the reference in different regions. Results show that mis-matched projection/back projection pairs can promise more accuracy of reconstructed images than matched ones. The forward projection operator performance seems independent of the choice of the back projection operator and vice versa.

  3. A fast GPU-based approach to branchless distance-driven projection and back-projection in cone beam CT

    Schlifske, Daniel; Medeiros, Henry

    2016-03-01

    Modern CT image reconstruction algorithms rely on projection and back-projection operations to refine an image estimate in iterative image reconstruction. A widely-used state-of-the-art technique is distance-driven projection and back-projection. While the distance-driven technique yields superior image quality in iterative algorithms, it is a computationally demanding process. This has a detrimental effect on the relevance of the algorithms in clinical settings. A few methods have been proposed for enhancing the distance-driven technique in order to take advantage of modern computer hardware. This paper explores a two-dimensional extension of the branchless method proposed by Samit Basu and Bruno De Man. The extension of the branchless method is named "pre-integration" because it achieves a significant performance boost by integrating the data before the projection and back-projection operations. It was written with Nvidia's CUDA platform and carefully designed for massively parallel GPUs. The performance and the image quality of the pre-integration method were analyzed. Both projection and back-projection are significantly faster with preintegration. The image quality was analyzed using cone beam image reconstruction algorithms within Jeffrey Fessler's Image Reconstruction Toolbox. Images produced from regularized, iterative image reconstruction algorithms using the pre-integration method show no significant impact to image quality.

  4. Back-projection source reconstruction in the presence of point scatterers

    Solimene, Raffaele; Cuccaro, Antonio; Pierri, Rocco

    2016-06-01

    Inverse source and inverse scattering problems can benefit from multipath due to a scattering environment. At the same time, multipath can be source of artefacts in the reconstructions. In this paper the aim is to understand when and how multipath manifests its positive or negative effects. To this end, a simple scenario is considered. First of all, the problem is cast within a 2D scalar setting where multipath is assumed due to known ‘extra’ point-like scatterers. In order to make the study easier, the inverse source problem is dealt with. This allows us to handle a modelling operator with fewer terms than inverse scattering and hence it makes the study less tedious. A back-projection inversion method based on the adjoint of the radiation operator is exploited. This allows us to easily compute the model resolution kernels, i.e. the point spread function, whose dominant contributions are determined by phase stationary arguments. The role played by the point scatterers and how they contribute to an improvement of the achievable resolution is highlighted.

  5. Sinogram bow-tie filtering in FBP PET reconstruction

    Abella, Mónica; Vaquero, Juan José; Soto-Montenegro, M. L.; Lage, E.; Desco, Manuel

    2009-01-01

    Low-pass filtering of sinograms in the radial direction is the most common practice to limit noise amplification in filtered back projection FBP reconstruction of positron emission tomography studies. Other filtering strategies have been proposed to prevent the loss in resolution due to low-pass radial filters, although results have been diverse. Using the well-known properties of the Fourier transform of a sinogram, the authors defined a binary mask that matches the expected shape of the sup...

  6. A rapid parallelization of cone-beam projection and back-projection operator based on texture fetching interpolation

    Xie, Lizhe; Hu, Yining; Chen, Yang; Shi, Luyao

    2015-03-01

    Projection and back-projection are the most computational consuming parts in Computed Tomography (CT) reconstruction. Parallelization strategies using GPU computing techniques have been introduced. We in this paper present a new parallelization scheme for both projection and back-projection. The proposed method is based on CUDA technology carried out by NVIDIA Corporation. Instead of build complex model, we aimed on optimizing the existing algorithm and make it suitable for CUDA implementation so as to gain fast computation speed. Besides making use of texture fetching operation which helps gain faster interpolation speed, we fixed sampling numbers in the computation of projection, to ensure the synchronization of blocks and threads, thus prevents the latency caused by inconsistent computation complexity. Experiment results have proven the computational efficiency and imaging quality of the proposed method.

  7. Mitigating artifacts in back-projection source imaging with implications for frequency-dependent properties of the Tohoku-Oki earthquake

    Meng, Lingsen; Ampuero, Jean-Paul; Luo, Yingdi; Wu, Wenbo; Ni, Sidao

    2012-12-01

    Comparing teleseismic array back-projection source images of the 2011 Tohoku-Oki earthquake with results from static and kinematic finite source inversions has revealed little overlap between the regions of high- and low-frequency slip. Motivated by this interesting observation, back-projection studies extended to intermediate frequencies, down to about 0.1 Hz, have suggested that a progressive transition of rupture properties as a function of frequency is observable. Here, by adapting the concept of array response function to non-stationary signals, we demonstrate that the "swimming artifact", a systematic drift resulting from signal non-stationarity, induces significant bias on beamforming back-projection at low frequencies. We introduce a "reference window strategy" into the multitaper-MUSIC back-projection technique and significantly mitigate the "swimming artifact" at high frequencies (1 s to 4 s). At lower frequencies, this modification yields notable, but significantly smaller, artifacts than time-domain stacking. We perform extensive synthetic tests that include a 3D regional velocity model for Japan. We analyze the recordings of the Tohoku-Oki earthquake at the USArray and at the European array at periods from 1 s to 16 s. The migration of the source location as a function of period, regardless of the back-projection methods, has characteristics that are consistent with the expected effect of the "swimming artifact". In particular, the apparent up-dip migration as a function of frequency obtained with the USArray can be explained by the "swimming artifact". This indicates that the most substantial frequency-dependence of the Tohoku-Oki earthquake source occurs at periods longer than 16 s. Thus, low-frequency back-projection needs to be further tested and validated in order to contribute to the characterization of frequency-dependent rupture properties.

  8. Images of gravitational and magnetic phenomena derived from two-dimensional back-projection Doppler tomography of interacting binary stars

    Richards, Mercedes T.; Cocking, Alexander S.; Fisher, John G.; Conover, Marshall J., E-mail: mrichards@astro.psu.edu, E-mail: asc5097@psu.edu [Department of Astronomy and Astrophysics, Pennsylvania State University, University Park, PA 16802 (United States)

    2014-11-10

    We have used two-dimensional back-projection Doppler tomography as a tool to examine the influence of gravitational and magnetic phenomena in interacting binaries that undergo mass transfer from a magnetically active star onto a non-magnetic main-sequence star. This multitiered study of over 1300 time-resolved spectra of 13 Algol binaries involved calculations of the predicted dynamical behavior of the gravitational flow and the dynamics at the impact site, analysis of the velocity images constructed from tomography, and the influence on the tomograms of orbital inclination, systemic velocity, orbital coverage, and shadowing. The Hα tomograms revealed eight sources: chromospheric emission, a gas stream along the gravitational trajectory, a star-stream impact region, a bulge of absorption or emission around the mass-gaining star, a Keplerian accretion disk, an absorption zone associated with hotter gas, a disk-stream impact region, and a hot spot where the stream strikes the edge of a disk. We described several methods used to extract the physical properties of the emission sources directly from the velocity images, including S-wave analysis, the creation of simulated velocity tomograms from hydrodynamic simulations, and the use of synthetic spectra with tomography to sequentially extract the separate sources of emission from the velocity image. In summary, the tomography images have revealed results that cannot be explained solely by gravitational effects: chromospheric emission moving with the mass-losing star, a gas stream deflected from the gravitational trajectory, and alternating behavior between stream state and disk state. Our results demonstrate that magnetic effects cannot be ignored in these interacting binaries.

  9. Images of gravitational and magnetic phenomena derived from two-dimensional back-projection Doppler tomography of interacting binary stars

    We have used two-dimensional back-projection Doppler tomography as a tool to examine the influence of gravitational and magnetic phenomena in interacting binaries that undergo mass transfer from a magnetically active star onto a non-magnetic main-sequence star. This multitiered study of over 1300 time-resolved spectra of 13 Algol binaries involved calculations of the predicted dynamical behavior of the gravitational flow and the dynamics at the impact site, analysis of the velocity images constructed from tomography, and the influence on the tomograms of orbital inclination, systemic velocity, orbital coverage, and shadowing. The Hα tomograms revealed eight sources: chromospheric emission, a gas stream along the gravitational trajectory, a star-stream impact region, a bulge of absorption or emission around the mass-gaining star, a Keplerian accretion disk, an absorption zone associated with hotter gas, a disk-stream impact region, and a hot spot where the stream strikes the edge of a disk. We described several methods used to extract the physical properties of the emission sources directly from the velocity images, including S-wave analysis, the creation of simulated velocity tomograms from hydrodynamic simulations, and the use of synthetic spectra with tomography to sequentially extract the separate sources of emission from the velocity image. In summary, the tomography images have revealed results that cannot be explained solely by gravitational effects: chromospheric emission moving with the mass-losing star, a gas stream deflected from the gravitational trajectory, and alternating behavior between stream state and disk state. Our results demonstrate that magnetic effects cannot be ignored in these interacting binaries.

  10. Digital Filters for Low Frequency Equalization

    Tyril, Marni; Abildgaard, J.; Rubak, Per

    2001-01-01

    Digital filters with high resolution in the low-frequency range are studied. Specifically, for a given computational power, traditional IIR filters are compared with warped FIR filters, warped IIR filters, and modified warped FIR filters termed warped individual z FIR filters (WizFIR). The results...

  11. Implicit Kalman filtering

    Skliar, M.; Ramirez, W. F.

    1997-01-01

    For an implicitly defined discrete system, a new algorithm for Kalman filtering is developed and an efficient numerical implementation scheme is proposed. Unlike the traditional explicit approach, the implicit filter can be readily applied to ill-conditioned systems and allows for generalization to descriptor systems. The implementation of the implicit filter depends on the solution of the congruence matrix equation (A1)(Px)(AT1) = Py. We develop a general iterative method for the solution of this equation, and prove necessary and sufficient conditions for convergence. It is shown that when the system matrices of an implicit system are sparse, the implicit Kalman filter requires significantly less computer time and storage to implement as compared to the traditional explicit Kalman filter. Simulation results are presented to illustrate and substantiate the theoretical developments.

  12. Generalized Nonlinear Complementary Attitude Filter

    Jensen, Kenneth

    2011-01-01

    This work describes a family of attitude estimators that are based on a generalization of Mahony's nonlinear complementary filter. This generalization reveals the close mathematical relationship between the nonlinear complementary filter and the more traditional multiplicative extended Kalman filter. In fact, the bias-free and constant gain multiplicative continuous-time extended Kalman filters may be interpreted as special cases of the generalized attitude estimator. The correspondence provi...

  13. A back projection dosimetry method for diagnostic and orthovoltage x-ray from 40 to 140 kVp for patients and phantoms

    Afrashteh, Hossein

    2005-07-01

    Patient dosimetry in practice is involved with time consuming, tedious calculations during the measurement process. There is a need for a straight forward and accurate method to perform patient dosimetry when required. A back projection dosimetry method for patient/phantom using Entrance Surface Dose (ESD) and its corresponding Exit Surface Dose with an average value for attenuation coefficient (mu), (e.g., mean effective attenuation coefficient (mu`)), was developed. The method focused on low energy X-ray units (40--140 kVp), primarily for conventional diagnostic radiography and low energy radiation therapy procedures. The assumption is that it may be used for similar concepts and modalities within the same energy range, (e.g., fluoroscopy, where the skin injuries have been common in the past, or mammography, where the radiation carcinogenesis has been a matter of concern). A new Gafchromic film, XR-QA, as a precision dosimeter was assessed and used with this algorithm. Due to the fact that the dose range often seen in conventional radiography exams in most cases is not high enough to activate the sensitive layer of this film sufficiently, the measured net Optical Density (OD) changes were not substantial enough. Therefore, a conventional and relatively low speed dental film, DF58 Ultra, was used. Various thicknesses of Acrylic, a tissue equivalent material, were used with the algorithm. When compared with the other sources and reference data, the results from the developed mathematical algorithm are in a reasonable agreement with these values. The developed method is straight forward, and within the acceptable accuracy range. The back projection dosimetry method is effective and may be used individually for the desired body parts or fetus areas, depending on the clinical practice and interests.

  14. 用过滤器实现Web网站汉字简繁体自动转换%Using ISAPI Filter To Implement the Web Pages Auto Conversion Between Simplified and Traditional Chinese Characters

    张震; 张曾科

    2001-01-01

    本文对网络上汉字的显示与传输进行了研究,提出-种新的在Web服务器端直接解决汉字繁简体内码转换的方案,使得只有一种内码的中文主页也可以自动地对不同内码浏览器提供支持,而不必要求客户端安装软件。这种思想在Windows NT下用IIS里的ISAPI过滤器得以实现。%This paper put forward a new conversion method between simplified and traditional Chinese characters. By using the ISAPI filter,homepages in Chinese characters on the web site could be translated automatically to support browsers on different Chinese systems without any software or language packages installed on the clients. Thisidea is completed in the Intemet Information Server 4.0 under the Window NT.

  15. Microwave Filters

    Zhou, Jiafeng

    2010-01-01

    The general theory of microwave filter design based on lumped-element circuit is described in this chapter. The lowpass prototype filters with Butterworth, Chebyshev and quasielliptic characteristics are synthesized, and the prototype filters are then transformed to bandpass filters by lowpass to bandpass frequency mapping. By using immitance inverters ( J - or K -inverters), the bandpass filters can be realized by the same type of resonators. One design example is given to verify the theory ...

  16. Improved resolution and reduced clutter in ultra-wideband microwave imaging using cross-correlated back projection: experimental and numerical results.

    Jacobsen, S; Birkelund, Y

    2010-01-01

    Microwave breast cancer detection is based on the dielectric contrast between healthy and malignant tissue. This radar-based imaging method involves illumination of the breast with an ultra-wideband pulse. Detection of tumors within the breast is achieved by some selected focusing technique. Image formation algorithms are tailored to enhance tumor responses and reduce early-time and late-time clutter associated with skin reflections and heterogeneity of breast tissue. In this contribution, we evaluate the performance of the so-called cross-correlated back projection imaging scheme by using a scanning system in phantom experiments. Supplementary numerical modeling based on commercial software is also presented. The phantom is synthetically scanned with a broadband elliptical antenna in a mono-static configuration. The respective signals are pre-processed by a data-adaptive RLS algorithm in order to remove artifacts caused by antenna reverberations and signal clutter. Successful detection of a 7 mm diameter cylindrical tumor immersed in a low permittivity medium was achieved in all cases. Selecting the widely used delay-and-sum (DAS) beamforming algorithm as a benchmark, we show that correlation based imaging methods improve the signal-to-clutter ratio by at least 10 dB and improves spatial resolution through a reduction of the imaged peak full-width half maximum (FWHM) of about 40-50%. PMID:21331362

  17. Portable Wideband Microwave Imaging System for Intracranial Hemorrhage Detection Using Improved Back-projection Algorithm with Model of Effective Head Permittivity

    Mobashsher, Ahmed Toaha; Mahmoud, A.; Abbosh, A. M.

    2016-02-01

    Intracranial hemorrhage is a medical emergency that requires rapid detection and medication to restrict any brain damage to minimal. Here, an effective wideband microwave head imaging system for on-the-spot detection of intracranial hemorrhage is presented. The operation of the system relies on the dielectric contrast between healthy brain tissues and a hemorrhage that causes a strong microwave scattering. The system uses a compact sensing antenna, which has an ultra-wideband operation with directional radiation, and a portable, compact microwave transceiver for signal transmission and data acquisition. The collected data is processed to create a clear image of the brain using an improved back projection algorithm, which is based on a novel effective head permittivity model. The system is verified in realistic simulation and experimental environments using anatomically and electrically realistic human head phantoms. Quantitative and qualitative comparisons between the images from the proposed and existing algorithms demonstrate significant improvements in detection and localization accuracy. The radiation and thermal safety of the system are examined and verified. Initial human tests are conducted on healthy subjects with different head sizes. The reconstructed images are statistically analyzed and absence of false positive results indicate the efficacy of the proposed system in future preclinical trials.

  18. High security and robust optical image encryption approach based on computer-generated integral imaging pickup and iterative back-projection techniques

    Li, Xiao Wei; Cho, Sung Jin; Kim, Seok Tae

    2014-04-01

    In this paper, a novel optical image encryption algorithm by combining the use of computer-generated integral imaging (CGII) pickup technique and iterative back-projection (IBP) technique is proposed. In this scheme, a color image to be encrypted which is firstly segregated into three channels: red, green, and blue. Each of these three channels is independently captured by using a virtual pinhole array and be computationally transformed as a sub-image array. Then, each of these three sub-image arrays are scrambled by the Fibonacci transformation (FT) algorithm, respectively. These three scrambled sub-image arrays are encrypted by the hybrid cellular automata (HCA), respectively. Ultimately, these three encrypted images are combined to produce the colored encrypted image. In the reconstruction process, because the computational integral imaging reconstruction (CIIR) is a pixel-overlapping reconstruction technique, the interference of the adjacent pixels will decrease the quality of the reconstructed image. To address this problem, we introduce an image super-resolution reconstruction technique, the image can be computationally reconstructed by the IBP technique. Some numerical simulations are made to test the validity and the capability of the proposed image encryption algorithm.

  19. Water Filters

    1993-01-01

    The Aquaspace H2OME Guardian Water Filter, available through Western Water International, Inc., reduces lead in water supplies. The filter is mounted on the faucet and the filter cartridge is placed in the "dead space" between sink and wall. This filter is one of several new filtration devices using the Aquaspace compound filter media, which combines company developed and NASA technology. Aquaspace filters are used in industrial, commercial, residential, and recreational environments as well as by developing nations where water is highly contaminated.

  20. GMTI processing using back projection.

    Doerry, Armin Walter

    2013-07-01

    Backprojection has long been applied to SAR image formation. It has equal utility in forming the range-velocity maps for Ground Moving Target Indicator (GMTI) radar processing. In particular, it overcomes the problem of targets migrating through range resolution cells.

  1. Cold Crystal Reflector Filter Concept

    Muhrer, G.

    2014-01-01

    In this paper the theoretical concept of a cold crystal reflector filter will be presented. The aim of this concept is to balance the shortcoming of the traditional cold polycrystalline reflector filter, which lies in the significant reduction of the neutron flux right above (in energy space) or right below (wavelength space) the first Bragg edge.

  2. Optimal filtering

    Anderson, Brian D O

    2005-01-01

    This graduate-level text augments and extends beyond undergraduate studies of signal processing, particularly in regard to communication systems and digital filtering theory. Vital for students in the fields of control and communications, its contents are also relevant to students in such diverse areas as statistics, economics, bioengineering, and operations research.Topics include filtering, linear systems, and estimation; the discrete-time Kalman filter; time-invariant filters; properties of Kalman filters; computational aspects; and smoothing of discrete-time signals. Additional subjects e

  3. Characterizing trends in HIV infection among men who have sex with men in Australia by birth cohorts: results from a modified back-projection method

    Wand Handan

    2009-09-01

    Full Text Available Abstract Background We set out to estimate historical trends in HIV incidence in Australian men who have sex with men with respect to age at infection and birth cohort. Methods A modified back-projection technique is applied to data from the HIV/AIDS Surveillance System in Australia, including "newly diagnosed HIV infections", "newly acquired HIV infections" and "AIDS diagnoses", to estimate trends in HIV incidence over both calendar time and age at infection. Results Our results demonstrate that since 2000, there has been an increase in new HIV infections in Australian men who have sex with men across all age groups. The estimated mean age at infection increased from ~35 years in 2000 to ~37 years in 2007. When the epidemic peaked in the mid 1980s, the majority of the infections (56% occurred among men aged 30 years and younger; 30% occurred in ages 31 to 40 years; and only ~14% of them were attributed to the group who were older than 40 years of age. In 2007, the proportion of infections occurring in persons 40 years or older doubled to 31% compared to the mid 1980s, while the proportion of infections attributed to the group younger than 30 years of age decreased to 36%. Conclusion The distribution of HIV incidence for birth cohorts by infection year suggests that the HIV epidemic continues to affect older homosexual men as much as, if not more than, younger men. The results are useful for evaluating the impact of the epidemic across successive birth cohorts and study trends among the age groups most at risk.

  4. Rupture Processes of the Mw8.3 Sea of Okhotsk Earthquake and Aftershock Sequences from 3-D Back Projection Imaging

    Jian, P. R.; Hung, S. H.; Meng, L.

    2014-12-01

    On May 24, 2013, the largest deep earthquake ever recorded in history occurred on the southern tip of the Kamchatka Island, where the Pacific Plate subducts underneath the Okhotsk Plate. Previous 2D beamforming back projection (BP) of P- coda waves suggests the mainshock ruptured bilaterally along a horizontal fault plane determined by the global centroid moment tensor solution. On the other hand, the multiple point source inversion of P and SH waveforms argued that the earthquake comprises a sequence of 6 subevents not located on a single plane but actually distributed in a zone that extends 64 km horizontally and 35 km in depth. We then apply a three-dimensional MUSIC BP approach to resolve the rupture processes of the manishock and two large aftershocks (M6.7) with no a priori setup of preferential orientations of the planar rupture. The maximum pseudo-spectrum of high-frequency P wave in a sequence of time windows recorded by the densely-distributed stations from US and EU Array are used to image 3-D temporal and spatial rupture distribution. The resulting image confirms that the nearly N-S striking but two antiparallel rupture stages. The first subhorizontal rupture initially propagates toward the NNE direction, while at 18 s later it directs reversely to the SSW and concurrently shifts downward to 35 km deeper lasting for about 20 s. The rupture lengths in the first NNE-ward and second SSW-ward stage are about 30 km and 85 km; the estimated rupture velocities are 3 km/s and 4.25 km/s, respectively. Synthetic experiments are undertaken to assess the capability of the 3D MUSIC BP for the recovery of spatio-temporal rupture processes. Besides, high frequency BP images based on the EU-Array data show two M6.7 aftershocks are more likely to rupture on the vertical fault planes.

  5. Gaoling Back to Back Project Thyristor Valve Electrical Design%高岭背靠背工程换流阀电气设计

    王英洁; 李斌; 田方

    2011-01-01

    The main purpose of HVDC back to back project is to unite two large AC power grid of different voltage grade or different frequency. Thyristor valve is the important equipment in the converter station. The purpose of thyristor valve is to use rectifier valve make AC power into DC power, and use inverter valve make DC power into AC power. In allusion to main parameter requirement of Gaoling +125 kV HVDC back to back thyristor valve electrical design, according to thyristor valve design technical specification, use simulate and analysis calculation,optimize the main parameter of the thyristor, and carry out calculation the number of thyristor positions and damping circuit parameters (including damping capacitance, damping resistance and DC resistance), and coordination of protective firing level. Calculation result verifies the correctness of electrical design.%高压直流背靠背工程的主要用途就是使不同电压等级和频率的两大交流系统联网,其中换流阀是换流站中的主要设备,它的作用是把交流电力变换成直流电力,或者实现逆变换.针对高岭背靠背工程±125 kV换流阀电气设计主参数要求,依据换流阀技术规范,通过模拟仿真分析计算,优化了的晶闸管参数,计算出单阀晶闸管串联数、阻尼电阻、阻尼电容、直流电阻以及晶闸管级保护触发电压值等主要电气设计数据,说明设计思路及计算方法的正确性.

  6. Keeping Tradition

    Zenhong, C.; Buwalda, P.L.

    2011-01-01

    Chinese dumplings such as Jiao Zi and Bao Zi are two of the popular traditional foods in Asia. They are usually made from wheat flour dough (rice flour or starch is sometimes used) that contains fillings. They can be steamed, boiled and fried and are consumed either as a main meal or dessert. As the

  7. Extensions to polar formatting with spatially variant post-filtering

    Garber, Wendy L.; Hawley, Robert W.

    2011-06-01

    The polar format algorithm (PFA) is computationally faster than back projection for producing spotlight mode synthetic aperture radar (SAR). This is very important in applications such as video SAR for persistent surveillance, as images may need to be produced in real time. PFA's speed is largely due to making a planar wavefront assumption and forming the image onto a regular grid of pixels lying in a plane. Unfortunately, both assumptions cause loss of focus in airborne persistent surveillance applications. The planar wavefront assumption causes a loss of focus in the scene for pixels that are far from scene center. The planar grid of image pixels causes loss of the depth of focus for conic flight geometries. In this paper, we present a method to compensate for the loss of depth of focus while warping the image onto a terrain map to produce orthorectified imagery. This technique applies a spatially variant post-filter and resampling to correct the defocus while dewarping the image. This work builds on spatially variant post-filtering techniques previously developed at Sandia National Laboratories in that it incorporates corrections for terrain height and circular flight paths. This approach produces high quality SAR images many times faster than back projection.

  8. Application of circular filter inserts

    High efficiency particulate air (HEPA) filters are used in the ventilation of nuclear plant as passive clean-up devices. Traditionally, the work-horse of the industry has been the rectangular HEPA filter. An assessment of the problems associated with remote handling, changing, and disposal of these rectangular filters suggested that significant advantages to filtration systems could be obtained by the adoption of HEPA filters with circular geometry for both new and existing ventilation plants. This paper covers the development of circular geometry filters and highlights the advantages of this design over their rectangular counterparts. The work has resulted in a range of commercially available filters for flows from 45 m3/h up to 3400 m3/h. This paper also covers the development of a range of sizes and types of housings that employ simple change techniques which take advantage of the circular geometry. The systems considered here have been designed in response to the requirements for shielded (remote filter change) and for unshielded facilities (potentially for bag changing of filters). Additionally the designs have allowed for the possibility of retrofitting circular geometry HEPA filters in place of the rectangular geometry filter

  9. High Resolution Telesesimic P-wave Back-Projection Imaging Using Variable Travel Time Corrections: Characterizing Sub-Events of the Great April 11th 2012 Indian Ocean Intraplate Earthquakes

    Kwong, K. B.; Koper, K. D.; Yue, H.; Lay, T.

    2012-12-01

    Two of the largest strike-slip earthquakes ever recorded occurred off the coast of northern Sumatra on April 11th 2012. The Mw 8.7 mainshock and Mw 8.2 aftershock occurred east of the NinetyEast Ridge in the Wharton Basin, a region of intraplate deformation with prominent fracture zones striking NNE-SSW. The relative lack of geodetic and local seismic data compared to other recent great earthquakes make teleseismic data especially important for understanding the rupture properties of these events. We performed short-period P-wave back-projection imaging using independent networks of stations in Europe and Japan. Preliminary images from the two networks showed similarly complex multi-event sources for the mainshock that indicate rupture occurred along both nodal planes of the gCMT solution, consistent with the locations of early aftershocks. Back-projection images of the Mw 8.2 aftershock showed a single, compact, bilateral rupture corresponding to the NNE-SSW nodal plane of the CMT solution [Yue et al., 2012]. Here we improve upon the resolution and accuracy of our initial back-projection images by estimating station specific travel time corrections that vary across the source region [e.g., Ishii et al., 2007]. These corrections are used to compensate for 3D variations in Earth structure that occur between the source region and the seismometers, and act to focus the array beams. We perform multi-channel cross-correlations of P waves recorded for 7 aftershocks that were (1) distributed broadly around the source region and (2) well-observed at seismometers in Europe. For each seismometer in the array, the 8 measured static corrections are smoothly interpolated over the entire source region with a Kriging method to form a travel time correction surface. These surfaces are then used with an otherwise conventional back-projection approach [Xu et al., 2009] to image the ruptures. Our new images are broadly consistent with our original results, indicating that the

  10. An Sub-Image Fast Factorized Back Projection Algorithm Based on Optimal Regional Partition%基于最优区域划分的子块快速因子分解后向投影算法

    林世斌; 李悦丽; 严少石; 周智敏

    2012-01-01

    Back Projection (BP) algorithm is suitable for airborne Ultra Wide Band Synthetic Aperture Radar (UWB SAR) imaging for its advantages such as perfect focusing and motion compensation. However, its application will be limited by the big computation load. The Sub-Image Fast Factorized Back Projection (SIFFBP) algorithm, which substantially reduces the computational load, can improve the practicability of the BP algorithm. In this article, an improved SIFFBP algorithm based on optimal regional partition is proposed by analyzing the regional division constraints. It provides a way to solve the acceleration-decline problem of the conventional SIFFBP algorithm when the SAR system has a small integration angle. The proposal can further reduce the computational load when the integration angle is smaller than 60 degree or when the length of the imaging region is much larger or smaller than the width. The performance of the proposal is demonstrated by u-sing simulated data, as well as real SAR data.%后向投影( Back Projection,BP)算法具有精确聚焦、完美运动补偿等优点,适合于机载超宽带合成孔径雷达(Ultra Wide Band Synthetic Aperture Radar,UWB SAR)成像,但是巨大的计算量限制了它的实际应用.子块快速因子分解后向投影算法(Sub-Image Fast Factorized Back Projection,SIFFBP)算法大幅度减小了BP算法的计算量,提高了BP算法的实用性.本文通过分析SIFFBP算法区域划分的约束条件,提出了一种基于最优区域划分的改进算法,解决了传统SIFFBP算法在小波束积累角时加速性能下降的问题.当波束积累角小于60度或成像区域长宽相差较大时,改进算法进一步减小了计算量.仿真和实测SAR数据的成像结果验证了改进算法的性能.

  11. Sinogram bow-tie filtering in FBP PET reconstruction.

    Abella, M; Vaquero, J J; Soto-Montenegro, M L; Lage, E; Desco, M

    2009-05-01

    Low-pass filtering of sinograms in the radial direction is the most common practice to limit noise amplification in filtered back projection (FBP) reconstruction of positron emission tomography studies. Other filtering strategies have been proposed to prevent the loss in resolution due to low-pass radial filters, although results have been diverse. Using the well-known properties of the Fourier transform of a sinogram, the authors defined a binary mask that matches the expected shape of the support region in the Fourier domain of the sinogram ("bow tie"). This mask was smoothed by a convolution with a ten-point Gaussian kernel which not only avoids ringing but also introduces a pre-emphasis at low frequencies. A new filtering scheme for FBP is proposed, comprising this smoothed bow-tie filter combined with a standard radial filter and an axial filter. The authors compared the performance of the bow-tie filtering scheme with that of other previously reported methods: Standard radial filtering, angular filtering, and stackgram-domain filtering. All the quantitative data in the comparisons refer to a baseline reconstruction using a ramp filter only. When using the smallest size of the Gaussian kernel in the stackgram domain, the authors achieved a noise reduction of 33% at the cost of degrading radial and tangential resolutions (14.5% and 16%, respectively, for cubic interpolation). To reduce the noise by 30%, the angular filter produced a larger degradation of contrast (3%) and tangential resolution (46% at 10 mm from the center of the field of view) and showed noticeable artifacts in the form of circular blurring dependent on the distance to the center of the field of view. For a similar noise reduction (33%), the proposed bow-tie filtering scheme yielded optimum results in resolution (gain in radial resolution of 10%) and contrast (1% increase) when compared with any of the other filters alone. Experiments with rodent images showed noticeable image quality

  12. Data assimilation the ensemble Kalman filter

    Evensen, Geir

    2006-01-01

    Covers data assimilation and inverse methods, including both traditional state estimation and parameter estimation. This text and reference focuses on various popular data assimilation methods, such as weak and strong constraint variational methods and ensemble filters and smoothers.

  13. 反投影滤波重建功能材料梯度折射率模拟分析*%Simulation reconstruction analysis of gradient refractive index for functionally gradient materials based on filtered back projection

    周文静; 李海鹏; 韩冰

    2014-01-01

    开展功能材料梯度折射率分布反投影滤波重建效果及其最小折射率分辨率的模拟分析.为提高折射率重建分辨率,选取适用于多投影方向的滤波反投影算法,分别分析了轴向梯度变化及径向梯度变化的两类功能材料梯度折射率重建效果,相同模拟条件下,重建误差均约为1%;同时也模拟分析了上述两类梯度折射率的最小分辨率,模拟结果表明,当投影量保持为90不变的情况下,径向梯度变化折射率较轴向梯度变化折射率重建准确度高,且相邻梯度间隔>0.003时,重建得到的两种梯度变化趋势的梯度折射率分布仍然能较好地分辨出其梯度间隔.%As parameters of functionally gradient materials were hard to represent,reconstruction method based on compressive sensing digital holographic tomography was proposed.In order to improve the reconstructed res-olution of refractive index,this paper focuses on simulation analysis of the tomographic reconstruction effect and resolution of functionally gradient materials’gradient refractive index.Functional materials with radical or axial varient refractive index are analyzed,and reconstruction results were compared.In the same simulation condition,the reconstruction error are all 1%.The resolution of reconstruction was also analyzed in simulation situation.From the simulation result,when the number of projection was 90,radial reconstruction error was lower than axial reconstruction.In addition,the gradient intervals can be identified from the reconstructed gra-dient trend of refractive index when the original gradient interval was larger than 0.003.

  14. Generalised Filtering

    Karl Friston

    2010-01-01

    Full Text Available We describe a Bayesian filtering scheme for nonlinear state-space models in continuous time. This scheme is called Generalised Filtering and furnishes posterior (conditional densities on hidden states and unknown parameters generating observed data. Crucially, the scheme operates online, assimilating data to optimize the conditional density on time-varying states and time-invariant parameters. In contrast to Kalman and Particle smoothing, Generalised Filtering does not require a backwards pass. In contrast to variational schemes, it does not assume conditional independence between the states and parameters. Generalised Filtering optimises the conditional density with respect to a free-energy bound on the model's log-evidence. This optimisation uses the generalised motion of hidden states and parameters, under the prior assumption that the motion of the parameters is small. We describe the scheme, present comparative evaluations with a fixed-form variational version, and conclude with an illustrative application to a nonlinear state-space model of brain imaging time-series.

  15. Water Filter

    1982-01-01

    A compact, lightweight electrolytic water sterilizer available through Ambassador Marketing, generates silver ions in concentrations of 50 to 100 parts per billion in water flow system. The silver ions serve as an effective bactericide/deodorizer. Tap water passes through filtering element of silver that has been chemically plated onto activated carbon. The silver inhibits bacterial growth and the activated carbon removes objectionable tastes and odors caused by addition of chlorine and other chemicals in municipal water supply. The three models available are a kitchen unit, a "Tourister" unit for portable use while traveling and a refrigerator unit that attaches to the ice cube water line. A filter will treat 5,000 to 10,000 gallons of water.

  16. Robust filtering for uncertain systems a parameter-dependent approach

    Gao, Huijun

    2014-01-01

    This monograph provides the reader with a systematic treatment of robust filter design, a key issue in systems, control and signal processing, because of the fact that the inevitable presence of uncertainty in system and signal models often degrades the filtering performance and may even cause instability. The methods described are therefore not subject to the rigorous assumptions of traditional Kalman filtering. The monograph is concerned with robust filtering for various dynamical systems with parametric uncertainties, and focuses on parameter-dependent approaches to filter design. Classical filtering schemes, like H2 filtering and H¥ filtering, are addressed, and emerging issues such as robust filtering with constraints on communication channels and signal frequency characteristics are discussed. The text features: ·        design approaches to robust filters arranged according to varying complexity level, and emphasizing robust filtering in the parameter-dependent framework for the first time; ·...

  17. Digital filters

    Hamming, Richard W

    1997-01-01

    Digital signals occur in an increasing number of applications: in telephone communications; in radio, television, and stereo sound systems; and in spacecraft transmissions, to name just a few. This introductory text examines digital filtering, the processes of smoothing, predicting, differentiating, integrating, and separating signals, as well as the removal of noise from a signal. The processes bear particular relevance to computer applications, one of the focuses of this book.Readers will find Hamming's analysis accessible and engaging, in recognition of the fact that many people with the s

  18. Variable Span Filters for Speech Enhancement

    Jensen, Jesper Rindom; Benesty, Jacob; Christensen, Mads Græsbøll

    2016-01-01

    In this work, we consider enhancement of multichannel speech recordings. Linear filtering and subspace approaches have been considered previously for solving the problem. The current linear filtering methods, although many variants exist, have limited control of noise reduction and speech...... distortion. Subspace approaches, on the other hand, can potentially yield better control by filtering in the eigen-domain, but traditionally these approaches have not been optimized explicitly for traditional noise reduction and signal distortion measures. Herein, we combine these approaches by deriving...... optimal filters using a joint diagonalization as a basis. This gives excellent control over the performance, as we can optimize for noise reduction or signal distortion performance. Results from real data experiments show that the proposed variable span filters can achieve better performance than existing...

  19. Satisfactory Optimization Design of IIR Digital Filters

    Jin Weidong; Zhang Gexiang; Zhao Duo

    2005-01-01

    A new method called satisfactory optimization method is proposed to design IIR (Infinite Impulse Response) digital filters, and the satisfactory optimization model is presented. The detailed algorithm of designing IIR digital filters using satisfactory optimization method is described. By using quantum genetic algorithm characterized by rapid convergence and good global search capability, the satisfying solutions are achieved in the experiment of designing lowpass and bandpass IIR digital filters. Experimental results show that the performances of IIR filters designed by the introduced method are better than those by traditional methods.

  20. Research on dynamic image reconstruction for MIT based on back-projection algorithm%基于反投影的MIT动态图像重建方法研究

    柯丽; 林筱; 杜强; 赵璐璐

    2013-01-01

    Magnetic induction tomography (MIT) is a contactless and non-invasive medical imaging technology, and the image reconstruction algorithm plays an important role in realizing MIT imaging quickly and accurately. An improved back-projection image reconstruction algorithm is presented in this paper. Firstly, the back-projection path is determined from the magnetic field line according to the magnetic field distribution in the imaging field, which reduces the positioning error of the magnetic imaging using the straight line back-projection algorithm. Secondly, an edge detection data modification model is built based on the electromagnetic relation in MIT, and this model is used to modify the phase shifts detected by the detection coils, which further improves the reconstructed image positioning accuracy. Two groups of sequence images were reconstructed for the two cases of conductivity variation and position variation of the disturbance object in the imaging field. With the conjoint analysis of the image sequence, the longitudinal impedance variation information was obtained, which reflects the dynamic information that the imaging object varies with time. Experiment results show that the improved back-projection algorithm for MIT possesses the characteristics of high reconstruction speed and accurate positioning; the method can accurately reflect the conductivity variation in the imaging field. Combined with the conjoint analysis of sequence image, the method can realize the dynamic imaging for MIT.%磁感应断层成像(magnetic induction tomography,MIT)是一种无创、非接触的新型医学成像技术,图像重建算法是实现MIT快速、精确成像的关键.提出一种改进的反投影图像重建算法,首先根据成像区域的磁场分布,由磁力线确定反投影路径,降低了直线反投影用于磁场成像的定位误差;其次根据MIT电磁关系推导,构建了边界检测数据的修正模型,据此对边界相位差数据进行修正处

  1. Anti-Aliasing filter for reverse-time migration

    Zhan, Ge

    2012-01-01

    We develop an anti-aliasing filter for reverse-time migration (RTM). It is similar to the traditional anti-aliasing filter used for Kirchhoff migration in that it low-pass filters the migration operator so that the dominant wavelength in the operator is greater than two times the trace sampling interval, except it is applied to both primary and multiple reflection events. Instead of applying this filter to the data in the traditional RTM operation, we apply the anti-aliasing filter to the generalized diffraction-stack migration operator. This gives the same migration image as computed by anti-aliased RTM. Download

  2. SavGolFilterCov: Savitzky Golay filter for data with error covariance

    More, Surhud

    2016-01-01

    A Savitzky-Golay filter is often applied to data to smooth the data without greatly distorting the signal; however, almost all data inherently comes with noise, and the noise properties can differ from point to point. This python script improves upon the traditional Savitzky-Golay filter by accounting for error covariance in the data. The inputs and arguments are modeled after scipy.signal.savgol_filter.

  3. Convergent Filter Bases

    Coghetto Roland

    2015-01-01

    We are inspired by the work of Henri Cartan [16], Bourbaki [10] (TG. I Filtres) and Claude Wagschal [34]. We define the base of filter, image filter, convergent filter bases, limit filter and the filter base of tails (fr: filtre des sections).

  4. Optimal Gaussian Filter for Effective Noise Filtering

    Kopparapu, Sunil; Satish, M

    2014-01-01

    In this paper we show that the knowledge of noise statistics contaminating a signal can be effectively used to choose an optimal Gaussian filter to eliminate noise. Very specifically, we show that the additive white Gaussian noise (AWGN) contaminating a signal can be filtered best by using a Gaussian filter of specific characteristics. The design of the Gaussian filter bears relationship with the noise statistics and also some basic information about the signal. We first derive a relationship...

  5. Filter Bank Design for Subband Adaptive Filtering

    Haan, Jan Mark de

    2001-01-01

    Adaptive filtering is an important subject in the field of signal processing and has numerous applications in fields such as speech processing and communications. Examples in speech processing include speech enhancement, echo- and interference- cancellation, and speech coding. Subband filter banks have been introduced in the area of adaptive filtering in order to improve the performance of time domain adaptive filters. The main improvements are faster convergence speed and the reduction of co...

  6. Fault Tolerant Parallel Filters Based On Bch Codes

    K.Mohana Krishna

    2015-04-01

    Full Text Available Digital filters are used in signal processing and communication systems. In some cases, the reliability of those systems is critical, and fault tolerant filter implementations are needed. Over the years, many techniques that exploit the filters’ structure and properties to achieve fault tolerance have been proposed. As technology scales, it enables more complex systems that incorporate many filters. In those complex systems, it is common that some of the filters operate in parallel, for example, by applying the same filter to different input signals. Recently, a simple technique that exploits the presence of parallel filters to achieve multiple fault tolerance has been presented. In this brief, that idea is generalized to show that parallel filters can be protected using Bose– Chaudhuri–Hocquenghem codes (BCH in which each filter is the equivalent of a bit in a traditional ECC. This new scheme allows more efficient protection when the number of parallel filters is large.

  7. Filters in 2D and 3D Cardiac SPECT Image Processing

    Maria Lyra

    2014-01-01

    Full Text Available Nuclear cardiac imaging is a noninvasive, sensitive method providing information on cardiac structure and physiology. Single photon emission tomography (SPECT evaluates myocardial perfusion, viability, and function and is widely used in clinical routine. The quality of the tomographic image is a key for accurate diagnosis. Image filtering, a mathematical processing, compensates for loss of detail in an image while reducing image noise, and it can improve the image resolution and limit the degradation of the image. SPECT images are then reconstructed, either by filter back projection (FBP analytical technique or iteratively, by algebraic methods. The aim of this study is to review filters in cardiac 2D, 3D, and 4D SPECT applications and how these affect the image quality mirroring the diagnostic accuracy of SPECT images. Several filters, including the Hanning, Butterworth, and Parzen filters, were evaluated in combination with the two reconstruction methods as well as with a specified MatLab program. Results showed that for both 3D and 4D cardiac SPECT the Butterworth filter, for different critical frequencies and orders, produced the best results. Between the two reconstruction methods, the iterative one might be more appropriate for cardiac SPECT, since it improves lesion detectability due to the significant improvement of image contrast.

  8. Traditional Agriculture and Permaculture.

    Pierce, Dick

    1997-01-01

    Discusses benefits of combining traditional agricultural techniques with the concepts of "permaculture," a framework for revitalizing traditions, culture, and spirituality. Describes school, college, and community projects that have assisted American Indian communities in revitalizing sustainable agricultural practices that incorporate cultural…

  9. Active Optical Lattice Filters

    Gary Evans; MacFarlane, Duncan L.; Govind Kannan; Jian Tong; Issa Panahi; Vishnupriya Govindan; L. Roberts Hunt

    2005-01-01

    Optical lattice filter structures including gains are introduced and analyzed. The photonic realization of the active, adaptive lattice filter is described. The algorithms which map between gains space and filter coefficients space are presented and studied. The sensitivities of filter parameters with respect to gains are derived and calculated. An example which is relevant to adaptive signal processing is also provided.

  10. Passive Power Filters

    Künzi, R

    2015-01-01

    Power converters require passive low-pass filters which are capable of reducing voltage ripples effectively. In contrast to signal filters, the components of power filters must carry large currents or withstand large voltages, respectively. In this paper, three different suitable filter struc tures for d.c./d.c. power converters with inductive load are introduced. The formulas needed to calculate the filter components are derived step by step and practical examples are given. The behaviour of the three discussed filters is compared by means of the examples. P ractical aspects for the realization of power filters are also discussed.

  11. Intelligent Optimize Design of LCL Filter for Three-Phase Voltage-Source PWM Rectifier

    Sun, Wei; Chen, Zhe; Wu, Xiaojie

    2009-01-01

    Compared to traditional L filter, a LCL filter is more effective on reducing harmonic distortion at switch frequency. So it is important to choose the LCL filter parameters to achieve good filtering effect. This paper introduces some traditional design methods. Design of a LCL filter by genetic...... algorithm (GA) and particle swam optimization (PSO) are presented in this paper and comparison of the two intelligent optimization. Simulation result and calculate data are provided to prove that intelligent optimization are more effective and simple than traditional methods....

  12. Filter synthesis using Genesys S/Filter

    Rhea, Randall W

    2014-01-01

    S/Filter includes tools beyond direct synthesis, including a wide variety of both exact and approximate equivalent network transforms, methods for selecting the most desirable out of potentially thousands of synthesized alternatives, and a transform history record that simplifies design attempts requiring iteration. Very few software programs are based on direct synthesis, and the additional features of S/Filter make it a uniquely effective tool for filter design.This resource presents a practical guide to using Genesys software for microwave and RF filter design and synthesis. The focus of th

  13. HEPA Filter Performance under Adverse Conditions

    This study involved challenging nuclear grade high-efficiency particulate air (HEPA) filters under a variety of conditions that can arise in Department of Energy (DOE) applications such as: low or high RH, controlled and uncontrolled challenge, and filters with physically damaged media or seals (i.e., leaks). Reported findings correlate filter function as measured by traditional differential pressure techniques in comparison with simultaneous instrumental determination of up and down stream PM concentrations. Additionally, emission rates and failure signatures will be discussed for filters that have either failed or exceeded their usable lifetime. Significant findings from this effort include the use of thermocouples up and down stream of the filter housing to detect the presence of moisture. Also demonstrated in the moisture challenge series of tests is the effect of repeated wetting of the filter. This produces a phenomenon referred to as transient failure before the tensile strength of the media weakens to the point of physical failure. An evaluation of the effect of particle size distribution of the challenge aerosol on loading capacity of filters is also included. Results for soot and two size distributions of KCl are reported. Loading capacities for filters ranged from approximately 70 g of soot to nearly 900 g for the larger particle size distribution of KCl. (authors)

  14. Adaptive Threshold Median Filter for Multiple-Impulse Noise

    JIANG Bo; HUANG Wei

    2007-01-01

    Attenuating the noises plays an essential role in the image processing. Almost all the traditional median filters concern the removal of impulse noise having a single layer, whose noise gray level value is constant. In this paper, a new adaptive median filter is proposed to handle those images corrupted not only by single layer noise. The adaptive threshold median filter(ATMF) has been developed by combining the adaptive median filter (AMF) and two dynamic thresholds. Because of the dynamic threshold being used, the ATMF is able to balance the removal of the multiple-impulse noise and the quality of image. Comparison of the proposed method with traditional median filters is provided. Some visual examples are given to demonstrate the performance of the proposed Filter.

  15. FIRT: Filtered iterative reconstruction technique with information restoration.

    Chen, Yu; Zhang, Yan; Zhang, Kai; Deng, Yuchen; Wang, Shengliu; Zhang, Fa; Sun, Fei

    2016-07-01

    Electron tomography (ET) combining subsequent sub-volume averaging has been becoming a unique way to study the in situ 3D structures of macromolecular complexes. However, information missing in electron tomography due to limited angular sampling is still the bottleneck in high-resolution electron tomography application. Here, based on the understanding of smooth nature of biological specimen, we present a new iterative image reconstruction algorithm, FIRT (filtered iterative reconstruction technique) for electron tomography by combining the algebra reconstruction technique (ART) and the nonlinear diffusion (ND) filter technique. Using both simulated and experimental data, in comparison to ART and weight back projection method, we proved that FIRT could generate a better reconstruction with reduced ray artifacts and significant improved correlation with the ground truth and partially restore the information at the non-sampled angular region, which was proved by investigating the 90° re-projection and by the cross-validation method. This new algorithm will be subsequently useful in the future for both cellular and molecular ET with better quality and improved structural details. PMID:27134004

  16. An Adaptive Approach to Mitigate Background Covariance Limitations in the Ensemble Kalman Filter

    Song, Hajoon

    2010-07-01

    A new approach is proposed to address the background covariance limitations arising from undersampled ensembles and unaccounted model errors in the ensemble Kalman filter (EnKF). The method enhances the representativeness of the EnKF ensemble by augmenting it with new members chosen adaptively to add missing information that prevents the EnKF from fully fitting the data to the ensemble. The vectors to be added are obtained by back projecting the residuals of the observation misfits from the EnKF analysis step onto the state space. The back projection is done using an optimal interpolation (OI) scheme based on an estimated covariance of the subspace missing from the ensemble. In the experiments reported here, the OI uses a preselected stationary background covariance matrix, as in the hybrid EnKF–three-dimensional variational data assimilation (3DVAR) approach, but the resulting correction is included as a new ensemble member instead of being added to all existing ensemble members. The adaptive approach is tested with the Lorenz-96 model. The hybrid EnKF–3DVAR is used as a benchmark to evaluate the performance of the adaptive approach. Assimilation experiments suggest that the new adaptive scheme significantly improves the EnKF behavior when it suffers from small size ensembles and neglected model errors. It was further found to be competitive with the hybrid EnKF–3DVAR approach, depending on ensemble size and data coverage.

  17. A local particle filter for high dimensional geophysical systems

    S. G. Penny; Miyoshi, T.

    2015-01-01

    A local particle filter (LPF) is introduced that outperforms traditional ensemble Kalman filters in highly nonlinear/non-Gaussian scenarios, both in accuracy and computational cost. The standard Sampling Importance Resampling (SIR) particle filter is augmented with an observation-space localization approach, for which an independent analysis is computed locally at each gridpoint. The deterministic resampling approach of Kitagawa is adapted for application locally and combine...

  18. Ceramic filters for bulk inoculation of nickel alloy castings

    F. Binczyk; J. Śleziona; P. Gradoń

    2011-01-01

    The work includes the results of research on production technology of ceramic filters which, besides the traditional filtering function, playalso the role of an inoculant modifying the macrostructure of cast nickel alloys. To play this additional role, filters should demonstratesufficient compression strength and ensure proper flow rate of liquid alloy. The role of an inoculant is played by cobalt aluminateintroduced to the composition of external coating in an amount from 5 to 10 wt.% . The ...

  19. Single-periodic-film optical bandpass filter

    Niraula, Manoj; Magnusson, Robert

    2015-01-01

    Resonant periodic surfaces and films enable new functionalities with wide applicability in practical optical systems. Their material sparsity, ease of fabrication, and minimal interface count provide environmental and thermal stability and robustness in applications. Here we report an experimental bandpass filter fashioned in a single patterned layer on a substrate. Its performance corresponds to bandpass filters requiring perhaps 30 traditional thin-film layers as shown by an example. We demonstrate an ultra-narrow, high-efficiency bandpass filter with extremely wide, flat, and low sidebands. This class of devices is designed with rigorous solutions of the Maxwell equations while engaging the physical principles of resonant waveguide gratings. The proposed technology is integration-friendly and opens doors for further development in various disciplines and spectral regions where thin-film solutions are traditionally applied.

  20. An iterative ensemble Kalman filter for reservoir engineering applications

    Krymskaya, M.V.; Hanea, R.G.; Verlaan, M.

    2009-01-01

    The study has been focused on examining the usage and the applicability of ensemble Kalman filtering techniques to the history matching procedures. The ensemble Kalman filter (EnKF) is often applied nowadays to solving such a problem. Meanwhile, traditional EnKF requires assumption of the distributi

  1. Effect of different thickness of material filter on Tc-99m spectra and performance parameters of gamma camera

    This study aimed to investigate the effects of material filter technique on Tc-99m spectra and performance parameters of Philip ADAC forte dual head gamma camera. Thickness of material filter was selected on the basis of percentage attenuation of various gamma ray energies by different thicknesses of zinc material. A cylindrical source tank of NEMA single photon emission computed tomography (SPECT) Triple Line Source Phantom filled with water and Tc-99m radionuclide injected was used for spectra, uniformity and sensitivity measurements. Vinyl plastic tube was used as a line source for spatial resolution. Images for uniformity were reconstructed by filtered back projection method. Butterworth filter of order 5 and cut off frequency 0.35 cycles/cm was selected. Chang's attenuation correction method was applied by selecting 0.13/cm linear attenuation coefficient. Count rate was decreased with material filter from the compton region of Tc-99m energy spectrum, also from the photopeak region. Spatial resolution was improved. However, uniformity of tomographic image was equivocal, and system volume sensitivity was reduced by material filter. Material filter improved system's spatial resolution. Therefore, the technique may be used for phantom studies to improve the image quality

  2. Effect of different thickness of material filter on Tc-99m spectra and performance parameters of gamma camera

    Nazifah, A.; Norhanna, S.; Shah, S. I.; Zakaria, A.

    2014-11-01

    This study aimed to investigate the effects of material filter technique on Tc-99m spectra and performance parameters of Philip ADAC forte dual head gamma camera. Thickness of material filter was selected on the basis of percentage attenuation of various gamma ray energies by different thicknesses of zinc material. A cylindrical source tank of NEMA single photon emission computed tomography (SPECT) Triple Line Source Phantom filled with water and Tc-99m radionuclide injected was used for spectra, uniformity and sensitivity measurements. Vinyl plastic tube was used as a line source for spatial resolution. Images for uniformity were reconstructed by filtered back projection method. Butterworth filter of order 5 and cut off frequency 0.35 cycles/cm was selected. Chang's attenuation correction method was applied by selecting 0.13/cm linear attenuation coefficient. Count rate was decreased with material filter from the compton region of Tc-99m energy spectrum, also from the photopeak region. Spatial resolution was improved. However, uniformity of tomographic image was equivocal, and system volume sensitivity was reduced by material filter. Material filter improved system's spatial resolution. Therefore, the technique may be used for phantom studies to improve the image quality.

  3. HEPA Filter Vulnerability Assessment

    GUSTAVSON, R.D.

    2000-05-11

    This assessment of High Efficiency Particulate Air (HEPA) filter vulnerability was requested by the USDOE Office of River Protection (ORP) to satisfy a DOE-HQ directive to evaluate the effect of filter degradation on the facility authorization basis assumptions. Within the scope of this assessment are ventilation system HEPA filters that are classified as Safety-Class (SC) or Safety-Significant (SS) components that perform an accident mitigation function. The objective of the assessment is to verify whether HEPA filters that perform a safety function during an accident are likely to perform as intended to limit release of hazardous or radioactive materials, considering factors that could degrade the filters. Filter degradation factors considered include aging, wetting of filters, exposure to high temperature, exposure to corrosive or reactive chemicals, and exposure to radiation. Screening and evaluation criteria were developed by a site-wide group of HVAC engineers and HEPA filter experts from published empirical data. For River Protection Project (RPP) filters, the only degradation factor that exceeded the screening threshold was for filter aging. Subsequent evaluation of the effect of filter aging on the filter strength was conducted, and the results were compared with required performance to meet the conditions assumed in the RPP Authorization Basis (AB). It was found that the reduction in filter strength due to aging does not affect the filter performance requirements as specified in the AB. A portion of the HEPA filter vulnerability assessment is being conducted by the ORP and is not part of the scope of this study. The ORP is conducting an assessment of the existing policies and programs relating to maintenance, testing, and change-out of HEPA filters used for SC/SS service. This document presents the results of a HEPA filter vulnerability assessment conducted for the River protection project as requested by the DOE Office of River Protection.

  4. HEPA Filter Vulnerability Assessment

    This assessment of High Efficiency Particulate Air (HEPA) filter vulnerability was requested by the USDOE Office of River Protection (ORP) to satisfy a DOE-HQ directive to evaluate the effect of filter degradation on the facility authorization basis assumptions. Within the scope of this assessment are ventilation system HEPA filters that are classified as Safety-Class (SC) or Safety-Significant (SS) components that perform an accident mitigation function. The objective of the assessment is to verify whether HEPA filters that perform a safety function during an accident are likely to perform as intended to limit release of hazardous or radioactive materials, considering factors that could degrade the filters. Filter degradation factors considered include aging, wetting of filters, exposure to high temperature, exposure to corrosive or reactive chemicals, and exposure to radiation. Screening and evaluation criteria were developed by a site-wide group of HVAC engineers and HEPA filter experts from published empirical data. For River Protection Project (RPP) filters, the only degradation factor that exceeded the screening threshold was for filter aging. Subsequent evaluation of the effect of filter aging on the filter strength was conducted, and the results were compared with required performance to meet the conditions assumed in the RPP Authorization Basis (AB). It was found that the reduction in filter strength due to aging does not affect the filter performance requirements as specified in the AB. A portion of the HEPA filter vulnerability assessment is being conducted by the ORP and is not part of the scope of this study. The ORP is conducting an assessment of the existing policies and programs relating to maintenance, testing, and change-out of HEPA filters used for SC/SS service. This document presents the results of a HEPA filter vulnerability assessment conducted for the River protection project as requested by the DOE Office of River Protection

  5. HEPA filter monitoring program

    Kirchner, K. N.; Johnson, C. M.; Aiken, W. F.; Lucerna, J. J.; Barnett, R. L.; Jensen, R. T.

    1986-07-01

    The testing and replacement of HEPA filters, widely used in the nuclear industry to purify process air, are costly and labor-intensive. Current methods of testing filter performance, such as differential pressure measurement and scanning air monitoring, allow determination of overall filter performance but preclude detection of incipient filter failure such as small holes in the filters. Using current technology, a continual in-situ monitoring system was designed which provides three major improvements over current methods of filter testing and replacement. The improvements include: cost savings by reducing the number of intact filters which are currently being replaced unnecessarily; more accurate and quantitative measurement of filter performance; and reduced personnel exposure to a radioactive environment by automatically performing most testing operations.

  6. Novel Backup Filter Device for Candle Filters

    Bishop, B.; Goldsmith, R.; Dunham, G.; Henderson, A.

    2002-09-18

    The currently preferred means of particulate removal from process or combustion gas generated by advanced coal-based power production processes is filtration with candle filters. However, candle filters have not shown the requisite reliability to be commercially viable for hot gas clean up for either integrated gasifier combined cycle (IGCC) or pressurized fluid bed combustion (PFBC) processes. Even a single candle failure can lead to unacceptable ash breakthrough, which can result in (a) damage to highly sensitive and expensive downstream equipment, (b) unacceptably low system on-stream factor, and (c) unplanned outages. The U.S. Department of Energy (DOE) has recognized the need to have fail-safe devices installed within or downstream from candle filters. In addition to CeraMem, DOE has contracted with Siemens-Westinghouse, the Energy & Environmental Research Center (EERC) at the University of North Dakota, and the Southern Research Institute (SRI) to develop novel fail-safe devices. Siemens-Westinghouse is evaluating honeycomb-based filter devices on the clean-side of the candle filter that can operate up to 870 C. The EERC is developing a highly porous ceramic disk with a sticky yet temperature-stable coating that will trap dust in the event of filter failure. SRI is developing the Full-Flow Mechanical Safeguard Device that provides a positive seal for the candle filter. Operation of the SRI device is triggered by the higher-than-normal gas flow from a broken candle. The CeraMem approach is similar to that of Siemens-Westinghouse and involves the development of honeycomb-based filters that operate on the clean-side of a candle filter. The overall objective of this project is to fabricate and test silicon carbide-based honeycomb failsafe filters for protection of downstream equipment in advanced coal conversion processes. The fail-safe filter, installed directly downstream of a candle filter, should have the capability for stopping essentially all particulate

  7. MST Filterability Tests

    Poirier, M. R. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Burket, P. R. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Duignan, M. R. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2015-03-12

    The Savannah River Site (SRS) is currently treating radioactive liquid waste with the Actinide Removal Process (ARP) and the Modular Caustic Side Solvent Extraction Unit (MCU). The low filter flux through the ARP has limited the rate at which radioactive liquid waste can be treated. Recent filter flux has averaged approximately 5 gallons per minute (gpm). Salt Batch 6 has had a lower processing rate and required frequent filter cleaning. Savannah River Remediation (SRR) has a desire to understand the causes of the low filter flux and to increase ARP/MCU throughput. In addition, at the time the testing started, SRR was assessing the impact of replacing the 0.1 micron filter with a 0.5 micron filter. This report describes testing of MST filterability to investigate the impact of filter pore size and MST particle size on filter flux and testing of filter enhancers to attempt to increase filter flux. The authors constructed a laboratory-scale crossflow filter apparatus with two crossflow filters operating in parallel. One filter was a 0.1 micron Mott sintered SS filter and the other was a 0.5 micron Mott sintered SS filter. The authors also constructed a dead-end filtration apparatus to conduct screening tests with potential filter aids and body feeds, referred to as filter enhancers. The original baseline for ARP was 5.6 M sodium salt solution with a free hydroxide concentration of approximately 1.7 M.3 ARP has been operating with a sodium concentration of approximately 6.4 M and a free hydroxide concentration of approximately 2.5 M. SRNL conducted tests varying the concentration of sodium and free hydroxide to determine whether those changes had a significant effect on filter flux. The feed slurries for the MST filterability tests were composed of simple salts (NaOH, NaNO2, and NaNO3) and MST (0.2 – 4.8 g/L). The feed slurry for the filter enhancer tests contained simulated salt batch 6 supernate, MST, and filter enhancers.

  8. Family Customs and Traditions.

    MacGregor, Cynthia

    Recognizing the importance of maintaining open communication with immediate and extended family members, this book provides a compilation of ideas for family traditions and customs that are grounded in compassion and human kindness. The traditions were gathered from families in the United States and Canada who responded to advertisements in…

  9. Tradition og Modernisme

    Bay, Carl Erik

    artiklen "Tradition og Modernisme" sit syn på konventionelle og moderne former i byplanlægning, arkitektur og design. I denne antologi er "Tradition og Modernisme" genoptrykt og fem forskere analyserer den med forskellige indfaldsvinkler. Den perspektiveres i forhold til PHs filosofiske afsæt i...

  10. Oriented Fiber Filter Media

    Bharadwaj, R; A. Patel, S. Chokdeepanich, Ph.D.; G.G. Chase, Ph.D.

    2008-01-01

    Coalescing filters are widely used throughout industry and improved performance will reduce droplet emissions and operating costs. Experimental observations show orientation of micro fibers in filter media effect the permeability and the separation efficiency of the filter media. In this work two methods are used to align the fibers to alter the filter structure. The results show that axially aligned fiber media improve quality factor on the order of 20% and cutting media on an angle from a t...

  11. Introduction to Kalman Filtering

    Alazard, Daniel

    2005-01-01

    This document is an introduction to Kalman optimal Filtering applied to linear systems. It is assumed that the reader is already aware of linear servo-loop theory, frequency-domain Filtering (continuous and discrete-time) and state-space approach to represent linear systems. Generally, Filtering consists in estimating a useful information (signal) from a measurement (of this information) perturbed by a noise. Frequency-domain Filtering assumes that a frequency-domain separation exists between...

  12. Filtering in Finance.

    Lautier, Delphine; Javaheri, Alireza; Galli, Alain

    2003-01-01

    In this article we present an introduction to various Filtering algorithms and some of their applications to the world of Quantitative Finance. We shall first mention the fundamental case of Gaussian noises where we obtain the well-known Kalman Filter. Because of common nonlinearities, we will be discussing the Extended Kalman Filter.

  13. Adaptive acoustooptic filter

    Psaltis, Demetri; Hong, John

    1984-01-01

    A new adaptive filter utilizing acoustooptic devices in a space integrating architecture is described. Two configurations are presented; one of them, suitable for signal estimation, is shown to approximate the Wiener filter, while the other, suitable for detection, is shown to approximate the matched filter.

  14. HEPA filter encapsulation

    Gates-Anderson, Dianne D.; Kidd, Scott D.; Bowers, John S.; Attebery, Ronald W.

    2003-01-01

    A low viscosity resin is delivered into a spent HEPA filter or other waste. The resin is introduced into the filter or other waste using a vacuum to assist in the mass transfer of the resin through the filter media or other waste.

  15. Filter service system

    Sellers, Cheryl L.; Nordyke, Daniel S.; Crandell, Richard A.; Tomlins, Gregory; Fei, Dong; Panov, Alexander; Lane, William H.; Habeger, Craig F.

    2008-12-09

    According to an exemplary embodiment of the present disclosure, a system for removing matter from a filtering device includes a gas pressurization assembly. An element of the assembly is removably attachable to a first orifice of the filtering device. The system also includes a vacuum source fluidly connected to a second orifice of the filtering device.

  16. Modified Adaptive Weighted Averaging Filtering Algorithm for Noisy Image Sequences

    LI Weifeng; YU Daoyin; CHEN Xiaodong

    2007-01-01

    In order to avoid the influence of noise variance on the filtering performances, a modified adaptive weighted averaging (MAWA) filtering algorithm is proposed for noisy image sequences. Based upon adaptive weighted averaging pixel values in consecutive frames, this algorithm achieves the filtering goal by assigning smaller weights to the pixels with inappropriate estimated motion trajectory for noise. It only utilizes the intensity of pixels to suppress noise and accordingly is independent of noise variance. To evaluate the performance of the proposed filtering algorithm, its mean square error and percentage of preserved edge points were compared with those of traditional adaptive weighted averaging and non-adaptive mean filtering algorithms under different noise variances. Relevant results show that the MAWA filtering algorithm can preserve image structures and edges under motion after attenuating noise, and thus may be used in image sequence filtering.

  17. Improved Passive-Damped LCL Filter to Enhance Stability in Grid-Connected Voltage-Source Converters

    Beres, Remus Narcis; Wang, Xiongfei; Blaabjerg, Frede;

    2015-01-01

    This paper proposes an improved passive-damped LCL filter to be used as interface between the grid-connected voltage-source converters and the utility grid. The proposed filter replaces the LCL filter capacitor with a traditional C-type filter with the resonant circuit tuned in such a way that sw...

  18. Filter assessment applied to analytical reconstruction for industrial third-generation tomography

    Multiphase systems are structures that contain a mixture of solids, liquids and gases inside a chemical reactor or pipes in a dynamic process. These systems are found in chemical, food, pharmaceutical and petrochemical industries. The gamma ray computed tomography (CT) system has been applied to visualize the distribution of multiphase systems without interrupting production. CT systems have been used to improve design, operation and troubleshooting of industrial processes. Computer tomography for multiphase processes is being developed at several laboratories. It is well known that scanning systems demand high processing time, limited set of data projections and views to obtain an image. Because of it, the image quality is dependent on the number of projection, number of detectors, acquisition time and reconstruction time. A phantom containing air, iron and aluminum was used on the third generation industrial tomography with 662 keV (137Cs) radioactive source. It was applied the Filtered Back Projection algorithm to reconstruct the images. An efficient tomography is dependent of the image quality, thus the objective of this research was to apply different types of filters on the analytical algorithm and compare each other using the figure of merit denominated root mean squared error (RMSE), the filter that presents lower value of RMSE has better quality. On this research, five types of filters were used: Ram-Lak, Shepp-Logan, Cosine, Hamming and Hann filters. As results, all filters presented lower values of RMSE, that means the filters used have low stand deviation compared to the mass absorption coefficient, however, the Hann filter presented better RMSE and CNR compared to the others. (author)

  19. Filter assessment applied to analytical reconstruction for industrial third-generation tomography

    Velo, Alexandre F.; Martins, Joao F.T.; Oliveira, Adriano S.; Carvalho, Diego V.S.; Faria, Fernando S.; Hamada, Margarida M.; Mesquita, Carlos H., E-mail: afvelo@usp.br [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)

    2015-07-01

    Multiphase systems are structures that contain a mixture of solids, liquids and gases inside a chemical reactor or pipes in a dynamic process. These systems are found in chemical, food, pharmaceutical and petrochemical industries. The gamma ray computed tomography (CT) system has been applied to visualize the distribution of multiphase systems without interrupting production. CT systems have been used to improve design, operation and troubleshooting of industrial processes. Computer tomography for multiphase processes is being developed at several laboratories. It is well known that scanning systems demand high processing time, limited set of data projections and views to obtain an image. Because of it, the image quality is dependent on the number of projection, number of detectors, acquisition time and reconstruction time. A phantom containing air, iron and aluminum was used on the third generation industrial tomography with 662 keV ({sup 137}Cs) radioactive source. It was applied the Filtered Back Projection algorithm to reconstruct the images. An efficient tomography is dependent of the image quality, thus the objective of this research was to apply different types of filters on the analytical algorithm and compare each other using the figure of merit denominated root mean squared error (RMSE), the filter that presents lower value of RMSE has better quality. On this research, five types of filters were used: Ram-Lak, Shepp-Logan, Cosine, Hamming and Hann filters. As results, all filters presented lower values of RMSE, that means the filters used have low stand deviation compared to the mass absorption coefficient, however, the Hann filter presented better RMSE and CNR compared to the others. (author)

  20. Bias aware Kalman filters

    Drecourt, J.-P.; Madsen, H.; Rosbjerg, Dan

    2006-01-01

    This paper reviews two different approaches that have been proposed to tackle the problems of model bias with the Kalman filter: the use of a colored noise model and the implementation of a separate bias filter. Both filters are implemented with and without feedback of the bias into the model state...... illustrated on a simple one-dimensional groundwater problem. The results show that the presented filters outperform the standard Kalman filter and that the implementations with bias feedback work in more general conditions than the implementations without feedback. 2005 Elsevier Ltd. All rights reserved....

  1. Ceramic fiber filter technology

    Holmes, B.L.; Janney, M.A.

    1996-06-01

    Fibrous filters have been used for centuries to protect individuals from dust, disease, smoke, and other gases or particulates. In the 1970s and 1980s ceramic filters were developed for filtration of hot exhaust gases from diesel engines. Tubular, or candle, filters have been made to remove particles from gases in pressurized fluidized-bed combustion and gasification-combined-cycle power plants. Very efficient filtration is necessary in power plants to protect the turbine blades. The limited lifespan of ceramic candle filters has been a major obstacle in their development. The present work is focused on forming fibrous ceramic filters using a papermaking technique. These filters are highly porous and therefore very lightweight. The papermaking process consists of filtering a slurry of ceramic fibers through a steel screen to form paper. Papermaking and the selection of materials will be discussed, as well as preliminary results describing the geometry of papers and relative strengths.

  2. Changing ventilation filters

    A filter changing unit has a door which interlocks with the door of a filter chamber so as to prevent contamination of the outer surfaces of the doors by radioactive material collected on the filter element and a movable support which enables a filter chamber thereonto to be stored within the unit in such a way that the doors of the unit and the filter chamber can be replaced. The door pivots and interlocks with another door by means of a bolt, a seal around the periphery lip of the first door engages the periphery of the second door to seal the gap. A support pivots into a lower filter element storage position. Inspection windows and glove ports are provided. The unit is releasably connected to the filter chamber by bolts engaging in a flange provided around an opening. (author)

  3. Traditional Urban Aboriginal Religion

    Kristina Everett

    2009-01-01

    Full Text Available This paper represents a group of Aboriginal people who claim traditional Aboriginal ownership of a large Australian metropol is. They have struggled for at least the last 25 to 30 years to articulate and represent the ir contemporary group identity to the wider Australian society that very often does not take th eir expressions seriously. This is largely because dominant discourses claim that ‘authentic’ Aboriginal culture only exists in remote, pristine areas far away from western societ y and that urban Aboriginal traditions, especially urban religious traditions are, today, d efunct. This paper is an account of one occasion on which such traditional Aboriginal relig ious practice was performed before the eyes of a group of tourists.

  4. Compact planar microwave blocking filters

    U-Yen, Kongpop (Inventor); Wollack, Edward J. (Inventor)

    2012-01-01

    A compact planar microwave blocking filter includes a dielectric substrate and a plurality of filter unit elements disposed on the substrate. The filter unit elements are interconnected in a symmetrical series cascade with filter unit elements being organized in the series based on physical size. In the filter, a first filter unit element of the plurality of filter unit elements includes a low impedance open-ended line configured to reduce the shunt capacitance of the filter.

  5. Frequency weighting filter design for automotive ride comfort evaluation

    Du, Feng

    2016-04-01

    Few study gives guidance to design weighting filters according to the frequency weighting factors, and the additional evaluation method of automotive ride comfort is not made good use of in some countries. Based on the regularities of the weighting factors, a method is proposed and the vertical and horizontal weighting filters are developed. The whole frequency range is divided several times into two parts with respective regularity. For each division, a parallel filter constituted by a low- and a high-pass filter with the same cutoff frequency and the quality factor is utilized to achieve section factors. The cascading of these parallel filters obtains entire factors. These filters own a high order. But, low order filters are preferred in some applications. The bilinear transformation method and the least P-norm optimal infinite impulse response(IIR) filter design method are employed to develop low order filters to approximate the weightings in the standard. In addition, with the window method, the linear phase finite impulse response(FIR) filter is designed to keep the signal from distorting and to obtain the staircase weighting. For the same case, the traditional method produces 0.330 7 m • s-2 weighted root mean square(r.m.s.) acceleration and the filtering method gives 0.311 9 m • s-2 r.m.s. The fourth order filter for approximation of vertical weighting obtains 0.313 9 m • s-2 r.m.s. Crest factors of the acceleration signal weighted by the weighting filter and the fourth order filter are 3.002 7 and 3.011 1, respectively. This paper proposes several methods to design frequency weighting filters for automotive ride comfort evaluation, and these developed weighting filters are effective.

  6. KASTAMONU TRADITIONAL WOMEN CLOTHES

    E. Elhan ÖZUS; ERDEN, Filiz; TUFAN, Melek

    2015-01-01

    Clothing is a unique dressing style of a community, a period or a profession. In clothing there is social status and difference principle rather than fashion. In this context, the society created a clothing style in line with its own customs, traditions and social structure. One of the features separating societies from each other and indicating their cultural and social classes is the clothing style. As it is known, traditional Turkish clothes reflecting...

  7. Survey of Sparse Adaptive Filters for Acoustic Echo Cancellation

    Krishna Samalla

    2013-01-01

    Full Text Available This paper reviews the existing developments of adaptive methods of sparse adaptive filters for the identification of sparse impulse response in both network and acoustic echo cancellation from the last decade. A variety of different architectures and novel training algorithms have been proposed in literature. At present most of the work in echo cancellation on using more than one method. Sparse adaptive filters take the advantage of each method and showing good improvement in the sparseness measure performance. This survey gives an overview of existing sparse adaptive filters mechanisms and discusses their advantages over the traditional adaptive filters developed for echo cancellation.

  8. Miniature wideband filter based on coupled-line sections and quasi-lumped element resonator

    Zhurbenko, Vitaliy; Krozer, Viktor; Meincke, Peter

    2007-01-01

    A new design of a wideband bandpass filter is proposed, based on coupled-line sections and quasi-lumped element resonator, taking advantage of the last one to introduce two transmission zeros and suppress a spurious response. The proposed filter demonstrates significantly improved characteristics...... in comparison with traditional coupled-line filter and exhibits a very compact structure....

  9. Filter material charging apparatus for filter assembly for radioactive contaminants

    A filter charging apparatus for a filter assembly is described. The filter assembly includes a housing with at least one filter bed therein and the filter charging apparatus for adding filter material to the filter assembly includes a tank with an opening therein, the tank opening being disposed in flow communication with opposed first and second conduit means, the first conduit means being in flow communication with the filter assembly housing and the second conduit means being in flow communication with a blower means. Upon activation of the blower means, the blower means pneumatically conveys the filter material from the tank to the filter housing

  10. Evaluation of median filtering after reconstruction with maximum likelihood expectation maximization (ML-EM) by real space and frequency space

    Matsumoto, Keiichi; Fujita, Toru; Oogari, Koji [Kyoto Univ. (Japan). Hospital

    2002-05-01

    Maximum likelihood expectation maximization (ML-EM) image quality is sensitive to the number of iterations, because a large number of iterations leads to images with checkerboard noise. The use of median filtering in the reconstruction process allows both noise reduction and edge preservation. We examined the value of median filtering after reconstruction with ML-EM by comparing filtered back projection (FBP) with a ramp filter or ML-EM without filtering. SPECT images were obtained with a dual-head gamma camera. The acquisition time was changed from 10 to 200 (seconds/frame) to examine the effect of the count statistics on the quality of the reconstructed images. First, images were reconstructed with ML-EM by changing the number of iterations from 1 to 150 in each study. Additionally, median filtering was applied following reconstruction with ML-EM. The quality of the reconstructed images was evaluated in terms of normalized mean square error (NMSE) values and two-dimensional power spectrum analysis. Median filtering after reconstruction by the ML-EM method provided stable NMSE values even when the number of iterations was increased. The signal element of the image was close to the reference image for any repetition number of iterations. Median filtering after reconstruction with ML-EM was useful in reducing noise, with a similar resolution achieved by reconstruction with FBP and a ramp filter. Especially in images with poor count statistics, median filtering after reconstruction with ML-EM is effective as a simple, widely available method. (author)

  11. Generic Kalman Filter Software

    Lisano, Michael E., II; Crues, Edwin Z.

    2005-01-01

    The Generic Kalman Filter (GKF) software provides a standard basis for the development of application-specific Kalman-filter programs. Historically, Kalman filters have been implemented by customized programs that must be written, coded, and debugged anew for each unique application, then tested and tuned with simulated or actual measurement data. Total development times for typical Kalman-filter application programs have ranged from months to weeks. The GKF software can simplify the development process and reduce the development time by eliminating the need to re-create the fundamental implementation of the Kalman filter for each new application. The GKF software is written in the ANSI C programming language. It contains a generic Kalman-filter-development directory that, in turn, contains a code for a generic Kalman filter function; more specifically, it contains a generically designed and generically coded implementation of linear, linearized, and extended Kalman filtering algorithms, including algorithms for state- and covariance-update and -propagation functions. The mathematical theory that underlies the algorithms is well known and has been reported extensively in the open technical literature. Also contained in the directory are a header file that defines generic Kalman-filter data structures and prototype functions and template versions of application-specific subfunction and calling navigation/estimation routine code and headers. Once the user has provided a calling routine and the required application-specific subfunctions, the application-specific Kalman-filter software can be compiled and executed immediately. During execution, the generic Kalman-filter function is called from a higher-level navigation or estimation routine that preprocesses measurement data and post-processes output data. The generic Kalman-filter function uses the aforementioned data structures and five implementation- specific subfunctions, which have been developed by the user on

  12. Conservative Noise Filters

    Mona M.Jamjoom

    2016-05-01

    Full Text Available Noisy training data have a huge negative impact on machine learning algorithms. Noise-filtering algorithms have been proposed to eliminate such noisy instances. In this work, we empirically show that the most popular noise-filtering algorithms have a large False Positive (FP error rate. In other words, these noise filters mistakenly identify genuine instances as outliers and eliminate them. Therefore, we propose more conservative outlier identification criteria that improve the FP error rate and, thus, the performance of the noise filters. With the new filter, an instance is eliminated if and only if it is misclassified by a mutual decision of Naïve Bayesian (NB classifier and the original filtering criteria being used. The number of genuine instances that are incorrectly eliminated is reduced as a result, thereby improving the classification accuracy.

  13. KASTAMONU TRADITIONAL WOMEN CLOTHES

    E.Elhan ÖZUS

    2015-08-01

    Full Text Available Clothing is a unique dressing style of a community, a period or a profession. In clothing there is social status and difference principle rather than fashion. In this context, the society created a clothing style in line with its own customs, traditions and social structure. One of the features separating societies from each other and indicating their cultural and social classes is the clothing style. As it is known, traditional Turkish clothes reflecting the characteristics of Turkish society is our most beautiful heritage from past to present. From this heritage there are several examples of women's clothes c arried to present. When these examples are examined, it is possible to see the taste, the way of understanding art, joy and the lifestyle of the history. These garments are also the documents outlining the taste and grace of Turkish people. In the present study, traditional Kastamonu women's clothing, that has an important place in traditional cultural clothes of Anatolia, is investigated . The method of the present research is primarily defined as the examination of the written sources. The study is complet ed with the observations and examinations made in Kastamonu. According to the findings of the study, traditional Kastamonu women's clothing are examined and adapted to todays’ clothing.

  14. Hybrid Filter Membrane

    Laicer, Castro; Rasimick, Brian; Green, Zachary

    2012-01-01

    Cabin environmental control is an important issue for a successful Moon mission. Due to the unique environment of the Moon, lunar dust control is one of the main problems that significantly diminishes the air quality inside spacecraft cabins. Therefore, this innovation was motivated by NASA s need to minimize the negative health impact that air-suspended lunar dust particles have on astronauts in spacecraft cabins. It is based on fabrication of a hybrid filter comprising nanofiber nonwoven layers coated on porous polymer membranes with uniform cylindrical pores. This design results in a high-efficiency gas particulate filter with low pressure drop and the ability to be easily regenerated to restore filtration performance. A hybrid filter was developed consisting of a porous membrane with uniform, micron-sized, cylindrical pore channels coated with a thin nanofiber layer. Compared to conventional filter media such as a high-efficiency particulate air (HEPA) filter, this filter is designed to provide high particle efficiency, low pressure drop, and the ability to be regenerated. These membranes have well-defined micron-sized pores and can be used independently as air filters with discreet particle size cut-off, or coated with nanofiber layers for filtration of ultrafine nanoscale particles. The filter consists of a thin design intended to facilitate filter regeneration by localized air pulsing. The two main features of this invention are the concept of combining a micro-engineered straight-pore membrane with nanofibers. The micro-engineered straight pore membrane can be prepared with extremely high precision. Because the resulting membrane pores are straight and not tortuous like those found in conventional filters, the pressure drop across the filter is significantly reduced. The nanofiber layer is applied as a very thin coating to enhance filtration efficiency for fine nanoscale particles. Additionally, the thin nanofiber coating is designed to promote capture of

  15. Spot- Zombie Filtering System

    Arathy Rajagopal; B. Geethanjali; Arulprakash P

    2014-01-01

    A major security challenge on the Internet is the existence of the large number of compromised machines. Such machines have been increasingly used to launch various security attacks including spamming and spreading malware, DDoS, and identity theft. These compromised machines are called "Zombies". In general E-mail applications and providers uses spam filters to filter the spam messages. Spam filtering is a technique for discriminating the genuine message from the spam messages. The attackers...

  16. Morphing ensemble Kalman filters

    Beezley, Jonathan D.; Mandel, Jan

    2008-01-01

    A new type of ensemble filter is proposed, which combines an ensemble Kalman filter (EnKF) with the ideas of morphing and registration from image processing. This results in filters suitable for non-linear problems whose solutions exhibit moving coherent features, such as thin interfaces in wildfire modelling. The ensemble members are represented as the composition of one common state with a spatial transformation, called registration mapping, plus a residual. A fully automatic registration m...

  17. Morphing Ensemble Kalman Filters

    Beezley, Jonathan D.; Mandel, Jan

    2007-01-01

    A new type of ensemble filter is proposed, which combines an ensemble Kalman filter (EnKF) with the ideas of morphing and registration from image processing. This results in filters suitable for nonlinear problems whose solutions exhibit moving coherent features, such as thin interfaces in wildfire modeling. The ensemble members are represented as the composition of one common state with a spatial transformation, called registration mapping, plus a residual. A fully automatic registration met...

  18. Kalman Filter Neuron Training

    Murase, Haruhiko; KOYAMA, Shuhei; HONAMI, Nobuo; Kuwabara, Takao

    1991-01-01

    An attempt of implementing Kalman filter algorithm in the procedure for training the neural network was made and evaluated. The Kalman filter neuron training program (KNT) was coded. The performance of Kalman filter in KNT was compared to commonly used neuron training algorithm. The study revealed that KNT requires much less calculation time to accomplish neuron training than commonly used other algorithms do. KNT also gave much smaller final error than any other algorithms tested in this study.

  19. Retrofitting fabric filters for clean stack emission

    The fly ash generated from New South Wales coals, which are predominately low sulphur coals, has been difficult to collect in traditional electrostatic precipitators. During the early 1970's development work was undertaken on the use of fabric filters at some of the Commission's older power stations. The satisfactory performance of the plant at those power stations led to the selection of fabric filters for flue gas cleaning at the next two new power stations constructed by the Electricity Commission of New South Wales. On-going pilot plant testing has continued to indicate the satisfactory performance of enhanced designs of fabric filters of varying types and the Commission has recently retrofitted pulse cleaned fabric filters to 2 x 350 MW units at a further power station with plans to retrofit similar plant to the remaining 2 x 350 MW units at that station. A contract has also been let for the retrofitting of pulse cleaned fabric filters to 4 x 500 MW units at another power station in the Commission's system. The paper reviews the performance of the 6000 MW of plant operating with fabric filters. Fabric selection and fabric life forms an important aspect of this review

  20. Nanofiber Filters Eliminate Contaminants

    2009-01-01

    With support from Phase I and II SBIR funding from Johnson Space Center, Argonide Corporation of Sanford, Florida tested and developed its proprietary nanofiber water filter media. Capable of removing more than 99.99 percent of dangerous particles like bacteria, viruses, and parasites, the media was incorporated into the company's commercial NanoCeram water filter, an inductee into the Space Foundation's Space Technology Hall of Fame. In addition to its drinking water filters, Argonide now produces large-scale nanofiber filters used as part of the reverse osmosis process for industrial water purification.

  1. Filters in nuclear facilities

    The topics of the nine papers given include the behavior of HEPA filters during exposure to air flows of high humidity as well as of high differential pressure, the development of steel-fiber filters suitable for extreme operating conditions, and the occurrence of various radioactive iodine species in the exhaust air from boiling water reactors. In an introductory presentation the German view of the performance requirements to be met by filters in nuclear facilities as well as the present status of filter quality assurance are discussed. (orig.)

  2. Updating the OMERACT filter

    Wells, George; Beaton, Dorcas E; Tugwell, Peter; Boers, Maarten; Kirwan, John R; Bingham, Clifton O; Boonen, Annelies; Brooks, Peter; Conaghan, Philip G; D'Agostino, Maria-Antonietta; Dougados, Maxime; Furst, Daniel E; Gossec, Laure; Guillemin, Francis; Helliwell, Philip; Hewlett, Sarah; Kvien, Tore K; Landewé, Robert B; March, Lyn; Mease, Philip J; Østergaard, Mikkel; Simon, Lee; Singh, Jasvinder A; Strand, Vibeke; van der Heijde, Désirée M

    2014-01-01

    The "Discrimination" part of the OMERACT Filter asks whether a measure discriminates between situations that are of interest. "Feasibility" in the OMERACT Filter encompasses the practical considerations of using an instrument, including its ease of use, time to complete, monetary costs, and...... interpretability of the question(s) included in the instrument. Both the Discrimination and Reliability parts of the filter have been helpful but were agreed on primarily by consensus of OMERACT participants rather than through explicit evidence-based guidelines. In Filter 2.0 we wanted to improve this definition...

  3. Oriented Fiber Filter Media

    R. Bharadwaj

    2008-06-01

    Full Text Available Coalescing filters are widely used throughout industry and improved performance will reduce droplet emissions and operating costs. Experimental observations show orientation of micro fibers in filter media effect the permeability and the separation efficiency of the filter media. In this work two methods are used to align the fibers to alter the filter structure. The results show that axially aligned fiber media improve quality factor on the order of 20% and cutting media on an angle from a thick layered media can improve performance by about 40%. The results also show the improved performance is not monotonically correlated to the average fiber angle of the medium.

  4. Traditional Chinese Biotechnology

    Xu, Yan; Wang, Dong; Fan, Wen Lai; Mu, Xiao Qing; Chen, Jian

    The earliest industrial biotechnology originated in ancient China and developed into a vibrant industry in traditional Chinese liquor, rice wine, soy sauce, and vinegar. It is now a significant component of the Chinese economy valued annually at about 150 billion RMB. Although the production methods had existed and remained basically unchanged for centuries, modern developments in biotechnology and related fields in the last decades have greatly impacted on these industries and led to numerous technological innovations. In this chapter, the main biochemical processes and related technological innovations in traditional Chinese biotechnology are illustrated with recent advances in functional microbiology, microbial ecology, solid-state fermentation, enzymology, chemistry of impact flavor compounds, and improvements made to relevant traditional industrial facilities. Recent biotechnological advances in making Chinese liquor, rice wine, soy sauce, and vinegar are reviewed.

  5. Traditional Chinese biotechnology.

    Xu, Yan; Wang, Dong; Fan, Wen Lai; Mu, Xiao Qing; Chen, Jian

    2010-01-01

    The earliest industrial biotechnology originated in ancient China and developed into a vibrant industry in traditional Chinese liquor, rice wine, soy sauce, and vinegar. It is now a significant component of the Chinese economy valued annually at about 150 billion RMB. Although the production methods had existed and remained basically unchanged for centuries, modern developments in biotechnology and related fields in the last decades have greatly impacted on these industries and led to numerous technological innovations. In this chapter, the main biochemical processes and related technological innovations in traditional Chinese biotechnology are illustrated with recent advances in functional microbiology, microbial ecology, solid-state fermentation, enzymology, chemistry of impact flavor compounds, and improvements made to relevant traditional industrial facilities. Recent biotechnological advances in making Chinese liquor, rice wine, soy sauce, and vinegar are reviewed. PMID:19888561

  6. Family traditions and generations.

    Schneiderman, Gerald; Barrera, Maru

    2009-01-01

    Currently, traditional family values that have been passed down through generations appear to be at risk. This has significant implications for the stability and health of individuals, families, and communities. This article explores selected issues related to intergenerational transmission of family values and cultural beliefs, with particular reference to Western culture and values that are rooted in Jewish and Christian traditions. It also examines family values and parenting styles as they influence the developing perspective of children and the family's adaptation to a changing world. PMID:19752638

  7. FPGA Based Kalman Filter for Wireless Sensor Networks

    Vikrant Vij

    2011-01-01

    Full Text Available A Wireless Sensor Network (WSN is a set of tiny and low-cost devices equipped with different kind of sensors, a small microcontroller and a radio transceiver, typically powered by batteries. Target tracking is one of the very important applications of such a network system. Traditionally, KF (Kalman filtering and its derivatives are used for tracking of a random signal. Kalman filter is a linear optimal filtering approach, to address the problem when system dynamics become nonlinear, researchers developed sub-optimal extensions of Kalman filter, two popular versions are EKF (extended Kalman filter and UKF (unscented Kalman filter.The rapidly increasing popularity of WSNs has placed increased computational demands upon these systemswhich can be met by FPGA based design. FPGAs offer increased performance compared to microprocessors and increased flexibility compared to ASICs , while maintaining low power consumption

  8. The Daala Directional Deringing Filter

    Valin, Jean-Marc

    2016-01-01

    This paper presents the deringing filter used in the Daala royalty-free video codec. The filter is based on a non-linear conditional replacement filter and is designed for vectorization efficiency. It takes into account the direction of edges and patterns being filtered. The filter works by identifying the direction of each block and then adaptively filtering along the identified direction. In a second pass, the blocks are also filtered in a different direction, with more conservative thresho...

  9. Students’ Weakness Detective in Traditional Class

    Fatimah Altuhaifa

    2016-10-01

    Full Text Available In Artificial Intelligent in Education in learning contexts and domains, the traditional classroom is tough to find students’ weakness during lecture due to the student’s number and because the instruction is busy with explaining the lesson. According to that, choosing teaching style that can improve student talent or skills to performs better in their classes or professional life would not be an easy task. This system is going to detect the average of students’ weakness and find either a solution for this or instruction a style that can increase students’ ability and skills by filtering the collection data, understanding the problem. After that, it provides a teaching style.

  10. Investigation of New Microstrip Bandpass Filter Based on Patch Resonator with Geometrical Fractal Slot

    Mezaal, Yaqeen S.; Eyyuboglu, Halil T.

    2016-01-01

    A compact dual-mode microstrip bandpass filter using geometrical slot is presented in this paper. The adopted geometrical slot is based on first iteration of Cantor square fractal curve. This filter has the benefits of possessing narrower and sharper frequency responses as compared to microstrip filters that use single mode resonators and traditional dual-mode square patch resonators. The filter has been modeled and demonstrated by Microwave Office EM simulator designed at a resonant frequenc...

  11. Traditional versus shadow banking

    Bryan J. Noeth; Wolla, Scott A.

    2012-01-01

    Modern economies rely heavily on financial intermediaries to channel funds between borrowers and lenders. In the first edition of the Page One Economics Newsletter, the role of traditional banking is outlined and a parallel system—shadow banking—is explored.

  12. Traditional Chinese Medicine

    2010-01-01

    2010150 A prospective multicenter randomized double-blinded controlled clinical trial on effects of Tiantai No. 1 in treating mild cognitive impairment. WU Zhengzhi(吴正治),et al. Shenzhen Hosp,Southern Med Univ,Guangdong 518035.Chin J Integr Tradit & West Med 2010;30(3):255-258.

  13. Traditional Cherokee Food.

    Hendrix, Janey B.

    A collection for children and teachers of traditional Cherokee recipes emphasizes the art, rather than the science, of cooking. The hand-printed, illustrated format is designed to communicate the feeling of Cherokee history and culture and to encourage readers to collect and add family recipes. The cookbook could be used as a starting point for…

  14. Major Traditional Festivals

    2006-01-01

    Spring Festival is the most important and most celebrated Chinese traditional festival, and it is the only indigenous celebration with legal holidays. People have different opinions on the origin of the event. Many say it can be dated back to 4,000 years ago, when people sacrificed to their

  15. Traditional healers formalised?

    Van Niekerk, Jp

    2012-03-01

    Traditional healers are the first to be called for help when illness strikes the majority of South Africans. Their communities have faith in their ability to cure or alleviate conditions managed by doctors, and much more. A visit to such practitioners' websites (they are up with the latest advertising technology!) shows that they promise help with providing more power, love, security or money, protection from evil people and spirits, enhancing one's sex life with penis enlargement and vagina tightening spells, etc. Contemplating such claims, it is easy to be dismissive of traditional healers. But in this issue of the SAMJ Nompumelelo Mbatha and colleagues1 argue that the traditional healers' regulatory council, promised by an Act of Parliament, should be established, followed by (or preferably preceded by) formal recognition by employers of sick certificates issued by traditional healers. Can matters be so simply resolved? What does this mean for doctors and other formally recognised healthcare professionals, and how to respond to such claims and social pressures? PMID:22380886

  16. Tibetan traditional medicine

    2005-01-01

    Tibetan medicine companies in T.A.R can manufacture more than 360 Tibetan patent medicines. There are 18 Tibetan medicine factories in Tibet, and total out value exceeds 3 billion yuan. 24 kinds of Tibetan patent medicines have been incorporated into State Fundamental Medicine List, in which 14 Tibetan patent medicines are listed in national protected traditional medicine category.

  17. Making Tradition Healthy

    2007-11-01

    In this podcast, a Latina nutrition educator shows how a community worked with local farmers to grow produce traditionally enjoyed by Hispanic/Latinos.  Created: 11/1/2007 by National Diabetes Education Program (NDEP), a joint program of the Centers for Disease Control and Prevention and the National Institutes of Health.   Date Released: 11/10/2007.

  18. 3.TRADITIONAL CHINESE MEDICINE

    1992-01-01

    920220 Studies on plasma cortisol concen-tration and blood leukocyte content of gluco-corticoid receptors in patients with asthenia-cold asthenia-heat syndrome.ZHANG Guan-gyu (张广宇),XLE Zhufar (谢竹藩).Tradit & West

  19. Kalman filtering technique for reactivity measurement

    Measurement of reactivity and its on-line display is of great help in calibration of reactivity control and safety devices and in the planning of suitable actions during the reactor operation. In traditional approaches the reactivity is estimated from reactor period or by solving the inverse point kinetic equation. In this paper, an entirely new approach based on the Kalman filtering technique has been presented. The theory and design of the reactivity measuring instrument based on the approach has been explained. Its performance has been compared with traditional approaches by estimation of transient reactivity from flux variation data recorded in a research reactor. It is demonstrated that the Kalman filtering approach is superior to other methods from the viewpoints of accuracy, noise suppression, and robustness against uncertainties in the reactor parameters. (author). 1 fig

  20. A Robust Gaussian Filter Corresponding to the Transmisson Characterisic of the Gaussian Filter

    A surface roughness profile of an object can be measured by extracting a mean line of the long wavelength component from the primary profile, and by subtracting it from the primary profile. This mean line is usually computed by convolving the traditional Gaussian filter (GF) and the primary profile. However, if an outlier exists in the primary profile, the output of a Gaussian filter will be greatly affected by the outlier. To solve the outlier problem, several schemes of robust Gaussian filter have been proposed. However there are several fatal problems that a mean line determined with respect to the measurement data containing no outliers does not agree with the mean line of the Gaussian filter output. To solve these problems, this paper proposes a new robust Gaussian filter based on a fast M-estimation method (FMGF) and the performance of the new robust Gaussian filter was experimentally clarified. As a result, if an outlier exist, the proposed method behaves a robust performance. If no outlier exists, the output wave pattern, RMSE and transmission characteristic accorded mutually with Gaussian filter

  1. Multilevel ensemble Kalman filtering

    Hoel, Håkon; Law, Kody J. H.; Tempone, Raul

    2015-01-01

    This work embeds a multilevel Monte Carlo (MLMC) sampling strategy into the Monte Carlo step of the ensemble Kalman filter (ENKF), thereby yielding a multilevel ensemble Kalman filter (MLENKF) which has provably superior asymptotic cost to a given accuracy level. The theoretical results are illustrated numerically.

  2. Randomized Filtering Algorithms

    Katriel, Irit; Van Hentenryck, Pascal

    2008-01-01

    Filtering every global constraint of a CPS to are consistency at every search step can be costly and solvers often compromise on either the level of consistency or the frequency at which are consistency is enforced. In this paper we propose two randomized filtering schemes for dense instances...

  3. Internet Filtering in China

    Zittrain, Jonathan L.

    2003-01-01

    We collected data on the methods, scope, and depth of selective barriers to Internet usage through networks in China. Tests conducted from May through November 2002 indicated at least four distinct and independently operable Internet filtering methods - Web server IP address, DNS server IP address, keyword, and DNS redirection with a quantifiable leap in filtering sophistication beginning in September 2002.

  4. Filter Bank Fusion Frames

    Chebira, Amina; Mixon, Dustin G

    2010-01-01

    In this paper we characterize and construct novel oversampled filter banks implementing fusion frames. A fusion frame is a sequence of orthogonal projection operators whose sum can be inverted in a numerically stable way. When properly designed, fusion frames can provide redundant encodings of signals which are optimally robust against certain types of noise and erasures. However, up to this point, few implementable constructions of such frames were known; we show how to construct them using oversampled filter banks. In this work, we first provide polyphase domain characterizations of filter bank fusion frames. We then use these characterizations to construct filter bank fusion frame versions of discrete wavelet and Gabor transforms, emphasizing those specific finite impulse response filters whose frequency responses are well-behaved.

  5. Sub-micron filter

    Tepper, Frederick; Kaledin, Leonid

    2009-10-13

    Aluminum hydroxide fibers approximately 2 nanometers in diameter and with surface areas ranging from 200 to 650 m.sup.2/g have been found to be highly electropositive. When dispersed in water they are able to attach to and retain electronegative particles. When combined into a composite filter with other fibers or particles they can filter bacteria and nano size particulates such as viruses and colloidal particles at high flux through the filter. Such filters can be used for purification and sterilization of water, biological, medical and pharmaceutical fluids, and as a collector/concentrator for detection and assay of microbes and viruses. The alumina fibers are also capable of filtering sub-micron inorganic and metallic particles to produce ultra pure water. The fibers are suitable as a substrate for growth of cells. Macromolecules such as proteins may be separated from each other based on their electronegative charges.

  6. Experimental validation of a single shaped filter approach for CT using variable source-to-filter distance for examination of arbitrary object diameters

    The purpose of this study was to validate the use of a single shaped filter (SF) for computed tomography (CT) using variable source-to-filter distance (SFD) for the examination of different object diameters. A SF was designed by performing simulations with the purpose of achieving noise homogeneity in the reconstructed volume and dose reduction for arbitrary phantom diameters. This was accomplished by using a filter design method thats target is to achieve a homogeneous detector noise, but also uses a correction factor for the filtered back projection process. According to simulation results, a single SF designed for one of the largest phantom diameters meets the requirements for all diameters when SFD can be adjusted. To validate these results, a SF made of aluminium alloy was manufactured. Measurements were performed on a CT scanner with polymethyl methacrylate (PMMA) phantoms of diameters from 40–100 mm. The filter was positioned at SFDs ranging from 97–168 mm depending on the phantom diameter. Image quality was evaluated for the reconstructed volume by assessing CT value accuracy, noise homogeneity, contrast-to-noise ratio weighted by dose (CNRD) and spatial resolution. Furthermore, scatter distribution was determined with the use of a beam-stop phantom. Dose was measured for a PMMA phantom with a diameter of 100 mm using a calibrated ionization chamber. The application of a single SF at variable SFD led to improved noise uniformity and dose reduction: noise homogeneity was improved from 15% down to about 0%, and dose was reduced by about 37%. Furthermore, scatter dropped by about 32%, which led to reduced cupping artifacts and improved CT value accuracy. Spatial resolution and CNRD was not affected by the SF. By means of a single SF with variable SFD designed for CT, significant dose reduction can be achieved and image quality can be improved by reducing noise inhomogeneity as well as scatter-induced artifacts. (paper)

  7. PRESERVING A TRADITION

    2007-01-01

    COVER STORY The Chinese art of paper cutting has long been a popular pastime in the country’s rural areas.For more than 1,000 years,farming families have used it as a method for decorating their homes,but the tradition has struggled for survival in recent years.In Yanchuan County in China’s northwestern Shaanxi Province,however,the art form has experienced a revival thanks to the efforts of a local woman.Paper cutting master Gao Fenglian has invested her own money in establishing a paper cutting gallery in the region.The craft’s growing popularity has also fuelled a new wave of people wanting to learn how to cut.More than 10,000 of the county’s 200,000 are now skilled in the ancient craft,and its revival could serve as a model for the preservation of other Chinese traditions.

  8. Traditional Chinese Medicine

    2009-01-01

    2009013 Clinical observation on treatment of active rheumatoid arthritis with Chinese herbal medicine. SHENG Zhenghe(盛正和), et al.Dept TCM, 5th Affili Hosp, Guangxi Med Univ, Guangxi 545001. Chin J Integr Tradit West Med 2008;28(11):990-993. Objective To study the efficacy and safety of Chinese drugs for expelling evil-wind, removing dampness, promoting blood circulation and invigorating yin in treating active rheumatoid arthritis (RA).

  9. Distance and Traditional Education

    Liu, Yuliang

    2002-01-01

    This case study is designed to investigate how distance education technology affects the instructor’s simultaneously teaching the same course via instructional television (ITV) and traditional education (face-to-face) formats. This study involved random observations of the instructor in a graduate course in both instructional television and face-to-face classrooms. In addition, an interview with the instructor was conducted to collect more data. This study has suggested that the instructor wh...

  10. 3.TRADITIONAL CHINESE MEDICINE

    1993-01-01

    930625 Clinical study of rotundium in treating atrialfibrillation.WANG Dajin,et al.CardiovascDis Instit,Tongji Med Univ,Wuhan,430022.Chin JIntegr Tradit & West Med 1993;13(8):455—457.L—tetrahydropalmatine(Rotundium)is an alkaloidof Corydalis turtschaninovii.Some animal experimentshad demonstrated that Rotundium had a good antiar-rhythmic effect on blocking the calcium channel andthat it was a class Ⅳ antiarrhythmic agent,similar to

  11. The tyranny of tradition.

    Gulati, L

    1999-01-01

    This paper narrates the cruelty enforced by tradition on the lives of women in India. It begins with the life of the author's great-grandmother Ponnamma wherein the family was rigidly patriarchal, and Brahmin values were applied. Here, women had very little say in the decisions men made, were forced in an arranged marriage before puberty, were not sent to school, and were considered unimportant. This tradition lived on in the author's grandmother Seetha and in the life of her mother Saras. However, in the story of Saras, following the death of her husband, they departed from rigid Brahmin tradition and orthodoxy. Her mother, unperturbed by the challenges she faced, consistently devised ways to cope and succeeded in changing environment. Meaningless Brahmatic rituals and prayers found no place in her life, which she approached with a cosmopolitan and humanitarian outlook. In essence, she shaped the lives of three daughters and a son, and all her grandchildren, making a success of not only her own but of all whose lives she touched. PMID:12322347

  12. Defueling filter test

    The Three Mile Island Unit 2 Reactor (TMI-2) has sustained core damage creating a significant quantity of fine debris, which can become suspended during the planned defueling operations, and will have to be constantly removed to maintain water clarity and minimize radiation exposure. To accomplish these objectives, a Defueling Water Cleanup System (DWCS) has been designed. One of the primary components in the DWCS is a custom designed filter canister using an all stainless steel filter medium. The full scale filter canister is designed to remove suspended solids from 800 microns to 0.5 microns in size. Filter cartridges are fabricated into an element cluster to provide for a flowrate of greater than 100 gals/min. Babcock and Wilcox (B and W) under contract to GPU Nuclear Corporation has evaluated two candidate DWCS filter concepts in a 1/100 scale proof-of-principle test program at BandW's Lynchburg Research Center. The filters were challenged with simulated solids suspensions of 1400 and 140 ppm in borated water (5000 ppm boron). Test data collected includes solids loading, effluent turbidity, and differential pressure trends versus time. From the proof-of-principle test results, a full-scale filter canister was generated

  13. Smoke and pollutant filtering device

    A smoke and pollutant filtering device comprising a mask having a filter composed of a series of contiguous, serial layers of filtering material. The filter consists of front and rear gas permeable covers, a first filter layer of pressed vegetable matter, a second filter layer comprising a layer of activated charcoal adjacent a layer of aqua filter floss, a third filter comprising a gas permeable cloth situated between layers of pressed vegetable matter, and a fourth filter layer comprising an aqua filter floss. The first through fourth filter layers are sandwiched between the front and rear gas permeable covers. The filtering device is stitched together and mounted within a fireretardant hood shaped to fit over a human head. Elastic bands are included in the hood to maintain the hood snugly about the head when worn

  14. An area efficient low noise 100 Hz low-pass filter

    Ølgaard, Christian; Sassene, Haoues; Perch-Nielsen, Ivan R.

    1996-01-01

    A technique based on scaling a filter's capacitor currents to improve the noise performance of low frequency continuous-time filters is presented. Two 100 Hz low-pass filters have been implemented: a traditional low pass filter (as reference), and a filter utilizing the above mentioned current...... when a class A/B biasing scheme is used in the current divider. Obtaining identical noise performance from the reference filter would require a 3.6 times larger filter capacitor. This would increase the reference filter's die area by 100%. Therefore, the current scaling technique allows filters with...... improved noise performance/dynamic range, given a fixed silicon area and a fixed power supply...

  15. Improved multilevel filters to enhance infrared small target

    Xiaoping Wang; Tianxu Zhang; Luxin Yan; Man Wang; Jiawei Wu

    2011-01-01

    We propose improved multilevel filters (IMLFs) involving the absolute value operation into the algorithmic framework of traditional multilevel filters (MLFs) to improve the robustness of infrared small target enhancement techniques under a complex infrared cluttered background. Compared with the widely used small target enhancement methods which only deal with bright targets, the proposed technique can enhance the infrared small target, whether it is bright or dark. Experimental results verify that the proposed technique is efficient and practical.%@@ We propose improved multilevel filters (IMLFs) involving the absolute value operation into the algorithmic framework of traditional multilevel filters (MLFs) to improve the robustness of infrared small target enhancement techniques under a complex infrared cluttered background.Compared with the widely used small target enhancement methods which only deal with bright targets, the proposed technique can enhance the infrared small target, whether it is bright or dark.Experimental results verify that the proposed technique is efficient and practical.

  16. Particle Filtering: The Need for Speed

    Karlsson Rickard

    2010-01-01

    Full Text Available Abstract The particle filter(PF has during the last decade been proposed for a wide range of localization and tracking applications. There is a general need in such embedded system to have a platform for efficient and scalable implementation of the PF. One such platform is the graphics processing unit (GPU, originally aimed to be used for fast rendering of graphics. To achieve this, GPUs are equipped with a parallel architecture which can be exploited for general-purpose computing on GPU (GPGPU as a complement to the central processing unit (CPU. In this paper, GPGPU techniques are used to make a parallel recursive Bayesian estimation implementation using particle filters. The modifications made to obtain a parallel particle filter, especially for the resampling step, are discussed and the performance of the resulting GPU implementation is compared to the one achieved with a traditional CPU implementation. The comparison is made using a minimal sensor network with bearings-only sensors. The resulting GPU filter, which is the first complete GPU implementation of a PF published to this date, is faster than the CPU filter when many particles are used, maintaining the same accuracy. The parallelization utilizes ideas that can be applicable for other applications.

  17. Particle Filtering: The Need for Speed

    Gustaf Hendeby

    2010-01-01

    Full Text Available The particle filter (PF has during the last decade been proposed for a wide range of localization and tracking applications. There is a general need in such embedded system to have a platform for efficient and scalable implementation of the PF. One such platform is the graphics processing unit (GPU, originally aimed to be used for fast rendering of graphics. To achieve this, GPUs are equipped with a parallel architecture which can be exploited for general-purpose computing on GPU (GPGPU as a complement to the central processing unit (CPU. In this paper, GPGPU techniques are used to make a parallel recursive Bayesian estimation implementation using particle filters. The modifications made to obtain a parallel particle filter, especially for the resampling step, are discussed and the performance of the resulting GPU implementation is compared to the one achieved with a traditional CPU implementation. The comparison is made using a minimal sensor network with bearings-only sensors. The resulting GPU filter, which is the first complete GPU implementation of a PF published to this date, is faster than the CPU filter when many particles are used, maintaining the same accuracy. The parallelization utilizes ideas that can be applicable for other applications.

  18. Circuits and filters handbook

    Chen, Wai-Kai

    2003-01-01

    A bestseller in its first edition, The Circuits and Filters Handbook has been thoroughly updated to provide the most current, most comprehensive information available in both the classical and emerging fields of circuits and filters, both analog and digital. This edition contains 29 new chapters, with significant additions in the areas of computer-aided design, circuit simulation, VLSI circuits, design automation, and active and digital filters. It will undoubtedly take its place as the engineer's first choice in looking for solutions to problems encountered in the design, analysis, and behavi

  19. HEPA filter jointer

    Hill, D.; Martinez, H.E.

    1998-02-01

    A HEPA filter jointer system was created to remove nitrate contaminated wood from the wooden frames of HEPA filters that are stored at the Rocky Flats Plant. A commercial jointer was chosen to remove the nitrated wood. The chips from the wood removal process are in the right form for caustic washing. The jointer was automated for safety and ease of operation. The HEPA filters are prepared for jointing by countersinking the nails with a modified air hammer. The equipment, computer program, and tests are described in this report.

  20. Multilevel filtering elliptic preconditioners

    Kuo, C. C. Jay; Chan, Tony F.; Tong, Charles

    1989-01-01

    A class of preconditioners is presented for elliptic problems built on ideas borrowed from the digital filtering theory and implemented on a multilevel grid structure. They are designed to be both rapidly convergent and highly parallelizable. The digital filtering viewpoint allows the use of filter design techniques for constructing elliptic preconditioners and also provides an alternative framework for understanding several other recently proposed multilevel preconditioners. Numerical results are presented to assess the convergence behavior of the new methods and to compare them with other preconditioners of multilevel type, including the usual multigrid method as preconditioner, the hierarchical basis method and a recent method proposed by Bramble-Pasciak-Xu.

  1. Trajectory probability hypothesis density filter

    García-Fernández, Ángel F.; Svensson, Lennart

    2016-01-01

    This paper presents the probability hypothesis density (PHD) filter for sets of trajectories. The resulting filter, which is referred to as trajectory probability density filter (TPHD), is capable of estimating trajectories in a principled way without requiring to evaluate all measurement-to-target association hypotheses. As the PHD filter, the TPHD filter is based on recursively obtaining the best Poisson approximation to the multitrajectory filtering density in the sense of minimising the K...

  2. Derivative free filtering using Kalmtool

    Bayramoglu, Enis; Hansen, Søren; Ravn, Ole;

    2010-01-01

    In this paper we present a toolbox enabling easy evaluation and comparison of different filtering algorithms. The toolbox is called Kalmtool 4 and is a set of MATLAB tools for state estimation of nonlinear systems. The toolbox contains functions for extended Kalman filtering as well as for DD1...... filter and the DD2 filter. It also contains functions for Unscented Kalman filters as well as several versions of particle filters. The toolbox requires MATLAB version 7, but no additional toolboxes are required....

  3. Spatial filtering efficiency of monostatic biaxial lidar: analysis and applications

    Agishev, Ravil R.; Comerón Tejero, Adolfo

    2002-01-01

    Results of lidar modeling based on spatial-angular filtering efficiency criteria are presented. Their analysis shows that the low spatial-angular filtering efficiency of traditional visible and near-infrared systems is an important cause of low signal background-radiation ratio SBR at the photodetector input. The low SBR may be responsible for considerable measurement errors and ensuing the low accuracy of the retrieval of atmospheric optical parameters. As shown, the most effec...

  4. Blind Source Parameters for Performance Evaluation of Despeckling Filters

    Nagashettappa Biradar; Dewal, M. L.; ManojKumar Rohit; Sanjaykumar Gowre; Yogesh Gundge

    2016-01-01

    The speckle noise is inherent to transthoracic echocardiographic images. A standard noise-free reference echocardiographic image does not exist. The evaluation of filters based on the traditional parameters such as peak signal-to-noise ratio, mean square error, and structural similarity index may not reflect the true filter performance on echocardiographic images. Therefore, the performance of despeckling can be evaluated using blind assessment metrics like the speckle suppression index, spec...

  5. Paul Rodgersi filter Kohilas

    2000-01-01

    28. I Kohila keskkoolis kohaspetsiifiline skulptuur ja performance "Filter". Kooli 130. aastapäeva tähistava ettevõtmise eesotsas oli skulptor Paul Rodgers ja kaks viimase klassi noormeest ئ Marko Heinmäe, Hendrik Karm.

  6. Updating the OMERACT filter

    Kirwan, John R; Boers, Maarten; Hewlett, Sarah; Beaton, Dorcas; Bingham, Clifton O; Choy, Ernest; Conaghan, Philip G; D'Agostino, Maria-Antonietta; Dougados, Maxime; Furst, Daniel E; Guillemin, Francis; Gossec, Laure; van der Heijde, Désirée M; Kloppenburg, Margreet; Kvien, Tore K; Landewé, Robert B M; Mackie, Sarah L; Matteson, Eric L; Mease, Philip J; Merkel, Peter A; Østergaard, Mikkel; Saketkoo, Lesley Ann; Simon, Lee; Singh, Jasvinder A; Strand, Vibeke; Tugwell, Peter

    2014-01-01

    OBJECTIVE: The Outcome Measures in Rheumatology (OMERACT) Filter provides guidelines for the development and validation of outcome measures for use in clinical research. The "Truth" section of the OMERACT Filter presupposes an explicit framework for identifying the relevant core outcomes that are...... for defining core areas of measurement ("Filter 2.0 Core Areas of Measurement") was presented at OMERACT 11 to explore areas of consensus and to consider whether already endorsed core outcome sets fit into this newly proposed framework. METHODS: Discussion groups critically reviewed the extent to...... construction, presentation, and clarity of the framework were questioned. The discussion groups and subsequent feedback highlighted 20 such issues. CONCLUSION: These issues will require resolution to reach consensus on accepting the proposed Filter 2.0 framework of Core Areas as the basis for the selection of...

  7. Updating the OMERACT filter

    Tugwell, Peter; Boers, Maarten; D'Agostino, Maria-Antonietta; Beaton, Dorcas; Boonen, Annelies; Bingham, Clifton O; Choy, Ernest; Conaghan, Philip G; Dougados, Maxime; Duarte, Catia; Furst, Daniel E; Guillemin, Francis; Gossec, Laure; Heiberg, Turid; van der Heijde, Désirée M; Hewlett, Sarah; Kirwan, John R; Kvien, Tore K; Landewé, Robert B; Mease, Philip J; Østergaard, Mikkel; Simon, Lee; Singh, Jasvinder A; Strand, Vibeke; Wells, George

    2014-01-01

    OBJECTIVE: The Outcome Measures in Rheumatology (OMERACT) Filter provides guidelines for the development and validation of outcome measures for use in clinical research. The "Truth" section of the OMERACT Filter requires that criteria be met to demonstrate that the outcome instrument meets the...... criteria for content, face, and construct validity. METHODS: Discussion groups critically reviewed a variety of ways in which case studies of current OMERACT Working Groups complied with the Truth component of the Filter and what issues remained to be resolved. RESULTS: The case studies showed that there...... is broad agreement on criteria for meeting the Truth criteria through demonstration of content, face, and construct validity; however, several issues were identified that the Filter Working Group will need to address. CONCLUSION: These issues will require resolution to reach consensus on how Truth...

  8. HEPA air filter (image)

    ... pet dander and other irritating allergens from the air. Along with other methods to reduce allergens, such ... controlling the amount of allergens circulating in the air. HEPA filters can be found in most air ...

  9. In the Dirac tradition

    It was Paul Dirac who cast quantum mechanics into the form we now use, and many generations of theoreticians openly acknowledge his influence on their thinking. When Dirac died in 1984, St. John's College, Cambridge, his base for most of his lifetime, instituted an annual lecture in his memory at Cambridge. The first lecture, in 1986, attracted two heavyweights - Richard Feynman and Steven Weinberg. Far from using the lectures as a platform for their own work, in the Dirac tradition they presented stimulating material on deep underlying questions

  10. The Band Pass Filter

    Christiano, Lawrence J.; Terry J. Fitzgerald

    1999-01-01

    The `ideal' band pass filter can be used to isolate the component of a time series that lies within a particular band of frequencies. However, applying this filter requires a dataset of infinite length. In practice, some sort of approximation is needed. Using projections, we derive approximations that are optimal when the time series representations underlying the raw data have a unit root, or are stationary about a trend. We identify one approximation which, though it is only optimal for one...

  11. Kalman Filtering in R

    Fernando Tusell

    2011-03-01

    Full Text Available Support in R for state space estimation via Kalman filtering was limited to one package, until fairly recently. In the last five years, the situation has changed with no less than four additional packages offering general implementations of the Kalman filter, including in some cases smoothing, simulation smoothing and other functionality. This paper reviews some of the offerings in R to help the prospective user to make an informed choice.

  12. Feedback Particle Filter

    Yang, Tao; Mehta, Prashant G.; Meyn, Sean P.

    2013-01-01

    A new formulation of the particle filter for nonlinear filtering is presented, based on concepts from optimal control, and from the mean-field game theory. The optimal control is chosen so that the posterior distribution of a particle matches as closely as possible the posterior distribution of the true state given the observations. This is achieved by introducing a cost function, defined by the Kullback-Leibler (K-L) divergence between the actual posterior, and the posterior of any particle....

  13. Filtered Social Learning

    Paul Niehaus

    2011-01-01

    Knowledge sharing is economically important but also typically incomplete: we "filter" our communication. This paper analyzes the consequences of filtering. In the model, homogeneous agents share knowledge with their peers whenever the private benefits exceed communication costs. The welfare implications of this transmission mechanism hinge on whether units of knowledge complement, substitute for, or are independent of each other. Both substitutability and complementarity generate externaliti...

  14. Novel quaternion Kalman filter

    Choukroun, Daniel; Bar-Itzhack, Itzhack Y.; Oshman, Yaakov

    2006-01-01

    This paper presents a novel Kalman filter for estimating the attitude-quaternion as well as gyro random drifts from vector measurements. Employing a special manipulation on the measurement equation results in a linear pseudo-measurement equation whose error is state-dependent. Because the quaternion kinematics equation is linear, the combination of the two yields a linear Kalman filter that eliminates the usual linearization procedure and is less sensitive to initial estimation errors. Genera...

  15. Retina-inspired Filter

    Doutsi, Effrosyni; Fillatre, Lionel; Antonini, Marc; Gaulmin, Julien

    2016-01-01

    This paper introduces a novel filter which is inspired by the human retina. The human retina consists of three different layers: the Outer Plexiform Layer (OPL), the inner plexiform layer and the ganglionic layer. Our inspiration is the linear transform which takes place in the OPL and has been mathematically described by the neuroscientific model “virtual retina”. This model is the cornerstone to derive the non-separable spatiotemporal OPL retina-inspired filter, briefly renamed retina- insp...

  16. Filtering Solid Gabor Noise

    Lagae, Ares; Drettakis, George

    2011-01-01

    Solid noise is a fundamental tool in computer graphics. Surprisingly, no existing noise function supports both high-quality anti-aliasing and continuity across sharp edges. In this paper we show that a slicing approach is required to preserve continuity across sharp edges, and we present a new noise function that supports anisotropic filtering of sliced solid noise. This is made possible by individually filtering the slices of Gabor kernels, which requires the proper treatment of phase. This ...

  17. Spatial filter issues

    Experiments and calculations indicate that the threshold pressure in spatial filters for distortion of a transmitted pulse scales approximately as IO.2 and (F number-sign)2 over the intensity range from 1014 to 2xlO15 W/CM2 . We also demonstrated an interferometric diagnostic that will be used to measure the scaling relationships governing pinhole closure in spatial filters

  18. Kalman Filtering in R

    Fernando Tusell

    2011-01-01

    Support in R for state space estimation via Kalman filtering was limited to one package, until fairly recently. In the last five years, the situation has changed with no less than four additional packages offering general implementations of the Kalman filter, including in some cases smoothing, simulation smoothing and other functionality. This paper reviews some of the offerings in R to help the prospective user to make an informed choice.

  19. Miniaturized superconducting microwave filters

    In this paper we present methods for the miniaturization of superconducting filters. We consider two designs of seventh-order bandpass Chebyshev filters based on lumped elements and a novel quasi-lumped element resonator. In both designs the area of the filters, with a central frequency of 2-5 GHz, is less than 1.2 mm2. Such small filters can be readily integrated on a single board for multi-channel microwave control of superconducting qubits. The filters have been experimentally tested and the results are compared with simulations. The miniaturization resulted in parasitic coupling between resonators and within each resonator that affected primarily the stopband and increased the bandwidth. The severity of the error depends on the design in particular, and was less sensitive when a groundplane was used under the inductances of the resonators. The best performance was reached for the quasi-lumped filter with central frequency of 4.45 GHz, quality factor of 40 and 50 dB stopband

  20. Dip-separated structural filtering using seislet transform and adaptive empirical mode decomposition based dip filter

    Chen, Yangkang

    2016-07-01

    The seislet transform has been demonstrated to have a better compression performance for seismic data compared with other well-known sparsity promoting transforms, thus it can be used to remove random noise by simply applying a thresholding operator in the seislet domain. Since the seislet transform compresses the seismic data along the local structures, the seislet thresholding can be viewed as a simple structural filtering approach. Because of the dependence on a precise local slope estimation, the seislet transform usually suffers from low compression ratio and high reconstruction error for seismic profiles that have dip conflicts. In order to remove the limitation of seislet thresholding in dealing with conflicting-dip data, I propose a dip-separated filtering strategy. In this method, I first use an adaptive empirical mode decomposition based dip filter to separate the seismic data into several dip bands (5 or 6). Next, I apply seislet thresholding to each separated dip component to remove random noise. Then I combine all the denoised components to form the final denoised data. Compared with other dip filters, the empirical mode decomposition based dip filter is data-adaptive. One only needs to specify the number of dip components to be separated. Both complicated synthetic and field data examples show superior performance of my proposed approach than the traditional alternatives. The dip-separated structural filtering is not limited to seislet thresholding, and can also be extended to all those methods that require slope information.

  1. Dip-separated structural filtering using seislet transform and adaptive empirical mode decomposition based dip filter

    Chen, Yangkang

    2016-04-01

    The seislet transform has been demonstrated to have a better compression performance for seismic data compared with other well-known sparsity promoting transforms, thus it can be used to remove random noise by simply applying a thresholding operator in the seislet domain. Since the seislet transform compresses the seismic data along the local structures, the seislet thresholding can be viewed as a simple structural filtering approach. Because of the dependence on a precise local slope estimation, the seislet transform usually suffers from low compression ratio and high reconstruction error for seismic profiles that have dip conflicts. In order to remove the limitation of seislet thresholding in dealing with conflicting-dip data, I propose a dip-separated filtering strategy. In this method, I first use an adaptive empirical mode decomposition based dip filter to separate the seismic data into several dip bands (5 or 6). Next, I apply seislet thresholding to each separated dip component to remove random noise. Then I combine all the denoised components to form the final denoised data. Compared with other dip filters, the empirical mode decomposition based dip filter is data-adaptive. One only need to specify the number of dip components to be separated. Both complicated synthetic and field data examples show superior performance of my proposed approach than the traditional alternatives. The dip-separated structural filtering is not limited to seislet thresholding, and can also be extended to all those methods that require slope information.

  2. Filter Paper: Solution to High Self-Attenuation Corrections in HEPA Filter Measurements

    Oberer, R.B.; Harold, N.B.; Gunn, C.A.; Brummett, M.; Chaing, L.G.

    2005-10-01

    An 8 by 8 by 6 inch High Efficiency Particulate Air (HEPA) filter was measured as part of a uranium holdup survey in June of 2005 as it has been routinely measured every two months since 1998. Although the survey relies on gross gamma count measurements, this was one of a few measurements that had been converted to a quantitative measurement in 1998. The measurement was analyzed using the traditional Generalized Geometry Holdup (GGH) approach, using HMS3 software, with an area calibration and self-attenuation corrected with an empirical correction factor of 1.06. A result of 172 grams of {sup 235}U was reported. The actual quantity of {sup 235}U in the filter was approximately 1700g. Because of this unusually large discrepancy, the measurement of HEPA filters will be discussed. Various techniques for measuring HEPA filters will be described using the measurement of a 24 by 24 by 12 inch HEPA filter as an example. A new method to correct for self attenuation will be proposed for this measurement Following the discussion of the 24 by 24 by 12 inch HEPA filter, the measurement of the 8 by 8 by 6 inch will be discussed in detail.

  3. Filter Paper: Solution to High Self-Attenuation Corrections in HEPA Filter Measurements

    An 8 by 8 by 6 inch High Efficiency Particulate Air (HEPA) filter was measured as part of a uranium holdup survey in June of 2005 as it has been routinely measured every two months since 1998. Although the survey relies on gross gamma count measurements, this was one of a few measurements that had been converted to a quantitative measurement in 1998. The measurement was analyzed using the traditional Generalized Geometry Holdup (GGH) approach, using HMS3 software, with an area calibration and self-attenuation corrected with an empirical correction factor of 1.06. A result of 172 grams of 235U was reported. The actual quantity of 235U in the filter was approximately 1700g. Because of this unusually large discrepancy, the measurement of HEPA filters will be discussed. Various techniques for measuring HEPA filters will be described using the measurement of a 24 by 24 by 12 inch HEPA filter as an example. A new method to correct for self attenuation will be proposed for this measurement Following the discussion of the 24 by 24 by 12 inch HEPA filter, the measurement of the 8 by 8 by 6 inch will be discussed in detail

  4. Circular filter bag change ladderack system video presentation

    A great deal of research and development at Harwell over the last few years has centered around the design of circular radial flow HEPA filters as alternatives to the traditional rectangular HEPA filter. With a circular insert there are inherent features which give this geometry certain advantages over its counterpart, such as ease of sealing, compatibility with remote handling and disposal routes; these have been well publicized in previous works. A mock-up is shown of a bag change ladderack system of 3400m3/h circular filter. It highlights the space requirements for bag changing and demonstrates the ease with which a filter may be replaced. The filter throat incorporates a silicone rubber lip seal which forms a flap seal against a tapered spigot feature built into the wall. The novelty of this filter design is that the bag is an integral part of the filter and is attached onto the filter flange. This enables the inside of the filter, where the contamination particulate has collected, to be sealed/bagged off and hence the dust burden retained

  5. Ceramic filters for bulk inoculation of nickel alloy castings

    F. Binczyk

    2011-07-01

    Full Text Available The work includes the results of research on production technology of ceramic filters which, besides the traditional filtering function, playalso the role of an inoculant modifying the macrostructure of cast nickel alloys. To play this additional role, filters should demonstratesufficient compression strength and ensure proper flow rate of liquid alloy. The role of an inoculant is played by cobalt aluminateintroduced to the composition of external coating in an amount from 5 to 10 wt.% . The required compression strength (over 1MPa isprovided by the supporting layers, deposited on the preform, which is a polyurethane foam. Based on a two-level fractional experiment24-1, the significance of an impact of various technological parameters (independent variables on selected functional parameters of theready filters was determined. Important effect of the number of the supporting layers and sintering temperature of filters after evaporationof polyurethane foam was stated.

  6. Distortion Parameters Analysis Method Based on Improved Filtering Algorithm

    ZHANG Shutuan

    2013-10-01

    Full Text Available In order to realize the accurate distortion parameters test of aircraft power supply system, and satisfy the requirement of corresponding equipment in the aircraft, the novel power parameters test system based on improved filtering algorithm is introduced in this paper. The hardware of the test system has the characters of s portable and high-speed data acquisition and processing, and the software parts utilize the software Labwindows/CVI as exploitation software, and adopt the pre-processing technique and adding filtering algorithm. Compare with the traditional filtering algorithm, the test system adopted improved filtering algorithm can help to increase the test accuracy. The application shows that the test system with improved filtering algorithm can realize the accurate test results, and reach to the design requirements.  

  7. Robust Hammerstein Adaptive Filtering under Maximum Correntropy Criterion

    Zongze Wu

    2015-10-01

    Full Text Available The maximum correntropy criterion (MCC has recently been successfully applied to adaptive filtering. Adaptive algorithms under MCC show strong robustness against large outliers. In this work, we apply the MCC criterion to develop a robust Hammerstein adaptive filter. Compared with the traditional Hammerstein adaptive filters, which are usually derived based on the well-known mean square error (MSE criterion, the proposed algorithm can achieve better convergence performance especially in the presence of impulsive non-Gaussian (e.g., α-stable noises. Additionally, some theoretical results concerning the convergence behavior are also obtained. Simulation examples are presented to confirm the superior performance of the new algorithm.

  8. Preparation and Application of New Porous Environmental Ceramics Filter Medium

    LI Meng; WU Jianfeng; JIN Jianhua; LIU Xinming

    2005-01-01

    A new kind of environmental ceramics medium which was made of industrial solid wastes discharged by Shandong Alum Corporation has been used in the process of drinking water treatment. New techniques were introduced to ensure its remarkable advantages such as high porosity and strength. The results of practical application show that this sort of filter medium has shorter filtration run, shorter mature period and higher filter deposit capability compared with traditional sand filter medium. Moreover, up to 25%- 30% of the daily running costs are expected to be reduced by using this ceramics medium.

  9. Axial 3D region of interest reconstruction using weighted cone beam BPF/DBPF algorithm cascaded with adequately oriented orthogonal butterfly filtering

    Tang, Shaojie; Tang, Xiangyang

    2016-03-01

    Axial cone beam (CB) computed tomography (CT) reconstruction is still the most desirable in clinical applications. As the potential candidates with analytic form for the task, the back projection-filtration (BPF) and the derivative backprojection filtered (DBPF) algorithms, in which Hilbert filtering is the common algorithmic feature, are originally derived for exact helical and axial reconstruction from CB and fan beam projection data, respectively. These two algorithms have been heuristically extended for axial CB reconstruction via adoption of virtual PI-line segments. Unfortunately, however, streak artifacts are induced along the Hilbert filtering direction, since these algorithms are no longer accurate on the virtual PI-line segments. We have proposed to cascade the extended BPF/DBPF algorithm with orthogonal butterfly filtering for image reconstruction (namely axial CB-BPP/DBPF cascaded with orthogonal butterfly filtering), in which the orientation-specific artifacts caused by post-BP Hilbert transform can be eliminated, at a possible expense of losing the BPF/DBPF's capability of dealing with projection data truncation. Our preliminary results have shown that this is not the case in practice. Hence, in this work, we carry out an algorithmic analysis and experimental study to investigate the performance of the axial CB-BPP/DBPF cascaded with adequately oriented orthogonal butterfly filtering for three-dimensional (3D) reconstruction in region of interest (ROI).

  10. Decontamination of HEPA filters

    Mound Facility, during many years of plutonium-238 experience, has recovered over 150 kg of plutonium-238. Much of this material was recovered from HEPA filters or from solid wastes such as sludge and slag. The objective of this task was to modify and improve the existing nitric acid leaching process used at Mound so that filters from the nuclear fuel cycle could be decontaminated effectively. Various leaching agents were tested to determine their capability for dissolving PuO2, UO2, U3O8, AmO2, NpO2, CmO2, and ThO2 in mixtures of the following: HNO3-HF; HNO3-HF-H2SO4; and HNO3-(NH4)2Ce(NO3)6. Adsorption isotherms were obtained for two leaching systems. In some tests simulated contaminated HEPA filter material was used, while in others actual spent glovebox filters were used. The maximum decontamination factor of 833 was achieved in the recovery of plutonium-238 from actual filters. The dissolution was accomplished by using a six-stage process with 4N HNO3-0.23M (NH4)2Ce(NO3)6 as the leaching agent. Thorium oxide was also effectively dissolved from filter media using a mixture of nitric acid and ceric ammonium nitrate. Sodium carbonate and Na2CO3-KNO3 fusion tests were performed using simulated PuO2-contaminated filter media at various temperatures. Approximately 70 wt% of the PuO2 was soluble in a mixture composed of 70 wt% Na2CO3-30 wt% KNO3 (heated for 1 h at 9500C). 23 figs., 14 tables

  11. Kalman Filtering for Manufacturing Processes

    Oakes, Thomas; Tang, Lie; Robert G. Landers; Balakrishnan, S.N.

    2009-01-01

    This chapter presented a methodology, based on stochastic process modeling and Kalman filtering, to filter manufacturing process measurements, which are known to be inherently noisy. Via simulation studies, the methodology was compared to low pass and Butterworth filters. The methodology was applied in a Friction Stir Welding (FSW) process to filter data

  12. Shape Preserving Filament Enhancement Filtering

    Wilkinson, Michael H.F.; Westenberg, Michel A.

    2001-01-01

    Morphological connected set filters for extraction of filamentous details from medical images are developed. The advantages of these filters are that they are shape preserving and do not amplify noise. Two approaches are compared: (i) multi-scale filtering (ii) single-step shape filtering using conn

  13. Design of a cavity filter

    A cavity filter was developed for the SSRF 0-mode beam feedback. The filter is used to pick up the 500 MHz signal from the storage ring beam. The Superfish was used to simulate the model of the cavity bandpass filter. The design method, parameters of the filter and results of beam measurements are described in this paper. (authors)

  14. TRADITIONAL CHINESE MEDICINE

    1993-01-01

    930433 A study on relationship between hy-pothyroidism and deficiency of kidney YANG.ZHA Lianglun(查良伦),et al.lnstit Integr TCM& West Med,Shanghai Med Univ,Shanghai,200040.Chin J Integr Tradit & West Med 1993;13(4):202—204.Thirty—two cases of hypothyroidism causedby various factors were treated for one year withChinese medicinal herbs preparation“Shen Lutablet”(SLT)to warm and reinforce the KidneyYang.34 normal persons were studied as a con-trol group.After treatment with SLT,the clini-cal symptoms of hypothyroidism were markedlyimproved.Average serum concentration of totalT3,T4 increased significantly from 67.06±4.81

  15. Traditional preventive treatment options

    Longbottom, C; Ekstrand, K; Zero, D

    2009-01-01

    prevention of caries in children, e.g. pit and fissure sealants and topically applied fluorides (including patient-applied fluoride toothpastes and professionally applied fluoride varnishes), but limited strong evidence for these techniques for secondary prevention--i.e. where early to established lesions......Preventive treatment options can be divided into primary, secondary and tertiary prevention techniques, which can involve patient- or professionally applied methods. These include: oral hygiene (instruction), pit and fissure sealants ('temporary' or 'permanent'), fluoride applications (patient- or...... conventional operative care, and since controlling the caries process prior to first restoration is the key to breaking the repair cycle and improving care for patients, future research should address the shortcomings in the current level of supporting evidence for the various traditional preventive treatment...

  16. Asymmetric Baxter-King filter

    Buss, Ginters

    2011-01-01

    The paper proposes an extension of the symmetric Baxter-King band pass filter to an asymmetric Baxter-King filter. The optimal correction scheme of the ideal filter weights is the same as in the symmetric version, i.e, cut the ideal filter at the appropriate length and add a constant to all filter weights to ensure zero weight on zero frequency. Since the symmetric Baxter-King filter is unable to extract the desired signal at the very ends of the series, the extension to an asymmetric filter...

  17. Choosing and using astronomical filters

    Griffiths, Martin

    2014-01-01

    As a casual read through any of the major amateur astronomical magazines will demonstrate, there are filters available for all aspects of optical astronomy. This book provides a ready resource on the use of the following filters, among others, for observational astronomy or for imaging: Light pollution filters Planetary filters Solar filters Neutral density filters for Moon observation Deep-sky filters, for such objects as galaxies, nebulae and more Deep-sky objects can be imaged in much greater detail than was possible many years ago. Amateur astronomers can take

  18. Anti-clogging filter system

    Brown, Erik P.

    2015-05-19

    An anti-clogging filter system for filtering a fluid containing large particles and small particles includes an enclosure with at least one individual elongated tubular filter element in the enclosure. The individual elongated tubular filter element has an internal passage, a closed end, an open end, and a filtering material in or on the individual elongated tubular filter element. The fluid travels through the open end of the elongated tubular element and through the internal passage and through the filtering material. An anti-clogging element is positioned on or adjacent the individual elongated tubular filter element and provides a fluid curtain that preferentially directs the larger particulates to one area of the filter material allowing the remainder of the filter material to remain more efficient.

  19. Multilevel Mixture Kalman Filter

    Xiaodong Wang

    2004-11-01

    Full Text Available The mixture Kalman filter is a general sequential Monte Carlo technique for conditional linear dynamic systems. It generates samples of some indicator variables recursively based on sequential importance sampling (SIS and integrates out the linear and Gaussian state variables conditioned on these indicators. Due to the marginalization process, the complexity of the mixture Kalman filter is quite high if the dimension of the indicator sampling space is high. In this paper, we address this difficulty by developing a new Monte Carlo sampling scheme, namely, the multilevel mixture Kalman filter. The basic idea is to make use of the multilevel or hierarchical structure of the space from which the indicator variables take values. That is, we draw samples in a multilevel fashion, beginning with sampling from the highest-level sampling space and then draw samples from the associate subspace of the newly drawn samples in a lower-level sampling space, until reaching the desired sampling space. Such a multilevel sampling scheme can be used in conjunction with the delayed estimation method, such as the delayed-sample method, resulting in delayed multilevel mixture Kalman filter. Examples in wireless communication, specifically the coherent and noncoherent 16-QAM over flat-fading channels, are provided to demonstrate the performance of the proposed multilevel mixture Kalman filter.

  20. Boolean filters of distributive lattices

    M. Sambasiva Rao

    2013-07-01

    Full Text Available In this paper we introduce the notion of Boolean filters in a pseudo-complemented distributive lattice and characterize the class of all Boolean filters. Further a set of equivalent conditions are derived for a proper filter to become a prime Boolean filter. Also a set of equivalent conditions is derived for a pseudo-complemented distributive lattice to become a Boolean algebra. Finally, a Boolean filter is characterized in terms of congruences.

  1. DOE HEPA filter test program

    This standard establishes essential elements of a Department of Energy (DOE) program for testing HEPA filters to be installed in DOE nuclear facilities or used in DOE-contracted activities. A key element is the testing of HEPA filters for performance at a DOE Filter Test Facility (FTF) prior to installation. Other key elements are (1) providing for a DOE HEPA filter procurement program, and (2) verifying that HEPA filters to be installed in nuclear facilities appear on a Qualified Products List (QPL)

  2. DOE HEPA filter test program

    NONE

    1998-05-01

    This standard establishes essential elements of a Department of Energy (DOE) program for testing HEPA filters to be installed in DOE nuclear facilities or used in DOE-contracted activities. A key element is the testing of HEPA filters for performance at a DOE Filter Test Facility (FTF) prior to installation. Other key elements are (1) providing for a DOE HEPA filter procurement program, and (2) verifying that HEPA filters to be installed in nuclear facilities appear on a Qualified Products List (QPL).

  3. Efficiency of new Miswak, titanium dioxide and sand filters in reducing pollutants from wastewater

    Mohamed Ramadan

    2015-01-01

    In this work, I tried to improve the efficiency and performance of the sand filter with a simple low cost method. This work successfully reported an alternative for the traditional biological sand filter which was considered as a time consuming filter. Miswak and TiO2 were added to the filter and their effects were observed. Three filters were designed (SF-1, SF-2, and SF-3). The results of this work indicated that chemical oxygen demand (COD) was reduced by 47.54%, 90.94%, and 95.47% in case...

  4. Fuzzy filters in BCI-algebras

    C. Lele; Wu, C; Mamadou, T.

    2002-01-01

    We introduce the notion of fuzzy filters and weak filters in BCI-algebras and discuss their properties. Then we establish some relations among filters, fuzzy filters, and weak filters in BCI-algebras.

  5. A new algorithm of inter-frame filtering in IR image based on threshold value

    Liu, Wei; Leng, Hanbing; Chen, Weining; Yang, Hongtao; Xie, Qingsheng; Yi, Bo; Zhang, Haifeng

    2013-09-01

    This paper proposed a new algorithm of inter-frame filtering in IR image based on threshold value for the purpose of solving image blur and smear brought by traditional inter-frame filtering algorithm. At first, it finds out causes of image blur and smear by analyzing general inter-frame filtering algorithm and dynamic inter-frame filtering algorithm, hence to bring up a new kind of time-domain filter. In order to obtain coefficients of the filter, it firstly gets difference image of present image and previous image, and then, it gets noisy threshold value by analyzing difference image with probability analysis method. The relationship between difference image and threshold value helps obtaining the coefficients of filter. At last, inter-frame filtering method is adopted to process pixels interrupted by noise. The experimental result shows that this algorithm has successfully repressed IR image blur and smear, and NETD tested by traditional inter filtering algorithm and the new algorithm are respectively 78mK and 70mK, which shows it has a better noise reduction performance than traditional ones. The algorithm is not only applied to still image, but also to sports image. As a new algorithm with great practical value, it is easy to achieve on FPGA, of excellent real-time performance and it effectively extends application scope of time domain filtering algorithm.

  6. Ferroelectric electronically tunable filters

    A cylindrical cavity is loaded with a ferroelectric rod and is resonant at the dominant mode. The loaded cylindrical cavity is a band pass filter. As a bias voltage is applied across the ferroelectric rod, its permittivity changes resulting in a new resonant frequency for the loaded cylindrical cavity. The ferroelectric rod is operated at a temperature slightly above its Curie temperature. The loaded cylindrical cavity is kept at a constant designed temperature. The cylindrical cavity is made of conductors, a single crystal high Tc superconductor including YBCO and a single crystal dielectric, including sapphire and lanthanum aluminate, the interior conducting surfaces of which are deposited with a film of a single crystal high Tc superconductor. Embodiments also include waveguide single and multiple cavity type tunable filters. Embodiments also include tunable band reject filters. 10 figs

  7. Kalman filter modeling

    Brown, R. G.

    1984-01-01

    The formulation of appropriate state-space models for Kalman filtering applications is studied. The so-called model is completely specified by four matrix parameters and the initial conditions of the recursive equations. Once these are determined, the die is cast, and the way in which the measurements are weighted is determined foreverafter. Thus, finding a model that fits the physical situation at hand is all important. Also, it is often the most difficult aspect of designing a Kalman filter. Formulation of discrete state models from the spectral density and ARMA random process descriptions is discussed. Finally, it is pointed out that many common processes encountered in applied work (such as band-limited white noise) simply do not lend themselves very well to Kalman filter modeling.

  8. Glove-box filters

    Description is given of a device for simply and rapidly assembling and dissassembling the filters used inside sealed enclosures, such as glove-boxes and shielded cells equipped with nippers or manipulators, said filters being of the type comprising a cylindrical casing containing a filtering member, the upper portion of said casing being open so as to allow the gases to be cleaned to flow in, whereas the casing bottom is centrally provided with a hole extended outwardly by a threaded collar on which is screwed a connecting-sleeve to be fixed to the mouth of a gas outlet pipe. To a yoke transverse bar is welded a pin which can be likened to a bent spring-blade, one arm of which welded to said transverse bar, is rectilinear whereas its other arm is provided with a boss cooperating with a cavity made in a protrusion of said pipe, right under the mouth thereof

  9. Updating the OMERACT filter

    D'Agostino, Maria-Antonietta; Boers, Maarten; Kirwan, John; van der Heijde, Désirée; Østergaard, Mikkel; Schett, Georg; Landewé, Robert B; Maksymowych, Walter P; Naredo, Esperanza; Dougados, Maxime; Iagnocco, Annamaria; Bingham, Clifton O; Brooks, Peter M; Beaton, Dorcas E; Gandjbakhch, Frederique; Gossec, Laure; Guillemin, Francis; Hewlett, Sarah E; Kloppenburg, Margreet; March, Lyn; Mease, Philip J; Moller, Ingrid; Simon, Lee S; Singh, Jasvinder A; Strand, Vibeke; Wakefield, Richard J; Wells, George A; Tugwell, Peter; Conaghan, Philip G

    2014-01-01

    OBJECTIVE: The Outcome Measures in Rheumatology (OMERACT) Filter provides a framework for the validation of outcome measures for use in rheumatology clinical research. However, imaging and biochemical measures may face additional validation challenges because of their technical nature. The Imaging...... evaluated using the original OMERACT Filter and the newly proposed structure. Breakout groups critically reviewed the extent to which the candidate biomarkers complied with the proposed stepwise approach, as a way of examining the utility of the proposed 3-dimensional structure. RESULTS: Although there was...... was obtained for a proposed tri-axis structure to assess validation of imaging and soluble biomarkers; nevertheless, additional work is required to better evaluate its place within the OMERACT Filter 2.0....

  10. Generalized Filtering Decomposition

    Grigori, Laura

    2011-01-01

    This paper introduces a new preconditioning technique that is suitable for matrices arising from the discretization of a system of PDEs on unstructured grids. The preconditioner satisfies a so-called filtering property, which ensures that the input matrix is identical with the preconditioner on a given filtering vector. This vector is chosen to alleviate the effect of low frequency modes on convergence and so decrease or eliminate the plateau which is often observed in the convergence of iterative methods. In particular, the paper presents a general approach that allows to ensure that the filtering condition is satisfied in a matrix decomposition. The input matrix can have an arbitrary sparse structure. Hence, it can be reordered using nested dissection, to allow a parallel computation of the preconditioner and of the iterative process.

  11. Hybrid Data Assimilation without Ensemble Filtering

    Todling, Ricardo; Akkraoui, Amal El

    2014-01-01

    The Global Modeling and Assimilation Office is preparing to upgrade its three-dimensional variational system to a hybrid approach in which the ensemble is generated using a square-root ensemble Kalman filter (EnKF) and the variational problem is solved using the Grid-point Statistical Interpolation system. As in most EnKF applications, we found it necessary to employ a combination of multiplicative and additive inflations, to compensate for sampling and modeling errors, respectively and, to maintain the small-member ensemble solution close to the variational solution; we also found it necessary to re-center the members of the ensemble about the variational analysis. During tuning of the filter we have found re-centering and additive inflation to play a considerably larger role than expected, particularly in a dual-resolution context when the variational analysis is ran at larger resolution than the ensemble. This led us to consider a hybrid strategy in which the members of the ensemble are generated by simply converting the variational analysis to the resolution of the ensemble and applying additive inflation, thus bypassing the EnKF. Comparisons of this, so-called, filter-free hybrid procedure with an EnKF-based hybrid procedure and a control non-hybrid, traditional, scheme show both hybrid strategies to provide equally significant improvement over the control; more interestingly, the filter-free procedure was found to give qualitatively similar results to the EnKF-based procedure.

  12. Digital filters in spectrometry

    In this work is presented the development and application of the digital signal processing for different multichannel analysis spectra. The use of the smoothing classic methods in applications of signal processing is illustrated by a filters discussion; autoregressive, mobile average and the ARMA filters. Generally, simple routines of lineal smoothing do not provide appropriate smoothing of the data that show the local ruggedness as the strong discontinuities; however the indicated development algorithms have been enough to leave adapting to this task. Four algorithms were proven: autoregressive, mobile average, ARMA and binomial methods for 5, 7, and 9 of data, everything in the domain of the time and programmed in Mat lab. (Author)

  13. Die filosofiese filter

    L F. Schulze

    1971-06-01

    Full Text Available ’n Filter is ’n opvanger. Dit kan dien as ’n suiweraar wat die onsuiwerhede in vog of water opvang en terughou. Dit kan egter ook dien om ligstrale van ’n bepaalde golflengte op te vang en weg te hou van die oë of van die lens van ’n kamera. Hierdeur word die beeld of omgewing wat ek sien of fotografeer, gekleur. Ons gebruik die woord filter hier in lg. sin, hoewel die meeste mense dit nie as sodanig wil sien of erken nie, maar meen om dit bloot as suiweraar te gebruik.

  14. Filters in topology optimization

    Bourdin, Blaise

    1999-01-01

    In this article, a modified (``filtered'') version of the minimum compliance topology optimization problem is studied. The direct dependence of the material properties on its pointwise density is replaced by a regularization of the density field using a convolution operator. In this setting it is...... possible to establish the existence of solutions. Moreover, convergence of an approximation by means of finite elements can be obtained. This is illustrated through some numerical experiments. The ``filtering'' technique is also shown to cope with two important numerical problems in topology optimization...

  15. Alarm filtering and presentation

    This paper discusses alarm filtering and presentation in the control room of nuclear and other process control plants. Alarm generation and presentation is widely recognized as a general process control problem. Alarm systems often fail to provide meaningful alarms to operators. Alarm generation and presentation is an area in which computer aiding is feasible and provides clear benefits. Therefore, researchers have developed several computerized alarm filtering and presentation approaches. This paper discusses problems associated with alarm generation and presentation. Approaches to improving the alarm situation and installation issues of alarm system improvements are discussed. The impact of artificial intelligence (AI) technology on alarm system improvements is assessed. (orig.)

  16. Computing a Comprehensible Model for Spam Filtering

    Ruiz-Sepúlveda, Amparo; Triviño-Rodriguez, José L.; Morales-Bueno, Rafael

    In this paper, we describe the application of the Desicion Tree Boosting (DTB) learning model to spam email filtering.This classification task implies the learning in a high dimensional feature space. So, it is an example of how the DTB algorithm performs in such feature space problems. In [1], it has been shown that hypotheses computed by the DTB model are more comprehensible that the ones computed by another ensemble methods. Hence, this paper tries to show that the DTB algorithm maintains the same comprehensibility of hypothesis in high dimensional feature space problems while achieving the performance of other ensemble methods. Four traditional evaluation measures (precision, recall, F1 and accuracy) have been considered for performance comparison between DTB and others models usually applied to spam email filtering. The size of the hypothesis computed by a DTB is smaller and more comprehensible than the hypothesis computed by Adaboost and Naïve Bayes.

  17. Survey of HEPA filter experience

    A survey of high efficiency particulate air (HEPA) filter applications and experience at Department of Energy (DOE) sites was conducted to provide an overview of the reasons and magnitude of HEPA filter changeouts and failures. Results indicated that approximately 58% of the filters surveyed were changed out in the three year study period, and some 18% of all filters were changed out more than once. Most changeouts (63%) were due to the existence of a high pressure drop across the filter, indicative of filter plugging. Other reasons for changeout included leak-test failure (15%), preventive maintenance service life limit (13%), suspected damage (5%) and radiation buildup (4%). Filter failures occurred with approximately 12% of all installed filters. Of these failures, most (64%) occurred for unknown or unreported reasons. Handling or installation damage accounted for an additional 19% of reported failures. Media ruptures, filter-frame failures and seal failures each accounted for approximately 5 to 6% of the reported failures

  18. Traditional Medicine in Developing Countries

    Thorsen, Rikke Stamp

    People use traditional medicine to meet their health care needs in developing countries and medical pluralism persists worldwide despite increased access to allopathic medicine. Traditional medicine includes a variety of treatment opportunities, among others, consultation with a traditional healer...... or spiritual healer and self-treatment with herbal medicine or medicinal plants. Reliance on traditional medicine varies between countries and rural and urban areas, but is reported to be as high as 80% in some developing countries. Increased realization of the continued importance of traditional medicine has...... led to the formulation of policies on the integration of traditional medicine into public health care. Local level integration is already taking place as people use multiple treatments when experiencing illness. Research on local level use of traditional medicine for health care, in particular the use...

  19. Bayesian Filters in Practice

    Krejsa, Jiří; Věchet, S.

    Bratislava: Slovak University of Technology in Bratislava, 2010, s. 217-222. ISBN 978-80-227-3353-3. [Robotics in Education . Bratislava (SK), 16.09.2010-17.09.2010] Institutional research plan: CEZ:AV0Z20760514 Keywords : mobile robot localization * bearing only beacons * Bayesian filters Subject RIV: JD - Computer Applications, Robotics

  20. Spectral Ensemble Kalman Filters

    Mandel, Jan; Kasanický, Ivan; Vejmelka, Martin; Fuglík, Viktor; Turčičová, Marie; Eben, Kryštof; Resler, Jaroslav; Juruš, Pavel

    2014-01-01

    Roč. 11, - (2014), EMS2014-446. [EMS Annual Meeting /14./ & European Conference on Applied Climatology (ECAC) /10./. 06.10.2014-10.10.2014, Prague] R&D Projects: GA ČR GA13-34856S Grant ostatní: NSF DMS -1216481 Institutional support: RVO:67985807 Keywords : data assimilation * spectral filter Subject RIV: DG - Athmosphere Sciences, Meteorology

  1. Spot- Zombie Filtering System

    Arathy Rajagopal

    2015-10-01

    Full Text Available A major security challenge on the Internet is the existence of the large number of compromised machines. Such machines have been increasingly used to launch various security attacks including spamming and spreading malware, DDoS, and identity theft. These compromised machines are called "Zombies". In general E-mail applications and providers uses spam filters to filter the spam messages. Spam filtering is a technique for discriminating the genuine message from the spam messages. The attackers send the spam messages to the targeted machine by exalting the filters, which causes the increase in false positives and false negatives. We develop an effective spam zombie detection system named SPOT by monitoring outgoing messages of a network. SPOT focuses on the number of outgoing messages that are originated or forwarded by each computer on a network to identify the presence of Zombies. SPOT is designed based on a powerful statistical tool called Sequential Probability Ratio Test, which has bounded false positive and false negative error rates.

  2. Spot- Zombie Filtering System

    Arathy Rajagopal

    2014-01-01

    Full Text Available A major security challenge on the Internet is the existence of the large number of compromised machines. Such machines have been increasingly used to launch various security attacks including spamming and spreading malware, DDoS, and identity theft. These compromised machines are called “Zombies”. In general E-mail applications and providers uses spam filters to filter the spam messages. Spam filtering is a technique for discriminating the genuine message from the spam messages. The attackers send the spam messages to the targeted machine by exalting the filters, which causes the increase in false positives and false negatives. We develop an effective spam zombie detection system named SPOT by monitoring outgoing messages of a network. SPOT focuses on the number of outgoing messages that are originated or forwarded by each computer on a network to identify the presence of Zombies. SPOT is designed based on a powerful statistical tool called Sequential Probability Ratio Test, which has bounded false positive and false negative error rates.

  3. Ceramic HEPA Filter Program

    Mitchell, M A; Bergman, W; Haslam, J; Brown, E P; Sawyer, S; Beaulieu, R; Althouse, P; Meike, A

    2012-04-30

    Potential benefits of ceramic filters in nuclear facilities: (1) Short term benefit for DOE, NRC, and industry - (a) CalPoly HTTU provides unique testing capability to answer questions for DOE - High temperature testing of materials, components, filter, (b) Several DNFSB correspondences and presentations by DNFSB members have highlighted the need for HEPA filter R and D - DNFSB Recommendation 2009-2 highlighted a nuclear facility response to an evaluation basis earthquake followed by a fire (aka shake-n-bake) and CalPoly has capability for a shake-n-bake test; (2) Intermediate term benefit for DOE and industry - (a) Filtration for specialty applications, e.g., explosive applications at Nevada, (b) Spin-off technologies applicable to other commercial industries; and (3) Long term benefit for DOE, NRC, and industry - (a) Across industry, strong desire for better performance filter, (b) Engineering solution to safety problem will improve facility safety and decrease dependence on associated support systems, (c) Large potential life-cycle cost savings, and (d) Facilitates development and deployment of LLNL process innovations to allow continuous ventilation system operation during a fire.

  4. Compressive Bilateral Filtering.

    Sugimoto, Kenjiro; Kamata, Sei-Ichiro

    2015-11-01

    This paper presents an efficient constant-time bilateral filter that produces a near-optimal performance tradeoff between approximate accuracy and computational complexity without any complicated parameter adjustment, called a compressive bilateral filter (CBLF). The constant-time means that the computational complexity is independent of its filter window size. Although many existing constant-time bilateral filters have been proposed step-by-step to pursue a more efficient performance tradeoff, they have less focused on the optimal tradeoff for their own frameworks. It is important to discuss this question, because it can reveal whether or not a constant-time algorithm still has plenty room for improvements of performance tradeoff. This paper tackles the question from a viewpoint of compressibility and highlights the fact that state-of-the-art algorithms have not yet touched the optimal tradeoff. The CBLF achieves a near-optimal performance tradeoff by two key ideas: 1) an approximate Gaussian range kernel through Fourier analysis and 2) a period length optimization. Experiments demonstrate that the CBLF significantly outperforms state-of-the-art algorithms in terms of approximate accuracy, computational complexity, and usability. PMID:26068315

  5. Efficient Iterated Filtering

    Lindström, Erik; Ionides, Edward; Frydendall, Jan;

    2012-01-01

    -Rao efficient. The proposed estimator is easy to implement as it only relies on non-linear filtering. This makes the framework flexible as it is easy to tune the implementation to achieve computational efficiency. This is done by using the approximation of the score function derived from the theory on Iterative...

  6. Mirrors as power filters

    Multilayer mirrors offer advantages in power filtering compared to total reflection mirrors in both wiggler and undulator beams at third generation synchrotron radiation sources currently under construction. These advantages come at the expense of increased absorbed power in the mirror itself, and of added complexity of beamline optical design. This paper discusses these aspects

  7. Morphing and Ensemble Filtering

    Mandel, J.; Beezley, J.; Resler, Jaroslav; Juruš, Pavel; Eben, Kryštof

    Prague: Institute of Computer Science of the AS CR, v.v.i, 2010, s. 1-9. [Workshop on "GHG reduction using IT" /2./. Prague (CZ), 28.05.2010] Institutional research plan: CEZ:AV0Z10300504 Keywords : data assimilation * Kalman filter Subject RIV: JE - Non-nuclear Energetics, Energy Consumption ; Use

  8. Ceramic HEPA Filter Program

    Potential benefits of ceramic filters in nuclear facilities: (1) Short term benefit for DOE, NRC, and industry - (a) CalPoly HTTU provides unique testing capability to answer questions for DOE - High temperature testing of materials, components, filter, (b) Several DNFSB correspondences and presentations by DNFSB members have highlighted the need for HEPA filter R and D - DNFSB Recommendation 2009-2 highlighted a nuclear facility response to an evaluation basis earthquake followed by a fire (aka shake-n-bake) and CalPoly has capability for a shake-n-bake test; (2) Intermediate term benefit for DOE and industry - (a) Filtration for specialty applications, e.g., explosive applications at Nevada, (b) Spin-off technologies applicable to other commercial industries; and (3) Long term benefit for DOE, NRC, and industry - (a) Across industry, strong desire for better performance filter, (b) Engineering solution to safety problem will improve facility safety and decrease dependence on associated support systems, (c) Large potential life-cycle cost savings, and (d) Facilitates development and deployment of LLNL process innovations to allow continuous ventilation system operation during a fire.

  9. Rotating drum filter

    Anson, Donald

    1990-01-01

    A perforated drum (10) rotates in a coaxial cylindrical housing (18) having three circumferential ports (19,22,23), and an axial outlet (24) at one end. The axis (11) is horizontal. A fibrous filter medium (20) is fed through a port (19) on or near the top of the housing (81) by a distributing mechanism (36) which lays a uniform mat (26) of the desired thickness onto the rotating drum (10). This mat (26) is carried by the drum (10) to a second port (23) through which dirty fluid (13) enters. The fluid (13) passes through the filter (26) and the cleaned stream (16) exits through the open end (15) of the drum (10) and the axial port (24) in the housing (18). The dirty filter material (20) is carried on to a third port (22) near the bottom of the housing (18) and drops into a receiver (31) from which it is continuously removed, cleaned (30), and returned (32) to the charging port (36) at the top. To support the filter mat, the perforated cylinder may carry a series of tines (40), shaped blades (41), or pockets, so that the mat (26) will not fall from the drum (10) prematurely. To minimize risk of mat failure, the fluid inlet port (23) may be located above the horizontal centerline (11).

  10. Magnetic-Optical Filter

    Formicola, I; Pinto, C; Cerulo, P

    2007-01-01

    Magnetic-Optical Filter (MOF) is an instrument suited for high precision spectral measurements for its peculiar characteristics. It is employed in Astronomy and in the field of the telecommunications (it is called FADOF there). In this brief paper we summarize its fundamental structure and functioning.

  11. Parzen Particle Filters

    Lehn-Schiøler, Tue; Erdogmus, Deniz; Principe, Jose C.

    Using a Parzen density estimator any distribution can be approximated arbitrarily close by a sum of kernels. In particle filtering this fact is utilized to estimate a probability density function with Dirac delta kernels; when the distribution is discretized it becomes possible to solve an...

  12. Decontamination of HEPA filters

    Koenst, J.W. Jr.; Lewis, E.L.; Luthy, D.F.

    1978-01-01

    Mound Facility, during many years of plutonium-238 experience, has recovered over 150 kg of plutonium-238. Much of this material was recovered from HEPA filters or from solid wastes such as sludge and slag. The objective of this task was to modify and improve the existing nitric acid leaching process used at Mound so that filters from the nuclear fuel cycle could be decontaminated effectively. Various leaching agents were tested to determine their capability for dissolving PuO/sub 2/, UO/sub 2/, U/sub 3/O/sub 8/, AmO/sub 2/, NpO/sub 2/, CmO/sub 2/, and ThO/sub 2/ in mixtures of the following: HNO/sub 3/-HF; HNO/sub 3/-HF-H/sub 2/SO/sub 4/; and HNO/sub 3/-(NH/sub 4/)/sub 2/Ce(NO/sub 3/)/sub 6/. Adsorption isotherms were obtained for two leaching systems. In some tests simulated contaminated HEPA filter material was used, while in others actual spent glovebox filters were used. The maximum decontamination factor of 833 was achieved in the recovery of plutonium-238 from actual filters. The dissolution was accomplished by using a six-stage process with 4N HNO/sub 3/-0.23M (NH/sub 4/)/sub 2/Ce(NO/sub 3/)/sub 6/ as the leaching agent. Thorium oxide was also effectively dissolved from filter media using a mixture of nitric acid and ceric ammonium nitrate. Sodium carbonate and Na/sub 2/CO/sub 3/-KNO/sub 3/ fusion tests were performed using simulated PuO/sub 2/-contaminated filter media at various temperatures. Approximately 70 wt% of the PuO/sub 2/ was soluble in a mixture composed of 70 wt% Na/sub 2/CO/sub 3/-30 wt% KNO/sub 3/ (heated for 1 h at 950/sup 0/C). 23 figs., 14 tables.

  13. Blind Source Parameters for Performance Evaluation of Despeckling Filters

    Nagashettappa Biradar

    2016-01-01

    Full Text Available The speckle noise is inherent to transthoracic echocardiographic images. A standard noise-free reference echocardiographic image does not exist. The evaluation of filters based on the traditional parameters such as peak signal-to-noise ratio, mean square error, and structural similarity index may not reflect the true filter performance on echocardiographic images. Therefore, the performance of despeckling can be evaluated using blind assessment metrics like the speckle suppression index, speckle suppression and mean preservation index (SMPI, and beta metric. The need for noise-free reference image is overcome using these three parameters. This paper presents a comprehensive analysis and evaluation of eleven types of despeckling filters for echocardiographic images in terms of blind and traditional performance parameters along with clinical validation. The noise is effectively suppressed using the logarithmic neighborhood shrinkage (NeighShrink embedded with Stein’s unbiased risk estimation (SURE. The SMPI is three times more effective compared to the wavelet based generalized likelihood estimation approach. The quantitative evaluation and clinical validation reveal that the filters such as the nonlocal mean, posterior sampling based Bayesian estimation, hybrid median, and probabilistic patch based filters are acceptable whereas median, anisotropic diffusion, fuzzy, and Ripplet nonlinear approximation filters have limited applications for echocardiographic images.

  14. Blind Source Parameters for Performance Evaluation of Despeckling Filters.

    Biradar, Nagashettappa; Dewal, M L; Rohit, ManojKumar; Gowre, Sanjaykumar; Gundge, Yogesh

    2016-01-01

    The speckle noise is inherent to transthoracic echocardiographic images. A standard noise-free reference echocardiographic image does not exist. The evaluation of filters based on the traditional parameters such as peak signal-to-noise ratio, mean square error, and structural similarity index may not reflect the true filter performance on echocardiographic images. Therefore, the performance of despeckling can be evaluated using blind assessment metrics like the speckle suppression index, speckle suppression and mean preservation index (SMPI), and beta metric. The need for noise-free reference image is overcome using these three parameters. This paper presents a comprehensive analysis and evaluation of eleven types of despeckling filters for echocardiographic images in terms of blind and traditional performance parameters along with clinical validation. The noise is effectively suppressed using the logarithmic neighborhood shrinkage (NeighShrink) embedded with Stein's unbiased risk estimation (SURE). The SMPI is three times more effective compared to the wavelet based generalized likelihood estimation approach. The quantitative evaluation and clinical validation reveal that the filters such as the nonlocal mean, posterior sampling based Bayesian estimation, hybrid median, and probabilistic patch based filters are acceptable whereas median, anisotropic diffusion, fuzzy, and Ripplet nonlinear approximation filters have limited applications for echocardiographic images. PMID:27298618

  15. Generic Hardware Architectures for Sampling and Resampling in Particle Filters

    Petar M. Djurić

    2005-10-01

    Full Text Available Particle filtering is a statistical signal processing methodology that has recently gained popularity in solving several problems in signal processing and communications. Particle filters (PFs have been shown to outperform traditional filters in important practical scenarios. However their computational complexity and lack of dedicated hardware for real-time processing have adversely affected their use in real-time applications. In this paper, we present generic architectures for the implementation of the most commonly used PF, namely, the sampling importance resampling filter (SIRF. These provide a generic framework for the hardware realization of the SIRF applied to any model. The proposed architectures significantly reduce the memory requirement of the filter in hardware as compared to a straightforward implementation based on the traditional algorithm. We propose two architectures each based on a different resampling mechanism. Further, modifications of these architectures for acceleration of resampling process are presented. We evaluate these schemes based on resource usage and latency. The platform used for the evaluations is the Xilinx Virtex II pro FPGA. The architectures presented here have led to the development of the first hardware (FPGA prototype for the particle filter applied to the bearings-only tracking problem.

  16. The impact of metallic filter media on HEPA filtration

    Traditional HEPA filter systems have limitations that often prevent them from solving many of the filtration problems in the nuclear industry; particularly in applications where long service or storage life, high levels of radioactivity, dangerous decomposition products, chemical aggression, organic solvents, elevated operating temperatures, fire resistance and resistance to moisture are issues. This paper addresses several of these matters of concern by considering the use of metallic filter media to solve HEPA filtration problems ranging from the long term storage of transuranic waste at the WIPP site, spent and damaged fuel assemblies, in glove box ventilation and tank venting to the venting of fumes at elevated temperatures from incinerators, vitrification processes and conversion and sintering furnaces as well as downstream of iodine absorbers in gas cooled reactors in the UK. The paper reviews the basic technology, development, performance characteristics and filtration efficiency, flow versus differential pressure, cleanability and costs of sintered metal fiber in comparison with traditional resin bonded glass fiber filter media and sintered metal powder filter media. Examples of typical filter element and system configurations and applications will be presented The paper will also address the economic case for installing self cleaning pre-filtration, using metallic media, to recover the small volumes of dust that would otherwise blind large volumes of final disposable HEPA filters, thus presenting a route to reduce ultimate disposal volumes and secondary waste streams. (authors)

  17. The impact of metallic filter media on HEPA filtration

    Chadwick, Chris; Kaufman, Seth [Microfiltrex, Porvair Filtration Group Ltd Fareham Industrial Park, Fareham, Hampshire, PO16 8XG (United Kingdom)

    2006-07-01

    Traditional HEPA filter systems have limitations that often prevent them from solving many of the filtration problems in the nuclear industry; particularly in applications where long service or storage life, high levels of radioactivity, dangerous decomposition products, chemical aggression, organic solvents, elevated operating temperatures, fire resistance and resistance to moisture are issues. This paper addresses several of these matters of concern by considering the use of metallic filter media to solve HEPA filtration problems ranging from the long term storage of transuranic waste at the WIPP site, spent and damaged fuel assemblies, in glove box ventilation and tank venting to the venting of fumes at elevated temperatures from incinerators, vitrification processes and conversion and sintering furnaces as well as downstream of iodine absorbers in gas cooled reactors in the UK. The paper reviews the basic technology, development, performance characteristics and filtration efficiency, flow versus differential pressure, cleanability and costs of sintered metal fiber in comparison with traditional resin bonded glass fiber filter media and sintered metal powder filter media. Examples of typical filter element and system configurations and applications will be presented The paper will also address the economic case for installing self cleaning pre-filtration, using metallic media, to recover the small volumes of dust that would otherwise blind large volumes of final disposable HEPA filters, thus presenting a route to reduce ultimate disposal volumes and secondary waste streams. (authors)

  18. Computer aided design of reentrant coaxial filters including coaxial excitation

    Boria, V.; Gerini, G.; Guglielmi, M.

    1999-01-01

    An advanced EM based CAD tool is used for the detailed characterisation of a family of reentrant coaxial waveguide filters. The EM analysis includes the effects of tuning screws and of the input/output coaxial excitation. The software is essentially used as an efficient replacement for the tradition

  19. CSA noise measurement used on oscilloscope and digital filter

    A new method for measurement of low noise Charge Sensitive Preamplifier, with digital oscilloscope and digital filter is discussed in this paper. Compared with traditional measurement, this method has advantage of flexible parameters and convenient to use. The test result is reasonably accords with conventional result. (authors)

  20. Sensory Pollution from Bag Filters, Carbon Filters and Combinations

    Bekö, Gabriel; Clausen, Geo; Weschler, Charles J.

    2008-01-01

    Used ventilation filters are a major source of sensory pollutants in air handling systems. The objective of the present study was to evaluate the net effect that different combinations of filters had on perceived air quality after 5 months of continuous filtration of outdoor suburban air. A panel...... that contained AC and a synthetic fiber cartridge filter that contained AC. Air that had passed through used filters was most acceptable for those sets in which an AC filter was used downstream of the particle filter. Comparable air quality was achieved with the stand-alone bag filter that contained AC....... Furthermore, its pressure drop changed very little during the 5 months of service, and it had the added benefit of removing a large fraction of ozone from the airstream. If similar results are obtained over a wider variety of soiling conditions, such filters may be a viable solution to a long recognized...

  1. Digital Simulation of a Hybrid Active Filter - An Active Filter in Series with a Shunt Passive Filter

    Sitaram, Mahesh I; Padiyar, KR; Ramanarayanan, V

    1998-01-01

    Active filters have long been in use for the filtering of power system load harmonics. In this paper, the digital simulation results of a hybrid active power filter system for a rectifier load are presented. The active filter is used for filtering higher order harmonics as the dominant harmonics are filtered by the passive filter. This reduces the rating of the active filter significantly. The DC capacitor voltage of the active filter is controlled using a PI controller.

  2. Intraindividual evaluation of the influence of iterative reconstruction and filter kernel on subjective and objective image quality in computed tomography of the brain

    Objectives: To intraindividually evaluate the potential of 4th generation iterative reconstruction (IR) on brain CT with regard to subjective and objective image quality. Methods: 31 consecutive raw data sets of clinical routine native sequential brain CT scans were reconstructed with IR level 0 (= filtered back projection), 1, 3 and 4; 3 different brain filter kernels (smooth/standard/sharp) were applied respectively. Five independent radiologists with different levels of experience performed subjective image rating. Detailed ROI analysis of image contrast and noise was performed. Statistical analysis was carried out by applying a random intercept model. Results: Subjective scores for the smooth and the standard kernels were best at low IR levels, but both, in particular the smooth kernel, scored inferior with an increasing IR level. The sharp kernel scored lowest at IR 0, while the scores substantially increased at high IR levels, reaching significantly best scores at IR 4. Objective measurements revealed an overall increase in contrast-to-noise ratio at higher IR levels, which was highest when applying the soft filter kernel. The absolute grey-white contrast decreased with an increasing IR level and was highest when applying the sharp filter kernel. All subjective effects were independent of the raters' experience and the patients' age and sex. Conclusion: Different combinations of IR level and filter kernel substantially influence subjective and objective image quality of brain CT. (orig.)

  3. Intraindividual evaluation of the influence of iterative reconstruction and filter kernel on subjective and objective image quality in computed tomography of the brain

    Buhk, J.H. [Univ. Medical Center, Hamburg-Eppendorf (Germany). Dept. of Neuroradiology; Laqmani, A.; Schultzendorff, H.C. von; Hammerle, D.; Adam, G.; Regier, M. [Univ. Medical Center, Hamburg-Eppendorf (Germany). Dept. of Diagnostic and Interventional Radiology; Sehner, S. [Univ. Medical Center, Hamburg-Eppendorf (Germany). Inst. of Medical Biometry and Epidemiology; Fiehler, J. [Univ. Medical Center, Hamburg-Eppendorf (Germany). Neuroradiology; Nagel, H.D. [Dr. HD Nagel, Science and Technology for Radiology, Buchholz (Germany)

    2013-08-15

    Objectives: To intraindividually evaluate the potential of 4th generation iterative reconstruction (IR) on brain CT with regard to subjective and objective image quality. Methods: 31 consecutive raw data sets of clinical routine native sequential brain CT scans were reconstructed with IR level 0 (= filtered back projection), 1, 3 and 4; 3 different brain filter kernels (smooth/standard/sharp) were applied respectively. Five independent radiologists with different levels of experience performed subjective image rating. Detailed ROI analysis of image contrast and noise was performed. Statistical analysis was carried out by applying a random intercept model. Results: Subjective scores for the smooth and the standard kernels were best at low IR levels, but both, in particular the smooth kernel, scored inferior with an increasing IR level. The sharp kernel scored lowest at IR 0, while the scores substantially increased at high IR levels, reaching significantly best scores at IR 4. Objective measurements revealed an overall increase in contrast-to-noise ratio at higher IR levels, which was highest when applying the soft filter kernel. The absolute grey-white contrast decreased with an increasing IR level and was highest when applying the sharp filter kernel. All subjective effects were independent of the raters' experience and the patients' age and sex. Conclusion: Different combinations of IR level and filter kernel substantially influence subjective and objective image quality of brain CT. (orig.)

  4. Collaborative recommendations with content-based filters for cultural activities via a scalable event distribution platform

    De Pessemier, Toon; Coppens, Sam; Geebelen, Kristof; Vleugels, Chris; Bannier, Stijn; Mannens, Erik; Vanhecke, Kris; Martens, Luc

    2012-01-01

    Nowadays, most people have limited leisure time and the offer of (cultural) activities to spend this time is enormous. Consequently, picking the most appropriate events becomes increasingly difficult for end-users. This complexity of choice reinforces the necessity of filtering systems that assist users in finding and selecting relevant events. Whereas traditional filtering tools enable e.g. the use of keyword-based or filtered searches, innovative recommender systems draw on user ratings, pr...

  5. Advanced Techniques in Harmonic Suppression via Active Power Filter (APF): A Review

    Ekhlas Mhawi; Hamdan Daniyal; Mohd Herwan Sulaiman

    2015-01-01

    This paper intends to present the recent development of artificial intelligence (AI) applications in active power filter (APF). As a result of the development in power electronic technology, (APF) continues to attract ample attention. Compared with the traditional reactive LC filter, active power filter is considered to be more effective in compensating harmonic current generated by nonlinear loads.APF, can correct the power quality and improve the reliability and stability on power utility. ...

  6. Location Estimation for an Autonomously Guided Vehicle using an Augmented Kalman Filter to Autocalibrate the Odometry

    Larsen, Thomas Dall; Bak, Martin; Andersen, Nils Axel; Ravn, Ole

    1998-01-01

    A Kalman filter using encoder readings as inputs and vision measurements as observations is designed as a location estimator for an autonomously guided vehicle (AGV). To reduce the effect of modelling errors an augmented filter that estimates the true system parameters is designed. The traditional...

  7. Experimental study of filter cake formation on different filter media

    Removal of particulate matter from gases generated in the process industry is important for product recovery as well as emission control. Dynamics of filtration plant depend on operating conditions. The models, that predict filter plant behaviour, involve empirical resistance parameters which are usually derived from limited experimental data and are characteristics of the filter media and filter cake (dust deposited on filter medium). Filter cake characteristics are affected by the nature of filter media, process parameters and mode of filter regeneration. Removal of dust particles from air is studied in a pilot scale jet pulsed bag filter facility resembling closely to the industrial filters. Limestone dust and ambient air are used in this study with two widely different filter media. All important parameters like pressure drop, gas flow rate, dust settling, are recorded continuously at 1s interval. The data is processed for estimation of the resistance parameters. The pressure drop rise on test filter media is compared. Results reveal that the surface of filter media has an influence on pressure drop rise (concave pressure drop rise). Similar effect is produced by partially jet pulsed filter surface. Filter behaviour is also simulated using estimated parameters and a simplified model and compared with the experimental results. Distribution of cake area load is therefore an important aspect of jet pulse cleaned bag filter modeling. Mean specific cake resistance remains nearly constant on thoroughly jet pulse cleaned membrane coated filter bags. However, the trend can not be confirmed without independent cake height and density measurements. Thus the results reveal the importance of independent measurements of cake resistance. (author)

  8. The Rao-Blackwellized Particle Filter: A Filter Bank Implementation

    Hendeby G.; Hendeby, Gustaf; Karlsson R.; Karlsson, Rickard; Gustafsson, Fredrik; Gustafsson F.

    2010-01-01

    For computational efficiency, it is important to utilize model structure in particle filtering. One of the most important cases occurs when there exists a linear Gaussian substructure, which can be efficiently handled by Kalman filters. This is the standard formulation of the Rao-Blackwellized particle filter (RBPF). This contribution suggests an alternative formulation of this well-known result that facilitates reuse of standard filtering components and which is also suitable for object-ori...

  9. An Adjoint-Based Adaptive Ensemble Kalman Filter

    Song, Hajoon

    2013-10-01

    A new hybrid ensemble Kalman filter/four-dimensional variational data assimilation (EnKF/4D-VAR) approach is introduced to mitigate background covariance limitations in the EnKF. The work is based on the adaptive EnKF (AEnKF) method, which bears a strong resemblance to the hybrid EnKF/three-dimensional variational data assimilation (3D-VAR) method. In the AEnKF, the representativeness of the EnKF ensemble is regularly enhanced with new members generated after back projection of the EnKF analysis residuals to state space using a 3D-VAR [or optimal interpolation (OI)] scheme with a preselected background covariance matrix. The idea here is to reformulate the transformation of the residuals as a 4D-VAR problem, constraining the new member with model dynamics and the previous observations. This should provide more information for the estimation of the new member and reduce dependence of the AEnKF on the assumed stationary background covariance matrix. This is done by integrating the analysis residuals backward in time with the adjoint model. Numerical experiments are performed with the Lorenz-96 model under different scenarios to test the new approach and to evaluate its performance with respect to the EnKF and the hybrid EnKF/3D-VAR. The new method leads to the least root-mean-square estimation errors as long as the linear assumption guaranteeing the stability of the adjoint model holds. It is also found to be less sensitive to choices of the assimilation system inputs and parameters.

  10. Analog filters in nanometer CMOS

    Uhrmann, Heimo; Zimmermann, Horst

    2014-01-01

    Starting from the basics of analog filters and the poor transistor characteristics in nanometer CMOS 10 high-performance analog filters developed by the authors in 120 nm and 65 nm CMOS are described extensively. Among them are gm-C filters, current-mode filters, and active filters for system-on-chip realization for Bluetooth, WCDMA, UWB, DVB-H, and LTE applications. For the active filters several operational amplifier designs are described. The book, furthermore, contains a review of the newest state of research on low-voltage low-power analog filters. To cover the topic of the book comprehensively, linearization issues and measurement methods for the characterization of advanced analog filters are introduced in addition. Numerous elaborate illustrations promote an easy comprehension. This book will be of value to engineers and researchers in industry as well as scientists and Ph.D students at universities. The book is also recommendable to graduate students specializing on nanoelectronics, microelectronics ...

  11. Development of DWDM Filter Manufacture

    2001-01-01

    DWDM technology is developing rapidly. Thin film narrow bandpass filter plays an important role in this field. This article presents some achievements in developing the DWDM narrow bandpass filters and also describes the results achieved by us.

  12. Active resistance capacitance filter design

    Kerwin, W. J.

    1970-01-01

    Filters, formed by combinations of distributed RC elements with positive-feedback voltage amplifiers, provide transfer functions similar to those the heavier LC filters ordinarily employ. They also provide signal amplification.

  13. Hierarchical Bayes Ensemble Kalman Filtering

    Tsyrulnikov, Michael

    2015-01-01

    Ensemble Kalman filtering (EnKF), when applied to high-dimensional systems, suffers from an inevitably small affordable ensemble size, which results in poor estimates of the background error covariance matrix ${\\bf B}$. The common remedy is a kind of regularization, usually an ad-hoc spatial covariance localization (tapering) combined with artificial covariance inflation. Instead of using an ad-hoc regularization, we adopt the idea by Myrseth and Omre (2010) and explicitly admit that the ${\\bf B}$ matrix is unknown and random and estimate it along with the state (${\\bf x}$) in an optimal hierarchical Bayes analysis scheme. We separate forecast errors into predictability errors (i.e. forecast errors due to uncertainties in the initial data) and model errors (forecast errors due to imperfections in the forecast model) and include the two respective components ${\\bf P}$ and ${\\bf Q}$ of the ${\\bf B}$ matrix into the extended control vector $({\\bf x},{\\bf P},{\\bf Q})$. Similarly, we break the traditional backgrou...

  14. Spatial filtering efficiency of monostatic biaxial lidar: analysis and applications.

    Agishev, Ravil R; Comeron, Adolfo

    2002-12-20

    Results of lidar modeling based on spatial-angular filtering efficiency criteria are presented. Their analysis shows that the low spatial-angular filtering efficiency of traditional visible and near-infrared systems is an important cause of low signal/background-radiation ratio (SBR) at the photodetector input The low SBR may be responsible for considerable measurement errors and ensuing the low accuracy of the retrieval of atmospheric optical parameters. As shown, the most effective protection against sky background radiation for groundbased biaxial lidars is the modifying of their angular field according to a spatial-angular filtering efficiency criterion. Some effective approaches to achieve a high filtering efficiency for the receiving system optimization are discussed. PMID:12510915

  15. Reconfigurable Mixed Mode Universal Filter

    Neelofer Afzal; Devesh Singh

    2014-01-01

    This paper presents a novel mixed mode universal filter configuration capable of working in voltage and transimpedance mode. The proposed single filter configuration can be reconfigured digitally to realize all the five second order filter functions (types) at single output port. Other salient features of proposed configuration include independently programmable filter parameters, full cascadability, and low sensitivity figure. However, all these features are provided at the cost of quite lar...

  16. Traditional Knowledge and Human Rights

    Haugen, Hans Morten

    2005-01-01

    In the realm of intellectual property protection, traditional knowledge has been receiving increasing attention in the past few years. Presently, however, there is no international treaty regulating traditional knowledge.’ Nonetheless, the efforts of the Intergovernmental Committee on Intellectual Property and Genetic Resources, Traditional Knowledge and Folklore (GRTKFw) ithin the World Intellectual Property Organization (WIPO) might just result in the creation of such an inte...

  17. Property rights and traditional knowledge

    JT Cross

    2010-01-01

    For the past several decades, there has been a push to provide some sort of right akin to an intellectual property right in traditional knowledge and traditional cultural expression. This push has encountered staunch resistance from a number of different quarters. Many of the objections are practical. However, underlying these practical concerns is a core philosophical concern. A system of traditional knowledge rights, this argument suggests, simply does not satisfy the basic rationale for gr...

  18. The Rao-Blackwellized Particle Filter: A Filter Bank Implementation

    Karlsson Rickard

    2010-01-01

    Full Text Available For computational efficiency, it is important to utilize model structure in particle filtering. One of the most important cases occurs when there exists a linear Gaussian substructure, which can be efficiently handled by Kalman filters. This is the standard formulation of the Rao-Blackwellized particle filter (RBPF. This contribution suggests an alternative formulation of this well-known result that facilitates reuse of standard filtering components and which is also suitable for object-oriented programming. Our RBPF formulation can be seen as a Kalman filter bank with stochastic branching and pruning.

  19. Condensate filtering device

    In a condensate filtering device of a nuclear power plant, a water collecting pipe is disposed over the entire length, an end of a hollow thread is in communication with the water collecting pipe and secured. If the length of the water collecting pipe is extended, a filtering device of an optional length can be obtained irrespective of the length of the hollow thread. Therefore, since there is no need to connect units upon constituting a module, flow of cleaning gases is not restricted at connection portions. Accordingly, even if the volume of the device is increased by the extension of the module, the working life of the module is not degraded. (T.M.)

  20. Dichroic ultraviolet light filters

    Kocher, Christoph; Weder, Christoph; Smith, Paul

    2003-10-01

    With the intention to produce dichroic filters for use in photoluminescent systems that rely on polarized UV light, we synthesized a number of linear, dichroic dyes, which absorb mainly in the near-UV range of the electromagnetic spectrum. These dyes were designed for compatibility with common thermoplastic polymers such as linear low-density poly(ethylene), poly(ethylene terephthalate), and polyamide-12. Films of these host polymers that consisted of 0.2% by weight of various dichroic UV dyes were produced by common melt-processing schemes. Uniaxial drawing of these films yielded highly dichroic UV filters with dichroic ratios in absorption that in some cases exceeded 100. The fact that these free-standing films display little or no coloration and are environmentally stable makes them useful for various applications that involve generation of polarized UV light.

  1. Privacy Preserving Spam Filtering

    Pathak, Manas A; Raj, Bhiksha

    2011-01-01

    We present an approach to training a binary logistic regression classifier in the setting where the training data needs to be kept private. We provide a theoretical analysis of the security of this procedure and experimental results for the problem of large scale spam detection. High performance spam filters often use character n-grams as features which result in large sparse vectors to which applying our protocol directly is not feasible. We explore various dimensionality reduction and parallelization approaches and provide a detailed analysis of the speed and accuracy trade-off. Our results show that we can achieve the accuracy of state of the art spam filters at comparable training and testing time of non-private version of logistic regression.

  2. Filter Bank Fusion frames

    Chebira, Amina; Fickus, Matthew; Mixon, Dustin G.

    2011-01-01

    In this paper we characterize and construct novel oversampled filter banks implementing fusion frames. A fusion frame is a sequence of orthogonal projection operators whose sum can be inverted in a numerically stable way. When properly designed, fusion frames can provide redundant encodings of signals which are optimally robust against certain types of noise and erasures. However, up to this point, few implementable constructions of such frames were known; we show how to construct them using ...

  3. A new nonlinear filter

    Elliott, Robert J.; Haykin, Simon

    2006-01-01

    A discrete time filter is constructed where both the observation and signal process have non-linear dynamics with additive white Gaussian noise. Using the reference probably frame- work a convolution Zakai equation is obtained which updates the unnormalized conditional density. Our work obtains approximate solutions of this equation in terms of Gaussian sum when second order expansions are introduced for the non-linear terms.

  4. Resampling in particle filters

    Hol, Jeroen D.

    2004-01-01

    In this report a comparison is made between four frequently encountered resampling algorithms for particle filters. A theoretical framework is introduced to be able to understand and explain the differences between the resampling algorithms. This facilitates a comparison of the algorithms based on resampling quality and on computational complexity. Using extensive Monte Carlo simulations the theoretical results are verified. It is found that systematic resampling is favourable, both in resamp...

  5. Archimedes Mass Filter Vaporizer

    Putvinski, S.; Agnew, A. F.; Cluggish, B. P.; Ohkawa, T.; Sevier, L.; Umstadter, K. R.; Dresvin, S. V.; Kuteev, B. V.; Feygenson, O. N.; Ivanov, D. V.; Zverev, S. G.; Miroshnikov, I. V.; Egorov, S. M.; Kiesewetter, D. V.; Maliugin, V. I.

    2001-10-01

    Archimedes Technology Group, Inc., is developing a plasma mass separator called the Archimedes Filter that separates waste oxide mixtures ion by ion into two mass groups: light and heavy. Since high-level waste at Hanford has 99.9its radioactivity associated with heavy elements, the Archimedes Filter can effectively decontaminate over three-quarters of that waste. The Filter process involves some preprocessing followed by volatilization and separation by the magnetic and electric fields of the main plasma. This presentation describes the approach to volatilization of the waste oxy-hydroxide mixture by means of a very high heat flux (q > 10 MW/m2). Such a high heat flux is required to ensure congruent evaporation of the complex oxy-hydroxide mixture and is achieved by injection of small droplets of molten waste into an inductively coupled plasma (ICP) torch. This presentation further addresses different issues related to evaporation of the waste including modeling of droplet evaporation, estimates of parameters of plasma torch, and 2D modeling of the plasma. The experimental test bed for oxide vaporization and results of the initial experiments on oxide evaporation in 60 kW ICP torch will also be described.

  6. Controlling flow conditions of test filters in iodine filters

    Several different iodine filter and test filter designs and experience gained from their operation are presented. For the flow experiments, an iodine filter system equipped with flow regulating and measuring devices was built. In the experiments the influence of the packing method of the iodine sorption material and the influence of the flow regulating and measuring divices upon the flow conditions in the test filters was studied. On the basis of the experiments it has been shown that the flows through the test filters always can be adjusted to a correct value if there only is a high enough pressure difference available across the test filter ducting. As a result of the research, several different methods are presented with which the flows through the test filters in both operating and future iodine sorption system can easily be measured and adjusted to their correct values. (author)

  7. Morphological filters for functional assessment of roundness profiles

    Filtration techniques are useful tools for analysing roundness profiles. The 2RC filter and Gaussian filter are commonly used to assess peripheral undulations of the roundness data. However they cannot do every aspect of functional prediction. Morphological filters are employed to characterize roundness profiles for functional assessment. Traditional computation methods for morphological filters are limited to planar surfaces and unable to be extended to roundness measurement. A novel method based on alpha shape theory is developed to break up the confinement. The morphological closing and opening envelopes are obtained by rolling a disk upon the roundness profile from the air and material side of the component respectively. They can be used to identify significant peaks and valleys on the profile respectively, which is vital to the functional performance of components, especially contact phenomenon. A case study is presented where various options of morphological filters and reference circles are applied to a roundness profile, delivering different functional meanings. An in-depth comparison of morphological filters and the Gaussian filter is followed to derive their pros and cons. (paper)

  8. COBE experience with filter QUEST

    Filla, O.; Keat, J.; Chu, D.

    1991-10-01

    A gyro based filter variation on the standard QUEST attitude determination algorithm is applied to the Cosmic Background Explorer (COBE). Filter QUEST is found to be three times as fast as the batch estimator and slightly more accurate than regular QUEST. Perhaps more important than its speed or accuracy is the fact that Filter QUEST can provide real time attitude solutions when regular QUEST cannot, due to lack of observability. Filter QUEST is also easy to use and adjust for the proper memory length. Suitable applications for Filter QUEST include coarse and real time attitude determination.

  9. A Short Note on t-filters, I-filters and Extended Filters on Residuated Lattices

    Víta, Martin

    2015-01-01

    Roč. 271, 15 July (2015), s. 168-171. ISSN 0165-0114 R&D Projects: GA ČR GAP202/10/1826 Institutional support: RVO:67985807 Keywords : t- filter s * I- filter s * extended filter s * residuated lattices Subject RIV: BA - General Mathematics Impact factor: 1.986, year: 2014

  10. Cherokee Stickball: A Changing Tradition.

    Olson, Ted

    1993-01-01

    Discusses the history of Cherokee stickball, a ball game dating back at least to the 1500s that was once used (as an alternative to war) for resolving grievances between tribes and townships. Describes traditional aspects of Cherokee stickball and notes the steady decline of the game and its traditional rules and ceremonies. (LP)

  11. Modern calorimetry: going beyond tradition

    Jeong, Y. H.

    2001-01-01

    Calorimetry has been a traditional tool for obtaining invaluable thermodynamic information of matter, the free energy. We describe recent efforts to go beyond this traditional calorimetry: After introducing dynamic heat capacity, we present the various experimental methods to measure it. Applications and future prospects are also given.

  12. Filter for reactor emergency cooling system

    The invention describes the design of a filter for the emergency cooling system. The new type of filter can be rinsed by flushing water backwards through the filter. The arrangement will prevent the filter from being silt up

  13. Collaborative Filtering Recommendation on Users' Interest Sequences.

    Cheng, Weijie; Yin, Guisheng; Dong, Yuxin; Dong, Hongbin; Zhang, Wansong

    2016-01-01

    As an important factor for improving recommendations, time information has been introduced to model users' dynamic preferences in many papers. However, the sequence of users' behaviour is rarely studied in recommender systems. Due to the users' unique behavior evolution patterns and personalized interest transitions among items, users' similarity in sequential dimension should be introduced to further distinguish users' preferences and interests. In this paper, we propose a new collaborative filtering recommendation method based on users' interest sequences (IS) that rank users' ratings or other online behaviors according to the timestamps when they occurred. This method extracts the semantics hidden in the interest sequences by the length of users' longest common sub-IS (LCSIS) and the count of users' total common sub-IS (ACSIS). Then, these semantics are utilized to obtain users' IS-based similarities and, further, to refine the similarities acquired from traditional collaborative filtering approaches. With these updated similarities, transition characteristics and dynamic evolution patterns of users' preferences are considered. Our new proposed method was compared with state-of-the-art time-aware collaborative filtering algorithms on datasets MovieLens, Flixster and Ciao. The experimental results validate that the proposed recommendation method is effective and outperforms several existing algorithms in the accuracy of rating prediction. PMID:27195787

  14. OPTIMIZATION OF ADVANCED FILTER SYSTEMS

    R.A. Newby; G.J. Bruck; M.A. Alvin; T.E. Lippert

    1998-04-30

    Reliable, maintainable and cost effective hot gas particulate filter technology is critical to the successful commercialization of advanced, coal-fired power generation technologies, such as IGCC and PFBC. In pilot plant testing, the operating reliability of hot gas particulate filters have been periodically compromised by process issues, such as process upsets and difficult ash cake behavior (ash bridging and sintering), and by design issues, such as cantilevered filter elements damaged by ash bridging, or excessively close packing of filtering surfaces resulting in unacceptable pressure drop or filtering surface plugging. This test experience has focused the issues and has helped to define advanced hot gas filter design concepts that offer higher reliability. Westinghouse has identified two advanced ceramic barrier filter concepts that are configured to minimize the possibility of ash bridge formation and to be robust against ash bridges should they occur. The ''inverted candle filter system'' uses arrays of thin-walled, ceramic candle-type filter elements with inside-surface filtering, and contains the filter elements in metal enclosures for complete separation from ash bridges. The ''sheet filter system'' uses ceramic, flat plate filter elements supported from vertical pipe-header arrays that provide geometry that avoids the buildup of ash bridges and allows free fall of the back-pulse released filter cake. The Optimization of Advanced Filter Systems program is being conducted to evaluate these two advanced designs and to ultimately demonstrate one of the concepts in pilot scale. In the Base Contract program, the subject of this report, Westinghouse has developed conceptual designs of the two advanced ceramic barrier filter systems to assess their performance, availability and cost potential, and to identify technical issues that may hinder the commercialization of the technologies. A plan for the Option I, bench

  15. A comparison of error subspace Kalman filters

    Nerger, Lars; Hiller, Wolfgang; Schröter, Jens

    2005-01-01

    Three advanced filter algorithms based on the Kalman filter are reviewed and presented in a unified notation. They are the well-known ensemble Kalman filter (EnKF), the singular evolutive extended Kalman (SEEK) filter, and the less common singular evolutive interpolated Kalman (SEIK) filter. For comparison, the mathematical formulations of the filters are reviewed in relation to the extended Kalman filter as error subspace Kalman filters. The algorithms are presented in their original form an...

  16. Spindoktorer et politisk filter

    Talic, Elvedin; Bernt, Rune; Mortensen, Mass Holmegård

    2015-01-01

    Through the TV and other media, many rumors about what a spin-doctor does and the nature of his services have spread. We would like to clarify the function; role and influence the spin-doctor have on Danish politics. We will look specifically at the relationship of power between the Ministers and spin-doctors, and the spin-doctors and media, as some believe that the spin-doctor sometimes will rise to power. We will also take look at the filter function between spin-doctors and media, as it is...

  17. Multilevel ensemble Kalman filtering

    Hoel, Hakon

    2016-06-14

    This work embeds a multilevel Monte Carlo sampling strategy into the Monte Carlo step of the ensemble Kalman filter (EnKF) in the setting of finite dimensional signal evolution and noisy discrete-time observations. The signal dynamics is assumed to be governed by a stochastic differential equation (SDE), and a hierarchy of time grids is introduced for multilevel numerical integration of that SDE. The resulting multilevel EnKF is proved to asymptotically outperform EnKF in terms of computational cost versus approximation accuracy. The theoretical results are illustrated numerically.

  18. Factorized Kalman Filtering

    Suzdaleva, Evgenia

    Praha: ÚTIA AV ČR, 2006 - (Přikryl, J.; Šmídl, V.). s. 51-52 [International PhD Workshop on Interplay of Societal and Technical Decision-Making, Young Generation Viewpoint /7./. 25.09.2006-30.09.2006, Hrubá Skála] R&D Projects: GA MŠk 1M0572; GA ČR GP201/06/P434 Institutional research plan: CEZ:AV0Z10750506 Keywords : state estimation * factorized filters * traffic control Subject RIV: BC - Control Systems Theory

  19. Factorized Kalman Filtering

    Suzdaleva, Evgenia

    Praha: ÚTIA AV ČR, 2006 - (Přikryl, J.; Andrýsek, J.; Šmídl, V.), s. 226-233 [International PhD Workshop on Interplay of Societal and Technical Decision-Making, Young Generation Viewpoint /7./. Hrubá Skála (CZ), 25.09.2006-30.09.2006] R&D Projects: GA MŠk 1M0572; GA ČR(CZ) GP201/06/P434 Grant ostatní: project TED ESF Institutional research plan: CEZ:AV0Z10750506 Keywords : state estimation * factorized filters * traffic control Subject RIV: BC - Control Systems Theory

  20. Robust Kriged Kalman Filtering

    Baingana, Brian; Dall' Anese, Emiliano; Mateos, Gonzalo; Giannakis, Georgios B.

    2015-11-11

    Although the kriged Kalman filter (KKF) has well-documented merits for prediction of spatial-temporal processes, its performance degrades in the presence of outliers due to anomalous events, or measurement equipment failures. This paper proposes a robust KKF model that explicitly accounts for presence of measurement outliers. Exploiting outlier sparsity, a novel l1-regularized estimator that jointly predicts the spatial-temporal process at unmonitored locations, while identifying measurement outliers is put forth. Numerical tests are conducted on a synthetic Internet protocol (IP) network, and real transformer load data. Test results corroborate the effectiveness of the novel estimator in joint spatial prediction and outlier identification.

  1. Charcoal filter testing

    Lyons, J. [Nuclear Regulatory Commission, Washington, DC (United States)

    1997-08-01

    In this very brief, informal presentation, a representative of the US Nuclear Regulatory Commission outlines some problems with charcoal filter testing procedures and actions being taken to correct the problems. Two primary concerns are addressed: (1) the process to find the test method is confusing, and (2) the requirements of the reference test procedures result in condensation on the charcoal and causes the test to fail. To address these problems, emergency technical specifications were processed for three nuclear plants. A generic or an administrative letter is proposed as a more permanent solution. 1 fig.

  2. Fast algorithm of the robust Gaussian regression filter for areal surface analysis

    In this paper, the general model of the Gaussian regression filter for areal surface analysis is explored. The intrinsic relationships between the linear Gaussian filter and the robust filter are addressed. A general mathematical solution for this model is presented. Based on this technique, a fast algorithm is created. Both simulated and practical engineering data (stochastic and structured) have been used in the testing of the fast algorithm. Results show that with the same accuracy, the processing time of the second-order nonlinear regression filters for a dataset of 1024*1024 points has been reduced to several seconds from the several hours of traditional algorithms

  3. Design of LLCL-filter for grid-connected converter to improve stability and robustness

    Huang, Min; Wang, Xiongfei; Loh, Poh Chiang;

    2015-01-01

    The LLCL-filter has recently emerged into gridconnected converters due to the improved filtering capability which ensuring a smaller physical size. An LLCL -based gridconnected converter has almost the same frequency-response characteristic as that with the traditional LCL-filter within half of the...... example for LLCL-filter is given. Both simulations and experimental results are provided through a 5 kW, 380V/50 Hz grid-connected inverter model to validate the theoretical analysis in this paper....

  4. A New Stateless Packet Classification and Filter against DoS Attacks

    Guang Jin

    2014-02-01

    Full Text Available Capabilities is a typical scheme of stateless filtering. In order to classify and filter packets effectively, a novel scheme of packet classification and filter based on capabilities is proposed in this paper. In our scheme, a new classifier module is added and a new filter structure is designed. We employ capabilities as verification and introduce new authorization in the communications. All these innovations make packet classification owning good effects in attacking scenario. The experimental results based on large-scale topology datasets and NS2 show that our scheme is better than traditional packet classification algorithms, especially under complex cyber environment.

  5. A new Approach for Kalman filtering on Mobile Robots in the presence of uncertainties

    Larsen, Thomas Dall; Andersen, Nils Axel; Ravn, Ole

    1999-01-01

    In many practical Kalman filter applications, the quantity of most significance for the estimation error is the process noise matrix. When filters are stabilized or performance is sought to be improved, tuning of this matrix is the most common method. This tuning process cannot be done before...... the filter is implemented, as it is primarily made necessary by modelling errors. In this paper, two different methods for modelling the process noise are described and evaluated; a traditional one based on Gaussian noise models and a new one based on propagating modelling uncertainties. We discuss which...... method to use and how to tune the filter to achieve the lowest estimation error....

  6. Location Estimation for an Autonomously Guided Vehicle using an Augmented Kalman Filter to Autocalibrate the Odometry

    Larsen, Thomas Dall; Bak, Martin; Andersen, Nils Axel;

    1998-01-01

    A Kalman filter using encoder readings as inputs and vision measurements as observations is designed as a location estimator for an autonomously guided vehicle (AGV). To reduce the effect of modelling errors an augmented filter that estimates the true system parameters is designed. The traditional...... way of reducing these errors is by fictitious noise injection in the filter model. The main problem with that approach however is that the filter does not learn about its bad model, it just puts more confidence in incoming measurements and less in the model. As a result the estimates will drift...

  7. Low-power implementation of polyphase filters in Quadratic Residue Number System

    Cardarilli, Gian Carlo; Re, Andrea Del; Nannarelli, Alberto; Re, Marco

    The aim of this work is the reduction of the power dissipated in digital filters, while maintaining the timing unchanged. A polyphase filter bank in the Quadratic Residue Number System (QRNS) has been implemented and then compared, in terms of performance, area, and power dissipation to the...... implementation of a polyphase filter bank in the traditional two's complement system (TCS). The resulting implementations, designed to have the same clock rates, show that the QRNS filter is smaller and consumes less power than the TCS one....

  8. Fantastic filters of lattice implication algebras

    Young Bae Jun

    2000-01-01

    The notion of a fantastic filter in a lattice implication algebra is introduced, and the relations among filter, positive implicative filter, and fantastic filter are given. We investigate an equivalent condition for a filter to be fantastic, and state an extension property for fantastic filter.

  9. Vertical media bed filter and method of cleaning filter panels

    A vertical media bed dust collector in which the media bed of a filter panel is rejuvenated when necessary by interrupting the gas flow through the panel, withdrawing the filter media from the panel] separating the agglomerated dust from the filter media, returing the filter media to the filter panel, and reestablishing the gas flow through the panel. The system further includes apparatus for removing collected dust from the deparating and recirculating surfaces of the media handling apparatus and also from the remote face of the filter panels before the cleaned gas is allowed to pass out of the collector so that the cleaned gas is not recontaminated by small amounts of dust adhering to those surfaces

  10. Unveiling Cebuano Traditional Healing Practices

    ZachiaRaiza Joy S. Berdon

    2016-02-01

    Full Text Available This study aims to identify the features of Cebuano’s traditional healing practices. Specifically, it also answers the following objectives: analyze traditional healing in Cebuano’s perspectives, explain the traditional healing process practiced in terms of the traditional healers’ belief, and extrapolate perceptions of medical practitioners toward traditional healing. This study made use of qualitative approach, among five traditional healers who performed healing for not less than ten years, in the mountain barangays of Cebu City. These healers served as the primary informants who were selected because of their popularity in healing. The use of open-ended interview in local dialect and naturalistic observation provided a free listing of their verbatim accounts were noted and as primary narratives. Participation in the study was voluntary and participants were interviewed privately after obtaining their consent. The Cebuano traditional healing practices or “panambal” comprise the use of “himolso” (pulse-checking, “palakaw” (petition, “pasubay” (determining what causes the sickness and its possible means of healing, “pangalap” (searching of medicinal plants for “palina” (fumigation, “tayhop” (gentle-blowing, “tutho” (saliva-blowing,“tuob” (boiling, “orasyon” (mystical prayers, “hilot” (massage, and “barang” (sorcery. Though traditional with medical science disapproval, it contributes to a mystical identity of Cebuano healers, as a manifestation of folk Catholicism belief, in order to do a good legacy to the community that needs help. For further study, researchers may conduct further the studies on the: curative effects of medicinal plants in Cebu, psychological effect pulsechecking healed persons by the mananambal, and unmasking the other features of traditional healing.

  11. Initial results of a new generation dual source CT system using only an in-plane comb filter for ultra-high resolution temporal bone imaging

    Meyer, Mathias; Haubenreisser, Holger; Schoenberg, Stefan O.; Henzler, Thomas [Heidelberg University, Institute of Clinical Radiology and Nuclear Medicine, University Medical Center Mannheim, Medical Faculty Mannheim, Mannheim (Germany); Raupach, Rainer; Schmidt, Bernhard; Leidecker, Christianne; Allmendinger, Thomas; Flohr, Thomas [Siemens Healthcare, Imaging and Therapy Division, Forchheim (Germany); Lietzmann, Florian; Schad, Lothar R. [Heidelberg University, Computer Assisted Clinical Medicine, University Medical Center Mannheim, Medical Faculty Mannheim, Mannheim (Germany)

    2015-01-15

    To prospectively evaluate radiation dose and image quality of a third generation dual-source CT (DSCT) without z-axis filter behind the patient for temporal bone CT. Forty-five patients were either examined on a first, second, or third generation DSCT in an ultra-high-resolution (UHR) temporal bone-imaging mode. On the third generation DSCT system, the tighter focal spot of 0.2 mm{sup 2} removesthe necessity for an additional z-axis-filter, leading to an improved z-axis radiation dose efficiency. Images of 0.4 mm were reconstructed using standard filtered-back-projection or iterative reconstruction (IR) technique for previous generations of DSCT and a novel IR algorithm for the third generation DSCT. Radiation dose and image quality were compared between the three DSCT systems. The statistically significantly highest subjective and objective image quality was evaluated for the third generation DSCT when compared to the first or second generation DSCT systems (all p < 0.05). Total effective dose was 63 %/39 % lower for the third generation examination as compared to the first and second generation DSCT. Temporal bone imaging without z-axis-UHR-filter and a novel third generation IR algorithm allows for significantly higher image quality while lowering effective dose when compared to the first two generations of DSCTs. (orig.)

  12. Aurorae in Australian Aboriginal Traditions

    Hamacher, Duane W

    2013-01-01

    Transient celestial phenomena feature prominently in the astronomical knowledge and traditions of Aboriginal Australians. In this paper, I collect accounts of the Aurora Australis from the literature regarding Aboriginal culture. Using previous studies of meteors, eclipses, and comets in Aboriginal traditions, I anticipate that the physical properties of aurora, such as their generally red colour as seen from southern Australia, will be associated with fire, death, blood, and evil spirits. The survey reveals this to be the case and also explores historical auroral events in Aboriginal cultures, aurorae in rock art, and briefly compares Aboriginal auroral traditions with other global indigenous groups, including the Maori of New Zealand.

  13. Aurorae in Australian Aboriginal Traditions

    Hamacher, Duane W.

    2013-07-01

    Transient celestial phenomena feature prominently in the astronomical knowledge and traditions of Aboriginal Australians. In this paper, I collect accounts of the Aurora Australis from the literature regarding Aboriginal culture. Using previous studies of meteors, eclipses, and comets in Aboriginal traditions, I anticipate that the physical properties of aurora, such as their generally red colour as seen from southern Australia, will be associated with fire, death, blood, and evil spirits. The survey reveals this to be the case and also explores historical auroral events in Aboriginal cultures, aurorae in rock art, and briefly compares Aboriginal auroral traditions with other global indigenous groups, including the Maori of New Zealand.

  14. From Microwave Filter to Digital Filter and Back Again

    Dalby, Arne Brejning

    1989-01-01

    A new very simple state variable flow graph representation for interdigital transmission line bandpass filters is presented, which has led to two important results: 1) A new type of digital filter with properties, that surpass the properties of most other (all pole) digital filtertypes. 2) The study of the new digital filtertype has led to design formulas for interdigital transmission line filters that are very simple compared to the hitherto known formulas. The accuracy is the same or better.

  15. From Microwave Filter to Digital Filter and Back Again

    Dalby, Arne Brejning

    1989-01-01

    A new very simple state variable flow graph representation for interdigital transmission line bandpass filters is presented, which has led to two important results: 1) A new type of digital filter with properties, that surpass the properties of most other (all pole) digital filtertypes. 2) The...... study of the new digital filtertype has led to design formulas for interdigital transmission line filters that are very simple compared to the hitherto known formulas. The accuracy is the same or better....

  16. Direct analysis of air filter samples for alpha emitting isotopes

    The traditional method for determination of alpha emitting isotopes on air filters has been to process the samples by radiochemical methods. However, this method is too slow for cases of incidents involving radioactive materials where the determination of personnel received dose is urgent. A method is developed to directly analyze the air filters taken from personal and area air monitors. The site knowledge is used in combination with alpha spectral information to identify isotopes. A mathematical function is developed to estimate the activity for each isotope. The strengths and weaknesses of the method are discussed

  17. Filter and Filter Bank Design for Image Texture Recognition

    Randen, Trygve

    1997-12-31

    The relevance of this thesis to energy and environment lies in its application to remote sensing such as for instance sea floor mapping and seismic pattern recognition. The focus is on the design of two-dimensional filters for feature extraction, segmentation, and classification of digital images with textural content. The features are extracted by filtering with a linear filter and estimating the local energy in the filter response. The thesis gives a review covering broadly most previous approaches to texture feature extraction and continues with proposals of some new techniques. 143 refs., 59 figs., 7 tabs.

  18. Archimedes Plasma Mass Filter

    The Archimedes' Plasma Mass Filter is a novel plasma-based mass separation device. The basic physics of the Filter concept and a description of its primary application for nuclear waste separation at Hanford will be presented along with initial experimental results from a Demo device. The Demo is a 3.89 m long cylindrical device with a plasma radius of 0.4 m and an axial magnetic field up to 1600 Gauss. The plasma is produced by helicon waves launched by two four-strap antennas placed symmetrically either side of a central source region. One strap of each antenna is powered by one of four phase controlled 1 MW transmitters operating in the frequency range from 3.9 - 26 MHz. Each end of the device has ten concentric ring electrodes used to apply an electric field to rotate the plasma. Application of a parabolic voltage profile results in a rigid body rotation. Heavy ions above the cut-off mass number are extracted radially and collected by a heavy ion collector surrounding the source injection region while light ions are collected at the ends of the cylinder. Initial experiments will use noble gas and trace metals to demonstrate separation before attempting to operate with complex waste characteristic of Hanford

  19. Chapter 1. Traditional marketing revisited

    Lambin, Jean-Jacques

    2013-01-01

    The objective of this chapter is to review the traditional marketing concept and to analyse its main ambiguities as presented in popular textbooks. The traditional marketing management model placing heavy emphasis of the marketing mix is in fact a supply-driven approach of the market, using the understanding of consumers’ needs to mould demand to the requirements of supply, instead of adapting supply to the expectations of demand. To clarify the true role of marketing, a distinction is made b...

  20. Karanga Traditional Medicine and Healing

    Shoko, Tabona

    2007-01-01

    In this paper we present the Karanga traditional system of therapy of illness and disease manifest in the treatments administered by the medical practitioners. In order to establish the traditional system of therapy of illness and disease, numerous interviews were carried out with healers, herbalists and elders in the field area. This enabled a systematic compilation of cases. There was also the pressing need to be present at rituals and instances where healing was effected and to observe the...

  1. Was the Monetarist Tradition Invented?

    George S. Tavlas

    1998-01-01

    In 1969, Harry Johnson charged that Milton Friedman 'invented' a Chicago oral quantity theory tradition, the idea being that in order to launch a monetarist counter-revolution, Friedman needed to establish a linkage with pre-Keynesian orthodoxy. This paper shows that there was a distinct pre-Keynesian Chicago quantity-theory tradition that advocated increased government expenditure during the Great Depression in order to put money directly into circulation. This policy stance distinguished th...

  2. Little Eyolf and dramatic tradition

    Roland Lysell

    2015-02-01

    Full Text Available The article criticises an Ibsen tradition who has seen the last scene of Little Eyolf as a reconciliation. Instead, the article discusses the improbability of a happy marriage characterised by social engagement. The play is open but it is hardly probable that Rita, with her erotic desire, and Allmers, whose desire has turned into metaphysics, can be happy together. The arguments refer to inner criteria and the constantly present dramatic tradition.

  3. Electronic commerce versus traditional commerce

    Dorin Vicentiu Popescu; Manoela Popescu

    2007-01-01

    The internet represents new opportunities for the traditional companies, including the diversification of the given services and also the promotion of the new ones, which are personalized and attractive and they are possible thanks to the information and communication technologies. According to this, the Internet impact, which has allowed the development of a new form of commerce- the commerce via Internet (which is a component of the electronic commerce), against the traditional global comme...

  4. Reconfigurable Mixed Mode Universal Filter

    Neelofer Afzal

    2014-01-01

    Full Text Available This paper presents a novel mixed mode universal filter configuration capable of working in voltage and transimpedance mode. The proposed single filter configuration can be reconfigured digitally to realize all the five second order filter functions (types at single output port. Other salient features of proposed configuration include independently programmable filter parameters, full cascadability, and low sensitivity figure. However, all these features are provided at the cost of quite large number of active elements. It needs three digitally programmable current feedback amplifiers and three digitally programmable current conveyors. Use of six active elements is justified by introducing three additional reduced hardware mixed mode universal filter configurations and its comparison with reported filters.

  5. Wavelength Filters in Fibre Optics

    Venghaus, Herbert

    2006-01-01

    Wavelength filters constitute an essential element of fibre-optic networks. This book gives a comprehensive account of the principles and applications of such filters, including their technological realisation. After an introductory chapter on wavelength division multiplexing in current and future fibre optic networks follows a detailed treatment of the phase characteristics of wavelength filters, a factor frequently neglected but of significant importance at high bit rates. Subsequent chapters cover three-dimensional reflection of gratings, arrayed waveguide gratings, fibre Bragg gratings, Fabry-Perot filters, dielectric multilayer filters, ring filters, and interleavers. The book explains the relevant performance parameters, the particular advantages and shortcomings of the various concepts and components, and the preferred applications. It also includes in-depth information on the characteristics of both commercially available devices and those still at the R&D stage. All chapters are authored by inter...

  6. DSP Control of Line Hybrid Active Filter

    Dan, Stan George; Benjamin, Doniga Daniel; Magureanu, R.;

    2005-01-01

    Active Power Filters have been intensively explored in the past decade. Hybrid active filters inherit the efficiency of passive filters and the improved performance of active filters, and thus constitute a viable improved approach for harmonic compensation. In this paper a parallel hybrid filter...... is studied for current harmonic compensation. The hybrid filter is formed by a single tuned Le filter and a small-rated power active filter, which are directly connected in series without any matching transformer. Thus the required rating of the active filter is much smaller than a conventional standalone...... active filter. Simulation and experimental results obtained in laboratory confirmed the validity and effectiveness of the control....

  7. Filters via Neutrosophic Crisp Sets

    A. A. Salama; Florentin Smarandache

    2013-01-01

    In this paper we introduce the notion of filter on the neutrosophic crisp set, then we consider a generalization of the filter’s studies. Afterwards, we present the important neutrosophic crisp filters. We also study several relations between different neutrosophic crisp filters and neutrosophic topologies. Possible applications to database systems are touched upon.

  8. The Marginalized Auxiliary Particle Filter

    Fritsche, Carsten; Schön, Thomas; Klein, Anja

    2010-01-01

    In this paper we are concerned with nonlinear systems subject to a conditionally linear, Gaussian sub-structure. This structure is often exploited in high-dimensional state estimation problems using the marginalized (aka Rao-Blackwellized) particle filter. The main contribution in the present work is to show how an efficient filter can be derived by exploiting this structure within the auxiliary particle filter. Based on a multisensor aircraft tracking example, the superior performance of the...

  9. In Situ Cleanable HEPA Filter

    Phillips, T.D.

    1999-11-18

    This paper describes a welded steel HEPA filter which uses liquid spray cleaning and vacuum drying. Development of the filter was initiated in order to eliminate personnel exposure, disposal cost, and short lifetime associated with systems commonly employed throughout the Department of Energy complex. In addition the design promises to resolve the issues of fire, elevated temperatures, wetting, filter strength, air leaks and aging documented in the May, 1999 DNFSB-TECH-23 report.

  10. Optimization of integrated polarization filters

    Gagnon, Denis; Déziel, Jean-Luc; Dubé, Louis J

    2014-01-01

    This study reports on the design of small footprint, integrated polarization filters based on engineered photonic lattices. Using a rods-in-air lattice as a basis for a TE filter and a holes-in-slab lattice for the analogous TM filter, we are able to maximize the degree of polarization of the output beams up to 98 % with a transmission efficiency greater than 75 %. The proposed designs allow not only for logical polarization filtering, but can also be tailored to output an arbitrary transverse beam profile. The lattice configurations are found using a recently proposed parallel tabu search algorithm for combinatorial optimization problems in integrated photonics.

  11. Optimal filters on the sphere

    McEwen, J D; Lasenby, A N

    2006-01-01

    We derive optimal filters on the sphere in the context of detecting compact objects embedded in a stochastic background process. The matched filter and the scale adaptive filter are derived on the sphere in the most general setting, allowing for directional template profiles and filters. The performance and relative merits of the two optimal filters are discussed. The application of optimal filter theory on the sphere to the detection of compact objects is demonstrated on simulated mock data. A naive detection strategy is adopted, with an initial aim of illustrating the application of the new optimal filters derived on the sphere. Nevertheless, this simple object detection strategy is demonstrated to perform well, even a low signal-to-noise ratio. Code written to compute optimal filters on the sphere (S2FIL), to perform fast directional filtering on the sphere (FastCSWT) and to construct the simulated mock data (COMB) are all made publicly available. (Accompanying code will be made publicly available on publi...

  12. Adaptive filtering and change detection

    Gustafsson, Fredrik

    2003-01-01

    Adaptive filtering is a classical branch of digital signal processing (DSP). Industrial interest in adaptive filtering grows continuously with the increase in computer performance that allows ever more conplex algorithms to be run in real-time. Change detection is a type of adaptive filtering for non-stationary signals and is also the basic tool in fault detection and diagnosis. Often considered as separate subjects Adaptive Filtering and Change Detection bridges a gap in the literature with a unified treatment of these areas, emphasizing that change detection is a natural extensi

  13. Kalman filtering implementation with Matlab

    Kleinbauer, Rachel

    2004-01-01

    1960 und 1961 veröffentlichte Rudolf Emil Kalmen seine Arbeiten über einen rekursiven prädiktiven Filter, der auf dem Gebrauch von rekursiven Algorithmen basiert. Damit revolutionierte er das Feld der Schätzverfahren. Seitdem ist der sogenannte Kalman Filter Gegenstand ausführlicher Forschung und findet bis heute Anwendung in zahlreichen Gebieten. Der Kalman Filter schätzt den Zustand eines dynamischen Systems, auch wenn die exakte Form dieses Systems unbekannt ist. Der Filter ist sehr lei...

  14. Tradition?! Traditional Cultural Institutions on Customary Practices in Uganda

    Joanna R. Quinn

    2014-01-01

    Full Text Available This contribution traces the importance of traditional institutions in rehabilitating societies in general terms and more particularly in post-independence Uganda. The current regime, partly by inventing “traditional” cultural institutions, partly by co-opting them for its own interests, contributed to a loss of legitimacy of those who claim responsibility for customary law. More recently, international prosecutions have complicated the use of customary mechanisms within such societies. This article shows that some traditional and cultural leaders continue to struggle to restore their original institutions, some having taken the initiative of inventing new forms of engaging with society. Uganda is presented as a test case for the International Criminal Court’s ability to work with traditional judicial institutions in Africa.

  15. Scalable filter banks

    Hur, Youngmi; Okoudjou, Kasso A.

    2015-08-01

    A finite frame is said to be scalable if its vectors can be rescaled so that the resulting set of vectors is a tight frame. The theory of scalable frame has been extended to the setting of Laplacian pyramids which are based on (rectangular) paraunitary matrices whose column vectors are Laurent polynomial vectors. This is equivalent to scaling the polyphase matrices of the associated filter banks. Consequently, tight wavelet frames can be constructed by appropriately scaling the columns of these paraunitary matrices by diagonal matrices whose diagonal entries are square magnitude of Laurent polynomials. In this paper we present examples of tight wavelet frames constructed in this manner and discuss some of their properties in comparison to the (non tight) wavelet frames they arise from.

  16. Advanced filters for nuclear facilities and filter conditioning for disposal

    This paper reports the advantages of the cylinder shape selected for the filter elements for aerosol and iodine removal from the offgas of nuclear facilities, above all in view of remote and manual operation and transport, conditioning and disposal. In order to test the conditioning of polygonal HEPA filter elements, several filter elements not exposed to radioactivity were crushed remotely and embedded in concrete in a 400 l waste drum. The waste drum was subsequently saw cut in order to verify the quality of concrete embedding. The result of concrete embedding is satisfactory. The design is presented of a filter element capable of accommodating gas flows up to 500 m3/h for wet aerosol removal with a high removal efficiency. Also the design of a filter element for gas flows up to 800 m3/h to be used in iodine removal from offgases with low iodine contents is described. In order to be able to use the cylindrical filter elements developed for remote handling in manual operation too, e.g., for cleaning low level offgases, a manually operated filter housing was developed. It is suited for working pressures up to 10 bar and working temperatures up to 160 degree C. The filter elements are replaced by the usual bagging technique

  17. A parallel Kalman filter via the square root Kalman filtering

    Romera, Rosario; Cipra, Tomas

    1993-01-01

    A parallel algorithm for Kalman filtering with contaminated observations is developed. Theı parallel implementation is based on the square root version of the Kalman filter (see [3]). Thisı represents a great improvement over serial implementations reducing drastically computationalı costs for each state update.

  18. T Source Inverter Based Shunt Active Filter with LCL Passive Filter for the 415V 50 Hz Distribution systems

    S. Sellakumar

    2015-06-01

    Full Text Available The inverter topology is being used as an active filter to reduce the harmonics in the power system [1]. The traditional voltage source or current source inverters are having the disadvantages of limited output voltage range hence it may not be able to supply enough compensating currents during heavy switching surges, Vulnerable to EMI noise and the devices gets damaged in either open or short circuit conditions and the main switching device of VSI and CSI are not interchangeable. The active filters are the type of DC-AC system with wide range of voltage regulation and integration of energy storages is often required. This cannot be achieved with conventional inverters and hence the impedance source inverters have been suggested. The T source inverters are basically impedance source inverters which can be used as an active filter in the power system. The MATLAB simulation is done and the results are discussed in this paper for both the types. The proposed dampening system is fully characterized by LCL based passive filters [6] and T source inverter based shunt active filter. The disturbances in the supply voltage and load current due to the non linear loads are observed in the simulation. The same is studied after connecting the designed hybrid shunt active filter in the distribution system. The simulation results obtained from the proposed method proves that it gives comparatively better THD value.

  19. AER image filtering

    Gómez-Rodríguez, F.; Linares-Barranco, A.; Paz, R.; Miró-Amarante, L.; Jiménez, G.; Civit, A.

    2007-05-01

    Address Event Representation (AER) is an emergent neuromorphic interchip communication protocol that allows real-time virtual massive connectivity among huge number of neurons located on different chips.[1] By exploiting high speed digital communication circuits (with nano-seconds timing), synaptic neural connections can be time multiplexed, while neural activity signals (with mili-seconds timings) are sampled at low frequencies. Neurons generate "events" according to their activity levels. That is, more active neurons generate more events per unit time and access the interchip communication channel more frequently than neurons with low activity. In Neuromorphic system development, AER brings some advantages to develop real-time image processing system: (1) AER represents the information like time continuous stream not like a frame; (2) AER sends the most important information first (although this depends on the sender); (3) AER allows to process information as soon as it is received. When AER is used in artificial vision field, each pixel is considered like a neuron, so pixel's intensity is represented like a sequence of events; modifying the number and the frequency of these events, it is possible to make some image filtering. In this paper we present four image filters using AER: (a) Noise addition and suppression, (b) brightness modification, (c) single moving object tracking and (d) geometrical transformations (rotation, translation, reduction and magnification). For testing and debugging, we use USB-AER board developed by Robotic and Technology of Computers Applied to Rehabilitation (RTCAR) research group. This board is based on an FPGA, devoted to manage the AER functionality. This board also includes a micro-controlled for USB communication, 2 Mbytes RAM and 2 AER ports (one for input and one for output).

  20. Spatial Filter with Volume Gratings for High-peak-power Multistage Laser Amplifiers

    Tan, Yi-zhou; Zheng, Guang-wei; Shen, Ben-jian; Pan, Heng-yue; Li, Liu

    2012-01-01

    The regular spatial filters comprised of lens and pinhole are essential component in high power laser systems, such as lasers for inertial confinement fusion, nonlinear optical technology and directed-energy weapon. On the other hand the pinhole is treated as a bottleneck of high power laser due to harmful plasma created by the focusing beam. In this paper we present a spatial filter based on angular selectivity of Bragg diffraction grating to avoid the harmful focusing effect in the traditional pinhole filter. A spatial filter consisted of volume phase gratings in two-pass amplifier cavity were reported. Two-dimensional filter was proposed by using single Pi-phase-shifted Bragg grating, numerical simulation results shown that its angular spectrum bandwidth can be less than 160urad. The angular selectivity of photo-thermo-refractive glass and RUGATE film filters, construction stability, thermal stability and the effects of misalignments of gratings on the diffraction efficiencies under high-pulse-energy laser...

  1. The Error-Subspace Transform Kalman Filter

    Nerger, Lars; Schröter, Jens; Hiller, Wolfgang

    2013-01-01

    Ensemble square-root Kalman filters are currently the computationally most efficient ensemble-based Kalman filter methods. In particular, the Ensemble Transform Kalman Filter (ETKF) is known to provide a minimum ensemble transformation in a very efficient way. A similar filter algorithm is the Singular Evolutive Interpolated Kalman (SEIK) filter. In contrast to the ETKF, the SEIK filter solves the estimation problem of the Kalman filter directly in the in the error-subspace that is represente...

  2. Kalman Filter Desing, Smoothing and Analysis

    2001-01-01

    Thesis is based on three different aspects of Kalman filtering. >Kalman filters for navigation. Investigate the difference between a Extended Kalman Filter and a Linearized Kalman Filter with feedback. And show how different system models relate to these Kalman Filters when implemented in a filter. >Smoothing. Investigate how much there is to be gained from smoothing. We will only look at the fixed-interval smoother, using the method of forward and backward filtering. ...

  3. ADAPTIVE TRILATERAL FILTER FOR IN-LOOP FILTERING

    Akitha Kesireddy

    2014-07-01

    Full Text Available High Efficiency Video Coding (HEVC has achieved significant coding efficiency improvement beyond existing video coding standard by employing several new coding tools. Deblocking Filter, Sample Adaptive Offset (SAO and Adaptive Loop Filter (ALF for in-loop filtering are currently introduced for the HEVC standard. However, these filters are implemented in spatial domain despite the fact of temporal correlation within video sequences. To reduce the artifacts and better align object boundaries in video, a proposed algorithm in in-loop filtering is proposed. The proposed algorithm is implemented in HM-11.0 software. This proposed algorithm allows an average bitrate reduction of about 0.7% and improves the PSNR of the decoded frame by 0.05%, 0.30% and 0.35% in luminance and chroma.

  4. Particle Kalman Filtering: A Nonlinear Framework for Ensemble Kalman Filters

    Hoteit, Ibrahim

    2010-09-19

    Optimal nonlinear filtering consists of sequentially determining the conditional probability distribution functions (pdf) of the system state, given the information of the dynamical and measurement processes and the previous measurements. Once the pdfs are obtained, one can determine different estimates, for instance, the minimum variance estimate, or the maximum a posteriori estimate, of the system state. It can be shown that, many filters, including the Kalman filter (KF) and the particle filter (PF), can be derived based on this sequential Bayesian estimation framework. In this contribution, we present a Gaussian mixture‐based framework, called the particle Kalman filter (PKF), and discuss how the different EnKF methods can be derived as simplified variants of the PKF. We also discuss approaches to reducing the computational burden of the PKF in order to make it suitable for complex geosciences applications. We use the strongly nonlinear Lorenz‐96 model to illustrate the performance of the PKF.

  5. Bridging the ensemble Kalman filter and particle filters: the adaptive Gaussian mixture filter

    Stordal, Andreas Størksen; Karlsen, Hans A.; Nævdal, Geir; Skaug, Hans J.; Vallès, Brice

    2010-01-01

    The nonlinear filtering problem occurs in many scientific areas. Sequential Monte Carlo solutions with the correct asymptotic behavior such as particle filters exist, but they are computationally too expensive when working with high-dimensional systems. The ensemble Kalman filter (EnKF) is a more robust method that has shown promising results with a small sample size, but the samples are not guaranteed to come from the true posterior distribution. By approximating the model error with a Gauss...

  6. Application of DFT Filter Banks and Cosine Modulated Filter Banks in Filtering

    Lin, Yuan-Pei; Vaidyanathan, P. P.

    1994-01-01

    None given. This is a proposal for a paper to be presented at APCCAS '94 in Taipei, Taiwan. (From outline): This work is organized as follows: Sec. II is devoted to the construction of the new 2m channel under-decimated DFT filter bank. Implementation and complexity of this DFT filter bank are discussed therein. IN a similar manner, the new 2m channel cosine modulated filter bank is discussed in Sec. III. Design examples are given in Sec. IV.

  7. Testing Dual Rotary Filters - 12373

    The Savannah River National Laboratory (SRNL) installed and tested two hydraulically connected SpinTekR Rotary Micro-filter units to determine the behavior of a multiple filter system and develop a multi-filter automated control scheme. Developing and testing the control of multiple filters was the next step in the development of the rotary filter for deployment. The test stand was assembled using as much of the hardware planned for use in the field including instrumentation and valving. The control scheme developed will serve as the basis for the scheme used in deployment. The multi filter setup was controlled via an Emerson DeltaV control system running version 10.3 software. Emerson model MD controllers were installed to run the control algorithms developed during this test. Savannah River Remediation (SRR) Process Control Engineering personnel developed the software used to operate the process test model. While a variety of control schemes were tested, two primary algorithms provided extremely stable control as well as significant resistance to process upsets that could lead to equipment interlock conditions. The control system was tuned to provide satisfactory response to changing conditions during the operation of the multi-filter system. Stability was maintained through the startup and shutdown of one of the filter units while the second was still in operation. The equipment selected for deployment, including the concentrate discharge control valve, the pressure transmitters, and flow meters, performed well. Automation of the valve control integrated well with the control scheme and when used in concert with the other control variables, allowed automated control of the dual rotary filter system. Experience acquired on a multi-filter system behavior and with the system layout during this test helped to identify areas where the current deployment rotary filter installation design could be improved. Completion of this testing provides the necessary information

  8. A cloud filtering method for microwave upper tropospheric humidity measurements

    S. A. Buehler

    2007-05-01

    Full Text Available The paper presents a cloud filtering method for upper tropospheric humidity (UTH measurements at 183.31±1.00 GHz. The method uses two criteria: The difference between the brightness temperatures at 183.31±7.00 and 183.31±1.00 GHz, and a threshold for the brightness temperature at 183.31±1.00 GHz. The robustness of this cloud filter is demonstrated by a mid-latitudes winter case-study.

    The paper then studies different biases on UTH climatologies. Clouds are associated with high humidity, therefore the dry bias introduced by cloud filtering is discussed and compared to the wet biases introduced by the clouds radiative effect if no filtering is done. This is done by means of a case study, and by means of a stochastic cloud database with representative statistics for midlatitude conditions.

    The consistent result is that both cloud wet bias (0.8% RH and cloud filtering dry bias (–2.4% RH are modest for microwave data, where the numbers given are for the stochastic cloud dataset. This indicates that for microwave data cloud-filtered UTH and unfiltered UTH can be taken as error bounds for errors due to clouds. This is not possible for the more traditional infrared data, since the radiative effect of clouds is much stronger there.

    The focus of the paper is on midlatitude data, since atmospheric data to test the filter for that case were readily available. The filter is expected to be applicable also to subtropical and tropical data, but should be further validated with case studies similar to the one presented here for those cases.

  9. Compressed sensing & sparse filtering

    Carmi, Avishy Y; Godsill, Simon J

    2013-01-01

    This book is aimed at presenting concepts, methods and algorithms ableto cope with undersampled and limited data. One such trend that recently gained popularity and to some extent revolutionised signal processing is compressed sensing. Compressed sensing builds upon the observation that many signals in nature are nearly sparse (or compressible, as they are normally referred to) in some domain, and consequently they can be reconstructed to within high accuracy from far fewer observations than traditionally held to be necessary. Apart from compressed sensing this book contains other related app

  10. Market Risk Beta Estimation using Adaptive Kalman Filter

    Atanu Das,

    2010-06-01

    Full Text Available Market risk of an asset or portfolio is recognized through beta in Capital Asset Pricing Model (CAPM. Traditional estimation techniques emerge poor results when beta in CAPM assumed to be dynamic and follows auto regressive model. Kalman Filter (KF can optimally estimate dynamic beta where measurement noise covariance and state noise covariance are assumed to be known in a state-space framework. This paper applied Adaptive Kalman Filter (AKF for beta estimation when the above covariances are not known and estimated dynamically. The technique is first characterized through simulation study and then applied to empirical data from Indian security market. A odification of the used AKF is also proposed to take care of the problems of AKF implementation onbeta estimation and simulations show that modified method improves the performance of the filter measured by RMSE.

  11. The evolution of traditional knowledge:

    Saslis Lagoudakis, C Haris; Hawkins, Julie A; Greenhill, Simon J; Pendry, Colin A; Watson, Mark F; Tuladhar-Douglas, Will; Baral, Sushim R; Savolainen, Vincent

    2014-01-01

    Traditional knowledge is influenced by ancestry, inter-cultural diffusion and interaction with the natural environment. It is problematic to assess the contributions of these influences independently because closely related ethnic groups may also be geographically close, exposed to similar...... environments and able to exchange knowledge readily. Medicinal plant use is one of the most important components of traditional knowledge, since plants provide healthcare for up to 80% of the world's population. Here, we assess the significance of ancestry, geographical proximity of cultures and the...... the effects of shared ancestry and geographical proximity. These findings demonstrate the importance of adaptation to local environments, even at small spatial scale, in shaping traditional knowledge during human cultural evolution....

  12. The double well mass filter

    Gueroult, Renaud; Fisch, Nathaniel J. [Princeton Plasma Physics Laboratory, Princeton University, Princeton, New Jersey 08543 (United States); Rax, Jean-Marcel [Laboratoire d' optique appliquée-LOA, Ecole Polytechnique, Chemin de la Hunière, 91761 Palaiseau Cedex (France)

    2014-02-15

    Various mass filter concepts based on rotating plasmas have been suggested with the specific purpose of nuclear waste remediation. We report on a new rotating mass filter combining radial separation with axial extraction. The radial separation of the masses is the result of a “double-well” in effective radial potential in rotating plasma with a sheared rotation profile.

  13. Chopped filter for nuclear spectroscopy

    Some of the theoretical and practical factors affecting the energy resolution of a spectrometry system are considered, specially those related to t he signal-to-noise ratio, and a time-variant filter with the transfer function of the theoretical optimum filter, during its active time, is proposed. A prototype has been tested and experimental results are presented. (Author)

  14. Informativeness of Parallel Kalman Filters

    Hajiyev, Chingiz

    2004-01-01

    This article considers the informativeness of parallel Kalman filters. Expressions are derived for determination of the amount of information obtained by additional measurements with a reserved measurement channel during processing. The theorems asserting that there is an increase in the informativeness of Kalman filters when there is a failure-free reserved measurement channel are proved.

  15. Tracking speckle displacement by double Kalman filtering

    Donghui Li; Li Guo

    2006-01-01

    @@ A tracking technique using two sequentially-connected Kalman filter for tracking laser speckle displacement is presented. One Kalman filter tracks temporal speckle displacement, while another Kalman filter tracks spatial speckle displacement. The temporal Kalman filter provides a prior for the spatial Kalman filter, and the spatial Kalman filter provides measurements for the temporal Kalman filter. The contribution of a prior to estimations of the spatial Kalman filter is analyzed. An optical analysis system was set up to verify the double-Kalman-filter tracker's ability of tracking laser speckle's constant displacement.

  16. A Note of Filters in Effect Algebras

    Biao Long Meng; Xiao Long Xin

    2013-01-01

    We investigate relations of the two classes of filters in effect algebras (resp., MV-algebras). We prove that a lattice filter in a lattice ordered effect algebra (resp., MV-algebra) does not need to be an effect algebra filter (resp., MV-filter). In general, in MV-algebras, every MV-filter is also a lattice filter. Every lattice filter in a lattice ordered effect algebra is an effect algebra filter if and only if is an orthomodular lattice. Every lattice filter in an MV-algebra is an MV-filt...

  17. Approximately Liner Phase IIR Digital Filter Banks

    J. D. Ćertić

    2013-11-01

    Full Text Available In this paper, uniform and nonuniform digital filter banks based on approximately linear phase IIR filters and frequency response masking technique (FRM are presented. Both filter banks are realized as a connection of an interpolated half-band approximately linear phase IIR filter as a first stage of the FRM design and an appropriate number of masking filters. The masking filters are half-band IIR filters with an approximately linear phase. The resulting IIR filter banks are compared with linear-phase FIR filter banks exhibiting similar magnitude responses. The effects of coefficient quantization are analyzed.

  18. Filter wheat equalization for DSA

    This paper reports on the design of a practical system for radiographic equalization in digital subtraction angiography (DSA), using multiple filter wheels mounted near the x-ray tube. Using an unequalized scout image, multiple filter wheels were independently rotated under computer control in order to vary the spatial distribution of the attenuator material on the filter wheel surfaces intersecting the x-ray beam. Computer simulations using clinical DSA images were used to analyze the optimal configuration of attenuator material on the filter wheels; the resulting improvement in signal-to-noise ratio in some typical DSA images was quantified. Neural networks were evaluated as a mathematic technique for rapidly determining the optimal filter wheel positioning from the scout image data

  19. Joint MIMO radar waveform and receiving filter optimization

    Chen, Chun-Yang; Vaidyanathan, P.P.

    2009-01-01

    The concept of MIMO (multiple-input multipleoutput) radar allows each transmitting antenna element to transmit an arbitrary waveform. This provides extra degrees of freedom compared to the traditional transmit beamforming approach. It has been shown in the recent literature that MIMO radar systems have many advantages. In this paper, we consider the joint optimization of waveforms and receiving filters in the MIMO radar when the prior information of target and clutter ...

  20. Implementational Aspects of the Contourlet Filter Bank and Application in Image Coding

    Truong T. Nguyen

    2009-02-01

    Full Text Available This paper analyzed the implementational aspects of the contourlet filter bank (or the pyramidal directional filter bank (PDFB, and considered its application in image coding. First, details of the binary tree-structured directional filter bank (DFB are presented, including a modification to minimize the phase delay factor and necessary steps for handling rectangular images. The PDFB is viewed as an overcomplete filter bank, and the directional filters are expressed in terms of polyphase components of the pyramidal filter bank and the conventional DFB. The aliasing effect of the conventional DFB and the Laplacian pyramid to the directional filters is then considered, and the conditions for reducing this effect are presented. The new filters obtained by redesigning the PDFBs satisfying these requirements have much better frequency responses. A hybrid multiscale filter bank consisting of the PDFB at higher scales and the traditional maximally decimated wavelet filter bank at lower scales is constructed to provide a sparse image representation. A novel embedded image coding system based on the image decomposition and a morphological dilation algorithm is then presented. The coding algorithm efficiently clusters the significant coefficients using progressive morphological operations. Context models for arithmetic coding are designed to exploit the intraband dependency and the correlation existing among the neighboring directional subbands. Experimental results show that the proposed coding algorithm outperforms the current state-of-the-art wavelet-based coders, such as JPEG2000, for images with directional features.

  1. Traditional and regional food in Poland

    Gulbicka, Bożena

    2014-01-01

    Regional and traditional products in the European Union - basic legal regulations. Traditional and regional products in the Polish legislation. National and regional food quality schemes. Quality and safety or traditional and regional food. Polish traditional and regional products registered with the European Union and their characteristics. Opportunities for and barriers to the development of the market of traditional and regional products in Poland.

  2. Investigation of New Microstrip Bandpass Filter Based on Patch Resonator with Geometrical Fractal Slot.

    Mezaal, Yaqeen S; Eyyuboglu, Halil T

    2016-01-01

    A compact dual-mode microstrip bandpass filter using geometrical slot is presented in this paper. The adopted geometrical slot is based on first iteration of Cantor square fractal curve. This filter has the benefits of possessing narrower and sharper frequency responses as compared to microstrip filters that use single mode resonators and traditional dual-mode square patch resonators. The filter has been modeled and demonstrated by Microwave Office EM simulator designed at a resonant frequency of 2 GHz using a substrate of εr = 10.8 and thickness of h = 1.27 mm. The output simulated results of the proposed filter exhibit 22 dB return loss, 0.1678 dB insertion loss and 12 MHz bandwidth in the passband region. In addition to the narrow band gained, miniaturization properties as well as weakened spurious frequency responses and blocked second harmonic frequency in out of band regions have been acquired. Filter parameters including insertion loss, return loss, bandwidth, coupling coefficient and external quality factor have been compared with different values of perturbation dimension (d). Also, a full comparative study of this filter as compared with traditional square patch filter has been considered. PMID:27054755

  3. Objects tracking with adaptive correlation filters and Kalman filtering

    Ontiveros-Gallardo, Sergio E.; Kober, Vitaly

    2015-09-01

    Object tracking is commonly used for applications such as video surveillance, motion based recognition, and vehicle navigation. In this work, a tracking system using adaptive correlation filters and robust Kalman prediction of target locations is proposed. Tracking is performed by means of multiple object detections in reduced frame areas. A bank of filters is designed from multiple views of a target using synthetic discriminant functions. An adaptive approach is used to improve discrimination capability of the synthesized filters adapting them to multiple types of backgrounds. With the help of computer simulation, the performance of the proposed algorithm is evaluated in terms of detection efficiency and accuracy of object tracking.

  4. Tracking of human head with particle filter

    GUO Chao

    2009-01-01

    To cope with the problem of tracking a human head in a complicated scene, we propose a method that adopts human skin color and hair color integrated with a kind of particle filter named condensation algorithm. Firstly, a novel method is presented to set up human head color model using skin color and hair color separately based on region growing. Compared with traditional human face model, this method is more precise and works well when human turns around and the face disappears in the image. Then a novel method is presented to use color model in condensation algorithm more effectively. In this method, a combination of edge detection result, color segmentation result and color edge detection result in an Omega window is used to measure the scale and position of human head in condensation. Experiments show that this approach can track human head in complicated scene even when human turns around or the distance of tracking a human head changes quickly.

  5. Individualizing in Traditional Classroom Settings.

    Thornell, John G.

    1980-01-01

    Effective individualized instruction depends primarily on the teacher possessing the skills to implement it. Individualization is therefore quite compatible with the traditional self-contained elementary classroom model, but not with its alternative, departmentalization, which allows teachers neither the time flexibility nor the familiarity with…

  6. A Grand Tradition of Struggle.

    West, Cornel

    2000-01-01

    Offers an "inspirational speech" delivered by Harvard professor Cornel West at the 1994 National Council of Teachers of English convention. Discusses ways in which English teachers can help to keep alive the tradition of struggle for decency, dignity, freedom, and democracy. Shares his belief in the significant role English teachers play in…

  7. Analysis of Traditional Historical Clothing

    Jensen, Karsten; Schmidt, A. L.; Petersen, A. H.

    2013-01-01

    A recurrent problem for scholars who investigate traditional and historical clothing is the measuring of items of clothing and subsequent pattern construction. The challenge is to produce exact data without damaging the item. The main focus of this paper is to present a new procedure for establis...

  8. Supplements to Traditional Vocabulary Teaching

    布亚男

    2012-01-01

      In a word, Vocabulary plays an indispensable part in language proficiency and provides much of the basis of how wel learns language, so it cannot be ignored. I discussed Schools’ viewpoints on the vocabulary teaching ,Reason for forgetting, Traditional approach to vocabulary teaching, supplements to vocabulary teaching,the author hope the above content can offer some hints for language learners.

  9. Goddess Traditions in Tantric Hinduism

    Hinduism cannot be understood without the Great Goddess and the goddess-orientated Śākta traditions. The Goddess pervades Hinduism at all levels, from aniconic village deities to high-caste pan-Hindu goddesses to esoteric, tantric goddesses. Nevertheless, the highly influential tantric forms of...

  10. Adolescent Obesity: Rethinking Traditional Approaches.

    Morrill, Correen M.; And Others

    1991-01-01

    Describes traditional approaches to working with obese students (weight loss programs, nutrition programs, self-esteem groups). Suggests system-based alternative. Suggests providing in-service workshops for staff; developing team to work with large students; providing individual counseling; assisting students in locating peer support groups; and…

  11. Traditional Literacy and Critical Thinking

    Dando, Priscille

    2016-01-01

    How school librarians focus on activating critical thinking through traditional literacy development can proactively set the stage for the deep thinking that occurs in all literacy development. The critical-thinking skills students build while becoming accomplished readers and writers provide the foundation for learning in a variety of…

  12. Active Learning versus Traditional Teaching

    L.A. Azzalis

    2009-05-01

    Full Text Available In traditional teaching most of the class time is spent with the professor lecturing and the students watching and listening. The students work individually, and cooperation is discouraged. On the other hand,  active learning  changes the focus of activity from the teacher to the learners, in which students solve problems, answer questions, formulate questions of their own, discuss, explain, debate during class;  moreover, students work in teams on problems and projects under conditions that assure positive interdependence and individual accountability. Although student-centered methods have repeatedly been shown to be superior to the traditional teacher-centered approach to instruction, the literature regarding the efficacy of various teaching methods is inconclusive. The purpose of this study was to compare the student perceptions of course and instructor effectiveness, course difficulty, and amount learned between the active learning and lecture sections  in Health Sciences´ courses by statistical data from Anhembi Morumbi University. Results indicated significant  difference between active  learning and traditional  teaching. Our conclusions were that strategies promoting  active  learning to  traditional lectures could increase knowledge and understanding.

  13. Storytelling Figures: A Pueblo Tradition.

    Kraus, Nancy

    1997-01-01

    In a collaborative unit on pueblo storytelling figures involving art, music, language arts, and physical education, a teacher describes how she helped second graders understand the Pueblo pottery tradition by reading aloud literature covering the past and present. Lists folklore, fiction, poetry, nonfiction, professional resources, videos, CDs,…

  14. Innovating Traditional Nursing Administration Challenges.

    Joseph, M Lindell; Fowler, Debra

    2016-03-01

    The evolving and complex practice environment calls for new mindsets among nurse leaders, academics, and nurse innovators to envision innovative ways to manage and optimize traditional tasks and processes in nursing administration. The purpose of this article is to present 3 case studies that used linear programming and simulation to innovate staffing enterprises, financial management of healthcare systems, and curricula development. PMID:26906516

  15. Traditional Teacher Education Still Matters

    Jacobs, Nick

    2013-01-01

    Fresh from teaching his first full school year the author reflects on his traditional teacher preparation path into the classroom and finds he was instilled with a common sense of ethics, compassion, a demand for reflective practice, and a robust guiding philosophy. As a college student, he learned theory and was able to augment that with…

  16. On-line filtering

    Present day electronic detectors used in high energy physics make it possible to obtain high event rates and it is likely that future experiments will face even higher data rates than at present. The complexity of the apparatus increases very rapidly with time and also the criteria for selecting desired events become more and more complex. So complex in fact that the fast trigger system cannot be designed to fully cope with it. The interesting events become thus contaminated with multitudes of uninteresting ones. To distinguish the 'good' events from the often overwhelming background of other events one has to resort to computing techniques. Normally this selection is made in the first part of the analysis of the events, analysis normally performed on a powerful scientific computer. This implies however that many uninteresting or background events have to be recorded during the experiment for subsequent analysis. A number of undesired consequences result; and these constitute a sufficient reason for trying to perform the selection at an earlier stage, in fact ideally before the events are recorded on magnetic tape. This early selection is called 'on-line filtering' and it is the topic of the present lectures. (Auth.)

  17. Filter for radioactive iodine

    Purpose: To prevent the reduction in the activity of radioactive iodine adsorbent material at high temperature. Constitution: Regenerated cellulose type fiberous activated carbon with the pore volume of 0.08 cc/g is reactivated by impregnating to support 10% by weight of magnesium acetate to obtain fiberous activated carbon with the pore volume 0.40 cc/g for the pores having diameter between 30 - 300 A. 60 parts of the activated carbon, 40 parts of breaching graft pulp made of coniferous trees and 7 parts by weight of polyvinyl alcohol fibers are subjected to a paper- making process to obtain activated carbon paper. Then, it is molded into a single side corrugated sheet, which is immersed in an ethanol solution containing 20% by weight of triethylenediamine then dried and molded into a honeycomb filter. It is necessary that the activated carbon material has pore volume of more than 5 cc/g for the pores having diameter between 30 - 300 A. (Horiuchi, T.)

  18. Collaborative Filtering Recommender Systems

    Mehrbakhsh Nilashi

    2013-04-01

    Full Text Available Recommender Systems are software tools and techniques for suggesting items to users by considering their preferences in an automated fashion. The suggestions provided are aimed at support users in various decision-making processes. Technically, recommender system has their origins in different fields such as Information Retrieval (IR, text classification, machine learning and Decision Support Systems (DSS. Recommender systems are used to address the Information Overload (IO problem by recommending potentially interesting or useful items to users. They have proven to be worthy tools for online users to deal with the IO and have become one of the most popular and powerful tools in E-commerce. Many existing recommender systems rely on the Collaborative Filtering (CF and have been extensively used in E-commerce .They have proven to be very effective with powerful techniques in many famous E-commerce companies. This study presents an overview of the field of recommender systems with current generation of recommendation methods and examines comprehensively CF systems with its algorithms.

  19. T Source Inverter Based Shunt Active Filter with LCL Passive Filter for the 415V 50 Hz Distribution systems

    Mr.S.Sellakumar; Vijayakumar, M.

    2015-01-01

    The inverter topology is being used as an active filter to reduce the harmonics in the power system [1]. The traditional voltage source or current source inverters are having the disadvantages of limited output voltage range hence it may not be able to supply enough compensating currents during heavy switching surges, Vulnerable to EMI noise and the devices gets damaged in either open or short circuit conditions and the main switching device of VSI and CSI are not interchangeable. The active ...

  20. Advanced Filtering Techniques Applied to Spaceflight Project

    National Aeronautics and Space Administration — IST-Rolla developed two nonlinear filters for spacecraft orbit determination during the Phase I contract. The theta-D filter and the cost based filter, CBF, were...

  1. Cryptosporidium: A Guide to Water Filters

    ... filter Effective against Giardia Effective against parasites Carbon filter Water purifier EPA approved ( Caution: EPA does not approve ... as recommended by the manufacturer can cause a filter to fail. Healthy Water Links Healthy Water Healthy Swimming/Recreational Water Global ...

  2. Factors Influencing HEPA Filter Performance

    Properly functioning HEPA air filtration systems depend on a variety of factors that start with the use of fully characterized challenge conditions for system design and then process control during operation. This paper addresses factors that should be considered during the design phase as well as operating parameters that can be monitored to ensure filter function and lifetime. HEPA filters used in nuclear applications are expected to meet design, fabrication, and performance requirements set forth in the ASME AG-1 standard. The DOE publication Nuclear Air Cleaning Handbook (NACH) is an additional guidance document for design and operation HEPA filter systems in DOE facilities. These two guidelines establish basic maximum operating parameters for temperature, maximum aerosol particle size, maximum particulate matter mass concentration, acceptable differential pressure range, and filter media velocity. Each of these parameters is discussed along with data linking variability of each parameter with filter function and lifetime. Temporal uncertainty associated with gas composition, temperature, and absolute pressure of the air flow can have a direct impact on the volumetric flow rate of the system with a corresponding impact on filter media velocity. Correlations between standard units of flow rate (standard meters per minute or cubic feet per minute) versus actual units of volumetric flow rate are shown for variations in relative humidity for a 70 deg. C to 200 deg. C temperature range as an example of gas composition that, uncorrected, will influence media velocity. The AG-1 standard establishes a 2.5 cm/s (5 feet per minute) ceiling for media velocities of nuclear grade HEPA filters. Data are presented that show the impact of media velocities from 2.0 to 4.0 cm/s media velocities (4 to 8 fpm) on differential pressure, filter efficiency, and filter lifetime. Data will also be presented correlating media velocity effects with two different particle size

  3. Traditional Smallpox Vaccines and Atopic Dermatitis

    ... Grant Request DONATE Traditional Smallpox Vaccines and Atopic Dermatitis Frequently Asked Questions Eczema Living with Eczema Get ... News Research Donate Traditional Smallpox Vaccines and Atopic Dermatitis Frequently Asked Questions What is the traditional smallpox ...

  4. Passive Target Tracking in Non-cooperative Radar System Based on Particle Filtering

    LI Shuo; TAO Ran

    2006-01-01

    We propose a target tracking method based on particle filtering(PF) to solve the nonlinear non-Gaussian target-tracking problem in the bistatic radar systems using external radiation sources. Traditional nonlinear state estimation method is extended Kalman filtering (EKF), which is to do the first level Taylor series extension. It will cause an inaccuracy or even a scatter estimation result on condition that there is either a highly nonlinear target or a large noise square-error. Besides, Kalman filtering is the optimal resolution under a Gaussian noise assumption, and is not suitable to the non-Gaussian condition. PF is a sort of statistic filtering based on Monte Carlo simulation that is using some random samples (particles) to simulate the posterior probability density of system random variables. This method can be used in any nonlinear random system. It can be concluded through simulation that PF can achieve higher accuracy than the traditional EKF.

  5. A local particle filter for high dimensional geophysical systems

    Penny, S. G.; Miyoshi, T.

    2015-12-01

    A local particle filter (LPF) is introduced that outperforms traditional ensemble Kalman filters in highly nonlinear/non-Gaussian scenarios, both in accuracy and computational cost. The standard Sampling Importance Resampling (SIR) particle filter is augmented with an observation-space localization approach, for which an independent analysis is computed locally at each gridpoint. The deterministic resampling approach of Kitagawa is adapted for application locally and combined with interpolation of the analysis weights to smooth the transition between neighboring points. Gaussian noise is applied with magnitude equal to the local analysis spread to prevent particle degeneracy while maintaining the estimate of the growing dynamical instabilities. The approach is validated against the Local Ensemble Transform Kalman Filter (LETKF) using the 40-variable Lorenz-96 model. The results show that: (1) the accuracy of LPF surpasses LETKF as the forecast length increases (thus increasing the degree of nonlinearity), (2) the cost of LPF is significantly lower than LETKF as the ensemble size increases, and (3) LPF prevents filter divergence experienced by LETKF in cases with non-Gaussian observation error distributions.

  6. Attitude Representations for Kalman Filtering

    Markley, F. Landis; Bauer, Frank H. (Technical Monitor)

    2001-01-01

    The four-component quaternion has the lowest dimensionality possible for a globally nonsingular attitude representation, it represents the attitude matrix as a homogeneous quadratic function, and its dynamic propagation equation is bilinear in the quaternion and the angular velocity. The quaternion is required to obey a unit norm constraint, though, so Kalman filters often employ a quaternion for the global attitude estimate and a three-component representation for small errors about the estimate. We consider these mixed attitude representations for both a first-order Extended Kalman filter and a second-order filter, as well for quaternion-norm-preserving attitude propagation.

  7. Image Filtering via Generalized Scale

    De Souza, Andre; Udupa, Jayaram K.; Madabhushi, Anant

    2007-01-01

    In medical imaging, low signal-to-noise ratio (SNR) and/or contrast-to-noise ratio (CNR) often cause many image processing algorithms to perform poorly. Postacquisition image filtering is an important off-line image processing approach widely employed to enhance the SNR and CNR. A major drawback of many filtering techniques is image degradation by diffusing/blurring edges and/or fine structures. In this paper, we introduce a scale-based filtering method that employs scale-dependent diffusion ...

  8. Pragmatic circuits signals and filters

    Eccles, William

    2006-01-01

    Pragmatic Circuits: Signals and Filters is built around the processing of signals. Topics include spectra, a short introduction to the Fourier series, design of filters, and the properties of the Fourier transform. The focus is on signals rather than power. But the treatment is still pragmatic. For example, the author accepts the work of Butterworth and uses his results to design filters in a fairly methodical fashion. This third of three volumes finishes with a look at spectra by showing how to get a spectrum even if a signal is not periodic. The Fourier transform provides a way of dealing wi

  9. Sample-whitened matched filters

    Andersen, Ib

    1973-01-01

    A sample-whitened matched filter (SWMF) for a channel with intersymbol interference and additive white Gaussian noise is defined as a linear filter with the properties that its output samples are a sufficient statistic for the MAP estimation of the transmitted sequence and have uncorrelated noise...... components. These filters are shown to exist for ali realistic channels and the complete set of SWMF's for any channel is determined. It is shown that for nonpathological channels there is a unique SWMF which minimizes the amount of intersymbol interference defined as the discrete-time analog to the rms...

  10. Spatial filtering through elementary examples

    Gluskin, Emanuel

    2004-05-01

    The spatial filtering features of resistive grids have become important in microelectronics in the last two decades, in particular because of the current interest in the design of 'vision chips.' However, these features of the grids are unexpected for many who received a basic physics or electrical engineering education. The author's opinion is that the concept of spatial filtering is important in itself, and should be introduced and separately considered at an early educational stage. We thus discuss some simple examples, of both continuous and discrete systems in which spatial filtering may be observed, using only basic physics concepts.

  11. Properties of ceramic candle filters

    Pontius, D.H.

    1995-06-01

    The mechanical integrity of ceramic filter elements is a key issue for hot gas cleanup systems. To meet the demands of the advanced power systems, the filter components must sustain the thermal stresses of normal operations (pulse cleaning), of start-up and shut-down conditions, and of unanticipated process upsets such as excessive ash accumulation without catastrophic failure. They must also survive the various mechanical loads associated with handling and assembly, normal operation, and process upsets. For near-term filter systems, these elements must survive at operating temperatures of 1650{degrees}F for three years.

  12. ADVANCED HOT GAS FILTER DEVELOPMENT

    E.S. Connolly; G.D. Forsythe

    2000-09-30

    DuPont Lanxide Composites, Inc. undertook a sixty-month program, under DOE Contract DEAC21-94MC31214, in order to develop hot gas candle filters from a patented material technology know as PRD-66. The goal of this program was to extend the development of this material as a filter element and fully assess the capability of this technology to meet the needs of Pressurized Fluidized Bed Combustion (PFBC) and Integrated Gasification Combined Cycle (IGCC) power generation systems at commercial scale. The principal objective of Task 3 was to build on the initial PRD-66 filter development, optimize its structure, and evaluate basic material properties relevant to the hot gas filter application. Initially, this consisted of an evaluation of an advanced filament-wound core structure that had been designed to produce an effective bulk filter underneath the barrier filter formed by the outer membrane. The basic material properties to be evaluated (as established by the DOE/METC materials working group) would include mechanical, thermal, and fracture toughness parameters for both new and used material, for the purpose of building a material database consistent with what is being done for the alternative candle filter systems. Task 3 was later expanded to include analysis of PRD-66 candle filters, which had been exposed to actual PFBC conditions, development of an improved membrane, and installation of equipment necessary for the processing of a modified composition. Task 4 would address essential technical issues involving the scale-up of PRD-66 candle filter manufacturing from prototype production to commercial scale manufacturing. The focus would be on capacity (as it affects the ability to deliver commercial order quantities), process specification (as it affects yields, quality, and costs), and manufacturing systems (e.g. QA/QC, materials handling, parts flow, and cost data acquisition). Any filters fabricated during this task would be used for product qualification tests

  13. Face Recognition using Gabor Filters

    Sajjad MOHSIN

    2011-01-01

    Full Text Available An Elastic Bunch Graph Map (EBGM algorithm is being proposed in this research paper that successfully implements face recognition using Gabor filters. The proposed system applies 40 different Gabor filters on an image. As aresult of which 40 images with different angles and orientation are received. Next, maximum intensity points in each filtered image are calculated and mark them as Fiducial points. The system reduces these points in accordance to distance between them. The next step is calculating the distances between the reduced points using distance formula. At last, the distances are compared with database. If match occurs, it means that the image is recognized.

  14. Simplified design of filter circuits

    Lenk, John

    1999-01-01

    Simplified Design of Filter Circuits, the eighth book in this popular series, is a step-by-step guide to designing filters using off-the-shelf ICs. The book starts with the basic operating principles of filters and common applications, then moves on to describe how to design circuits by using and modifying chips available on the market today. Lenk's emphasis is on practical, simplified approaches to solving design problems.Contains practical designs using off-the-shelf ICsStraightforward, no-nonsense approachHighly illustrated with manufacturer's data sheets

  15. A Gaussian mixture ensemble transform filter

    Reich, Sebastian

    2011-01-01

    We generalize the popular ensemble Kalman filter to an ensemble transform filter where the prior distribution can take the form of a Gaussian mixture or a Gaussian kernel density estimator. The design of the filter is based on a continuous formulation of the Bayesian filter analysis step. We call the new filter algorithm the ensemble Gaussian mixture filter (EGMF). The EGMF is implemented for three simple test problems (Brownian dynamics in one dimension, Langevin dynamics in two dimensions, ...

  16. Performance data and limits of aerosol filters

    Under suitable conditions and with an optimum mode of operation, aerosol filters as high-quality mechanical air filters can retain sumicron particles, which makes them nearly absolute filters at the present state of the art of air filter technology. This cautions statement shows, however, that the term absolute filter, which is often used, is not correct as an absolute retention - 100% - is never reached under the parameters valid for the various classes fo filters. (orig.)

  17. Characterizing a Tune-all bandstop filter

    Musoll, Carles; Llamas Garro, Ignacio; Brito Brito, Zabdiel; Pradell i Cara, Lluís; Corona, Alfonso

    2009-01-01

    In this paper a reconfigurable bandstop filter able to reconfigure central frequency, bandwidth and selectivity for fine tuning applications is presented. The reconfigurable filter topology has four poles and a quasielliptic bandstop filter response. The filter is tuned by varactor diodes placed at different locations on the filter topology. The varactors are voltage controlled in pairs due to filter symmetry for central frequency and bandwidth control. An additional v...

  18. Filters for High Rate Pulse Processing

    Alpert, B. K.; Horansky, R. D.; Bennett, D.A.; Doriese, W. B.; Fowler, J. W.; Hoover, A. S.; Rabin, M. W.; Ullom, J. N.

    2012-01-01

    We introduce a filter-construction method for pulse processing that differs in two respects from that in standard optimal filtering, in which the average pulse shape and noise-power spectral density are combined to create a convolution filter for estimating pulse heights. First, the proposed filters are computed in the time domain, to avoid periodicity artifacts of the discrete Fourier transform, and second, orthogonality constraints are imposed on the filters, to reduce the filtering procedu...

  19. Application of Archimedes Filter for Reduction of Hanford HLW

    Archimedes Technology Group, Inc., is developing a plasma mass separator called the Archimedes Filter that separates waste oxide mixtures ion by ion into two mass groups: light and heavy. For the first time, it is feasible to separate large amounts of material atom by atom in a single pass device. Although vacuum ion based electromagnetic separations have been around for many decades, they have traditionally depended on ion beam manipulation. Neutral plasma devices, on the other hand, are much easier, less costly, and permit several orders of magnitude greater throughput. The Filter has many potential applications in areas where separation of species is otherwise difficult or expensive. In particular, radioactive waste sludges at Hanford have been a particularly difficult issue for pretreatment and immobilization. Over 75% of Hanford HLW oxide mass (excluding water, carbon, and nitrogen) has mass less than 59 g/mol. On the other hand, 99.9% of radionuclide activity has mass greater than 89 g/mol. Therefore, Filter mass separation tuned to this cutoff would have a dramatic effect on the amount of IHLW produced--in fact IHLW would be reduced by a factor of at least four. The Archimedes Filter is a brand new tool for the separations specialist's toolbox. In this paper, we show results that describe the extent to which the Filter separates ionized material. Such results provide estimates for the potential advantages of Filter tunability, both in cutoff mass (electric and magnetic fields) and in degree of ionization (plasma power). Archimedes is now engaged in design and fabrication of its Demonstration Filter separator and intends on performing a full-scale treatment of Hanford high-level waste surrogates. The status of the Demo project will be described

  20. Performance Evaluation of a Loeb-Eiber Mass Filter at 1 Torr

    Hoffmann, William D.; Jin, Feng; Pedder, Randall E.; Taormina, Christopher; Jackson, Glen P.

    2015-02-01

    The Loeb-Eiber mass filter is best operated at relatively high pressures—such as 1 Torr—where collisional dampening of ions up to the mass filter thermalizes the ions' kinetic energy, which is a requirement for effective filtering. The inter-electrode gaps of ~8 μm require rf amplitudes on the order of 0-5 V p-p at approximately 50 MHz to achieve mass filtering up to m/z 40. Mass filtering between the 25-μm diameter wires, therefore, takes place on time frames less than the collision frequency at ~1 Torr. The low power and high pressure capabilities of the Loeb-Eiber mass filter make it ideally suited for miniaturization, where power and space are a premium. In the present work, a Loeb-Eiber mass filter was constructed using commercial silicon-on-insulator (SOI) microfabrication techniques. Ions transmitting through the chip-based Loeb-Eiber mass filter were characterized in real time using a traditional linear quadrupole mass analyzer in series with the Loeb-Eiber mass filter. The new hybrid instrument has enabled us to verify several important claims regarding the operation of the Loeb-Eiber mass filter: (1) that ions can be effectively filtered at ~1 Torr, (2) that for ions of a fixed mass-to-charge ratio, the ion transmission current decreases linearly with increasing rf amplitude on the Loeb-Eiber mass filter, (3) that the cutoff voltage at which all ions of a particular m/z value are effectively blocked is linearly related to mass-to-charge, and (4) that square waveforms can filter ions more effectively than sinusoidal waveforms for a given peak-to-peak rf amplitude.

  1. The Value of Rotational Venography Versus Anterior–Posterior Venography in 100 Consecutive IVC Filter Retrievals

    PurposeAccurately detecting inferior vena cava (IVC) filter complications is important for safe and successful retrieval as tip-embedded filters require removal with non-standard techniques. Venography prior to IVC filter retrieval has traditionally used a single anterior–posterior (AP) projection. This study compares the utility of rotational venography to AP venography prior to IVC filter removal.Materials and MethodsThe rotational venograms from 100 consecutive IVC filter retrievals over a 35-month period were evaluated retrospectively. The AP view of the rotational venogram was examined separately from the full series by a radiologist blinded to alternative imaging and operative findings. The venograms were evaluated for tip embedding, filter fracture, filter thrombus, and IVC thrombus. Statistical analysis was performed.ResultsUsing operative findings and peri-procedural imaging as the reference standard, tip embedding occurred in 59 of the 100 filters (59 %). AP venography was used to correctly identify 31 tip-embedded filters (53 % sensitivity) with two false positives (95 % specificity) for an accuracy of 70 %. Rotational venography was used to correctly identify 58 tip-embedded filters (98 % sensitivity) with one false positive (98 % specificity) for an accuracy of 98 %. A significant difference was found in the sensitivities of the two diagnostic approaches (P < .01). Other findings of thrombus and filter fracture were not significantly different between the two groups.ConclusionRotational venograms allow for more accurate detection of tip-embedded IVC filters compared to AP views alone. As this determines the approach taken, rotational venograms are helpful if obtained prior to IVC filter retrieval

  2. The Value of Rotational Venography Versus Anterior–Posterior Venography in 100 Consecutive IVC Filter Retrievals

    Kiefer, Ryan M., E-mail: rkiefer11@gmail.com; Pandey, Nirnimesh; Trerotola, Scott O.; Nadolski, Gregory J.; Stavropoulos, S. William, E-mail: stav@uphs.upenn.edu [Hospital of University of Pennsylvania Medical Center, Division of Interventional Radiology, Department of Radiology (United States)

    2016-03-15

    PurposeAccurately detecting inferior vena cava (IVC) filter complications is important for safe and successful retrieval as tip-embedded filters require removal with non-standard techniques. Venography prior to IVC filter retrieval has traditionally used a single anterior–posterior (AP) projection. This study compares the utility of rotational venography to AP venography prior to IVC filter removal.Materials and MethodsThe rotational venograms from 100 consecutive IVC filter retrievals over a 35-month period were evaluated retrospectively. The AP view of the rotational venogram was examined separately from the full series by a radiologist blinded to alternative imaging and operative findings. The venograms were evaluated for tip embedding, filter fracture, filter thrombus, and IVC thrombus. Statistical analysis was performed.ResultsUsing operative findings and peri-procedural imaging as the reference standard, tip embedding occurred in 59 of the 100 filters (59 %). AP venography was used to correctly identify 31 tip-embedded filters (53 % sensitivity) with two false positives (95 % specificity) for an accuracy of 70 %. Rotational venography was used to correctly identify 58 tip-embedded filters (98 % sensitivity) with one false positive (98 % specificity) for an accuracy of 98 %. A significant difference was found in the sensitivities of the two diagnostic approaches (P < .01). Other findings of thrombus and filter fracture were not significantly different between the two groups.ConclusionRotational venograms allow for more accurate detection of tip-embedded IVC filters compared to AP views alone. As this determines the approach taken, rotational venograms are helpful if obtained prior to IVC filter retrieval.

  3. A trend-cycle(-season) filter

    Mohr, Matthias

    2005-01-01

    This paper proposes a new univariate method to decompose a time series into a trend, a cyclical and a seasonal component: the Trend-Cycle filter (TC filter) and its extension, the Trend-Cycle-Season filter (TCS filter). They can be regarded as extensions of the Hodrick-Prescott filter (HP filter). In particular, the stochastic model of the HP filter is extended by explicit models for the cyclical and the seasonal component. The introduction of a stochastic cycle improves the filter in three r...

  4. SEIK - the unknown ensemble Kalman filter

    Nerger, Lars; Janjic Pfander, Tijana; Hiller, Wolfgang; Schröter, Jens

    2009-01-01

    The SEIK filter (Singular "Evolutive" Interpolated Kalman filter) hasbeen introduced in 1998 by D.T. Pham as a variant of the SEEK filter,which is a reduced-rank approximation of the Extended KalmanFilter. In recent years, it has been shown that the SEIK filter isan ensemble-based Kalman filter that uses a factorization rather thansquare-root of the state error covariance matrix. Unfortunately, theexistence of the SEIK filter as an ensemble-based Kalman filter withsimilar efficiency as the la...

  5. Three Revised Kalman Filtering Models for Short-Term Rail Transit Passenger Flow Prediction

    Pengpeng Jiao; Ruimin Li; Tuo Sun; Zenghao Hou; Amir Ibrahim

    2016-01-01

    Short-term prediction of passenger flow is very important for the operation and management of a rail transit system. Based on the traditional Kalman filtering method, this paper puts forward three revised models for real-time passenger flow forecasting. First, the paper introduces the historical prediction error into the measurement equation and formulates a revised Kalman filtering model based on error correction coefficient (KF-ECC). Second, this paper employs the deviation between real-tim...

  6. Tradition et écriture

    Boutry, Philippe

    2013-01-01

    Le concept de tradition, tel qu’il est aujourd’hui communément utilisé par les spécialistes des sciences sociales, n’est nullement étranger, par son histoire et par son contenu, à la notion de tradition telle que l’ont élaborée depuis deux millénaires les théologiens ; et la réflexion des seconds peut, sous certains rapports, concourir aux questionnements des premiers. Des origines du christianisme aux définitions du concile de Trente (1546) sur « les deux sources de la révélation » qui fixen...

  7. Insomnia in Iranian Traditional Medicine

    Feyzabadi, Zohre; Jafari, Farhad; Feizabadi, Parvin Sadat; Ashayeri, Hassan; Esfahani, Mohammad Mahdi; Badiee Aval, Shapour

    2014-01-01

    Context: Insomnia is one of the most prevalent sleep disorders characterized by sleep difficulty that impairs daily functioning and reduces quality of life. The burden of medical, psychiatric, interpersonal, and societal consequences of insomnia expresses the importance of diagnosing and treatment of insomnia. The aim of study was to investigate causes of insomnia from the viewpoint of Iranian traditional medicine. Evidence Acquisition: In this review study, we searched insomnia in a few of t...

  8. Software Development: Agile vs. Traditional

    Marian STOICA; Marinela MIRCEA; Bogdan GHILIC-MICU

    2013-01-01

    Organizations face the need to adapt themselves to a complex business environment, in continuous change and transformation. Under these circumstances, organization agility is a key element in gaining strategic advantages and market success. Achieving and maintaining agility requires agile architectures, techniques, methods and tools, able to react in real time to change requirements. This paper proposes an incursion in the software development, from traditional to agile.

  9. Harry Potter: Tradition and Innovation

    2004-01-01

    The aim of this thesis is to show that the Harry Potter novels have something in common with traditional children's literature as well as they are innovative. Joanne Rowling's novels have been accused of being derivative. In my thesis, I try to show that there is an intertextual relationship between the Harry Potter novels and previously released works. Rowling uses the recognition effect and shows that she is well aware of what has been written before. She uses elements from previously relea...

  10. Matched Spectral Filter Imager Project

    National Aeronautics and Space Administration — OPTRA proposes the development of an imaging spectrometer for greenhouse gas and volcanic gas imaging based on matched spectral filtering and compressive imaging....

  11. Integrated Spatial Filter Array Project

    National Aeronautics and Space Administration — To address the NASA Earth Science Division need for spatial filter arrays for amplitude and wavefront control, Luminit proposes to develop a novel Integrated...

  12. Regeneration method for filter element

    The outer surface of a filter element used for treating exhaust gases from an incinerator is divided into a plurality of zones. A back-washing for the filter is conducted in a container by using pressurized air in a state of leaving a certain zone unsealed. Then, the unsealed zone is displaced and back-washing is applied in the same manner. With such procedures, clogged materials which could not be removed by the existent method of simultaneously back-washing the entire filter element can certainly be removed. Further, according to the present invention, the clogged materials removed from the filter element are not discharged to the outside, but prevented from flowing out of the system. (T.M.)

  13. The Kalman-Levy filter

    Sornette, D.; Ide, K.

    2000-01-01

    The Kalman filter combines forecasts and new observations to obtain an estimation which is optimal in the sense of a minimum average quadratic error. The Kalman filter has two main restrictions: (i) the dynamical system is assumed linear and (ii) forecasting errors and observational noises are taken Gaussian. Here, we offer an important generalization to the case where errors and noises have heavy tail distributions such as power laws and L\\'evy laws. The main tool needed to solve this ``Kalm...

  14. Narrow-Band Microwave Filters

    A.V. Strizhachenko

    2010-01-01

    Full Text Available Original design of the narrow-band compact filters based on the high-quality waveguide-dielectric resonator with anisotropic materials has been presented in this work. Designed filters satisfy the contradictory requirements: they provide the narrow frequency band (0.05 ÷ 0.1 % of the main frequency f0 and the low initial losses α0 ≤ 1 dB.

  15. Stochastic processes and filtering theory

    Jazwinski, Andrew H

    2007-01-01

    This unified treatment of linear and nonlinear filtering theory presents material previously available only in journals, and in terms accessible to engineering students. Its sole prerequisites are advanced calculus, the theory of ordinary differential equations, and matrix analysis. Although theory is emphasized, the text discusses numerous practical applications as well.Taking the state-space approach to filtering, this text models dynamical systems by finite-dimensional Markov processes, outputs of stochastic difference, and differential equations. Starting with background material on probab

  16. The Archimedes Plasma Mass Filter

    Miller, R. L.; Ohkawa, T.; Agnew, S. F.; Cluggish, B. P.; Freeman, R. L.; Gilleland, J.; Putvinski, S.; Sevier, L.; Umstadter, K. R.

    2001-10-01

    Archimedes Technology Group is developing a plasma technology, called the Archimedes Plasma Mass Filter, which can separate a waste mixture ion by ion into mass groups and as such represents a major advance in waste separations technology. The filter is a plasma device employing a magnetic and electric field configuration that acts as a low-mass-pass filter for ions. Ions with mass above a tunable “cutoff mass” are expelled from the plasma. The Archimedes Plasma Mass Filter satisfies all of the requirements of an economic mass separator system: good single-pass separation, acceptable energy cost per ion, and high material throughput. This technology could significantly reduce the volume of radioactive waste at the Hanford Site in Richland, Washington, which is storing sixty percent of the nation’s defense nuclear waste. The potential waste reduction is dramatic because 82 wtpresently scheduled to be vitrified (immobilized and stored in glass) at Hanford are below mass number 60 while 99.9the radioactivity comes from atoms above mass number 89. We will present the plasma physics basis for the filter effect, the fundamental parameter constraints, and modeling results of filter operation.

  17. A taxonomy fuzzy filtering approach

    Vrettos S.

    2003-01-01

    Full Text Available Our work proposes the use of topic taxonomies as part of a filtering language. Given a taxonomy, a classifier is trained for each one of its topics. The user is able to formulate logical rules combining the available topics, e.g. (Topic1 AND Topic2 OR Topic3, in order to filter related documents in a stream. Using the trained classifiers, every document in the stream is assigned a belief value of belonging to the topics of the filter. These belief values are then aggregated using logical operators to yield the belief to the filter. In our study, Support Vector Machines and Naïve Bayes classifiers were used to provide topic probabilities. Aggregation of topic probabilities based on fuzzy logic operators was found to improve filtering performance on the Renters text corpus, as compared to the use of their Boolean counterparts. Finally, we deployed a filtering system on the web using a sample taxonomy of the Open Directory Project.

  18. Q value analysis of microwave photonic filters

    Lina ZHOU; Xinliang ZHANG; Enming XU

    2009-01-01

    This paper first presents the fundamental principles of the microwave photonic filters.As an example to explain how to implement a microwave photonic filter, a specific finite impulse response (FIR) filter is illustrated.Next, the Q value of the microwave photonic filters is analyzed theoretically, and methods around how to gain high Q value are discussed.Then,divided into FIR filter, first-order infinite impulse response (IIR) filter, and multi-order IIR filter, several novel microwave photonic filters with high Q value are listed and compared.The technical difficulties to get high Q value in first-order IIR filter and multi-order IIR filter are analyzed concretely.Finally, in order to gain higher Q value, a multi-order IIR microwave photonic filter that easily extends its order is presented and discussed.

  19. Improvement of Stopband Performance in Parallel-Coupled Bandpass Filters Using Quasi-Lumped Elements

    Zhurbenko, Vitaliy; Krozer, Viktor; Meincke, Peter

    2008-01-01

    This paper is aimed at improving the rejection level as well as overcoming the bandwidth limitations for classical coupled-line filters. A planar microwave coupled-line filter employing a quasi-lumped element resonator considerably improving the stopband characteristics is presented. The proposed...... bandpass filter has a compact footprint, and exhibits good stopband rejection with no repeated passband at twice the center frequency in comparison with the traditional coupled-line filter. By introducing the quasi-lumped element resonator, two transmission zeros at upper and lower stopbands are created......, with adjustable locations of transmission zeros for desired performance. The device is fabricated in standard thick-film manufacturing technology. Based on parallel-coupled line theory, the impedance inverter model for this device is developed. The passband filter centered at 5.5 GHz with a 3 d...

  20. A planar and tunable bandpass filter on a ferrite substrate with integrated windings

    Arabi, Eyad

    2015-05-01

    Tunable Filters that are based on ferrite materials are often biased by external magnets or coils which are large and bulky. In this work a completely planar, CPW-based bandpass filter is presented with integrated windings. Due to these windings the size of the filter is only 26mm × 34mm × 0.38mm which is orders of magnitude smaller than the traditional designs with external windings. The filter is realized by electroplating of Copper over seed layers of Titanium and Gold over a YIG substrate. The fabricated filter achieves a tunability of 3.4% without any external magnets or coils. A good insertion loss of 2.3 dBs and rejection greater than 50 dBs have been obtained. To the best of the authors knowledge, this design is the first ferrite-based design that is completely planar and self-biased.

  1. A robust strong tracking cubature Kalman filter for spacecraft attitude estimation with quaternion constraint

    Huang, Wei; Xie, Hongsheng; Shen, Chen; Li, Jinpeng

    2016-04-01

    This paper considers a robust strong tracking nonlinear filtering problem in the case there are model uncertainties including the model mismatch, unknown disturbance and status mutation in the spacecraft attitude estimation system with quaternion constraint. Two multiple fading factor matrices are employed to regulate the prediction error covariance matrix, which guarantees its symmetry. The spherical-radial cubature rule is developed to deal with the multi-dimensional integrals. The quaternion constraint is maintained by utilizing the gain correction method. Therefore a robust strong tracking cubature Kalman filter (RSTCKF) is formed for the spacecraft attitude estimation with quaternion constraint. Unlike adopting a single fading factor in the traditional strong tracking filter, the presented filter uses two multiple fading factor matrices to make different channels have respective filter adjustment capability, which improves the tracking performance of this algorithm. Simulation results show the effectiveness of the proposed RSTCKF.

  2. Position USBL/DVL Sensor-based Navigation Filter in the presence of Unknown Ocean Currents

    Morgado, M; Oliveira, P; Silvestre, C

    2010-01-01

    This paper presents a novel approach to the design of globally asymptotically stable (GAS) position filters for Autonomous Underwater Vehicles (AUVs) based directly on the nonlinear sensor readings of an Ultra-short Baseline (USBL) and a Doppler Velocity Log (DVL). Central to the proposed solution is the derivation of a linear time-varying (LTV) system that fully captures the dynamics of the nonlinear system, allowing for the use of powerful linear system analysis and filtering design tools that yield GAS filter error dynamics. Simulation results reveal that the proposed filter is able to achieve the same level of performance of more traditional solutions, such as the Extended Kalman Filter (EKF), while providing, at the same time, GAS guarantees, which are absent for the EKF.

  3. Photonic Color Filters Integrated with Organic Solar Cells for Energy Harvesting

    Park, Hui Joon

    2011-09-27

    Color filters are indispensable in most color display applications. In most cases, they are chemical pigment-based filters, which produce a particular color by absorbing its complementary color, and the absorbed energy is totally wasted. If the absorbed and wasted energy can be utilized, e.g., to generate electricity, innovative energy-efficient electronic media could be envisioned. Here we show photonic nanostructures incorporated with photovoltaics capable of producing desirable colors in the visible band and utilize the absorbed light to simultaneously generate electrical powers. In contrast to the traditional colorant-based filters, these devices offer great advantages for electro-optic applications. © 2011 American Chemical Society.

  4. Reduction of Data Sparsity in Collaborative Filtering based on Fuzzy Inference Rules

    A tisha Sachan

    2013-06-01

    Full Text Available Collaborative filtering Recommender system plays avery demanding and significance role in this era ofinternet informationand of course e commerce age.Collaborative filtering predicts user preferencesfrom past user behaviouror user-item relationships.Though it has many advantages it also has somelimitations such as sparsity, scalability, accuracy,cold start problem etc.In this paper we proposed amethod that helps in reducing sparsity to enhancerecommendation accuracy. We developed fuzzyinference ruleswhich is easily to implement andalso gives better result. Acomparison experiment isalsoperformingwith two previous methods,Traditional Collaborative Filtering (TCF andHybrid User Model Technique (HUMCF.

  5. A wide-angle metamaterial narrow-band-stop filter for 532 nm wavelength green light

    Yue, Liyang; Ji, Songkun; Yan, Bing; Tung, Nguyen Thanh; Lam, Vu Dinh; Wang, Zengbo

    2016-01-01

    Traditional optical interference narrow-band-stop filters do not possess wide-angle property, because peaks and troughs of filter spectrum would be moved at a non-normal angle of incidence (AOI), which could result in functional failure in particular cases, e.g. blocking of laser for pilot in cockpit during premeditated laser pointer direct. For this reason, we designed a wide-angle metamaterial narrow-band-stop filter assembled by cross shaped units to block 532 nm green light, which is firs...

  6. Spatial mask filtering algorithm for partial discharge pulse extraction of large generators

    2006-01-01

    A spatial mask filter algorithm (SMF) for partial discharge (PD) pulse extraction is proposed in this then direct multiplication of coefficients at two adjacent scales is used to detect singularity points of the signal tain the last spatial mask filter. By multiplication of wavelet coefficients with the final mask filter and wavelet reconstruction process, partial discharge pulses are extracted. The results of digital simulation and practical experiment show that this method is superior to traditional wavelet shrinkage method (TWS). This algorithm not only can increase the signal to noise ratio (SNR), but also can preserve the energy and pulse amplitude.

  7. Modernism and tradition and the traditions of modernism

    Kros Džonatan

    2006-01-01

    Full Text Available Conventionally, the story of musical modernism has been told in terms of a catastrophic break with the (tonal past and the search for entirely new techniques and modes of expression suitable to a new age. The resulting notion of a single, linear, modernist mainstream (predicated on the basis of a Schoenbergian model of musical progress has served to conceal a more subtle relationship between past and present. Increasingly, it is being recognized that there exist many modernisms and their various identities are forged from a continual renegotiation between past and present, between tradition(s and the avant-garde. This is especially relevant when attempting to discuss the reception of modernism outside central Europe, where the adoption of (Germanic avant-garde attitudes was often interpreted as being "unpatriotic". The case of Great Britain is examined in detail: Harrison Birtwistle’s opera The Mask of Orpheus (1973–83 forms the focus for a wider discussion of modernism within the context of late/post-modern thought.

  8. Particle filter Simulation and Analysis Enabling Non-Traditional Navigation Project

    National Aeronautics and Space Administration — Incorporate PF into GSFC’s Orbit Determination Toolbox (ODTBX). Augment PF with ODTBX’ unique ability to partition error sources into subspaces for...

  9. A New Adaptive Square-Root Unscented Kalman Filter for Nonlinear Systems with Additive Noise

    Yong Zhou

    2015-01-01

    Full Text Available The Kalman filter (KF, extended KF, and unscented KF all lack a self-adaptive capacity to deal with system noise. This paper describes a new adaptive filtering approach for nonlinear systems with additive noise. Based on the square-root unscented KF (SRUKF, traditional Maybeck’s estimator is modified and extended to nonlinear systems. The square root of the process noise covariance matrix Q or that of the measurement noise covariance matrix R is estimated straightforwardly. Because positive semidefiniteness of Q or R is guaranteed, several shortcomings of traditional Maybeck’s algorithm are overcome. Thus, the stability and accuracy of the filter are greatly improved. In addition, based on three different nonlinear systems, a new adaptive filtering technique is described in detail. Specifically, simulation results are presented, where the new filter was applied to a highly nonlinear model (i.e., the univariate nonstationary growth model (UNGM. The UNGM is compared with the standard SRUKF to demonstrate its superior filtering performance. The adaptive SRUKF (ASRUKF algorithm can complete direct recursion and calculate the square roots of the variance matrixes of the system state and noise, which ensures the symmetry and nonnegative definiteness of the matrixes and greatly improves the accuracy, stability, and self-adaptability of the filter.

  10. Sensory pollution from bag-type fiberglass ventilation filters: Conventional filter compared with filters containing various amounts of activated carbon

    Bekö, Gabriel; Fadeyi, M.O.; Clausen, Geo;

    2009-01-01

    quarter as much carbon (100 g/m(2)). Each filter was weighed at the beginning of the soiling period and after 3 and 6 months of service. Additionally, up- and down-stream ozone concentrations and filter pressure drops were measured monthly. Following 6 months of service, the air downstream of each of the...... to an equivalent filter without carbon. The aim of the present study was to examine how the amount of activated carbon (AC) used in combination filters affects their ability to remove both sensory offending pollutants and ozone. A panel evaluated the air downstream of four different filters after...... combination filters was judged to be significantly better than the air downstream of the 6-month-old F7 filter, and was comparable to that from an unused F7 filter. Additionally, the combination filters removed more ozone from the air than the F7 filter, with their respective fractional removal efficiencies...

  11. Hyperhidrosis in Iranian Traditional Medicine

    Shahroodi, Aniseh Saffar; Shirbeigi, Leila

    2016-01-01

    Background: Excessive sweating is a medical condition in which a person sweats much more than needed. The medical name of this disorder is hyperhidrosis known as a common dermal problem that affects people of all ages and leads to negative impact on the quality of life. During the last decades, several studies have shown that in many cases of hyperhidrosis there is no evidence of systemic disease. Therefore, most treatments are temporary and symptomatic therapy. According to Iranian traditional medicine (ITM), different approaches are mentioned for hyperhidrosis. Methods: This study has reviewed ITM textbooks, such as “Canon of Medicine and Exir-e-azam” as well as scientific references and databases of modern medicine (ISI, PubMed, etc.) with specific keywords. Contents and related concepts were classified and results prepared. Results: In modern medicine, hyperhidrosis has been defined as an abnormal excessive sweating, which is either primary (idiopathic) or secondary to other systemic diseases such as hyperthyroidism, neurological condition or heart disease. Current modalities for treatment are topical anti-perspiration, iontophoresis, Botox injection (Botulinum toxin type A) and eventually thoracic sympathectomy as the last therapeutic modalities. From the viewpoint of the Iranian traditional medicine as a holistic doctrine, hyperhidrosis etiologies include overfilled and repletion of body due to the accumulation of humors, excessive intake of food, excessive dilated skin pores, vigorous exercise, or physical activity. Therefore, therapeutic plan for hyperhidrosis was based on its cause, which includes reduction in the amount of food, increasing physical activity, purging the body from the excess humors and adjustment in temperament. Conclusion: Hyperhidrosis is not an important or dangerous disorder; however, due to the negative impact on quality of life and failure to achieve perfect answer in modern medicine treatments it seems that the recommendations

  12. Adapting agriculture with traditional knowledge

    Swiderska, Krystyna; Reid, Hannah [IIED, London (United Kingdom); Song, Yiching; Li, Jingsong [Centre for Chinese Agriculutral Policy (China); Mutta, Doris [Kenya Forestry Research Institute (Kenya)

    2011-10-15

    Over the coming decades, climate change is likely to pose a major challenge to agriculture; temperatures are rising, rainfall is becoming more variable and extreme weather is becoming a more common event. Researchers and policymakers agree that adapting agriculture to these impacts is a priority for ensuring future food security. Strategies to achieve that in practice tend to focus on modern science. But evidence, both old and new, suggests that the traditional knowledge and crop varieties of indigenous peoples and local communities could prove even more important in adapting agriculture to climate change.

  13. Bulgarie. Musique de tradition pastorale

    Charles-Dominique, Luc

    2012-01-01

    L’excellente collection discographique des Archives internationales de musique populaire du Musée d’ethnographie de Genève s’est récemment enrichie d’un album consacré aux « Musiques de tradition pastorale » de Bulgarie, venant compléter ainsi un catalogue européen plus balkanique (Bosnie, Serbie, Grèce, Roumanie) et oriental (Pologne, Russie) qu’occidental. Cette nouvelle publication, c’est à Marie-Barbara Le Gonidec qu’on la doit, une ethnomusicologue et ethno-organologue spécialiste de la ...

  14. Traditional Therapies for Severe Asthma.

    Wang, Eileen; Hoyte, Flavia C L

    2016-08-01

    Severe asthma is a complex and heterogeneous disease. The European Respiratory Society and American Thoracic Society guidelines define severe asthma for patients 6 years or older as "asthma which requires treatment with high-dose inhaled corticosteroids…plus a second controller or systemic corticosteroids to prevent it from becoming 'uncontrolled' or which remains 'uncontrolled' despite this therapy." This article reviews available traditional therapies, data behind their uses in severe asthma, and varying recommendations. As various asthma endotypes and phenotypes are better understood and characterized, targeted therapies should help improve disease outcomes, efficacy, and cost-effectiveness. PMID:27401628

  15. Pixelated filters for spatial imaging

    Mathieu, Karine; Lequime, Michel; Lumeau, Julien; Abel-Tiberini, Laetitia; Savin De Larclause, Isabelle; Berthon, Jacques

    2015-10-01

    Small satellites are often used by spatial agencies to meet scientific spatial mission requirements. Their payloads are composed of various instruments collecting an increasing amount of data, as well as respecting the growing constraints relative to volume and mass; So small-sized integrated camera have taken a favored place among these instruments. To ensure scene specific color information sensing, pixelated filters seem to be more attractive than filter wheels. The work presented here, in collaboration with Institut Fresnel, deals with the manufacturing of this kind of component, based on thin film technologies and photolithography processes. CCD detectors with a pixel pitch about 30 μm were considered. In the configuration where the matrix filters are positioned the closest to the detector, the matrix filters are composed of 2x2 macro pixels (e.g. 4 filters). These 4 filters have a bandwidth about 40 nm and are respectively centered at 550, 700, 770 and 840 nm with a specific rejection rate defined on the visible spectral range [500 - 900 nm]. After an intense design step, 4 thin-film structures have been elaborated with a maximum thickness of 5 μm. A run of tests has allowed us to choose the optimal micro-structuration parameters. The 100x100 matrix filters prototypes have been successfully manufactured with lift-off and ion assisted deposition processes. High spatial and spectral characterization, with a dedicated metrology bench, showed that initial specifications and simulations were globally met. These excellent performances knock down the technological barriers for high-end integrated specific multi spectral imaging.

  16. Filter-adsorber aging assessment

    An aging assessment of high-efficiency particulate (HEPA) air filters and activated carbon gas adsorption units was performed by the Pacific Northwest Laboratory as part of the U.S. Nuclear Regulatory Commission's (USNRC) Nuclear Plant Aging Research (NPAR) Program. This evaluation of the general process in which characteristics of these two components gradually change with time or use included the compilation of information concerning failure experience, stressors, aging mechanisms and effects, and inspection, surveillance, and monitoring methods (ISMM). Stressors, the agents or stimuli that can produce aging degradation, include heat, radiation, volatile contaminants, and even normal concentrations of aerosol particles and gasses. In an experimental evaluation of degradation in terms of the tensile breaking strength of aged filter media specimens, over forty percent of the samples did not meet specifications for new material. Chemical and physical reactions can gradually embrittle sealants and gaskets as well as filter media. Mechanisms that can lead to impaired adsorber performance are associated with the loss of potentially available active sites as a result of the exposure of the carbon to airborne moisture or volatile organic compounds. Inspection, surveillance, and monitoring methods have been established to observe filter pressure drop buildup, check HEPA filters and adsorbers for bypass, and determine the retention effectiveness of aged carbon. These evaluations of installed filters do not reveal degradation in terms of reduced media strength but that under normal conditions aged media can continue to effectively retain particles. However, this degradation may be important when considering the likelihood of moisture, steam, and higher particle loadings during severe accidents and the fact it is probable that the filters have been in use for an extended period

  17. Traditional Chinese culture in modern Product Design

    Wu, Qing

    2015-01-01

    This paper describes the sources and features of traditional Chinese culture. It also discusses the aesthetic thoughts of Chinese traditional culture. Chinese traditional culture has been illustrated broad and far-reaching impact on design since ancient time. In the context of globalization and the rapid development of science and technology, the difference between traditional design and modern product design should be explored. Chinese traditional culture cannot be totally absorbed. It is im...

  18. CPHD filter derivation for extended targets

    Orguner, Umut

    2010-01-01

    This document derives the CPHD filter for extended targets. Only the update step is derived here. Target generated measurements, false alarms and prior are all assumed to be independent identically distributed cluster processes. We also prove here that the derived CPHD filter for extended targets reduce to PHD filter for extended targets and CPHD filter for standard targets under suitable assumptions.

  19. An ultra high frequency wideband filter

    Pan, V.M.; Tarasov, V. F.; Futimsky, S. I.

    2008-01-01

    An ultra high frequency wideband filter was developed and fabricated. High-temperature superconductive film, sputtered in a sapphire substrate, was used as a resonator material. Loss in the filter pass band is 0.7 dB, the filter pass band is 165 MHz, its central frequency is 1877 MHz. The filter topology and amplitude-frequency responses are given.

  20. Water washable stainless steel HEPA filter

    Phillips, Terrance D.

    2001-01-01

    The invention is a high efficiency particulate (HEPA) filter apparatus and system, and method for assaying particulates. The HEPA filter provides for capture of 99.99% or greater of particulates from a gas stream, with collection of particulates on the surface of the filter media. The invention provides a filter system that can be cleaned and regenerated in situ.

  1. Digital notch filter based active damping for LCL filters

    Yao, Wenli; Yang, Yongheng; Zhang, Xiaobin;

    2015-01-01

    . In contrast, the active damping does not require any dissipation elements, and thus has become of increasing interest. As a result, a vast of active damping solutions have been reported, among which multi-loop control systems and additional sensors are necessary, leading to increased cost and complexity....... In this paper, a notch filter based active damping without the requirement of additional sensors is proposed, where the inverter current is employed as the feedback variable. Firstly, a design method of the notch filter for active damping is presented. The entire system stability has then been investigated...... in the z-domain. Simulations and experiments are carried out to verify the proposed active damping method. Both results have confirmed that the notch filter based active damping can ensure the entire system stability in the case of resonances with a good system performance....

  2. Moisture reduction of filter cake by improved filter design

    Dahlstrom, D.A.; Davis, S.S.; Garlick, L.D.

    1983-10-01

    With the redesign of the Agidisc filter, several improvements have been achieved as follows: Internal velocities of filtrate and air have been greatly reduced in the PIPPED sectors, ferrules, internal channels and filter valve. This will maximize pressure drop across the cake resulting in lower moistures and decreased wear. The PIPPED sector achieves better cake discharge because of the flexing action while increasing bag life. Moisture content reduction for the PIPPED sectors alone appear to have the potential for about 1 1/2 percentage points lower then redwood sectors. By indexing the two halves of the disc filter so each half discharges separately, shock wear on the cake conveyor should be reduced. More air volume will also go to less sectors resulting in better discharge. Using a sudden blow with a low pressure blower for cake discharge, energy consumption is reduced. Overall maintenance costs for the new design should be reduced through less wear due to appreciably reduced velocities.

  3. Sharpening of the multistage modified comb filters

    Nikolić Marko

    2011-01-01

    Full Text Available This paper describes the application of filter sharpening method to the modified comb filter (MCF in the case of decimation factor, which is product of two or more positive integers. It is shown that in the case of multistage decimation with MCF, filters in each stage are also MCF. Applying the sharpening to the decimation filter in the last stage provides very good results, with savings in the number of operations comparing to the case of sharpening of the complete filter. Direct-form FIR polyphase filter structure is proposed for the filters in each stage.

  4. Progress towards the use of disposable filters

    Thermally degradable materials have been evaluated for service in HEPA filter units used to filter gases from active plants. The motivation was to reduce the bulk storage problems of contaminated filters by thermal decomposition to gaseous products and a solid residue substantially comprised of the filtered particulates. It is shown that while there are no commercially available alternatives to the glass fibre used in the filter medium, it would be feasible to manufacture the filter case and spacers from materials which could be incinerated. Operating temperatures, costs and the type of residues for disposal are discussed for filter case materials. (U.K.)

  5. Novel Simplex Unscented Transform and Filter

    Wan-Chun Li; Ping Wei; Xian-Ci Xiao

    2008-01-01

    In this paper, a new simplex unscented transform (UT) based Schmidt orthogonal algorithm and a new filter method based on this transform are proposed. This filter has less computation consumption than UKF (unscented Kalman filter), SUKF (simplex unscented Kalman filter) and EKF (extended Kalman filter). Computer simulation shows that this filter has the same performance as UKF and SUKF, and according to the analysis of the computational requirements of EKF, UKF and SUKF, this filter has preferable practicality value. Finally, the appendix shows the efficiency for this UT.

  6. A Study of Speckle Noise Reduction Filters

    Jyoti Jaybhay

    2015-06-01

    Full Text Available Ultrasound images and SAR i.e. synthetic aperture radar images are usually corrupted because of speckle noise also called as granular noise. It is quite a tedious task to remove such noise and analyze those corrupted images. Till now many researchers worked to remove speckle noise using frequency domain methods, temporal methods, and adaptive methods. Different filters have been developed as Mean and Median filters, Statistic Lee filter, Statistic Kuan filter, Frost filter, Srad filter. This paper reviews filters used to remove speckle noise.

  7. TRADITIONAL FERMENTED FOODS OF LESOTHO

    Tendekayi H. Gadaga

    2013-06-01

    Full Text Available This paper describes the traditional methods of preparing fermented foods and beverages of Lesotho. Information on the preparation methods was obtained through a combination of literature review and face to face interviews with respondents from Roma in Lesotho. An unstructured questionnaire was used to capture information on the processes, raw materials and utensils used. Four products; motoho (a fermented porridge, Sesotho (a sorghum based alcoholic beverage, hopose (sorghum fermented beer with added hops and mafi (spontaneously fermented milk, were found to be the main fermented foods prepared and consumed at household level in Lesotho. Motoho is a thin gruel, popular as refreshing beverage as well as a weaning food. Sesotho is sorghum based alcoholic beverage prepared for household consumption as well as for sale. It is consumed in the actively fermenting state. Mafi is the name given to spontaneously fermented milk with a thick consistency. Little research has been done on the technological aspects, including the microbiological and biochemical characteristics of fermented foods in Lesotho. Some of the traditional aspects of the preparation methods, such as use of earthenware pots, are being replaced, and modern equipment including plastic utensils are being used. There is need for further systematic studies on the microbiological and biochemical characteristics of these these products.

  8. Elephant resource-use traditions.

    Fishlock, Victoria; Caldwell, Christine; Lee, Phyllis C

    2016-03-01

    African elephants (Loxodonta africana) use unusual and restricted habitats such as swampy clearings, montane outcrops and dry rivers for a variety of social and ecological reasons. Within these habitats, elephants focus on very specific areas for resource exploitation, resulting in deep caves, large forest clearings and sand pits as well as long-established and highly demarcated routes for moving between resources. We review evidence for specific habitat exploitation in elephants and suggest that this represents socially learned cultural behaviour. Although elephants show high fidelity to precise locations over the very long term, these location preferences are explained neither by resource quality nor by accessibility. Acquiring techniques for exploiting specific resource sites requires observing conspecifics and practice and is evidence for social learning. Elephants possess sophisticated cognitive capacities used to track relationships and resources over their long lifespans, and they have an extended period of juvenile dependency as a result of the need to acquire this considerable social and ecological knowledge. Thus, elephant fidelity to particular sites results in traditional behaviour over generations, with the potential to weaken relationships between resource quality and site preferences. Illustrating the evidence for such powerful traditions in a species such as elephants contributes to understanding animal cognition in natural contexts. PMID:26359083

  9. Traditional and Modern Morphometrics: Review

    Gökhan OCAKOĞLU

    2013-01-01

    Full Text Available Morphometrics, a branch of morphology, is the study of the size and shape components of biological forms and their variation in the population. In biological and medical sciences, there is a long history of attempts to quantitatively express the diversity of the size and shape of biological forms. On the basis of historical developments in morphometry, we address several questions related to the shape of organs or organisms that are considered in biological and medical studies. In the field of morphometrics, multivariate statistical analysis is used to rigorously address such questions. Historically, these methods have involved the analysis of collections of distances or angles, but recent theoretical, computational, and other advances have shifted the focus of morphometric procedures to the Cartesian coordinates of anatomical points. In recent years, in biology and medicine, the traditional morphometric studies that aim to analyze shape variation have been replaced by modern morphometric studies. In the biological and medical sciences, morphometric methods are frequently preferred for examining the morphologic structures of organs or organisms with regard to diseases or environmental factors. These methods are also preferred for evaluating and classifying the variation of organs or organisms with respect to growth or allometry time dependently. Geometric morphometric methods are more valid than traditional morphometric methods in protecting more morphological information and in permitting analysis of this information.

  10. Anti-aliasing Filter in Hybrid Filter Banks

    Poulton, Daniel

    2006-01-01

    International audience Hybrid Filter Banks allow wide-band, high frequency conversion. All existing design methods suppose that the input signal is band-limited and that each sub-band signal is sampled at 1/M times the effective Nyquist frequency of the input signal 1/T . To avoid aliasing in the sampling process, an analog anti-aliasing filter should be used in order to eliminate noise in frequency bands in which there is no signal (or a few signal) . In this paper, it is shown that this ...

  11. Constrained digital matched filter method for optimum filter synthesis

    We present a new method to directly calculate the optimum filter in presence of any additive stationary noise, with arbitrary time and domain constraints (flat-top, zero-area, etc.). A more concise re-deduction of digital penalized LMS method (DPLMS) is given. This method is fully developed, and synthesis results of a typical situation are given and compared with the DPLMS method. Optimum filter can be synthesized without a prior knowledge of the noise power spectral density, which makes it suitable to be used in adaptive, self-calibrating digital spectroscopy

  12. The use of filter media to determine filter cleanliness

    Van Staden, S. J.; Haarhoff, J.

    It is general believed that a sand filter starts its life with new, perfectly clean media, which becomes gradually clogged with each filtration cycle, eventually getting to a point where either head loss or filtrate quality starts to deteriorate. At this point the backwash cycle is initiated and, through the combined action of air and water, returns the media to its original perfectly clean state. Reality, however, dictates otherwise. Many treatment plants visited a decade or more after commissioning are found to have unacceptably dirty filter sand and backwash systems incapable of returning the filter media to a desired state of cleanliness. In some cases, these problems are common ones encountered in filtration plants but many reasons for media deterioration remain elusive, falling outside of these common problems. The South African conditions of highly eutrophic surface waters at high temperatures, however, exacerbate the problems with dirty filter media. Such conditions often lead to the formation of biofilm in the filter media, which is shown to inhibit the effective backwashing of sand and carbon filters. A systematic investigation into filter media cleanliness was therefore started in 2002, ending in 2005, at the University of Johannesburg (the then Rand Afrikaans University). This involved media from eight South African Water Treatment Plants, varying between sand and sand-anthracite combinations and raw water types from eutrophic through turbid to low-turbidity waters. Five states of cleanliness and four fractions of specific deposit were identified relating to in situ washing, column washing, cylinder inversion and acid-immersion techniques. These were measured and the results compared to acceptable limits for specific deposit, as determined in previous studies, though expressed in kg/m 3. These values were used to determine the state of the filters. In order to gain greater insight into the composition of the specific deposits stripped from the media, a

  13. A biological oil adsorption filter

    Pasila, A. [University of Helsinki (Finland). Dept. of Agricultural Engineering and Household Technology

    2005-12-01

    A new oil adsorption method called adsorption filtration (AF) has been developed. It is a technology where by oil residues can be cleaned from water by running it through a simple filter made from freeze treated, dried, milled and then fragmented plant material. By choosing suitable plants and fragmentation sizes it is possible to produce filters, which pass water but adsorb oil. The aim of this study was to investigate the possibilities of manufacturing oil adsorbing filter materials from reed canary grass (Phalaris arundinacea), flax (Linum usitatissimum L.) or hemp fibre (Cannabis sativa L.). The oil (80 ml) was mixed with de-ionised water (200 ml) and this mixture was filtered through 10 or 20 g adsorption filters. Fine spring harvested hemp fibre (diameter less than 1 mm) and reed canary grass fragments adsorb 2-4 g of oil per gram of adsorption material compared to 1-3 g of water. Adsorption filtration is thus a novel way of gathering spilled oil in shallow coastal waters before the oil reaches the shore. (author)

  14. Properties of Ceramic Candle Filters

    Coal-fired Pressurized Fluidized Bed Combustion (PFBC) and Integrated Gasification Combined Cycle (IGCC) systems require ceramic candle filter elements which can withstand the mechanical, thermal, and chemical environment of hot gas cleanup applications. These systems demand filter elements to sustain the thermal stresses of normal operations (pulse cleaning), of start-up and shut-down conditions, and of unanticipated process upsets such as excessive ash accumulation without catastrophic failure. The filter elements must also survive the mechanical loads associated with handling and assembly, normal operation, and process upsets. Objectives of the test program at Southern Research are as follows: (1) Provide material characterization to develop an understanding of the physical, mechanical, and thermal behavior of hot gas filter materials. (2) Develop a material property data base from which the behavior of materials in the hot gas cleanup environment may be predicted. (3) Perform testing and analysis of filter elements after exposure to actual operating conditions to determine the effects of the thermal and chemical environments in hot gas filtration on material properties. (4) Explore the glass-like nature of the matrix material

  15. Iodine filters in nuclear installations

    The present report discusses the significance for environmental exposure of the iodine released with the gaseous effluents of nuclear power stations and reprocessing plants in relation to releases of other airborne radionuclides. Iodine filtration processes are described. The release pathways and the composition of airborne fission product iodine mixtures and their bearing on environmental exposure are discussed on the basis of measured fission product iodine emissions. The sorbents which can be used for iodine filtration, their removal efficiencies and range of applications are dealt with in detail. The particular conditions governing iodine removal, which are determined by the various gaseous iodine species, are illustrated on the basis of experimentally determined retention profiles. Particular attention is given to the limitations imposed by temperature, humidity, radiation and filter poisoning. The types of filter normally used are described, their advantages and drawbacks discussed, the principles underlying their design are outlined and the sources of error indicated. The methods normally applied to test the efficiency of various iodine sorbents are described and assessed. Operating experience with iodine filters, gathered from surveillance periods of many years, is supplemented by a large number of test results and the findings of extensive experiments. Possible ways of prolonging the permissible service lives of iodine filters are discussed and information is given on protective measures. The various iodine removal processes applied in reprocessing plants are described and compared with reference to efficiency and cost. The latest developments in filter technology in reprocessing plants are briefly outlined

  16. Modal Filters for Infrared Interferometry

    Ksendzov, Alexander; MacDonald, Daniel R.; Soibel, Alexander

    2009-01-01

    Modal filters in the approximately equal to 10-micrometer spectral range have been implemented as planar dielectric waveguides in infrared interferometric applications such as searching for Earth-like planets. When looking for a small, dim object ("Earth") in close proximity to a large, bright object ("Sun"), the interferometric technique uses beams from two telescopes combined with a 180 phase shift in order to cancel the light from a brighter object. The interferometer baseline can be adjusted so that, at the same time, the light from the dimmer object arrives at the combiner in phase. This light can be detected and its infrared (IR) optical spectra can be studied. The cancellation of light from the "Sun" to approximately equal to 10(exp 6) is required; this is not possible without special devices-modal filters- that equalize the wavefronts arriving from the two telescopes. Currently, modal filters in the approximately equal to 10-micrometer spectral range are implemented as single- mode fibers. Using semiconductor technology, single-mode waveguides for use as modal filters were fabricated. Two designs were implemented: one using an InGaAs waveguide layer matched to an InP substrate, and one using InAlAs matched to an InP substrate. Photon Design software was used to design the waveguides, with the main feature all designs being single-mode operation in the 10.5- to 17-micrometer spectral range. Preliminary results show that the filter's rejection ratio is 26 dB.

  17. A biological oil adsorption filter

    A new oil adsorption method called adsorption filtration (AF) has been developed. It is a technology where by oil residues can be cleaned from water by running it through a simple filter made from freeze treated, dried, milled and then fragmented plant material. By choosing suitable plants and fragmentation sizes it is possible to produce filters, which pass water but adsorb oil. The aim of this study was to investigate the possibilities of manufacturing oil adsorbing filter materials from reed canary grass (Phalaris arundinacea), flax (Linum usitatissimum L.) or hemp fibre (Cannabis sativa L.). The oil (80 ml) was mixed with de-ionised water (200 ml) and this mixture was filtered through 10 or 20 g adsorption filters. Fine spring harvested hemp fibre (diameter less than 1 mm) and reed canary grass fragments adsorb 2-4 g of oil per gram of adsorption material compared to 1-3 g of water. Adsorption filtration is thus a novel way of gathering spilled oil in shallow coastal waters before the oil reaches the shore. (author)

  18. A Level Set Filter for Speckle Reduction in SAR Images

    Huang Bo

    2010-01-01

    Full Text Available Despite much effort and significant progress in recent years, speckle removal for Synthetic Aperture Radar (SAR image still is a challenging problem in image processing. Unlike the traditional noise filters, which are mainly based on local neighborhood statistical average or frequencies transform, in this paper, we propose a speckle reduction method based on the theory of level set, one form of curvature flow propagation. Firstly, based on partial differential equation, the Lee filter can be cast as a formulation of anisotropic diffusion function; furthermore, we continued to deduce it into a level set formulation. Level set flow into the method allows the front interface to propagate naturally with topological changes, where the speed is proportional to the curvature of the intensity contours in an image. Hence, small speckle will disappear quickly, while large scale interfaces will be slow to evolve. Secondly, for preserving finer detailed structures in images when smoothing the speckle, the evolution is switched between minimum or maximum curvature speed depending on the scale of speckle. The proposed method has been illustrated by experiments on simulation image and ERS-2 SAR images under different circumstances. Its advantages over the traditional speckle reduction filter approaches have also been demonstrated.

  19. Effect of a biological activated carbon filter on particle counts

    Su-hua WU; Bing-zhi DONG; Tie-jun QIAO; Jin-song ZHANG

    2008-01-01

    Due to the importance of biological safety in drinking water quality and the disadvantages which exist in traditional methods of detecting typical microorganisms such as Cryptosporidium and Giardia,it is necessary to develop an alternative.Particle counts is a qualitative measurement of the amount of dissolved solids in water.The removal rate of particle counts was previously used as an indicator of the effectiveness of a biological activated carbon(BAC)filter in removing Cryptosporidium and Giardia.The particle counts in a BAC filter effluent over one operational period and the effects of BAC filter construction and operational parameters were investigated with a 10 m3/h pilot plant.The results indicated that the maximum particle count in backwash remnant water was as high as 1296 count/ml and it needed about 1.5 h to reduce from the maximum to less than 50 count/ml.During the standard filtration period,particle counts stay constant at less than 50 count/ml for 5 d except when influ-enced by sand filter backwash remnant water.The removal rates of particle counts in the BAC filter are related to characteristics of the carbon.For example,a columned carbon and a sand bed removed 33.3% and 8.5% of particles,respectively,while the particle counts in effluent from a cracked BAC filter was higher than that of the influent.There is no significant difference among particle removal rates with different filtration rates.High post-ozone dosage(>2 mg/L)plays an important role in particle count removal;when the dosage was 3 mg/L,the removal rates by carbon layers and sand beds decreased by 17.5% and increased by 9.5%,respectively,compared with a 2 mg/L dosage.

  20. An Improved Morphological Algorithm for Filtering Airborne LiDAR Point Cloud Based on Multi-Level Kriging Interpolation

    Zhenyang Hui

    2016-01-01

    Full Text Available Filtering is one of the core post-processing steps for airborne LiDAR point cloud. In recent years, the morphology-based filtering algorithms have proven to be a powerful and efficient tool for filtering airborne LiDAR point cloud. However, most traditional morphology-based algorithms have difficulties in preserving abrupt terrain features, especially when using larger filtering windows. In order to suppress the omission error caused by protruding terrain features, this paper proposes an improved morphological algorithm based on multi-level kriging interpolation. This algorithm is essentially a combination of progressive morphological filtering algorithm and multi-level interpolation filtering algorithm. The morphological opening operation is performed with filtering window gradually downsizing, while kriging interpolation is conducted at different levels according to the different filtering windows. This process is iterative in a top to down fashion until the filtering window is no longer greater than the preset minimum filtering window. Fifteen samples provided by the ISPRS commission were chosen to test the performance of the proposed algorithm. Experimental results show that the proposed method can achieve promising results not only in flat urban areas but also in rural areas. Comparing with other eight classical filtering methods, the proposed method obtained the lowest omission error, and preserved protruding terrain features better.

  1. Different aspects of nonlinear stochastic filtering theory

    Kaddachi, Riadh

    2006-01-01

    This thesis studies different aspects of the linear and the nonlinear stochastic filtering problem. It consists of four chapters. In the first chapter we derive the Kalman and the extended Kalman filter algorithms and we study some of their qualitative properties. In the second chapter we present a unified general framework on particle filter methods. In particular, we show how the particle filter methods surmount the difficulties due to the Kalman approach to filtering and we compare differe...

  2. Analytic Moment-based Gaussian Process Filtering

    Deisenroth, Marc P.; Huber, Marco F.; Hanebeck, Uwe D.

    2009-01-01

    We propose an analytic moment-based filter for nonlinear stochastic dynamic systems modeled by Gaussian processes. Exact expressions for the expected value and the covariance matrix are provided for both the prediction step and the filter step, where an additional Gaussian assumption is exploited in the latter case. Our filter does not require further approximations. In particular, it avoids finite-sample approximations. We compare the filter to a variety of Gaussian filters, that is, the EKF...

  3. Optimal Nonlinear Filter for INS Alignment

    赵瑞; 顾启泰

    2002-01-01

    All the methods to handle the inertial navigation system (INS) alignment were sub-optimal in the past. In this paper, particle filtering (PF) as an optimal method is used for solving the problem of INS alignment. A sub-optimal two-step filtering algorithm is presented to improve the real-time performance of PF. The approach combines particle filtering with Kalman filtering (KF). Simulation results illustrate the superior performance of these approaches when compared with extended Kalman filtering (EKF).

  4. Multiplier-free filters for wideband SAR

    Dall, Jørgen; Christensen, Erik Lintz

    2001-01-01

    This paper derives a set of parameters to be optimized when designing filters for digital demodulation and range prefiltering in SAR systems. Aiming at an implementation in field programmable gate arrays (FPGAs), an approach for the design of multiplier-free filters is outlined. Design results are...... presented in terms of filter complexity and performance. One filter has been coded in VHDL and preliminary results indicate that the filter can meet a 2 GHz input sample rate....

  5. Accelerated graph-based spectral polynomial filters

    Knyazev, Andrew; Malyshev, Alexander,

    2015-01-01

    Graph-based spectral denoising is a low-pass filtering using the eigendecomposition of the graph Laplacian matrix of a noisy signal. Polynomial filtering avoids costly computation of the eigendecomposition by projections onto suitable Krylov subspaces. Polynomial filters can be based, e.g., on the bilateral and guided filters. We propose constructing accelerated polynomial filters by running flexible Krylov subspace based linear and eigenvalue solvers such as the Block Locally Optimal Precond...

  6. A uniformly convergent adaptive particle filter

    Papavasiliou, Anastasia

    2005-01-01

    Particle filters are Monte Carlo methods that aim to approximate the optimal filter of a partially observed Markov chain. In this paper, we study the case in which the transition kernel of the Markov chain depends on unknown parameters: we construct a particle filter for the simultaneous estimation of the parameter and the partially observed Markov chain (adaptive estimation) and we prove the convergence of this filter to the correct optimal filter, as time and the number...

  7. Multiresolution Bilateral Filtering for Image Denoising

    Zhang, Ming; Gunturk, Bahadir K.

    2008-01-01

    The bilateral filter is a nonlinear filter that does spatial averaging without smoothing edges; it has shown to be an effective image denoising technique. An important issue with the application of the bilateral filter is the selection of the filter parameters, which affect the results significantly. There are two main contributions of this paper. The first contribution is an empirical study of the optimal bilateral filter parameter selection in image denoising applications. The second contri...

  8. A generalised linear and nonlinear spline filter

    Zeng, W.; Jiang, X.; Scott, P.

    2011-01-01

    In this paper, a generalised spline filter, that has a unified description for both the linear spline filter and the nonlinear robust spline filter, is proposed. Based on the M-estimation theory, the general spline filter model can be solved by using an Iterated Reweighted Least Squared method which is also general for both the linear and nonlinear spline filter. The algorithm has been verified to be effective, efficient and fast.

  9. Automated Integrated Analog Filter Design Issues

    Karolis Kiela; Romualdas Navickas

    2015-01-01

    An analysis of modern automated integrated analog circuits design methods and their use in integrated filter design is done. Current modern analog circuits automated tools are based on optimization algorithms and/or new circuit generation methods. Most automated integrated filter design methods are only suited to gmC and switched current filter topologies. Here, an algorithm for an active RC integrated filter design is proposed, that can be used in automated filter designs. The algorithm is t...

  10. Evolving Information Filtering for Personalized Information Service

    田范江; 李丛蓉; 王鼎兴

    2001-01-01

    Information filtering (IF) systems are important for personalized information service. However, most current IF systems suffer from low quality and long training time. In this paper, a refined evolving information filtering method is presented. This method describes user's information need from multi-aspects and improves filtering quality through a process like natural selection. Experimental result shows this method can shorten training time, improve filtering quality, and reduce the relevance between filtering results and training sequence.

  11. Inferior vena cava filters in cancer patients: to filter or not to filter

    Hikmat Abdel-Razeq

    2011-03-01

    Full Text Available Hikmat Abdel-Razeq1, Asem Mansour2, Yousef Ismael1, Hazem Abdulelah11Department of Internal Medicine, 2Department of Radiology, King Hussein Cancer Center, Amman, JordanPurpose: Cancer and its treatment are recognized risk factors for venous thromboembolism (VTE; active cancer accounts for almost 20% of all newly diagnosed VTE. Inferior vena cava (IVC filters are utilized to provide mechanical thromboprophylaxis to prevent pulmonary embolism (PE or to avoid bleeding from systemic anticoagulation in high-risk situations. In this report, and utilizing a case study, we will address the appropriate utilization of such filters in cancer patients.Methods: The case of a 43-year-old female patient with rectal cancer, who developed deep vein thrombosis following a complicated medical course, will be presented. The patient was anticoagulated with a low molecular weight heparin, but a few months later and following an episode of bleeding, an IVC filter was planned. Using the PubMed database, articles published in English language addressing issues related to IVC filters in cancer patients were accessed and will be presented.Results: Many recent studies questioned the need to insert IVC filters in advanced-stage cancer patients, particularly those whose anticipated survival is short and prevention of PE may be of little clinical benefit and could be a poor utilization of resources.Conclusion: Systemic anticoagulation can be safely offered for the majority of cancer patients. When the risk of bleeding or pulmonary embolism is high, IVC filters can be utilized. However, placement of such filters should take into consideration the stage of disease and life expectancy of such patients.Keywords: anticoagulation, bleeding, chemotherapy

  12. Radiant zone heated particulate filter

    Gonze, Eugene V [Pinckney, MI

    2011-12-27

    A system includes a particulate matter (PM) filter including an upstream end for receiving exhaust gas and a downstream end. A radiant zoned heater includes N zones, where N is an integer greater than one, wherein each of the N zones includes M sub-zones, where M is an integer greater than or equal to one. A control module selectively activates at least a selected one of the N zones to initiate regeneration in downstream portions of the PM filter from the one of the N zones, restricts exhaust gas flow in a portion of the PM filter that corresponds to the selected one of the N zones, and deactivates non-selected ones of the N zones.

  13. HEPA filter concerns - an overview

    The U.S. Department of Energy (DOE) recently initiated a complete review of the DOE High Efficiency Particulate Air (HEPA) Filter Program to identify areas for improvement. Although this process is currently ongoing, various issues and problems have already been identified for action that not only impacts the DOE HEPA filter program, but potentially the national and international air cleaning community as well. This paper briefly reviews a few of those concerns that may be of interest, and discusses actions initiated by the DOE to address the associated issues and problems. Issues discussed include: guidance standards, in-place testing, specifications, Test Facilities, portable units, vacuum cleaners, substitute aerosols, filter efficiencies, aging/shelf life/service life, fire suppression, handbook, Quality Products List (QPL), QA testing, and evaluations

  14. Applications of nonwoven filter media

    1988-11-01

    The multi-client technical and marketing report, Nonwovens in Filtration (1987) World Wide, has been completed by Filter Media Consulting, Inc. According to this 450-page report, $818 million in sales worldwide in nonwoven filter media represents a substantial segment of the entire nonwoven market. This total is mainly roll goods with a few exceptions. Meltblown composites represent $108 million, 13% of the total, and is the fastest growing segment as compared to needled felts, dry formed, thermobonded, spunbonded, wet laid and other unique processes, all extensively covered in this report. Included are 20 filtration applications covered in 190 pages, such as baghouse and dust filtration, Torit-type cartridge filters, HEPA/ULPA filtration, and heating, ventilation and air conditioning. Major markets are addressed, and trends in different fields are highlighted throughout the report.

  15. Efficient Wiener filtering without preconditioning

    Elsner, Franz

    2012-01-01

    We present a new approach to calculate the Wiener filter solution of general data sets. It is trivial to implement, flexible, numerically absolutely stable, and guaranteed to converge. Most importantly, it does not require an ingenious choice of preconditioner to work well. The method is capable of taking into account inhomogeneous noise distributions and arbitrary mask geometries. It iteratively builds up the signal reconstruction by means of a messenger field, introduced to mediate between the different preferred bases in which signal and noise properties can be specified most conveniently. Using cosmic microwave background (CMB) radiation data as a showcase, we demonstrate the capabilities of our scheme by computing Wiener filtered WMAP7 temperature and polarization maps at full resolution for the first time. We show how the algorithm can be modified to synthesize fluctuation maps, which, combined with the Wiener filter solution, result in unbiased constrained signal realizations, consistent with the obser...

  16. Image Recommendation Algorithm Using Feature-Based Collaborative Filtering

    Kim, Deok-Hwan

    As the multimedia contents market continues its rapid expansion, the amount of image contents used in mobile phone services, digital libraries, and catalog service is increasing remarkably. In spite of this rapid growth, users experience high levels of frustration when searching for the desired image. Even though new images are profitable to the service providers, traditional collaborative filtering methods cannot recommend them. To solve this problem, in this paper, we propose feature-based collaborative filtering (FBCF) method to reflect the user's most recent preference by representing his purchase sequence in the visual feature space. The proposed approach represents the images that have been purchased in the past as the feature clusters in the multi-dimensional feature space and then selects neighbors by using an inter-cluster distance function between their feature clusters. Various experiments using real image data demonstrate that the proposed approach provides a higher quality recommendation and better performance than do typical collaborative filtering and content-based filtering techniques.

  17. Local geometry variable conductance diffusion for post-reconstruction filtering

    Variable conductance diffusion (VCD) filtering can preserve edges while smoothing noise in an image. The threshold of the conductance function determines the degree to which a part of the image is smoothed. Traditionally, a constant threshold has been used. The use of a global threshold does not allow for adaptation to local variations within the image. The approach presented in this paper exploits the local geometry of the image and derives the threshold from the variations that are more likely caused by noise than by structural changes. The authors apply it to simulated noisy reconstructed single-photon emission computed tomographic (SPECT) image sets. For a particular voxel, if a consistent gradient direction is found within its neighborhood, then the variations on the plane perpendicular to the gradient direction are considered as noise and used to derive the threshold. The results show that, for the same average noise level in the liver, the image contrast from both local geometry and constant threshold VCD filters are higher than those from Butterworth filtering. The local geometry VCD filtering provides images with smoother boundaries than the constant threshold method. Moreover, the contrast loss is less sensitive to the tumor size for the local geometry method

  18. Modeling the filtration ability of stockpiled filtering facepiece

    Rottach, Dana R.

    2016-03-01

    Filtering facepiece respirators (FFR) are often stockpiled for use during public health emergencies such as an infectious disease outbreak or pandemic. While many stockpile administrators are aware of shelf life limitations, environmental conditions can lead to premature degradation. Filtration performance of a set of FFR retrieved from a storage room with failed environmental controls was measured. Though within the expected shelf life, the filtration ability of several respirators was degraded, allowing twice the penetration of fresh samples. The traditional picture of small particle capture by fibrous filter media qualitatively separates the effect of inertial impaction, interception from the streamline, diffusion, settling, and electrostatic attraction. Most of these mechanisms depend upon stable conformational properties. However, common FFR rely on electrets to achieve their high performance, and over time heat and humidity can cause the electrostatic media to degrade. An extension of the Langevin model with correlations to classical filtration concepts will be presented. The new computational model will be used to predict the change in filter effectiveness as the filter media changes with time.

  19. Trust and Traditions in Transitions

    McQuaid, Sara Dybris

    On New Year’s Eve 2013, months of talks on ‘Dealing with the past’, ‘Flags’ and ‘Parades’ ended without agreement on how to move towards a reconciliation of positions in Northern Ireland. The failure of the talks illustrates the importance of culture and (mis)trust in divided societies, where pol...... recesses of memory in the current transition. And b) that patterns of ‘competitive commemoration’ in parades should be understood in relation to the increasing dissonance between vernacular languages of conflict and the official post-conflict discourses in Northern Ireland....... politics often pivot around whose culture shall be official and whose subordinated, whose history shall be remembered and whose forgotten (Jordan and Weedon 1995). These struggles are particularly intense in times of transition where traditions, power relations and frames of relevant remembrance are...

  20. Traditional Procurement is too Slow

    Ann Kong

    2012-11-01

    Full Text Available This paper reports on an exploratory interview survey of construction project participants aimed at identifying the reasons for the decrease in use of the traditional, lump-sum, procurement system in Malaysia. The results show that most people believe it is too slow. This appears to be in part due to the contiguous nature of the various phase and stages of the process and especially the separation of the design and construction phases. The delays caused by disputes between the various parties are also seen as a contributory factor - the most prominent cause being the frequency of variations, with design and scope changes being a particular source of discontent. It is concluded that an up scaling of the whole of the time related reward/penalty system may be the most appropriate measure for the practice in future.