WorldWideScience

Sample records for traditional filtered back-projection

  1. Coronary CT angiography: image quality, diagnostic accuracy, and potential for radiation dose reduction using a novel iterative image reconstruction technique - comparison with traditional filtered back projection

    To compare image noise, image quality and diagnostic accuracy of coronary CT angiography (cCTA) using a novel iterative reconstruction algorithm versus traditional filtered back projection (FBP) and to estimate the potential for radiation dose savings. Sixty five consecutive patients (48 men; 59.3 ± 7.7 years) prospectively underwent cCTA and coronary catheter angiography (CCA). Full radiation dose data, using all projections, were reconstructed with FBP. To simulate image acquisition at half the radiation dose, 50% of the projections were discarded from the raw data. The resulting half-dose data were reconstructed with sinogram-affirmed iterative reconstruction (SAFIRE). Full-dose FBP and half-dose iterative reconstructions were compared with regard to image noise and image quality, and their respective accuracy for stenosis detection was compared against CCA. Compared with full-dose FBP, half-dose iterative reconstructions showed significantly (p = 0.001 - p = 0.025) lower image noise and slightly higher image quality. Iterative reconstruction improved the accuracy of stenosis detection compared with FBP (per-patient: accuracy 96.9% vs. 93.8%, sensitivity 100% vs. 100%, specificity 94.6% vs. 89.2%, NPV 100% vs. 100%, PPV 93.3% vs. 87.5%). Iterative reconstruction significantly reduces image noise without loss of diagnostic information and holds the potential for substantial radiation dose reduction from cCTA. (orig.)

  2. A Wiener filtering based back projection algorithm for image reconstruction

    In the context of computed tomography (CT), a key techniques is the image reconstruction from projection data. The filtered back projection (FBP) algorithm is commonly used in image reconstruction. Based on cause analysis of the artifacts, we propose a new image reconstruction algorithm combining the Wiener filter and FBP algorithm. The conventional FBP image reconstruction algorithm is improved by adopting Wiener filter: and artifacts in the reconstructed images are obviously reduced. Experimental results of typical flow regimes show that the improved algorithm can effectively improve the image quality. (authors)

  3. An adaptive filtered back-projection for photoacoustic image reconstruction

    Purpose: The purpose of this study is to develop an improved filtered-back-projection (FBP) algorithm for photoacoustic tomography (PAT), which allows image reconstruction with higher quality compared to images reconstructed through traditional algorithms. Methods: A rigorous expression of a weighting function has been derived directly from a photoacoustic wave equation and used as a ramp filter in Fourier domain. The authors’ new algorithm utilizes this weighting function to precisely calculate each photoacoustic signal’s contribution and then reconstructs the image based on the retarded potential generated from the photoacoustic sources. In addition, an adaptive criterion has been derived for selecting the cutoff frequency of a low pass filter. Two computational phantoms were created to test the algorithm. The first phantom contained five spheres with each sphere having different absorbances. The phantom was used to test the capability for correctly representing both the geometry and the relative absorbed energy in a planar measurement system. The authors also used another phantom containing absorbers of different sizes with overlapping geometry to evaluate the performance of the new method for complicated geometry. In addition, random noise background was added to the simulated data, which were obtained by using an arc-shaped array of 50 evenly distributed transducers that spanned 160° over a circle with a radius of 65 mm. A normalized factor between the neighbored transducers was applied for correcting measurement signals in PAT simulations. The authors assumed that the scanned object was mounted on a holder that rotated over the full 360° and the scans were set to a sampling rate of 20.48 MHz. Results: The authors have obtained reconstructed images of the computerized phantoms by utilizing the new FBP algorithm. From the reconstructed image of the first phantom, one can see that this new approach allows not only obtaining a sharp image but also showing the correct signal strength of the absorbers. The reconstructed image of the second phantom further demonstrates the capability to form clear images of the spheres with sharp borders in the overlapping geometry. The smallest sphere is clearly visible and distinguishable, even though it is surrounded by two big spheres. In addition, image reconstructions were conducted with randomized noise added to the observed signals to mimic realistic experimental conditions. Conclusions: The authors have developed a new FBP algorithm that is capable for reconstructing high quality images with correct relative intensities and sharp borders for PAT. The results demonstrate that the weighting function serves as a precise ramp filter for processing the observed signals in the Fourier domain. In addition, this algorithm allows an adaptive determination of the cutoff frequency for the applied low pass filter

  4. An adaptive filtered back-projection for photoacoustic image reconstruction

    Huang, He; Bustamante, Gilbert; Peterson, Ralph; Ye, Jing Yong, E-mail: jingyong.ye@utsa.edu [Department of Biomedical Engineering, University of Texas at San Antonio, San Antonio, Texas 78249 (United States)

    2015-05-15

    Purpose: The purpose of this study is to develop an improved filtered-back-projection (FBP) algorithm for photoacoustic tomography (PAT), which allows image reconstruction with higher quality compared to images reconstructed through traditional algorithms. Methods: A rigorous expression of a weighting function has been derived directly from a photoacoustic wave equation and used as a ramp filter in Fourier domain. The authors’ new algorithm utilizes this weighting function to precisely calculate each photoacoustic signal’s contribution and then reconstructs the image based on the retarded potential generated from the photoacoustic sources. In addition, an adaptive criterion has been derived for selecting the cutoff frequency of a low pass filter. Two computational phantoms were created to test the algorithm. The first phantom contained five spheres with each sphere having different absorbances. The phantom was used to test the capability for correctly representing both the geometry and the relative absorbed energy in a planar measurement system. The authors also used another phantom containing absorbers of different sizes with overlapping geometry to evaluate the performance of the new method for complicated geometry. In addition, random noise background was added to the simulated data, which were obtained by using an arc-shaped array of 50 evenly distributed transducers that spanned 160° over a circle with a radius of 65 mm. A normalized factor between the neighbored transducers was applied for correcting measurement signals in PAT simulations. The authors assumed that the scanned object was mounted on a holder that rotated over the full 360° and the scans were set to a sampling rate of 20.48 MHz. Results: The authors have obtained reconstructed images of the computerized phantoms by utilizing the new FBP algorithm. From the reconstructed image of the first phantom, one can see that this new approach allows not only obtaining a sharp image but also showing the correct signal strength of the absorbers. The reconstructed image of the second phantom further demonstrates the capability to form clear images of the spheres with sharp borders in the overlapping geometry. The smallest sphere is clearly visible and distinguishable, even though it is surrounded by two big spheres. In addition, image reconstructions were conducted with randomized noise added to the observed signals to mimic realistic experimental conditions. Conclusions: The authors have developed a new FBP algorithm that is capable for reconstructing high quality images with correct relative intensities and sharp borders for PAT. The results demonstrate that the weighting function serves as a precise ramp filter for processing the observed signals in the Fourier domain. In addition, this algorithm allows an adaptive determination of the cutoff frequency for the applied low pass filter.

  5. Filtered back-projection algorithm for Compton telescopes

    Gunter, Donald L.

    2008-03-18

    A method for the conversion of Compton camera data into a 2D image of the incident-radiation flux on the celestial sphere includes detecting coincident gamma radiation flux arriving from various directions of a 2-sphere. These events are mapped by back-projection onto the 2-sphere to produce a convolution integral that is subsequently stereographically projected onto a 2-plane to produce a second convolution integral which is deconvolved by the Fourier method to produce an image that is then projected onto the 2-sphere.

  6. A reconstruction algorithm for coherent scatter computed tomography based on filtered back-projection

    Coherent scatter computed tomography (CSCT) is a reconstructive x-ray imaging technique that yields the spatially resolved coherent-scatter form factor of the investigated object. Reconstruction from coherently scattered x-rays is commonly done using algebraic reconstruction techniques (ART). In this paper, we propose an alternative approach based on filtered back-projection. For the first time, a three-dimensional (3D) filtered back-projection technique using curved 3D back-projection lines is applied to two-dimensional coherent scatter projection data. The proposed algorithm is tested with simulated projection data as well as with projection data acquired with a demonstrator setup similar to a multi-line CT scanner geometry. While yielding comparable image quality as ART reconstruction, the modified 3D filtered back-projection algorithm is about two orders of magnitude faster. In contrast to iterative reconstruction schemes, it has the advantage that subfield-of-view reconstruction becomes feasible. This allows a selective reconstruction of the coherent-scatter form factor for a region of interest. The proposed modified 3D filtered back-projection algorithm is a powerful reconstruction technique to be implemented in a CSCT scanning system. This method gives coherent scatter CT the potential of becoming a competitive modality for medical imaging or nondestructive testing

  7. An investigation of filter choice for filtered back-projection reconstruction in PET

    A key parameter in the practical application of filtered back-projection (FBP), the standard clinical image reconstruction algorithm for positron emission tomography (PET), is the choice of a low-pass filter window function and its cut-off frequency. However, the filter windows and cut-off frequencies for clinical reconstruction are usually chosen empirically, based on a small sample of images and filters. By considering the features of the signal and noise spectra in a sinogram, the desired image resolution, and the signal-to-noise ratio (SNR) of the filtered sinogram, a methodology for informed selection of a filter function and cut-off frequency for FBP was investigated. Simulations of sinogram data similar to whole body or cardiac studies provided information on the signal and noise frequency-domain spectra of noisy projection data. The improvements in SNR with different filter windows and cut-off frequencies were evaluated and compared. The projection spectrum SNR measure did not prove to be an accurate indicator of subjective image quality or lesion detectability with variations in Poisson noise and image resolution

  8. PET reconstruction artifact can be minimized by using sinogram correction and filtered back-projection technique

    Jha, Ashish Kumar; Purandare, Nilendu C; Shah, Sneha; Agrawal, Archi; Puranik, Ameya D; Rangarajan, Venkatesh

    2014-01-01

    Filtered Back-Projection (FBP) has become an outdated image reconstruction technique in new-generation positron emission tomography (PET)/computed tomography (CT) scanners. Iterative reconstruction used in all new-generation PET scanners is a much improved reconstruction technique. Though a well-calibrated PET system can only be used for clinical imaging in few situations like ours, when compromised PET scanner with one PET module bypassed was used for PET acquisition, FBP with sinogram correction proved to be a better reconstruction technique to minimize streak artifact present in the image reconstructed by the iterative technique. PMID:25024515

  9. PET reconstruction artifact can be minimized by using sinogram correction and filtered back-projection technique

    Ashish Kumar Jha

    2014-01-01

    Full Text Available Filtered Back-Projection (FBP has become an outdated image reconstruction technique in new-generation positron emission tomography (PET/computed tomography (CT scanners. Iterative reconstruction used in all new-generation PET scanners is a much improved reconstruction technique. Though a well-calibrated PET system can only be used for clinical imaging in few situations like ours, when compromised PET scanner with one PET module bypassed was used for PET acquisition, FBP with sinogram correction proved to be a better reconstruction technique to minimize streak artifact present in the image reconstructed by the iterative technique.

  10. Two-dimensional water temperature reconstruction by filtered back-projection method

    The reconstruction of water temperature in combustion is realized by the tunable diode absorption spectroscopy technique. The model for H2O temperature distribution is assumed as a Gaussian function, ranging from 300 K to 1300 K. Radon transform is used to simulate the experimental results. The reconstruction of temperature distribution is achieved by reconstruction of two temperature-dependent line strengths based on the filtered back-projection method. The temperature reconstruction result agrees well with the original model. Moreover, the influences of the number of projections and random errors in projections on reconstruction are also studied. The simulation results indicate that the decrease in projection number or the increase in noise increases the mean square error of the reconstructed temperature, deteriorating the reconstructed image. The temperature reconstruction can not reveal the original temperature distribution when the projection number reduces to four. (authors)

  11. Comparison of adaptive statistical iterative and filtered back projection reconstruction techniques in brain CT

    Ren, Qingguo, E-mail: renqg83@163.com [Department of Radiology, Hua Dong Hospital of Fudan University, Shanghai 200040 (China); Dewan, Sheilesh Kumar, E-mail: sheilesh_d1@hotmail.com [Department of Geriatrics, Hua Dong Hospital of Fudan University, Shanghai 200040 (China); Li, Ming, E-mail: minli77@163.com [Department of Radiology, Hua Dong Hospital of Fudan University, Shanghai 200040 (China); Li, Jianying, E-mail: Jianying.Li@med.ge.com [CT Imaging Research Center, GE Healthcare China, Beijing (China); Mao, Dingbiao, E-mail: maodingbiao74@163.com [Department of Radiology, Hua Dong Hospital of Fudan University, Shanghai 200040 (China); Wang, Zhenglei, E-mail: Williswang_doc@yahoo.com.cn [Department of Radiology, Shanghai Electricity Hospital, Shanghai 200050 (China); Hua, Yanqing, E-mail: cjr.huayanqing@vip.163.com [Department of Radiology, Hua Dong Hospital of Fudan University, Shanghai 200040 (China)

    2012-10-15

    Purpose: To compare image quality and visualization of normal structures and lesions in brain computed tomography (CT) with adaptive statistical iterative reconstruction (ASIR) and filtered back projection (FBP) reconstruction techniques in different X-ray tube current–time products. Materials and methods: In this IRB-approved prospective study, forty patients (nineteen men, twenty-one women; mean age 69.5 ± 11.2 years) received brain scan at different tube current–time products (300 and 200 mAs) in 64-section multi-detector CT (GE, Discovery CT750 HD). Images were reconstructed with FBP and four levels of ASIR-FBP blending. Two radiologists (please note that our hospital is renowned for its geriatric medicine department, and these two radiologists are more experienced in chronic cerebral vascular disease than in neoplastic disease, so this research did not contain cerebral tumors but as a discussion) assessed all the reconstructed images for visibility of normal structures, lesion conspicuity, image contrast and diagnostic confidence in a blinded and randomized manner. Volume CT dose index (CTDI{sub vol}) and dose-length product (DLP) were recorded. All the data were analyzed by using SPSS 13.0 statistical analysis software. Results: There was no statistically significant difference between the image qualities at 200 mAs with 50% ASIR blending technique and 300 mAs with FBP technique (p > .05). While between the image qualities at 200 mAs with FBP and 300 mAs with FBP technique a statistically significant difference (p < .05) was found. Conclusion: ASIR provided same image quality and diagnostic ability in brain imaging with greater than 30% dose reduction compared with FBP reconstruction technique.

  12. Comparison of adaptive statistical iterative and filtered back projection reconstruction techniques in brain CT

    Purpose: To compare image quality and visualization of normal structures and lesions in brain computed tomography (CT) with adaptive statistical iterative reconstruction (ASIR) and filtered back projection (FBP) reconstruction techniques in different X-ray tube current–time products. Materials and methods: In this IRB-approved prospective study, forty patients (nineteen men, twenty-one women; mean age 69.5 ± 11.2 years) received brain scan at different tube current–time products (300 and 200 mAs) in 64-section multi-detector CT (GE, Discovery CT750 HD). Images were reconstructed with FBP and four levels of ASIR-FBP blending. Two radiologists (please note that our hospital is renowned for its geriatric medicine department, and these two radiologists are more experienced in chronic cerebral vascular disease than in neoplastic disease, so this research did not contain cerebral tumors but as a discussion) assessed all the reconstructed images for visibility of normal structures, lesion conspicuity, image contrast and diagnostic confidence in a blinded and randomized manner. Volume CT dose index (CTDIvol) and dose-length product (DLP) were recorded. All the data were analyzed by using SPSS 13.0 statistical analysis software. Results: There was no statistically significant difference between the image qualities at 200 mAs with 50% ASIR blending technique and 300 mAs with FBP technique (p > .05). While between the image qualities at 200 mAs with FBP and 300 mAs with FBP technique a statistically significant difference (p < .05) was found. Conclusion: ASIR provided same image quality and diagnostic ability in brain imaging with greater than 30% dose reduction compared with FBP reconstruction technique

  13. CT coronary angiography: Image quality with sinogram-affirmed iterative reconstruction compared with filtered back-projection

    Aim: To investigate image quality and potential for radiation dose reduction using sinogram-affirmed iterative reconstruction (SAFIRE) at computed tomography (CT) coronary angiography (CTCA) compared with filtered back-projection (FBP) reconstruction. Materials and methods: A water phantom and 49 consecutive patients were scanned using a retrospectively electrocardiography (ECG)-gated CTCA protocol on a dual-source CT system. Image reconstructions were performed with both conventional FBP and SAFIRE. The SAFIRE series were reconstructed image data from only one tube, simulating a 50% radiation dose reduction. Two blinded observers independently assessed the image quality of each coronary segment using a four-point scale and measured image noise (the standard deviation of Hounsfield values, SD), signal-to-noise ratio (SNR), and contrast-to-noise ratio (CNR). Radiation dose estimates were calculated. Results: In the water phantom, image noise decreased at the same ratio as the tube current increased for both reconstruction algorithms. Despite an estimated radiation dose reduction from 7.9 ± 2.8 to 4 ± 1.4 mSv, there was no significant difference in the SD and SNR within the aortic root and left ventricular chamber between the two reconstruction methods. There was also no significant difference in the image quality between the FBP and SAFIRE series. Conclusion: Compared with traditional FBP, there is potential for substantial radiation dose reduction at CTCA with use of SAFIRE, while maintaining similar diagnostic image quality

  14. Filtered back-projection reconstruction for attenuation proton CT along most likely paths

    Quiñones, C. T.; Létang, J. M.; Rit, S.

    2016-05-01

    This work investigates the attenuation of a proton beam to reconstruct the map of the linear attenuation coefficient of a material which is mainly caused by the inelastic interactions of protons with matter. Attenuation proton computed tomography (pCT) suffers from a poor spatial resolution due to multiple Coulomb scattering (MCS) of protons in matter, similarly to the conventional energy-loss pCT. We therefore adapted a recent filtered back-projection algorithm along the most likely path (MLP) of protons for energy-loss pCT (Rit et al 2013) to attenuation pCT assuming a pCT scanner that can track the position and the direction of protons before and after the scanned object. Monte Carlo simulations of pCT acquisitions of density and spatial resolution phantoms were performed to characterize the new algorithm using Geant4 (via Gate). Attenuation pCT assumes an energy-independent inelastic cross-section, and the impact of the energy dependence of the inelastic cross-section below 100 MeV showed a capping artifact when the residual energy was below 100 MeV behind the object. The statistical limitation has been determined analytically and it was found that the noise in attenuation pCT images is 411 times and 278 times higher than the noise in energy-loss pCT images for the same imaging dose at 200 MeV and 300 MeV, respectively. Comparison of the spatial resolution of attenuation pCT images with a conventional straight-line path binning showed that incorporating the MLP estimates during reconstruction improves the spatial resolution of attenuation pCT. Moreover, regardless of the significant noise in attenuation pCT images, the spatial resolution of attenuation pCT was better than that of conventional energy-loss pCT in some studied situations thanks to the interplay of MCS and attenuation known as the West–Sherwood effect.

  15. Single Image Super-Resolution VIA Iterative Back Projection Based Canny Edge Detection and a Gabor Filter Prior

    Rujul R Makwana

    2013-03-01

    Full Text Available The Iterative back-projection (IBP is a classical super-resolution method with low computational complexity that can be applied in real time applications. This paper presents an effective novel single image super resolution approach to recover a high resolution image from a single low resolution input image. The approach is based on an Iterative back projection (IBP method combined with the Canny Edge Detection and Gabor Filter to recover high frequency information. This method is applied on different natural gray images and compared with different existing image super resolution approaches. Simulation results show that the proposed algorithms can more accurately enlarge the low resolution image than previous approaches. Proposed algorithm increases the MSSIM and the PSNR and decreases MSE compared to other existing algorithms and also improves visual quality of enlarged images.

  16. Comprehensive analysis of high-performance computing methods for filtered back-projection

    Mendl, Christian B.; Eliuk, Steven; Noga, Michelle; Boulanger, Pierre

    2013-01-01

    This paper provides an extensive runtime, accuracy, and noise analysis of Computed To-mography (CT) reconstruction algorithms using various High-Performance Computing (HPC) frameworks such as: “conventional” multi-core, multi threaded CPUs, Compute Unified Device Architecture (CUDA), and DirectX or OpenGL graphics pipeline programming. The proposed algorithms exploit various built-in hardwired features of GPUs such as rasterization and texture filtering. We compare implementations of the Filt...

  17. Evaluation of dose reduction and image quality in CT colonography: Comparison of low-dose CT with iterative reconstruction and routine-dose CT with filtered back projection

    Nagata, Koichi [Kameda Medical Center, Department of Radiology, Kamogawa, Chiba (Japan); Jichi Medical University, Department of Radiology, Tochigi (Japan); National Cancer Center, Cancer Screening Technology Division, Research Center for Cancer Prevention and Screening, Tokyo (Japan); Fujiwara, Masanori; Mogi, Tomohiro; Iida, Nao [Kameda Medical Center Makuhari, Department of Radiology, Chiba (Japan); Kanazawa, Hidenori; Sugimoto, Hideharu [Jichi Medical University, Department of Radiology, Tochigi (Japan); Mitsushima, Toru [Kameda Medical Center Makuhari, Department of Gastroenterology, Chiba (Japan); Lefor, Alan T. [Jichi Medical University, Department of Surgery, Tochigi (Japan)

    2015-01-15

    To prospectively evaluate the radiation dose and image quality comparing low-dose CT colonography (CTC) reconstructed using different levels of iterative reconstruction techniques with routine-dose CTC reconstructed with filtered back projection. Following institutional ethics clearance and informed consent procedures, 210 patients underwent screening CTC using automatic tube current modulation for dual positions. Examinations were performed in the supine position with a routine-dose protocol and in the prone position, randomly applying four different low-dose protocols. Supine images were reconstructed with filtered back projection and prone images with iterative reconstruction. Two blinded observers assessed the image quality of endoluminal images. Image noise was quantitatively assessed by region-of-interest measurements. The mean effective dose in the supine series was 1.88 mSv using routine-dose CTC, compared to 0.92, 0.69, 0.57, and 0.46 mSv at four different low doses in the prone series (p < 0.01). Overall image quality and noise of low-dose CTC with iterative reconstruction were significantly improved compared to routine-dose CTC using filtered back projection. The lowest dose group had image quality comparable to routine-dose images. Low-dose CTC with iterative reconstruction reduces the radiation dose by 48.5 to 75.1 % without image quality degradation compared to routine-dose CTC with filtered back projection. (orig.)

  18. Comparison of pure and hybrid iterative reconstruction techniques with conventional filtered back projection: Image quality assessment in the cervicothoracic region

    Objectives: To evaluate the impact on image quality of three different image reconstruction techniques in the cervicothoracic region: model-based iterative reconstruction (MBIR), adaptive statistical iterative reconstruction (ASIR), and filtered back projection (FBP). Methods: Forty-four patients underwent unenhanced standard-of-care clinical computed tomography (CT) examinations which included the cervicothoracic region with a 64-row multidetector CT scanner. Images were reconstructed with FBP, 50% ASIR-FBP blending (ASIR50), and MBIR. Two radiologists assessed the cervicothoracic region in a blinded manner for streak artifacts, pixilated blotchy appearances, critical reproduction of visually sharp anatomical structures (thyroid gland, common carotid artery, and esophagus), and overall diagnostic acceptability. Objective image noise was measured in the internal jugular vein. Data were analyzed using the sign test and pair-wise Student's t-test. Results: MBIR images had significant lower quantitative image noise (8.88 ± 1.32) compared to ASIR images (18.63 ± 4.19, P 0.9 for ASIR vs. FBP for both readers). MBIR images were all diagnostically acceptable. Unique features of MBIR images included pixilated blotchy appearances, which did not adversely affect diagnostic acceptability. Conclusions: MBIR significantly improves image noise and streak artifacts of the cervicothoracic region over ASIR and FBP. MBIR is expected to enhance the value of CT examinations for areas where image noise and streak artifacts are problematic

  19. Chest computed tomography using iterative reconstruction vs filtered back projection (Part 1): evaluation of image noise reduction in 32 patients

    To assess noise reduction achievable with an iterative reconstruction algorithm. 32 consecutive chest CT angiograms were reconstructed with regular filtered back projection (FBP) (Group 1) and an iterative reconstruction technique (IRIS) with 3 (Group 2a) and 5 (Group 2b) iterations. Objective image noise was significantly reduced in Group 2a and Group 2b compared with FBP (p < 0.0001). There was a significant reduction in the level of subjective image noise in Group 2a compared with Group 1 images (p < 0.003), further reinforced on Group 2b images (Group 2b vs Group 1; p < 0.0001) (Group 2b vs Group 2a; p = 0.0006). The overall image quality scores significantly improved on Group 2a images compared with Group 1 images (p = 0.0081) and on Group 2b images compared with Group 2a images (p < 0.0001). Comparative analysis of individual CT features of mild lung infiltration showed improved conspicuity of ground glass attenuation (p < 0.0001), ill-defined micronodules (p = 0.0351) and emphysematous lesions (p < 0.0001) on Group 2a images, further improved on Group 2b images for ground glass attenuation (p < 0.0001), and emphysematous lesions (p = 0.0087). Compared with regular FBP, iterative reconstructions enable significant reduction of image noise without loss of diagnostic information, thus having the potential to decrease radiation dose during chest CT examinations. (orig.)

  20. Detection of Urothelial Carcinoma: Comparison of Reduced-Dose Iterative Reconstruction with Standard-Dose Filtered Back Projection.

    Bahn, Young Eun; Kim, See Hyung; Kim, Mi Jeong; Kim, Chan Sun; Kim, Young Hwan; Cho, Seung Hyun

    2016-05-01

    Purpose To prospectively assess radiation dose, image quality, and diagnostic performance of computed tomography (CT) urography for detection of urothelial carcinomas by performing reduced-dose scanning with iterative reconstruction (IR) compared with standard-dose scanning with filtered back projection (FBP). Materials and Methods The institutional review board approved the study with written informed patient consent. In total, 2163 patients at high risk for urothelial carcinomas randomly underwent standard-dose scanning with FBP (protocol A, 120 kVp for >80 kg body weight; protocol B, 100 kVp for 50-80 kg body weight) or reduced-dose scanning with IR (protocol C, 100 kVp for >80 kg body weight; protocol D, 80 kVp for 50-80 kg body weight). Objective image quality (signal-to-noise ratio and contrast-to-noise ratio) between the two groups with same weight range was measured for various regions of interest. Subjective image quality (visual image noise, artifact, ureter depiction, and overall image quality) and diagnostic accuracy (per lesion and per patient) were assessed with three- and five-point scores, respectively. Results Size-specific dose estimate (protocol A vs protocol C, 24.2 mGy vs 19.2 mGy, respectively; protocol B vs protocol D,13.9 mGy vs 8.8 mGy, respectively) was significantly lower in reduced-dose scanning (P image quality, except for artifacts in protocols B and D (range, 4-5 vs 3-4; P image quality in detection of urothelial carcinomas, except for some artifacts in 80-kVp scanning. (©) RSNA, 2015. PMID:26566141

  1. Coronary CT angiography: Comparison of a novel iterative reconstruction with filtered back projection for reconstruction of low-dose CT—Initial experience

    Objective: To prospectively compare subjective and objective image quality in 20% tube current coronary CT angiography (cCTA) datasets between an iterative reconstruction algorithm (SAFIRE) and traditional filtered back projection (FBP). Materials and methods: Twenty patients underwent a prospectively ECG-triggered dual-step cCTA protocol using 2nd generation dual-source CT (DSCT). CT raw data was reconstructed using standard FBP at full-dose (Group1a) and 80% tube current reduced low-dose (Group1b). The low-dose raw data was additionally reconstructed using iterative raw data reconstruction (Group2). Attenuation and image noise were measured in three regions of interest and signal-to-noise-ratio (SNR) as well as contrast-to-noise-ratio (CNR) was calculated. Subjective diagnostic image quality was evaluated using a 4-point Likert scale. Results: Mean image noise of group2 was lowered by 22% on average when compared to group1b (p 2 compared to group1b (p 2 (1.88 ± 0.63) was also rated significantly higher when compared to group1b (1.58 ± 0.63, p = 0.004). Conclusions: Image quality of 80% tube current reduced iteratively reconstructed cCTA raw data is significantly improved when compared to standard FBP and consequently may improve the diagnostic accuracy of cCTA

  2. Coronary CT angiography: Comparison of a novel iterative reconstruction with filtered back projection for reconstruction of low-dose CT—Initial experience

    Takx, Richard A.P. [Department of Radiology and Radiological Science, Medical University of South Carolina, Charleston, SC (United States); Department of Radiology, Maastricht University Medical Centre, Maastricht (Netherlands); Schoepf, U. Joseph, E-mail: schoepf@musc.edu [Department of Radiology and Radiological Science, Medical University of South Carolina, Charleston, SC (United States); Division of Cardiology, Department of Medicine, Medical University of South Carolina, Charleston, SC (United States); Moscariello, Antonio [Department of Radiology and Radiological Science, Medical University of South Carolina, Charleston, SC (United States); Department of Radiology, Policlinico Universitario Campus Bio-Medico, Rome (Italy); Das, Marco [Department of Radiology, Maastricht University Medical Centre, Maastricht (Netherlands); Rowe, Garrett [Department of Radiology and Radiological Science, Medical University of South Carolina, Charleston, SC (United States); Schoenberg, Stefan O.; Fink, Christian [Institute of Clinical Radiology and Nuclear Medicine, University Medical Center Mannheim, Medical Faculty Mannheim, Heidelberg University (Germany); Henzler, Thomas [Department of Radiology and Radiological Science, Medical University of South Carolina, Charleston, SC (United States); Institute of Clinical Radiology and Nuclear Medicine, University Medical Center Mannheim, Medical Faculty Mannheim, Heidelberg University (Germany)

    2013-02-15

    Objective: To prospectively compare subjective and objective image quality in 20% tube current coronary CT angiography (cCTA) datasets between an iterative reconstruction algorithm (SAFIRE) and traditional filtered back projection (FBP). Materials and methods: Twenty patients underwent a prospectively ECG-triggered dual-step cCTA protocol using 2nd generation dual-source CT (DSCT). CT raw data was reconstructed using standard FBP at full-dose (Group{sub 1}a) and 80% tube current reduced low-dose (Group{sub 1}b). The low-dose raw data was additionally reconstructed using iterative raw data reconstruction (Group{sub 2}). Attenuation and image noise were measured in three regions of interest and signal-to-noise-ratio (SNR) as well as contrast-to-noise-ratio (CNR) was calculated. Subjective diagnostic image quality was evaluated using a 4-point Likert scale. Results: Mean image noise of group{sub 2} was lowered by 22% on average when compared to group{sub 1}b (p < 0.0001–0.0033), while there were no significant differences in mean attenuation within the same anatomical regions. The lower image noise resulted in significantly higher SNR and CNR ratios in group{sub 2} compared to group{sub 1}b (p < 0.0001–0.0232). Subjective image quality of group{sub 2} (1.88 ± 0.63) was also rated significantly higher when compared to group{sub 1}b (1.58 ± 0.63, p = 0.004). Conclusions: Image quality of 80% tube current reduced iteratively reconstructed cCTA raw data is significantly improved when compared to standard FBP and consequently may improve the diagnostic accuracy of cCTA.

  3. Ultra low-dose chest CT using filtered back projection: Comparison of 80-, 100- and 120 kVp protocols in a prospective randomized study

    Highlights: • Filtered back projection technique enables acceptable image quality for chest CT examinations at 0.9 mGy (estimated effective dose of 0.5 mSv) for selected sizes of patients. • Lesion detection (such as solid non-calcified lung nodules) in lung parenchyma is optimal at 0.9 mGy, with limited visualization of thyroid nodules in FBP images. • Further dose reduction down to 0.4 mGy is possible for most patients undergoing follow-up chest CT for evaluation of larger lung nodules and GGOs. • Our results may help set the reference ALARA dose for chest CT examinations reconstructed with filtered back projection technique using the minimum possible radiation dose with acceptable image quality and lesion detection. - Abstract: Purpose: To assess lesion detection and diagnostic image quality of filtered back projection (FBP) reconstruction technique in ultra low-dose chest CT examinations. Methods and materials: In this IRB-approved ongoing prospective clinical study, 116 CT-image-series at four different radiation-doses were performed for 29 patients (age, 57–87 years; F:M – 15:12; BMI 16–32 kg/m2). All patients provided written-informed-consent for the acquisitions of additional ultra low-dose (ULD) series on a 256-slice MDCT (iCT, Philips Healthcare). In-addition to their clinical standard-dose chest CT (SD, 120 kV mean CTDIvol, 6 ± 1 mGy), ULD-CT was subsequently performed at three-dose-levels (0.9 mGy [120 kV]; 0.5 mGy [100 kV] and 0.2 mGy [80 kV]). Images were reconstructed with FBP (2.5 mm * 1.25 mm) resulting into four-stacks: SD-FBP (reference-standard), FBP0.9, FBP0.5, and FBP0.2. Four thoracic-radiologists from two-teaching-hospitals independently-evaluated data for lesion-detection and visibility-of-small-structures. Friedman's-non-parametric-test with post hoc Dunn's-test was used for data-analysis. Results: Interobserver-agreement was substantial between radiologists (k = 0.6–0.8). With pooled analysis, 146-pulmonary (27-groundglass-opacities, 64-solid-lung-nodules, 7-consolidations, 27-emphysema) and 347-mediastinal/soft tissue lesions (87-mediastinal, 46-hilar, 62-axillary-lymph-nodes, and 11-mediastinal-masses) were evaluated. Compared to the SD-FBP, 100% pulmonary-lesions were seen with FBP0.9, up to 81% with FBP0.5 (missed: 4), and up to 30% with FBP0.2 images (missed:16). Compared to SD-FBP, all enlarged mediastinal-lymph-nodes were seen with FBP0.9 images. All mediastinal-masses (>2 cm, 11/11) were seen equivalent to SD-FBP images at 0.9 mGy. Across all sizes of patients, FBP0.9 images had optimal visualization for lung findings. They were optimal for mediastinal soft tissues for only non-obese patients. Conclusion: Filtered-back-projection technique allows optimal lesion detection and acceptable image quality for chest-CT examinations at CDTIvol of 0.9 mGy for lung and mediastinal findings in selected sizes of patients

  4. Ultra low-dose chest CT using filtered back projection: Comparison of 80-, 100- and 120 kVp protocols in a prospective randomized study

    Khawaja, Ranish Deedar Ali, E-mail: rkhawaja@mgh.harvard.edu [Division of Thoracic Radiology, MGH Imaging, Massachusetts General Hospital and Harvard Medical School, Boston (United States); Singh, Sarabjeet [Division of Thoracic Radiology, MGH Imaging, Massachusetts General Hospital and Harvard Medical School, Boston (United States); Madan, Rachna [Division of Thoracic Radiology, Brigham and Women' s Hospital and Harvard Medical School, Boston (United States); Sharma, Amita; Padole, Atul; Pourjabbar, Sarvenaz; Digumarthy, Subba; Shepard, Jo-Anne; Kalra, Mannudeep K. [Division of Thoracic Radiology, MGH Imaging, Massachusetts General Hospital and Harvard Medical School, Boston (United States)

    2014-10-15

    Highlights: • Filtered back projection technique enables acceptable image quality for chest CT examinations at 0.9 mGy (estimated effective dose of 0.5 mSv) for selected sizes of patients. • Lesion detection (such as solid non-calcified lung nodules) in lung parenchyma is optimal at 0.9 mGy, with limited visualization of thyroid nodules in FBP images. • Further dose reduction down to 0.4 mGy is possible for most patients undergoing follow-up chest CT for evaluation of larger lung nodules and GGOs. • Our results may help set the reference ALARA dose for chest CT examinations reconstructed with filtered back projection technique using the minimum possible radiation dose with acceptable image quality and lesion detection. - Abstract: Purpose: To assess lesion detection and diagnostic image quality of filtered back projection (FBP) reconstruction technique in ultra low-dose chest CT examinations. Methods and materials: In this IRB-approved ongoing prospective clinical study, 116 CT-image-series at four different radiation-doses were performed for 29 patients (age, 57–87 years; F:M – 15:12; BMI 16–32 kg/m{sup 2}). All patients provided written-informed-consent for the acquisitions of additional ultra low-dose (ULD) series on a 256-slice MDCT (iCT, Philips Healthcare). In-addition to their clinical standard-dose chest CT (SD, 120 kV mean CTDI{sub vol}, 6 ± 1 mGy), ULD-CT was subsequently performed at three-dose-levels (0.9 mGy [120 kV]; 0.5 mGy [100 kV] and 0.2 mGy [80 kV]). Images were reconstructed with FBP (2.5 mm * 1.25 mm) resulting into four-stacks: SD-FBP (reference-standard), FBP{sub 0.9}, FBP{sub 0.5}, and FBP{sub 0.2}. Four thoracic-radiologists from two-teaching-hospitals independently-evaluated data for lesion-detection and visibility-of-small-structures. Friedman's-non-parametric-test with post hoc Dunn's-test was used for data-analysis. Results: Interobserver-agreement was substantial between radiologists (k = 0.6–0.8). With pooled analysis, 146-pulmonary (27-groundglass-opacities, 64-solid-lung-nodules, 7-consolidations, 27-emphysema) and 347-mediastinal/soft tissue lesions (87-mediastinal, 46-hilar, 62-axillary-lymph-nodes, and 11-mediastinal-masses) were evaluated. Compared to the SD-FBP, 100% pulmonary-lesions were seen with FBP{sub 0.9}, up to 81% with FBP{sub 0.5} (missed: 4), and up to 30% with FBP{sub 0.2} images (missed:16). Compared to SD-FBP, all enlarged mediastinal-lymph-nodes were seen with FBP{sub 0.9} images. All mediastinal-masses (>2 cm, 11/11) were seen equivalent to SD-FBP images at 0.9 mGy. Across all sizes of patients, FBP{sub 0.9} images had optimal visualization for lung findings. They were optimal for mediastinal soft tissues for only non-obese patients. Conclusion: Filtered-back-projection technique allows optimal lesion detection and acceptable image quality for chest-CT examinations at CDTI{sub vol} of 0.9 mGy for lung and mediastinal findings in selected sizes of patients.

  5. Investigation of the quantitative accuracy of 3D iterative reconstruction algorithms in comparison to filtered back projection method: a phantom study

    Abuhadi, Nouf; Bradley, David; Katarey, Dev; Podolyak, Zsolt; Sassi, Salem

    2014-03-01

    Introduction: Single-Photon Emission Computed Tomography (SPECT) is used to measure and quantify radiopharmaceutical distribution within the body. The accuracy of quantification depends on acquisition parameters and reconstruction algorithms. Until recently, most SPECT images were constructed using Filtered Back Projection techniques with no attenuation or scatter corrections. The introduction of 3-D Iterative Reconstruction algorithms with the availability of both computed tomography (CT)-based attenuation correction and scatter correction may provide for more accurate measurement of radiotracer bio-distribution. The effect of attenuation and scatter corrections on accuracy of SPECT measurements is well researched. It has been suggested that the combination of CT-based attenuation correction and scatter correction can allow for more accurate quantification of radiopharmaceutical distribution in SPECT studies (Bushberg et al., 2012). However, The effect of respiratory induced cardiac motion on SPECT images acquired using higher resolution algorithms such 3-D iterative reconstruction with attenuation and scatter corrections has not been investigated. Aims: To investigate the quantitative accuracy of 3D iterative reconstruction algorithms in comparison to filtered back projection (FBP) methods implemented on cardiac SPECT/CT imaging with and without CT-attenuation and scatter corrections. Also to investigate the effects of respiratory induced cardiac motion on myocardium perfusion quantification. Lastly, to present a comparison of spatial resolution for FBP and ordered subset expectation maximization (OSEM) Flash 3D together with and without respiratory induced motion, and with and without attenuation and scatter correction. Methods: This study was performed on a Siemens Symbia T16 SPECT/CT system using clinical acquisition protocols. Respiratory induced cardiac motion was simulated by imaging a cardiac phantom insert whilst moving it using a respiratory motion motor inducing cyclical elliptical motion of the apex of the cardiac insert. Results: Our analyses revealed that the use of the Flash 3-D reconstruction algorithm without scatter or attenuation correction has improved Spatial Resolution by 30% relative to FBP. Reduction in Spatial Resolution due to respiratory induced motion was 12% and 38% for FBP and Flash 3-D respectively. The implementation of scatter correction has resulted in a reduction in resolution by up to 6%. The application of CT-based attenuation correction has resulted in 13% and 26% reduction in spatial resolution for SPECT images reconstructed using FBP and Flash 3-D algorithms respectively. Conclusion: We conclude that iterative reconstruction (Flash-3D) provides significant improvement in image spatial resolution, however as a result the effects of respiratory induced motion have become more evident and correction of this is required before the full potential of these algorithms can be realised for myocardial perfusion imaging. Attenuation and scatter correction can improve image contrast, but may have significant detrimental effect on spatial resolution.

  6. Comparison of applied dose and image quality in staging CT of neuroendocrine tumor patients using standard filtered back projection and adaptive statistical iterative reconstruction

    Highlights: • Iterative reconstruction (IR) in staging CT provides equal objective image quality compared to filtered back projection (FBP). • IR delivers excellent subjective quality and reduces effective dose compared to FBP. • In patients with neuroendocrine tumor (NET) or may other hypervascular abdominal tumors IR can be used without scarifying diagnostic confidence. - Abstract: Objective: To investigate whether dose reduction via adaptive statistical iterative reconstruction (ASIR) affects image quality and diagnostic accuracy in neuroendocrine tumor (NET) staging. Methods: A total of 28 NET patients were enrolled in the study. Inclusion criteria were histologically proven NET and visible tumor in abdominal computed tomography (CT). In an intraindividual study design, the patients underwent a baseline CT (filtered back projection, FBP) and follow-up CT (ASIR 40%) using matched scan parameters. Image quality was assessed subjectively using a 5-grade scoring system and objectively by determining signal-to-noise ratio (SNR) and contrast-to-noise ratios (CNRs). Applied volume computed tomography dose index (CTDIvol) of each scan was taken from the dose report. Results: ASIR 40% significantly reduced CTDIvol (10.17 ± 3.06 mGy [FBP], 6.34 ± 2.25 mGy [ASIR] (p < 0.001) by 37.6% and significantly increased CNRs (complete tumor-to-liver, 2.76 ± 1.87 [FBP], 3.2 ± 2.32 [ASIR]) (p < 0.05) (complete tumor-to-muscle, 2.74 ± 2.67 [FBP], 4.31 ± 4.61 [ASIR]) (p < 0.05) compared to FBP. Subjective scoring revealed no significant changes for diagnostic confidence (5.0 ± 0 [FBP], 5.0 ± 0 [ASIR]), visibility of suspicious lesion (4.8 ± 0.5 [FBP], 4.8 ± 0.5 [ASIR]) and artifacts (5.0 ± 0 [FBP], 5.0 ± 0 [ASIR]). ASIR 40% significantly decreased scores for noise (4.3 ± 0.6 [FBP], 4.0 ± 0.8 [ASIR]) (p < 0.05), contrast (4.4 ± 0.6 [FBP], 4.1 ± 0.8 [ASIR]) (p < 0.001) and visibility of small structures (4.5 ± 0.7 [FBP], 4.3 ± 0.8 [ASIR]) (p < 0.001). Conclusion: In clinical practice ASIR can be used to reduce radiation dose without sacrificing image quality and diagnostic confidence in staging CT of NET patients. This may be beneficial for patients with frequent follow-up and significant cumulative radiation exposure

  7. Comparison of applied dose and image quality in staging CT of neuroendocrine tumor patients using standard filtered back projection and adaptive statistical iterative reconstruction

    Böning, G., E-mail: georg.boening@charite.de [Department of Radiology, Charité, Humboldt-University Medical School, Charitéplatz 1, 10117 Berlin (Germany); Schäfer, M.; Grupp, U. [Department of Radiology, Charité, Humboldt-University Medical School, Charitéplatz 1, 10117 Berlin (Germany); Kaul, D. [Department of Radiation Oncology, Charité, Humboldt-University Medical School, Charitéplatz 1, 10117 Berlin (Germany); Kahn, J. [Department of Radiology, Charité, Humboldt-University Medical School, Charitéplatz 1, 10117 Berlin (Germany); Pavel, M. [Department of Gastroenterology, Charité, Humboldt-University Medical School, Charitéplatz 1, 10117 Berlin (Germany); Maurer, M.; Denecke, T.; Hamm, B.; Streitparth, F. [Department of Radiology, Charité, Humboldt-University Medical School, Charitéplatz 1, 10117 Berlin (Germany)

    2015-08-15

    Highlights: • Iterative reconstruction (IR) in staging CT provides equal objective image quality compared to filtered back projection (FBP). • IR delivers excellent subjective quality and reduces effective dose compared to FBP. • In patients with neuroendocrine tumor (NET) or may other hypervascular abdominal tumors IR can be used without scarifying diagnostic confidence. - Abstract: Objective: To investigate whether dose reduction via adaptive statistical iterative reconstruction (ASIR) affects image quality and diagnostic accuracy in neuroendocrine tumor (NET) staging. Methods: A total of 28 NET patients were enrolled in the study. Inclusion criteria were histologically proven NET and visible tumor in abdominal computed tomography (CT). In an intraindividual study design, the patients underwent a baseline CT (filtered back projection, FBP) and follow-up CT (ASIR 40%) using matched scan parameters. Image quality was assessed subjectively using a 5-grade scoring system and objectively by determining signal-to-noise ratio (SNR) and contrast-to-noise ratios (CNRs). Applied volume computed tomography dose index (CTDI{sub vol}) of each scan was taken from the dose report. Results: ASIR 40% significantly reduced CTDI{sub vol} (10.17 ± 3.06 mGy [FBP], 6.34 ± 2.25 mGy [ASIR] (p < 0.001) by 37.6% and significantly increased CNRs (complete tumor-to-liver, 2.76 ± 1.87 [FBP], 3.2 ± 2.32 [ASIR]) (p < 0.05) (complete tumor-to-muscle, 2.74 ± 2.67 [FBP], 4.31 ± 4.61 [ASIR]) (p < 0.05) compared to FBP. Subjective scoring revealed no significant changes for diagnostic confidence (5.0 ± 0 [FBP], 5.0 ± 0 [ASIR]), visibility of suspicious lesion (4.8 ± 0.5 [FBP], 4.8 ± 0.5 [ASIR]) and artifacts (5.0 ± 0 [FBP], 5.0 ± 0 [ASIR]). ASIR 40% significantly decreased scores for noise (4.3 ± 0.6 [FBP], 4.0 ± 0.8 [ASIR]) (p < 0.05), contrast (4.4 ± 0.6 [FBP], 4.1 ± 0.8 [ASIR]) (p < 0.001) and visibility of small structures (4.5 ± 0.7 [FBP], 4.3 ± 0.8 [ASIR]) (p < 0.001). Conclusion: In clinical practice ASIR can be used to reduce radiation dose without sacrificing image quality and diagnostic confidence in staging CT of NET patients. This may be beneficial for patients with frequent follow-up and significant cumulative radiation exposure.

  8. Standard dose versus low-dose abdominal and pelvic CT: Comparison between filtered back projection versus adaptive iterative dose reduction 3D

    Purpose: To compare the dose and image quality of a standard dose abdominal and pelvic CT with Filtered Back Projection (FBP) to low-dose CT with Adaptive Iterative Dose Reduction 3D (AIDR 3D). Materials and methods: We retrospectively examined the images of 21 patients in the portal phase of an abdominal and pelvic CT scan before and after implementation of AIDR 3D iterative reconstruction. The acquisition length, dose and evaluations of the image quality were compared between standard dose FBP images and low-dose images reconstructed with AIDR 3D and FBP using the Wilcoxon test. Results: The mean acquisition length was similar for both CT scans. There was a significant dose reduction of 49.5% with low-dose CT compared to standard dose CT (mean DLP of 451 mGy.cm versus 892 mGy.cm, P < 0.001). There were no differences in image quality scores between standard dose FBP and low-dose AIDR 3D images (4.6 ± 0.6 versus 4.4 ± 0.6 respectively, P = 0.147). Conclusion: AIDR 3D iterative reconstruction enables a significant reduction in dose of 49.5% to be achieved with abdominal CT scan compared to FBP, whilst maintaining equivalent image quality. (authors)

  9. Local detection of prostate cancer by positron emission tomography with 2-fluorodeoxyglucose comparison of filtered back projection and iterative reconstruction with segmented attenuation correction

    To compare filtered back projection (FBP) and iterative reconstruction with segmented attenuation correction (IRSAC) in the local imaging of prostate cancer by positron emission tomography with 2-fluorodeoxyglucose (FDG-PET). 13 patients with primary (n=7) or recurrent (n=6) prostate cancer who had increased uptake in the prostate on FDG-PET performed without urinary catheterization, contemporaneous biopsy confirming the presence of active tumor in the prostate, were retrospectively identified. Two independent nuclear medicine physicians separately rated FBP and IRSAC images for visualization of prostatic activity on a 4-point scale. Results were compared using biopsy and cross-sectional imaging findings as the standard of reference. IRSAC images were significantly better that FBP in terms of visualization of prostatic activity in 12 of 13 patients, and were equivalent in 1 patient (p<0.001, Wilcoxon signed ranks test). In particular, 2 foci of tumor activity in 2 different patients seen on IRSAC images were not visible on FBP images. In 11 patients who had a gross tumor mass evident on cross-sectional imaging, there was good agreement between PET and cross-sectional anatomic imaging with respect to tumor localization. In selected patients, cancer can be imaged within the prostate using FDG-PET, and IRSAC is superior to FBP in image reconstruction for local tumor visualization

  10. Quantitative analysis of emphysema and airway measurements according to iterative reconstruction algorithms: comparison of filtered back projection, adaptive statistical iterative reconstruction and model-based iterative reconstruction

    To evaluate filtered back projection (FBP) and two iterative reconstruction (IR) algorithms and their effects on the quantitative analysis of lung parenchyma and airway measurements on computed tomography (CT) images. Low-dose chest CT obtained in 281 adult patients were reconstructed using three algorithms: FBP, adaptive statistical IR (ASIR) and model-based IR (MBIR). Measurements of each dataset were compared: total lung volume, emphysema index (EI), airway measurements of the lumen and wall area as well as average wall thickness. Accuracy of airway measurements of each algorithm was also evaluated using an airway phantom. EI using a threshold of -950 HU was significantly different among the three algorithms in decreasing order of FBP (2.30 %), ASIR (1.49 %) and MBIR (1.20 %) (P < 0.01). Wall thickness was also significantly different among the three algorithms with FBP (2.09 mm) demonstrating thicker walls than ASIR (2.00 mm) and MBIR (1.88 mm) (P < 0.01). Airway phantom analysis revealed that MBIR showed the most accurate value for airway measurements. The three algorithms presented different EIs and wall thicknesses, decreasing in the order of FBP, ASIR and MBIR. Thus, care should be taken in selecting the appropriate IR algorithm on quantitative analysis of the lung. (orig.)

  11. Chest computed tomography using iterative reconstruction vs filtered back projection (Part 2): image quality of low-dose CT examinations in 80 patients

    To evaluate the image quality of an iterative reconstruction algorithm (IRIS) in low-dose chest CT in comparison with standard-dose filtered back projection (FBP) CT. Eighty consecutive patients referred for a follow-up chest CT examination of the chest, underwent a low-dose CT examination (Group 2) in similar technical conditions to those of the initial examination, (Group 1) except for the milliamperage selection and the replacement of regular FBP reconstruction by iterative reconstructions using three (Group 2a) and five iterations (Group 2b). Despite a mean decrease of 35.5% in the dose-length-product, there was no statistically significant difference between Group 2a and Group 1 in the objective noise, signal-to-noise (SNR) and contrast-to-noise (CNR) ratios and distribution of the overall image quality scores. Compared to Group 1, objective image noise in Group 2b was significantly reduced with increased SNR and CNR and a trend towards improved image quality. Iterative reconstructions using three iterations provide similar image quality compared with the conventionally used FBP reconstruction at 35% less dose, thus enabling dose reduction without loss of diagnostic information. According to our preliminary results, even higher dose reductions than 35% may be feasible by using more than three iterations. (orig.)

  12. Half-dose abdominal CT with sinogram-affirmed iterative reconstruction technique in children - comparison with full-dose CT with filtered back projection

    Iterative reconstruction can be helpful to reduce radiation dose while maintaining image quality. However, this technique has not been fully evaluated in children during abdominal CT. To compare objective and subjective image quality between half-dose images reconstructed with iterative reconstruction at iteration strength levels 1 to 5 (half-S1 to half-S5 studies) and full-dose images reconstructed with filtered back projection (full studies) in pediatric abdominal CT. Twenty-one children (M:F = 13:8; mean age 8.2 ± 5.7 years) underwent dual-source abdominal CT (mean effective dose 4.8 ± 2.1 mSv). The objective image quality was evaluated as noise. Subjective image quality analysis was performed comparing each half study to the full study for noise, sharpness, artifact and diagnostic acceptability. Both objective and subjective image noise decreased with increasing iteration strength. Half-S4 and -S5 studies showed objective image noise similar to or lower than that of full studies. The half-S2 and -S3 studies produced the greatest sharpness and the half-S5 studies were the worst from a blocky appearance. Full and half studies did not differ in artifacts. Half-S3 studies showed the best diagnostic acceptability. Half-S4 and -S5 studies objectively and half-S3 studies subjectively showed comparable image quality to full studies in pediatric abdominal CT. (orig.)

  13. Quantitative analysis of emphysema and airway measurements according to iterative reconstruction algorithms: comparison of filtered back projection, adaptive statistical iterative reconstruction and model-based iterative reconstruction

    Choo, Ji Yung [Seoul National University Medical Research Center, Department of Radiology, Seoul National University College of Medicine, and Institute of Radiation Medicine, Seoul (Korea, Republic of); Korea University Ansan Hospital, Ansan-si, Department of Radiology, Gyeonggi-do (Korea, Republic of); Goo, Jin Mo; Park, Chang Min; Park, Sang Joon [Seoul National University Medical Research Center, Department of Radiology, Seoul National University College of Medicine, and Institute of Radiation Medicine, Seoul (Korea, Republic of); Seoul National University, Cancer Research Institute, Seoul (Korea, Republic of); Lee, Chang Hyun; Shim, Mi-Suk [Seoul National University Medical Research Center, Department of Radiology, Seoul National University College of Medicine, and Institute of Radiation Medicine, Seoul (Korea, Republic of)

    2014-04-15

    To evaluate filtered back projection (FBP) and two iterative reconstruction (IR) algorithms and their effects on the quantitative analysis of lung parenchyma and airway measurements on computed tomography (CT) images. Low-dose chest CT obtained in 281 adult patients were reconstructed using three algorithms: FBP, adaptive statistical IR (ASIR) and model-based IR (MBIR). Measurements of each dataset were compared: total lung volume, emphysema index (EI), airway measurements of the lumen and wall area as well as average wall thickness. Accuracy of airway measurements of each algorithm was also evaluated using an airway phantom. EI using a threshold of -950 HU was significantly different among the three algorithms in decreasing order of FBP (2.30 %), ASIR (1.49 %) and MBIR (1.20 %) (P < 0.01). Wall thickness was also significantly different among the three algorithms with FBP (2.09 mm) demonstrating thicker walls than ASIR (2.00 mm) and MBIR (1.88 mm) (P < 0.01). Airway phantom analysis revealed that MBIR showed the most accurate value for airway measurements. The three algorithms presented different EIs and wall thicknesses, decreasing in the order of FBP, ASIR and MBIR. Thus, care should be taken in selecting the appropriate IR algorithm on quantitative analysis of the lung. (orig.)

  14. Forward problem solution as the operator of filtered and back projection matrix to reconstruct the various method of data collection and the object element model in electrical impedance tomography

    Ain, Khusnul [Engineering Physics Program, ITB, Bandung - Indonesia (Indonesia); Physics Department - Airlangga University, Surabaya – Indonesia, khusnulainunair@yahoo.com (Indonesia); Kurniadi, Deddy; Suprijanto [Engineering Physics Program, ITB, Bandung - Indonesia (Indonesia); Santoso, Oerip [Informatics Program, ITB, Bandung - Indonesia (Indonesia); Wibowo, Arif [Physics Department - Airlangga University, Surabaya – Indonesia, khusnulainunair@yahoo.com (Indonesia)

    2015-04-16

    Back projection reconstruction has been implemented to get the dynamical image in electrical impedance tomography. However the implementation is still limited in method of adjacent data collection and circular object element model. The study aims to develop the methods of back projection as reconstruction method that has the high speed, accuracy, and flexibility, which can be used for various methods of data collection and model of the object element. The proposed method uses the forward problem solution as the operator of filtered and back projection matrix. This is done through a simulation study on several methods of data collection and various models of the object element. The results indicate that the developed method is capable of producing images, fastly and accurately for reconstruction of the various methods of data collection and models of the object element.

  15. Forward problem solution as the operator of filtered and back projection matrix to reconstruct the various method of data collection and the object element model in electrical impedance tomography

    Back projection reconstruction has been implemented to get the dynamical image in electrical impedance tomography. However the implementation is still limited in method of adjacent data collection and circular object element model. The study aims to develop the methods of back projection as reconstruction method that has the high speed, accuracy, and flexibility, which can be used for various methods of data collection and model of the object element. The proposed method uses the forward problem solution as the operator of filtered and back projection matrix. This is done through a simulation study on several methods of data collection and various models of the object element. The results indicate that the developed method is capable of producing images, fastly and accurately for reconstruction of the various methods of data collection and models of the object element

  16. Impact of adaptive iterative dose reduction (AIDR) 3D on low-dose abdominal CT - Comparison with routine-dose CT using filtered back projection

    Background: While CT is widely used in medical practice, a substantial source of radiation exposure is associated with an increased lifetime risk of cancer. Therefore, concerns to dose reduction in CT examinations are increasing and an iterative reconstruction algorithm, which allow for dose reduction by compensating image noise in the image reconstruction, has been developed. Purpose: To investigate the performance of low-dose abdominal CT using adaptive iterative dose reduction 3D (AIDR 3D) compared to routine-dose CT using filtered back projection (FBP). Material and Methods: Fifty-eight patients underwent both routine-dose CT scans using FBP and low-dose CT scans using AIDR 3D in the abdomen. The image noise levels, signal-to-noise ratios (SNRs), and contrast-to-noise ratios (CNRs) of the aorta, portal vein, liver, and pancreas were measured and compared in both scans. Visual evaluations were performed. The volume CT dose index (CTDIvol) was measured. Results: Image noise levels on low-dose CT images using AIDR 3D were significantly lower than, or not significantly different from, routine-dose CT images using FBP in reviewing the data on the basis of all patients and the three BMI groups. SNRs and CNRs on low-dose CT images using AIDR 3D were significantly higher than, or not significantly different from, routine-dose CT images using FBP in reviewing the data on the basis of all patients and the three BMI groups. In visual evaluation of the images, there were no statistically significant differences between the scans in all organs independently of BMI. The average CTDIvol at routine-dose and low dose CT was 21.4 and 10.8 mGy, respectively. Conclusion: Low-dose abdominal CT using AIDR 3D allows for approximately 50 % reduction in radiation dose without a degradation of image quality compared to routine-dose CT using FBP independently of BMI

  17. Image quality and radiation dose of low dose coronary CT angiography in obese patients: Sinogram affirmed iterative reconstruction versus filtered back projection

    Purpose: To investigate the image quality and radiation dose of low radiation dose CT coronary angiography (CTCA) using sinogram affirmed iterative reconstruction (SAFIRE) compared with standard dose CTCA using filtered back-projection (FBP) in obese patients. Materials and methods: Seventy-eight consecutive obese patients were randomized into two groups and scanned using a prospectively ECG-triggered step-and-shot (SAS) CTCA protocol on a dual-source CT scanner. Thirty-nine patients (protocol A) were examined using a routine radiation dose protocol at 120 kV and images were reconstructed with FBP (protocol A). Thirty-nine patients (protocol B) were examined using a low dose protocol at 100 kV and images were reconstructed with SAFIRE. Two blinded observers independently assessed the image quality of each coronary segment using a 4-point scale (1 = non-diagnostic, 4 = excellent) and measured the objective parameters image noise, signal-to-noise ratio (SNR), and contrast-to-noise ratio (CNR). Radiation dose was calculated. Results: The coronary artery image quality scores, image noise, SNR and CNR were not significantly different between protocols A and B (all p > 0.05), with image quality scores of 3.51 ± 0.70 versus 3.55 ± 0.47, respectively. The effective radiation dose was significantly lower in protocol B (4.41 ± 0.83 mSv) than that in protocol A (8.83 ± 1.74 mSv, p < 0.01). Conclusion: Compared with standard dose CTCA using FBP, low dose CTCA using SAFIRE can maintain diagnostic image quality with 50% reduction of radiation dose.

  18. Dose reduction in chest CT: Comparison of the adaptive iterative dose reduction 3D, adaptive iterative dose reduction, and filtered back projection reconstruction techniques

    Objectives: To assess the effectiveness of adaptive iterative dose reduction (AIDR) and AIDR 3D in improving the image quality in low-dose chest CT (LDCT). Materials and methods: Fifty patients underwent standard-dose chest CT (SDCT) and LDCT simultaneously, performed under automatic exposure control with noise index of 19 and 38 (for a 2-mm slice thickness), respectively. The SDCT images were reconstructed with filtered back projection (SDCT-FBP images), and the LDCT images with FBP, AIDR and AIDR 3D (LDCT-FBP, LDCT-AIDR and LDCT-AIDR 3D images, respectively). On all the 200 lung and 200 mediastinal image series, objective image noise and signal-to-noise ratio (SNR) were measured in several regions, and two blinded radiologists independently assessed the subjective image quality. Wilcoxon's signed rank sum test with Bonferroni's correction was used for the statistical analyses. Results: The mean dose reduction in LDCT was 64.2% as compared with the dose in SDCT. LDCT-AIDR 3D images showed significantly reduced objective noise and significantly increased SNR in all regions as compared to the SDCT-FBP, LDCT-FBP and LDCT-AIDR images (all, P ? 0.003). In all assessments of the image quality, LDCT-AIDR 3D images were superior to LDCT-AIDR and LDCT-FBP images. The overall diagnostic acceptability of both the lung and mediastinal LDCT-AIDR 3D images was comparable to that of the lung and mediastinal SDCT-FBP images. Conclusions: AIDR 3D is superior to AIDR. Intra-individual comparisons between SDCT and LDCT suggest that AIDR 3D allows a 64.2% reduction of the radiation dose as compared to SDCT, by substantially reducing the objective image noise and increasing the SNR, while maintaining the overall diagnostic acceptability.

  19. Comparison of iterative model, hybrid iterative, and filtered back projection reconstruction techniques in low-dose brain CT: impact of thin-slice imaging

    The purpose of this study was to evaluate the utility of iterative model reconstruction (IMR) in brain CT especially with thin-slice images. This prospective study received institutional review board approval, and prior informed consent to participate was obtained from all patients. We enrolled 34 patients who underwent brain CT and reconstructed axial images with filtered back projection (FBP), hybrid iterative reconstruction (HIR) and IMR with 1 and 5 mm slice thicknesses. The CT number, image noise, contrast, and contrast noise ratio (CNR) between the thalamus and internal capsule, and the rate of increase of image noise in 1 and 5 mm thickness images between the reconstruction methods, were assessed. Two independent radiologists assessed image contrast, image noise, image sharpness, and overall image quality on a 4-point scale. The CNRs in 1 and 5 mm slice thickness were significantly higher with IMR (1.2 ± 0.6 and 2.2 ± 0.8, respectively) than with FBP (0.4 ± 0.3 and 1.0 ± 0.4, respectively) and HIR (0.5 ± 0.3 and 1.2 ± 0.4, respectively) (p < 0.01). The mean rate of increasing noise from 5 to 1 mm thickness images was significantly lower with IMR (1.7 ± 0.3) than with FBP (2.3 ± 0.3) and HIR (2.3 ± 0.4) (p < 0.01). There were no significant differences in qualitative analysis of unfamiliar image texture between the reconstruction techniques. IMR offers significant noise reduction and higher contrast and CNR in brain CT, especially for thin-slice images, when compared to FBP and HIR. (orig.)

  20. Dose reduction in computed tomography of the chest. Image quality of iterative reconstructions at a 50% radiation dose compared to filtered back projection at a 100% radiation dose

    Purpose: The aim of this study was to evaluate the potential of iterative reconstruction (IR) in chest computed tomography (CT) to reduce radiation exposure. The qualitative and quantitative image quality of standard reconstructions with filtered back projection (FBP) and half dose (HD) chest CT data reconstructed with FBP and IR was assessed. Materials and Methods: 52 consecutive patients underwent contrast-enhanced chest CT on a dual-source CT system at 120 kV and automatic exposure control. The tube current was equally split on both tube detector systems. For the HD datasets, only data from one tube detector system was utilized. Thus, FD and HD data was available for each patient with a single scan. Three datasets were reconstructed from the raw data: standard full dose (FD) images applying FBP which served as a reference, HD images applying FBP and IR. Objective image quality analysis was performed by measuring the image noise in tissue and air. The subjective image quality was evaluated by 2 radiologists according to European guidelines. Additional assessment of artifacts, lesion conspicuity and edge sharpness was performed. Results: Image noise did not differ significantly between HD-IR and FD-FBP (p = 0.254) but increased substantially in HD-FBP (p < 0.001). No statistically significant differences were found for the reproduction of anatomical and pathological structures between HD-IR and FD-FBP, subsegmental bronchi and bronchioli. The image quality of HD-FBP was rated inferior because of increased noise. Conclusion: A 50% dose reduction in contrast-enhanced chest CT is feasible without a loss of diagnostic confidence if IR is used for image data reconstruction. Iterative reconstruction is another powerful tool to reduce radiation exposure and can be combined with other dose-saving techniques. (orig.)

  1. Measurement of vascular wall attenuation: Comparison of CT angiography using model-based iterative reconstruction with standard filtered back-projection algorithm CT in vitro

    Objectives: To compare the performance of model-based iterative reconstruction (MBIR) with that of standard filtered back projection (FBP) for measuring vascular wall attenuation. Study design: After subjecting 9 vascular models (actual attenuation value of wall, 89 HU) with wall thickness of 0.5, 1.0, or 1.5 mm that we filled with contrast material of 275, 396, or 542 HU to scanning using 64-detector computed tomography (CT), we reconstructed images using MBIR and FBP (Bone, Detail kernels) and measured wall attenuation at the center of the wall for each model. We performed attenuation measurements for each model and additional supportive measurements by a differentiation curve. We analyzed statistics using analyzes of variance with repeated measures. Results: Using the Bone kernel, standard deviation of the measurement exceeded 30 HU in most conditions. In measurements at the wall center, the attenuation values obtained using MBIR were comparable to or significantly closer to the actual wall attenuation than those acquired using Detail kernel. Using differentiation curves, we could measure attenuation for models with walls of 1.0- or 1.5-mm thickness using MBIR but only those of 1.5-mm thickness using Detail kernel. We detected no significant differences among the attenuation values of the vascular walls of either thickness (MBIR, P = 0.1606) or among the 3 densities of intravascular contrast material (MBIR, P = 0.8185; Detail kernel, P = 0.0802). Conclusions: Compared with FBP, MBIR reduces both reconstruction blur and image noise simultaneously, facilitates recognition of vascular wall boundaries, and can improve accuracy in measuring wall attenuation.

  2. Adaptive iterative dose reduction algorithm in CT: Effect on image quality compared with filtered back projection in body phantoms of different sizes

    To evaluate the impact of the adaptive iterative dose reduction (AIDR) three-dimensional (3D) algorithm in CT on noise reduction and the image quality compared to the filtered back projection (FBP) algorithm and to compare the effectiveness of AIDR 3D on noise reduction according to the body habitus using phantoms with different sizes. Three different-sized phantoms with diameters of 24 cm, 30 cm, and 40 cm were built up using the American College of Radiology CT accreditation phantom and layers of pork belly fat. Each phantom was scanned eight times using different mAs. Images were reconstructed using the FBP and three different strengths of the AIDR 3D. The image noise, the contrast-to-noise ratio (CNR) and the signal-to-noise ratio (SNR) of the phantom were assessed. Two radiologists assessed the image quality of the 4 image sets in consensus. The effectiveness of AIDR 3D on noise reduction compared with FBP were also compared according to the phantom sizes. Adaptive iterative dose reduction 3D significantly reduced the image noise compared with FBP and enhanced the SNR and CNR (p < 0.05) with improved image quality (p < 0.05). When a stronger reconstruction algorithm was used, greater increase of SNR and CNR as well as noise reduction was achieved (p < 0.05). The noise reduction effect of AIDR 3D was significantly greater in the 40-cm phantom than in the 24-cm or 30-cm phantoms (p < 0.05). The AIDR 3D algorithm is effective to reduce the image noise as well as to improve the image-quality parameters compared by FBP algorithm, and its effectiveness may increase as the phantom size increases.

  3. Impact of hybrid iterative reconstruction on Agatston coronary artery calcium scores in comparison to filtered back projection in native cardiac CT

    To investigate whether the effects of hybrid iterative reconstruction (HIR) on coronary artery calcium (CAC) measurements using the Agatston score lead to changes in assignment of patients to cardiovascular risk groups compared to filtered back projection (FBP). 68 patients (mean age 61.5 years; 48 male; 20 female) underwent prospectively ECG-gated, non-enhanced, cardiac 256-MSCT for coronary calcium scoring. Scanning parameters were as follows: Tube voltage, 120 kV; Mean tube current time-product 63.67 mAs (50 - 150 mAs); collimation, 2 x 128 x 0.625 mm. Images were reconstructed with FBP and with HIR at all levels (L1 to L7). Two independent readers measured Agatston scores of all reconstructions and assigned patients to cardiovascular risk groups. Scores of HIR and FBP reconstructions were correlated (Spearman). Interobserver agreement and variability was assessed with k-statistics and Bland-Altmann-Plots. Agatston scores of HIR reconstructions were closely correlated with FBP reconstructions (L1, R = 0.9996; L2, R = 0.9995; L3, R = 0.9991; L4, R = 0.986; L5, R = 0.9986; L6, R = 0.9987; and L7, R = 0.9986). In comparison to FBP, HIR led to reduced Agatston scores between 97% (L1) and 87.4% (L7) of the FBP values. Using HIR iterations L1-L3, all patients were assigned to identical risk groups as after FPB reconstruction. In 5.4% of patients the risk group after HIR with the maximum iteration level was different from the group after FBP reconstruction. There was an excellent correlation of Agatston scores after HIR and FBP with identical risk group assignment at levels 1 - 3 for all patients. Hence it appears that the application of HIR in routine calcium scoring does not entail any disadvantages. Thus, future studies are needed to demonstrate whether HIR is a reliable method for reducing radiation dose in coronary calcium scoring.

  4. Image quality of low mA CT pulmonary angiography reconstructed with model based iterative reconstruction versus standard CT pulmonary angiography reconstructed with filtered back projection: an equivalency trial

    Montet, Xavier; Hachulla, Anne-Lise; Neroladaki, Angeliki; Botsikas, Diomidis; Becker, Christoph D. [Geneva University Hospital, Division of Radiology, Geneva 4 (Switzerland); Lador, Frederic; Rochat, Thierry [Geneva University Hospital, Division of Pulmonary Medicine, Geneva 4 (Switzerland)

    2015-06-01

    To determine whether CT pulmonary angiography (CTPA) using low mA setting reconstructed with model-based iterative reconstruction (MBIR) is equivalent to routine CTPA reconstructed with filtered back projection (FBP). This prospective study was approved by the institutional review board and patients provided written informed consent. Eighty-two patients were examined with a low mA MBIR-CTPA (100 kV, 20 mA) and 82 patients with a standard FBP-CTPA (100 kV, 250 mA). Region of interests were drawn in nine pulmonary vessels; signal-to-noise ratio (SNR) and contrast-to-noise ratio (CNR) were calculated. A five-point scale was used to subjectively evaluate the image quality of FBP-CTPA and low mA MBIR-CTPA. Compared to routine FBP-CTPA, low mA MBIR-CTPA showed no differences in the attenuation measured in nine pulmonary vessels, higher SNR (56 ± 19 vs 43 ± 20, p < 0.0001) and higher CNR (50 ± 17 vs 38 ± 18, p < 0.0001) despite a dose reduction of 93 % (p < 0.0001). The subjective image quality of low mA MBIR-CTPA was quoted as diagnostic in 98 % of the cases for patient with body mass index less than 30 kg/m{sup 2}. Low mA MBIR-CTPA is equivalent to routine FBP-CTPA and allows a significant dose reduction while improving SNR and CNR in the pulmonary vessels, as compared with routine FBP-CTPA. (orig.)

  5. Iterative reconstruction technique vs filter back projection: utility for quantitative bronchial assessment on low-dose thin-section MDCT in patients with/without chronic obstructive pulmonary disease

    The aim of this study was to evaluate the utility of the iterative reconstruction (IR) technique for quantitative bronchial assessment during low-dose computed tomography (CT) as a substitute for standard-dose CT in patients with/without chronic obstructive pulmonary disease. Fifty patients (mean age, 69.2; mean % predicted FEV1, 79.4) underwent standard-dose CT (150mAs) and low-dose CT (25mAs). Except for tube current, the imaging parameters were identical for both protocols. Standard-dose CT was reconstructed using filtered back-projection (FBP), and low-dose CT was reconstructed using IR and FBP. For quantitative bronchial assessment, the wall area percentage (WA%) of the sub-segmental bronchi and the airway luminal volume percentage (LV%) from the main bronchus to the peripheral bronchi were acquired in each dataset. The correlation and agreement of WA% and LV% between standard-dose CT and both low-dose CTs were statistically evaluated. WA% and LV% between standard-dose CT and both low-dose CTs were significant correlated (r > 0.77, p < 0.00001); however, only the LV% agreement between SD-CT and low-dose CT reconstructed with IR was moderate (concordance correlation coefficient = 0.93); the other agreement was poor (concordance correlation coefficient <0.90). Quantitative bronchial assessment via low-dose CT has potential as a substitute for standard-dose CT by using IR and airway luminal volumetry techniques. circle Quantitative bronchial assessment of COPD using low-dose CT is possible. (orig.)

  6. Computed tomography of the cervical spine: comparison of image quality between a standard-dose and a low-dose protocol using filtered back-projection and iterative reconstruction

    To compare image quality of a standard-dose (SD) and a low-dose (LD) cervical spine CT protocol using filtered back-projection (FBP) and iterative reconstruction (IR). Forty patients investigated by cervical spine CT were prospectively randomised into two groups: SD (120 kVp, 275 mAs) and LD (120 kVp, 150 mAs), both applying automatic tube current modulation. Data were reconstructed using both FBP and sinogram-affirmed IR. Image noise, signal-to-noise (SNR) and contrast-to-noise (CNR) ratios were measured. Two radiologists independently and blindly assessed the following anatomical structures at C3-C4 and C6-C7 levels, using a four-point scale: intervertebral disc, content of neural foramina and dural sac, ligaments, soft tissues and vertebrae. They subsequently rated overall image quality using a ten-point scale. For both protocols and at each disc level, IR significantly decreased image noise and increased SNR and CNR, compared with FBP. SNR and CNR were statistically equivalent in LD-IR and SD-FBP protocols. Regardless of the dose and disc level, the qualitative scores with IR compared with FBP, and with LD-IR compared with SD-FBP, were significantly higher or not statistically different for intervertebral discs, neural foramina and ligaments, while significantly lower or not statistically different for soft tissues and vertebrae. The overall image quality scores were significantly higher with IR compared with FBP, and with LD-IR compared with SD-FBP. LD-IR cervical spine CT provides better image quality for intervertebral discs, neural foramina and ligaments, and worse image quality for soft tissues and vertebrae, compared with SD-FBP, while reducing radiation dose by approximately 40 %. (orig.)

  7. Computed tomography of the cervical spine: comparison of image quality between a standard-dose and a low-dose protocol using filtered back-projection and iterative reconstruction

    Becce, Fabio [University of Lausanne, Department of Diagnostic and Interventional Radiology, Centre Hospitalier Universitaire Vaudois, Lausanne (Switzerland); Universite Catholique Louvain, Department of Radiology, Cliniques Universitaires Saint-Luc, Brussels (Belgium); Ben Salah, Yosr; Berg, Bruno C. vande; Lecouvet, Frederic E.; Omoumi, Patrick [Universite Catholique Louvain, Department of Radiology, Cliniques Universitaires Saint-Luc, Brussels (Belgium); Verdun, Francis R. [University of Lausanne, Institute of Radiation Physics, Centre Hospitalier Universitaire Vaudois, Lausanne (Switzerland); Meuli, Reto [University of Lausanne, Department of Diagnostic and Interventional Radiology, Centre Hospitalier Universitaire Vaudois, Lausanne (Switzerland)

    2013-07-15

    To compare image quality of a standard-dose (SD) and a low-dose (LD) cervical spine CT protocol using filtered back-projection (FBP) and iterative reconstruction (IR). Forty patients investigated by cervical spine CT were prospectively randomised into two groups: SD (120 kVp, 275 mAs) and LD (120 kVp, 150 mAs), both applying automatic tube current modulation. Data were reconstructed using both FBP and sinogram-affirmed IR. Image noise, signal-to-noise (SNR) and contrast-to-noise (CNR) ratios were measured. Two radiologists independently and blindly assessed the following anatomical structures at C3-C4 and C6-C7 levels, using a four-point scale: intervertebral disc, content of neural foramina and dural sac, ligaments, soft tissues and vertebrae. They subsequently rated overall image quality using a ten-point scale. For both protocols and at each disc level, IR significantly decreased image noise and increased SNR and CNR, compared with FBP. SNR and CNR were statistically equivalent in LD-IR and SD-FBP protocols. Regardless of the dose and disc level, the qualitative scores with IR compared with FBP, and with LD-IR compared with SD-FBP, were significantly higher or not statistically different for intervertebral discs, neural foramina and ligaments, while significantly lower or not statistically different for soft tissues and vertebrae. The overall image quality scores were significantly higher with IR compared with FBP, and with LD-IR compared with SD-FBP. LD-IR cervical spine CT provides better image quality for intervertebral discs, neural foramina and ligaments, and worse image quality for soft tissues and vertebrae, compared with SD-FBP, while reducing radiation dose by approximately 40 %. (orig.)

  8. Iterative reconstruction technique vs filter back projection: utility for quantitative bronchial assessment on low-dose thin-section MDCT in patients with/without chronic obstructive pulmonary disease

    Koyama, Hisanobu; Seki, Shinichiro; Sugimura, Kazuro [Kobe University Graduate School of Medicine, Division of Radiology, Department of Radiology, Kobe, Hyogo (Japan); Ohno, Yoshiharu; Nishio, Mizuho; Matsumoto, Sumiaki; Yoshikawa, Takeshi [Kobe University Graduate School of Medicine, Advanced Biomedical Imaging Research Centre, Kobe (Japan); Kobe University Graduate School of Medicine, Division of Functional and Diagnostic Imaging Research, Department of Radiology, Kobe (Japan); Sugihara, Naoki [Toshiba Medical Systems Corporation, Ohtawara, Tochigi (Japan)

    2014-08-15

    The aim of this study was to evaluate the utility of the iterative reconstruction (IR) technique for quantitative bronchial assessment during low-dose computed tomography (CT) as a substitute for standard-dose CT in patients with/without chronic obstructive pulmonary disease. Fifty patients (mean age, 69.2; mean % predicted FEV1, 79.4) underwent standard-dose CT (150mAs) and low-dose CT (25mAs). Except for tube current, the imaging parameters were identical for both protocols. Standard-dose CT was reconstructed using filtered back-projection (FBP), and low-dose CT was reconstructed using IR and FBP. For quantitative bronchial assessment, the wall area percentage (WA%) of the sub-segmental bronchi and the airway luminal volume percentage (LV%) from the main bronchus to the peripheral bronchi were acquired in each dataset. The correlation and agreement of WA% and LV% between standard-dose CT and both low-dose CTs were statistically evaluated. WA% and LV% between standard-dose CT and both low-dose CTs were significant correlated (r > 0.77, p < 0.00001); however, only the LV% agreement between SD-CT and low-dose CT reconstructed with IR was moderate (concordance correlation coefficient = 0.93); the other agreement was poor (concordance correlation coefficient <0.90). Quantitative bronchial assessment via low-dose CT has potential as a substitute for standard-dose CT by using IR and airway luminal volumetry techniques. circle Quantitative bronchial assessment of COPD using low-dose CT is possible. (orig.)

  9. Image quality of low mA CT pulmonary angiography reconstructed with model based iterative reconstruction versus standard CT pulmonary angiography reconstructed with filtered back projection: an equivalency trial

    To determine whether CT pulmonary angiography (CTPA) using low mA setting reconstructed with model-based iterative reconstruction (MBIR) is equivalent to routine CTPA reconstructed with filtered back projection (FBP). This prospective study was approved by the institutional review board and patients provided written informed consent. Eighty-two patients were examined with a low mA MBIR-CTPA (100 kV, 20 mA) and 82 patients with a standard FBP-CTPA (100 kV, 250 mA). Region of interests were drawn in nine pulmonary vessels; signal-to-noise ratio (SNR) and contrast-to-noise ratio (CNR) were calculated. A five-point scale was used to subjectively evaluate the image quality of FBP-CTPA and low mA MBIR-CTPA. Compared to routine FBP-CTPA, low mA MBIR-CTPA showed no differences in the attenuation measured in nine pulmonary vessels, higher SNR (56 ± 19 vs 43 ± 20, p < 0.0001) and higher CNR (50 ± 17 vs 38 ± 18, p < 0.0001) despite a dose reduction of 93 % (p < 0.0001). The subjective image quality of low mA MBIR-CTPA was quoted as diagnostic in 98 % of the cases for patient with body mass index less than 30 kg/m2. Low mA MBIR-CTPA is equivalent to routine FBP-CTPA and allows a significant dose reduction while improving SNR and CNR in the pulmonary vessels, as compared with routine FBP-CTPA. (orig.)

  10. Quantitative evaluation of calcium (content) in the coronary artery using hybrid iterative reconstruction (iDose) algorithm on low-dose 64-detector CT. Comparison of iDose and filtered back projection

    To evaluate the usefulness of hybrid iterative reconstruction (iDose) for quantification of calcium content in the coronary artery on 64-detector computed tomography (CT), an anthropomorphic cardiac CT phantom containing cylinders with known calcium content was scanned at tube current-time products of 15, 20, 25, and 50 mAs using 64-detector CT. The images obtained at 15, 20, 25, and 50 mAs were reconstructed using filtered back projection (FBP), and those at 15, 20, and 25 mAs were also reconstructed using iDose. Then the volume and mass of the calcium content in the cylinders were calculated and compared with the true values. The Agatston score was also evaluated. The Agatston score and mass of calcium obtained at 50 mAs using FBP were 656.92 and 159.91 mg, respectively. In contrast, those obtained at 25 mAs using iDose were 641.91 and 159.05 mg, respectively. No significant differences were found in the calcium measurements obtained using FBP and iDose. In addition, the Agatston score and mass of calcium obtained at 15 mAs and 20 mAs using iDose were not significantly different from those obtained at 25 mAs with iDose. By using iDose, accurate quantification of calcium in the coronary artery can be achieved at 15 mAs using 64-detector CT. The radiation dose can be significantly reduced in coronary artery calcium scoring without impairing the detection and quantification of coronary calcification. (author)

  11. Feasible Dose Reduction in Routine Chest Computed Tomography Maintaining Constant Image Quality Using the Last Three Scanner Generations: From Filtered Back Projection to Sinogram-affirmed Iterative Reconstruction and Impact of the Novel Fully Integrated Detector Design Minimizing Electronic Noise

    Lukas Ebner

    2014-01-01

    Full Text Available Objective:The aim of the present study was to evaluate a dose reduction in contrast-enhanced chest computed tomography (CT by comparing the three latest generations of Siemens CT scanners used in clinical practice. We analyzed the amount of radiation used with filtered back projection (FBP and an iterative reconstruction (IR algorithm to yield the same image quality. Furthermore, the influence on the radiation dose of the most recent integrated circuit detector (ICD; Stellar detector, Siemens Healthcare, Erlangen, Germany was investigated. Materials and Methods: 136 Patients were included. Scan parameters were set to a thorax routine: SOMATOM Sensation 64 (FBP, SOMATOM Definition Flash (IR, and SOMATOM Definition Edge (ICD and IR. Tube current was set constantly to the reference level of 100 mA automated tube current modulation using reference milliamperes. Care kV was used on the Flash and Edge scanner, while tube potential was individually selected between 100 and 140 kVp by the medical technologists at the SOMATOM Sensation. Quality assessment was performed on soft-tissue kernel reconstruction. Dose was represented by the dose length product. Results: Dose-length product (DLP with FBP for the average chest CT was 308 mGycm ± 99.6. In contrast, the DLP for the chest CT with IR algorithm was 196.8 mGycm ± 68.8 (P = 0.0001. Further decline in dose can be noted with IR and the ICD: DLP: 166.4 mGycm ± 54.5 (P = 0.033. The dose reduction compared to FBP was 36.1% with IR and 45.6% with IR/ICD. Signal-to-noise ratio (SNR was favorable in the aorta, bone, and soft tissue for IR/ICD in combination compared to FBP (the P values ranged from 0.003 to 0.048. Overall contrast-to-noise ratio (CNR improved with declining DLP. Conclusion: The most recent technical developments, namely IR in combination with integrated circuit detectors, can significantly lower radiation dose in chest CT examinations.

  12. Clinical evaluation of image quality and radiation dose reduction in upper abdominal computed tomography using model-based iterative reconstruction; comparison with filtered back projection and adaptive statistical iterative reconstruction

    Highlights: • MBIR significantly improves objective image quality. • MBIR reduces the radiation dose by 87.5% without increasing objective image noise. • A half dose will be needed to maintain the subjective image quality. - Abstract: Purpose: To evaluate the image quality of upper abdominal CT images reconstructed with model-based iterative reconstruction (MBIR) in comparison with filtered back projection (FBP) and adaptive statistical iterative reconstruction (ASIR) on scans acquired with various radiation exposure dose protocols. Materials and methods: This prospective study was approved by our institutional review board, and informed consent was obtained from all 90 patients who underwent both control-dose (CD) and reduced-dose (RD) CT of the upper abdomen (unenhanced: n = 45, contrast-enhanced: n = 45). The RD scan protocol was randomly selected from three protocols; Protocol A: 12.5% dose, Protocol B: 25% dose, Protocol C: 50% dose. Objective image noise, signal-to-noise (SNR) ratio for the liver parenchyma, visual image score and lesion conspicuity were compared among CD images of FBP and RD images of FBP, ASIR and MBIR. Results: RD images of MBIR yielded significantly lower objective image noise and higher SNR compared with RD images of FBP and ASIR for all protocols (P < .01) and CD images of FBP for Protocol C (P < .05). Although the subjective image quality of RD images of MBIR was almost acceptable for Protocol C, it was inferior to that of CD images of FBP for Protocols A and B (P < .0083). The conspicuity of the small lesions in RD images of MBIR tended to be superior to that in RD images of FBP and ASIR and inferior to that in CD images for Protocols A and B, although the differences were not significant (P > .0083). Conclusion: Although 12.5%-dose MBIR images (mean size-specific dose estimates [SSDE] of 1.13 mGy) yielded objective image noise and SNR comparable to CD-FBP images, at least a 50% dose (mean SSDE of 4.63 mGy) would be needed to maintain the subjective image quality and the lesion conspicuity

  13. Impact of hybrid iterative reconstruction on Agatston coronary artery calcium scores in comparison to filtered back projection in native cardiac CT; Einfluss der hybriden iterativen Rekonstruktion bei der nativen CT des Herzens auf die Agatston-Kalziumscores der Koronararterien

    Obmann, V.C.; Heverhagen, J.T. [Inselspital - University Hospital Bern (Switzerland). University Inst. for Diagnostic, Interventional and Pediatric Radiology; Klink, T. [Wuerzburg Univ. (Germany). Inst. of Diagnostic and Interventional Radiology; Stork, A.; Begemann, P.G.C. [Roentgeninstitut Duesseldorf, Duesseldorf (Germany); Laqmani, A.; Adam, G. [University Medical Center Hamburg-Eppendorf, Hamburg (Germany). Dept. of Diagnostic and Interventional Radiology

    2015-05-15

    To investigate whether the effects of hybrid iterative reconstruction (HIR) on coronary artery calcium (CAC) measurements using the Agatston score lead to changes in assignment of patients to cardiovascular risk groups compared to filtered back projection (FBP). 68 patients (mean age 61.5 years; 48 male; 20 female) underwent prospectively ECG-gated, non-enhanced, cardiac 256-MSCT for coronary calcium scoring. Scanning parameters were as follows: Tube voltage, 120 kV; Mean tube current time-product 63.67 mAs (50 - 150 mAs); collimation, 2 x 128 x 0.625 mm. Images were reconstructed with FBP and with HIR at all levels (L1 to L7). Two independent readers measured Agatston scores of all reconstructions and assigned patients to cardiovascular risk groups. Scores of HIR and FBP reconstructions were correlated (Spearman). Interobserver agreement and variability was assessed with k-statistics and Bland-Altmann-Plots. Agatston scores of HIR reconstructions were closely correlated with FBP reconstructions (L1, R = 0.9996; L2, R = 0.9995; L3, R = 0.9991; L4, R = 0.986; L5, R = 0.9986; L6, R = 0.9987; and L7, R = 0.9986). In comparison to FBP, HIR led to reduced Agatston scores between 97% (L1) and 87.4% (L7) of the FBP values. Using HIR iterations L1-L3, all patients were assigned to identical risk groups as after FPB reconstruction. In 5.4% of patients the risk group after HIR with the maximum iteration level was different from the group after FBP reconstruction. There was an excellent correlation of Agatston scores after HIR and FBP with identical risk group assignment at levels 1 - 3 for all patients. Hence it appears that the application of HIR in routine calcium scoring does not entail any disadvantages. Thus, future studies are needed to demonstrate whether HIR is a reliable method for reducing radiation dose in coronary calcium scoring.

  14. Fan beam and parallel beam projection and back-projection operators

    Expressions for the fan beam and parallel beam projection and back-projection operators are given along with an evaluation of the point source response for the back-projection operators. The back-projection operator for the fan beam geometry requires the superposition of projection data measured over 3600. Both the fan beam and parallel beam geometries have back-projection operators with point source responses which are proportional to 1/mod (r-r0), and thus two-dimensional Fourier filter techniques can be used to reconstruct transverse sections from fan beam and parallel beam projection data. The two-dimensional Fourier filter techniques may have the speed over other methods for reconstructing fan beam data, but the reconstructed image requires four times the core storage so that the convolution result of one period does not overlap the convolution result of the succeeding period when implementing the fast Fourier transform. 10 figures

  15. Reconstruction of CT images by the Bayes- back projection method

    Haruyama, M; Takase, M; Tobita, H

    2002-01-01

    In the course of research on quantitative assay of non-destructive measurement of radioactive waste, the have developed a unique program based on the Bayesian theory for reconstruction of transmission computed tomography (TCT) image. The reconstruction of cross-section images in the CT technology usually employs the Filtered Back Projection method. The new imaging reconstruction program reported here is based on the Bayesian Back Projection method, and it has a function of iterative improvement images by every step of measurement. Namely, this method has the capability of prompt display of a cross-section image corresponding to each angled projection data from every measurement. Hence, it is possible to observe an improved cross-section view by reflecting each projection data in almost real time. From the basic theory of Baysian Back Projection method, it can be not only applied to CT types of 1st, 2nd, and 3rd generation. This reported deals with a reconstruction program of cross-section images in the CT of ...

  16. Image reconstruction of simulated specimens using convolution back projection

    Mohd. Farhan Manzoor

    2001-04-01

    Full Text Available This paper reports about the reconstruction of cross-sections of composite structures. The convolution back projection (CBP algorithm has been used to capture the attenuation field over the specimen. Five different test cases have been taken up for evaluation. These cases represent varying degrees of complexity. In addition, the role of filters on the nature of the reconstruction errors has also been discussed. Numerical results obtained in the study reveal that CBP algorithm is a useful tool for qualitative as well as quantitative assessment of composite regions encountered in engineering applications.

  17. Procedure and apparatus for back projection

    (1) The procedure is for back projection in the form of a tomographic picture of a member, characterised in that the strip pictures are written onto the signal plate by a conversion pick-up unit; and that to the address inputs of the pick-up unit voltages are applied that represent a rotating coordinate system. (2) Procedure following claim 1 characterised by the fact that the voltages are respectively applied as sawtooth waveform horizontal and vertical television deflections. (3) Procedure following claims 1 and 2, characterised that in order to correct the television deflection voltages for the effect of a fan shaped radiation beam, first one and then the other of the amplitudes is modulated. (G.C.)

  18. UV Fluorescence Photography of Works of Art : Replacing the Traditional UV Cut Filters with Interference Filters

    Luís BRAVO PEREIRA

    2010-09-01

    Full Text Available For many years filters like the Kodak Wratten E series, or the equivalent Schneider B+W 415, were used as standard UV cut filters, necessary to obtain good quality on UV Fluorescence photography. The only problem with the use of these filters is that, when they receive the UV radiation that they should remove, they present themselves an internal fluorescence as side effect, that usually reduce contrast and quality on the final image. This article presents the results of our experiences on using some innovative filters, that appeared available on the market in recent years, projected to adsorb UV radiation even more efficiently than with the mentioned above pigment based standard filters: the interference filters for UV rejection (and, usually, for IR rejection too manufactured using interference layers, that present better results than the pigment based filters. The only problem with interference filters type is that they are sensitive to the rays direction and, because of that, they are not adequate to wide-angle lenses. The internal fluorescence for three filters: the B+W 415 UV cut (equivalent to the Kodak Wratten 2E, pigment based, the B+W 486 UV IR cut (an interference type filter, used frequently on digital cameras to remove IR or UV and the Baader UVIR rejection filter (two versions of this interference filter were used had been tested and compared. The final quality of the UV fluorescence images seems to be of a superior quality when compared to the images obtained with classic filters.

  19. Traditional Tracking with Kalman Filter on Parallel Architectures

    Cerati, Giuseppe; Lantz, Steven; MacNeill, Ian; McDermott, Kevin; Riley, Dan; Tadel, Matevz; Wittich, Peter; Wuerthwein, Frank; Yagil, Avi

    2014-01-01

    Power density constraints are limiting the performance improvements of modern CPUs. To address this, we have seen the introduction of lower-power, multi-core processors, but the future will be even more exciting. In order to stay within the power density limits but still obtain Moore's Law performance/price gains, it will be necessary to parallelize algorithms to exploit larger numbers of lightweight cores and specialized functions like large vector units. Example technologies today include Intel's Xeon Phi and GPGPUs. Track finding and fitting is one of the most computationally challenging problems for event reconstruction in particle physics. At the High Luminosity LHC, for example, this will be by far the dominant problem. The most common track finding techniques in use today are however those based on the Kalman Filter. Significant experience has been accumulated with these techniques on real tracking detector systems, both in the trigger and offline. We report the results of our investigations into the p...

  20. 3D Forward and Back-Projection for X-Ray CT Using Separable Footprints

    Long, Yong; Fessler, Jeffrey A; Balter, James M.

    2010-01-01

    Iterative methods for 3D image reconstruction have the potential to improve image quality over conventional filtered back projection (FBP) in X-ray computed tomography (CT). However, the computation burden of 3D cone-beam forward and back-projectors is one of the greatest challenges facing practical adoption of iterative methods for X-ray CT. Moreover, projector accuracy is also important for iterative methods. This paper describes two new separable footprint (SF) projector methods that appro...

  1. Error estimates for universal back-projection-based photoacoustic tomography

    Pandey, Prabodh K.; Naik, Naren; Munshi, Prabhat; Pradhan, Asima

    2015-07-01

    Photo-acoustic tomography is a hybrid imaging modality that combines the advantages of optical as well as ultrasound imaging techniques to produce images with high resolution and good contrast at high penetration depths. Choice of reconstruction algorithm as well as experimental and computational parameters plays a major role in governing the accuracy of a tomographic technique. Therefore error estimates with the variation of these parameters have extreme importance. Due to the finite support, that photo-acoustic source has, the pressure signals are not band-limited, but in practice, our detection system is. Hence the reconstructed image from ideal, noiseless band-limited forward data (for future references we will call this band-limited reconstruction) is the best approximation that we have for the unknown object. In the present study, we report the error that arises in the universal back-projection (UBP) based photo-acoustic reconstruction for planer detection geometry due to sampling and filtering of forward data (pressure signals).Computational validation of the error estimates have been carried out for synthetic phantoms. Validation with noisy forward data has also been carried out, to study the effect of noise on the error estimates derived in our work. Although here we have derived the estimates for planar detection geometry, the derivations for spherical and cylindrical geometries follow accordingly.

  2. Dose reduction in computed tomography of the chest. Image quality of iterative reconstructions at a 50% radiation dose compared to filtered back projection at a 100% radiation dose; Dosisreduktion in der Thorax-CT. Vergleich der Bildqualitaet bei 50% Dosis und iterativer Bildrekonstruktion mit 100% Dosis und gefilterter Rueckprojektion

    May, M.S.; Eller, A.; Stahl, C. [University Hospital Erlangen (Germany). Dept. of Radiology; and others

    2014-06-15

    Purpose: The aim of this study was to evaluate the potential of iterative reconstruction (IR) in chest computed tomography (CT) to reduce radiation exposure. The qualitative and quantitative image quality of standard reconstructions with filtered back projection (FBP) and half dose (HD) chest CT data reconstructed with FBP and IR was assessed. Materials and Methods: 52 consecutive patients underwent contrast-enhanced chest CT on a dual-source CT system at 120 kV and automatic exposure control. The tube current was equally split on both tube detector systems. For the HD datasets, only data from one tube detector system was utilized. Thus, FD and HD data was available for each patient with a single scan. Three datasets were reconstructed from the raw data: standard full dose (FD) images applying FBP which served as a reference, HD images applying FBP and IR. Objective image quality analysis was performed by measuring the image noise in tissue and air. The subjective image quality was evaluated by 2 radiologists according to European guidelines. Additional assessment of artifacts, lesion conspicuity and edge sharpness was performed. Results: Image noise did not differ significantly between HD-IR and FD-FBP (p = 0.254) but increased substantially in HD-FBP (p < 0.001). No statistically significant differences were found for the reproduction of anatomical and pathological structures between HD-IR and FD-FBP, subsegmental bronchi and bronchioli. The image quality of HD-FBP was rated inferior because of increased noise. Conclusion: A 50% dose reduction in contrast-enhanced chest CT is feasible without a loss of diagnostic confidence if IR is used for image data reconstruction. Iterative reconstruction is another powerful tool to reduce radiation exposure and can be combined with other dose-saving techniques. (orig.)

  3. UV Fluorescence Photography of Works of Art : Replacing the Traditional UV Cut Filters with Interference Filters

    Luís BRAVO PEREIRA

    2010-01-01

    For many years filters like the Kodak Wratten E series, or the equivalent Schneider B+W 415, were used as standard UV cut filters, necessary to obtain good quality on UV Fluorescence photography. The only problem with the use of these filters is that, when they receive the UV radiation that they should remove, they present themselves an internal fluorescence as side effect, that usually reduce contrast and quality on the final image. This article presents the results of our experiences on usi...

  4. Tsunami Wave Estimation Using GPS-TEC Back Projection

    Ito, T.

    2014-12-01

    Large tsunami generates an acoustic wave and shakes the atmosphere layer around a focal region. The generated acoustic wave propagates to ionosphere layer which is located at about 300km hight and causes a ionospheric disturbance. So, a generated acoustic wave causes the density of total electron content (TEC) inside ionospheric layer to be change. The changing TEC inside ionospheric layer can be measured by dense GPS network, such as GEONET. We focus on a large tsunami such as the 2011 March 11 Tohoku-Oki earthquake (Mw 9.0) , which caused vast damages to the country. Tsunami wave due to the 2011 Tohoku-Oki earthquake generates acoustic wave which propagates to ionospheric layer. The Japanese dense network of GPS detected clear anomaly of ionospheric TEC due to Tsunami wave around the focal region. We assume that acoustic wave cause the ionospheric disturbance and estimate tsunami wave propagation using Back Projection (BP) method of ionospheric disturbance. The Japanese dense array of GPS recorded ionospheric disturbances as changes in TEC due to the 2011 Tohoku-Oki earthquake. In this study, we try to reveal the detail of generating tsunami propagation using changing TEC from GPS observation network. At first, we process GPS-TEC above focal region by 1 sec sampling of GEONET. We remove slant GPS-TEC effects using filter out second degrees polynomial fitting. We try to adapt back projection (BP) method for GPS-TEC time series. The BP product shows the beam formed time history and location of coherent acoustic-wave energy generated by large tsunami observed at ionospheric layer regional arrays and across the GPS Network. BPs are performed by beam forming (stacking) energy to a flat grid around the source region with variable spatial resolution scaled by the magnitude of generating tsunami velocity for each second. In result, we can obtain the generating tsunami wave due to a large earthquake. The method is perfectly new and provide detail of tsunami propagating wave distribution from huge GPS-TEC data. We can obtain in-direct measurement Tsunami wave generation from ionospheric disturbances. So, this result will bring a revolution of tsunami study.

  5. Traditional Medicine Through the Filter of Modernity: A brief historical analysis

    R. Rabarihoela Razafimandimby

    2014-12-01

    Full Text Available Traditional medicines still prevail in current Malagasy context. A careful historical analysis shows however that Malagasy traditional medicine has been screened through many filters before being accepted in a global context. Traditional medicine in its authentic form has been more or less rejected with the advent of  modern medicine – although not without reaction. This paper will retrace the historical encountering of the modern and traditional to determine the extent to which traditional medicine is acknowledged and used in the current prevailing modern, rational and scientific global context.

  6. An accelerated threshold-based back-projection algorithm for Compton camera image reconstruction

    Purpose: Compton camera imaging (CCI) systems are currently under investigation for radiotherapy dose reconstruction and verification. The ability of such a system to provide real-time images during dose delivery will be limited by the computational speed of the image reconstruction algorithm. In this work, the authors present a fast and simple method by which to generate an initial back-projected image from acquired CCI data, suitable for use in a filtered back-projection algorithm or as a starting point for iterative reconstruction algorithms, and compare its performance to the current state of the art. Methods: Each detector event in a CCI system describes a conical surface that includes the true point of origin of the detected photon. Numerical image reconstruction algorithms require, as a first step, the back-projection of each of these conical surfaces into an image space. The algorithm presented here first generates a solution matrix for each slice of the image space by solving the intersection of the conical surface with the image plane. Each element of the solution matrix is proportional to the distance of the corresponding voxel from the true intersection curve. A threshold function was developed to extract those pixels sufficiently close to the true intersection to generate a binary intersection curve. This process is repeated for each image plane for each CCI detector event, resulting in a three-dimensional back-projection image. The performance of this algorithm was tested against a marching algorithm known for speed and accuracy. Results: The threshold-based algorithm was found to be approximately four times faster than the current state of the art with minimal deficit to image quality, arising from the fact that a generically applicable threshold function cannot provide perfect results in all situations. The algorithm fails to extract a complete intersection curve in image slices near the detector surface for detector event cones having axes nearly parallel to the image plane. This effect decreases the sum of the image, thereby also affecting the mean, standard deviation, and SNR of the image. All back-projected events associated with a simulated point source intersected the voxel containing the source and the FWHM of the back-projected image was similar to that obtained from the marching method. Conclusions: The slight deficit to image quality observed with the threshold-based back-projection algorithm described here is outweighed by the 75% reduction in computation time. The implementation of this method requires the development of an optimum threshold function, which determines the overall accuracy of the method. This makes the algorithm well-suited to applications involving the reconstruction of many large images, where the time invested in threshold development is offset by the decreased image reconstruction time. Implemented in a parallel-computing environment, the threshold-based algorithm has the potential to provide real-time dose verification for radiation therapy.

  7. Camera calibration based on the back projection process

    Gu, Feifei; Zhao, Hong; Ma, Yueyang; Bu, Penghui

    2015-12-01

    Camera calibration plays a crucial role in 3D measurement tasks of machine vision. In typical calibration processes, camera parameters are iteratively optimized in the forward imaging process (FIP). However, the results can only guarantee the minimum of 2D projection errors on the image plane, but not the minimum of 3D reconstruction errors. In this paper, we propose a universal method for camera calibration, which uses the back projection process (BPP). In our method, a forward projection model is used to obtain initial intrinsic and extrinsic parameters with a popular planar checkerboard pattern. Then, the extracted image points are projected back into 3D space and compared with the ideal point coordinates. Finally, the estimation of the camera parameters is refined by a non-linear function minimization process. The proposed method can obtain a more accurate calibration result, which is more physically useful. Simulation and practical data are given to demonstrate the accuracy of the proposed method.

  8. Image-domain sampling properties of the Hotelling Observer in CT using filtered back-projection

    Sanchez, Adrian A.; Sidky, Emil Y.; Pan, Xiaochuan

    2015-03-01

    The Hotelling Observer (HO),1 along with its channelized variants,2 has been proposed for image quality evaluation in x-ray CT.3,4 In this work, we investigate HO performance for a detection task in parallel-beam FBP as a function of two image-domain sampling parameters, namely pixel size and field-of-view. These two parameters are of central importance in adapting HO methods to use in CT, since the large number of pixels in a single image makes direct computation of HO performance for a full image infeasible in most cases. Reduction of the number of image pixels and/or restriction of the image to a region-of-interest (ROI) has the potential to make direct computation of HO statistics feasible in CT, provided that the signal and noise properties lead to redundant information in some regions of the image. For small signals, we hypothesize that reduction of image pixel size and enlargement of the image field-of-view are approximately equivalent means of gaining additional information relevant to a detection task. The rationale for this hypothesis is that the backprojection operation in FBP introduces long range correlations so that, for small signals, the reconstructed signal outside of a small ROI is not linearly independent of the signal within the ROI. In this work, we perform a preliminary investigation of this hypothesis by sweeping these two sampling parameters and computing HO performance for a signal detection task.

  9. Filtering

    Jan Drugowitsch

    2015-04-01

    Full Text Available Predictive coding appears to be one of the fundamental working principles of brain processing. Amongst other aspects, brains often predict the sensory consequences of their own actions. Predictive coding resembles Kalman filtering, where incoming sensory information is filtered to produce prediction errors for subsequent adaptation and learning. However, to generate prediction errors given motor commands, a suitable temporal forward model is required to generate predictions. While in engineering applications, it is usually assumed that this forward model is known, the brain has to learn it. When filtering sensory input and learning from the residual signal in parallel, a fundamental problem arises: the system can enter a delusional loop when filtering the sensory information using an overly trusted forward model. In this case, learning stalls before accurate convergence because uncertainty about the forward model is not properly accommodated. We present a Bayes-optimal solution to this generic and pernicious problem for the case of linear forward models, which we call Predictive Inference and Adaptive Filtering (PIAF. PIAF filters incoming sensory information and learns the forward model simultaneously. We show that PIAF is formally related to Kalman filtering and to the Recursive Least Squares linear approximation method, but combines these procedures in a Bayes optimal fashion. Numerical evaluations confirm that the delusional loop is precluded and that the learning of the forward model is more than ten-times faster when compared to a naive combination of Kalman filtering and Recursive Least Squares.

  10. A fast marching method based back projection algorithm for photoacoustic tomography in heterogeneous media

    Wang, Tianren

    2015-01-01

    This paper presents a numerical study on a fast marching method based back projection reconstruction algorithm for photoacoustic tomography in heterogeneous media. Transcranial imaging is used here as a case study. To correct for the phase aberration from the heterogeneity (i.e., skull), the fast marching method is adopted to compute the phase delay based on the known speed of sound distribution, and the phase delay is taken into account by the back projection algorithm for more accurate reconstructions. It is shown that the proposed algorithm is more accurate than the conventional back projection algorithm, but slightly less accurate than the time reversal algorithm particularly in the area close to the skull. However, the image reconstruction time for the proposed algorithm can be as little as 124 ms when implemented by a GPU (512 sensors, 21323 pixels reconstructed), which is two orders of magnitude faster than the time reversal reconstruction. The proposed algorithm, therefore, not only corrects for the p...

  11. Super-resolution Reconstruction Algorithm Based on Patch Similarity and Back-projection Modification

    Wei-long Chen; Li Guo; Wu He; Wei Wu; Xiao-min Yang

    2014-01-01

    We propose an effective super-resolution reconstruction algorithm based on patch similarity and back-projection modification. In the proposed algorithm, we assume patch to be similar in natural images and extract the high-frequency information from the best similar patch to add into goal high-resolution image. In the process of reconstruction, the high-resolution patch is back-projected into the low-resolution patch so as to gain detailed modification. Experiments performed on simulated low-r...

  12. Navigating Earthquake Physics with High-Resolution Array Back-Projection

    Meng, Lingsen

    Understanding earthquake source dynamics is a fundamental goal of geophysics. Progress toward this goal has been slow due to the gap between state-of-art earthquake simulations and the limited source imaging techniques based on conventional low-frequency finite fault inversions. Seismic array processing is an alternative source imaging technique that employs the higher frequency content of the earthquakes and provides finer detail of the source process with few prior assumptions. While the back-projection provides key observations of previous large earthquakes, the standard beamforming back-projection suffers from low resolution and severe artifacts. This thesis introduces the MUSIC technique, a high-resolution array processing method that aims to narrow the gap between the seismic observations and earthquake simulations. The MUSIC is a high-resolution method taking advantage of the higher order signal statistics. The method has not been widely used in seismology yet because of the nonstationary and incoherent nature of the seismic signal. We adapt MUSIC to transient seismic signal by incorporating the Multitaper cross-spectrum estimates. We also adopt a "reference window" strategy that mitigates the "swimming artifact," a systematic drift effect in back projection. The improved MUSIC back projections allow the imaging of recent large earthquakes in finer details which give rise to new perspectives on dynamic simulations. In the 2011 Tohoku-Oki earthquake, we observe frequency-dependent rupture behaviors which relate to the material variation along the dip of the subduction interface. In the 2012 off-Sumatra earthquake, we image the complicated ruptures involving orthogonal fault system and an usual branching direction. This result along with our complementary dynamic simulations probes the pressure-insensitive strength of the deep oceanic lithosphere. In another example, back projection is applied to the 2010 M7 Haiti earthquake recorded at regional distance. The high-frequency subevents are located at the edges of geodetic slip regions, which are correlated to the stopping phases associated with rupture speed reduction when the earthquake arrests.

  13. Super-resolution Reconstruction Algorithm Based on Patch Similarity and Back-projection Modification

    Wei-long Chen

    2014-07-01

    Full Text Available We propose an effective super-resolution reconstruction algorithm based on patch similarity and back-projection modification. In the proposed algorithm, we assume patch to be similar in natural images and extract the high-frequency information from the best similar patch to add into goal high-resolution image. In the process of reconstruction, the high-resolution patch is back-projected into the low-resolution patch so as to gain detailed modification. Experiments performed on simulated low-resolution image and real low-resolution image are proved that the proposed super-resolution reconstruction algorithm is effective and efficient to improve the resolution of image and achieve a better visual performance.

  14. Comparison of back projection methods of determining earthquake rupture process in time and frequency domains

    Wang, W.; Wen, L.

    2013-12-01

    Back projection is a method to back project the seismic energy recorded in a seismic array back to the earthquake source region and determine the rupture process of a large earthquake. The method takes advantage of the coherence of seismic energy in a seismic array and is quick in determining some important properties of earthquake source. The method can be performed in both time and frequency domains. In time domain, the most conventional procedure is beam forming with some measures of suppressing the noise, such as the Nth root stacking, etc. In the frequency domain, the multiple signal classification method (MUSIC) estimates the direction of arrivals of multiple waves propagating through an array using the subspace method. The advantage of this method is the ability to study rupture properties at various frequencies and to resolve simultaneous arrivals making it suitable for detecting biliteral rupture of an earthquake source. We present a comparison of back projection results on some large earthquakes between the methods in time domain and frequency domain. The time-domain procedure produces an image that is smeared and exhibits some artifacts, although some enhancing stacking methods can at some extent alleviate the problem. On the other hand, the MUSIC method resolves clear multiple arrivals and provides higher resolution of rupture imaging.

  15. External force back-projective composition and globally deformable optimization for 3-D coronary artery reconstruction

    The clinical value of the 3D reconstruction of a coronary artery is important for the diagnosis and intervention of cardiovascular diseases. This work proposes a method based on a deformable model for reconstructing coronary arteries from two monoplane angiographic images acquired from different angles. First, an external force back-projective composition model is developed to determine the external force, for which the force distributions in different views are back-projected to the 3D space and composited in the same coordinate system based on the perspective projection principle of x-ray imaging. The elasticity and bending forces are composited as an internal force to maintain the smoothness of the deformable curve. Second, the deformable curve evolves rapidly toward the true vascular centerlines in 3D space and angiographic images under the combination of internal and external forces. Third, densely matched correspondence among vessel centerlines is constructed using a curve alignment method. The bundle adjustment method is then utilized for the global optimization of the projection parameters and the 3D structures. The proposed method is validated on phantom data and routine angiographic images with consideration for space and re-projection image errors. Experimental results demonstrate the effectiveness and robustness of the proposed method for the reconstruction of coronary arteries from two monoplane angiographic images. The proposed method can achieve a mean space error of 0.564 mm and a mean re-projection error of 0.349 mm. (paper)

  16. Imaging the ruptures of the 2009 Samoan and Sumatran earthquakes using broadband network back-projections: Results and limitations

    Hutko, A. R.; Lay, T.; Koper, K. D.

    2009-12-01

    Applications of teleseismic P-wave back-projection to image gross characteristics of large earthquake finite-source ruptures have been enabled by ready availability of large digital data sets. Imaging with short-period data from dense arrays or broadband data from global networks can place constraints on rupture attributes that otherwise have to be treated parametrically in conventional modeling and inversion procedures. Back-projection imaging may constrain choice of fault plane and rupture direction, velocity, duration and length for large (M>~8.0) earthquakes, and can robustly locate early aftershocks embedded in mainshock surface waves. Back-projection methods seek locations of coherent energy release from the source region, ideally associated with down-going P wave energy. For shallow events, depth phase arrivals can produce artifacts in back-projection images that appear as secondary or even prominent features with incorrect apparent source locations and times, and such effects need to be recognized. We apply broadband P-wave back-projection imaging to the 29 September 2009 Samoa (Mw8.2) and 30 September 2009 Sumatra (Mw7.6) earthquakes using data from globally distributed broadband stations and compare results to back-projections of synthetic seismograms from finite-source models for these events to evaluate the artifacts from depth phases. Back-projection images for the great normal-faulting Samoa event feature two prominent bright spots, which could be interpreted to correspond to two distinct slip patches, one near the epicenter in the outer trench slope and the other approximately 80 km to the west near the plate boundary megathrust where many aftershocks occurred. This interpretation is at odds with finite-fault modeling results, which indicate a predominantly bilateral rupture in the NW-SE direction on a steeply dipping trench slope fault, with rupture extending about 60 km in each direction. Back-projections of data and synthetic seismograms from the finite-fault modeling with common source-receiver geometries are nearly identical, with both having the two prominent bright regions as well as many secondary features. The prominent feature to the west is an artifact resulting from constructive interference of azimuthally varying depth phases; this coincidentally projects in the same region as many aftershocks. Similar analysis for the Sumatra event yields back-projections of data and synthetics for a finite-fault model that closely match and indicate a very compact source with virtually all of the coherent radiation located within 20km of the epicenter. Weak features in the back-projections with arrival times consistent with pP and sP are visible offset from the epicenter. Back-projection methods can be useful for constraining aspects of large earthquake rupture processes, particularly if the variation of waveforms across the imaging network is small, but it is always important to assess what features of the back-projections are artifacts from the path geometry or depth phases to avoid misinterpreting the images, particularly when using globally distributed stations rather than large-aperture arrays.

  17. Visual Effects for Stereoscopic 3D Contents: Experiences from the Don't Look Back-project

    Matilainen, Tapio

    2012-01-01

    Visual effects for stereoscopic 3D contents, experiences from the Don't Look Back-project is a description of workflow and phases I went through by making of visual effects on stereoscopic 3D footage filmed earlier in 2010. Work includes two shots from the Don't Look Back-project, where I had opportunity to make computer generated imagery and visual effects. Stereoscopic three dimensional (s3D) refers in cinema to the films which are seen with specific glasses on. S3D material is create...

  18. Parathyroid scintigraphy - benefit of early and late SPECT with iterative reconstruction (IR) versus filtered back projection (FBP)

    Full text: The published literature is conflicting on the benefit of Parathyroid SPECT. This study evaluates the incremental benefit of SPECT over planar imaging, comparing both the timing of SPECT and the processing technique used. Over 1 year, patients referred for parathyroid scintigraphy were studied with conventional dual tracer and dual phase planar imaging using 50 MBq of Tc-99m-pertechnetate and 800 MBq of Tc-99m MIBI. SPECT was performed both after the initial and 2 hour planars, and processed using both PBP and IR. All studies were randomised and read as planar, planar+FBP then planar+FBP+IR for both early and late SPECT by 2 blinded readers. Focal abnormalities were scored on a 5 point scale, with scores of 4 and 5 being called positive. In cases of observer disagreement a third blinded reader was used. Surgical follow up was available in 16 of 33 patients. 2 were surgically non curative and excluded (including one probable scintigraphic mediastinal adenoma not located at surgery). Sensitivity and ROC analyses were performed to evaluate incremental benefit of FBP and IR SPECT over planars. No significant difference was found between Early and Late SPECT. ROC analysis of individual readers showed improved accuracy of SPECT over planars for one of the two readers. IR SPECT improves sensitivity without loss of specificity compared to planar imaging. Late SPECT shows no additional benefit. Copyright (2002) The Australian and New Zealand Society of Nuclear Medicine Inc

  19. A comparative study between matched and mis-matched projection/back projection pairs used with ASIRT reconstruction method

    Guedouar, R., E-mail: raja_guedouar@yahoo.f [Higher School of Health Sciences and Techniques of Monastir, Av. Avicenne, 5060 Monastir, B.P. 128 (Tunisia); Zarrad, B., E-mail: boubakerzarrad@yahoo.f [Higher School of Health Sciences and Techniques of Monastir, Av. Avicenne, 5060 Monastir, B.P. 128 (Tunisia)

    2010-07-21

    For algebraic reconstruction techniques both forward and back projection operators are needed. The ability to perform accurate reconstruction relies fundamentally on the forward projection and back projection methods which are usually, the transpose of each other. Even though the mis-matched pairs may introduce additional errors during the iterative process, the usefulness of mis-matched projector/back projector pairs has been proved in image reconstruction. This work investigates the performance of matched and mis-matched reconstruction pairs using popular forward projectors and their transposes when used in reconstruction tasks with additive simultaneous iterative reconstruction techniques (ASIRT) in a parallel beam approach. Simulated noiseless phantoms are used to compare the performance of the investigated pairs in terms of the root mean squared errors (RMSE) which are calculated between reconstructed slices and the reference in different regions. Results show that mis-matched projection/back projection pairs can promise more accuracy of reconstructed images than matched ones. The forward projection operator performance seems independent of the choice of the back projection operator and vice versa.

  20. A comparative study between matched and mis-matched projection/back projection pairs used with ASIRT reconstruction method

    For algebraic reconstruction techniques both forward and back projection operators are needed. The ability to perform accurate reconstruction relies fundamentally on the forward projection and back projection methods which are usually, the transpose of each other. Even though the mis-matched pairs may introduce additional errors during the iterative process, the usefulness of mis-matched projector/back projector pairs has been proved in image reconstruction. This work investigates the performance of matched and mis-matched reconstruction pairs using popular forward projectors and their transposes when used in reconstruction tasks with additive simultaneous iterative reconstruction techniques (ASIRT) in a parallel beam approach. Simulated noiseless phantoms are used to compare the performance of the investigated pairs in terms of the root mean squared errors (RMSE) which are calculated between reconstructed slices and the reference in different regions. Results show that mis-matched projection/back projection pairs can promise more accuracy of reconstructed images than matched ones. The forward projection operator performance seems independent of the choice of the back projection operator and vice versa.

  1. Resolution Analysis of finite fault inversions: A back-projection approach.

    Ji, C.; Shao, G.

    2007-12-01

    The resolution of inverted source models of large earthquakes is controlled by frequency contents of "coherent" (or "useful") seismic observations and their spatial distribution. But it is difficult to distinguish whether some features consistent during different inversions are really required by data or a consequence of "prior" information, such as velocity structures, fault geometry, model parameterizations. Here, we investigate the model spatial resolution by first back projecting and stacking the data at the source regions and then analyzing the spatial- temporal variations of the focusing regions, which arbitrarily defined as the regions with 90% of the peak focusing amplitude. Our preliminary results indicated 1) The spatial-temporal resolution at a particularly direction is controlled by the region of directivity parameter [pcos(?)] within the seismic network, where p is the horizontal slowness from the hypocenter and ? is the difference between the station azimuth and this orientation. Therefore, the network aperture is more important than the number of stations. 2) Simple stacking method is a robust method to capture the asperities but the sizes of focusing regions are usually much larger than what data could resolve. By carefully weighting the data before the stacking could enhance the spatial resolution in a particular direction. 3) The results based on the teleseismic P waves of a local network usually surfers the trade-off between the source's spatial location and its rupture time. The resolution of the 2001 Kunlunshan earthquake and 2006 Kuril island earthquake will be investigated.

  2. Importance of point-by-point back projection correction for isocentric motion in digital breast tomosynthesis: Relevance to morphology of structures such as microcalcifications

    Digital breast tomosynthesis is a three-dimensional imaging technique that provides an arbitrary set of reconstruction planes in the breast from a limited-angle series of projection images acquired while the x-ray tube moves. Traditional shift-and-add (SAA) tomosynthesis reconstruction is a common mathematical method to line up each projection image based on its shifting amount to generate reconstruction slices. With parallel-path geometry of tube motion, the path of the tube lies in a plane parallel to the plane of the detector. The traditional SAA algorithm gives shift amounts for each projection image calculated only along the direction of x-ray tube movement. However, with the partial isocentric motion of the x-ray tube in breast tomosynthesis, small objects such as microcalcifications appear blurred (for instance, about 1-4 pixels in blur for a microcalcification in a human breast) in traditional SAA images in the direction perpendicular to the direction of tube motion. Some digital breast tomosynthesis algorithms reported in the literature utilize a traditional one-dimensional SAA method that is not wholly suitable for isocentric motion. In this paper, a point-by-point back projection (BP) method is described and compared with traditional SAA for the important clinical task of evaluating morphology of small objects such as microcalcifications. Impulse responses at different three-dimensional locations with five different combinations of imaging acquisition parameters were investigated. Reconstruction images of microcalcifications in a human subject were also evaluated. Results showed that with traditional SAA and 45 deg. view angle of tube movement with respect to the detector, at the same height above the detector, the in-plane blur artifacts were obvious for objects farther away from x-ray source. In a human subject, the appearance of calcifications was blurred in the direction orthogonal to the tube motion with traditional SAA. With point-by-point BP, the appearance of calcifications was sharper. The point-by-point BP method demonstrated improved rendition of microcalcifications in the direction perpendicular to the tube motion direction. With wide angles or for imaging of larger breasts, this point-by-point BP rather than the traditional SAA should also be considered as the basis of further deblurring algorithms that work in conjunction with the BP method

  3. GPU-accelerated back-projection revisited. Squeezing performance by careful tuning

    In recent years, GPUs have become an increasingly popular tool in computed tomography (CT) reconstruction. In this paper, we discuss performance optimization techniques for a GPU-based filtered-backprojection reconstruction implementation. We explore the different optimization techniques we used and explain how those techniques affected performance. Our results show a nearly 50% increase in performance when compared to the current top ranked GPU implementation. (orig.)

  4. Mitigating artifacts in back-projection source imaging with implications for frequency-dependent properties of the Tohoku-Oki earthquake

    Meng, Lingsen; Ampuero, Jean-Paul; Luo, Yingdi; Wu, Wenbo; Ni, Sidao

    2012-12-01

    Comparing teleseismic array back-projection source images of the 2011 Tohoku-Oki earthquake with results from static and kinematic finite source inversions has revealed little overlap between the regions of high- and low-frequency slip. Motivated by this interesting observation, back-projection studies extended to intermediate frequencies, down to about 0.1 Hz, have suggested that a progressive transition of rupture properties as a function of frequency is observable. Here, by adapting the concept of array response function to non-stationary signals, we demonstrate that the "swimming artifact", a systematic drift resulting from signal non-stationarity, induces significant bias on beamforming back-projection at low frequencies. We introduce a "reference window strategy" into the multitaper-MUSIC back-projection technique and significantly mitigate the "swimming artifact" at high frequencies (1 s to 4 s). At lower frequencies, this modification yields notable, but significantly smaller, artifacts than time-domain stacking. We perform extensive synthetic tests that include a 3D regional velocity model for Japan. We analyze the recordings of the Tohoku-Oki earthquake at the USArray and at the European array at periods from 1 s to 16 s. The migration of the source location as a function of period, regardless of the back-projection methods, has characteristics that are consistent with the expected effect of the "swimming artifact". In particular, the apparent up-dip migration as a function of frequency obtained with the USArray can be explained by the "swimming artifact". This indicates that the most substantial frequency-dependence of the Tohoku-Oki earthquake source occurs at periods longer than 16 s. Thus, low-frequency back-projection needs to be further tested and validated in order to contribute to the characterization of frequency-dependent rupture properties.

  5. Images of Gravitational and Magnetic Phenomena Derived from 2D Back-Projection Doppler Tomography of Interacting Binary Stars

    Richards, Mercedes T; Fisher, John G; Conover, Marshall J

    2014-01-01

    We have used 2D back-projection Doppler tomography as a tool to examine the influence of gravitational and magnetic phenomena in interacting binaries which undergo mass transfer from a magnetically-active star onto a non-magnetic main sequence star. This multi-tiered study of over 1300 time-resolved spectra of 13 Algol binaries involved calculations of the predicted dynamical behavior of the gravitational flow and the dynamics at the impact site, analysis of the velocity images constructed from tomography, and the influence on the tomograms of orbital inclination, systemic velocity, orbital coverage, and shadowing. The H$\\alpha$ tomograms revealed eight sources: chromospheric emission, a gas stream along the gravitational trajectory, a star-stream impact region, a bulge of absorption or emission around the mass-gaining star, a Keplerian accretion disk, an absorption zone associated with hotter gas, a disk-stream impact region, and a hot spot where the stream strikes the edge of a disk. We described several me...

  6. Electrical capacitance tomography two-phase oil-gas pipe flow imaging by the linear back-projection algorithm

    R. Martin

    2005-12-01

    Full Text Available Electrical Capacitance Tomography (ECT is a novel technology that can deal with the complexity of two-phase gas-oil flow measurement by explicitly deriving the component distributions on two adjacent planes along a pipeline. One of its most promising applications is the visualization of gas-oil flows. ECT offers some advantages over other tomography modalities, such as no radiation, rapid response, low-cost, being non-intrusive and non-invasive, and the ability to withstand high temperature and high pressure. The linear back-projection (LBP algorithm is one of the most popular methods employed to perform image reconstruction in ECT. Despite its relatively poor accuracy, it is a simple and fast procedure capable of real-time operation in many applications, and it has remained a very popular choice. However, since it was first reported it has lacked a clear formal support in the context of this application. Its only justification has been that it was an adaptation of a method normally used in linear X-ray medical tomography, and the fact that it actually does produce useful (albeit only ‘qualitative’ images. In this paper, one illustrative way of interpreting LBP is presented. It is shown how LBP is actually based on the linearisation of a normalised form of the forward problem. More specifically, the normalised forward problem is approximated by means of a series of hyper-planes. The reconstruction matrix used in LBP is found to be a ‘weighted’ transpose of the linear operator (matrix that defines the linearised normalised forward problem. The rows of this latter matrix contain the information of the sensitivity maps used in LBP.

  7. Images of gravitational and magnetic phenomena derived from two-dimensional back-projection Doppler tomography of interacting binary stars

    We have used two-dimensional back-projection Doppler tomography as a tool to examine the influence of gravitational and magnetic phenomena in interacting binaries that undergo mass transfer from a magnetically active star onto a non-magnetic main-sequence star. This multitiered study of over 1300 time-resolved spectra of 13 Algol binaries involved calculations of the predicted dynamical behavior of the gravitational flow and the dynamics at the impact site, analysis of the velocity images constructed from tomography, and the influence on the tomograms of orbital inclination, systemic velocity, orbital coverage, and shadowing. The Hα tomograms revealed eight sources: chromospheric emission, a gas stream along the gravitational trajectory, a star-stream impact region, a bulge of absorption or emission around the mass-gaining star, a Keplerian accretion disk, an absorption zone associated with hotter gas, a disk-stream impact region, and a hot spot where the stream strikes the edge of a disk. We described several methods used to extract the physical properties of the emission sources directly from the velocity images, including S-wave analysis, the creation of simulated velocity tomograms from hydrodynamic simulations, and the use of synthetic spectra with tomography to sequentially extract the separate sources of emission from the velocity image. In summary, the tomography images have revealed results that cannot be explained solely by gravitational effects: chromospheric emission moving with the mass-losing star, a gas stream deflected from the gravitational trajectory, and alternating behavior between stream state and disk state. Our results demonstrate that magnetic effects cannot be ignored in these interacting binaries.

  8. Images of gravitational and magnetic phenomena derived from two-dimensional back-projection Doppler tomography of interacting binary stars

    Richards, Mercedes T.; Cocking, Alexander S.; Fisher, John G.; Conover, Marshall J., E-mail: mrichards@astro.psu.edu, E-mail: asc5097@psu.edu [Department of Astronomy and Astrophysics, Pennsylvania State University, University Park, PA 16802 (United States)

    2014-11-10

    We have used two-dimensional back-projection Doppler tomography as a tool to examine the influence of gravitational and magnetic phenomena in interacting binaries that undergo mass transfer from a magnetically active star onto a non-magnetic main-sequence star. This multitiered study of over 1300 time-resolved spectra of 13 Algol binaries involved calculations of the predicted dynamical behavior of the gravitational flow and the dynamics at the impact site, analysis of the velocity images constructed from tomography, and the influence on the tomograms of orbital inclination, systemic velocity, orbital coverage, and shadowing. The Hα tomograms revealed eight sources: chromospheric emission, a gas stream along the gravitational trajectory, a star-stream impact region, a bulge of absorption or emission around the mass-gaining star, a Keplerian accretion disk, an absorption zone associated with hotter gas, a disk-stream impact region, and a hot spot where the stream strikes the edge of a disk. We described several methods used to extract the physical properties of the emission sources directly from the velocity images, including S-wave analysis, the creation of simulated velocity tomograms from hydrodynamic simulations, and the use of synthetic spectra with tomography to sequentially extract the separate sources of emission from the velocity image. In summary, the tomography images have revealed results that cannot be explained solely by gravitational effects: chromospheric emission moving with the mass-losing star, a gas stream deflected from the gravitational trajectory, and alternating behavior between stream state and disk state. Our results demonstrate that magnetic effects cannot be ignored in these interacting binaries.

  9. Digital Filters for Low Frequency Equalization

    Tyril, Marni; Abildgaard, J.; Rubak, Per

    Digital filters with high resolution in the low-frequency range are studied. Specifically, for a given computational power, traditional IIR filters are compared with warped FIR filters, warped IIR filters, and modified warped FIR filters termed warped individual z FIR filters (WizFIR). The results...

  10. Generalized Nonlinear Complementary Attitude Filter

    Jensen, Kenneth

    2011-01-01

    This work describes a family of attitude estimators that are based on a generalization of Mahony's nonlinear complementary filter. This generalization reveals the close mathematical relationship between the nonlinear complementary filter and the more traditional multiplicative extended Kalman filter. In fact, the bias-free and constant gain multiplicative continuous-time extended Kalman filters may be interpreted as special cases of the generalized attitude estimator. The correspondence provides a rational means of choosing the gains for the nonlinear complementary filter and a proof of the near global asymptotic stability of special cases of the multiplicative extended Kalman filter.

  11. Rupture process of the 2011 off the Pacific coast of Tohoku Earthquake ( M w 9.0) as imaged with back-projection of teleseismic P-waves

    Wang, Dun; Mori, Jim

    2011-07-01

    We use the back-projection method, with data recorded on the dense USArray network, to estimate the rupture propagation for the M w 9.0 earthquake that occurred offshore of the Tohoku region, Japan. The results show a variable rupture propagation ranging from about 1.0 to 3.0 km/s for the high-frequency radiation. The rupture propagates over about 450 km in approximately 150 s. Based on the rupture speed and direction, the high-frequency source process can be divided into two parts. The first part has a relatively slow rupture speed of 1.0 to 1.5 km/s and propagates northwestward. In the second part, the rupture progresses southwestward starting with a slow speed of about 1.5 km/s and accelerating to about 3.0 km/s. We see three large pulses at 30 s, 80 s and 130 s. The first two, including the largest second pulse, were located 50 to 70 km northwest of the epicenter. The third occurred about 250 km southwest of the epicenter. The variability of rupture velocity may be associated with significant changes of physical properties along the fault plane. Areas of low/high rupture speed are associated with large/small energy releases on the fault plane. These variations may reflect the strength properties along the fault. Also, locations of the high-frequency radiation derived from the back-projection analysis are significantly different from the areas of very large slip for this earthquake.

  12. Filtering apparatus

    Filtering apparatus for removing toxic particles from a fluid includes a sealed vessel having a fluid inlet and a fluid outlet between which is an array of elongate filter elements. The filter elements are so disposed that when viewed in the direction of their length the axis of each filter element is spaced from that of each adjacent filter element by a distance R whereby the axes lie at the apexes of a plurality of equilateral triangles. The axes of six of the filter elements are equispaced about a central pitch circle whose radius is R and at the centre of which is the axis of a seventh filter element. A portion of the wall of the vessel is removable to permit access to the filter elements and each filter element has means to locate a machine for remotely exchanging the filter elements. (author)

  13. Air filter

    An air filter consists of an upright cylinder of corrugated or pleated filter fabric, joined at its upper end to a tubular right-angled elbow. The open end of the elbow includes an internal lip seal, so the elbow can be slid onto a horizontal spigot in an air filter unit. The filter can be cleaned by subjecting the fabric to a reverse pressure pulse from a nozzle. The construction facilitates removal of the filter into a plastic bag secured round a frame behind a door, when the unit is used to filter radioactive dust from air. (author)

  14. Cold Crystal Reflector Filter Concept

    Muhrer, G

    2014-01-01

    In this paper the theoretical concept of a cold crystal reflector filter will be presented. The aim of this concept is to balance the shortcoming of the traditional cold polycrystalline reflector filter, which lies in the significant reduction of the neutron flux right above (in energy space) or right below (wavelength space) the first Bragg edge.

  15. High security and robust optical image encryption approach based on computer-generated integral imaging pickup and iterative back-projection techniques

    Li, Xiao Wei; Cho, Sung Jin; Kim, Seok Tae

    2014-04-01

    In this paper, a novel optical image encryption algorithm by combining the use of computer-generated integral imaging (CGII) pickup technique and iterative back-projection (IBP) technique is proposed. In this scheme, a color image to be encrypted which is firstly segregated into three channels: red, green, and blue. Each of these three channels is independently captured by using a virtual pinhole array and be computationally transformed as a sub-image array. Then, each of these three sub-image arrays are scrambled by the Fibonacci transformation (FT) algorithm, respectively. These three scrambled sub-image arrays are encrypted by the hybrid cellular automata (HCA), respectively. Ultimately, these three encrypted images are combined to produce the colored encrypted image. In the reconstruction process, because the computational integral imaging reconstruction (CIIR) is a pixel-overlapping reconstruction technique, the interference of the adjacent pixels will decrease the quality of the reconstructed image. To address this problem, we introduce an image super-resolution reconstruction technique, the image can be computationally reconstructed by the IBP technique. Some numerical simulations are made to test the validity and the capability of the proposed image encryption algorithm.

  16. Portable Wideband Microwave Imaging System for Intracranial Hemorrhage Detection Using Improved Back-projection Algorithm with Model of Effective Head Permittivity

    Mobashsher, Ahmed Toaha; Mahmoud, A.; Abbosh, A. M.

    2016-02-01

    Intracranial hemorrhage is a medical emergency that requires rapid detection and medication to restrict any brain damage to minimal. Here, an effective wideband microwave head imaging system for on-the-spot detection of intracranial hemorrhage is presented. The operation of the system relies on the dielectric contrast between healthy brain tissues and a hemorrhage that causes a strong microwave scattering. The system uses a compact sensing antenna, which has an ultra-wideband operation with directional radiation, and a portable, compact microwave transceiver for signal transmission and data acquisition. The collected data is processed to create a clear image of the brain using an improved back projection algorithm, which is based on a novel effective head permittivity model. The system is verified in realistic simulation and experimental environments using anatomically and electrically realistic human head phantoms. Quantitative and qualitative comparisons between the images from the proposed and existing algorithms demonstrate significant improvements in detection and localization accuracy. The radiation and thermal safety of the system are examined and verified. Initial human tests are conducted on healthy subjects with different head sizes. The reconstructed images are statistically analyzed and absence of false positive results indicate the efficacy of the proposed system in future preclinical trials.

  17. Improved resolution and reduced clutter in ultra-wideband microwave imaging using cross-correlated back projection: experimental and numerical results.

    Jacobsen, S; Birkelund, Y

    2010-01-01

    Microwave breast cancer detection is based on the dielectric contrast between healthy and malignant tissue. This radar-based imaging method involves illumination of the breast with an ultra-wideband pulse. Detection of tumors within the breast is achieved by some selected focusing technique. Image formation algorithms are tailored to enhance tumor responses and reduce early-time and late-time clutter associated with skin reflections and heterogeneity of breast tissue. In this contribution, we evaluate the performance of the so-called cross-correlated back projection imaging scheme by using a scanning system in phantom experiments. Supplementary numerical modeling based on commercial software is also presented. The phantom is synthetically scanned with a broadband elliptical antenna in a mono-static configuration. The respective signals are pre-processed by a data-adaptive RLS algorithm in order to remove artifacts caused by antenna reverberations and signal clutter. Successful detection of a 7 mm diameter cylindrical tumor immersed in a low permittivity medium was achieved in all cases. Selecting the widely used delay-and-sum (DAS) beamforming algorithm as a benchmark, we show that correlation based imaging methods improve the signal-to-clutter ratio by at least 10 dB and improves spatial resolution through a reduction of the imaged peak full-width half maximum (FWHM) of about 40-50%. PMID:21331362

  18. Portable Wideband Microwave Imaging System for Intracranial Hemorrhage Detection Using Improved Back-projection Algorithm with Model of Effective Head Permittivity.

    Mobashsher, Ahmed Toaha; Mahmoud, A; Abbosh, A M

    2016-01-01

    Intracranial hemorrhage is a medical emergency that requires rapid detection and medication to restrict any brain damage to minimal. Here, an effective wideband microwave head imaging system for on-the-spot detection of intracranial hemorrhage is presented. The operation of the system relies on the dielectric contrast between healthy brain tissues and a hemorrhage that causes a strong microwave scattering. The system uses a compact sensing antenna, which has an ultra-wideband operation with directional radiation, and a portable, compact microwave transceiver for signal transmission and data acquisition. The collected data is processed to create a clear image of the brain using an improved back projection algorithm, which is based on a novel effective head permittivity model. The system is verified in realistic simulation and experimental environments using anatomically and electrically realistic human head phantoms. Quantitative and qualitative comparisons between the images from the proposed and existing algorithms demonstrate significant improvements in detection and localization accuracy. The radiation and thermal safety of the system are examined and verified. Initial human tests are conducted on healthy subjects with different head sizes. The reconstructed images are statistically analyzed and absence of false positive results indicate the efficacy of the proposed system in future preclinical trials. PMID:26842761

  19. GMTI processing using back projection.

    Doerry, Armin Walter

    2013-07-01

    Backprojection has long been applied to SAR image formation. It has equal utility in forming the range-velocity maps for Ground Moving Target Indicator (GMTI) radar processing. In particular, it overcomes the problem of targets migrating through range resolution cells.

  20. Filter element

    A filter element for a precoat filter has a septum formed with longitudinal pleats having broad roots proximal to a perforate core and narrow tips distal from the core. The element may include circumferential supporting bands spaced axially along the filter element and constraining the septum tips. In use a deposit of mechanical or ion exchange precoat is applied to the element before filtering and later removed by backwashing. (author)

  1. Rectifier Filters

    Y. A. Bladyko

    2014-07-01

    Full Text Available The paper contains definition of a smoothing factor which is suitable for any rectifier filter. The formulae of complex smoothing factors have been developed for simple and complex passive filters. The paper shows conditions for application of calculation formulae and filters

  2. Optimal filtering

    Anderson, Brian D O

    2005-01-01

    This graduate-level text augments and extends beyond undergraduate studies of signal processing, particularly in regard to communication systems and digital filtering theory. Vital for students in the fields of control and communications, its contents are also relevant to students in such diverse areas as statistics, economics, bioengineering, and operations research.Topics include filtering, linear systems, and estimation; the discrete-time Kalman filter; time-invariant filters; properties of Kalman filters; computational aspects; and smoothing of discrete-time signals. Additional subjects e

  3. Rupture Processes of the Mw8.3 Sea of Okhotsk Earthquake and Aftershock Sequences from 3-D Back Projection Imaging

    Jian, P. R.; Hung, S. H.; Meng, L.

    2014-12-01

    On May 24, 2013, the largest deep earthquake ever recorded in history occurred on the southern tip of the Kamchatka Island, where the Pacific Plate subducts underneath the Okhotsk Plate. Previous 2D beamforming back projection (BP) of P- coda waves suggests the mainshock ruptured bilaterally along a horizontal fault plane determined by the global centroid moment tensor solution. On the other hand, the multiple point source inversion of P and SH waveforms argued that the earthquake comprises a sequence of 6 subevents not located on a single plane but actually distributed in a zone that extends 64 km horizontally and 35 km in depth. We then apply a three-dimensional MUSIC BP approach to resolve the rupture processes of the manishock and two large aftershocks (M6.7) with no a priori setup of preferential orientations of the planar rupture. The maximum pseudo-spectrum of high-frequency P wave in a sequence of time windows recorded by the densely-distributed stations from US and EU Array are used to image 3-D temporal and spatial rupture distribution. The resulting image confirms that the nearly N-S striking but two antiparallel rupture stages. The first subhorizontal rupture initially propagates toward the NNE direction, while at 18 s later it directs reversely to the SSW and concurrently shifts downward to 35 km deeper lasting for about 20 s. The rupture lengths in the first NNE-ward and second SSW-ward stage are about 30 km and 85 km; the estimated rupture velocities are 3 km/s and 4.25 km/s, respectively. Synthetic experiments are undertaken to assess the capability of the 3D MUSIC BP for the recovery of spatio-temporal rupture processes. Besides, high frequency BP images based on the EU-Array data show two M6.7 aftershocks are more likely to rupture on the vertical fault planes.

  4. Improving back projection imaging with a novel physics-based aftershock calibration approach: A case study of the 2015 Gorkha earthquake

    Meng, Lingsen; Zhang, Ailin; Yagi, Yuji

    2016-01-01

    The 2015 Mw 7.8 Nepal-Gorkha earthquake with casualties of over 9000 people was the most devastating disaster to strike Nepal since the 1934 Nepal-Bihar earthquake. Its rupture process was imaged by teleseismic back projections (BP) of seismograms recorded by three, large regional networks in Australia, North America, and Europe. The source images of all three arrays reveal a unilateral eastward rupture; however, the propagation directions and speeds differ significantly between the arrays. To understand the spatial uncertainties of the BP analyses, we analyze four moderate size aftershocks recorded by all three arrays exactly as had been conducted for the main shock. The apparent source locations inferred from BPs are systematically biased from the catalog locations, as a result of a slowness error caused by three-dimensional Earth structures. We introduce a physics-based slowness correction that successfully mitigates the source location discrepancies among the arrays. Our calibrated BPs are found to be mutually consistent and reveal a unilateral rupture propagating eastward at a speed of 2.7 km/s, localized in a relatively narrow and deep swath along the downdip edge of the locked Himalayan thrust zone. We find that the 2015 Gorkha earthquake was a localized rupture that failed to break the entire Himalayan décollement to the surface, which can be regarded as an intermediate event during the interseismic period of larger Himalayan ruptures that break the whole seismogenic zone width. Thus, our physics-based slowness correction is an important technical improvement of BP, mitigating spatial uncertainties and improving the robustness of single and multiarray studies.

  5. Application of circular filter inserts

    High efficiency particulate air (HEPA) filters are used in the ventilation of nuclear plant as passive clean-up devices. Traditionally, the work-horse of the industry has been the rectangular HEPA filter. An assessment of the problems associated with remote handling, changing, and disposal of these rectangular filters suggested that significant advantages to filtration systems could be obtained by the adoption of HEPA filters with circular geometry for both new and existing ventilation plants. This paper covers the development of circular geometry filters and highlights the advantages of this design over their rectangular counterparts. The work has resulted in a range of commercially available filters for flows from 45 m3/h up to 3400 m3/h. This paper also covers the development of a range of sizes and types of housings that employ simple change techniques which take advantage of the circular geometry. The systems considered here have been designed in response to the requirements for shielded (remote filter change) and for unshielded facilities (potentially for bag changing of filters). Additionally the designs have allowed for the possibility of retrofitting circular geometry HEPA filters in place of the rectangular geometry filter

  6. Filter assemblies

    A filter assembly for the nuclear industry comprises a plurality of tubular filters welded at one end to a plenum chamber which is made of plastics by rotational moulding and includes an outlet. The other ends of the filters are closed and supported by a plate attached to the plenum chamber by tie rods. A central rod screws into the capture nut at one end and has a fitting, to facilitate remote handling, at the other. The assembly is cheap and destructible after use. (author)

  7. Filter This

    Audrey Barbakoff

    2011-03-01

    Full Text Available In the Library with the Lead Pipe welcomes Audrey Barbakoff, a librarian at the Milwaukee Public Library, and Ahniwa Ferrari, Virtual Experience Manager at the Pierce County Library System in Washington, for a point-counterpoint piece on filtering in libraries. The opinions expressed here are those of the authors, and are not endorsed by their employers. [...

  8. Particle filters

    Künsch, Hans R.

    2013-01-01

    This is a short review of Monte Carlo methods for approximating filter distributions in state space models. The basic algorithm and different strategies to reduce imbalance of the weights are discussed. Finally, methods for more difficult problems like smoothing and parameter estimation and applications outside the state space model context are presented.

  9. Data assimilation the ensemble Kalman filter

    Evensen, Geir

    2006-01-01

    Covers data assimilation and inverse methods, including both traditional state estimation and parameter estimation. This text and reference focuses on various popular data assimilation methods, such as weak and strong constraint variational methods and ensemble filters and smoothers.

  10. Optimal filter systems for photometric redshift estimation

    Benitez, N.; Moles, M.; Aguerri, J. A. L.; Alfaro, E.; Broadhurst, T.; Cabrera, J.; Castander, F. J.; Cepa, J.; Cervino, M.; Cristobal-Hornillos, D.; Fernandez-Soto, A.; Gonzalez-Delgado, R. M.; Infante, L; Marquez, I.; Martinez, V. J.

    2008-01-01

    In the next years, several cosmological surveys will rely on imaging data to estimate the redshift of galaxies, using traditional filter systems with 4-5 optical broad bands; narrower filters improve the spectral resolution, but strongly reduce the total system throughput. We explore how photometric redshift performance depends on the number of filters n_f, characterizing the survey depth through the fraction of galaxies with unambiguous redshift estimates. For a combination of total exposure...

  11. Water Filter

    1982-01-01

    A compact, lightweight electrolytic water sterilizer available through Ambassador Marketing, generates silver ions in concentrations of 50 to 100 parts per billion in water flow system. The silver ions serve as an effective bactericide/deodorizer. Tap water passes through filtering element of silver that has been chemically plated onto activated carbon. The silver inhibits bacterial growth and the activated carbon removes objectionable tastes and odors caused by addition of chlorine and other chemicals in municipal water supply. The three models available are a kitchen unit, a "Tourister" unit for portable use while traveling and a refrigerator unit that attaches to the ice cube water line. A filter will treat 5,000 to 10,000 gallons of water.

  12. Energy filter

    An energy filter, for enclosing a Geiger Mueller tube, comprises a copper sheath, open at one end and possessing an aperture at the other, surrounded by a discontinuous jacket of 60-40 tin-lead alloy. The latter, assisted by the copper sheath, reduces those radiant frequencies at which the Geiger Mueller tube is most sensitive, enabling the tube to produce a response which is substantially independent of the energy of the incoming radiation. 60-40 tin-lead alloy is a commercial solder of eutectic composition. Consequently the filter is readily constructed by casting the tin-lead parts, either directly onto the copper sheath, or apart from it, for later assembly. The particular configuration of the alloy jacket, comprising two axially spaced rings and one end disc, and the sheath aperture, are chosen to give a circular polar response as nearly as possible. (author)

  13. Robust filtering for uncertain systems a parameter-dependent approach

    Gao, Huijun

    2014-01-01

    This monograph provides the reader with a systematic treatment of robust filter design, a key issue in systems, control and signal processing, because of the fact that the inevitable presence of uncertainty in system and signal models often degrades the filtering performance and may even cause instability. The methods described are therefore not subject to the rigorous assumptions of traditional Kalman filtering. The monograph is concerned with robust filtering for various dynamical systems with parametric uncertainties, and focuses on parameter-dependent approaches to filter design. Classical filtering schemes, like H2 filtering and H¥ filtering, are addressed, and emerging issues such as robust filtering with constraints on communication channels and signal frequency characteristics are discussed. The text features: ·        design approaches to robust filters arranged according to varying complexity level, and emphasizing robust filtering in the parameter-dependent framework for the first time; ·...

  14. Digital filters

    Hamming, Richard W

    1997-01-01

    Digital signals occur in an increasing number of applications: in telephone communications; in radio, television, and stereo sound systems; and in spacecraft transmissions, to name just a few. This introductory text examines digital filtering, the processes of smoothing, predicting, differentiating, integrating, and separating signals, as well as the removal of noise from a signal. The processes bear particular relevance to computer applications, one of the focuses of this book.Readers will find Hamming's analysis accessible and engaging, in recognition of the fact that many people with the s

  15. Anti-Aliasing filter for reverse-time migration

    Zhan, Ge

    2012-01-01

    We develop an anti-aliasing filter for reverse-time migration (RTM). It is similar to the traditional anti-aliasing filter used for Kirchhoff migration in that it low-pass filters the migration operator so that the dominant wavelength in the operator is greater than two times the trace sampling interval, except it is applied to both primary and multiple reflection events. Instead of applying this filter to the data in the traditional RTM operation, we apply the anti-aliasing filter to the generalized diffraction-stack migration operator. This gives the same migration image as computed by anti-aliased RTM. Download

  16. Adaptive particle filtering

    Stevens, Mark R.; Gutchess, Dan; Checka, Neal; Snorrason, Magnús

    2006-05-01

    Image exploitation algorithms for Intelligence, Surveillance and Reconnaissance (ISR) and weapon systems are extremely sensitive to differences between the operating conditions (OCs) under which they are trained and the extended operating conditions (EOCs) in which the fielded algorithms are tested. As an example, terrain type is an important OC for the problem of tracking hostile vehicles from an airborne camera. A system designed to track cars driving on highways and on major city streets would probably not do well in the EOC of parking lots because of the very different dynamics. In this paper, we present a system we call ALPS for Adaptive Learning in Particle Systems. ALPS takes as input a sequence of video images and produces labeled tracks. The system detects moving targets and tracks those targets across multiple frames using a multiple hypothesis tracker (MHT) tightly coupled with a particle filter. This tracker exploits the strengths of traditional MHT based tracking algorithms by directly incorporating tree-based hypothesis considerations into the particle filter update and resampling steps. We demonstrate results in a parking lot domain tracking objects through occlusions and object interactions.

  17. Traditional Agriculture and Permaculture.

    Pierce, Dick

    1997-01-01

    Discusses benefits of combining traditional agricultural techniques with the concepts of "permaculture," a framework for revitalizing traditions, culture, and spirituality. Describes school, college, and community projects that have assisted American Indian communities in revitalizing sustainable agricultural practices that incorporate cultural…

  18. Traditional Agriculture and Permaculture.

    Pierce, Dick

    1997-01-01

    Discusses benefits of combining traditional agricultural techniques with the concepts of "permaculture," a framework for revitalizing traditions, culture, and spirituality. Describes school, college, and community projects that have assisted American Indian communities in revitalizing sustainable agricultural practices that incorporate cultural…

  19. Quantitative gated SPECT: the effect of reconstruction filter on calculated left ventricular ejection fractions and volumes

    Gated SPECT (GSPECT) offers the possibility of obtaining additional functional information from perfusion studies, including calculation of left ventricular ejection fraction (LVEF). The calculation of LVEF relies upon the identification of the endocardial surface, which will be affected by the spatial resolution and statistical noise in the reconstructed images. The aim of this study was to compare LVEFs and ventricular volumes calculated from GSPECT using six reconstruction filters. GSPECT and radionuclide ventriculography (RNVG) were performed on 40 patients; filtered back projection was used to reconstruct the datasets with each filter. LVEFs and volumes were calculated using the Cedars-Sinai QGS package. The correlation coefficient between RNVG and GSPECT ranged from 0.81 to 0.86 with higher correlations for smoother filters. The narrowest prediction interval was 11±2%. There was a trend towards higher LVEF values with smoother filters, the ramp filter yielding LVEFs 2.55±3.10% (p<0.001) lower than the Hann filter. There was an overall fall in ventricular volumes with smoother filters with a mean difference of 13.98±10.15 ml (p<0.001) in EDV between the Butterworth-0.5 and Butterworth-0.3 filters. In conclusion, smoother reconstruction filters lead to lower volumes and higher ejection fractions with the QGS algorithm, with the Butterworth-0.4 filter giving the highest correlation with LVEFs from RNVG. Even if the optimal filter is chosen the uncertainty in the measured ejection fractions is still too great to be clinically acceptable. (author)

  20. Filter Bank Design for Subband Adaptive Filtering

    Haan, Jan Mark de

    2001-01-01

    Adaptive filtering is an important subject in the field of signal processing and has numerous applications in fields such as speech processing and communications. Examples in speech processing include speech enhancement, echo- and interference- cancellation, and speech coding. Subband filter banks have been introduced in the area of adaptive filtering in order to improve the performance of time domain adaptive filters. The main improvements are faster convergence speed and the reduction of co...

  1. Single-layer optical bandpass filter technology.

    Niraula, Manoj; Yoon, Jae Woong; Magnusson, Robert

    2015-11-01

    Resonant periodic surfaces and films enable new functionalities with wide applicability in practical optical systems. Their material sparsity, ease of fabrication, and minimal interface count provide environmental and thermal stability and robustness in applications. Here, we report an experimental bandpass filter fashioned in a single patterned silicon layer on a quartz substrate. Its performance corresponds to bandpass filters requiring 15 traditional Si/SiO2 thin-film layers. The feasibility of sparse narrowband high-efficiency bandpass filters with extremely wide, flat, and low sidebands is thereby demonstrated. This class of devices is designed with rigorous solutions of Maxwell's equations while engaging the physical principles of resonant waveguide gratings. An experimental filter presented exhibits a transmittance of ?72%, bandwidth of ?0.5??nm, and low sidebands spanning ?100??nm. The proposed technology is integration-friendly and opens doors for further development in various disciplines and spectral regions where thin-film solutions are traditionally applied. PMID:26512519

  2. Fault Tolerant Parallel Filters Based On Bch Codes

    K.Mohana Krishna

    2015-04-01

    Full Text Available Digital filters are used in signal processing and communication systems. In some cases, the reliability of those systems is critical, and fault tolerant filter implementations are needed. Over the years, many techniques that exploit the filters’ structure and properties to achieve fault tolerance have been proposed. As technology scales, it enables more complex systems that incorporate many filters. In those complex systems, it is common that some of the filters operate in parallel, for example, by applying the same filter to different input signals. Recently, a simple technique that exploits the presence of parallel filters to achieve multiple fault tolerance has been presented. In this brief, that idea is generalized to show that parallel filters can be protected using Bose– Chaudhuri–Hocquenghem codes (BCH in which each filter is the equivalent of a bit in a traditional ECC. This new scheme allows more efficient protection when the number of parallel filters is large.

  3. Substrate Integrated Evanescent Filters Employing Coaxial Stubs

    Zhurbenko, Vitaliy

    2015-01-01

    Evanescent mode substrate integrated waveguide (SIW) is one of the promising technologies for design of light-weight low-cost microwave components. Traditional realization methods used in the standard evanescent waveguide technology are often not directly applicable to SIW due to dielectric filling...... principles is designed, fabricated, and tested. The filter exhibits a transmission zero due to the implemented stubs. The problem of evanescent mode filter analysis is formulated in terms of conventional network concepts. This formulation is then used for modelling of the filters. Strategies to further...

  4. An area efficient low noise 100 Hz low-pass filter

    Ølgaard, Christian; Sassene, Haoues; Perch-Nielsen, Ivan R.

    1996-01-01

    A technique based on scaling a filter's capacitor currents to improve the noise performance of low frequency continuous-time filters is presented. Two 100 Hz low-pass filters have been implemented: a traditional low pass filter (as reference), and a filter utilizing the above mentioned current...... scaling technique. The two filters utilize approximately the same silicon area. The scaled filter implements the scaling by use of a MOS based current conveyor type CCII. Measurements indicate that the current scaled filter results in a noise improvement of approximately 5.5 dB over the reference filter...... when a class A/B biasing scheme is used in the current divider. Obtaining identical noise performance from the reference filter would require a 3.6 times larger filter capacitor. This would increase the reference filter's die area by 100%. Therefore, the current scaling technique allows filters with...

  5. Filters in 2D and 3D Cardiac SPECT Image Processing

    Ploussi, Agapi; Synefia, Stella

    2014-01-01

    Nuclear cardiac imaging is a noninvasive, sensitive method providing information on cardiac structure and physiology. Single photon emission tomography (SPECT) evaluates myocardial perfusion, viability, and function and is widely used in clinical routine. The quality of the tomographic image is a key for accurate diagnosis. Image filtering, a mathematical processing, compensates for loss of detail in an image while reducing image noise, and it can improve the image resolution and limit the degradation of the image. SPECT images are then reconstructed, either by filter back projection (FBP) analytical technique or iteratively, by algebraic methods. The aim of this study is to review filters in cardiac 2D, 3D, and 4D SPECT applications and how these affect the image quality mirroring the diagnostic accuracy of SPECT images. Several filters, including the Hanning, Butterworth, and Parzen filters, were evaluated in combination with the two reconstruction methods as well as with a specified MatLab program. Results showed that for both 3D and 4D cardiac SPECT the Butterworth filter, for different critical frequencies and orders, produced the best results. Between the two reconstruction methods, the iterative one might be more appropriate for cardiac SPECT, since it improves lesion detectability due to the significant improvement of image contrast. PMID:24804144

  6. Tap water filters.

    2003-02-01

    Moen PureTouch filters remove impurities from tap water without removing fluoride. These carbon block filters consist of finely powdered activated carbon that is combined with a plastic binder material and heated to form a hollow cylinder. The blocks are further wrapped with material to improve performance and reduce clogging. The filters are available with different filtering capabilities (Table 1). The filters mount in the faucet spout or under the sink. PMID:12636128

  7. Intelligent Optimize Design of LCL Filter for Three-Phase Voltage-Source PWM Rectifier

    Sun, Wei; Chen, Zhe; Wu, Xiaojie

    Compared to traditional L filter, a LCL filter is more effective on reducing harmonic distortion at switch frequency. So it is important to choose the LCL filter parameters to achieve good filtering effect. This paper introduces some traditional design methods. Design of a LCL filter by genetic...... algorithm (GA) and particle swam optimization (PSO) are presented in this paper and comparison of the two intelligent optimization. Simulation result and calculate data are provided to prove that intelligent optimization are more effective and simple than traditional methods....

  8. Filter synthesis using Genesys S/Filter

    Rhea, Randall W

    2014-01-01

    S/Filter includes tools beyond direct synthesis, including a wide variety of both exact and approximate equivalent network transforms, methods for selecting the most desirable out of potentially thousands of synthesized alternatives, and a transform history record that simplifies design attempts requiring iteration. Very few software programs are based on direct synthesis, and the additional features of S/Filter make it a uniquely effective tool for filter design.This resource presents a practical guide to using Genesys software for microwave and RF filter design and synthesis. The focus of th

  9. Tradition og Modernisme

    Bay, Carl Erik

    artiklen "Tradition og Modernisme" sit syn på konventionelle og moderne former i byplanlægning, arkitektur og design. I denne antologi er "Tradition og Modernisme" genoptrykt og fem forskere analyserer den med forskellige indfaldsvinkler. Den perspektiveres i forhold til PHs filosofiske afsæt i...

  10. Family Customs and Traditions.

    MacGregor, Cynthia

    Recognizing the importance of maintaining open communication with immediate and extended family members, this book provides a compilation of ideas for family traditions and customs that are grounded in compassion and human kindness. The traditions were gathered from families in the United States and Canada who responded to advertisements in…

  11. Traditional medicine and genomics

    Kalpana Joshi

    2010-01-01

    Full Text Available ′Omics′ developments in the form of genomics, proteomics and metabolomics have increased the impetus of traditional medicine research. Studies exploring the genomic, proteomic and metabolomic basis of human constitutional types based on Ayurveda and other systems of oriental medicine are becoming popular. Such studies remain important to developing better understanding of human variations and individual differences. Countries like India, Korea, China and Japan are investing in research on evidence-based traditional medicines and scientific validation of fundamental principles. This review provides an account of studies addressing relationships between traditional medicine and genomics.

  12. Single-periodic-film optical bandpass filter

    Niraula, Manoj; Magnusson, Robert

    2015-01-01

    Resonant periodic surfaces and films enable new functionalities with wide applicability in practical optical systems. Their material sparsity, ease of fabrication, and minimal interface count provide environmental and thermal stability and robustness in applications. Here we report an experimental bandpass filter fashioned in a single patterned layer on a substrate. Its performance corresponds to bandpass filters requiring perhaps 30 traditional thin-film layers as shown by an example. We demonstrate an ultra-narrow, high-efficiency bandpass filter with extremely wide, flat, and low sidebands. This class of devices is designed with rigorous solutions of the Maxwell equations while engaging the physical principles of resonant waveguide gratings. The proposed technology is integration-friendly and opens doors for further development in various disciplines and spectral regions where thin-film solutions are traditionally applied.

  13. Application of CT filter algorithms to digitized film data

    The CT studies performed thus far in the RAS division have been based on neutron radiographs of two 7-pin reactor fuel bundles which were subjected to over-power accident simulations in the TREAT reactor. As a result of the tests, the pins were severely damaged, with molten fuel and steel spreading throughout the fuel assembly. The neutron radiographs are produced at the NRAD reactor facility at ANL-West in Idaho. The RAS reconstruction codes are based on the filtered back-projection technique, using standard fast Fourier transforms and filter algorithms. Because of the length of the fuel assemblies, and the fact that they are held only at the top by the rotation mechanism, it is nearly impossible to achieve a perfect vertical alignment, so a major part of the analysis time is spent in rotating and aligning the images. As part of this computerized alignment, each image is also normalized to a constant exposure time, based on the data in a neutron absorbing step wedge that is imaged along with the fuel pins. All computer codes were loosely developed from those given in the Donner Algorithms prepared for the National Cancer Institute and are currently run on a PDP-11/60 computer

  14. A Kalman filter technique applied for medical image reconstruction

    Medical images contain information about vital organic tissues inside of human body and are widely used for diagnoses of disease or for surgical purposes. Image reconstruction is essential for medical images for some applications such as suppression of noise or de-blurring the image in order to provide images with better quality and contrast. Due to vital rule of image reconstruction in medical sciences the corresponding algorithms with better efficiency and higher speed is desirable. Most algorithms in image reconstruction are operated on frequency domain such as the most popular one known as filtered back projection. In this paper we introduce a Kalman filter technique which is operated in time domain for medical image reconstruction. Results indicated that as the number of projection increases in both normal collected ray sum and the collected ray sum corrupted by noise the quality of reconstructed image becomes better in terms of contract and transparency. It is also seen that as the number of projection increases the error index decreases.

  15. Geometric computation theory for morphological filtering on freeform surfaces

    Lou, S.; Jiang, X.; Scott, P. J.

    2013-01-01

    Surfaces govern functional behaviours of geometrical products, especially high precision and high added-value products. Compared to the mean-line based filters, morphological filters, evolved from the traditional E-system, are relevant to functional performance of surfaces. The conventional implementation of morphological filters based on image processing does not work for state-of-the-art surfaces, for example freeform surfaces. A set of novel geometric computation theory is developed by app...

  16. An Efficient Divide-and-Conquer Algorithm for Morphological Filters

    Lou, Shan; Jiang, Xiangqian; Scott, Paul J

    2013-01-01

    Morphological filters, evolved from the traditional envelope filter, are function oriented filtration techniques. A recent research on the implementation of morphological filters was based on the theoretical link between morphological operations and the alpha shape. However the Delaunay triangulation on which the alpha shape method depends is costly for large areal data. This paper proposes a divide-and-conquer method as an optimization to the alpha shape method aiming to speed up its perfor...

  17. HEPA Filter Vulnerability Assessment

    GUSTAVSON, R.D.

    2000-05-11

    This assessment of High Efficiency Particulate Air (HEPA) filter vulnerability was requested by the USDOE Office of River Protection (ORP) to satisfy a DOE-HQ directive to evaluate the effect of filter degradation on the facility authorization basis assumptions. Within the scope of this assessment are ventilation system HEPA filters that are classified as Safety-Class (SC) or Safety-Significant (SS) components that perform an accident mitigation function. The objective of the assessment is to verify whether HEPA filters that perform a safety function during an accident are likely to perform as intended to limit release of hazardous or radioactive materials, considering factors that could degrade the filters. Filter degradation factors considered include aging, wetting of filters, exposure to high temperature, exposure to corrosive or reactive chemicals, and exposure to radiation. Screening and evaluation criteria were developed by a site-wide group of HVAC engineers and HEPA filter experts from published empirical data. For River Protection Project (RPP) filters, the only degradation factor that exceeded the screening threshold was for filter aging. Subsequent evaluation of the effect of filter aging on the filter strength was conducted, and the results were compared with required performance to meet the conditions assumed in the RPP Authorization Basis (AB). It was found that the reduction in filter strength due to aging does not affect the filter performance requirements as specified in the AB. A portion of the HEPA filter vulnerability assessment is being conducted by the ORP and is not part of the scope of this study. The ORP is conducting an assessment of the existing policies and programs relating to maintenance, testing, and change-out of HEPA filters used for SC/SS service. This document presents the results of a HEPA filter vulnerability assessment conducted for the River protection project as requested by the DOE Office of River Protection.

  18. HEPA Filter Vulnerability Assessment

    This assessment of High Efficiency Particulate Air (HEPA) filter vulnerability was requested by the USDOE Office of River Protection (ORP) to satisfy a DOE-HQ directive to evaluate the effect of filter degradation on the facility authorization basis assumptions. Within the scope of this assessment are ventilation system HEPA filters that are classified as Safety-Class (SC) or Safety-Significant (SS) components that perform an accident mitigation function. The objective of the assessment is to verify whether HEPA filters that perform a safety function during an accident are likely to perform as intended to limit release of hazardous or radioactive materials, considering factors that could degrade the filters. Filter degradation factors considered include aging, wetting of filters, exposure to high temperature, exposure to corrosive or reactive chemicals, and exposure to radiation. Screening and evaluation criteria were developed by a site-wide group of HVAC engineers and HEPA filter experts from published empirical data. For River Protection Project (RPP) filters, the only degradation factor that exceeded the screening threshold was for filter aging. Subsequent evaluation of the effect of filter aging on the filter strength was conducted, and the results were compared with required performance to meet the conditions assumed in the RPP Authorization Basis (AB). It was found that the reduction in filter strength due to aging does not affect the filter performance requirements as specified in the AB. A portion of the HEPA filter vulnerability assessment is being conducted by the ORP and is not part of the scope of this study. The ORP is conducting an assessment of the existing policies and programs relating to maintenance, testing, and change-out of HEPA filters used for SC/SS service. This document presents the results of a HEPA filter vulnerability assessment conducted for the River protection project as requested by the DOE Office of River Protection

  19. Traditional Urban Aboriginal Religion

    Kristina Everett

    2009-01-01

    Full Text Available This paper represents a group of Aboriginal people who claim traditional Aboriginal ownership of a large Australian metropol is. They have struggled for at least the last 25 to 30 years to articulate and represent the ir contemporary group identity to the wider Australian society that very often does not take th eir expressions seriously. This is largely because dominant discourses claim that ‘authentic’ Aboriginal culture only exists in remote, pristine areas far away from western societ y and that urban Aboriginal traditions, especially urban religious traditions are, today, d efunct. This paper is an account of one occasion on which such traditional Aboriginal relig ious practice was performed before the eyes of a group of tourists.

  20. An Adaptive Approach to Mitigate Background Covariance Limitations in the Ensemble Kalman Filter

    Song, Hajoon

    2010-07-01

    A new approach is proposed to address the background covariance limitations arising from undersampled ensembles and unaccounted model errors in the ensemble Kalman filter (EnKF). The method enhances the representativeness of the EnKF ensemble by augmenting it with new members chosen adaptively to add missing information that prevents the EnKF from fully fitting the data to the ensemble. The vectors to be added are obtained by back projecting the residuals of the observation misfits from the EnKF analysis step onto the state space. The back projection is done using an optimal interpolation (OI) scheme based on an estimated covariance of the subspace missing from the ensemble. In the experiments reported here, the OI uses a preselected stationary background covariance matrix, as in the hybrid EnKF–three-dimensional variational data assimilation (3DVAR) approach, but the resulting correction is included as a new ensemble member instead of being added to all existing ensemble members. The adaptive approach is tested with the Lorenz-96 model. The hybrid EnKF–3DVAR is used as a benchmark to evaluate the performance of the adaptive approach. Assimilation experiments suggest that the new adaptive scheme significantly improves the EnKF behavior when it suffers from small size ensembles and neglected model errors. It was further found to be competitive with the hybrid EnKF–3DVAR approach, depending on ensemble size and data coverage.

  1. Traditional Indonesian dairy foods.

    Surono, Ingrid S

    2015-12-01

    Indonesia is the largest archipelago blessed with one of the richest mega-biodiversities and also home to one of the most diverse cuisines and traditional fermented foods. There are 3 types of traditional dairy foods, namely the butter-like product minyak samin; yogurt-like product dadih; and cheese-like products dali or bagot in horbo, dangke, litsusu, and cologanti, which reflect the culture of dairy product consumption in Indonesia. PMID:26715081

  2. The Fine Dutch Tradition

    Hooimeijer, F.L.

    2012-01-01

    Publication of the exhibition and symposium on water adaptive urban planning and architecture in Bangkok. The Urban Fine Dutch Tradition is a dynamic tradition of making urban designs using the parameters of the natural system – incorperating in an efficient way the hydrological cycle, the soil and subsurface conditions, technology and urban development opportunities. Sustainability is the capacity of making a sensible choice for enabling technology taking a perspective from the natural syste...

  3. In Defense of Filtering.

    Burt, David

    1997-01-01

    Presents responses to 10 common arguments against the use of Internet filters in libraries. Highlights include keyword blocking; selection of materials; liability of libraries using filters; users' judgments; Constitutional issues, including First Amendment rights; and censorship. (LRW)

  4. HEPA filter monitoring program

    Kirchner, K. N.; Johnson, C. M.; Aiken, W. F.; Lucerna, J. J.; Barnett, R. L.; Jensen, R. T.

    1986-07-01

    The testing and replacement of HEPA filters, widely used in the nuclear industry to purify process air, are costly and labor-intensive. Current methods of testing filter performance, such as differential pressure measurement and scanning air monitoring, allow determination of overall filter performance but preclude detection of incipient filter failure such as small holes in the filters. Using current technology, a continual in-situ monitoring system was designed which provides three major improvements over current methods of filter testing and replacement. The improvements include: cost savings by reducing the number of intact filters which are currently being replaced unnecessarily; more accurate and quantitative measurement of filter performance; and reduced personnel exposure to a radioactive environment by automatically performing most testing operations.

  5. MST Filterability Tests

    Poirier, M. R. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Burket, P. R. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Duignan, M. R. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2015-03-12

    The Savannah River Site (SRS) is currently treating radioactive liquid waste with the Actinide Removal Process (ARP) and the Modular Caustic Side Solvent Extraction Unit (MCU). The low filter flux through the ARP has limited the rate at which radioactive liquid waste can be treated. Recent filter flux has averaged approximately 5 gallons per minute (gpm). Salt Batch 6 has had a lower processing rate and required frequent filter cleaning. Savannah River Remediation (SRR) has a desire to understand the causes of the low filter flux and to increase ARP/MCU throughput. In addition, at the time the testing started, SRR was assessing the impact of replacing the 0.1 micron filter with a 0.5 micron filter. This report describes testing of MST filterability to investigate the impact of filter pore size and MST particle size on filter flux and testing of filter enhancers to attempt to increase filter flux. The authors constructed a laboratory-scale crossflow filter apparatus with two crossflow filters operating in parallel. One filter was a 0.1 micron Mott sintered SS filter and the other was a 0.5 micron Mott sintered SS filter. The authors also constructed a dead-end filtration apparatus to conduct screening tests with potential filter aids and body feeds, referred to as filter enhancers. The original baseline for ARP was 5.6 M sodium salt solution with a free hydroxide concentration of approximately 1.7 M.3 ARP has been operating with a sodium concentration of approximately 6.4 M and a free hydroxide concentration of approximately 2.5 M. SRNL conducted tests varying the concentration of sodium and free hydroxide to determine whether those changes had a significant effect on filter flux. The feed slurries for the MST filterability tests were composed of simple salts (NaOH, NaNO2, and NaNO3) and MST (0.2 – 4.8 g/L). The feed slurry for the filter enhancer tests contained simulated salt batch 6 supernate, MST, and filter enhancers.

  6. Effect of different thickness of material filter on Tc-99m spectra and performance parameters of gamma camera

    Nazifah, A.; Norhanna, S.; Shah, S. I.; Zakaria, A.

    2014-11-01

    This study aimed to investigate the effects of material filter technique on Tc-99m spectra and performance parameters of Philip ADAC forte dual head gamma camera. Thickness of material filter was selected on the basis of percentage attenuation of various gamma ray energies by different thicknesses of zinc material. A cylindrical source tank of NEMA single photon emission computed tomography (SPECT) Triple Line Source Phantom filled with water and Tc-99m radionuclide injected was used for spectra, uniformity and sensitivity measurements. Vinyl plastic tube was used as a line source for spatial resolution. Images for uniformity were reconstructed by filtered back projection method. Butterworth filter of order 5 and cut off frequency 0.35 cycles/cm was selected. Chang's attenuation correction method was applied by selecting 0.13/cm linear attenuation coefficient. Count rate was decreased with material filter from the compton region of Tc-99m energy spectrum, also from the photopeak region. Spatial resolution was improved. However, uniformity of tomographic image was equivocal, and system volume sensitivity was reduced by material filter. Material filter improved system's spatial resolution. Therefore, the technique may be used for phantom studies to improve the image quality.

  7. Filtering, FDR and power

    van Iterson Maarten

    2010-09-01

    Full Text Available Abstract Background In high-dimensional data analysis such as differential gene expression analysis, people often use filtering methods like fold-change or variance filters in an attempt to reduce the multiple testing penalty and improve power. However, filtering may introduce a bias on the multiple testing correction. The precise amount of bias depends on many quantities, such as fraction of probes filtered out, filter statistic and test statistic used. Results We show that a biased multiple testing correction results if non-differentially expressed probes are not filtered out with equal probability from the entire range of p-values. We illustrate our results using both a simulation study and an experimental dataset, where the FDR is shown to be biased mostly by filters that are associated with the hypothesis being tested, such as the fold change. Filters that induce little bias on the FDR yield less additional power of detecting differentially expressed genes. Finally, we propose a statistical test that can be used in practice to determine whether any chosen filter introduces bias on the FDR estimate used, given a general experimental setup. Conclusions Filtering out of probes must be used with care as it may bias the multiple testing correction. Researchers can use our test for FDR bias to guide their choice of filter and amount of filtering in practice.

  8. HEPA filter encapsulation

    Gates-Anderson, Dianne D. (Union City, CA); Kidd, Scott D. (Brentwood, CA); Bowers, John S. (Manteca, CA); Attebery, Ronald W. (San Lorenzo, CA)

    2003-01-01

    A low viscosity resin is delivered into a spent HEPA filter or other waste. The resin is introduced into the filter or other waste using a vacuum to assist in the mass transfer of the resin through the filter media or other waste.

  9. Optimal Filter Systems for Photometric Redshift Estimation

    Benítez, N.; Moles, M.; Aguerri, J. A. L.; Alfaro, E.; Broadhurst, T.; Cabrera-Caño, J.; Castander, F. J.; Cepa, J.; Cerviño, M.; Cristóbal-Hornillos, D.; Fernández-Soto, A.; González Delgado, R. M.; Infante, L.; Márquez, I.; Martínez, V. J.; Masegosa, J.; Del Olmo, A.; Perea, J.; Prada, F.; Quintana, J. M.; Sánchez, S. F.

    2009-02-01

    In the coming years, several cosmological surveys will rely on imaging data to estimate the redshift of galaxies, using traditional filter systems with 4-5 optical broad bands; narrower filters improve the spectral resolution, but strongly reduce the total system throughput. We explore how photometric redshift performance depends on the number of filters nf , characterizing the survey depth by the fraction of galaxies with unambiguous redshift estimates. For a combination of total exposure time and telescope imaging area of 270 hr m2, 4-5 filter systems perform significantly worse, both in completeness depth and precision, than systems with nf gsim 8 filters. Our results suggest that for low nf the color-redshift degeneracies overwhelm the improvements in photometric depth, and that even at higher nf the effective photometric redshift depth decreases much more slowly with filter width than naively expected from the reduction in the signal-to-noise ratio. Adding near-IR observations improves the performance of low-nf systems, but still the system which maximizes the photometric redshift completeness is formed by nine filters with logarithmically increasing bandwidth (constant resolution) and half-band overlap, reaching ~0.7 mag deeper, with 10% better redshift precision, than 4-5 filter systems. A system with 20 constant-width, nonoverlapping filters reaches only ~0.1 mag shallower than 4-5 filter systems, but has a precision almost three times better, δz = 0.014(1 + z) versus δz = 0.042(1 + z). We briefly discuss a practical implementation of such a photometric system: the ALHAMBRA Survey.

  10. KASTAMONU TRADITIONAL WOMEN CLOTHES

    E.Elhan ÖZUS

    2015-08-01

    Full Text Available Clothing is a unique dressing style of a community, a period or a profession. In clothing there is social status and difference principle rather than fashion. In this context, the society created a clothing style in line with its own customs, traditions and social structure. One of the features separating societies from each other and indicating their cultural and social classes is the clothing style. As it is known, traditional Turkish clothes reflecting the characteristics of Turkish society is our most beautiful heritage from past to present. From this heritage there are several examples of women's clothes c arried to present. When these examples are examined, it is possible to see the taste, the way of understanding art, joy and the lifestyle of the history. These garments are also the documents outlining the taste and grace of Turkish people. In the present study, traditional Kastamonu women's clothing, that has an important place in traditional cultural clothes of Anatolia, is investigated . The method of the present research is primarily defined as the examination of the written sources. The study is complet ed with the observations and examinations made in Kastamonu. According to the findings of the study, traditional Kastamonu women's clothing are examined and adapted to todays’ clothing.

  11. Bias aware Kalman filters

    Drecourt, J.-P.; Madsen, H.; Rosbjerg, Dan

    2006-01-01

    This paper reviews two different approaches that have been proposed to tackle the problems of model bias with the Kalman filter: the use of a colored noise model and the implementation of a separate bias filter. Both filters are implemented with and without feedback of the bias into the model state...... illustrated on a simple one-dimensional groundwater problem. The results show that the presented filters outperform the standard Kalman filter and that the implementations with bias feedback work in more general conditions than the implementations without feedback. 2005 Elsevier Ltd. All rights reserved....

  12. Traditional Chinese biotechnology.

    Xu, Yan; Wang, Dong; Fan, Wen Lai; Mu, Xiao Qing; Chen, Jian

    2010-01-01

    The earliest industrial biotechnology originated in ancient China and developed into a vibrant industry in traditional Chinese liquor, rice wine, soy sauce, and vinegar. It is now a significant component of the Chinese economy valued annually at about 150 billion RMB. Although the production methods had existed and remained basically unchanged for centuries, modern developments in biotechnology and related fields in the last decades have greatly impacted on these industries and led to numerous technological innovations. In this chapter, the main biochemical processes and related technological innovations in traditional Chinese biotechnology are illustrated with recent advances in functional microbiology, microbial ecology, solid-state fermentation, enzymology, chemistry of impact flavor compounds, and improvements made to relevant traditional industrial facilities. Recent biotechnological advances in making Chinese liquor, rice wine, soy sauce, and vinegar are reviewed. PMID:19888561

  13. Traditional Chinese Biotechnology

    Xu, Yan; Wang, Dong; Fan, Wen Lai; Mu, Xiao Qing; Chen, Jian

    The earliest industrial biotechnology originated in ancient China and developed into a vibrant industry in traditional Chinese liquor, rice wine, soy sauce, and vinegar. It is now a significant component of the Chinese economy valued annually at about 150 billion RMB. Although the production methods had existed and remained basically unchanged for centuries, modern developments in biotechnology and related fields in the last decades have greatly impacted on these industries and led to numerous technological innovations. In this chapter, the main biochemical processes and related technological innovations in traditional Chinese biotechnology are illustrated with recent advances in functional microbiology, microbial ecology, solid-state fermentation, enzymology, chemistry of impact flavor compounds, and improvements made to relevant traditional industrial facilities. Recent biotechnological advances in making Chinese liquor, rice wine, soy sauce, and vinegar are reviewed.

  14. Ceramic fiber filter technology

    Holmes, B.L.; Janney, M.A.

    1996-06-01

    Fibrous filters have been used for centuries to protect individuals from dust, disease, smoke, and other gases or particulates. In the 1970s and 1980s ceramic filters were developed for filtration of hot exhaust gases from diesel engines. Tubular, or candle, filters have been made to remove particles from gases in pressurized fluidized-bed combustion and gasification-combined-cycle power plants. Very efficient filtration is necessary in power plants to protect the turbine blades. The limited lifespan of ceramic candle filters has been a major obstacle in their development. The present work is focused on forming fibrous ceramic filters using a papermaking technique. These filters are highly porous and therefore very lightweight. The papermaking process consists of filtering a slurry of ceramic fibers through a steel screen to form paper. Papermaking and the selection of materials will be discussed, as well as preliminary results describing the geometry of papers and relative strengths.

  15. Family traditions and generations.

    Schneiderman, Gerald; Barrera, Maru

    2009-01-01

    Currently, traditional family values that have been passed down through generations appear to be at risk. This has significant implications for the stability and health of individuals, families, and communities. This article explores selected issues related to intergenerational transmission of family values and cultural beliefs, with particular reference to Western culture and values that are rooted in Jewish and Christian traditions. It also examines family values and parenting styles as they influence the developing perspective of children and the family's adaptation to a changing world. PMID:19752638

  16. Filter assessment applied to analytical reconstruction for industrial third-generation tomography

    Velo, Alexandre F.; Martins, Joao F.T.; Oliveira, Adriano S.; Carvalho, Diego V.S.; Faria, Fernando S.; Hamada, Margarida M.; Mesquita, Carlos H., E-mail: afvelo@usp.br [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)

    2015-07-01

    Multiphase systems are structures that contain a mixture of solids, liquids and gases inside a chemical reactor or pipes in a dynamic process. These systems are found in chemical, food, pharmaceutical and petrochemical industries. The gamma ray computed tomography (CT) system has been applied to visualize the distribution of multiphase systems without interrupting production. CT systems have been used to improve design, operation and troubleshooting of industrial processes. Computer tomography for multiphase processes is being developed at several laboratories. It is well known that scanning systems demand high processing time, limited set of data projections and views to obtain an image. Because of it, the image quality is dependent on the number of projection, number of detectors, acquisition time and reconstruction time. A phantom containing air, iron and aluminum was used on the third generation industrial tomography with 662 keV ({sup 137}Cs) radioactive source. It was applied the Filtered Back Projection algorithm to reconstruct the images. An efficient tomography is dependent of the image quality, thus the objective of this research was to apply different types of filters on the analytical algorithm and compare each other using the figure of merit denominated root mean squared error (RMSE), the filter that presents lower value of RMSE has better quality. On this research, five types of filters were used: Ram-Lak, Shepp-Logan, Cosine, Hamming and Hann filters. As results, all filters presented lower values of RMSE, that means the filters used have low stand deviation compared to the mass absorption coefficient, however, the Hann filter presented better RMSE and CNR compared to the others. (author)

  17. Filter assessment applied to analytical reconstruction for industrial third-generation tomography

    Multiphase systems are structures that contain a mixture of solids, liquids and gases inside a chemical reactor or pipes in a dynamic process. These systems are found in chemical, food, pharmaceutical and petrochemical industries. The gamma ray computed tomography (CT) system has been applied to visualize the distribution of multiphase systems without interrupting production. CT systems have been used to improve design, operation and troubleshooting of industrial processes. Computer tomography for multiphase processes is being developed at several laboratories. It is well known that scanning systems demand high processing time, limited set of data projections and views to obtain an image. Because of it, the image quality is dependent on the number of projection, number of detectors, acquisition time and reconstruction time. A phantom containing air, iron and aluminum was used on the third generation industrial tomography with 662 keV (137Cs) radioactive source. It was applied the Filtered Back Projection algorithm to reconstruct the images. An efficient tomography is dependent of the image quality, thus the objective of this research was to apply different types of filters on the analytical algorithm and compare each other using the figure of merit denominated root mean squared error (RMSE), the filter that presents lower value of RMSE has better quality. On this research, five types of filters were used: Ram-Lak, Shepp-Logan, Cosine, Hamming and Hann filters. As results, all filters presented lower values of RMSE, that means the filters used have low stand deviation compared to the mass absorption coefficient, however, the Hann filter presented better RMSE and CNR compared to the others. (author)

  18. The Traditional Rebel.

    Lemansky, Janet

    1993-01-01

    Outlines the Linden, New Jersey, schools' introduction and use of electronic musical technology and contemporary instruments in the orchestral music program, which has broadened the musical repertoire and the recruitment of talented students not schooled in the classical tradition. Four applications of technology for rehearsals and instrumental…

  19. Traditional Cherokee Food.

    Hendrix, Janey B.

    A collection for children and teachers of traditional Cherokee recipes emphasizes the art, rather than the science, of cooking. The hand-printed, illustrated format is designed to communicate the feeling of Cherokee history and culture and to encourage readers to collect and add family recipes. The cookbook could be used as a starting point for…

  20. Making Tradition Healthy

    2007-11-01

    In this podcast, a Latina nutrition educator shows how a community worked with local farmers to grow produce traditionally enjoyed by Hispanic/Latinos.  Created: 11/1/2007 by National Diabetes Education Program (NDEP), a joint program of the Centers for Disease Control and Prevention and the National Institutes of Health.   Date Released: 11/10/2007.

  1. Teaching Traditional Tropical Agriculture.

    Clawson, David L.

    1987-01-01

    Maintains that the teaching of traditional tropical agriculture through the presentation of large numbers of categories or types tends to overemphasize superficial differences at the expense of comprehending the inner essence of life as it exists for the majority of the world's farmers. Offers an alternative approach which claims to foster greater…

  2. Tradition in Science

    Heisenberg, Werner

    1973-01-01

    Discusses the influence of tradition in science on selection of scientific problems and methods and on the use of concepts as tools for research work. Indicates that future research studies will be directed toward the change of fundamental concepts in such fields as astrophysics, molecular biology, and environmental science. (CC)

  3. Traditional healers formalised?

    Van Niekerk, Jp

    2012-03-01

    Traditional healers are the first to be called for help when illness strikes the majority of South Africans. Their communities have faith in their ability to cure or alleviate conditions managed by doctors, and much more. A visit to such practitioners' websites (they are up with the latest advertising technology!) shows that they promise help with providing more power, love, security or money, protection from evil people and spirits, enhancing one's sex life with penis enlargement and vagina tightening spells, etc. Contemplating such claims, it is easy to be dismissive of traditional healers. But in this issue of the SAMJ Nompumelelo Mbatha and colleagues1 argue that the traditional healers' regulatory council, promised by an Act of Parliament, should be established, followed by (or preferably preceded by) formal recognition by employers of sick certificates issued by traditional healers. Can matters be so simply resolved? What does this mean for doctors and other formally recognised healthcare professionals, and how to respond to such claims and social pressures? PMID:22380886

  4. Survey of Sparse Adaptive Filters for Acoustic Echo Cancellation

    Krishna Samalla

    2013-01-01

    Full Text Available This paper reviews the existing developments of adaptive methods of sparse adaptive filters for the identification of sparse impulse response in both network and acoustic echo cancellation from the last decade. A variety of different architectures and novel training algorithms have been proposed in literature. At present most of the work in echo cancellation on using more than one method. Sparse adaptive filters take the advantage of each method and showing good improvement in the sparseness measure performance. This survey gives an overview of existing sparse adaptive filters mechanisms and discusses their advantages over the traditional adaptive filters developed for echo cancellation.

  5. Challenging tradition in Nigeria.

    Supriya, K E

    1991-01-01

    In Nigeria since 1987, the National Association of Nigeria Nurses and Midwives (NSNNM) has used traditional medial and traditional health care workers to curtail the practice of female circumcision. Other harmful traditions are being changed also, such as early marriage, taboos of pregnancy and childbirth, and scarification. 30,000 member of NANNM are involved in this effort to halt the harmful practices themselves and to change community opinion. The program involved national and state level workshops on harmful health consequences of traditional practices and instruction on how to conduct focus group discussions to assess women's beliefs and practices. The focus groups were found to be a particularly successful method of opening up discussion of taboo topics and expressing deep emotions. The response to the knowledge that circumcision was not necessary was rage and anger, which was channeled into advocacy roles or change in the practice. The result was the channeled into advocacy roles for change in the practice. The result was the development of books, leaflets and videos. One community group designed a dress with a decorative motif of tatoos and bodily cuts to symbolize circumcision and scarring. Plays and songs were written and performed. Artists provided models of female genitalia both before and after circumcision. The campaign has been successful in bringing this issue to the public attention in prominent ways, such a national television, health talk shows, and women;s magazines. One of the most important results of the effort has been the demonstration that culture and tradition can be changed from within, rather than from outside imposition of values and beliefs. PMID:12284522

  6. Traditional Medicine in Zimbabwe

    Takawira Kazembe

    2007-06-01

    Full Text Available This study was carried out to help demystify traditional medical practices in Zimbabwe and assist people in understanding Zimbabwean traditional medicine. The Zimbabwean traditional religion involves a hierarchy of spirit mediums differing in the way they practice traditional medicine, as well as the origin and power of the spirit(s that possess(es them. MaGombwe, mediums of angels of God, occupy the highest level on the hierarchy. The second level is that of maSadunhu, the spirit mediums of the original leaders of clans who look after the interests of members of their clans. The third level is that of maTateguru, the spirits who look after the interests of the families they left behind. These spirits of great grandparents are complimented by spirits of grandparents who possess their mediums only to get a specific thing done and then disappear. The fourth level is occupied by N’angas, the ‘real traditional medical practitioners.’ These mediums may be possessed by spirits from any of the above levels, and differ from mediums at the original level in that they charge clients and the powers of their spirits are lower. The spirits at any of the levels are complimented by maShave, spirits that were created to perform specific tasks. The role of the spirit mediums is to service spiritual and medicinal interests of people. Training at the different levels of spirit mediums involves rigorous and tedious apprenticeship systems, and the mediums are willing to cooperate with other service providers if certain conditions are met.

  7. Filter material charging apparatus for filter assembly for radioactive contaminants

    A filter charging apparatus for a filter assembly is described. The filter assembly includes a housing with at least one filter bed therein and the filter charging apparatus for adding filter material to the filter assembly includes a tank with an opening therein, the tank opening being disposed in flow communication with opposed first and second conduit means, the first conduit means being in flow communication with the filter assembly housing and the second conduit means being in flow communication with a blower means. Upon activation of the blower means, the blower means pneumatically conveys the filter material from the tank to the filter housing

  8. Generic Kalman Filter Software

    Lisano, Michael E., II; Crues, Edwin Z.

    2005-01-01

    The Generic Kalman Filter (GKF) software provides a standard basis for the development of application-specific Kalman-filter programs. Historically, Kalman filters have been implemented by customized programs that must be written, coded, and debugged anew for each unique application, then tested and tuned with simulated or actual measurement data. Total development times for typical Kalman-filter application programs have ranged from months to weeks. The GKF software can simplify the development process and reduce the development time by eliminating the need to re-create the fundamental implementation of the Kalman filter for each new application. The GKF software is written in the ANSI C programming language. It contains a generic Kalman-filter-development directory that, in turn, contains a code for a generic Kalman filter function; more specifically, it contains a generically designed and generically coded implementation of linear, linearized, and extended Kalman filtering algorithms, including algorithms for state- and covariance-update and -propagation functions. The mathematical theory that underlies the algorithms is well known and has been reported extensively in the open technical literature. Also contained in the directory are a header file that defines generic Kalman-filter data structures and prototype functions and template versions of application-specific subfunction and calling navigation/estimation routine code and headers. Once the user has provided a calling routine and the required application-specific subfunctions, the application-specific Kalman-filter software can be compiled and executed immediately. During execution, the generic Kalman-filter function is called from a higher-level navigation or estimation routine that preprocesses measurement data and post-processes output data. The generic Kalman-filter function uses the aforementioned data structures and five implementation- specific subfunctions, which have been developed by the user on the basis of the aforementioned templates. The GKF software can be used to develop many different types of unfactorized Kalman filters. A developer can choose to implement either a linearized or an extended Kalman filter algorithm, without having to modify the GKF software. Control dynamics can be taken into account or neglected in the filter-dynamics model. Filter programs developed by use of the GKF software can be made to propagate equations of motion for linear or nonlinear dynamical systems that are deterministic or stochastic. In addition, filter programs can be made to operate in user-selectable "covariance analysis" and "propagation-only" modes that are useful in design and development stages.

  9. Hybrid Filter Membrane

    Laicer, Castro; Rasimick, Brian; Green, Zachary

    2012-01-01

    Cabin environmental control is an important issue for a successful Moon mission. Due to the unique environment of the Moon, lunar dust control is one of the main problems that significantly diminishes the air quality inside spacecraft cabins. Therefore, this innovation was motivated by NASA s need to minimize the negative health impact that air-suspended lunar dust particles have on astronauts in spacecraft cabins. It is based on fabrication of a hybrid filter comprising nanofiber nonwoven layers coated on porous polymer membranes with uniform cylindrical pores. This design results in a high-efficiency gas particulate filter with low pressure drop and the ability to be easily regenerated to restore filtration performance. A hybrid filter was developed consisting of a porous membrane with uniform, micron-sized, cylindrical pore channels coated with a thin nanofiber layer. Compared to conventional filter media such as a high-efficiency particulate air (HEPA) filter, this filter is designed to provide high particle efficiency, low pressure drop, and the ability to be regenerated. These membranes have well-defined micron-sized pores and can be used independently as air filters with discreet particle size cut-off, or coated with nanofiber layers for filtration of ultrafine nanoscale particles. The filter consists of a thin design intended to facilitate filter regeneration by localized air pulsing. The two main features of this invention are the concept of combining a micro-engineered straight-pore membrane with nanofibers. The micro-engineered straight pore membrane can be prepared with extremely high precision. Because the resulting membrane pores are straight and not tortuous like those found in conventional filters, the pressure drop across the filter is significantly reduced. The nanofiber layer is applied as a very thin coating to enhance filtration efficiency for fine nanoscale particles. Additionally, the thin nanofiber coating is designed to promote capture of dust particles on the filter surface and to facilitate dust removal with pulse or back airflow.

  10. Decentralised Power Active Filters

    Wada M. Hosny

    2004-01-01

    Full Text Available This paper deals with a decentralised power active filter control based on the separated computation of reference current for each active filter operating under determinated harmonic frequency. The basic principle of such controlled active filter is explained. It is shown how the nth hamonic component of the reference current can be calculated. Simulation results are shown on the end of the paper.

  11. Fractional and seasonal filtering

    Guegan, Dominique; Ferrara, Laurent

    2008-01-01

    We introduce in this study a new strategy to model simultaneously persistence and seasonality inside economic data using different stochastic filters based on the Gegenbauer modelling. The limits and advantages of these filters are discussed in order to improve the adjustment of economic series, particularly when specific trend is observed. The series of new cars registrations in the Euro-zone is modelled using the previous filters

  12. Boosted Adaptive Filters

    Kari, Dariush; Delibalta, Ibrahim; Kozat, Suleyman Serdar

    2016-01-01

    We introduce the boosting notion extensively used in different machine learning applications to adaptive signal processing literature and implement several different adaptive filtering algorithms. In this framework, we have several adaptive filtering algorithms, i.e., the weak learners, that run in parallel on a common task such as equalization, classification, regression or filtering. For each newly received input vector and observation pair, each algorithm adapts itself based on the perform...

  13. Bloofi: Multidimensional Bloom Filters

    Crainiceanu, Adina; Lemire, Daniel

    2015-01-01

    Bloom filters are probabilistic data structures commonly used for approximate membership problems in many areas of Computer Science (networking, distributed systems, databases, etc.). With the increase in data size and distribution of data, problems arise where a large number of Bloom filters are available, and all them need to be searched for potential matches. As an example, in a federated cloud environment, each cloud provider could encode the information using Bloom filters and share the ...

  14. Bayesian Trend Filtering

    Roualdes, Edward A.

    2015-01-01

    We develop a fully Bayesian hierarchical model for trend filtering, itself a new development in nonparametric, univariate regression. The framework more broadly applies to the generalized lasso, but focus is on Bayesian trend filtering. We compare two shrinkage priors, double exponential and generalized double Pareto. A simulation study, comparing Bayesian trend filtering to the original formulation and a number of other popular methods shows our method to improve estimation error while maint...

  15. Retrofitting fabric filters for clean stack emission

    The fly ash generated from New South Wales coals, which are predominately low sulphur coals, has been difficult to collect in traditional electrostatic precipitators. During the early 1970's development work was undertaken on the use of fabric filters at some of the Commission's older power stations. The satisfactory performance of the plant at those power stations led to the selection of fabric filters for flue gas cleaning at the next two new power stations constructed by the Electricity Commission of New South Wales. On-going pilot plant testing has continued to indicate the satisfactory performance of enhanced designs of fabric filters of varying types and the Commission has recently retrofitted pulse cleaned fabric filters to 2 x 350 MW units at a further power station with plans to retrofit similar plant to the remaining 2 x 350 MW units at that station. A contract has also been let for the retrofitting of pulse cleaned fabric filters to 4 x 500 MW units at another power station in the Commission's system. The paper reviews the performance of the 6000 MW of plant operating with fabric filters. Fabric selection and fabric life forms an important aspect of this review

  16. Optimal filter systems for photometric redshift estimation

    Benítez, N; López-Aguerri, J A; Alfaro, E; Broadhurst, T; Cabrera, J; Castander, F J; Cepa, J; Cerviño, M; Cristobal-Hornillos, D; Fernández-Soto, A; González-Delgado, R M; Infante, L; Márquez, I; Martínez, V J; Masegosa, J; Del Olmo, A; Perea, J; Prada, F; Quintana, J M; Sánchez, S F

    2008-01-01

    In the next years, several cosmological surveys will rely on imaging data to estimate the redshift of galaxies, using traditional filter systems with 4-5 optical broad bands; narrower filters improve the spectral resolution, but strongly reduce the total system throughput. We explore how photometric redshift performance depends on the number of filters n_f, characterizing the survey depth through the fraction of galaxies with unambiguous redshift estimates. For a combination of total exposure time and telescope imaging area of 270 hrs m^2, 4-5 filter systems perform significantly worse, both in completeness depth and precision, than systems with n_f >= 8 filters. Our results suggest that for low n_f, the color-redshift degeneracies overwhelm the improvements in photometric depth, and that even at higher n_f, the effective photometric redshift depth decreases much more slowly with filter width than naively expected from the reduction in S/N. Adding near-IR observations improves the performance of low n_f syste...

  17. Fast Guided Filter

    He, Kaiming; Sun, Jian

    2015-01-01

    The guided filter is a technique for edge-aware image filtering. Because of its nice visual quality, fast speed, and ease of implementation, the guided filter has witnessed various applications in real products, such as image editing apps in phones and stereo reconstruction, and has been included in official MATLAB and OpenCV. In this note, we remind that the guided filter can be simply sped up from O(N) time to O(N/s^2) time for a subsampling ratio s. In a variety of applications, this leads...

  18. Paul Rodgersi filter Kohilas

    2000-01-01

    28. I Kohila keskkoolis kohaspetsiifiline skulptuur ja performance "Filter". Kooli 130. aastapäeva tähistava ettevõtmise eesotsas oli skulptor Paul Rodgers ja kaks viimase klassi noormeest ? Marko Heinmäe, Hendrik Karm.

  19. Nanofiber Filters Eliminate Contaminants

    2009-01-01

    With support from Phase I and II SBIR funding from Johnson Space Center, Argonide Corporation of Sanford, Florida tested and developed its proprietary nanofiber water filter media. Capable of removing more than 99.99 percent of dangerous particles like bacteria, viruses, and parasites, the media was incorporated into the company's commercial NanoCeram water filter, an inductee into the Space Foundation's Space Technology Hall of Fame. In addition to its drinking water filters, Argonide now produces large-scale nanofiber filters used as part of the reverse osmosis process for industrial water purification.

  20. Updating the OMERACT filter

    Wells, George; Beaton, Dorcas E; Tugwell, Peter; Boers, Maarten; Kirwan, John R; Bingham, Clifton O; Boonen, Annelies; Brooks, Peter; Conaghan, Philip G; D'Agostino, Maria-Antonietta; Dougados, Maxime; Furst, Daniel E; Gossec, Laure; Guillemin, Francis; Helliwell, Philip; Hewlett, Sarah; Kvien, Tore K; Landewé, Robert B; March, Lyn; Mease, Philip J; Østergaard, Mikkel; Simon, Lee; Singh, Jasvinder A; Strand, Vibeke; van der Heijde, Désirée M

    2014-01-01

    The "Discrimination" part of the OMERACT Filter asks whether a measure discriminates between situations that are of interest. "Feasibility" in the OMERACT Filter encompasses the practical considerations of using an instrument, including its ease of use, time to complete, monetary costs, and...... interpretability of the question(s) included in the instrument. Both the Discrimination and Reliability parts of the filter have been helpful but were agreed on primarily by consensus of OMERACT participants rather than through explicit evidence-based guidelines. In Filter 2.0 we wanted to improve this definition...

  1. Filters in nuclear facilities

    The topics of the nine papers given include the behavior of HEPA filters during exposure to air flows of high humidity as well as of high differential pressure, the development of steel-fiber filters suitable for extreme operating conditions, and the occurrence of various radioactive iodine species in the exhaust air from boiling water reactors. In an introductory presentation the German view of the performance requirements to be met by filters in nuclear facilities as well as the present status of filter quality assurance are discussed. (orig.)

  2. Linear phase compressive filter

    McEwan, Thomas E. (Livermore, CA)

    1995-01-01

    A phase linear filter for soliton suppression is in the form of a laddered series of stages of non-commensurate low pass filters with each low pass filter having a series coupled inductance (L) and a reverse biased, voltage dependent varactor diode, to ground which acts as a variable capacitance (C). L and C values are set to levels which correspond to a linear or conventional phase linear filter. Inductance is mapped directly from that of an equivalent nonlinear transmission line and capacitance is mapped from the linear case using a large signal equivalent of a nonlinear transmission line.

  3. Traditional preventive treatment options

    Longbottom, C; Ekstrand, K; Zero, D

    2009-01-01

    Preventive treatment options can be divided into primary, secondary and tertiary prevention techniques, which can involve patient- or professionally applied methods. These include: oral hygiene (instruction), pit and fissure sealants ('temporary' or 'permanent'), fluoride applications (patient- or...... conventional operative care, and since controlling the caries process prior to first restoration is the key to breaking the repair cycle and improving care for patients, future research should address the shortcomings in the current level of supporting evidence for the various traditional preventive treatment...

  4. [Children and traditional practices].

    Aholi, P

    1990-07-01

    African traditional practices can be both beneficial and harmful to the newly born. This article describes these practices from 4 perspectives: 1) the period following childbirth or "maternage;" 2) nutrition; 3) curative care; and 4) social customs. The beneficial practices include: 1) giving the baby water as soon as he is washed to prevent neonatal hypoglycemia; 2) breast feeding; 3) carrying the baby on the mother's back and 4) the traditional massage. The harmful practices during maternage include: 1) the baby is rolled in the dirt to protect him and "give birth to his race;" 2) after birth the baby is given lemon juice or gin to prevent the obstruction of the respiratory cords; 3) mother and baby are "put in the dark" or a separate room for the rest of the family and community for 6 days to protect them against evil influences. The harmful nutritional practices are based on superstitions that relate to all animal products because they might produce diseases. 1) Eggs are known to cause diarrhea and throughout Africa eggs are forbidden because of their effect on children's physical development. 2) Chicken and pigeon and "everything that flies" causes convulsions. 3) Palm oil, oranges and bananas cause coughing. 4) Sugar cane, manioc leaves and everything with natural sugar cause intestinal ailments. Traditional health cures are used during an illness and are aimed at reestablishing the balance between man and his environment. Examples described include treatment for measles and chicken pox; fevers; diarrhea, and vomiting and convulsions. The positive traditional African practices need to be combined with those of modern medicine while discouraging the harmful practices. PMID:12342828

  5. Evaluation of median filtering after reconstruction with maximum likelihood expectation maximization (ML-EM) by real space and frequency space

    Maximum likelihood expectation maximization (ML-EM) image quality is sensitive to the number of iterations, because a large number of iterations leads to images with checkerboard noise. The use of median filtering in the reconstruction process allows both noise reduction and edge preservation. We examined the value of median filtering after reconstruction with ML-EM by comparing filtered back projection (FBP) with a ramp filter or ML-EM without filtering. SPECT images were obtained with a dual-head gamma camera. The acquisition time was changed from 10 to 200 (seconds/frame) to examine the effect of the count statistics on the quality of the reconstructed images. First, images were reconstructed with ML-EM by changing the number of iterations from 1 to 150 in each study. Additionally, median filtering was applied following reconstruction with ML-EM. The quality of the reconstructed images was evaluated in terms of normalized mean square error (NMSE) values and two-dimensional power spectrum analysis. Median filtering after reconstruction by the ML-EM method provided stable NMSE values even when the number of iterations was increased. The signal element of the image was close to the reference image for any repetition number of iterations. Median filtering after reconstruction with ML-EM was useful in reducing noise, with a similar resolution achieved by reconstruction with FBP and a ramp filter. Especially in images with poor count statistics, median filtering after reconstruction with ML-EM is effective as a simple, widely available method. (author)

  6. The tyranny of tradition.

    Gulati, L

    1999-01-01

    This paper narrates the cruelty enforced by tradition on the lives of women in India. It begins with the life of the author's great-grandmother Ponnamma wherein the family was rigidly patriarchal, and Brahmin values were applied. Here, women had very little say in the decisions men made, were forced in an arranged marriage before puberty, were not sent to school, and were considered unimportant. This tradition lived on in the author's grandmother Seetha and in the life of her mother Saras. However, in the story of Saras, following the death of her husband, they departed from rigid Brahmin tradition and orthodoxy. Her mother, unperturbed by the challenges she faced, consistently devised ways to cope and succeeded in changing environment. Meaningless Brahmatic rituals and prayers found no place in her life, which she approached with a cosmopolitan and humanitarian outlook. In essence, she shaped the lives of three daughters and a son, and all her grandchildren, making a success of not only her own but of all whose lives she touched. PMID:12322347

  7. FPGA Based Kalman Filter for Wireless Sensor Networks

    Vikrant Vij

    2011-01-01

    Full Text Available A Wireless Sensor Network (WSN is a set of tiny and low-cost devices equipped with different kind of sensors, a small microcontroller and a radio transceiver, typically powered by batteries. Target tracking is one of the very important applications of such a network system. Traditionally, KF (Kalman filtering and its derivatives are used for tracking of a random signal. Kalman filter is a linear optimal filtering approach, to address the problem when system dynamics become nonlinear, researchers developed sub-optimal extensions of Kalman filter, two popular versions are EKF (extended Kalman filter and UKF (unscented Kalman filter.The rapidly increasing popularity of WSNs has placed increased computational demands upon these systemswhich can be met by FPGA based design. FPGAs offer increased performance compared to microprocessors and increased flexibility compared to ASICs , while maintaining low power consumption

  8. Kalman filtering technique for reactivity measurement

    Measurement of reactivity and its on-line display is of great help in calibration of reactivity control and safety devices and in the planning of suitable actions during the reactor operation. In traditional approaches the reactivity is estimated from reactor period or by solving the inverse point kinetic equation. In this paper, an entirely new approach based on the Kalman filtering technique has been presented. The theory and design of the reactivity measuring instrument based on the approach has been explained. Its performance has been compared with traditional approaches by estimation of transient reactivity from flux variation data recorded in a research reactor. It is demonstrated that the Kalman filtering approach is superior to other methods from the viewpoints of accuracy, noise suppression, and robustness against uncertainties in the reactor parameters. (author). 1 fig

  9. Vena cava filter; Vena-cava-Filter

    Helmberger, T. [Klinikum Bogenhausen, Institut fuer Diagnostische und Interventionelle Radiologie und Nuklearmedizin, Muenchen (Germany)

    2007-05-15

    Fulminant pulmonary embolism is one of the major causes of death in the Western World. In most cases, deep leg and pelvic venous thrombosis are the cause. If an anticoagulant/thrombotic therapy is no longer possible or ineffective, a vena cava filter implant may be indicated if an embolism is threatening. Implantation of the filter is a simple and safe intervention. Nevertheless, it is necessary to take into consideration that the data base for determining the indications for this treatment are very limited. Currently, a reduction in the risk of thromboembolism with the use of filters of about 30%, of recurrences of almost 5% and fatal pulmonary embolism of 1% has been reported, with a risk of up to 20% of filter induced vena cava thrombosis. (orig.) [German] Die fulminante Lungenembolie zaehlt zu den Haupttodesursachen in der westlichen Welt. In der Mehrzahl der Faelle sind tiefe Bein- und Beckenvenenthrombosen ursaechlich verantwortlich. Ist eine antikoagulative/-thrombotische Therapie nicht (mehr) moeglich oder unwirksam, kann bei drohender Emboliegefahr die Vena-cava-Filterimplantation indiziert sein. Die Filterimplantation ist eine einfache und sehr sichere Intervention. Dennoch muss bei der Indikationsstellung beruecksichtigt werden, dass die Datenlage zur Wirksamkeit sehr limitiert ist. So wird aktuell ueber eine Reduktion des Thrombembolierisikos um 30% bei Embolierezidiven von knapp 5% und fatalen Lungenembolien von 1% unter Filterprophylaxe berichtet, bei einem Risiko von bis zu 20% fuer die filterinduzierte Vena-cava-Thrombose. (orig.)

  10. Multilevel ensemble Kalman filtering

    Hoel, HÃ¥kon; Law, Kody J. H.; Tempone, Raul

    2015-01-01

    This work embeds a multilevel Monte Carlo (MLMC) sampling strategy into the Monte Carlo step of the ensemble Kalman filter (ENKF), thereby yielding a multilevel ensemble Kalman filter (MLENKF) which has provably superior asymptotic cost to a given accuracy level. The theoretical results are illustrated numerically.

  11. Updating the OMERACT filter

    Kirwan, John R; Boers, Maarten; Hewlett, Sarah; Beaton, Dorcas; Bingham, Clifton O; Choy, Ernest; Conaghan, Philip G; D'Agostino, Maria-Antonietta; Dougados, Maxime; Furst, Daniel E; Guillemin, Francis; Gossec, Laure; van der Heijde, Désirée M; Kloppenburg, Margreet; Kvien, Tore K; Landewé, Robert B M; Mackie, Sarah L; Matteson, Eric L; Mease, Philip J; Merkel, Peter A; Østergaard, Mikkel; Saketkoo, Lesley Ann; Simon, Lee; Singh, Jasvinder A; Strand, Vibeke; Tugwell, Peter

    2014-01-01

    OBJECTIVE: The Outcome Measures in Rheumatology (OMERACT) Filter provides guidelines for the development and validation of outcome measures for use in clinical research. The "Truth" section of the OMERACT Filter presupposes an explicit framework for identifying the relevant core outcomes that are...

  12. Updating the OMERACT filter

    Tugwell, Peter; Boers, Maarten; D'Agostino, Maria-Antonietta; Beaton, Dorcas; Boonen, Annelies; Bingham, Clifton O; Choy, Ernest; Conaghan, Philip G; Dougados, Maxime; Duarte, Catia; Furst, Daniel E; Guillemin, Francis; Gossec, Laure; Heiberg, Turid; van der Heijde, Désirée M; Hewlett, Sarah; Kirwan, John R; Kvien, Tore K; Landewé, Robert B; Mease, Philip J; Østergaard, Mikkel; Simon, Lee; Singh, Jasvinder A; Strand, Vibeke; Wells, George

    2014-01-01

    OBJECTIVE: The Outcome Measures in Rheumatology (OMERACT) Filter provides guidelines for the development and validation of outcome measures for use in clinical research. The "Truth" section of the OMERACT Filter requires that criteria be met to demonstrate that the outcome instrument meets the cr...

  13. Randomized Filtering Algorithms

    Katriel, Irit; Van Hentenryck, Pascal

    Filtering every global constraint of a CPS to are consistency at every search step can be costly and solvers often compromise on either the level of consistency or the frequency at which are consistency is enforced. In this paper we propose two randomized filtering schemes for dense instances of...

  14. Internet Filtering in China

    Zittrain, Jonathan L.

    2003-01-01

    We collected data on the methods, scope, and depth of selective barriers to Internet usage through networks in China. Tests conducted from May through November 2002 indicated at least four distinct and independently operable Internet filtering methods - Web server IP address, DNS server IP address, keyword, and DNS redirection with a quantifiable leap in filtering sophistication beginning in September 2002.

  15. Fuzzy Based Median Filtering for Removal of Salt-and-Pepper Noise

    Bhavana Deshpande

    2012-07-01

    Full Text Available This paper presents a filter for restoration of images that are highly corrupted by salt and pepper noise. By incorporating fuzzy logic after detecting and correcting the noisy pixel, the proposed filter is able to suppress noise and preserve details across a wide range of salt and pepper noise corruption, ranging from 1% to 60%. The proposed filter is tested on different images and is found to produce better results than the Traditional Median Filter.

  16. Sub-micron filter

    Tepper, Frederick (Sanford, FL); Kaledin, Leonid (Port Orange, FL)

    2009-10-13

    Aluminum hydroxide fibers approximately 2 nanometers in diameter and with surface areas ranging from 200 to 650 m.sup.2/g have been found to be highly electropositive. When dispersed in water they are able to attach to and retain electronegative particles. When combined into a composite filter with other fibers or particles they can filter bacteria and nano size particulates such as viruses and colloidal particles at high flux through the filter. Such filters can be used for purification and sterilization of water, biological, medical and pharmaceutical fluids, and as a collector/concentrator for detection and assay of microbes and viruses. The alumina fibers are also capable of filtering sub-micron inorganic and metallic particles to produce ultra pure water. The fibers are suitable as a substrate for growth of cells. Macromolecules such as proteins may be separated from each other based on their electronegative charges.

  17. Filter cake breaker systems

    Garcia, Marcelo H.F. [Poland Quimica Ltda., Duque de Caxias, RJ (Brazil)

    2004-07-01

    Drilling fluids filter cakes are based on a combination of properly graded dispersed particles and polysaccharide polymers. High efficiency filter cakes are formed by these combination , and their formation on wellbore walls during the drilling process has, among other roles, the task of protecting the formation from instantaneous or accumulative invasion of drilling fluid filtrate, granting stability to well and production zones. Filter cake minimizes contact between drilling fluid filtrate and water, hydrocarbons and clay existent in formations. The uniform removal of the filter cake from the entire interval is a critical factor of the completion process. The main methods used to breaking filter cake are classified into two groups, external or internal, according to their removal mechanism. The aim of this work is the presentation of these mechanisms as well their efficiency. (author)

  18. In the Dirac tradition

    It was Paul Dirac who cast quantum mechanics into the form we now use, and many generations of theoreticians openly acknowledge his influence on their thinking. When Dirac died in 1984, St. John's College, Cambridge, his base for most of his lifetime, instituted an annual lecture in his memory at Cambridge. The first lecture, in 1986, attracted two heavyweights - Richard Feynman and Steven Weinberg. Far from using the lectures as a platform for their own work, in the Dirac tradition they presented stimulating material on deep underlying questions

  19. Non-Traditional Vectors for Paralytic Shellfish Poisoning

    Sara Watt Longan

    2008-06-01

    Full Text Available Paralytic shellfish poisoning (PSP, due to saxitoxin and related compounds, typically results from the consumption of filter-feeding molluscan shellfish that concentrate toxins from marine dinoflagellates. In addition to these microalgal sources, saxitoxin and related compounds, referred to in this review as STXs, are also produced in freshwater cyanobacteria and have been associated with calcareous red macroalgae. STXs are transferred and bioaccumulate throughout aquatic food webs, and can be vectored to terrestrial biota, including humans. Fisheries closures and human intoxications due to STXs have been documented in several non-traditional (i.e. non-filter-feeding vectors. These include, but are not limited to, marine gastropods, both carnivorous and grazing, crustacea, and fish that acquire STXs through toxin transfer. Often due to spatial, temporal, or a species disconnection from the primary source of STXs (bloom forming dinoflagellates, monitoring and management of such non-traditional PSP vectors has been challenging. A brief literature review is provided for filter feeding (traditional and nonfilter feeding (non-traditional vectors of STXs with specific reference to human effects. We include several case studies pertaining to management actions to prevent PSP, as well as food poisoning incidents from STX(s accumulation in non-traditional PSP vectors.

  20. An Ontology- Content-based Filtering Method

    Shoval, Peretz; Maidel, Veronica; Shapira, Bracha

    2008-01-01

    Traditional content-based filtering methods usually utilize text extraction and classification techniques for building user profiles as well as for representations of contents, i.e. item profiles. These methods have some disadvantages e.g. mismatch between user profile terms and item profile terms, leading to low performance. Some of the disadvantages can be overcome by incorporating a common ontology which enables representing both the users' and the items' profiles with con...

  1. LCL Interface Filter Design for Shunt Active Power Filters

    DOBRICEANU, M.; Marin, D; Popescu, M.; BITOLEANU, A.

    2010-01-01

    This paper is focused on finding the parameters of a second order interface filter connected between the power system and the shunt active filter based on switching frequency of the active filter. Many publications on power active filters include various design methods for the interface inductive filter which take into account the injected current and its dynamic. Compared to these ones, the approach presented in this paper is oriented toward the design of the interface filter starting fro...

  2. Experimental validation of a single shaped filter approach for CT using variable source-to-filter distance for examination of arbitrary object diameters.

    Lück, Ferdinand; Kolditz, Daniel; Hupfer, Martin; Steiding, Christian; Kalender, Willi A

    2014-10-01

    The purpose of this study was to validate the use of a single shaped filter (SF) for computed tomography (CT) using variable source-to-filter distance (SFD) for the examination of different object diameters.A SF was designed by performing simulations with the purpose of achieving noise homogeneity in the reconstructed volume and dose reduction for arbitrary phantom diameters. This was accomplished by using a filter design method thats target is to achieve a homogeneous detector noise, but also uses a correction factor for the filtered back projection process. According to simulation results, a single SF designed for one of the largest phantom diameters meets the requirements for all diameters when SFD can be adjusted. To validate these results, a SF made of aluminium alloy was manufactured. Measurements were performed on a CT scanner with polymethyl methacrylate (PMMA) phantoms of diameters from 40-100?mm. The filter was positioned at SFDs ranging from 97-168?mm depending on the phantom diameter. Image quality was evaluated for the reconstructed volume by assessing CT value accuracy, noise homogeneity, contrast-to-noise ratio weighted by dose (CNRD) and spatial resolution. Furthermore, scatter distribution was determined with the use of a beam-stop phantom. Dose was measured for a PMMA phantom with a diameter of 100?mm using a calibrated ionization chamber.The application of a single SF at variable SFD led to improved noise uniformity and dose reduction: noise homogeneity was improved from 15% down to about 0%, and dose was reduced by about 37%. Furthermore, scatter dropped by about 32%, which led to reduced cupping artifacts and improved CT value accuracy. Spatial resolution and CNRD was not affected by the SF.By means of a single SF with variable SFD designed for CT, significant dose reduction can be achieved and image quality can be improved by reducing noise inhomogeneity as well as scatter-induced artifacts. PMID:25198916

  3. Experimental validation of a single shaped filter approach for CT using variable source-to-filter distance for examination of arbitrary object diameters

    The purpose of this study was to validate the use of a single shaped filter (SF) for computed tomography (CT) using variable source-to-filter distance (SFD) for the examination of different object diameters. A SF was designed by performing simulations with the purpose of achieving noise homogeneity in the reconstructed volume and dose reduction for arbitrary phantom diameters. This was accomplished by using a filter design method thats target is to achieve a homogeneous detector noise, but also uses a correction factor for the filtered back projection process. According to simulation results, a single SF designed for one of the largest phantom diameters meets the requirements for all diameters when SFD can be adjusted. To validate these results, a SF made of aluminium alloy was manufactured. Measurements were performed on a CT scanner with polymethyl methacrylate (PMMA) phantoms of diameters from 40–100?mm. The filter was positioned at SFDs ranging from 97–168?mm depending on the phantom diameter. Image quality was evaluated for the reconstructed volume by assessing CT value accuracy, noise homogeneity, contrast-to-noise ratio weighted by dose (CNRD) and spatial resolution. Furthermore, scatter distribution was determined with the use of a beam-stop phantom. Dose was measured for a PMMA phantom with a diameter of 100?mm using a calibrated ionization chamber. The application of a single SF at variable SFD led to improved noise uniformity and dose reduction: noise homogeneity was improved from 15% down to about 0%, and dose was reduced by about 37%. Furthermore, scatter dropped by about 32%, which led to reduced cupping artifacts and improved CT value accuracy. Spatial resolution and CNRD was not affected by the SF. By means of a single SF with variable SFD designed for CT, significant dose reduction can be achieved and image quality can be improved by reducing noise inhomogeneity as well as scatter-induced artifacts. (paper)

  4. Defueling filter test

    The Three Mile Island Unit 2 Reactor (TMI-2) has sustained core damage creating a significant quantity of fine debris, which can become suspended during the planned defueling operations, and will have to be constantly removed to maintain water clarity and minimize radiation exposure. To accomplish these objectives, a Defueling Water Cleanup System (DWCS) has been designed. One of the primary components in the DWCS is a custom designed filter canister using an all stainless steel filter medium. The full scale filter canister is designed to remove suspended solids from 800 microns to 0.5 microns in size. Filter cartridges are fabricated into an element cluster to provide for a flowrate of greater than 100 gals/min. Babcock and Wilcox (B and W) under contract to GPU Nuclear Corporation has evaluated two candidate DWCS filter concepts in a 1/100 scale proof-of-principle test program at BandW's Lynchburg Research Center. The filters were challenged with simulated solids suspensions of 1400 and 140 ppm in borated water (5000 ppm boron). Test data collected includes solids loading, effluent turbidity, and differential pressure trends versus time. From the proof-of-principle test results, a full-scale filter canister was generated

  5. Comparison Of Chest Filters

    Schiller, G.; Olson, A.; Nguyen, H.

    1985-09-01

    Recently, anatomically shaped lead acrylic filters have been introduced for chest radiography. Two of these filters were compared to several (Al, Cu, Y, and Pb acrylic) uniform filters. Phototimed exposures at 80, 100, 120 and 140 kVp were made on a realistic chest phantom. The optical density in the lung field was kept constant for all filters and kVp's. Exposure time and entrance exposures to the mediastinum and lungs were measured. When compared to standard (3.2 mm Al HVL) aluminum filtration a reduction of (44% - 60%) of lung exposure and better visualization of the mediastinum and retrocardiac areas were noted. However, a significant (80 - 350%) increase in exposure time was required and mediastinum exposure increased by 40 - 100 percent. When using uniform filters, in addition to the standard aluminum filter, entrance exposures to the lung and mediastinum were reduced by 30 -50% for Yttrium and Copper and by 27 to 35% for lead acrylic. Exposure times increased by up to 36%, 64%, and 52% respectively. When using spatially shaped filters, improved image quality and reduced lung exposure results, however, one must be aware of the significant increase in exposure time especially at low (80) kVp's. There is also an increase of medias-tinum exposure and possibility of positioning artifacts.

  6. Ceramic fiber reinforced filter

    Stinton, David P. (Knoxville, TN); McLaughlin, Jerry C. (Oak Ridge, TN); Lowden, Richard A. (Powell, TN)

    1991-01-01

    A filter for removing particulate matter from high temperature flowing fluids, and in particular gases, that is reinforced with ceramic fibers. The filter has a ceramic base fiber material in the form of a fabric, felt, paper of the like, with the refractory fibers thereof coated with a thin layer of a protective and bonding refractory applied by chemical vapor deposition techniques. This coating causes each fiber to be physically joined to adjoining fibers so as to prevent movement of the fibers during use and to increase the strength and toughness of the composite filter. Further, the coating can be selected to minimize any reactions between the constituents of the fluids and the fibers. A description is given of the formation of a composite filter using a felt preform of commercial silicon carbide fibers together with the coating of these fibers with pure silicon carbide. Filter efficiency approaching 100% has been demonstrated with these filters. The fiber base material is alternately made from aluminosilicate fibers, zirconia fibers and alumina fibers. Coating with Al.sub.2 O.sub.3 is also described. Advanced configurations for the composite filter are suggested.

  7. LCL Interface Filter Design for Shunt Active Power Filters

    DOBRICEANU, M.

    2010-08-01

    Full Text Available This paper is focused on finding the parameters of a second order interface filter connected between the power system and the shunt active filter based on switching frequency of the active filter. Many publications on power active filters include various design methods for the interface inductive filter which take into account the injected current and its dynamic. Compared to these ones, the approach presented in this paper is oriented toward the design of the interface filter starting from filter transfer functions by imposing the performances of the filter.

  8. Non-traditional inheritance

    In the last few years, several non-traditional forms of inheritance have been recognized. These include mosaicism, cytoplasmic inheritance, uniparental disomy, imprinting, amplification/anticipation, and somatic recombination. Genomic imprinting (GI) is the dependence of the phenotype on the sex of the transmitting parent. GI in humans seems to involve growth, behaviour, and survival in utero. The detailed mechanism of genomic imprinting is not known, but it seems that some process is involved in turning a gene off; this probably involves two genes, one of which produces a product that turns a gene off, and the gene that is itself turned off. The process of imprinting (turning off) may be associated with methylation. Erasure of imprinting can occur, and seems to be associated with meiosis. 10 refs

  9. EMI filter design

    Ozenbaugh, Richard Lee

    2011-01-01

    With today's electrical and electronics systems requiring increased levels of performance and reliability, the design of robust EMI filters plays a critical role in EMC compliance. Using a mix of practical methods and theoretical analysis, EMI Filter Design, Third Edition presents both a hands-on and academic approach to the design of EMI filters and the selection of components values. The design approaches covered include matrix methods using table data and the use of Fourier analysis, Laplace transforms, and transfer function realization of LC structures. This edition has been fully revised

  10. Circuits and filters handbook

    Chen, Wai-Kai

    2003-01-01

    A bestseller in its first edition, The Circuits and Filters Handbook has been thoroughly updated to provide the most current, most comprehensive information available in both the classical and emerging fields of circuits and filters, both analog and digital. This edition contains 29 new chapters, with significant additions in the areas of computer-aided design, circuit simulation, VLSI circuits, design automation, and active and digital filters. It will undoubtedly take its place as the engineer's first choice in looking for solutions to problems encountered in the design, analysis, and behavi

  11. Multilevel filtering elliptic preconditioners

    Kuo, C. C. Jay; Chan, Tony F.; Tong, Charles

    1989-01-01

    A class of preconditioners is presented for elliptic problems built on ideas borrowed from the digital filtering theory and implemented on a multilevel grid structure. They are designed to be both rapidly convergent and highly parallelizable. The digital filtering viewpoint allows the use of filter design techniques for constructing elliptic preconditioners and also provides an alternative framework for understanding several other recently proposed multilevel preconditioners. Numerical results are presented to assess the convergence behavior of the new methods and to compare them with other preconditioners of multilevel type, including the usual multigrid method as preconditioner, the hierarchical basis method and a recent method proposed by Bramble-Pasciak-Xu.

  12. HEPA filter jointer

    Hill, D.; Martinez, H.E.

    1998-02-01

    A HEPA filter jointer system was created to remove nitrate contaminated wood from the wooden frames of HEPA filters that are stored at the Rocky Flats Plant. A commercial jointer was chosen to remove the nitrated wood. The chips from the wood removal process are in the right form for caustic washing. The jointer was automated for safety and ease of operation. The HEPA filters are prepared for jointing by countersinking the nails with a modified air hammer. The equipment, computer program, and tests are described in this report.

  13. Smoke and pollutant filtering device

    A smoke and pollutant filtering device comprising a mask having a filter composed of a series of contiguous, serial layers of filtering material. The filter consists of front and rear gas permeable covers, a first filter layer of pressed vegetable matter, a second filter layer comprising a layer of activated charcoal adjacent a layer of aqua filter floss, a third filter comprising a gas permeable cloth situated between layers of pressed vegetable matter, and a fourth filter layer comprising an aqua filter floss. The first through fourth filter layers are sandwiched between the front and rear gas permeable covers. The filtering device is stitched together and mounted within a fireretardant hood shaped to fit over a human head. Elastic bands are included in the hood to maintain the hood snugly about the head when worn

  14. X-ray differential phase-contrast tomographic reconstruction with a phase line integral retrieval filter

    We report an alternative reconstruction technique for x-ray differential phase-contrast computed tomography (DPC-CT). This approach is based on a new phase line integral projection retrieval filter, which is rooted in the derivative property of the Fourier transform and counteracts the differential nature of the DPC-CT projections. It first retrieves the phase line integral from the DPC-CT projections. Then the standard filtered back-projection (FBP) algorithms popular in x-ray absorption-contrast CT are directly applied to the retrieved phase line integrals to reconstruct the DPC-CT images. Compared with the conventional DPC-CT reconstruction algorithms, the proposed method removes the Hilbert imaginary filter and allows for the direct use of absorption-contrast FBP algorithms. Consequently, FBP-oriented image processing techniques and reconstruction acceleration softwares that have already been successfully used in absorption-contrast CT can be directly adopted to improve the DPC-CT image quality and speed up the reconstruction

  15. HEPA air filter (image)

    ... pet dander and other irritating allergens from the air. Along with other methods to reduce allergens, such ... controlling the amount of allergens circulating in the air. HEPA filters can be found in most air ...

  16. Cleaning an air filter

    A hydrophobic HEPA air filter partially blocked by small ferric oxide and iron particles, such as arise from the plasma arc cutting of iron or steel, is unblocked by subjecting the filter to a flow of a vapour (which term includes droplets), atomised liquid or solution, aerosol or the like, of such concentration and duration as not to block the filter but to dissolve the oxide, which is redistributed to leave gas-conducting voids when the liquid subsequently dries out. The vapour may be a cold water fog produced by solid carbon dioxide and hot water, or an acid or organic solvent may be used. The unblocked filter may be coated with silane-coated glass spheres to facilitate subsequent filtration. A particular application involves plasma cutting of equipment contaminated with plutonium. (author)

  17. Paul Rodgersi filter Kohilas

    2000-01-01

    28. I Kohila keskkoolis kohaspetsiifiline skulptuur ja performance "Filter". Kooli 130. aastapäeva tähistava ettevõtmise eesotsas oli skulptor Paul Rodgers ja kaks viimase klassi noormeest ئ Marko Heinmäe, Hendrik Karm.

  18. Metalcasting: Filtering Molten Metal

    A more efficient method has been created to filter cast molten metal for impurities. Read about the resulting energy and money savings that can accrue to many different industries from the use of this exciting new technology

  19. Adaptive digital filters

    Kova?evi?, Branko; Milosavljevi?, Milan

    2013-01-01

    “Adaptive Digital Filters” presents an important discipline applied to the domain of speech processing. The book first makes the reader acquainted with the basic terms of filtering and adaptive filtering, before introducing the field of advanced modern algorithms, some of which are contributed by the authors themselves. Working in the field of adaptive signal processing requires the use of complex mathematical tools. The book offers a detailed presentation of the mathematical models that is clear and consistent, an approach that allows everyone with a college level of mathematics knowledge to successfully follow the mathematical derivations and descriptions of algorithms.   The algorithms are presented in flow charts, which facilitates their practical implementation. The book presents many experimental results and treats the aspects of practical application of adaptive filtering in real systems, making it a valuable resource for both undergraduate and graduate students, and for all others interested in m...

  20. Ultra accurate collaborative information filtering via directed user similarity

    Guo, Qiang; Song, Wen-Jun; LIU Jian-guo

    2014-01-01

    A key challenge of the collaborative filtering (CF) information filtering is how to obtain the reliable and accurate results with the help of peers' recommendation. Since the similarities from small-degree users to large-degree users would be larger than the ones opposite direction, the large-degree users' selections are recommended extensively by the traditional second-order CF algorithms. By considering the users' similarity direction and the second-order correlations to depress the influen...

  1. A Contextual Item-Based Collaborative Filtering Technology

    Pan Pan; Xueqing Tan

    2012-01-01

    This paper proposes a contextual item-based collaborative filtering technology, which is based on the traditional item-based collaborative filtering technology. In the process of the recommendation, user's important mobile contextual information are taken into account, and the technology combines with those ratings on the items in the users' historical contextual information who are familiar with user's current context information in order to predict that which items will be preferred by user...

  2. Kalman Filtering in R

    Fernando Tusell

    2011-01-01

    Support in R for state space estimation via Kalman filtering was limited to one package, until fairly recently. In the last five years, the situation has changed with no less than four additional packages offering general implementations of the Kalman filter, including in some cases smoothing, simulation smoothing and other functionality. This paper reviews some of the offerings in R to help the prospective user to make an informed choice.

  3. Spatial filter issues

    Experiments and calculations indicate that the threshold pressure in spatial filters for distortion of a transmitted pulse scales approximately as IO.2 and (F number-sign)2 over the intensity range from 1014 to 2xlO15 W/CM2 . We also demonstrated an interferometric diagnostic that will be used to measure the scaling relationships governing pinhole closure in spatial filters

  4. NIRCam filter wheels

    McCully, Sean; Schermerhorn, Michael; Thatcher, John

    2005-08-01

    The NIRCam instrument will provide near-infrared imaging capabilities for the James Webb Space Telescope. In addition, this instrument contains the wavefront-sensing elements necessary for optimizing the performance of the primary mirror. Several of these wavefront-sensing elements will reside in the NIRCam Filter Wheel Assembly. The instrument and its complement of mechanisms and optics will operate at a cryogenic temperature of 35K. This paper describes the design of the NIRCam Filter Wheel Assembly.

  5. Filtered stochastic calculus

    Lenczewski, Romuald

    2001-01-01

    By introducing a color filtration to the multiplicity space, we extend the quantum Ito calculus on multiple symmetric Fock space to the framework of filtered adapted biprocesses. In this new notion of adaptedness,``classical'' time filtration makes the integrands similar to adapted processes, whereas ``quantum'' color filtration produces their deviations from adaptedness. An important feature of this calculus, which we call filtered stochastic calculus, is that it provides an explicit interpo...

  6. Miniaturized superconducting microwave filters

    In this paper we present methods for the miniaturization of superconducting filters. We consider two designs of seventh-order bandpass Chebyshev filters based on lumped elements and a novel quasi-lumped element resonator. In both designs the area of the filters, with a central frequency of 2-5 GHz, is less than 1.2 mm2. Such small filters can be readily integrated on a single board for multi-channel microwave control of superconducting qubits. The filters have been experimentally tested and the results are compared with simulations. The miniaturization resulted in parasitic coupling between resonators and within each resonator that affected primarily the stopband and increased the bandwidth. The severity of the error depends on the design in particular, and was less sensitive when a groundplane was used under the inductances of the resonators. The best performance was reached for the quasi-lumped filter with central frequency of 4.45 GHz, quality factor of 40 and 50 dB stopband

  7. Multivariate granulometric filters

    Batman, Sinan; Dougherty, Edward R.

    1997-04-01

    As introduced by Matheron, granulometries depend on a single sizing parameter for each structuring element. The concept of granulometry has recently been extended in such a way that each structuring element has its own sizing parameter resulting in a filter (Psi) t depending on the vector parameter t equals (t1..., tn). The present paper generalizes the concept of a parameterized reconstructive (tau) -opening to the multivariate setting, where the reconstructive filter (Lambda) t fully passes any connected component not fully eliminated by (Psi) t. The problem of minimizing the MAE between the filtered and ideal image processes becomes one of vector optimization in an n- dimensional search space. Unlike the univariate case, the MAE achieved by the optimum filter (Lambda) t is global in the sense that it is independent of the relative sizes of structuring elements in the filter basis. As a consequence, multivariate granulometries provide a natural environment to study optimality of the choice of structuring elements. If the shapes of the structuring elements are themselves parameterized, the expected error is a deterministic function of the shape and size parameters and its minimization yields the optimal MAE filter.

  8. Filter Paper: Solution to High Self-Attenuation Corrections in HEPA Filter Measurements

    Oberer, R.B.; Harold, N.B.; Gunn, C.A.; Brummett, M.; Chaing, L.G.

    2005-10-01

    An 8 by 8 by 6 inch High Efficiency Particulate Air (HEPA) filter was measured as part of a uranium holdup survey in June of 2005 as it has been routinely measured every two months since 1998. Although the survey relies on gross gamma count measurements, this was one of a few measurements that had been converted to a quantitative measurement in 1998. The measurement was analyzed using the traditional Generalized Geometry Holdup (GGH) approach, using HMS3 software, with an area calibration and self-attenuation corrected with an empirical correction factor of 1.06. A result of 172 grams of {sup 235}U was reported. The actual quantity of {sup 235}U in the filter was approximately 1700g. Because of this unusually large discrepancy, the measurement of HEPA filters will be discussed. Various techniques for measuring HEPA filters will be described using the measurement of a 24 by 24 by 12 inch HEPA filter as an example. A new method to correct for self attenuation will be proposed for this measurement Following the discussion of the 24 by 24 by 12 inch HEPA filter, the measurement of the 8 by 8 by 6 inch will be discussed in detail.

  9. Filter Paper: Solution to High Self-Attenuation Corrections in HEPA Filter Measurements

    An 8 by 8 by 6 inch High Efficiency Particulate Air (HEPA) filter was measured as part of a uranium holdup survey in June of 2005 as it has been routinely measured every two months since 1998. Although the survey relies on gross gamma count measurements, this was one of a few measurements that had been converted to a quantitative measurement in 1998. The measurement was analyzed using the traditional Generalized Geometry Holdup (GGH) approach, using HMS3 software, with an area calibration and self-attenuation corrected with an empirical correction factor of 1.06. A result of 172 grams of 235U was reported. The actual quantity of 235U in the filter was approximately 1700g. Because of this unusually large discrepancy, the measurement of HEPA filters will be discussed. Various techniques for measuring HEPA filters will be described using the measurement of a 24 by 24 by 12 inch HEPA filter as an example. A new method to correct for self attenuation will be proposed for this measurement Following the discussion of the 24 by 24 by 12 inch HEPA filter, the measurement of the 8 by 8 by 6 inch will be discussed in detail

  10. Dip-separated structural filtering using seislet transform and adaptive empirical mode decomposition based dip filter

    Chen, Yangkang

    2016-04-01

    The seislet transform has been demonstrated to have a better compression performance for seismic data compared with other well-known sparsity promoting transforms, thus it can be used to remove random noise by simply applying a thresholding operator in the seislet domain. Since the seislet transform compresses the seismic data along the local structures, the seislet thresholding can be viewed as a simple structural filtering approach. Because of the dependence on a precise local slope estimation, the seislet transform usually suffers from low compression ratio and high reconstruction error for seismic profiles that have dip conflicts. In order to remove the limitation of seislet thresholding in dealing with conflicting-dip data, I propose a dip-separated filtering strategy. In this method, I first use an adaptive empirical mode decomposition based dip filter to separate the seismic data into several dip bands (5 or 6). Next, I apply seislet thresholding to each separated dip component to remove random noise. Then I combine all the denoised components to form the final denoised data. Compared with other dip filters, the empirical mode decomposition based dip filter is data-adaptive. One only need to specify the number of dip components to be separated. Both complicated synthetic and field data examples show superior performance of my proposed approach than the traditional alternatives. The dip-separated structural filtering is not limited to seislet thresholding, and can also be extended to all those methods that require slope information.

  11. Containment venting filter designs incorporating stainless-steel fiber filters

    Failure of the containment of a PWR as a consequence of a major reactor accident can be prevented by filtered containment venting into the stack through an accident filter system. This greatly reduces the environmental contamination by fission products, which otherwise would be released. Possible filter concepts and their embodiment in stainless-steel fiber filters are described. (orig./HP)

  12. Circular filter bag change ladderack system video presentation

    A great deal of research and development at Harwell over the last few years has centered around the design of circular radial flow HEPA filters as alternatives to the traditional rectangular HEPA filter. With a circular insert there are inherent features which give this geometry certain advantages over its counterpart, such as ease of sealing, compatibility with remote handling and disposal routes; these have been well publicized in previous works. A mock-up is shown of a bag change ladderack system of 3400m3/h circular filter. It highlights the space requirements for bag changing and demonstrates the ease with which a filter may be replaced. The filter throat incorporates a silicone rubber lip seal which forms a flap seal against a tapered spigot feature built into the wall. The novelty of this filter design is that the bag is an integral part of the filter and is attached onto the filter flange. This enables the inside of the filter, where the contamination particulate has collected, to be sealed/bagged off and hence the dust burden retained

  13. Distortion Parameters Analysis Method Based on Improved Filtering Algorithm

    ZHANG Shutuan

    2013-10-01

    Full Text Available In order to realize the accurate distortion parameters test of aircraft power supply system, and satisfy the requirement of corresponding equipment in the aircraft, the novel power parameters test system based on improved filtering algorithm is introduced in this paper. The hardware of the test system has the characters of s portable and high-speed data acquisition and processing, and the software parts utilize the software Labwindows/CVI as exploitation software, and adopt the pre-processing technique and adding filtering algorithm. Compare with the traditional filtering algorithm, the test system adopted improved filtering algorithm can help to increase the test accuracy. The application shows that the test system with improved filtering algorithm can realize the accurate test results, and reach to the design requirements.  

  14. Ceramic filters for bulk inoculation of nickel alloy castings

    F. Binczyk

    2011-07-01

    Full Text Available The work includes the results of research on production technology of ceramic filters which, besides the traditional filtering function, playalso the role of an inoculant modifying the macrostructure of cast nickel alloys. To play this additional role, filters should demonstratesufficient compression strength and ensure proper flow rate of liquid alloy. The role of an inoculant is played by cobalt aluminateintroduced to the composition of external coating in an amount from 5 to 10 wt.% . The required compression strength (over 1MPa isprovided by the supporting layers, deposited on the preform, which is a polyurethane foam. Based on a two-level fractional experiment24-1, the significance of an impact of various technological parameters (independent variables on selected functional parameters of theready filters was determined. Important effect of the number of the supporting layers and sintering temperature of filters after evaporationof polyurethane foam was stated.

  15. Fuzzy Stack Filters for Image Processing

    Cs. Stupak

    1999-06-01

    Full Text Available This paper is oriented to filtering by fuzzy stack filter of monochromatic images distorted with impulsive noise. Fuzzy stack filter is acquired by extension of stack filters by means of fuzzy logic. Adding some parameters to this filter, that are adjusted by neural adaptation algorithm, is obtained the new class of filters, so-called fuzzy rank-order filters. This class of filters is compared with other well known filters as stack filters and neural stack filters.

  16. Optimal filtering and filter stability of linear stochastic delay systems

    Kwong, R. H.-S.; Willsky, A. S.

    1977-01-01

    Optimal filtering equations are obtained for very general linear stochastic delay systems. Stability of the optimal filter is studied in the case where there are no delays in the observations. Using the duality between linear filtering and control, asymptotic stability of the optimal filter is proved. Finally, the cascade of the optimal filter and the deterministic optimal quadratic control system is shown to be asymptotically stable as well.

  17. Design of a cavity filter

    A cavity filter was developed for the SSRF 0-mode beam feedback. The filter is used to pick up the 500 MHz signal from the storage ring beam. The Superfish was used to simulate the model of the cavity bandpass filter. The design method, parameters of the filter and results of beam measurements are described in this paper. (authors)

  18. Kalman Filtering for Manufacturing Processes

    Oakes, Thomas; Tang, Lie; Landers, Robert G.; Balakrishnan, S. N.

    2009-01-01

    This chapter presented a methodology, based on stochastic process modeling and Kalman filtering, to filter manufacturing process measurements, which are known to be inherently noisy. Via simulation studies, the methodology was compared to low pass and Butterworth filters. The methodology was applied in a Friction Stir Welding (FSW) process to filter data

  19. Choosing and using astronomical filters

    Griffiths, Martin

    2014-01-01

    As a casual read through any of the major amateur astronomical magazines will demonstrate, there are filters available for all aspects of optical astronomy. This book provides a ready resource on the use of the following filters, among others, for observational astronomy or for imaging: Light pollution filters Planetary filters Solar filters Neutral density filters for Moon observation Deep-sky filters, for such objects as galaxies, nebulae and more Deep-sky objects can be imaged in much greater detail than was possible many years ago. Amateur astronomers can take

  20. An IIR median hybrid filter

    Bauer, Peter H.; Sartori, Michael A.; Bryden, Timothy M.

    1992-01-01

    A new class of nonlinear filters, the so-called class of multidirectional infinite impulse response median hybrid filters, is presented and analyzed. The input signal is processed twice using a linear shift-invariant infinite impulse response filtering module: once with normal causality and a second time with inverted causality. The final output of the MIMH filter is the median of the two-directional outputs and the original input signal. Thus, the MIMH filter is a concatenation of linear filtering and nonlinear filtering (a median filtering module). Because of this unique scheme, the MIMH filter possesses many desirable properties which are both proven and analyzed (including impulse removal, step preservation, and noise suppression). A comparison to other existing median type filters is also provided.

  1. Boolean filters of distributive lattices

    M. Sambasiva Rao

    2013-07-01

    Full Text Available In this paper we introduce the notion of Boolean filters in a pseudo-complemented distributive lattice and characterize the class of all Boolean filters. Further a set of equivalent conditions are derived for a proper filter to become a prime Boolean filter. Also a set of equivalent conditions is derived for a pseudo-complemented distributive lattice to become a Boolean algebra. Finally, a Boolean filter is characterized in terms of congruences.

  2. DOE HEPA filter test program

    NONE

    1998-05-01

    This standard establishes essential elements of a Department of Energy (DOE) program for testing HEPA filters to be installed in DOE nuclear facilities or used in DOE-contracted activities. A key element is the testing of HEPA filters for performance at a DOE Filter Test Facility (FTF) prior to installation. Other key elements are (1) providing for a DOE HEPA filter procurement program, and (2) verifying that HEPA filters to be installed in nuclear facilities appear on a Qualified Products List (QPL).

  3. DOE HEPA filter test program

    This standard establishes essential elements of a Department of Energy (DOE) program for testing HEPA filters to be installed in DOE nuclear facilities or used in DOE-contracted activities. A key element is the testing of HEPA filters for performance at a DOE Filter Test Facility (FTF) prior to installation. Other key elements are (1) providing for a DOE HEPA filter procurement program, and (2) verifying that HEPA filters to be installed in nuclear facilities appear on a Qualified Products List (QPL)

  4. Fuzzy Digital Filtering: Signal Interpretation

    Juan C. Sánchez García; J. Jesús Medel Juárez; Juan C. García Infante

    2011-01-01

    The paper makes a description of the fuzzy filter properties considering its operational principles. A digital filter interacts with a reference model signal into real process in order to get the best corresponding answer, having the minimum error at the filter output using the mean square criterion. Adding into this filter structure a fuzzy mechanism, to obtain an intelligent filtering because adaptively select and emit a decision answer according with the external reference signal changes, ...

  5. Stack filter classifiers

    Porter, Reid B [Los Alamos National Laboratory; Hush, Don [Los Alamos National Laboratory

    2009-01-01

    Just as linear models generalize the sample mean and weighted average, weighted order statistic models generalize the sample median and weighted median. This analogy can be continued informally to generalized additive modeels in the case of the mean, and Stack Filters in the case of the median. Both of these model classes have been extensively studied for signal and image processing but it is surprising to find that for pattern classification, their treatment has been significantly one sided. Generalized additive models are now a major tool in pattern classification and many different learning algorithms have been developed to fit model parameters to finite data. However Stack Filters remain largely confined to signal and image processing and learning algorithms for classification are yet to be seen. This paper is a step towards Stack Filter Classifiers and it shows that the approach is interesting from both a theoretical and a practical perspective.

  6. Morphing Ensemble Kalman Filters

    Beezley, Jonathan D

    2007-01-01

    A new type of ensemble filter is proposed, which combines an ensemble Kalman filter (EnKF) with the ideas of morphing and registration from image processing. This results in filters suitable for nonlinear problems whose solutions exhibit moving coherent features, such as thin interfaces in wildfire modeling. The ensemble members are represented as the composition of one common state with a spatial transformation, called registration mapping, plus a residual. A fully automatic registration method is used that requires only gridded data, so the features in the model state do not need to be identified by the user. The morphing EnKF operates on a transformed state consisting of the registration mapping and the residual. Essentially, the morphing EnKF uses intermediate states obtained by morphing instead of linear combinations of the states.

  7. Glove-box filters

    Description is given of a device for simply and rapidly assembling and dissassembling the filters used inside sealed enclosures, such as glove-boxes and shielded cells equipped with nippers or manipulators, said filters being of the type comprising a cylindrical casing containing a filtering member, the upper portion of said casing being open so as to allow the gases to be cleaned to flow in, whereas the casing bottom is centrally provided with a hole extended outwardly by a threaded collar on which is screwed a connecting-sleeve to be fixed to the mouth of a gas outlet pipe. To a yoke transverse bar is welded a pin which can be likened to a bent spring-blade, one arm of which welded to said transverse bar, is rectilinear whereas its other arm is provided with a boss cooperating with a cavity made in a protrusion of said pipe, right under the mouth thereof

  8. Filters in topology optimization

    Bourdin, Blaise

    1999-01-01

    In this article, a modified (``filtered'') version of the minimum compliance topology optimization problem is studied. The direct dependence of the material properties on its pointwise density is replaced by a regularization of the density field using a convolution operator. In this setting it is...... possible to establish the existence of solutions. Moreover, convergence of an approximation by means of finite elements can be obtained. This is illustrated through some numerical experiments. The ``filtering'' technique is also shown to cope with two important numerical problems in topology optimization...

  9. Die filosofiese filter

    L F. Schulze

    1971-06-01

    Full Text Available ’n Filter is ’n opvanger. Dit kan dien as ’n suiweraar wat die onsuiwerhede in vog of water opvang en terughou. Dit kan egter ook dien om ligstrale van ’n bepaalde golflengte op te vang en weg te hou van die oë of van die lens van ’n kamera. Hierdeur word die beeld of omgewing wat ek sien of fotografeer, gekleur. Ons gebruik die woord filter hier in lg. sin, hoewel die meeste mense dit nie as sodanig wil sien of erken nie, maar meen om dit bloot as suiweraar te gebruik.

  10. Alarm filtering and presentation

    This paper discusses alarm filtering and presentation in the control room of nuclear and other process control plants. Alarm generation and presentation is widely recognized as a general process control problem. Alarm systems often fail to provide meaningful alarms to operators. Alarm generation and presentation is an area in which computer aiding is feasible and provides clear benefits. Therefore, researchers have developed several computerized alarm filtering and presentation approaches. This paper discusses problems associated with alarm generation and presentation. Approaches to improving the alarm situation and installation issues of alarm system improvements are discussed. The impact of artificial intelligence (AI) technology on alarm system improvements is assessed. (orig.)

  11. Digital filters in spectrometry

    In this work is presented the development and application of the digital signal processing for different multichannel analysis spectra. The use of the smoothing classic methods in applications of signal processing is illustrated by a filters discussion; autoregressive, mobile average and the ARMA filters. Generally, simple routines of lineal smoothing do not provide appropriate smoothing of the data that show the local ruggedness as the strong discontinuities; however the indicated development algorithms have been enough to leave adapting to this task. Four algorithms were proven: autoregressive, mobile average, ARMA and binomial methods for 5, 7, and 9 of data, everything in the domain of the time and programmed in Mat lab. (Author)

  12. A new algorithm of inter-frame filtering in IR image based on threshold value

    Liu, Wei; Leng, Hanbing; Chen, Weining; Yang, Hongtao; Xie, Qingsheng; Yi, Bo; Zhang, Haifeng

    2013-09-01

    This paper proposed a new algorithm of inter-frame filtering in IR image based on threshold value for the purpose of solving image blur and smear brought by traditional inter-frame filtering algorithm. At first, it finds out causes of image blur and smear by analyzing general inter-frame filtering algorithm and dynamic inter-frame filtering algorithm, hence to bring up a new kind of time-domain filter. In order to obtain coefficients of the filter, it firstly gets difference image of present image and previous image, and then, it gets noisy threshold value by analyzing difference image with probability analysis method. The relationship between difference image and threshold value helps obtaining the coefficients of filter. At last, inter-frame filtering method is adopted to process pixels interrupted by noise. The experimental result shows that this algorithm has successfully repressed IR image blur and smear, and NETD tested by traditional inter filtering algorithm and the new algorithm are respectively 78mK and 70mK, which shows it has a better noise reduction performance than traditional ones. The algorithm is not only applied to still image, but also to sports image. As a new algorithm with great practical value, it is easy to achieve on FPGA, of excellent real-time performance and it effectively extends application scope of time domain filtering algorithm.

  13. Traditional West Coast Native Medicine

    Deagle, George

    1988-01-01

    An important part of the complex culture of the Native people of Canada's Pacific coast is the traditional system of medicine each culture has developed. Population loss from epidemics and the influence of dominant European cultures has resulted in loss of many aspects of traditional medicine. Although some Native practices are potentially hazardous, continuation of traditional approaches to illness remains an important part of health care for many Native people. The use of “devil's club” pla...

  14. Traditional West Coast Native Medicine

    Deagle, George

    1988-01-01

    An important part of the complex culture of the Native people of Canada's Pacific coast is the traditional system of medicine each culture has developed. Population loss from epidemics and the influence of dominant European cultures has resulted in loss of many aspects of traditional medicine. Although some Native practices are potentially hazardous, continuation of traditional approaches to illness remains an important part of health care for many Native people. The use of “devil's club” plant by the Haida people illustrates that Native medicine has both spiritual and physical properties. Modern family practice shares many important foundations with traditional healing systems. PMID:21253031

  15. A FUZZY FILTERING MODEL FOR CONTOUR DETECTION

    T.C. Rajakumar

    2011-04-01

    Full Text Available Contour detection is the basic property of image processing. Fuzzy Filtering technique is proposed to generate thick edges in two dimensional gray images. Fuzzy logic is applied to extract value for an image and is used for object contour detection. Fuzzy based pixel selection can reduce the drawbacks of conventional methods(Prewitt, Robert. In the traditional methods, filter mask is used for all kinds of images. It may succeed in one kind of image but fail in another one. In this frame work the threshold parameter values are obtained from the fuzzy histogram of the input image. The Fuzzy inference method selects the complete information about the border of the object and the resultant image has less impulse noise and the contrast of the edge is increased. The extracted object contour is thicker than the existing methods. The performance of the algorithm is tested with Peak Signal Noise Ratio(PSNR and Complex Wavelet Structural Similarity Metrics(CWSSIM.

  16. Directional bilateral filters for smoothing fluorescence microscopy images

    Venkatesh, Manasij; Mohan, Kavya; Seelamantula, Chandra Sekhar

    2015-08-01

    Images obtained through fluorescence microscopy at low numerical aperture (NA) are noisy and have poor resolution. Images of specimens such as F-actin filaments obtained using confocal or widefield fluorescence microscopes contain directional information and it is important that an image smoothing or filtering technique preserve the directionality. F-actin filaments are widely studied in pathology because the abnormalities in actin dynamics play a key role in diagnosis of cancer, cardiac diseases, vascular diseases, myofibrillar myopathies, neurological disorders, etc. We develop the directional bilateral filter as a means of filtering out the noise in the image without significantly altering the directionality of the F-actin filaments. The bilateral filter is anisotropic to start with, but we add an additional degree of anisotropy by employing an oriented domain kernel for smoothing. The orientation is locally adapted using a structure tensor and the parameters of the bilateral filter are optimized for within the framework of statistical risk minimization. We show that the directional bilateral filter has better denoising performance than the traditional Gaussian bilateral filter and other denoising techniques such as SURE-LET, non-local means, and guided image filtering at various noise levels in terms of peak signal-to-noise ratio (PSNR). We also show quantitative improvements in low NA images of F-actin filaments.

  17. Application of Unscented Kalman Filter for Sonar Signal Processing

    Leela Kumari. B , Padma Raju.K

    2012-06-01

    Full Text Available State estimation theory is one of the best mathematical approaches to analyze variants in the states of the system or process. The state of the system is defined by a set of variables that provide a complete representation of the internal condition at any given instant of time. Filtering of Random processes is referred to as Estimation, and is a well defined statistical technique. There are two types of state estimation processes, Linear and Nonlinear. Linear estimation of a system can easily be analyzed by using Kalman Filter (KF and is used to compute the target state parameters with a priori information under noisy environment. But the traditional KF is optimal only when the model is linear and its performance is well defined under the assumptions that the system model and noise statistics are well known. Most of the state estimation problems are nonlinear, thereby limiting the practical applications of the KF. The modified KF, aka EKF, Unscented Kalman filter and Particle filter are best known for nonlinear estimates. Extended Kalman filter (EKF is the nonlinear version of the Kalman filter which linearizes about the current mean and covariance. The EKF has been considered the standard in the theory of nonlinear state estimation. Since linear systems do not really exist, a novel transformation is adopted. Unscented Kalman filter and Particle filter are best known nonlinear estimates. The approach in this paper is to analyze the algorithm for maneuvering target tracking using bearing only measurements where UKF provides better probability of state estimation.

  18. Survey of HEPA filter experience

    A survey of high efficiency particulate air (HEPA) filter applications and experience at Department of Energy (DOE) sites was conducted to provide an overview of the reasons and magnitude of HEPA filter changeouts and failures. Results indicated that approximately 58% of the filters surveyed were changed out in the three year study period, and some 18% of all filters were changed out more than once. Most changeouts (63%) were due to the existence of a high pressure drop across the filter, indicative of filter plugging. Other reasons for changeout included leak-test failure (15%), preventive maintenance service life limit (13%), suspected damage (5%) and radiation buildup (4%). Filter failures occurred with approximately 12% of all installed filters. Of these failures, most (64%) occurred for unknown or unreported reasons. Handling or installation damage accounted for an additional 19% of reported failures. Media ruptures, filter-frame failures and seal failures each accounted for approximately 5 to 6% of the reported failures

  19. Filtered beam spectrometers

    Filtered Beam Spectrometers (FBS) are one type of spectrometer that has now shown that inelastic scattering experiments can be made using neutrons with energy in the electron volt range. A general description of the FBS is given in this paper. Examples of several types of data are presented and discussed

  20. Efficient Iterated Filtering

    Lindström, Erik; Ionides, Edward; Frydendall, Jan; Madsen, Henrik

    -Rao efficient. The proposed estimator is easy to implement as it only relies on non-linear filtering. This makes the framework flexible as it is easy to tune the implementation to achieve computational efficiency. This is done by using the approximation of the score function derived from the theory on Iterative...

  1. Morphing and Ensemble Filtering

    Mandel, J.; Beezley, J.; Resler, Jaroslav; Juruš, Pavel; Eben, Kryštof

    Prague : Institute of Computer Science of the AS CR, v.v.i, 2010, s. 1-9. [Workshop on "GHG reduction using IT" /2./. Prague (CZ), 28.05.2010] Institutional research plan: CEZ:AV0Z10300504 Keywords : data assimilation * Kalman filter Subject RIV: JE - Non-nuclear Energetics, Energy Consumption ; Use

  2. Spectral Ensemble Kalman Filters

    Mandel, Jan; Kasanický, Ivan; Vejmelka, Martin; Fuglík, Viktor; Turčičová, Marie; Eben, Kryštof; Resler, Jaroslav; Juruš, Pavel

    2014-01-01

    Roč. 11, - (2014), EMS2014-446. [EMS Annual Meeting /14./ & European Conference on Applied Climatology (ECAC) /10./. 06.10.2014-10.10.2014, Prague] R&D Projects: GA ČR GA13-34856S Grant ostatní: NSF DMS-1216481 Institutional support: RVO:67985807 Keywords : data assimilation * spectral filter Subject RIV: DG - Athmosphere Sciences, Meteorology

  3. Printed analogue filter structures

    Evans, PSA; Ramsey, BJ; Harrey, PM; Harrison, DJ

    1999-01-01

    The authors report progress in conductive lithographic film (CLF) technology, which uses the offset lithographic printing process to form electrically conductive patterns on flexible substrates. Networks of planar passive components and interconnects fabricated simultaneously via the CLF process form notch filter networks at 85 kHz.

  4. Spot- Zombie Filtering System

    Arathy Rajagopal

    2014-01-01

    Full Text Available A major security challenge on the Internet is the existence of the large number of compromised machines. Such machines have been increasingly used to launch various security attacks including spamming and spreading malware, DDoS, and identity theft. These compromised machines are called “Zombies”. In general E-mail applications and providers uses spam filters to filter the spam messages. Spam filtering is a technique for discriminating the genuine message from the spam messages. The attackers send the spam messages to the targeted machine by exalting the filters, which causes the increase in false positives and false negatives. We develop an effective spam zombie detection system named SPOT by monitoring outgoing messages of a network. SPOT focuses on the number of outgoing messages that are originated or forwarded by each computer on a network to identify the presence of Zombies. SPOT is designed based on a powerful statistical tool called Sequential Probability Ratio Test, which has bounded false positive and false negative error rates.

  5. Spot- Zombie Filtering System

    Arathy Rajagopal

    2015-10-01

    Full Text Available A major security challenge on the Internet is the existence of the large number of compromised machines. Such machines have been increasingly used to launch various security attacks including spamming and spreading malware, DDoS, and identity theft. These compromised machines are called "Zombies". In general E-mail applications and providers uses spam filters to filter the spam messages. Spam filtering is a technique for discriminating the genuine message from the spam messages. The attackers send the spam messages to the targeted machine by exalting the filters, which causes the increase in false positives and false negatives. We develop an effective spam zombie detection system named SPOT by monitoring outgoing messages of a network. SPOT focuses on the number of outgoing messages that are originated or forwarded by each computer on a network to identify the presence of Zombies. SPOT is designed based on a powerful statistical tool called Sequential Probability Ratio Test, which has bounded false positive and false negative error rates.

  6. Enhanced Optical Filter Design

    Cushing, David

    2011-01-01

    This book serves as a supplement to the classic texts by Angus Macleod and Philip Baumeister, taking an intuitive approach to the enhancement of optical coating (or filter) performance. Drawing from 40 years of experience in thin film design, Cushing introduces the basics of thin films, the commonly used materials and their deposition, the major coatings and their applications, and improvement methods for each.

  7. Ceramic HEPA Filter Program

    Mitchell, M A; Bergman, W; Haslam, J; Brown, E P; Sawyer, S; Beaulieu, R; Althouse, P; Meike, A

    2012-04-30

    Potential benefits of ceramic filters in nuclear facilities: (1) Short term benefit for DOE, NRC, and industry - (a) CalPoly HTTU provides unique testing capability to answer questions for DOE - High temperature testing of materials, components, filter, (b) Several DNFSB correspondences and presentations by DNFSB members have highlighted the need for HEPA filter R and D - DNFSB Recommendation 2009-2 highlighted a nuclear facility response to an evaluation basis earthquake followed by a fire (aka shake-n-bake) and CalPoly has capability for a shake-n-bake test; (2) Intermediate term benefit for DOE and industry - (a) Filtration for specialty applications, e.g., explosive applications at Nevada, (b) Spin-off technologies applicable to other commercial industries; and (3) Long term benefit for DOE, NRC, and industry - (a) Across industry, strong desire for better performance filter, (b) Engineering solution to safety problem will improve facility safety and decrease dependence on associated support systems, (c) Large potential life-cycle cost savings, and (d) Facilitates development and deployment of LLNL process innovations to allow continuous ventilation system operation during a fire.

  8. Structural notch filter optimization

    Felton, R.; Burge, S.; Bradshaw, A. [Lancaster Univ. (United Kingdom). Dept. of Engineering

    1995-09-01

    A modified algorithm for nonlinear constrained optimization of structural mode filters for an aeroelastic aircraft model is presented. The optimizer set-up and control is implemented in a MATLAB{trademark} graphical user interface environment. It is shown that the modified algorithm gives improved performance over existing nonlinear constrained optimization methods.

  9. Ozone decomposing filter

    Simandl, Ronald F. (Farragut, TN); Brown, John D. (Harriman, TN); Whinnery, Jr., LeRoy L. (Dublin, CA)

    1999-01-01

    In an improved ozone decomposing air filter carbon fibers are held together with a carbonized binder in a perforated structure. The structure is made by combining rayon fibers with gelatin, forming the mixture in a mold, freeze-drying, and vacuum baking.

  10. Mirrors as power filters

    Multilayer mirrors offer advantages in power filtering compared to total reflection mirrors in both wiggler and undulator beams at third generation synchrotron radiation sources currently under construction. These advantages come at the expense of increased absorbed power in the mirror itself, and of added complexity of beamline optical design. This paper discusses these aspects

  11. Parzen Particle Filters

    Lehn-Schiøler, Tue; Erdogmus, Deniz; Principe, Jose C.

    Using a Parzen density estimator any distribution can be approximated arbitrarily close by a sum of kernels. In particle filtering this fact is utilized to estimate a probability density function with Dirac delta kernels; when the distribution is discretized it becomes possible to solve an otherw...

  12. Signalverarbeitung, Filter und Effekte

    Zölzer, Udo

    In diesem Kapitel werden die Grundlagen der digitalen Signalverarbeitung, eine Einführung in digitale Filter und daran anschließend digitale Audio-Effekte vorgestellt. Hierzu wird eine einfache mathematische Formulierung eingeführt, die auf Algorithmen im Zeitbereich beruht. Die äquivalente Betrachtung dieser Algorithmen im Frequenzbereich wird durch Nutzung der zeitdiskreten Fourier-Transformation möglich.

  13. Bayesian Filters in Practice

    Krejsa, Jiří; Věchet, S.

    Bratislava : Slovak University of Technology in Bratislava, 2010, s. 217-222. ISBN 978-80-227-3353-3. [ Robotics in Education. Bratislava (SK), 16.09.2010-17.09.2010] Institutional research plan: CEZ:AV0Z20760514 Keywords : mobile robot localization * bearing only beacons * Bayesian filters Subject RIV: JD - Computer Applications, Robotics

  14. High temperature filter materials

    Alvin, M. A.; Lippert, T. E.; Bachovchin, D. M.; Tressler, R. E.

    Objectives of this program are to identify the potential long-term thermal/chemical effects that advanced coal-based power generating system environments have on the stability of porous ceramic filter materials, as well as to assess the influence of these effects on filter operating performance and life. We have principally focused our efforts on developing an understanding of the stability of the alumina/mullite filter material at high temperature (i.e., 870, 980, and 1100 C) under oxidizing conditions which contain gas phase alkali species. Testing has typically been performed in two continuous flow-through, high temperature test facilities at the Westinghouse Science and Technology Center, using 7 cm diameter times 6.4 mm thick discs. (Alvin, 1992) Each disc of ceramic filter material is exposed for periods of 100 to 3,000 hours in duration. Additional efforts have been performed at Westinghouse to broaden our understanding of the stability of cordierite, cordierite-silicon nitride, reaction and sintered silicon nitride, and clay bonded silicon carbide under similar simulated advanced coal fired process conditions. The results of these efforts are presented in this paper.

  15. The Kalman filter

    Andrade-Cetto, J.

    2002-01-01

    The kalman Filter developed in the early sixties by R. E. Kalman is a recursive state estimator for partially observed non-stationary stochastic prosses. It gives an optimal estimate in the least squares sense of the actual value of state vector from noisy observations.

  16. Ceramic HEPA Filter Program

    Potential benefits of ceramic filters in nuclear facilities: (1) Short term benefit for DOE, NRC, and industry - (a) CalPoly HTTU provides unique testing capability to answer questions for DOE - High temperature testing of materials, components, filter, (b) Several DNFSB correspondences and presentations by DNFSB members have highlighted the need for HEPA filter R and D - DNFSB Recommendation 2009-2 highlighted a nuclear facility response to an evaluation basis earthquake followed by a fire (aka shake-n-bake) and CalPoly has capability for a shake-n-bake test; (2) Intermediate term benefit for DOE and industry - (a) Filtration for specialty applications, e.g., explosive applications at Nevada, (b) Spin-off technologies applicable to other commercial industries; and (3) Long term benefit for DOE, NRC, and industry - (a) Across industry, strong desire for better performance filter, (b) Engineering solution to safety problem will improve facility safety and decrease dependence on associated support systems, (c) Large potential life-cycle cost savings, and (d) Facilitates development and deployment of LLNL process innovations to allow continuous ventilation system operation during a fire.

  17. Magnetic-Optical Filter

    Formicola, I; Pinto, C; Cerulo, P

    2007-01-01

    Magnetic-Optical Filter (MOF) is an instrument suited for high precision spectral measurements for its peculiar characteristics. It is employed in Astronomy and in the field of the telecommunications (it is called FADOF there). In this brief paper we summarize its fundamental structure and functioning.

  18. CSA noise measurement used on oscilloscope and digital filter

    A new method for measurement of low noise Charge Sensitive Preamplifier, with digital oscilloscope and digital filter is discussed in this paper. Compared with traditional measurement, this method has advantage of flexible parameters and convenient to use. The test result is reasonably accords with conventional result. (authors)

  19. The impact of metallic filter media on HEPA filtration

    Traditional HEPA filter systems have limitations that often prevent them from solving many of the filtration problems in the nuclear industry; particularly in applications where long service or storage life, high levels of radioactivity, dangerous decomposition products, chemical aggression, organic solvents, elevated operating temperatures, fire resistance and resistance to moisture are issues. This paper addresses several of these matters of concern by considering the use of metallic filter media to solve HEPA filtration problems ranging from the long term storage of transuranic waste at the WIPP site, spent and damaged fuel assemblies, in glove box ventilation and tank venting to the venting of fumes at elevated temperatures from incinerators, vitrification processes and conversion and sintering furnaces as well as downstream of iodine absorbers in gas cooled reactors in the UK. The paper reviews the basic technology, development, performance characteristics and filtration efficiency, flow versus differential pressure, cleanability and costs of sintered metal fiber in comparison with traditional resin bonded glass fiber filter media and sintered metal powder filter media. Examples of typical filter element and system configurations and applications will be presented The paper will also address the economic case for installing self cleaning pre-filtration, using metallic media, to recover the small volumes of dust that would otherwise blind large volumes of final disposable HEPA filters, thus presenting a route to reduce ultimate disposal volumes and secondary waste streams. (authors)

  20. The impact of metallic filter media on HEPA filtration

    Chadwick, Chris; Kaufman, Seth [Microfiltrex, Porvair Filtration Group Ltd Fareham Industrial Park, Fareham, Hampshire, PO16 8XG (United Kingdom)

    2006-07-01

    Traditional HEPA filter systems have limitations that often prevent them from solving many of the filtration problems in the nuclear industry; particularly in applications where long service or storage life, high levels of radioactivity, dangerous decomposition products, chemical aggression, organic solvents, elevated operating temperatures, fire resistance and resistance to moisture are issues. This paper addresses several of these matters of concern by considering the use of metallic filter media to solve HEPA filtration problems ranging from the long term storage of transuranic waste at the WIPP site, spent and damaged fuel assemblies, in glove box ventilation and tank venting to the venting of fumes at elevated temperatures from incinerators, vitrification processes and conversion and sintering furnaces as well as downstream of iodine absorbers in gas cooled reactors in the UK. The paper reviews the basic technology, development, performance characteristics and filtration efficiency, flow versus differential pressure, cleanability and costs of sintered metal fiber in comparison with traditional resin bonded glass fiber filter media and sintered metal powder filter media. Examples of typical filter element and system configurations and applications will be presented The paper will also address the economic case for installing self cleaning pre-filtration, using metallic media, to recover the small volumes of dust that would otherwise blind large volumes of final disposable HEPA filters, thus presenting a route to reduce ultimate disposal volumes and secondary waste streams. (authors)

  1. Improved Passive-Damped LCL Filter to Enhance Stability in Grid-Connected Voltage-Source Converters

    Beres, Remus Narcis; Wang, Xiongfei; Blaabjerg, Frede; Bak, Claus Leth; Liserre, Marco

    2015-01-01

    This paper proposes an improved passive-damped LCL filter to be used as interface between the grid-connected voltage-source converters and the utility grid. The proposed filter replaces the LCL filter capacitor with a traditional C-type filter with the resonant circuit tuned in such a way that...... switching harmonics due to pulse width modulation are to be cancelled. Since the tuned circuit of the C-type filter suppresses the switching harmonics more effectively, the total inductance of the filter can be reduced. Additionally, the rating of the damping resistor is lower, compared with conventional...

  2. Traditional Medicine Programmes in Madagascar

    Rasoanaivo, Philippe

    2006-01-01

    The note reviews Madagascar's inherited wealth of ethno-medical knowledge, being endowed with a flora of unique global importance on account of its biodiversity, endemicity, and ethno-medical uses. The government of Madagascar has shown its political commitment to traditional medicine by supporting, through an inter-ministerial convention, a commission to study regulations on traditional m...

  3. Experimental study of filter cake formation on different filter media

    Removal of particulate matter from gases generated in the process industry is important for product recovery as well as emission control. Dynamics of filtration plant depend on operating conditions. The models, that predict filter plant behaviour, involve empirical resistance parameters which are usually derived from limited experimental data and are characteristics of the filter media and filter cake (dust deposited on filter medium). Filter cake characteristics are affected by the nature of filter media, process parameters and mode of filter regeneration. Removal of dust particles from air is studied in a pilot scale jet pulsed bag filter facility resembling closely to the industrial filters. Limestone dust and ambient air are used in this study with two widely different filter media. All important parameters like pressure drop, gas flow rate, dust settling, are recorded continuously at 1s interval. The data is processed for estimation of the resistance parameters. The pressure drop rise on test filter media is compared. Results reveal that the surface of filter media has an influence on pressure drop rise (concave pressure drop rise). Similar effect is produced by partially jet pulsed filter surface. Filter behaviour is also simulated using estimated parameters and a simplified model and compared with the experimental results. Distribution of cake area load is therefore an important aspect of jet pulse cleaned bag filter modeling. Mean specific cake resistance remains nearly constant on thoroughly jet pulse cleaned membrane coated filter bags. However, the trend can not be confirmed without independent cake height and density measurements. Thus the results reveal the importance of independent measurements of cake resistance. (author)

  4. Toward Green Cloud Computing: An Attribute Clustering Based Collaborative Filtering Method for Virtual Machine Migration

    Zhang Liu-Mei; Ma Jian-Feng; Wang Yi-Chuan; Lu Di

    2013-01-01

    In this study, an attribute clustering based collaborative filtering algorithm is depicted for virtual machine migration towards green Cloud computing. The algorithm utilizes similarity characteristics of virtual machine task related attributes, especially CPU related attributes, to filter redundant data by feature selection. Then by referencing K-Means clustering to effectively solve the rating scale problems existing in the traditional collaborative filtering recommendation algorithm. Exper...

  5. Advanced Techniques in Harmonic Suppression via Active Power Filter (APF): A Review

    Ekhlas Mhawi; Hamdan Daniyal; Mohd Herwan Sulaiman

    2015-01-01

    This paper intends to present the recent development of artificial intelligence (AI) applications in active power filter (APF). As a result of the development in power electronic technology, (APF) continues to attract ample attention. Compared with the traditional reactive LC filter, active power filter is considered to be more effective in compensating harmonic current generated by nonlinear loads.APF, can correct the power quality and improve the reliability and stability on power utility. ...

  6. Confidence-Level-Based New Adaptive Particle Filter for Nonlinear Object Tracking

    Xiaoyong Zhang; Jun Peng; Wentao Yu; Kuo-chi Lin

    2012-01-01

    Nonlinear object tracking from noisy measurements is a basic skill and a challenging task of mobile robotics, especially under dynamic environments. The particle filter is a useful tool for nonlinear object tracking with non‐Gaussian noise. Nonlinear object tracking needs the real‐time processing capability of the particle filter. While the number in a traditional particle filter is fixed, that can lead to a lot of unnecessary computation. To address this issue, a confidence‐level‐ based new ...

  7. Study on New Method of Heart Disturbance Filtering on Measurement of Impedance Pneumograph

    The graph of impedance pneumograph often occurred heart disturbance. In order to decrease the disturbance, some traditional hardware and software methods were used. But effects of the filters were limited for the small frequency difference between heart and lung. The paper gave a new filter theory that heart was acted as a variety capacitance, then heart disturbance was decreased by integration of variety capacitance. Test results showed that the new filter theory was feasible and satisfied the requirements of clinic measurement

  8. Location Estimation for an Autonomously Guided Vehicle using an Augmented Kalman Filter to Autocalibrate the Odometry

    Larsen, Thomas Dall; Bak, Martin; Andersen, Nils Axel; Ravn, Ole

    1998-01-01

    A Kalman filter using encoder readings as inputs and vision measurements as observations is designed as a location estimator for an autonomously guided vehicle (AGV). To reduce the effect of modelling errors an augmented filter that estimates the true system parameters is designed. The traditional...

  9. Inorganic UV filters

    Eloísa Berbel, Manaia; Renata Cristina Kiatkoski, Kaminski; Marcos Antonio, Corrêa; Leila Aparecida, Chiavacci.

    2013-06-01

    Full Text Available A preocupação com o câncer de pele hoje em dia vem crescendo cada vez mais principalmente em países tropicais, onde a incidência da radiação UVA/B é maior. O uso correto de protetores solares é a forma mais eficaz de prevenir o aparecimento desta doença. Os ativos utilizados em protetores solares po [...] dem ser filtros orgânicos e inorgânicos. Filtros inorgânicos apresentam muitas vantagens em relação aos orgânicos, tais como fotoestabilidade, ausência de irritabilidade e amplo espectro de proteção. Entretanto, em razão de apresentarem alto índice de refração, os ativos inorgânicos conferem aos protetores solares aparência esbranquiçada, diminuindo sua atratividade estética. Muitas alternativas têm sido desenvolvidas no sentido de resolver este problema e dentre elas pode-se destacar o uso da nanotecnologia. Estima-se que o uso de nanomateriais deve crescer das atuais 2000 para 58000 toneladas até 2020. Neste sentido, este trabalho tem como objetivo fazer a análise crítica abordando diferentes aspectos envolvidos tanto na obtenção de protetores solares inorgânicos (rotas de sínteses propostas nos últimos anos) quanto na permeabilidade, na segurança e em outros aspectos relacionados à nova geração de filtros solares inorgânicos. Abstract in english Nowadays, concern over skin cancer has been growing more and more, especially in tropical countries where the incidence of UVA/B radiation is higher. The correct use of sunscreen is the most efficient way to prevent the development of this disease. The ingredients of sunscreen can be organic and/or [...] inorganic sun filters. Inorganic filters present some advantages over organic filters, such as photostability, non-irritability and broad spectrum protection. Nevertheless, inorganic filters have a whitening effect in sunscreen formulations owing to the high refractive index, decreasing their esthetic appeal. Many techniques have been developed to overcome this problem and among them, the use of nanotechnology stands out. The estimated amount of nanomaterial in use must increase from 2000 tons in 2004 to a projected 58000 tons in 2020. In this context, this article aims to analyze critically both the different features of the production of inorganic filters (synthesis routes proposed in recent years) and the permeability, the safety and other characteristics of the new generation of inorganic filters.

  10. Hierarchical Bayes Ensemble Kalman Filtering

    Tsyrulnikov, Michael

    2015-01-01

    Ensemble Kalman filtering (EnKF), when applied to high-dimensional systems, suffers from an inevitably small affordable ensemble size, which results in poor estimates of the background error covariance matrix ${\\bf B}$. The common remedy is a kind of regularization, usually an ad-hoc spatial covariance localization (tapering) combined with artificial covariance inflation. Instead of using an ad-hoc regularization, we adopt the idea by Myrseth and Omre (2010) and explicitly admit that the ${\\bf B}$ matrix is unknown and random and estimate it along with the state (${\\bf x}$) in an optimal hierarchical Bayes analysis scheme. We separate forecast errors into predictability errors (i.e. forecast errors due to uncertainties in the initial data) and model errors (forecast errors due to imperfections in the forecast model) and include the two respective components ${\\bf P}$ and ${\\bf Q}$ of the ${\\bf B}$ matrix into the extended control vector $({\\bf x},{\\bf P},{\\bf Q})$. Similarly, we break the traditional backgrou...

  11. Intraindividual evaluation of the influence of iterative reconstruction and filter kernel on subjective and objective image quality in computed tomography of the brain

    Buhk, J.H. [Univ. Medical Center, Hamburg-Eppendorf (Germany). Dept. of Neuroradiology; Laqmani, A.; Schultzendorff, H.C. von; Hammerle, D.; Adam, G.; Regier, M. [Univ. Medical Center, Hamburg-Eppendorf (Germany). Dept. of Diagnostic and Interventional Radiology; Sehner, S. [Univ. Medical Center, Hamburg-Eppendorf (Germany). Inst. of Medical Biometry and Epidemiology; Fiehler, J. [Univ. Medical Center, Hamburg-Eppendorf (Germany). Neuroradiology; Nagel, H.D. [Dr. HD Nagel, Science and Technology for Radiology, Buchholz (Germany)

    2013-08-15

    Objectives: To intraindividually evaluate the potential of 4th generation iterative reconstruction (IR) on brain CT with regard to subjective and objective image quality. Methods: 31 consecutive raw data sets of clinical routine native sequential brain CT scans were reconstructed with IR level 0 (= filtered back projection), 1, 3 and 4; 3 different brain filter kernels (smooth/standard/sharp) were applied respectively. Five independent radiologists with different levels of experience performed subjective image rating. Detailed ROI analysis of image contrast and noise was performed. Statistical analysis was carried out by applying a random intercept model. Results: Subjective scores for the smooth and the standard kernels were best at low IR levels, but both, in particular the smooth kernel, scored inferior with an increasing IR level. The sharp kernel scored lowest at IR 0, while the scores substantially increased at high IR levels, reaching significantly best scores at IR 4. Objective measurements revealed an overall increase in contrast-to-noise ratio at higher IR levels, which was highest when applying the soft filter kernel. The absolute grey-white contrast decreased with an increasing IR level and was highest when applying the sharp filter kernel. All subjective effects were independent of the raters' experience and the patients' age and sex. Conclusion: Different combinations of IR level and filter kernel substantially influence subjective and objective image quality of brain CT. (orig.)

  12. Intraindividual evaluation of the influence of iterative reconstruction and filter kernel on subjective and objective image quality in computed tomography of the brain

    Objectives: To intraindividually evaluate the potential of 4th generation iterative reconstruction (IR) on brain CT with regard to subjective and objective image quality. Methods: 31 consecutive raw data sets of clinical routine native sequential brain CT scans were reconstructed with IR level 0 (= filtered back projection), 1, 3 and 4; 3 different brain filter kernels (smooth/standard/sharp) were applied respectively. Five independent radiologists with different levels of experience performed subjective image rating. Detailed ROI analysis of image contrast and noise was performed. Statistical analysis was carried out by applying a random intercept model. Results: Subjective scores for the smooth and the standard kernels were best at low IR levels, but both, in particular the smooth kernel, scored inferior with an increasing IR level. The sharp kernel scored lowest at IR 0, while the scores substantially increased at high IR levels, reaching significantly best scores at IR 4. Objective measurements revealed an overall increase in contrast-to-noise ratio at higher IR levels, which was highest when applying the soft filter kernel. The absolute grey-white contrast decreased with an increasing IR level and was highest when applying the sharp filter kernel. All subjective effects were independent of the raters' experience and the patients' age and sex. Conclusion: Different combinations of IR level and filter kernel substantially influence subjective and objective image quality of brain CT. (orig.)

  13. An Adjoint-Based Adaptive Ensemble Kalman Filter

    Song, Hajoon

    2013-10-01

    A new hybrid ensemble Kalman filter/four-dimensional variational data assimilation (EnKF/4D-VAR) approach is introduced to mitigate background covariance limitations in the EnKF. The work is based on the adaptive EnKF (AEnKF) method, which bears a strong resemblance to the hybrid EnKF/three-dimensional variational data assimilation (3D-VAR) method. In the AEnKF, the representativeness of the EnKF ensemble is regularly enhanced with new members generated after back projection of the EnKF analysis residuals to state space using a 3D-VAR [or optimal interpolation (OI)] scheme with a preselected background covariance matrix. The idea here is to reformulate the transformation of the residuals as a 4D-VAR problem, constraining the new member with model dynamics and the previous observations. This should provide more information for the estimation of the new member and reduce dependence of the AEnKF on the assumed stationary background covariance matrix. This is done by integrating the analysis residuals backward in time with the adjoint model. Numerical experiments are performed with the Lorenz-96 model under different scenarios to test the new approach and to evaluate its performance with respect to the EnKF and the hybrid EnKF/3D-VAR. The new method leads to the least root-mean-square estimation errors as long as the linear assumption guaranteeing the stability of the adjoint model holds. It is also found to be less sensitive to choices of the assimilation system inputs and parameters.

  14. Resonant ?-filter for Moessbauer spectroscopy

    The parameters of a resonant filter used to measure a proportion of resonance ?-quanta emitted without recoil are analyzed. The optimal thickness and darkness of the filter versus its chemical composition are determined

  15. Filters used in scoliosis radiography

    The use of X-ray filters during full spinal radiography for scoliosis in adolescent patients is discussed. The filters compensate for differences in body thickness while maintaining optimum image quality. They also help to reduce patient dose

  16. Analog filters in nanometer CMOS

    Uhrmann, Heimo; Zimmermann, Horst

    2014-01-01

    Starting from the basics of analog filters and the poor transistor characteristics in nanometer CMOS 10 high-performance analog filters developed by the authors in 120 nm and 65 nm CMOS are described extensively. Among them are gm-C filters, current-mode filters, and active filters for system-on-chip realization for Bluetooth, WCDMA, UWB, DVB-H, and LTE applications. For the active filters several operational amplifier designs are described. The book, furthermore, contains a review of the newest state of research on low-voltage low-power analog filters. To cover the topic of the book comprehensively, linearization issues and measurement methods for the characterization of advanced analog filters are introduced in addition. Numerous elaborate illustrations promote an easy comprehension. This book will be of value to engineers and researchers in industry as well as scientists and Ph.D students at universities. The book is also recommendable to graduate students specializing on nanoelectronics, microelectronics ...

  17. Approaching Traditional Literature in Non-Traditional Ways.

    Tensen, Tracy Anderson; And Others

    1996-01-01

    Presents three brief essays that discuss approaching traditional literature (Thornton Wilder's "Our Town," Mark Twain "Adventures of Huckleberry Finn," and Geoffrey Chaucer's "Canterbury Tales") in imaginative ways in high school English and vocational/technical classrooms. (RS)

  18. Stochastic stacking without filters

    The rate of accumulation of antiprotons is a critical factor in the design of p anti p colliders. A design of a system to accumulate higher anti p fluxes is presented here which is an alternative to the schemes used at the CERN AA and in the Fermilab Tevatron I design. Contrary to these stacking schemes, which use a system of notch filters to protect the dense core of antiprotons from the high power of the stack tail stochastic cooling, an eddy current shutter is used to protect the core in the region of the stack tail cooling kicker. Without filters one can have larger cooling bandwidths, better mixing for stochastic cooling, and easier operational criteria for the power amplifiers. In the case considered here a flux of 1.4 x 108 per sec is achieved with a 4 to 8 GHz bandwidth

  19. Active pre-filters for dc/dc Boost regulators

    Carlos Andrés Ramos-Paja

    2014-07-01

    Full Text Available This paper proposes an active pre-filter to mitigate the current harmonics generated by classical dc/dc Boost regulators, which generate current ripples proportional to the duty cycle. Therefore, high output voltage conditions, i.e., high voltage conversion ratios, produce high current harmonics that must be filtered to avoid damage or source losses. Traditionally, these current components are filtered using electrolytic capacitors, which introduce reliability problems because of their high failure rate. The solution introduced in this paper instead uses a dc/dc converter based on the parallel connection of the Boost canonical cells to filter the current ripples generated by the Boost regulator, improving the system reliability. This solution provides the additional benefits of improving the overall efficiency and the voltage conversion ratio. Finally, the solution is validated with simulations and experimental results.

  20. Traditional Methods for Mineral Analysis

    Ward, Robert E.; Carpenter, Charles E.

    This chapter describes traditional methods for analysis of minerals involving titrimetric and colorimetric procedures, and the use of ion selective electrodes. Other traditional methods of mineral analysis include gravimetric titration (i.e., insoluble forms of minerals are precipitated, rinse, dried, and weighed) and redox reactions (i.e., mineral is part of an oxidation-reduction reaction, and product is quantitated). However, these latter two methods will not be covered because they currently are used little in the food industry. The traditional methods that will be described have maintained widespread usage in the food industry despite the development of more modern instrumentation such as atomic absorption spectroscopy and inductively coupled plasma-atomic emission spectroscopy (Chap. 24). Traditional methods generally require chemicals and equipment that are routinely available in an analytical laboratory and are within the experience of most laboratory technicians. Additionally, traditional methods often form the basis for rapid analysis kits (e.g., Quantab®; for salt determination) that are increasingly in demand. Procedures for analysis of minerals of major nutritional or food processing concern are used for illustrative purposes. For additional examples of traditional methods refer to references (1-6). Slight modifications of these traditional methods are often needed for specific foodstuffs to minimize interferences or to be in the range of analytical performance. For analytical requirements for specific foods see the Official Methods of Analysis of AOAC International (5) and related official methods (6).

  1. A new nonlinear filter

    Elliott, Robert J.; Haykin, Simon

    2006-01-01

    A discrete time filter is constructed where both the observation and signal process have non-linear dynamics with additive white Gaussian noise. Using the reference probably frame- work a convolution Zakai equation is obtained which updates the unnormalized conditional density. Our work obtains approximate solutions of this equation in terms of Gaussian sum when second order expansions are introduced for the non-linear terms.

  2. Metamaterial Tunable Filter Design

    Naima Benmostefa; M. MELIANI; H. Ouslimani

    2013-01-01

    This paper presents a new concept to implement a tunable filter metamaterial with dual negative refraction composed of ferrite slabs and metallic resonators, including split-ring resonators (SRR), and short wire pairs. The ferrite slabs under an applied magnetics bias provide one magnetic resonance frequency band and the metallic resonators provide another one. The continuous wires within the metamaterials provide the negative permittivity in a wide frequency band covering the two magnetic r...

  3. Interactive collaborative filtering

    X. Zhao; Zhang, W.(School of Physics, University College Dublin, Dublin, Ireland); J Wang*

    2013-01-01

    In this paper, we study collaborative filtering (CF) in an interactive setting, in which a recommender system continuously recommends items to individual users and receives interactive feedback. Whilst users enjoy sequential recommendations, the recommendation predictions are constantly refined using up-to-date feedback on the recommended items. Bringing the interactive mechanism back to the CF process is fundamental because the ultimate goal for a rec-ommender system is about the discovery o...

  4. Convolution filters for triangles

    Nicollier, Grégoire

    2014-01-01

    The construction of a new triangle by erecting similar ears on the sides of a given triangle (as in Napoleon's theorem) can be considered as the convolution of the initial triangle with another triangle. We use the discrete Fourier transformation and a shape function to give a complete and explicit description of such convolution filters and their iterates. Our method leads to many old and new results in a very direct way.

  5. Resampling in particle filters

    Hol, Jeroen D.

    2004-01-01

    In this report a comparison is made between four frequently encountered resampling algorithms for particle filters. A theoretical framework is introduced to be able to understand and explain the differences between the resampling algorithms. This facilitates a comparison of the algorithms based on resampling quality and on computational complexity. Using extensive Monte Carlo simulations the theoretical results are verified. It is found that systematic resampling is favourable, both in resamp...

  6. Superconducting notch filter

    Results of a preliminary investigation of a superconducting notch filter for possible application in the 2 to 30 MHz high frequency (HF) communication band are presented. The circuit was successfully implemented using planar geometry so that closed cycle refrigeration could be used to cool circuits fabricated from high T/sub c/ Nb3Sn or Nb3Ge thin films. In the present design, circuit Q's of about 2 x 103 were obtained with 50-ohm source and output impedance

  7. Wire frame filter

    Babev, D.A.; Abasov, S.M.

    1981-10-23

    Presentation is made of a filter, having a cylindrical hollow support frame with sliding openings, arranged uniformly along the perimeter of the frame with longitudinal shafts reinforced on it, and whose ends are rigidly installed in connection pieces, and a wire, wound in a spiral fashion around support shafts. In order to increase the productivity and to simplify the preparation, the support frame is made in the form of a spiral.

  8. Carbon nanotube filters

    Srivastava, A.; Srivastava, O. N.; Talapatra, S.; Vajtai, R.; Ajayan, P. M.

    2004-09-01

    Over the past decade of nanotube research, a variety of organized nanotube architectures have been fabricated using chemical vapour deposition. The idea of using nanotube structures in separation technology has been proposed, but building macroscopic structures that have controlled geometric shapes, density and dimensions for specific applications still remains a challenge. Here we report the fabrication of freestanding monolithic uniform macroscopic hollow cylinders having radially aligned carbon nanotube walls, with diameters and lengths up to several centimetres. These cylindrical membranes are used as filters to demonstrate their utility in two important settings: the elimination of multiple components of heavy hydrocarbons from petroleum-a crucial step in post-distillation of crude oil-with a single-step filtering process, and the filtration of bacterial contaminants such as Escherichia coli or the nanometre-sized poliovirus (~25 nm) from water. These macro filters can be cleaned for repeated filtration through ultrasonication and autoclaving. The exceptional thermal and mechanical stability of nanotubes, and the high surface area, ease and cost-effective fabrication of the nanotube membranes may allow them to compete with ceramic- and polymer-based separation membranes used commercially.

  9. Assessment of ceramic membrane filters

    Ahluwalia, Rajesh K.; Geyer, Howard K.; Im, Kwan H.; Zhu, Chao; Shelleman, David; Tressler, Richard E.

    The objectives of this project are (1) to develop analytical models for evaluating the fluid mechanics of membrane coated, dead-end ceramic filters; and (2) to determine the effects of thermal and thermo-chemical aging on the material properties of emerging ceramic hot gas filters. A honeycomb cordierite monolith with a thin ceramic coating and a rigid candle filter were evaluated.

  10. Filters via Neutrosophic Crisp Sets

    A. A. Salama

    2013-03-01

    Full Text Available In this paper we introduce the notion of filter on the neutrosophic crisp set, then we consider a generalization of the filter’s studies. Afterwards, we present the important neutrosophic crisp filters. We also study several relations between different neutrosophic crisp filters and neutrosophic topologies. Possible applications to database systems are touched upon.

  11. Aurorae in Australian Aboriginal Traditions

    Hamacher, Duane W.

    2013-07-01

    Transient celestial phenomena feature prominently in the astronomical knowledge and traditions of Aboriginal Australians. In this paper, I collect accounts of the Aurora Australis from the literature regarding Aboriginal culture. Using previous studies of meteors, eclipses, and comets in Aboriginal traditions, I anticipate that the physical properties of aurora, such as their generally red colour as seen from southern Australia, will be associated with fire, death, blood, and evil spirits. The survey reveals this to be the case and also explores historical auroral events in Aboriginal cultures, aurorae in rock art, and briefly compares Aboriginal auroral traditions with other global indigenous groups, including the Maori of New Zealand.

  12. Aurorae in Australian Aboriginal Traditions

    Hamacher, Duane W

    2013-01-01

    Transient celestial phenomena feature prominently in the astronomical knowledge and traditions of Aboriginal Australians. In this paper, I collect accounts of the Aurora Australis from the literature regarding Aboriginal culture. Using previous studies of meteors, eclipses, and comets in Aboriginal traditions, I anticipate that the physical properties of aurora, such as their generally red colour as seen from southern Australia, will be associated with fire, death, blood, and evil spirits. The survey reveals this to be the case and also explores historical auroral events in Aboriginal cultures, aurorae in rock art, and briefly compares Aboriginal auroral traditions with other global indigenous groups, including the Maori of New Zealand.

  13. Controlling flow conditions of test filters in iodine filters

    Several different iodine filter and test filter designs and experience gained from their operation are presented. For the flow experiments, an iodine filter system equipped with flow regulating and measuring devices was built. In the experiments the influence of the packing method of the iodine sorption material and the influence of the flow regulating and measuring divices upon the flow conditions in the test filters was studied. On the basis of the experiments it has been shown that the flows through the test filters always can be adjusted to a correct value if there only is a high enough pressure difference available across the test filter ducting. As a result of the research, several different methods are presented with which the flows through the test filters in both operating and future iodine sorption system can easily be measured and adjusted to their correct values. (author)

  14. Little Eyolf and dramatic tradition

    Roland Lysell

    2015-02-01

    Full Text Available The article criticises an Ibsen tradition who has seen the last scene of Little Eyolf as a reconciliation. Instead, the article discusses the improbability of a happy marriage characterised by social engagement. The play is open but it is hardly probable that Rita, with her erotic desire, and Allmers, whose desire has turned into metaphysics, can be happy together. The arguments refer to inner criteria and the constantly present dramatic tradition.

  15. Richard Rorty on Traditional Philosophy

    Mohammad Asghari

    2007-01-01

    We have been investigated different ideas of Richard Rorty and result of this paper shows that he contributes some features and important properties to the western traditional philosophy which begins from Plato. Rorty criticizes these features. This investigation only refers six important features of traditional philosophy in various writings of Rorty which are very important in his philosophy. These features are: (1) escape from history, (2) being based on epistemology, (3) being base of cul...

  16. Electronic commerce versus traditional commerce

    Dorin Vicentiu Popescu; Manoela Popescu

    2007-01-01

    The internet represents new opportunities for the traditional companies, including the diversification of the given services and also the promotion of the new ones, which are personalized and attractive and they are possible thanks to the information and communication technologies. According to this, the Internet impact, which has allowed the development of a new form of commerce- the commerce via Internet (which is a component of the electronic commerce), against the traditional global comme...

  17. Was the Monetarist Tradition Invented?

    George S. Tavlas

    1998-01-01

    In 1969, Harry Johnson charged that Milton Friedman 'invented' a Chicago oral quantity theory tradition, the idea being that in order to launch a monetarist counter-revolution, Friedman needed to establish a linkage with pre-Keynesian orthodoxy. This paper shows that there was a distinct pre-Keynesian Chicago quantity-theory tradition that advocated increased government expenditure during the Great Depression in order to put money directly into circulation. This policy stance distinguished th...

  18. Chapter 1. Traditional marketing revisited

    Lambin, Jean-Jacques

    2013-01-01

    The objective of this chapter is to review the traditional marketing concept and to analyse its main ambiguities as presented in popular textbooks. The traditional marketing management model placing heavy emphasis of the marketing mix is in fact a supply-driven approach of the market, using the understanding of consumers’ needs to mould demand to the requirements of supply, instead of adapting supply to the expectations of demand. To clarify the true role of marketing, a distinction is made b...

  19. Manufacturing a low-cost ceramic water filter and filter system for the elimination of common pathogenic bacteria

    Simonis, J. J.; Basson, A. K.

    Africa is one of the most water-scarce continents in the world but it is the lack of potable water which results in diarrhoea being the leading cause of death amongst children under the age of five in Africa (696 million children under 5 years old in Africa contract diarrhoea resulting in 2000 deaths per day: WHO and UNICEF, 2009). Most potable water treatment methods use bulk water treatment not suitable or available to the majority of rural poor in Sub-Saharan Africa. One simple but effective way of making sure that water is of good quality is by purifying it by means of a household ceramic water filter. The making and supply of water filters suitable for the removal of suspended solids, pathogenic bacteria and other toxins from drinking water is therefore critical. A micro-porous ceramic water filter with micron-sized pores was developed using the traditional slip casting process. This locally produced filter has the advantage of making use of less raw materials, cost, labour, energy and expertise and being more effective and efficient than other low cost produced filters. The filter is fitted with a silicone tube inserted into a collapsible bag that acts as container and protection for the filter. Enhanced flow is obtained through this filter system. The product was tested using water inoculated with high concentrations of different bacterial cultures as well as with locally polluted stream water. The filter is highly effective (log10 > 4 with 99.99% reduction efficiency) in providing protection from bacteria and suspended solids found in natural water. With correct cleaning and basic maintenance this filter technology can effectively provide drinking water to rural families affected by polluted surface water sources. This is an African solution for the more than 340 million people in Africa without access to clean drinking water (WHO and UNICEF, 2008).

  20. Collaborative Filtering Recommendation on Users' Interest Sequences.

    Cheng, Weijie; Yin, Guisheng; Dong, Yuxin; Dong, Hongbin; Zhang, Wansong

    2016-01-01

    As an important factor for improving recommendations, time information has been introduced to model users' dynamic preferences in many papers. However, the sequence of users' behaviour is rarely studied in recommender systems. Due to the users' unique behavior evolution patterns and personalized interest transitions among items, users' similarity in sequential dimension should be introduced to further distinguish users' preferences and interests. In this paper, we propose a new collaborative filtering recommendation method based on users' interest sequences (IS) that rank users' ratings or other online behaviors according to the timestamps when they occurred. This method extracts the semantics hidden in the interest sequences by the length of users' longest common sub-IS (LCSIS) and the count of users' total common sub-IS (ACSIS). Then, these semantics are utilized to obtain users' IS-based similarities and, further, to refine the similarities acquired from traditional collaborative filtering approaches. With these updated similarities, transition characteristics and dynamic evolution patterns of users' preferences are considered. Our new proposed method was compared with state-of-the-art time-aware collaborative filtering algorithms on datasets MovieLens, Flixster and Ciao. The experimental results validate that the proposed recommendation method is effective and outperforms several existing algorithms in the accuracy of rating prediction. PMID:27195787

  1. OPTIMIZATION OF ADVANCED FILTER SYSTEMS

    R.A. Newby; G.J. Bruck; M.A. Alvin; T.E. Lippert

    1998-04-30

    Reliable, maintainable and cost effective hot gas particulate filter technology is critical to the successful commercialization of advanced, coal-fired power generation technologies, such as IGCC and PFBC. In pilot plant testing, the operating reliability of hot gas particulate filters have been periodically compromised by process issues, such as process upsets and difficult ash cake behavior (ash bridging and sintering), and by design issues, such as cantilevered filter elements damaged by ash bridging, or excessively close packing of filtering surfaces resulting in unacceptable pressure drop or filtering surface plugging. This test experience has focused the issues and has helped to define advanced hot gas filter design concepts that offer higher reliability. Westinghouse has identified two advanced ceramic barrier filter concepts that are configured to minimize the possibility of ash bridge formation and to be robust against ash bridges should they occur. The ''inverted candle filter system'' uses arrays of thin-walled, ceramic candle-type filter elements with inside-surface filtering, and contains the filter elements in metal enclosures for complete separation from ash bridges. The ''sheet filter system'' uses ceramic, flat plate filter elements supported from vertical pipe-header arrays that provide geometry that avoids the buildup of ash bridges and allows free fall of the back-pulse released filter cake. The Optimization of Advanced Filter Systems program is being conducted to evaluate these two advanced designs and to ultimately demonstrate one of the concepts in pilot scale. In the Base Contract program, the subject of this report, Westinghouse has developed conceptual designs of the two advanced ceramic barrier filter systems to assess their performance, availability and cost potential, and to identify technical issues that may hinder the commercialization of the technologies. A plan for the Option I, bench-scale test program has also been developed based on the issues identified. The two advanced barrier filter systems have been found to have the potential to be significantly more reliable and less expensive to operate than standard ceramic candle filter system designs. Their key development requirements are the assessment of the design and manufacturing feasibility of the ceramic filter elements, and the small-scale demonstration of their conceptual reliability and availability merits.

  2. Filter for reactor emergency cooling system

    The invention describes the design of a filter for the emergency cooling system. The new type of filter can be rinsed by flushing water backwards through the filter. The arrangement will prevent the filter from being silt up

  3. Tradition?! Traditional Cultural Institutions on Customary Practices in Uganda

    Joanna R. Quinn

    2014-01-01

    Full Text Available This contribution traces the importance of traditional institutions in rehabilitating societies in general terms and more particularly in post-independence Uganda. The current regime, partly by inventing “traditional” cultural institutions, partly by co-opting them for its own interests, contributed to a loss of legitimacy of those who claim responsibility for customary law. More recently, international prosecutions have complicated the use of customary mechanisms within such societies. This article shows that some traditional and cultural leaders continue to struggle to restore their original institutions, some having taken the initiative of inventing new forms of engaging with society. Uganda is presented as a test case for the International Criminal Court’s ability to work with traditional judicial institutions in Africa.

  4. Factorized Kalman Filtering

    Suzdaleva, Evgenia

    Praha : ÚTIA AV ČR, 2006 - (Přikryl, J.; Andrýsek, J.; Šmídl, V.), s. 226-233 [International PhD Workshop on Interplay of Societal and Technical Decision-Making, Young Generation Viewpoint /7./. Hrubá Skála (CZ), 25.09.2006-30.09.2006] R&D Projects: GA MŠk 1M0572; GA ČR(CZ) GP201/06/P434 Grant ostatní: project TED ESF Institutional research plan: CEZ:AV0Z10750506 Keywords : state estimation * factorized filters * traffic control Subject RIV: BC - Control Systems Theory

  5. Factorized Kalman Filtering

    Suzdaleva, Evgenia

    Praha : ÚTIA AV ČR, 2006 - (Přikryl, J.; Šmídl, V.). s. 51-52 [International PhD Workshop on Interplay of Societal and Technical Decision-Making, Young Generation Viewpoint /7./. 25.09.2006-30.09.2006, Hrubá Skála] R&D Projects: GA MŠk 1M0572; GA ČR GP201/06/P434 Institutional research plan: CEZ:AV0Z10750506 Keywords : state estimation * factorized filters * traffic control Subject RIV: BC - Control Systems Theory

  6. Advances in Collaborative Filtering

    Koren, Yehuda; Bell, Robert

    The collaborative filtering (CF) approach to recommenders has recently enjoyed much interest and progress. The fact that it played a central role within the recently completed Netflix competition has contributed to its popularity. This chapter surveys the recent progress in the field. Matrix factorization techniques, which became a first choice for implementing CF, are described together with recent innovations. We also describe several extensions that bring competitive accuracy into neighborhood methods, which used to dominate the field. The chapter demonstrates how to utilize temporal models and implicit feedback to extend models accuracy. In passing, we include detailed descriptions of some the central methods developed for tackling the challenge of the Netflix Prize competition.

  7. Charcoal filter testing

    Lyons, J. [Nuclear Regulatory Commission, Washington, DC (United States)

    1997-08-01

    In this very brief, informal presentation, a representative of the US Nuclear Regulatory Commission outlines some problems with charcoal filter testing procedures and actions being taken to correct the problems. Two primary concerns are addressed: (1) the process to find the test method is confusing, and (2) the requirements of the reference test procedures result in condensation on the charcoal and causes the test to fail. To address these problems, emergency technical specifications were processed for three nuclear plants. A generic or an administrative letter is proposed as a more permanent solution. 1 fig.

  8. Vertical media bed filter and method of cleaning filter panels

    A vertical media bed dust collector in which the media bed of a filter panel is rejuvenated when necessary by interrupting the gas flow through the panel, withdrawing the filter media from the panel] separating the agglomerated dust from the filter media, returing the filter media to the filter panel, and reestablishing the gas flow through the panel. The system further includes apparatus for removing collected dust from the deparating and recirculating surfaces of the media handling apparatus and also from the remote face of the filter panels before the cleaned gas is allowed to pass out of the collector so that the cleaned gas is not recontaminated by small amounts of dust adhering to those surfaces

  9. Traditional botanical medicine: an introduction.

    Rosenbloom, Richard A; Chaudhary, Jayesh; Castro-Eschenbach, Diane

    2011-01-01

    The role of traditional medicine in the well-being of mankind has certainly journeyed a long way. From an ancient era, in which knowledge was limited to a few traditional healers and dominated by the use of whole plants or crude drugs, the science has gradually evolved into a complete healthcare system with global recognition. Technologic advancements have facilitated traditional science to deliver numerous breakthrough botanicals with potency equivalent to those of conventional drugs. The renewed interest in traditional medicine is mainly attributed to its ability to prevent disease, promote health, and improve quality of life. Despite the support received from public bodies and research organizations, development of botanical medicines continues to be a challenging process. The present article gives a summarized description of the various difficulties encountered in the development and evaluation of botanical drugs, including isolation of active compounds and standardization of plant ingredients. It indicates a future direction of traditional medicine toward evidence-based evaluation of health claims through well-controlled safety and efficacy studies. PMID:21336093

  10. Design and Implementation for a Non Linear State Filter for LEO Micro Satellite

    S. Chouraqui

    2009-01-01

    Full Text Available This study preliminarily investigates the numerical application of both Extended Kalman Filter (EKF (which has traditionally been used for non linear estimation and a relatively new filter, Unscented Kalman Filter (UKF to the nonlinear estimation problem. The new method can be applied to nonlinear systems without the linearization process necessary for the EKF and it does not demand a Gaussian distribution of noise and what's more, its ease of implementation and more accurate estimation features enables it to demonstrate its good performance. Present experimental results and analysis indicate that unscented Kalman filtering UKF have shown better performances in presence of the severe nonlinearity in state equations.

  11. Low-power implementation of polyphase filters in Quadratic Residue Number System

    Cardarilli, Gian Carlo; Re, Andrea Del; Nannarelli, Alberto; Re, Marco

    The aim of this work is the reduction of the power dissipated in digital filters, while maintaining the timing unchanged. A polyphase filter bank in the Quadratic Residue Number System (QRNS) has been implemented and then compared, in terms of performance, area, and power dissipation to the...... implementation of a polyphase filter bank in the traditional two's complement system (TCS). The resulting implementations, designed to have the same clock rates, show that the QRNS filter is smaller and consumes less power than the TCS one....

  12. Toward Green Cloud Computing: An Attribute Clustering Based Collaborative Filtering Method for Virtual Machine Migration

    Zhang Liu-Mei

    2013-01-01

    Full Text Available In this study, an attribute clustering based collaborative filtering algorithm is depicted for virtual machine migration towards green Cloud computing. The algorithm utilizes similarity characteristics of virtual machine task related attributes, especially CPU related attributes, to filter redundant data by feature selection. Then by referencing K-Means clustering to effectively solve the rating scale problems existing in the traditional collaborative filtering recommendation algorithm. Experiments use virtual machine task related information for clustering the data. By integration of a scaled rating scheme on task related properties and the collaborative filtering philosophy to provide migration recommendation for system administrators.

  13. Location Estimation for an Autonomously Guided Vehicle using an Augmented Kalman Filter to Autocalibrate the Odometry

    Larsen, Thomas Dall; Bak, Martin; Andersen, Nils Axel; Ravn, Ole

    1998-01-01

    A Kalman filter using encoder readings as inputs and vision measurements as observations is designed as a location estimator for an autonomously guided vehicle (AGV). To reduce the effect of modelling errors an augmented filter that estimates the true system parameters is designed. The traditional...... way of reducing these errors is by fictitious noise injection in the filter model. The main problem with that approach however is that the filter does not learn about its bad model, it just puts more confidence in incoming measurements and less in the model. As a result the estimates will drift and...

  14. A New Stateless Packet Classification and Filter against DoS Attacks

    Guang Jin

    2014-02-01

    Full Text Available Capabilities is a typical scheme of stateless filtering. In order to classify and filter packets effectively, a novel scheme of packet classification and filter based on capabilities is proposed in this paper. In our scheme, a new classifier module is added and a new filter structure is designed. We employ capabilities as verification and introduce new authorization in the communications. All these innovations make packet classification owning good effects in attacking scenario. The experimental results based on large-scale topology datasets and NS2 show that our scheme is better than traditional packet classification algorithms, especially under complex cyber environment.

  15. Kalman Filtered Compressed Sensing

    Vaswani, Namrata

    2008-01-01

    We consider the problem of reconstructing time sequences of spatially sparse signals (with unknown and time-varying sparsity patterns) from a limited number of linear "incoherent" measurements, in real-time. The signals are sparse in some transform domain referred to as the sparsity basis. For a single spatial signal, the solution is provided by Compressed Sensing (CS). The question that we address is, for a sequence of sparse signals, can we do better than CS, if (a) the sparsity pattern of the signal's transform coefficients' vector changes slowly over time, and (b) a simple prior model on the temporal dynamics of its current non-zero elements is available. The overall idea of our solution is to use CS to estimate the support set of the initial signal's transform vector. At future times, run a reduced order Kalman filter with the currently estimated support and estimate new additions to the support set by applying CS to the Kalman innovations or filtering error (whenever it is "large").

  16. Nanoparticle optical notch filters

    Kasinadhuni, Pradeep Kumar

    Developing novel light blocking products involves the design of a nanoparticle optical notch filter, working on the principle of localized surface plasmon resonance (LSPR). These light blocking products can be used in many applications. One such application is to naturally reduce migraine headaches and light sensitivity. Melanopsin ganglion cells present in the retina of the human eye, connect to the suprachiasmatic nucleus (SCN-the body's clock) in the brain, where they participate in the entrainment of the circadian rhythms. As the Melanopsin ganglion cells are involved in triggering the migraine headaches in photophobic patients, it is necessary to block the part of visible spectrum that activates these cells. It is observed from the action potential spectrum of the ganglion cells that they absorb light ranging from 450-500nm (blue-green part) of the visible spectrum with a ?max (peak sensitivity) of around 480nm (blue line). Currently prescribed for migraine patients is the FL-41 coating, which blocks a broad range of wavelengths, including wavelengths associated with melanopsin absorption. The nanoparticle optical notch filter is designed to block light only at 480nm, hence offering an effective prescription for the treatment of migraine headaches.

  17. Demonstration of polarization-independent resonant subwavelength grating filter arrays.

    Peters, D W; Boye, R R; Wendt, J R; Kellogg, R A; Kemme, S A; Carter, T R; Samora, S

    2010-10-01

    We demonstrate a two-dimensional (2D) polarization-independent resonant subwavelength grating (RSG) in a filter array. RSGs, also called guided mode resonant filters, are traditionally one-dimensional gratings; however, this leads to TE and TM resonances at different wavelengths and with different spectral shape. A 2D grating can remove the polarization dependence at normal incidence, while maintaining the desirable RSG properties of high reflectivity, narrow passband, and low sidebands without ripple. We designed and fabricated 2D gratings with near-identical responses for both polarizations at normal incidence in the telecommunication band. Ninety percent reflectivity is achieved at the resonant wavelengths. PMID:20890333

  18. Direct analysis of air filter samples for alpha emitting isotopes

    The traditional method for determination of alpha emitting isotopes on air filters has been to process the samples by radiochemical methods. However, this method is too slow for cases of incidents involving radioactive materials where the determination of personnel received dose is urgent. A method is developed to directly analyze the air filters taken from personal and area air monitors. The site knowledge is used in combination with alpha spectral information to identify isotopes. A mathematical function is developed to estimate the activity for each isotope. The strengths and weaknesses of the method are discussed

  19. Filter and Filter Bank Design for Image Texture Recognition

    Randen, Trygve

    1997-12-31

    The relevance of this thesis to energy and environment lies in its application to remote sensing such as for instance sea floor mapping and seismic pattern recognition. The focus is on the design of two-dimensional filters for feature extraction, segmentation, and classification of digital images with textural content. The features are extracted by filtering with a linear filter and estimating the local energy in the filter response. The thesis gives a review covering broadly most previous approaches to texture feature extraction and continues with proposals of some new techniques. 143 refs., 59 figs., 7 tabs.

  20. New filter efficiency test for future nuclear grade HEPA filters

    We have developed a new test procedure for evaluating filter penetrations as low as 10/sup /minus/9/ at 0.1-?m particle diameter. In comparison, the present US nuclear filter certification test has a lower penetration limit of 10/sup /minus/5/. Our new test procedure is unique not only in its much higher sensitivity, but also in avoiding the undesirable effect of clogging the filter. Our new test procedure consists of a two-step process: (1) We challenge the test filter with a very high concentration of heterodisperse aerosol for a short time while passing all or a significant portion of the filtered exhaust into an inflatable bag; (2) We then measure the aerosol concentration in the bag using a new laser particle counter sensitive to 0.07-?m diameter. The ratio of particle concentration in the bag to the concentration challenging the filter gives the filter penetration as a function of particle diameter. The bad functions as a particle accumulator for subsequent analysis to minimize the filter exposure time. We have studied the particle losses in the bag over time and find that they are negligible when the measurements are taken within one hour. We also compared filter penetration measurements taken in the conventional direct-sampling method with the indirect bag-sampling method and found excellent agreement. 6 refs., 18 figs., 1 tab

  1. Robust fault detection filter design

    Douglas, Randal Kirk

    The detection filter is a specially tuned linear observer that forms the residual generation part of an analytical redundancy system designed for model-based fault detection and identification. The detection filter has an invariant state subspace structure that produces a residual with known and fixed directional characteristics in response to a known design fault direction. In addition to a parameterization of the detection filter gain, three methods are given for improving performance in the presence of system disturbances, sensor noise, model mismatch and sensitivity to small parameter variations. First, it is shown that by solving a modified algebraic Riccati equation, a stabilizing detection filter gain is found that bounds the H-infinity norm of the transfer matrix from system disturbances and sensor noise to the detection filter residual. Second, a specially chosen expanded-order detection filter is formed with fault detection properties identical to a set of independent reduced-order filters that have no structural constraints. This result is important to the practitioner because the difficult problem of finding a detection filter insensitive to disturbances and sensor noise is converted to the easier problem of finding a set of uncoupled noise insensitive filters. Furthermore, the statistical properties of the reduced-order filter residuals are easier to find than the statistical properties of the structurally constrained detection filter residual. Third, an interpretation of the detection filter as a special case of the dual of the restricted decoupling problem leads to a new detection filter eigenstructure assignment algorithm. The new algorithm places detection filter left eigenvectors, which annihilate the detection spaces, rather than right eigenvectors, which span the detection spaces. This allows for a more flexible observer based fault detection system structure that could not be formulated as a detection filter. Furthermore, the link to the dual problem allows existing results relating supremal controllability subspaces and ill-conditioned eigenvector to be easily applied to the detection filter. The practitioner will find these results useful as they provide guidelines for desensitizing the detection filter to small parameter variations.

  2. Analysis of Traditional Historical Clothing

    Jensen, Karsten; Schmidt, A. L.; Petersen, A. H.

    2013-01-01

    A recurrent problem for scholars who investigate traditional and historical clothing is the measuring of items of clothing and subsequent pattern construction. The challenge is to produce exact data without damaging the item. The main focus of this paper is to present a new procedure for...... establishing a three-dimensional model and the corresponding two-dimensional pattern for items of skin clothing that are not flat. The new method is non-destructive, and also accurate and fast. Furthermore, this paper presents an overview of the more traditional methods of pattern documentation and measurement...

  3. Initial results of a new generation dual source CT system using only an in-plane comb filter for ultra-high resolution temporal bone imaging

    Meyer, Mathias; Haubenreisser, Holger; Schoenberg, Stefan O.; Henzler, Thomas [Heidelberg University, Institute of Clinical Radiology and Nuclear Medicine, University Medical Center Mannheim, Medical Faculty Mannheim, Mannheim (Germany); Raupach, Rainer; Schmidt, Bernhard; Leidecker, Christianne; Allmendinger, Thomas; Flohr, Thomas [Siemens Healthcare, Imaging and Therapy Division, Forchheim (Germany); Lietzmann, Florian; Schad, Lothar R. [Heidelberg University, Computer Assisted Clinical Medicine, University Medical Center Mannheim, Medical Faculty Mannheim, Mannheim (Germany)

    2015-01-15

    To prospectively evaluate radiation dose and image quality of a third generation dual-source CT (DSCT) without z-axis filter behind the patient for temporal bone CT. Forty-five patients were either examined on a first, second, or third generation DSCT in an ultra-high-resolution (UHR) temporal bone-imaging mode. On the third generation DSCT system, the tighter focal spot of 0.2 mm{sup 2} removesthe necessity for an additional z-axis-filter, leading to an improved z-axis radiation dose efficiency. Images of 0.4 mm were reconstructed using standard filtered-back-projection or iterative reconstruction (IR) technique for previous generations of DSCT and a novel IR algorithm for the third generation DSCT. Radiation dose and image quality were compared between the three DSCT systems. The statistically significantly highest subjective and objective image quality was evaluated for the third generation DSCT when compared to the first or second generation DSCT systems (all p < 0.05). Total effective dose was 63 %/39 % lower for the third generation examination as compared to the first and second generation DSCT. Temporal bone imaging without z-axis-UHR-filter and a novel third generation IR algorithm allows for significantly higher image quality while lowering effective dose when compared to the first two generations of DSCTs. (orig.)

  4. Initial results of a new generation dual source CT system using only an in-plane comb filter for ultra-high resolution temporal bone imaging

    To prospectively evaluate radiation dose and image quality of a third generation dual-source CT (DSCT) without z-axis filter behind the patient for temporal bone CT. Forty-five patients were either examined on a first, second, or third generation DSCT in an ultra-high-resolution (UHR) temporal bone-imaging mode. On the third generation DSCT system, the tighter focal spot of 0.2 mm2 removesthe necessity for an additional z-axis-filter, leading to an improved z-axis radiation dose efficiency. Images of 0.4 mm were reconstructed using standard filtered-back-projection or iterative reconstruction (IR) technique for previous generations of DSCT and a novel IR algorithm for the third generation DSCT. Radiation dose and image quality were compared between the three DSCT systems. The statistically significantly highest subjective and objective image quality was evaluated for the third generation DSCT when compared to the first or second generation DSCT systems (all p < 0.05). Total effective dose was 63 %/39 % lower for the third generation examination as compared to the first and second generation DSCT. Temporal bone imaging without z-axis-UHR-filter and a novel third generation IR algorithm allows for significantly higher image quality while lowering effective dose when compared to the first two generations of DSCTs. (orig.)

  5. Filters via Neutrosophic Crisp Sets

    Salama, A.A.; Florentin Smarandache

    2013-01-01

    In this paper we introduce the notion of filter on the neutrosophic crisp set, then we consider a generalization of the filter’s studies. Afterwards, we present the important neutrosophic crisp filters. We also study several relations between different neutrosophic crisp filters and neutrosophic topologies. Possible applications to database systems are touched upon.

  6. In Situ Cleanable HEPA Filter

    Phillips, T.D.

    1999-11-18

    This paper describes a welded steel HEPA filter which uses liquid spray cleaning and vacuum drying. Development of the filter was initiated in order to eliminate personnel exposure, disposal cost, and short lifetime associated with systems commonly employed throughout the Department of Energy complex. In addition the design promises to resolve the issues of fire, elevated temperatures, wetting, filter strength, air leaks and aging documented in the May, 1999 DNFSB-TECH-23 report.

  7. Turbo Equalization without MMSE Filtering

    Xiao, P; Carrasco, R; Wassell, I.

    2005-01-01

    The filter-based turbo equalization scheme has been proposed in several papers to avoid the prohibitive complexity imposed by the trellis-based turbo equalization. In the existing literature,the filter-based approach has been solely implemented by a linear MMSE filter, the coeffi cients of which are updated to minimize the mean-square error for every output symbol of the equalizer. A new turbo equalization algorithm is introduced in this paper. It has a lower computational complexity compared...

  8. Digital filtering in nuclear medicine

    Digital filtering is a powerful mathematical technique in computer analysis of nuclear medicine studies. The basic concepts of object-domain and frequency-domain filtering are presented in simple, largely nonmathemaical terms. Computational methods are described using both the Fourier transform and convolution techniques. The frequency response is described and used to represent the behavior of several classes of filters. These concepts are illustrated with examples drawn from a variety of important applications in nuclear medicine

  9. Multi-filter spectrophotometry simulations

    Callaghan, Kim A. S.; Gibson, Brad K.; Hickson, Paul

    1993-01-01

    To complement both the multi-filter observations of quasar environments described in these proceedings, as well as the proposed UBC 2.7 m Liquid Mirror Telescope (LMT) redshift survey, we have initiated a program of simulated multi-filter spectrophotometry. The goal of this work, still very much in progress, is a better quantitative assessment of the multiband technique as a viable mechanism for obtaining useful redshift and morphological class information from large scale multi-filter surveys.

  10. FILT - Filtering Indexed Lucene Triples

    Stuhr, Magnus

    2012-01-01

    The Resource Description Framework (RDF) is the W3C recommended standard for data on the semantic web, while the SPARQL Protocol and RDF Query Language (SPARQL) is the query language that retrieves RDF triples by subject, predicate, or object. RDF data often contain valuable information that can only be queried through filter functions. The SPARQL query language for RDF can include filter clauses in order to define specific data criteria, such as full-text searches, numerical filtering, and c...

  11. Liquid filter for liquids containing radioactive materials

    A filter insert consisting of several filter plates is suspended in a filter case. The radioactive liquid containing solids flows into the filter case from above and is distributed from its centre via a central duct into intermediate spaces between the filter plates. A filter cake is formed on the filters. The filtrate flows through the centre of the filter and is taken radially outwards. The filtrate is collected in a lower collector space. The filter insert can be removed from the filter case by remote operation when it is used up. (DG)

  12. DSP Control of Line Hybrid Active Filter

    Dan, Stan George; Benjamin, Doniga Daniel; Magureanu, R.; Asiminoaei, Lucian; Teodorescu, Remus; Blaabjerg, Frede

    Active Power Filters have been intensively explored in the past decade. Hybrid active filters inherit the efficiency of passive filters and the improved performance of active filters, and thus constitute a viable improved approach for harmonic compensation. In this paper a parallel hybrid filter is...... studied for current harmonic compensation. The hybrid filter is formed by a single tuned Le filter and a small-rated power active filter, which are directly connected in series without any matching transformer. Thus the required rating of the active filter is much smaller than a conventional standalone...... active filter. Simulation and experimental results obtained in laboratory confirmed the validity and effectiveness of the control....

  13. Complex Coherence Estimation Based on Adaptive Refined Lee Filter

    LONG Jiangping

    2015-12-01

    Full Text Available Polarimetric synthetic aperture radar interferometry (PolInSAR data procedures and their application are based on the estimation of polarimetric complex coherence, which are influenced by size of windows and filter methods. In this paper, the adaptive refined Lee filter, which based on traditional refined Lee filter, is used to estimate the interferometric coherence. The size of filter window is changed by the correlation coefficient between the central sub window and the neighboring sub window. Correlation coefficient which is larger than the threshold value means to the homogeneous pixels in the selected window, and then boxcar filter is chosen to estimate complex coherence. When maximum of correlation coefficient in difference windows sizes is smaller than the threshold value, the refined Lee filter is used to estimate the complex coherence. The efficiency and advantage of the new algorithm are demonstrated with E-SAR data sets. The results show that the influence of speckle noise and edge information is improved; more accurate complex coherence estimated by selected window size and selected pixels increase the accurate of forest parameters inversion.

  14. Device for filtering gaseous media

    The air filter system for gaseous radioactive substances consists of a vertical chamber with filter material (charcoal, e.g. impregnated). On one side of the chamber there is an inlet compartment and an outlet compartment. On the other side a guiding compartment turns the gas flow coming from the natural-air side through the lower part of filter chamber to the upper part of the filter. The gas flow leaves the upper part through the outlet conpartment as cleaned-air flow. The filter material may be filled into the chamber from above and drawn off below. For better utilization of the filter material the filter chamber is separated by means of a wall between the inlet and outlet compartment. This partition wall consist of two sheets arranged one above the other provided with slots which may be superposed in alignment. In this case filter material is tickling from the upper part of the chamber into the lower part avoiding to form a crater in the filter bed. (DG)

  15. Optimization of integrated polarization filters

    Gagnon, Denis; Déziel, Jean-Luc; Dubé, Louis J

    2014-01-01

    This study reports on the design of small footprint, integrated polarization filters based on engineered photonic lattices. Using a rods-in-air lattice as a basis for a TE filter and a holes-in-slab lattice for the analogous TM filter, we are able to maximize the degree of polarization of the output beams up to 98 % with a transmission efficiency greater than 75 %. The proposed designs allow not only for logical polarization filtering, but can also be tailored to output an arbitrary transverse beam profile. The lattice configurations are found using a recently proposed parallel tabu search algorithm for combinatorial optimization problems in integrated photonics.

  16. Adaptive filtering and change detection

    Gustafsson, Fredrik

    2003-01-01

    Adaptive filtering is a classical branch of digital signal processing (DSP). Industrial interest in adaptive filtering grows continuously with the increase in computer performance that allows ever more conplex algorithms to be run in real-time. Change detection is a type of adaptive filtering for non-stationary signals and is also the basic tool in fault detection and diagnosis. Often considered as separate subjects Adaptive Filtering and Change Detection bridges a gap in the literature with a unified treatment of these areas, emphasizing that change detection is a natural extensi

  17. Spatial filtering with photonic crystals

    Photonic crystals are well known for their celebrated photonic band-gaps—the forbidden frequency ranges, for which the light waves cannot propagate through the structure. The frequency (or chromatic) band-gaps of photonic crystals can be utilized for frequency filtering. In analogy to the chromatic band-gaps and the frequency filtering, the angular band-gaps and the angular (spatial) filtering are also possible in photonic crystals. In this article, we review the recent advances of the spatial filtering using the photonic crystals in different propagation regimes and for different geometries. We review the most evident configuration of filtering in Bragg regime (with the back-reflection—i.e., in the configuration with band-gaps) as well as in Laue regime (with forward deflection—i.e., in the configuration without band-gaps). We explore the spatial filtering in crystals with different symmetries, including axisymmetric crystals; we discuss the role of chirping, i.e., the dependence of the longitudinal period along the structure. We also review the experimental techniques to fabricate the photonic crystals and numerical techniques to explore the spatial filtering. Finally, we discuss several implementations of such filters for intracavity spatial filtering

  18. Cryptographically Secure Bloom-Filters

    Ryo Nojima

    2009-08-01

    Full Text Available In this paper, we propose a privacy-preserving variant of Bloom-filters. The Bloom-filter has many applications such as hash-based IP-traceback systems and Web cache sharing. In some of those applications, equipping the Bloom-filter with the privacy-preserving mechanism is crucial for the deployment. In this paper, we propose a cryptographically secure privacy-preserving Bloom-filter protocol. We propose such two protocols based on blind signatures and oblivious pseudorandom functions, respectively. To show that the proposed protocols are secure, we provide a reasonable security definition and prove the security.

  19. Kalman filtering implementation with Matlab

    Kleinbauer, Rachel

    2004-01-01

    1960 und 1961 veröffentlichte Rudolf Emil Kalmen seine Arbeiten über einen rekursiven prädiktiven Filter, der auf dem Gebrauch von rekursiven Algorithmen basiert. Damit revolutionierte er das Feld der Schätzverfahren. Seitdem ist der sogenannte Kalman Filter Gegenstand ausführlicher Forschung und findet bis heute Anwendung in zahlreichen Gebieten. Der Kalman Filter schätzt den Zustand eines dynamischen Systems, auch wenn die exakte Form dieses Systems unbekannt ist. Der Filter ist sehr lei...

  20. Innovating Traditional Nursing Administration Challenges.

    Joseph, M Lindell; Fowler, Debra

    2016-03-01

    The evolving and complex practice environment calls for new mindsets among nurse leaders, academics, and nurse innovators to envision innovative ways to manage and optimize traditional tasks and processes in nursing administration. The purpose of this article is to present 3 case studies that used linear programming and simulation to innovate staffing enterprises, financial management of healthcare systems, and curricula development. PMID:26906516

  1. Traditional Teacher Education Still Matters

    Jacobs, Nick

    2013-01-01

    Fresh from teaching his first full school year the author reflects on his traditional teacher preparation path into the classroom and finds he was instilled with a common sense of ethics, compassion, a demand for reflective practice, and a robust guiding philosophy. As a college student, he learned theory and was able to augment that with…

  2. Individualizing in Traditional Classroom Settings.

    Thornell, John G.

    1980-01-01

    Effective individualized instruction depends primarily on the teacher possessing the skills to implement it. Individualization is therefore quite compatible with the traditional self-contained elementary classroom model, but not with its alternative, departmentalization, which allows teachers neither the time flexibility nor the familiarity with…

  3. Active Learning versus Traditional Teaching

    L.A. Azzalis

    2009-05-01

    Full Text Available In traditional teaching most of the class time is spent with the professor lecturing and the students watching and listening. The students work individually, and cooperation is discouraged. On the other hand,  active learning  changes the focus of activity from the teacher to the learners, in which students solve problems, answer questions, formulate questions of their own, discuss, explain, debate during class;  moreover, students work in teams on problems and projects under conditions that assure positive interdependence and individual accountability. Although student-centered methods have repeatedly been shown to be superior to the traditional teacher-centered approach to instruction, the literature regarding the efficacy of various teaching methods is inconclusive. The purpose of this study was to compare the student perceptions of course and instructor effectiveness, course difficulty, and amount learned between the active learning and lecture sections  in Health Sciences´ courses by statistical data from Anhembi Morumbi University. Results indicated significant  difference between active  learning and traditional  teaching. Our conclusions were that strategies promoting  active  learning to  traditional lectures could increase knowledge and understanding.

  4. Adolescent Obesity: Rethinking Traditional Approaches.

    Morrill, Correen M.; And Others

    1991-01-01

    Describes traditional approaches to working with obese students (weight loss programs, nutrition programs, self-esteem groups). Suggests system-based alternative. Suggests providing in-service workshops for staff; developing team to work with large students; providing individual counseling; assisting students in locating peer support groups; and…

  5. Goddess Traditions in Tantric Hinduism

    Hinduism cannot be understood without the Great Goddess and the goddess-orientated Śākta traditions. The Goddess pervades Hinduism at all levels, from aniconic village deities to high-caste pan-Hindu goddesses to esoteric, tantric goddesses. Nevertheless, the highly influential tantric forms of...

  6. A Grand Tradition of Struggle.

    West, Cornel

    2000-01-01

    Offers an "inspirational speech" delivered by Harvard professor Cornel West at the 1994 National Council of Teachers of English convention. Discusses ways in which English teachers can help to keep alive the tradition of struggle for decency, dignity, freedom, and democracy. Shares his belief in the significant role English teachers play in…

  7. Bringing Traditional Teachings to Leadership

    Washington, Siemthlut Michelle

    2005-01-01

    The purpose of this article is to examine how our Kootegan Yix Meh Towlth (traditional governance) might contribute to the development and implementation of a culturally relevant Sliammon governance model. Our Uk woom he heow (ancestors) lived their everyday lives guided by a complex system of practices and beliefs based on our Ta-ow (traditional…

  8. Fabric filter system study

    Chambers, R. L.; Plunk, O. C.; Kunka, S. L.

    1984-08-01

    Results of the fourth year of operation of a fabric filter installed on a coal-fired boiler are reported. Project work during the fourth year concentrated on fabric studies. The 10-oz/sq yd fabrics of the 150 1/2 warp, 150 2/2T fill construction demonstrated superior performance over the most common 14-oz/sq yd constructions, regardless of coating. It was determined that improving cleaning by increasing shaking amplitude is more detrimental to baglife than increasing shaker frequency. Maintenance and operation observations continued, and the resolution of these types of problems became more efficient because of increased experience of maintenance personnel with baghouse-related problems.

  9. Efficient Multicore Collaborative Filtering

    Wu, Yao; Bickson, Danny; Low, Yucheng; Yang, Qing

    2011-01-01

    his paper describes the solution method taken by LeBuSiShu team for track1 in ACM KDD CUP 2011 contest (resulting in the 5th place). We identified two main challenges: the unique item taxonomy characteristics as well as the large data set size.To handle the item taxonomy, we present a novel method called Matrix Factorization Item Taxonomy Regularization (MFITR). MFITR obtained the 2nd best prediction result out of more then ten implemented algorithms. For rapidly computing multiple solutions of various algorithms, we have implemented an open source parallel collaborative filtering library on top of the GraphLab machine learning framework. We report some preliminary performance results obtained using the BlackLight supercomputer.

  10. AER image filtering

    Gómez-Rodríguez, F.; Linares-Barranco, A.; Paz, R.; Miró-Amarante, L.; Jiménez, G.; Civit, A.

    2007-05-01

    Address Event Representation (AER) is an emergent neuromorphic interchip communication protocol that allows real-time virtual massive connectivity among huge number of neurons located on different chips.[1] By exploiting high speed digital communication circuits (with nano-seconds timing), synaptic neural connections can be time multiplexed, while neural activity signals (with mili-seconds timings) are sampled at low frequencies. Neurons generate "events" according to their activity levels. That is, more active neurons generate more events per unit time and access the interchip communication channel more frequently than neurons with low activity. In Neuromorphic system development, AER brings some advantages to develop real-time image processing system: (1) AER represents the information like time continuous stream not like a frame; (2) AER sends the most important information first (although this depends on the sender); (3) AER allows to process information as soon as it is received. When AER is used in artificial vision field, each pixel is considered like a neuron, so pixel's intensity is represented like a sequence of events; modifying the number and the frequency of these events, it is possible to make some image filtering. In this paper we present four image filters using AER: (a) Noise addition and suppression, (b) brightness modification, (c) single moving object tracking and (d) geometrical transformations (rotation, translation, reduction and magnification). For testing and debugging, we use USB-AER board developed by Robotic and Technology of Computers Applied to Rehabilitation (RTCAR) research group. This board is based on an FPGA, devoted to manage the AER functionality. This board also includes a micro-controlled for USB communication, 2 Mbytes RAM and 2 AER ports (one for input and one for output).

  11. Sensory Pollution from Bag Filters, Carbon Filters and Combinations

    Bekö, Gabriel; Clausen, Geo; Weschler, Charles J.

    2008-01-01

    Used ventilation filters are a major source of sensory pollutants in air handling systems. The objective of the present study was to evaluate the net effect that different combinations of filters had on perceived air quality after 5 months of continuous filtration of outdoor suburban air. A panel...

  12. A parallel Kalman filter via the square root Kalman filtering

    Romera, Rosario; Cipra, Tomas

    1993-01-01

    A parallel algorithm for Kalman filtering with contaminated observations is developed. Theı parallel implementation is based on the square root version of the Kalman filter (see [3]). Thisı represents a great improvement over serial implementations reducing drastically computationalı costs for each state update.

  13. Cost volume refinement filter for post filtering of visual corresponding

    Fujita, Shu; Matsuo, Takuya; Fukushima, Norishige; Ishibashi, Yutaka

    2015-03-01

    In this paper, we propose a generalized framework of cost volume refinement filtering for visual corresponding problems. When we estimate a visual correspondence map, e.g., depth map, optical flow, segmentation and so on, the estimated map often contains a number of noises and blurs. One of the solutions for this problem is post filtering. Edge-preserving filtering, such as joint bilateral filtering, can remove the noises, but it causes blurs on object boundaries at the same time. As an approach to remove noises without blurring, there is cost volume refinement filtering (CVRF) that is an effective solution for the refinement of such labeling of correspondence problems. There are some papers that propose several methods categorized into CVRF for various applications. These methods use various reconstructing metrics functions, which are L1 norm, L2 norm or exponential function, and various edge-preserving filters, which are joint bilateral filtering, guided image filtering and so on. In this paper, we generalize these factors and add range-spacial domain resizing factor for CVRF. Experimental results show that our generalized formulation outperform the conventional approaches, and also show what the format of CVRF is appropriate for various applications of stereo matching and optical flow estimation.

  14. Advanced filters for nuclear facilities and filter conditioning for disposal

    This paper reports the advantages of the cylinder shape selected for the filter elements for aerosol and iodine removal from the offgas of nuclear facilities, above all in view of remote and manual operation and transport, conditioning and disposal. In order to test the conditioning of polygonal HEPA filter elements, several filter elements not exposed to radioactivity were crushed remotely and embedded in concrete in a 400 l waste drum. The waste drum was subsequently saw cut in order to verify the quality of concrete embedding. The result of concrete embedding is satisfactory. The design is presented of a filter element capable of accommodating gas flows up to 500 m3/h for wet aerosol removal with a high removal efficiency. Also the design of a filter element for gas flows up to 800 m3/h to be used in iodine removal from offgases with low iodine contents is described. In order to be able to use the cylindrical filter elements developed for remote handling in manual operation too, e.g., for cleaning low level offgases, a manually operated filter housing was developed. It is suited for working pressures up to 10 bar and working temperatures up to 160 degree C. The filter elements are replaced by the usual bagging technique

  15. Particle Kalman Filtering: A Nonlinear Framework for Ensemble Kalman Filters

    Hoteit, Ibrahim

    2010-09-19

    Optimal nonlinear filtering consists of sequentially determining the conditional probability distribution functions (pdf) of the system state, given the information of the dynamical and measurement processes and the previous measurements. Once the pdfs are obtained, one can determine different estimates, for instance, the minimum variance estimate, or the maximum a posteriori estimate, of the system state. It can be shown that, many filters, including the Kalman filter (KF) and the particle filter (PF), can be derived based on this sequential Bayesian estimation framework. In this contribution, we present a Gaussian mixture‐based framework, called the particle Kalman filter (PKF), and discuss how the different EnKF methods can be derived as simplified variants of the PKF. We also discuss approaches to reducing the computational burden of the PKF in order to make it suitable for complex geosciences applications. We use the strongly nonlinear Lorenz‐96 model to illustrate the performance of the PKF.

  16. T Source Inverter Based Shunt Active Filter with LCL Passive Filter for the 415V 50 Hz Distribution systems

    S. Sellakumar

    2015-06-01

    Full Text Available The inverter topology is being used as an active filter to reduce the harmonics in the power system [1]. The traditional voltage source or current source inverters are having the disadvantages of limited output voltage range hence it may not be able to supply enough compensating currents during heavy switching surges, Vulnerable to EMI noise and the devices gets damaged in either open or short circuit conditions and the main switching device of VSI and CSI are not interchangeable. The active filters are the type of DC-AC system with wide range of voltage regulation and integration of energy storages is often required. This cannot be achieved with conventional inverters and hence the impedance source inverters have been suggested. The T source inverters are basically impedance source inverters which can be used as an active filter in the power system. The MATLAB simulation is done and the results are discussed in this paper for both the types. The proposed dampening system is fully characterized by LCL based passive filters [6] and T source inverter based shunt active filter. The disturbances in the supply voltage and load current due to the non linear loads are observed in the simulation. The same is studied after connecting the designed hybrid shunt active filter in the distribution system. The simulation results obtained from the proposed method proves that it gives comparatively better THD value.

  17. Compressed sensing & sparse filtering

    Carmi, Avishy Y; Godsill, Simon J

    2013-01-01

    This book is aimed at presenting concepts, methods and algorithms ableto cope with undersampled and limited data. One such trend that recently gained popularity and to some extent revolutionised signal processing is compressed sensing. Compressed sensing builds upon the observation that many signals in nature are nearly sparse (or compressible, as they are normally referred to) in some domain, and consequently they can be reconstructed to within high accuracy from far fewer observations than traditionally held to be necessary. Apart from compressed sensing this book contains other related app

  18. Application of DFT Filter Banks and Cosine Modulated Filter Banks in Filtering

    Lin, Yuan-Pei; Vaidyanathan, P. P.

    1994-01-01

    None given. This is a proposal for a paper to be presented at APCCAS '94 in Taipei, Taiwan. (From outline): This work is organized as follows: Sec. II is devoted to the construction of the new 2m channel under-decimated DFT filter bank. Implementation and complexity of this DFT filter bank are discussed therein. IN a similar manner, the new 2m channel cosine modulated filter bank is discussed in Sec. III. Design examples are given in Sec. IV.

  19. Testing Dual Rotary Filters - 12373

    The Savannah River National Laboratory (SRNL) installed and tested two hydraulically connected SpinTekR Rotary Micro-filter units to determine the behavior of a multiple filter system and develop a multi-filter automated control scheme. Developing and testing the control of multiple filters was the next step in the development of the rotary filter for deployment. The test stand was assembled using as much of the hardware planned for use in the field including instrumentation and valving. The control scheme developed will serve as the basis for the scheme used in deployment. The multi filter setup was controlled via an Emerson DeltaV control system running version 10.3 software. Emerson model MD controllers were installed to run the control algorithms developed during this test. Savannah River Remediation (SRR) Process Control Engineering personnel developed the software used to operate the process test model. While a variety of control schemes were tested, two primary algorithms provided extremely stable control as well as significant resistance to process upsets that could lead to equipment interlock conditions. The control system was tuned to provide satisfactory response to changing conditions during the operation of the multi-filter system. Stability was maintained through the startup and shutdown of one of the filter units while the second was still in operation. The equipment selected for deployment, including the concentrate discharge control valve, the pressure transmitters, and flow meters, performed well. Automation of the valve control integrated well with the control scheme and when used in concert with the other control variables, allowed automated control of the dual rotary filter system. Experience acquired on a multi-filter system behavior and with the system layout during this test helped to identify areas where the current deployment rotary filter installation design could be improved. Completion of this testing provides the necessary information on the control and system behavior that will be used in deployment on actual waste. (authors)

  20. Kalman Filter Desing, Smoothing and Analysis

    2001-01-01

    Thesis is based on three different aspects of Kalman filtering. >Kalman filters for navigation. Investigate the difference between a Extended Kalman Filter and a Linearized Kalman Filter with feedback. And show how different system models relate to these Kalman Filters when implemented in a filter. >Smoothing. Investigate how much there is to be gained from smoothing. We will only look at the fixed-interval smoother, using the method of forward and backward filtering. ...

  1. Shunt Active Filter in Damping Harmonics Propagation

    BERBAOUI, B.; RAHLI, M.; MESLEM, Y.; TEDJINI, H.

    2010-01-01

    This paper deals with a hybrid shunt active power filter applied on 500 kV HVDC, after a description of the causes and effects harmonic pollution which may damage equipments and interrupt electric power customers service; in this paper we present the deferent solutions of this problem among one has to study the two most recent types of filtering: passive and hybrid filter. The hybrid filter consists of active filter connected in shunt with passive filter. The hybrid shunt active filter pro...

  2. Spatial Filter with Volume Gratings for High-peak-power Multistage Laser Amplifiers

    Tan, Yi-zhou; Zheng, Guang-wei; Shen, Ben-jian; Pan, Heng-yue; Li, Liu

    2012-01-01

    The regular spatial filters comprised of lens and pinhole are essential component in high power laser systems, such as lasers for inertial confinement fusion, nonlinear optical technology and directed-energy weapon. On the other hand the pinhole is treated as a bottleneck of high power laser due to harmful plasma created by the focusing beam. In this paper we present a spatial filter based on angular selectivity of Bragg diffraction grating to avoid the harmful focusing effect in the traditional pinhole filter. A spatial filter consisted of volume phase gratings in two-pass amplifier cavity were reported. Two-dimensional filter was proposed by using single Pi-phase-shifted Bragg grating, numerical simulation results shown that its angular spectrum bandwidth can be less than 160urad. The angular selectivity of photo-thermo-refractive glass and RUGATE film filters, construction stability, thermal stability and the effects of misalignments of gratings on the diffraction efficiencies under high-pulse-energy laser...

  3. Traditional Smallpox Vaccines and Atopic Dermatitis

    ... Rhinitis Infections Psychosocial Vaccines and Atopic Dermatitis Traditional Smallpox Vaccines and Atopic Dermatitis Frequently Asked Questions What is the traditional smallpox vaccine? The traditional smallpox vaccine is made from ...

  4. Interactions between microbial activity and distribution and mineral coatings on sand grains from rapid sand filters treating groundwater

    Gülay, Arda; Tatari, Karolina; Musovic, Sanin; Mateiu, Ramona Valentina; Albrechtsen, Hans-Jørgen; Smets, Barth F.

    Rapid sand filtration is a traditional and widespread technology for drinking water purification which combines biological, chemical and physical processes together. Granular media, especially sand, is a common filter material that allows several oxidized compounds to accumulate on its surface...

  5. Ground roll attenuation using non-stationary matching filtering

    Jiao, Shebao; Chen, Yangkang; Bai, Min; Yang, Wencheng; Wang, Erying; Gan, Shuwei

    2015-12-01

    Conventional approaches based on adaptive subtraction for ground roll attenuation first predict an initial model for ground rolls and then adaptively subtract it from the original data using a stationary matching filter (MF). Because of the non-stationary property of seismic data and ground rolls, the application of a traditional stationary MF is not physically plausible. Thus, in the case of highly non-stationary seismic reflections and ground rolls, a stationary MF cannot obtain satisfactory results. In this paper, we apply a non-stationary matching filter (NMF) to adaptively subtract the ground rolls. The NMF can be obtained by solving a highly under-determined inversion problem using non-stationary autoregression. We apply the proposed approach to one synthetic example and two field data examples, and demonstrate a much improved performance compared with the traditional MF approach.

  6. The double well mass filter

    Various mass filter concepts based on rotating plasmas have been suggested with the specific purpose of nuclear waste remediation. We report on a new rotating mass filter combining radial separation with axial extraction. The radial separation of the masses is the result of a “double-well” in effective radial potential in rotating plasma with a sheared rotation profile

  7. The double well mass filter

    Gueroult, Renaud; Fisch, Nathaniel J. [Princeton Plasma Physics Laboratory, Princeton University, Princeton, New Jersey 08543 (United States); Rax, Jean-Marcel [Laboratoire d' optique appliquée-LOA, Ecole Polytechnique, Chemin de la Hunière, 91761 Palaiseau Cedex (France)

    2014-02-15

    Various mass filter concepts based on rotating plasmas have been suggested with the specific purpose of nuclear waste remediation. We report on a new rotating mass filter combining radial separation with axial extraction. The radial separation of the masses is the result of a “double-well” in effective radial potential in rotating plasma with a sheared rotation profile.

  8. Derivative free filtering using Kalmtool

    Bayramoglu, Enis; Hansen, Søren; Ravn, Ole; Poulsen, Niels Kjølstad

    2010-01-01

    In this paper we present a toolbox enabling easy evaluation and comparison of different filtering algorithms. The toolbox is called Kalmtool 4 and is a set of MATLAB tools for state estimation of nonlinear systems. The toolbox contains functions for extended Kalman filtering as well as for DD1 fi...

  9. Chopped filter for nuclear spectroscopy

    Some of the theoretical and practical factors affecting the energy resolution of a spectrometry system are considered, specially those related to t he signal-to-noise ratio, and a time-variant filter with the transfer function of the theoretical optimum filter, during its active time, is proposed. A prototype has been tested and experimental results are presented. (Author)

  10. Informativeness of Parallel Kalman Filters

    Hajiyev, Chingiz

    2004-01-01

    This article considers the informativeness of parallel Kalman filters. Expressions are derived for determination of the amount of information obtained by additional measurements with a reserved measurement channel during processing. The theorems asserting that there is an increase in the informativeness of Kalman filters when there is a failure-free reserved measurement channel are proved.

  11. Filter-extruded liposomes revisited

    Hinna, Askell; Steiniger, Frank; Hupfeld, Stefan; Stein, Paul C.; Kuntsche, Judith; Brandl, Martin

    Filter-extrusion is a widely used technique for down-sizing of phospholipid vesicles. In order to gain a detailed insight into size and size distributions of filter-extruded vesicles composed of egg phosphatidyl-choline (with varying fractions of cholesterol) – in relation to extrusionparameters ...

  12. Market Risk Beta Estimation using Adaptive Kalman Filter

    Atanu Das,

    2010-06-01

    Full Text Available Market risk of an asset or portfolio is recognized through beta in Capital Asset Pricing Model (CAPM. Traditional estimation techniques emerge poor results when beta in CAPM assumed to be dynamic and follows auto regressive model. Kalman Filter (KF can optimally estimate dynamic beta where measurement noise covariance and state noise covariance are assumed to be known in a state-space framework. This paper applied Adaptive Kalman Filter (AKF for beta estimation when the above covariances are not known and estimated dynamically. The technique is first characterized through simulation study and then applied to empirical data from Indian security market. A odification of the used AKF is also proposed to take care of the problems of AKF implementation onbeta estimation and simulations show that modified method improves the performance of the filter measured by RMSE.

  13. Segmentation of Retinal Blood Vessels Based on Cake Filter.

    Bao, Xi-Rong; Ge, Xin; She, Li-Huang; Zhang, Shi

    2015-01-01

    Segmentation of retinal blood vessels is significant to diagnosis and evaluation of ocular diseases like glaucoma and systemic diseases such as diabetes and hypertension. The retinal blood vessel segmentation for small and low contrast vessels is still a challenging problem. To solve this problem, a new method based on cake filter is proposed. Firstly, a quadrature filter band called cake filter band is made up in Fourier field. Then the real component fusion is used to separate the blood vessel from the background. Finally, the blood vessel network is got by a self-adaption threshold. The experiments implemented on the STARE database indicate that the new method has a better performance than the traditional ones on the small vessels extraction, average accuracy rate, and true and false positive rate. PMID:26636095

  14. Traditional Medicine in Developing Countries

    Thorsen, Rikke Stamp

    medicine has led to the formulation of policies on the integration of traditional medicine into public health care. Local level integration is already taking place as people use multiple treatments when experiencing illness. Research on local level use of traditional medicine for health care, in particular...... treatments which are “traditional” reflect other conceptualizations than the “modern/traditional” dichotomy. People conceptualized treatments through notions related to time and place, positioning treatments on spectrums ranging from home treatments to city treatments as well as from past to present. Through...... ways; as the preferred, the pragmatic, the convenient, the compelled and the auxiliary treatment. Prevalence of use and ways of using self-treatment with medicinal plants varied among sites reflecting differences in social and health care needs as well as everyday realities and people’s experiences of...

  15. Filter wheat equalization for DSA

    This paper reports on the design of a practical system for radiographic equalization in digital subtraction angiography (DSA), using multiple filter wheels mounted near the x-ray tube. Using an unequalized scout image, multiple filter wheels were independently rotated under computer control in order to vary the spatial distribution of the attenuator material on the filter wheel surfaces intersecting the x-ray beam. Computer simulations using clinical DSA images were used to analyze the optimal configuration of attenuator material on the filter wheels; the resulting improvement in signal-to-noise ratio in some typical DSA images was quantified. Neural networks were evaluated as a mathematic technique for rapidly determining the optimal filter wheel positioning from the scout image data

  16. Identification Filtering with fuzzy estimations

    J.J Medel J

    2012-10-01

    Full Text Available A digital identification filter interacts with an output reference model signal known as a black-box output system. The identification technique commonly needs the transition and gain matrixes. Both estimation cases are based on mean square criterion obtaining of the minimum output error as the best estimation filtering. The evolution system represents adaptive properties that the identification mechanism includes considering the fuzzy logic strategies affecting in probability sense the evolution identification filter. The fuzzy estimation filter allows in two forms describing the transition and the gain matrixes applying actions that affect the identification structure. Basically, the adaptive criterion conforming the inference mechanisms set, the Knowledge and Rule bases, selecting the optimal coefficients in distribution form. This paper describes the fuzzy strategies applied to the Kalman filter transition function, and gain matrixes. The simulation results were developed using Matlab©.

  17. Software Development: Agile vs. Traditional

    Marian STOICA

    2013-01-01

    Full Text Available Organizations face the need to adapt themselves to a complex business environment, in continuous change and transformation. Under these circumstances, organization agility is a key element in gaining strategic advantages and market success. Achieving and maintaining agility requires agile architectures, techniques, methods and tools, able to react in real time to change requirements. This paper proposes an incursion in the software development, from traditional to agile.

  18. Post-traditional corporate governance

    Mason, Michael; O'Mahony, Joan

    2007-01-01

    Traditional definitions of corporate governance are narrow, focusing on legal relations between managers and shareholders. More recent definitions extend the boundaries of governance to consider the role that various stakeholders play in shaping the behaviour of firms. While stakeholding theory embraces a broader set of corporate constituencies, our argument in this paper is that even these definitions are too narrow – they lack the analytical capacity to account for the social embeddedness a...

  19. Symmetry in Traditional Persian Poetry

    BEHNEJAD, S. ALIREZA; Zahedi, Maryam

    2010-01-01

    A great many Persian poems have been composed by many famous or obscure poets throughout the centuries which Persians have learned, memorized and recited throughout their lives. Regardless of their meaning, there are other aspects that make learning these poems simple and pleasant. It seems that the rhythm in traditional Persian poems is an important factor that makes it possible for non-Persian speaking people to enjoy Persian poems. As Marco polo, writes in his itinerary: ‘Persians are peop...

  20. Ginseng in Traditional Herbal Prescriptions

    Park, Ho Jae; Kim, Dong Hyun; Park, Se Jin; Kim, Jong Min; Ryu, Jong Hoon

    2012-01-01

    Panax ginseng Meyer has been widely used as a tonic in traditional Korean, Chinese, and Japanese herbal medicines and in Western herbal preparations for thousands of years. In the past, ginseng was very rare and was considered to have mysterious powers. Today, the efficacy of drugs must be tested through well-designed clinical trials or meta-analyses, and ginseng is no exception. In the present review, we discuss the functions of ginseng described in historical documents and describe how thes...

  1. TRADITIONAL FERMENTED FOODS OF LESOTHO

    Gadaga, Tendekayi H.; Molupe Lehohla; Victor Ntuli

    2013-01-01

    This paper describes the traditional methods of preparing fermented foods and beverages of Lesotho. Information on the preparation methods was obtained through a combination of literature review and face to face interviews with respondents from Roma in Lesotho. An unstructured questionnaire was used to capture information on the processes, raw materials and utensils used. Four products; motoho (a fermented porridge), Sesotho (a sorghum based alcoholic beverage), hopose (sorghum fermented beer...

  2. A collaborative filtering similarity measure based on singularities.

    Bobadilla Sancho, Jesus; Ortega Requena, Fernando; Hernando Esteban, Antonio

    2012-01-01

    Recommender systems play an important role in reducing the negative impact of informa- tion overload on those websites where users have the possibility of voting for their prefer- ences on items. The most normal technique for dealing with the recommendation mechanism is to use collaborative filtering, in which it is essential to discover the most similar users to whom you desire to make recommendations. The hypothesis of this paper is that the results obtained by applying traditional similari...

  3. Joint MIMO radar waveform and receiving filter optimization

    Chen, Chun-Yang; Vaidyanathan, P.P.

    2009-01-01

    The concept of MIMO (multiple-input multipleoutput) radar allows each transmitting antenna element to transmit an arbitrary waveform. This provides extra degrees of freedom compared to the traditional transmit beamforming approach. It has been shown in the recent literature that MIMO radar systems have many advantages. In this paper, we consider the joint optimization of waveforms and receiving filters in the MIMO radar when the prior information of target and clutter ...

  4. Health traditions of Sikkim Himalaya

    Ashok Kumar Panda

    2010-01-01

    Full Text Available Ancient medical systems are still prevalent in Sikkim, popularly nurtured by Buddhist groups using the traditional Tibetan pharmacopoeia overlapping with Ayurvedic medicine. Traditional medical practices and their associated cultural values are based round Sikkim′s three major communities, Lepcha, Bhutia and Nepalis. In this study, a semi-structured questionnaire was prepared for folk healers covering age and sex, educational qualification, source of knowledge, types of practices, experience and generation of practice, and transformation of knowledge. These were administered to forty-eight folk healers identified in different parts of Sikkim. 490 medicinal plants find their habitats in Sikkim because of its large variations in altitude and climate. For 31 commonly used by these folk healers, we present botanical name, family, local name, distribution, and parts used, together with their therapeutic uses, mostly Rheumatoid arthritis, Gout, Gonorrhea, Fever, Viral flu, asthma, Cough and Cold, indigestion, Jaundice etc. A case treated by a folk healer is also recounted. This study indicates that, in the studied area, Sikkim′s health traditions and folk practices are declining due to shifts in socio-economic patterns, and unwillingness of the younger generation to adopt folk healing as a profession.

  5. On-line filtering

    Present day electronic detectors used in high energy physics make it possible to obtain high event rates and it is likely that future experiments will face even higher data rates than at present. The complexity of the apparatus increases very rapidly with time and also the criteria for selecting desired events become more and more complex. So complex in fact that the fast trigger system cannot be designed to fully cope with it. The interesting events become thus contaminated with multitudes of uninteresting ones. To distinguish the 'good' events from the often overwhelming background of other events one has to resort to computing techniques. Normally this selection is made in the first part of the analysis of the events, analysis normally performed on a powerful scientific computer. This implies however that many uninteresting or background events have to be recorded during the experiment for subsequent analysis. A number of undesired consequences result; and these constitute a sufficient reason for trying to perform the selection at an earlier stage, in fact ideally before the events are recorded on magnetic tape. This early selection is called 'on-line filtering' and it is the topic of the present lectures. (Auth.)

  6. Filter for radioactive iodine

    Purpose: To prevent the reduction in the activity of radioactive iodine adsorbent material at high temperature. Constitution: Regenerated cellulose type fiberous activated carbon with the pore volume of 0.08 cc/g is reactivated by impregnating to support 10% by weight of magnesium acetate to obtain fiberous activated carbon with the pore volume 0.40 cc/g for the pores having diameter between 30 - 300 A. 60 parts of the activated carbon, 40 parts of breaching graft pulp made of coniferous trees and 7 parts by weight of polyvinyl alcohol fibers are subjected to a paper- making process to obtain activated carbon paper. Then, it is molded into a single side corrugated sheet, which is immersed in an ethanol solution containing 20% by weight of triethylenediamine then dried and molded into a honeycomb filter. It is necessary that the activated carbon material has pore volume of more than 5 cc/g for the pores having diameter between 30 - 300 A. (Horiuchi, T.)

  7. Collaborative Filtering Recommender Systems

    Mehrbakhsh Nilashi

    2013-04-01

    Full Text Available Recommender Systems are software tools and techniques for suggesting items to users by considering their preferences in an automated fashion. The suggestions provided are aimed at support users in various decision-making processes. Technically, recommender system has their origins in different fields such as Information Retrieval (IR, text classification, machine learning and Decision Support Systems (DSS. Recommender systems are used to address the Information Overload (IO problem by recommending potentially interesting or useful items to users. They have proven to be worthy tools for online users to deal with the IO and have become one of the most popular and powerful tools in E-commerce. Many existing recommender systems rely on the Collaborative Filtering (CF and have been extensively used in E-commerce .They have proven to be very effective with powerful techniques in many famous E-commerce companies. This study presents an overview of the field of recommender systems with current generation of recommendation methods and examines comprehensively CF systems with its algorithms.

  8. Implementational Aspects of the Contourlet Filter Bank and Application in Image Coding

    Truong T. Nguyen

    2009-02-01

    Full Text Available This paper analyzed the implementational aspects of the contourlet filter bank (or the pyramidal directional filter bank (PDFB, and considered its application in image coding. First, details of the binary tree-structured directional filter bank (DFB are presented, including a modification to minimize the phase delay factor and necessary steps for handling rectangular images. The PDFB is viewed as an overcomplete filter bank, and the directional filters are expressed in terms of polyphase components of the pyramidal filter bank and the conventional DFB. The aliasing effect of the conventional DFB and the Laplacian pyramid to the directional filters is then considered, and the conditions for reducing this effect are presented. The new filters obtained by redesigning the PDFBs satisfying these requirements have much better frequency responses. A hybrid multiscale filter bank consisting of the PDFB at higher scales and the traditional maximally decimated wavelet filter bank at lower scales is constructed to provide a sparse image representation. A novel embedded image coding system based on the image decomposition and a morphological dilation algorithm is then presented. The coding algorithm efficiently clusters the significant coefficients using progressive morphological operations. Context models for arithmetic coding are designed to exploit the intraband dependency and the correlation existing among the neighboring directional subbands. Experimental results show that the proposed coding algorithm outperforms the current state-of-the-art wavelet-based coders, such as JPEG2000, for images with directional features.

  9. Modernism and tradition and the traditions of modernism

    Kros Džonatan

    2006-01-01

    Full Text Available Conventionally, the story of musical modernism has been told in terms of a catastrophic break with the (tonal past and the search for entirely new techniques and modes of expression suitable to a new age. The resulting notion of a single, linear, modernist mainstream (predicated on the basis of a Schoenbergian model of musical progress has served to conceal a more subtle relationship between past and present. Increasingly, it is being recognized that there exist many modernisms and their various identities are forged from a continual renegotiation between past and present, between tradition(s and the avant-garde. This is especially relevant when attempting to discuss the reception of modernism outside central Europe, where the adoption of (Germanic avant-garde attitudes was often interpreted as being "unpatriotic". The case of Great Britain is examined in detail: Harrison Birtwistle’s opera The Mask of Orpheus (1973–83 forms the focus for a wider discussion of modernism within the context of late/post-modern thought.

  10. Gas separating and venting filter

    A gas separating and venting filter is disclosed for separating gases and liquids and venting the gases in any position of the filter. A housing defines an interior chamber, with inlet and outlet means for the flow of liquid into and out of the chamber. A hydrophilic filter membrane extends along one major wall of the chamber, with longitudinally extending open-sided passageways in the one major wall facing the hydrophilic filter membrane and leading to the outlet means. The hydrophilic filter membrane is flexible for ballooning into the passageways in response to a build-up of pressure in the chamber to restrict and/or cut off the flow of liquid through the passageways. A hydrophobic filter membrane extends along substantially the entire length of an opposite major wall of the chamber between the inlet and outlet means for passing gas but not liquid therethrough. A plurality of spaced vent holes are formed in the opposite major wall for venting gas which has passed through the hydrophobic filter membrane

  11. Nanophotonic filters for digital imaging

    Walls, Kirsty

    There has been an increasing demand for low cost, portable CMOS image sensors because of increased integration, and new applications in the automotive, mobile communication and medical industries, amongst others. Colour reproduction remains imperfect in conventional digital image sensors, due to the limitations of the dye-based filters. Further improvement is required if the full potential of digital imaging is to be realised. In alternative systems, where accurate colour reproduction is a priority, existing equipment is too bulky for anything but specialist use. In this work both these issues are addressed by exploiting nanophotonic techniques to create enhanced trichromatic filters, and multispectral filters, all of which can be fabricated on-chip, i.e. integrated into a conventional digital image sensor, to create compact, low cost, mass produceable imaging systems with accurate colour reproduction. The trichromatic filters are based on plasmonic structures. They exploit the excitation of surface plasmon resonances in arrays of subwavelength holes in metal films to filter light. The currently-known analytical expressions are inadequate for optimising all relevant parameters of a plasmonic structure. In order to obtain arbitrary filter characteristics, an automated design procedure was developed that integrated a genetic algorithm and 3D finite-difference time-domain tool. The optimisation procedure's efficacy is demonstrated by designing a set of plasmonic filters that replicate the CIE (1931) colour matching functions, which themselves mimic the human eye's daytime colour response.

  12. Factors Influencing HEPA Filter Performance

    Properly functioning HEPA air filtration systems depend on a variety of factors that start with the use of fully characterized challenge conditions for system design and then process control during operation. This paper addresses factors that should be considered during the design phase as well as operating parameters that can be monitored to ensure filter function and lifetime. HEPA filters used in nuclear applications are expected to meet design, fabrication, and performance requirements set forth in the ASME AG-1 standard. The DOE publication Nuclear Air Cleaning Handbook (NACH) is an additional guidance document for design and operation HEPA filter systems in DOE facilities. These two guidelines establish basic maximum operating parameters for temperature, maximum aerosol particle size, maximum particulate matter mass concentration, acceptable differential pressure range, and filter media velocity. Each of these parameters is discussed along with data linking variability of each parameter with filter function and lifetime. Temporal uncertainty associated with gas composition, temperature, and absolute pressure of the air flow can have a direct impact on the volumetric flow rate of the system with a corresponding impact on filter media velocity. Correlations between standard units of flow rate (standard meters per minute or cubic feet per minute) versus actual units of volumetric flow rate are shown for variations in relative humidity for a 70 deg. C to 200 deg. C temperature range as an example of gas composition that, uncorrected, will influence media velocity. The AG-1 standard establishes a 2.5 cm/s (5 feet per minute) ceiling for media velocities of nuclear grade HEPA filters. Data are presented that show the impact of media velocities from 2.0 to 4.0 cm/s media velocities (4 to 8 fpm) on differential pressure, filter efficiency, and filter lifetime. Data will also be presented correlating media velocity effects with two different particle size distributions. (authors)

  13. Thick Filter Transmission Measurements

    In multigroup schemes for calculation of neutron space-energy distribution the fluxes g and reaction rates a*F(r)>g averaged over energy intervals g (groups) are determined. The averaged over resonances in the group transmissions and cross sections with self-indication: g = g, a(n)>g a*exp(-n*?)>g, is considered here as a basic input information about elementary processes. In principle for the resolved resonance region these cross section functionals can be determined directly by using the data from evaluated data files. The situation in the region of unresolved resonances levels, where the direct information about the cross sections resonance structure in the averaging interval is not available, is much more complicate. Indirectly the effect of resonance structure appear as a disagreement of thick samples transmission data with calculated exp(-n*) in dependence on the sample thickness n. The strong dependence of the average transmissions at big n on the cross section values in resonance minima and resonance wings imply the requirement for validation of the evaluated data files with the possible correction of these according to the results of direct transmission measurements for relatively thick samples with the beam attenuation of 100 - 1000 times. The tick samples transmission measurements can be treated as simplest benchmark experiments. These are the only integral measurements easily reproduced by evaluated files and can be used for validation and improvements of the evaluated libraries. The main advantage of using such measurements for that is the possibility of achieving high accuracy in experiment. A brief overview of available thick filters transmission and self-indication data including our results obtained previously is presented. The prospective of new high intensity neutron sources for thick sample transmission measurements is discussed concerning the possibility of reaching bigger attenuation of neutron beam and enlarging the number and variety of nuclei involved in these investigations. (author)

  14. Two traditions of interaction research.

    Peräkylä, Anssi

    2004-03-01

    The paper compares Bales' Interaction Process Analysis (IPA) with Sacks' Conversation Analysis (CA), arguing that CA has answered several questions that originally motivated the development of IPA, and while doing so, it has re-specified the phenomena of interaction research. These two research traditions are in many ways diametrically opposed: the former is quantitative, theory-oriented and aims at global characterizations of interactional situations, while the latter is qualitative, inductive and aims at characterizing specific layers of organization (such as turn taking or sequence organization) that give structure to interactional situations. Their primary objects of study are different. For the Balesian tradition, it is the functioning and the structure of a small group, whereas in the Sacksian tradition, it is the structures and practices of human social interaction per se. It is argued, however, that CA has radically expanded understanding of the questions IPA was originally developed to address. These questions include allocation of resources, control and solidarity. Bales' research deals with them in terms of the differentiation of participants of a group, whereas CA has re-specified them as emergent aspects of the very rules and structures that constitute and regulate interaction sequences. The uniqueness of the CA perspective on social interaction is demonstrated by exploring the display of emotion as an interactional phenomenon. It is argued that the display of emotion is intrinsically embedded in the sequential organization of action. Sensitive 'coding and counting' approaches can detect emotion displays, but the contribution of CA is to show the specific ways in which they are part of the business of interaction. PMID:15035695

  15. Face Recognition using Gabor Filters

    Sajjad MOHSIN

    2011-01-01

    Full Text Available An Elastic Bunch Graph Map (EBGM algorithm is being proposed in this research paper that successfully implements face recognition using Gabor filters. The proposed system applies 40 different Gabor filters on an image. As aresult of which 40 images with different angles and orientation are received. Next, maximum intensity points in each filtered image are calculated and mark them as Fiducial points. The system reduces these points in accordance to distance between them. The next step is calculating the distances between the reduced points using distance formula. At last, the distances are compared with database. If match occurs, it means that the image is recognized.

  16. Advanced simulation of digital filters

    Doyle, G. S.

    1980-09-01

    An Advanced Simulation of Digital Filters has been implemented on the IBM 360/67 computer utilizing Tektronix hardware and software. The program package is appropriate for use by persons beginning their study of digital signal processing or for filter analysis. The ASDF programs provide the user with an interactive method by which filter pole and zero locations can be manipulated. Graphical output on both the Tektronix graphics screen and the Versatec plotter are provided to observe the effects of pole-zero movement.

  17. ADVANCED HOT GAS FILTER DEVELOPMENT

    E.S. Connolly; G.D. Forsythe

    2000-09-30

    DuPont Lanxide Composites, Inc. undertook a sixty-month program, under DOE Contract DEAC21-94MC31214, in order to develop hot gas candle filters from a patented material technology know as PRD-66. The goal of this program was to extend the development of this material as a filter element and fully assess the capability of this technology to meet the needs of Pressurized Fluidized Bed Combustion (PFBC) and Integrated Gasification Combined Cycle (IGCC) power generation systems at commercial scale. The principal objective of Task 3 was to build on the initial PRD-66 filter development, optimize its structure, and evaluate basic material properties relevant to the hot gas filter application. Initially, this consisted of an evaluation of an advanced filament-wound core structure that had been designed to produce an effective bulk filter underneath the barrier filter formed by the outer membrane. The basic material properties to be evaluated (as established by the DOE/METC materials working group) would include mechanical, thermal, and fracture toughness parameters for both new and used material, for the purpose of building a material database consistent with what is being done for the alternative candle filter systems. Task 3 was later expanded to include analysis of PRD-66 candle filters, which had been exposed to actual PFBC conditions, development of an improved membrane, and installation of equipment necessary for the processing of a modified composition. Task 4 would address essential technical issues involving the scale-up of PRD-66 candle filter manufacturing from prototype production to commercial scale manufacturing. The focus would be on capacity (as it affects the ability to deliver commercial order quantities), process specification (as it affects yields, quality, and costs), and manufacturing systems (e.g. QA/QC, materials handling, parts flow, and cost data acquisition). Any filters fabricated during this task would be used for product qualification tests being conducted by Westinghouse at Foster-Wheeler's Pressurized Circulating Fluidized Bed (PCFBC) test facility in Karhula, Finland. Task 5 was designed to demonstrate the improvements implemented in Task 4 by fabricating fifty 1.5-meter hot gas filters. These filters were to be made available for DOE-sponsored field trials at the Power Systems Development Facility (PSDF), operated by Southern Company Services in Wilsonville, Alabama.

  18. Attitude Representations for Kalman Filtering

    Markley, F. Landis; Bauer, Frank H. (Technical Monitor)

    2001-01-01

    The four-component quaternion has the lowest dimensionality possible for a globally nonsingular attitude representation, it represents the attitude matrix as a homogeneous quadratic function, and its dynamic propagation equation is bilinear in the quaternion and the angular velocity. The quaternion is required to obey a unit norm constraint, though, so Kalman filters often employ a quaternion for the global attitude estimate and a three-component representation for small errors about the estimate. We consider these mixed attitude representations for both a first-order Extended Kalman filter and a second-order filter, as well for quaternion-norm-preserving attitude propagation.

  19. Pragmatic circuits signals and filters

    Eccles, William

    2006-01-01

    Pragmatic Circuits: Signals and Filters is built around the processing of signals. Topics include spectra, a short introduction to the Fourier series, design of filters, and the properties of the Fourier transform. The focus is on signals rather than power. But the treatment is still pragmatic. For example, the author accepts the work of Butterworth and uses his results to design filters in a fairly methodical fashion. This third of three volumes finishes with a look at spectra by showing how to get a spectrum even if a signal is not periodic. The Fourier transform provides a way of dealing wi

  20. Simplified design of filter circuits

    Lenk, John

    1999-01-01

    Simplified Design of Filter Circuits, the eighth book in this popular series, is a step-by-step guide to designing filters using off-the-shelf ICs. The book starts with the basic operating principles of filters and common applications, then moves on to describe how to design circuits by using and modifying chips available on the market today. Lenk's emphasis is on practical, simplified approaches to solving design problems.Contains practical designs using off-the-shelf ICsStraightforward, no-nonsense approachHighly illustrated with manufacturer's data sheets

  1. Gas cleaning with Granular Filters

    Natvig, Ingunn Roald

    2007-01-01

    The panel bed filter (PBF) is a granular filter patented by A. M. Squires in the late sixties. PBFs consist of louvers with stationary, granular beds. Dust is deposited in the top layers and on the bed surface when gas flows through. PBFs are resistant to high temperatures, variations in the gas flow and hot particles. The filter is cleaned by releasing a pressure pulse in the opposite direction of the bulk flow (a puff back pulse). A new louver geometry patented by A. M. Squires is the...

  2. Properties of ceramic candle filters

    Pontius, D.H.

    1995-06-01

    The mechanical integrity of ceramic filter elements is a key issue for hot gas cleanup systems. To meet the demands of the advanced power systems, the filter components must sustain the thermal stresses of normal operations (pulse cleaning), of start-up and shut-down conditions, and of unanticipated process upsets such as excessive ash accumulation without catastrophic failure. They must also survive the various mechanical loads associated with handling and assembly, normal operation, and process upsets. For near-term filter systems, these elements must survive at operating temperatures of 1650{degrees}F for three years.

  3. Advanced Filtering Techniques Applied to Spaceflight Project

    National Aeronautics and Space Administration — IST-Rolla developed two nonlinear filters for spacecraft orbit determination during the Phase I contract. The theta-D filter and the cost based filter, CBF, were...

  4. Shunt Active Filter in Damping Harmonics Propagation

    BERBAOUI, B.

    2010-08-01

    Full Text Available This paper deals with a hybrid shunt active power filter applied on 500 kV HVDC, after a description of the causes and effects harmonic pollution which may damage equipments and interrupt electric power customers service; in this paper we present the deferent solutions of this problem among one has to study the two most recent types of filtering: passive and hybrid filter. The hybrid filter consists of active filter connected in shunt with passive filter. The hybrid shunt active filter proposed is based on three levels PWM inverter and characterized by detecting the harmonic current flowing into the passive filter and controlled by notch algorithm. This structure has been applied on a test HVDC power system, is presented as a technical solution makes it possible to eliminate the disadvantages from passive filtering, and also economic price of active filtering part. The simulation results justified the effectiveness of this type of filter face of the classic passive filter.

  5. Adapting agriculture with traditional knowledge

    Swiderska, Krystyna; Reid, Hannah [IIED, London (United Kingdom); Song, Yiching; Li, Jingsong [Centre for Chinese Agriculutral Policy (China); Mutta, Doris [Kenya Forestry Research Institute (Kenya)

    2011-10-15

    Over the coming decades, climate change is likely to pose a major challenge to agriculture; temperatures are rising, rainfall is becoming more variable and extreme weather is becoming a more common event. Researchers and policymakers agree that adapting agriculture to these impacts is a priority for ensuring future food security. Strategies to achieve that in practice tend to focus on modern science. But evidence, both old and new, suggests that the traditional knowledge and crop varieties of indigenous peoples and local communities could prove even more important in adapting agriculture to climate change.

  6. Time Weight Update Model Based on the Memory Principle in Collaborative Filtering

    Dan Li

    2013-11-01

    Full Text Available Collaborative filtering is the most widely used technology in the recommender systems. Existing collaborative filtering algorithms do not take the time factor into account. However, users’ interests always change with time, and traditional collaborative filtering cannot reflect the changes. In this paper, the change of users’ interests is considered as the memory process, and a time weight iteration model is designed based on memory principle. For a certain user, the proposed model introduces the time weight for each item, and updates the weight by computing the similarity with the items chosen in a recent period. In the recommend process, the weight will be applied to the prediction algorithm. Experimental results show that the modified algorithm can optimize the result of the recommendation in a certain extent, and performs better than traditional collaborative filtering.

  7. Optimized Beam Sculpting with Generalized Fringe-rate Filters

    Parsons, Aaron R.; Liu, Adrian; Ali, Zaki S.; Cheng, Carina

    2016-03-01

    We generalize the technique of fringe-rate filtering, whereby visibilities measured by a radio interferometer are re-weighted according to their temporal variation. As the Earth rotates, radio sources traverse through an interferometer’s fringe pattern at rates that depend on their position on the sky. Capitalizing on this geometric interpretation of fringe rates, we employ time-domain convolution kernels to enact fringe-rate filters that sculpt the effective primary beam of antennas in an interferometer. As we show, beam sculpting through fringe-rate filtering can be used to optimize measurements for a variety of applications, including mapmaking, minimizing polarization leakage, suppressing instrumental systematics, and enhancing the sensitivity of power-spectrum measurements. We show that fringe-rate filtering arises naturally in minimum variance treatments of many of these problems, enabling optimal visibility-based approaches to analyses of interferometric data that avoid systematics potentially introduced by traditional approaches such as imaging. Our techniques have recently been demonstrated in Ali et al., where new upper limits were placed on the 21 {cm} power spectrum from reionization, showcasing the ability of fringe-rate filtering to successfully boost sensitivity and reduce the impact of systematics in deep observations.

  8. A local particle filter for high dimensional geophysical systems

    Penny, S. G.; Miyoshi, T.

    2015-12-01

    A local particle filter (LPF) is introduced that outperforms traditional ensemble Kalman filters in highly nonlinear/non-Gaussian scenarios, both in accuracy and computational cost. The standard Sampling Importance Resampling (SIR) particle filter is augmented with an observation-space localization approach, for which an independent analysis is computed locally at each gridpoint. The deterministic resampling approach of Kitagawa is adapted for application locally and combined with interpolation of the analysis weights to smooth the transition between neighboring points. Gaussian noise is applied with magnitude equal to the local analysis spread to prevent particle degeneracy while maintaining the estimate of the growing dynamical instabilities. The approach is validated against the Local Ensemble Transform Kalman Filter (LETKF) using the 40-variable Lorenz-96 model. The results show that: (1) the accuracy of LPF surpasses LETKF as the forecast length increases (thus increasing the degree of nonlinearity), (2) the cost of LPF is significantly lower than LETKF as the ensemble size increases, and (3) LPF prevents filter divergence experienced by LETKF in cases with non-Gaussian observation error distributions.

  9. Concrete ensemble Kalman filters with rigorous catastrophic filter divergence.

    Kelly, David; Majda, Andrew J; Tong, Xin T

    2015-08-25

    The ensemble Kalman filter and ensemble square root filters are data assimilation methods used to combine high-dimensional, nonlinear dynamical models with observed data. Ensemble methods are indispensable tools in science and engineering and have enjoyed great success in geophysical sciences, because they allow for computationally cheap low-ensemble-state approximation for extremely high-dimensional turbulent forecast models. From a theoretical perspective, the dynamical properties of these methods are poorly understood. One of the central mysteries is the numerical phenomenon known as catastrophic filter divergence, whereby ensemble-state estimates explode to machine infinity, despite the true state remaining in a bounded region. In this article we provide a breakthrough insight into the phenomenon, by introducing a simple and natural forecast model that transparently exhibits catastrophic filter divergence under all ensemble methods and a large set of initializations. For this model, catastrophic filter divergence is not an artifact of numerical instability, but rather a true dynamical property of the filter. The divergence is not only validated numerically but also proven rigorously. The model cleanly illustrates mechanisms that give rise to catastrophic divergence and confirms intuitive accounts of the phenomena given in past literature. PMID:26261335

  10. Block implementation of adaptive digital filters

    Block digital filtering involves the calculation of a block or finite set of filter outputs from a block of input values. This paper presents a block adaptive filtering procedure in which the filter coefficients are adjusted once per each output block in accordance with a generalized least mean-square (LMS) algorithm. Analyses of convergence properties and computational complexity show that the block adaptive filter permits fast implementations while maintaining performance equivalent to that of the widely used LMS adaptive filter

  11. Traditional Chinese culture in modern Product Design

    Wu, Qing

    2015-01-01

    This paper describes the sources and features of traditional Chinese culture. It also discusses the aesthetic thoughts of Chinese traditional culture. Chinese traditional culture has been illustrated broad and far-reaching impact on design since ancient time. In the context of globalization and the rapid development of science and technology, the difference between traditional design and modern product design should be explored. Chinese traditional culture cannot be totally absorbed. It is im...

  12. TRADITIONAL FERMENTED FOODS OF LESOTHO

    Tendekayi H. Gadaga

    2013-06-01

    Full Text Available This paper describes the traditional methods of preparing fermented foods and beverages of Lesotho. Information on the preparation methods was obtained through a combination of literature review and face to face interviews with respondents from Roma in Lesotho. An unstructured questionnaire was used to capture information on the processes, raw materials and utensils used. Four products; motoho (a fermented porridge, Sesotho (a sorghum based alcoholic beverage, hopose (sorghum fermented beer with added hops and mafi (spontaneously fermented milk, were found to be the main fermented foods prepared and consumed at household level in Lesotho. Motoho is a thin gruel, popular as refreshing beverage as well as a weaning food. Sesotho is sorghum based alcoholic beverage prepared for household consumption as well as for sale. It is consumed in the actively fermenting state. Mafi is the name given to spontaneously fermented milk with a thick consistency. Little research has been done on the technological aspects, including the microbiological and biochemical characteristics of fermented foods in Lesotho. Some of the traditional aspects of the preparation methods, such as use of earthenware pots, are being replaced, and modern equipment including plastic utensils are being used. There is need for further systematic studies on the microbiological and biochemical characteristics of these these products.

  13. Traditional and Modern Morphometrics: Review

    Gökhan OCAKOĞLU

    2013-01-01

    Full Text Available Morphometrics, a branch of morphology, is the study of the size and shape components of biological forms and their variation in the population. In biological and medical sciences, there is a long history of attempts to quantitatively express the diversity of the size and shape of biological forms. On the basis of historical developments in morphometry, we address several questions related to the shape of organs or organisms that are considered in biological and medical studies. In the field of morphometrics, multivariate statistical analysis is used to rigorously address such questions. Historically, these methods have involved the analysis of collections of distances or angles, but recent theoretical, computational, and other advances have shifted the focus of morphometric procedures to the Cartesian coordinates of anatomical points. In recent years, in biology and medicine, the traditional morphometric studies that aim to analyze shape variation have been replaced by modern morphometric studies. In the biological and medical sciences, morphometric methods are frequently preferred for examining the morphologic structures of organs or organisms with regard to diseases or environmental factors. These methods are also preferred for evaluating and classifying the variation of organs or organisms with respect to growth or allometry time dependently. Geometric morphometric methods are more valid than traditional morphometric methods in protecting more morphological information and in permitting analysis of this information.

  14. Elephant resource-use traditions.

    Fishlock, Victoria; Caldwell, Christine; Lee, Phyllis C

    2016-03-01

    African elephants (Loxodonta africana) use unusual and restricted habitats such as swampy clearings, montane outcrops and dry rivers for a variety of social and ecological reasons. Within these habitats, elephants focus on very specific areas for resource exploitation, resulting in deep caves, large forest clearings and sand pits as well as long-established and highly demarcated routes for moving between resources. We review evidence for specific habitat exploitation in elephants and suggest that this represents socially learned cultural behaviour. Although elephants show high fidelity to precise locations over the very long term, these location preferences are explained neither by resource quality nor by accessibility. Acquiring techniques for exploiting specific resource sites requires observing conspecifics and practice and is evidence for social learning. Elephants possess sophisticated cognitive capacities used to track relationships and resources over their long lifespans, and they have an extended period of juvenile dependency as a result of the need to acquire this considerable social and ecological knowledge. Thus, elephant fidelity to particular sites results in traditional behaviour over generations, with the potential to weaken relationships between resource quality and site preferences. Illustrating the evidence for such powerful traditions in a species such as elephants contributes to understanding animal cognition in natural contexts. PMID:26359083

  15. Matched Spectral Filter Imager Project

    National Aeronautics and Space Administration — OPTRA proposes the development of an imaging spectrometer for greenhouse gas and volcanic gas imaging based on matched spectral filtering and compressive imaging....

  16. Filtering Dialysis Myths from Facts

    ... Z Health Guide Featured Story Are you a dialysis patient? Sign up for our FREE magazine, Kidney ... Newsroom Contact Us You are here Home » Filtering Dialysis Myths from Facts Myth: The only option for ...

  17. Integrated Spatial Filter Array Project

    National Aeronautics and Space Administration — To address the NASA Earth Science Division need for spatial filter arrays for amplitude and wavefront control, Luminit proposes to develop a novel Integrated...

  18. Improvement of Stopband Performance in Parallel-Coupled Bandpass Filters Using Quasi-Lumped Elements

    Zhurbenko, Vitaliy; Krozer, Viktor; Meincke, Peter

    bandpass filter has a compact footprint, and exhibits good stopband rejection with no repeated passband at twice the center frequency in comparison with the traditional coupled-line filter. By introducing the quasi-lumped element resonator, two transmission zeros at upper and lower stopbands are created......B relative bandwidth of 60 % is implemented with an area of approximately lambda/6 times lambda/4. Measured and simulated results exhibit good agreement....

  19. Experimental results of a single-phase shunt active filter prototype with different switching techniques

    Neves, Pedro; Pinto, J. G.; Pregitzer, Ricardo G.; Luís F. C. Monteiro; João L Afonso; Sepúlveda, João

    2007-01-01

    This paper presents experimental results obtained with a developed single-phase shunt active power filter laboratory prototype operating with different switching techniques. This active filter can compensate harmonic currents and power factor in single-phase electric installations. Its power circuit is based on a two-leg IGBT inverter, with a single capacitor in the dc side, and an inductor in the ac side. Its control system is based on a simple stratagem that enables the use of the tradition...

  20. Factored particle filtering with dependent and constrained partition dynamics for tracking deformable objects

    Eskil, Mustafa Taner

    2014-01-01

    In particle filtering, dimensionality of the state space can be reduced by tracking control (or feature) points as independent objects, which are traditionally named as partitions. Two critical decisions have to be made in implementation of reduced state-space dimensionality. First is how to construct a dynamic (transition) model for partitions that are inherently dependent. Second critical decision is how to filter partition states such that a viable and likely object state is achieved. In t...

  1. MEEPTOOLS: A maximum expected error based FASTQ read filtering and trimming toolkit

    Koparde, Vishal N.; Parikh, Hardik I.; Bradley, Steven P.; Nihar U. Sheth

    2015-01-01

    Next generation sequencing technology rapidly produces massive volume of data and quality control of this sequencing data is essential to any genomic analysis. Here we present MEEPTOOLS, which is a collection of open-source tools based on maximum expected error as a percentage of read length (MEEP score) to filter, trim, truncate and assess next generation DNA sequencing data in FASTQ file format. MEEPTOOLS provides a non-traditional approach towards read filtering/trimming based on maximum e...

  2. The Value of Rotational Venography Versus Anterior–Posterior Venography in 100 Consecutive IVC Filter Retrievals

    Kiefer, Ryan M., E-mail: rkiefer11@gmail.com; Pandey, Nirnimesh; Trerotola, Scott O.; Nadolski, Gregory J.; Stavropoulos, S. William, E-mail: stav@uphs.upenn.edu [Hospital of University of Pennsylvania Medical Center, Division of Interventional Radiology, Department of Radiology (United States)

    2016-03-15

    PurposeAccurately detecting inferior vena cava (IVC) filter complications is important for safe and successful retrieval as tip-embedded filters require removal with non-standard techniques. Venography prior to IVC filter retrieval has traditionally used a single anterior–posterior (AP) projection. This study compares the utility of rotational venography to AP venography prior to IVC filter removal.Materials and MethodsThe rotational venograms from 100 consecutive IVC filter retrievals over a 35-month period were evaluated retrospectively. The AP view of the rotational venogram was examined separately from the full series by a radiologist blinded to alternative imaging and operative findings. The venograms were evaluated for tip embedding, filter fracture, filter thrombus, and IVC thrombus. Statistical analysis was performed.ResultsUsing operative findings and peri-procedural imaging as the reference standard, tip embedding occurred in 59 of the 100 filters (59 %). AP venography was used to correctly identify 31 tip-embedded filters (53 % sensitivity) with two false positives (95 % specificity) for an accuracy of 70 %. Rotational venography was used to correctly identify 58 tip-embedded filters (98 % sensitivity) with one false positive (98 % specificity) for an accuracy of 98 %. A significant difference was found in the sensitivities of the two diagnostic approaches (P < .01). Other findings of thrombus and filter fracture were not significantly different between the two groups.ConclusionRotational venograms allow for more accurate detection of tip-embedded IVC filters compared to AP views alone. As this determines the approach taken, rotational venograms are helpful if obtained prior to IVC filter retrieval.

  3. The Value of Rotational Venography Versus Anterior–Posterior Venography in 100 Consecutive IVC Filter Retrievals

    PurposeAccurately detecting inferior vena cava (IVC) filter complications is important for safe and successful retrieval as tip-embedded filters require removal with non-standard techniques. Venography prior to IVC filter retrieval has traditionally used a single anterior–posterior (AP) projection. This study compares the utility of rotational venography to AP venography prior to IVC filter removal.Materials and MethodsThe rotational venograms from 100 consecutive IVC filter retrievals over a 35-month period were evaluated retrospectively. The AP view of the rotational venogram was examined separately from the full series by a radiologist blinded to alternative imaging and operative findings. The venograms were evaluated for tip embedding, filter fracture, filter thrombus, and IVC thrombus. Statistical analysis was performed.ResultsUsing operative findings and peri-procedural imaging as the reference standard, tip embedding occurred in 59 of the 100 filters (59 %). AP venography was used to correctly identify 31 tip-embedded filters (53 % sensitivity) with two false positives (95 % specificity) for an accuracy of 70 %. Rotational venography was used to correctly identify 58 tip-embedded filters (98 % sensitivity) with one false positive (98 % specificity) for an accuracy of 98 %. A significant difference was found in the sensitivities of the two diagnostic approaches (P < .01). Other findings of thrombus and filter fracture were not significantly different between the two groups.ConclusionRotational venograms allow for more accurate detection of tip-embedded IVC filters compared to AP views alone. As this determines the approach taken, rotational venograms are helpful if obtained prior to IVC filter retrieval

  4. Stochastic processes and filtering theory

    Jazwinski, Andrew H

    2007-01-01

    This unified treatment of linear and nonlinear filtering theory presents material previously available only in journals, and in terms accessible to engineering students. Its sole prerequisites are advanced calculus, the theory of ordinary differential equations, and matrix analysis. Although theory is emphasized, the text discusses numerous practical applications as well.Taking the state-space approach to filtering, this text models dynamical systems by finite-dimensional Markov processes, outputs of stochastic difference, and differential equations. Starting with background material on probab

  5. A taxonomy fuzzy filtering approach

    Vrettos S.

    2003-01-01

    Full Text Available Our work proposes the use of topic taxonomies as part of a filtering language. Given a taxonomy, a classifier is trained for each one of its topics. The user is able to formulate logical rules combining the available topics, e.g. (Topic1 AND Topic2 OR Topic3, in order to filter related documents in a stream. Using the trained classifiers, every document in the stream is assigned a belief value of belonging to the topics of the filter. These belief values are then aggregated using logical operators to yield the belief to the filter. In our study, Support Vector Machines and Naïve Bayes classifiers were used to provide topic probabilities. Aggregation of topic probabilities based on fuzzy logic operators was found to improve filtering performance on the Renters text corpus, as compared to the use of their Boolean counterparts. Finally, we deployed a filtering system on the web using a sample taxonomy of the Open Directory Project.

  6. Polarising filter tests at ISIS

    For the efficient use of polarised neutrons on a pulsed source such as ISIS, a neutron polarizer capable of polarising over a broad band of neutron energies is required, so that time of flight techniques can be used. For this reason monochromating polarizers such as Heusler crystals, which are used at reactor sources, are not suitable. Furthermore, although supermirrors do provide white beam polarisation, they are inefficient at energies greater than ? 30 meV. In principle neutron polarising filters, based on preferential absorption of one of the two neutron spin states by aligned Sm149 nuclei, give a good performance at all neutron energies below 200 meV. However, in order for adequate nuclear alignment to be obtained, the filter must be maintained at ? 0.02 K in a dilution refrigerator. Previous measurements have shown that heating of the filter is too great for efficient operation of the filter in a white beam on ISIS, given the limited cooling power of the currently available dilution refrigerator. The primary purpose of the measurements presented here was to determine whether the filter is suitable for polarising an incident beam which is monochromated by a Fermi chopper. It was anticipated that the large reduction of neutron intensity by the chopper would largely eliminate any beam heating problems. In section 2 we outline the principle of neutron polarisation by Sm filters, in section 3 we describe the measurements and in section 4 we discuss the interpretation of these measurements and their implications for future progress. (author)

  7. Traditional Procurement is too Slow

    Ann Kong

    2012-11-01

    Full Text Available This paper reports on an exploratory interview survey of construction project participants aimed at identifying the reasons for the decrease in use of the traditional, lump-sum, procurement system in Malaysia. The results show that most people believe it is too slow. This appears to be in part due to the contiguous nature of the various phase and stages of the process and especially the separation of the design and construction phases. The delays caused by disputes between the various parties are also seen as a contributory factor - the most prominent cause being the frequency of variations, with design and scope changes being a particular source of discontent. It is concluded that an up scaling of the whole of the time related reward/penalty system may be the most appropriate measure for the practice in future.

  8. The evolution of traditional knowledge:

    Saslis Lagoudakis, C Haris; Hawkins, Julie A; Greenhill, Simon J; Pendry, Colin A; Watson, Mark F; Tuladhar-Douglas, Will; Baral, Sushim R; Savolainen, Vincent

    2014-01-01

    environments and able to exchange knowledge readily. Medicinal plant use is one of the most important components of traditional knowledge, since plants provide healthcare for up to 80% of the world's population. Here, we assess the significance of ancestry, geographical proximity of cultures and the...... environment in determining medicinal plant use for 12 ethnic groups in Nepal. Incorporating phylogenetic information to account for plant evolutionary relatedness, we calculate pairwise distances that describe differences in the ethnic groups' medicinal floras and floristic environments. We also determine...... linguistic relatedness and geographical separation for all pairs of ethnic groups. We show that medicinal uses are most similar when cultures are found in similar floristic environments. The correlation between medicinal flora and floristic environment was positive and strongly significant, in contrast to...

  9. Experience with three percutaneous vena cava filters

    Twenty-one Kimray-Greenfield, 33 bird's nest, and 19 Amplatz vena cava filters were placed percutaneously. The Kimray-Greenfield filter was the most difficult to insert. The major problem was the insertion site, which required venipuncture with a 24-F catheter. Minor hemorrhage was frequent, and femoral vein thrombosis occurred in four patients. No migration, caval thrombosis, or pulmonary emboli were seen after Kimray-Greenfield filter placement. The bird's nest filter was relatively easy to insert, although in two cases the filter prongs could not be adequately seated in the wall of the inferior vena cava. Three patients with bird's nest filters had thrombosis below the filter, and three filters migrated to the heart. One migrated filter could not be removed. One patient had multiple small pulmonary emboli at autopsy. No other pulmonary emboli after filter placement were noted. The Amplatz filter was the easiest of the three filters to insert. Only one patient with an Amplatz filter had thrombosis of the vena cava below the filter. No filter migrations were documented, and no recurrent pulmonary emboli were found on clinical or radiologic follow-up. The Amplatz vena cava filter is easier to place than percutaneous Kimray-Greenfield or bird's nest filters, has a low complication rate, and has proven to be clinically effective in preventing pulmonary emboli

  10. [Cataplasma of traditional Chinese medicine].

    Jia, Wei; Gao, Wen-yuan; Wang, Tao; Liu, Yun-bin; Xue, Jing; Xiao, Pei-gen

    2003-01-01

    The TCM (traditional Chinese medicine) transdermal plaster (also known as "cataplasma") are flexible adhesive patches used for treatment of pain, resulted from arthritis, sprain and bruise, tendovaginitis, lumbar spine protrude, neuralgia, hyperosteogeny ache, abdominal discomfort and metastatic cancer, etc. Since the 1980's, investigators in China have used this modern patch delivery system for herbal drugs and obtained satisfactory results especially from the treatment of various types of pain associated with bone diseases, abdominal discomfort, and tumors, etc. The production of TCM cataplasma was successfully scaled up in early 90's and the commercial product line for an antirheumatic agent was first established in Shanghai by Leiyunshang Group. Thus far, a number of products in the form of TCM cataplasma became commercially available in the market, and clinical investigations with these products indicated that topically applicable herbal preparations, especially in the form of cataplasma, are preferred formulations with respect to the treatment comfort of the patient. Compared to the traditional preparations which utilize rubber and rosin as adhesives, cataplasma is advantageous in that the lipophilic and hydrophilic ingredients of the herbal extracts are solubilized and then "gellified" with the organic polymers, and that the drug matrix containing up to 40%-70% of water serves as a "drug reservoir" that will sustain the quick and continuous release of herbal ingredients over several days across the skin. While there are conventional remedies for palliation of pain and discomfort associated with bone diseases or cancers, administration of oral medicinal herbs combined with topical agents such as TCM cataplasma may significantly alleviate the symptoms and improve their quality of life. This article provides a review on three aspects, which include the process development, characteristics and developmental status of TCM cataplasma, and future development of such a technology. PMID:15015257

  11. Development of Test Protocols for International Space Station Particulate Filters

    Green, Robert D.; Vijayakumar, R.; Agui, Juan H.

    2014-01-01

    Air quality control on the International Space Station (ISS) is a vital requirement for maintaining a clean environment for the crew and the hardware. This becomes a serious challenge in pressurized space compartments since no outside air ventilation is possible, and a larger particulate load is imposed on the filtration system due to lack of gravitational settling. The ISS Environmental Control and Life Support System (ECLSS) uses a filtration system that has been in use for over 14 years and has proven to meet this challenge. The heart of this system is a traditional High- Efficiency Particulate Air (HEPA) filter configured to interface with the rest of the life support elements and provide effective cabin filtration. Over the years, the service life of these filters has been re-evaluated based on limited post-flight tests of returned filters and risk factors. On earth, a well designed and installed HEPA filter will last for several years, e.g. in industrial and research clean room applications. Test methods for evaluating these filters are being developed on the basis of established test protocols used by the industry and the military. This paper will discuss the test methods adopted and test results on prototypes of the ISS filters. The results will assist in establishing whether the service life can be extended for these filters. Results from unused filters that have been in storage will also be presented to ascertain the shelf life and performance deterioration, if any and determine if the shelf life may be extended. XXXX Presently, the inventory of ISS bacterial filters for the ISS Air Revitalization System are nearing the end of their specified shelf life, and a means of testing to confirm whether the shelf life can be extended is of interest to the ISS Program. A filter test setup was designed and built to meet industry testing standards. A CFD analysis was performed to initially determine the optimal duct geometry and flow configuration. Both a screen and flow straighter were added to the test duct design to improve flow uniformity and face velocity profiles were subsequently measured to confirm. Flow quality and aerosol mixing assessments show that the duct flow is satisfactory for the intended leak testing. Preliminary leak testing was performed on two different ISS filters, one with known perforations and one with limited use, and results confirmed that the testing methods and photometer instrument are sensitive enough to detect and locate compromised sections of an ISS BFE. This work is focused on developing test protocols for testing the ISS BFE filters, but the methodology is general enough to be extended to other present and future spacecraft filters, These techniques for characterizing the test duct and perform leak testing can be applied to conducting acceptance testing and inventory testing for future manned exploration programs with air revitalization filtration needs, possibly even for in-situ filter element integrity testing for extensively long-duration missions. We plan to address the unique needs for test protocols for crewed spacecraft particulate filters by preparing the initial version of a standard, to be documented as a NASA Techmical Memorandum (TM), that can potentially be submitted to IEST and ASHRAE for consideration as a new standard for spacecraft applications.

  12. Position USBL/DVL Sensor-based Navigation Filter in the presence of Unknown Ocean Currents

    Morgado, M; Oliveira, P; Silvestre, C

    2010-01-01

    This paper presents a novel approach to the design of globally asymptotically stable (GAS) position filters for Autonomous Underwater Vehicles (AUVs) based directly on the nonlinear sensor readings of an Ultra-short Baseline (USBL) and a Doppler Velocity Log (DVL). Central to the proposed solution is the derivation of a linear time-varying (LTV) system that fully captures the dynamics of the nonlinear system, allowing for the use of powerful linear system analysis and filtering design tools that yield GAS filter error dynamics. Simulation results reveal that the proposed filter is able to achieve the same level of performance of more traditional solutions, such as the Extended Kalman Filter (EKF), while providing, at the same time, GAS guarantees, which are absent for the EKF.

  13. A planar and tunable bandpass filter on a ferrite substrate with integrated windings

    Arabi, Eyad

    2015-05-01

    Tunable Filters that are based on ferrite materials are often biased by external magnets or coils which are large and bulky. In this work a completely planar, CPW-based bandpass filter is presented with integrated windings. Due to these windings the size of the filter is only 26mm × 34mm × 0.38mm which is orders of magnitude smaller than the traditional designs with external windings. The filter is realized by electroplating of Copper over seed layers of Titanium and Gold over a YIG substrate. The fabricated filter achieves a tunability of 3.4% without any external magnets or coils. A good insertion loss of 2.3 dBs and rejection greater than 50 dBs have been obtained. To the best of the authors knowledge, this design is the first ferrite-based design that is completely planar and self-biased.

  14. A robust strong tracking cubature Kalman filter for spacecraft attitude estimation with quaternion constraint

    Huang, Wei; Xie, Hongsheng; Shen, Chen; Li, Jinpeng

    2016-04-01

    This paper considers a robust strong tracking nonlinear filtering problem in the case there are model uncertainties including the model mismatch, unknown disturbance and status mutation in the spacecraft attitude estimation system with quaternion constraint. Two multiple fading factor matrices are employed to regulate the prediction error covariance matrix, which guarantees its symmetry. The spherical-radial cubature rule is developed to deal with the multi-dimensional integrals. The quaternion constraint is maintained by utilizing the gain correction method. Therefore a robust strong tracking cubature Kalman filter (RSTCKF) is formed for the spacecraft attitude estimation with quaternion constraint. Unlike adopting a single fading factor in the traditional strong tracking filter, the presented filter uses two multiple fading factor matrices to make different channels have respective filter adjustment capability, which improves the tracking performance of this algorithm. Simulation results show the effectiveness of the proposed RSTCKF.

  15. LC Filter Design for Wide Band Gap Device Based Adjustable Speed Drives

    Vadstrup, Casper; Wang, Xiongfei; Blaabjerg, Frede

    and must therefore be reduced. This may be accomplished by a sine LC filter with DC link feedback, which reduces both the differential mode and common mode dV/dt. Wide band gap devices are capable of switching at frequencies much higher than traditional Si based devices. This makes it possible to...... design the LC filter with a higher cut off frequency and without damping resistors. The selection of inductance and capacitance is chosen based on capacitor voltage ripple and current ripple. The filter adds a base load to the inverter, which increases the inverter losses. It is shown how the modulation...... index affects the capacitor capacitor and the inverter current....

  16. Photonic Color Filters Integrated with Organic Solar Cells for Energy Harvesting

    Park, Hui Joon

    2011-09-27

    Color filters are indispensable in most color display applications. In most cases, they are chemical pigment-based filters, which produce a particular color by absorbing its complementary color, and the absorbed energy is totally wasted. If the absorbed and wasted energy can be utilized, e.g., to generate electricity, innovative energy-efficient electronic media could be envisioned. Here we show photonic nanostructures incorporated with photovoltaics capable of producing desirable colors in the visible band and utilize the absorbed light to simultaneously generate electrical powers. In contrast to the traditional colorant-based filters, these devices offer great advantages for electro-optic applications. © 2011 American Chemical Society.

  17. Traditional perception of Greeks in Serbian oral tradition

    Konjik Ivana

    2006-01-01

    Full Text Available Based on material on Greeks from Vuk’s corpus of epic poems, we discuss the construction of ethnic stereotype of Greeks in Serbian language. However, the limitation of the paper’s possible conclusion lies in the nature of the corpus: Vuk had deliberately chosen one material over another, therefore, the corpus relating to Greeks cannot be considered as representative of the whole Serbian folk poems. Therefore, the discussion is limited to certain elements of the stereotype. Nevertheless, these Serbian epic folk poems contain many layers: historical, geographical, sociological, mythological and so on, with a strong foundation in traditional culture; thus, they provide an insight into geo-political situation of the time period, viewpoints, perspectives and experiences of other ethnic groups that Serbs have been into contact with. In particular, the relationship toward Greeks was marked with pronounced patriarchal attitude concerning others: we-others, ours-foreign, good-bad. In this sense, Greeks are portrayed as foreign, and as such, as a potential source of danger. On the other hand, Greeks are Christian Orthodox, which associates them with the category ours. In socio-economic sense, they were traders and wealthy, respected gentlemen. In epical-heroic profile, they were not considered as great heroes, but as "lousy army", and frequently, as unfaithful.

  18. Sensory pollution from bag-type fiberglass ventilation filters: Conventional filter compared with filters containing various amounts of activated carbon

    Bekö, Gabriel; Fadeyi, M.O.; Clausen, Geo; Weschler, Charles J.

    2009-01-01

    As ventilation filters accumulate particles removed from the airstream, they become emitters of sensory pollutants that degrade indoor air quality. Previously we demonstrated that an F7 bag-type filter that incorporates activated carbon (a "combination filter") reduces this adverse effect compared...... to an equivalent filter without carbon. The aim of the present study was to examine how the amount of activated carbon (AC) used in combination filters affects their ability to remove both sensory offending pollutants and ozone. A panel evaluated the air downstream of four different filters after...... conventional F7 fiberglass filter and three modifications of a bag-type fiberglass combination filter: the "Heavy" corresponded to a commercially available filter containing 400 g of carbon per square meter of filter area, the "Medium" contained half as much carbon (200 g/m(2)), and the "Light" contained a...

  19. Io's Sodium Cloud (Clear Filter)

    1997-01-01

    This image of Jupiter's moon Io and its surrounding sky is shown in false color. It was taken at 5 hours 30 minutes Universal Time on Nov. 9, 1996 by the solid state imaging (CCD) system aboard NASA's Galileo spacecraft, using a clear filter whose wavelength range was approximately 400 to 1100 nanometers. This picture differs in two main ways from the green-yellow filter image of the same scene which was released yesterday.First, the sky around Io is brighter, partly because the wider wavelength range of the clear filter lets in more scattered light from Io's illuminated crescent and from Prometheus' sunlit plume. Nonetheless, the overall sky brightness in this frame is comparable to that seen through the green-yellow filter, indicating that even here much of the diffuse sky emission is coming from the wavelength range of the green-yellow filter (i.e., from Io's Sodium Cloud).The second major difference is that a quite large roundish spot has appeared in Io's southern hemisphere. This spot -- which has been colored red -- corresponds to thermal emission from the volcano Pele. The green-yellow filter image bears a much smaller trace of this emission because the clear filter is far more sensitive to those relatively long wavelengths where thermal emission is strongest.The Jet Propulsion Laboratory, Pasadena, CA manages the mission for NASA's Office of Space Science, Washington, DC.This image and other images and data received from Galileo are posted on the World Wide Web, on the Galileo mission home page at URL http://galileo.jpl.nasa.gov.

  20. Filter-adsorber aging assessment

    An aging assessment of high-efficiency particulate (HEPA) air filters and activated carbon gas adsorption units was performed by the Pacific Northwest Laboratory as part of the U.S. Nuclear Regulatory Commission's (USNRC) Nuclear Plant Aging Research (NPAR) Program. This evaluation of the general process in which characteristics of these two components gradually change with time or use included the compilation of information concerning failure experience, stressors, aging mechanisms and effects, and inspection, surveillance, and monitoring methods (ISMM). Stressors, the agents or stimuli that can produce aging degradation, include heat, radiation, volatile contaminants, and even normal concentrations of aerosol particles and gasses. In an experimental evaluation of degradation in terms of the tensile breaking strength of aged filter media specimens, over forty percent of the samples did not meet specifications for new material. Chemical and physical reactions can gradually embrittle sealants and gaskets as well as filter media. Mechanisms that can lead to impaired adsorber performance are associated with the loss of potentially available active sites as a result of the exposure of the carbon to airborne moisture or volatile organic compounds. Inspection, surveillance, and monitoring methods have been established to observe filter pressure drop buildup, check HEPA filters and adsorbers for bypass, and determine the retention effectiveness of aged carbon. These evaluations of installed filters do not reveal degradation in terms of reduced media strength but that under normal conditions aged media can continue to effectively retain particles. However, this degradation may be important when considering the likelihood of moisture, steam, and higher particle loadings during severe accidents and the fact it is probable that the filters have been in use for an extended period

  1. Pixelated filters for spatial imaging

    Mathieu, Karine; Lequime, Michel; Lumeau, Julien; Abel-Tiberini, Laetitia; Savin De Larclause, Isabelle; Berthon, Jacques

    2015-10-01

    Small satellites are often used by spatial agencies to meet scientific spatial mission requirements. Their payloads are composed of various instruments collecting an increasing amount of data, as well as respecting the growing constraints relative to volume and mass; So small-sized integrated camera have taken a favored place among these instruments. To ensure scene specific color information sensing, pixelated filters seem to be more attractive than filter wheels. The work presented here, in collaboration with Institut Fresnel, deals with the manufacturing of this kind of component, based on thin film technologies and photolithography processes. CCD detectors with a pixel pitch about 30 μm were considered. In the configuration where the matrix filters are positioned the closest to the detector, the matrix filters are composed of 2x2 macro pixels (e.g. 4 filters). These 4 filters have a bandwidth about 40 nm and are respectively centered at 550, 700, 770 and 840 nm with a specific rejection rate defined on the visible spectral range [500 - 900 nm]. After an intense design step, 4 thin-film structures have been elaborated with a maximum thickness of 5 μm. A run of tests has allowed us to choose the optimal micro-structuration parameters. The 100x100 matrix filters prototypes have been successfully manufactured with lift-off and ion assisted deposition processes. High spatial and spectral characterization, with a dedicated metrology bench, showed that initial specifications and simulations were globally met. These excellent performances knock down the technological barriers for high-end integrated specific multi spectral imaging.

  2. Dry Filter Method for Filtered Venting of the Containment

    In the course of severe accident scenarios pressure builds up in the containment. In these cases, reliable pressure release of the containment will be required to maintain containment integrity. To effectively perform such a containment depressurization while minimizing radioactive release to the environment, a Filtered Containment Venting System (FCVS) is needed. The Dry Filter Method (DFM) is a FCVS with very high Decontamination Factors (DF) for radioactive aerosols and elemental and organic iodine. It was designed for the back fit of German nuclear power stations after the Chernobyl accident in 1986. This paper describes a passive, flexible and modular dry filter system which can cope with a number of hypothetical scenarios and can be very flexibly adapted to plant-specific requirements

  3. Digital notch filter based active damping for LCL filters

    Yao, Wenli; Yang, Yongheng; Zhang, Xiaobin; Blaabjerg, Frede

    . In contrast, the active damping does not require any dissipation elements, and thus has become of increasing interest. As a result, a vast of active damping solutions have been reported, among which multi-loop control systems and additional sensors are necessary, leading to increased cost and...... complexity. In this paper, a notch filter based active damping without the requirement of additional sensors is proposed, where the inverter current is employed as the feedback variable. Firstly, a design method of the notch filter for active damping is presented. The entire system stability has then been...... in the z-domain. Simulations and experiments are carried out to verify the proposed active damping method. Both results have confirmed that the notch filter based active damping can ensure the entire system stability in the case of resonances with a good system performance....

  4. Nonlinear filtering in oil/gas reservoir simulation: filter design

    Arnold, E.M.; Voss, D.A.; Mayer, D.W.

    1980-10-01

    In order to provide an additional mode of utility to the USGS reservoir model VARGOW, a nonlinear filter was designed and incorporated into the system. As a result, optimal (in the least squares sense) estimates of reservoir pressure, liquid mass, and gas cap plus free gas mass are obtained from an input of reservoir initial condition estimates and pressure history. These optimal estimates are provided continuously for each time after the initial time, and the input pressure history is allowed to be corrupted by measurement error. Preliminary testing of the VARGOW filter was begun and the results show promise. Synthetic data which could be readily manipulated during testing was used in tracking tests. The results were positive when the initial estimates of the reservoir initial conditions were reasonably close. Further testing is necessary to investigate the filter performance with real reservoir data.

  5. Water washable stainless steel HEPA filter

    Phillips, Terrance D.

    2001-01-01

    The invention is a high efficiency particulate (HEPA) filter apparatus and system, and method for assaying particulates. The HEPA filter provides for capture of 99.99% or greater of particulates from a gas stream, with collection of particulates on the surface of the filter media. The invention provides a filter system that can be cleaned and regenerated in situ.

  6. Unscented Kalman Filtering for Articulated Human Tracking

    Boesen Lindbo Larsen, Anders; Hauberg, Søren; Pedersen, Kim Steenstrup

    2011-01-01

    -of-the-art trackers utilize particle filters, our unimodal likelihood model allows us to use an unscented Kalman filter. This robust and efficient filter allows us to improve the quality of the tracker while using substantially fewer likelihood evaluations. The system is compared to one based on a particle filter...

  7. Analysis and Simulation of Reduced FIR Filters

    Lj. Radic

    2005-01-01

    Full Text Available High order FIR filters employ model reduction techniques, in order to decrease power consumption and time delay. During reduction high order FIR filters are converted into low order IIR filters preserving stability and phase linearity as main features. Matlab simulations of an audio system with these reduced filters are presented. Furthermore, the influence of order on power consumption is discussed.

  8. Fungal decay of traditional fishing craft

    Gupta, R.

    , their role in the biodeterioration process, traditional and modern preventive methods, their economics with suggestions for future work. This study becomes important in view of the fishing industry turning towards traditional fishing craft for low energy...

  9. Constrained digital matched filter method for optimum filter synthesis

    We present a new method to directly calculate the optimum filter in presence of any additive stationary noise, with arbitrary time and domain constraints (flat-top, zero-area, etc.). A more concise re-deduction of digital penalized LMS method (DPLMS) is given. This method is fully developed, and synthesis results of a typical situation are given and compared with the DPLMS method. Optimum filter can be synthesized without a prior knowledge of the noise power spectral density, which makes it suitable to be used in adaptive, self-calibrating digital spectroscopy

  10. Satellite image restoration filter comparison

    Arbel, Dan; Kopeika, Norman S.

    2002-11-01

    The quality of satellite images propagating through the atmosphere is affected by phenomena such as scattering and absorption of the light, and turbulence, which degrade the image by blurring it and reducing its contrast. Here, a new approach for digital restoration of Landsat thematic mapper (TM) imagery is presented by implementing several filters as atmospheric filters which correct for turbulence blur, aerosol blur, and path radiance simultaneously. Aerosol modulation transfer function (MTF) is consistent with optical depth. Turbulence MTF is calculated from meteorological data. The product of the two yields atmospheric MTF, which is implemented in the atmospheric filters. Restoration improves both smallness of size of resolvable detail and contrast. Restorations are quite apparent even under clear weather conditions. Different restoration results are obtained by trying to restore the degraded image. Here, restorations results of the Kalman filter and the atmospheric Wiener filter are presented along with restoration results based on wavelets and multifractals. A way to determine which is the best restoration result and how good is the restored image is presented by a visual comparison and by examining several mathematical criteria.

  11. Properties of Ceramic Candle Filters

    Coal-fired Pressurized Fluidized Bed Combustion (PFBC) and Integrated Gasification Combined Cycle (IGCC) systems require ceramic candle filter elements which can withstand the mechanical, thermal, and chemical environment of hot gas cleanup applications. These systems demand filter elements to sustain the thermal stresses of normal operations (pulse cleaning), of start-up and shut-down conditions, and of unanticipated process upsets such as excessive ash accumulation without catastrophic failure. The filter elements must also survive the mechanical loads associated with handling and assembly, normal operation, and process upsets. Objectives of the test program at Southern Research are as follows: (1) Provide material characterization to develop an understanding of the physical, mechanical, and thermal behavior of hot gas filter materials. (2) Develop a material property data base from which the behavior of materials in the hot gas cleanup environment may be predicted. (3) Perform testing and analysis of filter elements after exposure to actual operating conditions to determine the effects of the thermal and chemical environments in hot gas filtration on material properties. (4) Explore the glass-like nature of the matrix material

  12. Iodine filters in nuclear installations

    The present report discusses the significance for environmental exposure of the iodine released with the gaseous effluents of nuclear power stations and reprocessing plants in relation to releases of other airborne radionuclides. Iodine filtration processes are described. The release pathways and the composition of airborne fission product iodine mixtures and their bearing on environmental exposure are discussed on the basis of measured fission product iodine emissions. The sorbents which can be used for iodine filtration, their removal efficiencies and range of applications are dealt with in detail. The particular conditions governing iodine removal, which are determined by the various gaseous iodine species, are illustrated on the basis of experimentally determined retention profiles. Particular attention is given to the limitations imposed by temperature, humidity, radiation and filter poisoning. The types of filter normally used are described, their advantages and drawbacks discussed, the principles underlying their design are outlined and the sources of error indicated. The methods normally applied to test the efficiency of various iodine sorbents are described and assessed. Operating experience with iodine filters, gathered from surveillance periods of many years, is supplemented by a large number of test results and the findings of extensive experiments. Possible ways of prolonging the permissible service lives of iodine filters are discussed and information is given on protective measures. The various iodine removal processes applied in reprocessing plants are described and compared with reference to efficiency and cost. The latest developments in filter technology in reprocessing plants are briefly outlined

  13. A biological oil adsorption filter

    A new oil adsorption method called adsorption filtration (AF) has been developed. It is a technology where by oil residues can be cleaned from water by running it through a simple filter made from freeze treated, dried, milled and then fragmented plant material. By choosing suitable plants and fragmentation sizes it is possible to produce filters, which pass water but adsorb oil. The aim of this study was to investigate the possibilities of manufacturing oil adsorbing filter materials from reed canary grass (Phalaris arundinacea), flax (Linum usitatissimum L.) or hemp fibre (Cannabis sativa L.). The oil (80 ml) was mixed with de-ionised water (200 ml) and this mixture was filtered through 10 or 20 g adsorption filters. Fine spring harvested hemp fibre (diameter less than 1 mm) and reed canary grass fragments adsorb 2-4 g of oil per gram of adsorption material compared to 1-3 g of water. Adsorption filtration is thus a novel way of gathering spilled oil in shallow coastal waters before the oil reaches the shore. (author)

  14. A biological oil adsorption filter

    Pasila, A. [University of Helsinki (Finland). Dept. of Agricultural Engineering and Household Technology

    2005-12-01

    A new oil adsorption method called adsorption filtration (AF) has been developed. It is a technology where by oil residues can be cleaned from water by running it through a simple filter made from freeze treated, dried, milled and then fragmented plant material. By choosing suitable plants and fragmentation sizes it is possible to produce filters, which pass water but adsorb oil. The aim of this study was to investigate the possibilities of manufacturing oil adsorbing filter materials from reed canary grass (Phalaris arundinacea), flax (Linum usitatissimum L.) or hemp fibre (Cannabis sativa L.). The oil (80 ml) was mixed with de-ionised water (200 ml) and this mixture was filtered through 10 or 20 g adsorption filters. Fine spring harvested hemp fibre (diameter less than 1 mm) and reed canary grass fragments adsorb 2-4 g of oil per gram of adsorption material compared to 1-3 g of water. Adsorption filtration is thus a novel way of gathering spilled oil in shallow coastal waters before the oil reaches the shore. (author)

  15. Risk Sensitive Filtering with Poisson Process Observations

    In this paper we consider risk sensitive filtering for Poisson process observations. Risk sensitive filtering is a type of robust filtering which offers performance benefits in the presence of uncertainties. We derive a risk sensitive filter for a stochastic system where the signal variable has dynamics described by a diffusion equation and determines the rate function for an observation process. The filtering equations are stochastic integral equations. Computer simulations are presented to demonstrate the performance gain for the risk sensitive filter compared with the risk neutral filter

  16. A Study of Speckle Noise Reduction Filters

    Jyoti Jaybhay

    2015-06-01

    Full Text Available Ultrasound images and SAR i.e. synthetic aperture radar images are usually corrupted because of speckle noise also called as granular noise. It is quite a tedious task to remove such noise and analyze those corrupted images. Till now many researchers worked to remove speckle noise using frequency domain methods, temporal methods, and adaptive methods. Different filters have been developed as Mean and Median filters, Statistic Lee filter, Statistic Kuan filter, Frost filter, Srad filter. This paper reviews filters used to remove speckle noise.

  17. Progress towards the use of disposable filters

    Thermally degradable materials have been evaluated for service in HEPA filter units used to filter gases from active plants. The motivation was to reduce the bulk storage problems of contaminated filters by thermal decomposition to gaseous products and a solid residue substantially comprised of the filtered particulates. It is shown that while there are no commercially available alternatives to the glass fibre used in the filter medium, it would be feasible to manufacture the filter case and spacers from materials which could be incinerated. Operating temperatures, costs and the type of residues for disposal are discussed for filter case materials. (U.K.)

  18. A personalized web page content filtering model based on segmentation

    Kuppusamy, K S; 10.5121/ijist.2012.2104

    2012-01-01

    In the view of massive content explosion in World Wide Web through diverse sources, it has become mandatory to have content filtering tools. The filtering of contents of the web pages holds greater significance in cases of access by minor-age people. The traditional web page blocking systems goes by the Boolean methodology of either displaying the full page or blocking it completely. With the increased dynamism in the web pages, it has become a common phenomenon that different portions of the web page holds different types of content at different time instances. This paper proposes a model to block the contents at a fine-grained level i.e. instead of completely blocking the page it would be efficient to block only those segments which holds the contents to be blocked. The advantages of this method over the traditional methods are fine-graining level of blocking and automatic identification of portions of the page to be blocked. The experiments conducted on the proposed model indicate 88% of accuracy in filter...

  19. A Level Set Filter for Speckle Reduction in SAR Images

    Huang Bo

    2010-01-01

    Full Text Available Despite much effort and significant progress in recent years, speckle removal for Synthetic Aperture Radar (SAR image still is a challenging problem in image processing. Unlike the traditional noise filters, which are mainly based on local neighborhood statistical average or frequencies transform, in this paper, we propose a speckle reduction method based on the theory of level set, one form of curvature flow propagation. Firstly, based on partial differential equation, the Lee filter can be cast as a formulation of anisotropic diffusion function; furthermore, we continued to deduce it into a level set formulation. Level set flow into the method allows the front interface to propagate naturally with topological changes, where the speed is proportional to the curvature of the intensity contours in an image. Hence, small speckle will disappear quickly, while large scale interfaces will be slow to evolve. Secondly, for preserving finer detailed structures in images when smoothing the speckle, the evolution is switched between minimum or maximum curvature speed depending on the scale of speckle. The proposed method has been illustrated by experiments on simulation image and ERS-2 SAR images under different circumstances. Its advantages over the traditional speckle reduction filter approaches have also been demonstrated.

  20. Ensemble Kalman filtering without the intrinsic need for inflation

    Bocquet, M.

    2011-10-01

    The main intrinsic source of error in the ensemble Kalman filter (EnKF) is sampling error. External sources of error, such as model error or deviations from Gaussianity, depend on the dynamical properties of the model. Sampling errors can lead to instability of the filter which, as a consequence, often requires inflation and localization. The goal of this article is to derive an ensemble Kalman filter which is less sensitive to sampling errors. A prior probability density function conditional on the forecast ensemble is derived using Bayesian principles. Even though this prior is built upon the assumption that the ensemble is Gaussian-distributed, it is different from the Gaussian probability density function defined by the empirical mean and the empirical error covariance matrix of the ensemble, which is implicitly used in traditional EnKFs. This new prior generates a new class of ensemble Kalman filters, called finite-size ensemble Kalman filter (EnKF-N). One deterministic variant, the finite-size ensemble transform Kalman filter (ETKF-N), is derived. It is tested on the Lorenz '63 and Lorenz '95 models. In this context, ETKF-N is shown to be stable without inflation for ensemble size greater than the model unstable subspace dimension, at the same numerical cost as the ensemble transform Kalman filter (ETKF). One variant of ETKF-N seems to systematically outperform the ETKF with optimally tuned inflation. However it is shown that ETKF-N does not account for all sampling errors, and necessitates localization like any EnKF, whenever the ensemble size is too small. In order to explore the need for inflation in this small ensemble size regime, a local version of the new class of filters is defined (LETKF-N) and tested on the Lorenz '95 toy model. Whatever the size of the ensemble, the filter is stable. Its performance without inflation is slightly inferior to that of LETKF with optimally tuned inflation for small interval between updates, and superior to LETKF with optimally tuned inflation for large time interval between updates.

  1. Development of Test Protocols for International Space Station Particulate Filters

    Vijayakumar, R.; Green, Robert D.; Agui, Juan H.

    2015-01-01

    Air quality control on the International Space Station (ISS) is a vital requirement for maintaining a clean environment for the crew and the hardware. This becomes a serious challenge in pressurized space compartments since no outside air ventilation is possible, and a larger particulate load is imposed on the filtration system due to lack of gravitational settling. The ISS Environmental Control and Life Support System (ECLSS) uses a filtration system that has been in use for over 14 years and has proven to meet this challenge. The heart of this system is a traditional High-Efficiency Particulate Air (HEPA) filter configured to interface with the rest of the life support elements and provide effective cabin filtration. The filter element for this system has a non-standard cross-section with a length-to-width ratio (LW) of 6.6. A filter test setup was designed and built to meet industry testing standards. A CFD analysis was performed to initially determine the optimal duct geometry and flow configuration. Both a screen and flow straighter were added to the test duct design to improve flow uniformity and face velocity profiles were subsequently measured to confirm. Flow quality and aerosol mixing assessments show that the duct flow is satisfactory for the intended leak testing. Preliminary leak testing was performed on two different ISS filters, one with known perforations and one with limited use, and results confirmed that the testing methods and photometer instrument are sensitive enough to detect and locate compromised sections of an ISS BFE.Given the engineering constraints in designing spacecraft life support systems, it is anticipated that non-industry standard filters will be required in future designs. This work is focused on developing test protocols for testing the ISS BFE filters, but the methodology is general enough to be extended to other present and future spacecraft filters. These techniques for characterizing the test duct and perform leak testing can be applied to conducting acceptance testing and inventory testing for future manned exploration programs with air revitalization filtration needs, possibly even for in-situ filter element integrity testing for extensively long-duration missions. We plan to address the unique needs for test protocols for crewed spacecraft particulate filters by preparing the initial version of a standard, to be documented as a NASA Technical Memorandum (TM).

  2. Abortion traditions in rural Jamaica.

    Sobo, E J

    1996-02-01

    Abortion is not condoned in Jamaica. Its meaning is linked to the meanings of kinship and parenthood, which are expressed through procreation and involve altruism and the assumption of responsibility for the well-being of others. Abortion subverts these ideals but indigenous methods for it are known and are secretly used. The inconsistencies between abortion talk and abortion practice are examined, and the structural functions of abortion (and of its culturally constructed, ideological meaning) are discussed. The distinction--and the overlap--between abortion as such and menstrual regulation is explored. The use of the culturally constructed 'witchcraft baby' syndrome to justify abortion is also investigated. Traditional abortion techniques follow from (and can illuminate) general health practices, which focus on inducing the ejection of 'blockages' and toxins, and from ethnophysiological beliefs about procreation and reproductive health, which easily allow for menstrual delays not caused by conception. The latter understanding and the similarity between abortifacients, emmenagogues and general purgatives allows women flexibility in interpreting the meanings of their missed periods and the physical effects of the remedy. PMID:8643976

  3. Nutraceutical enriched Indian traditional chikki.

    Ramakrishna, Chetana; Pamisetty, Aruna; Reddy, Sunki Reddy Yella

    2015-08-01

    Chikki or peanut brittle, a traditional sweet snack was chosen as vehicle for enrichment with added natural nutraceuticals through herbs. The formulation and process for preparation of chikki with added herbs like ashwagandha (Withania somenifera), tulasi (Ocimumsanctum L.) and ajwain (Trachyspermum ammi S.) were standardized. The polyphenol content of chikki with added herbs ranged 0.29-0.46 g/100 g. Among the herbs, ajwain showed more potent antioxidant activity followed by tulasi, whereas ashwagandha and product prepared with it showed the least activity. Total carotenoid contents of chikki with added herbs ranged between 1.5 and 4.3 mg/100 g. Storage studies showed that chikki prepared with tulasi and ajwain were sensorily acceptable up to 90 days, while rancid notes were observed in control and chikki with added ashwagandha at the end of 30 days. Thus chikki with added herbs in addition to containing natural nutraceuticals like polyphenols and carotenoids had improved storage stability compared to control. PMID:26243935

  4. Traditional budgeting during financial crisis

    Marie Anne Lorain

    2015-10-01

    Full Text Available This study examines the evolution of budgeting practices in the extremely difficult Spanish economic environment. In order to analyse if companies are still maintaining their budgeting process and if, right now, they are facing more difficulties in forecasting accurate indicators, two similar web surveys were addressed over two periods of time, firstly in 2008 at the beginning of the financial crisis, and secondly in 2013 after five years of a downward trend. In addition, in-depth interviews were conducted to investigate how companies brought more flexibility to their budgeting process in order to cope with environmental uncertainty. The survey indicates that 97% of respondents are still using a traditional budgeting process being this result similar to the one found in 2008. However, 2013 showed that the reliance on forecasted information is being increasingly questioned. Furthermore the study revealed that the respondents are bringing more flexibility to their processes, being able to modify the objectives once the budget is approved and to obtain new resources outside the budgeting process. This paper contributes to revealing information about difficulties in setting reliable objectives in a turbulent environment and provides data about the evolution of budgeting practices over five years during an austere economic crisis.

  5. Kazakh Traditional Dance Gesture Recognition

    Nussipbekov, A. K.; Amirgaliyev, E. N.; Hahn, Minsoo

    2014-04-01

    Full body gesture recognition is an important and interdisciplinary research field which is widely used in many application spheres including dance gesture recognition. The rapid growth of technology in recent years brought a lot of contribution in this domain. However it is still challenging task. In this paper we implement Kazakh traditional dance gesture recognition. We use Microsoft Kinect camera to obtain human skeleton and depth information. Then we apply tree-structured Bayesian network and Expectation Maximization algorithm with K-means clustering to calculate conditional linear Gaussians for classifying poses. And finally we use Hidden Markov Model to detect dance gestures. Our main contribution is that we extend Kinect skeleton by adding headwear as a new skeleton joint which is calculated from depth image. This novelty allows us to significantly improve the accuracy of head gesture recognition of a dancer which in turn plays considerable role in whole body gesture recognition. Experimental results show the efficiency of the proposed method and that its performance is comparable to the state-of-the-art system performances.

  6. Exploring Oral Traditions through the Written Text.

    Metting, Fred

    1995-01-01

    Argues that, by reading literature that incorporates folklore and oral traditions, students learn to recognize and appreciate how oral traditions have influenced all cultures. Argues that a study of contemporary American written literature which incorporates elements of the oral tradition introduces students to old and deep wisdom and to a diverse…

  7. Infusing Qualitative Traditions in Counseling Research Designs

    Hays, Danica G.; Wood, Chris

    2011-01-01

    Research traditions serve as a blueprint or guide for a variety of design decisions throughout qualitative inquiry. This article presents 6 qualitative research traditions: grounded theory, phenomenology, consensual qualitative research, ethnography, narratology, and participatory action research. For each tradition, the authors describe its…

  8. Fuel Efficient Diesel Particulate Filter (DPF) Modeling and Development

    Stewart, Mark L.; Gallant, Thomas R.; Kim, Do Heui; Maupin, Gary D.; Zelenyuk, Alla

    2010-08-01

    The project described in this report seeks to promote effective diesel particulate filter technology with minimum fuel penalty by enhancing fundamental understanding of filtration mechanisms through targeted experiments and computer simulations. The overall backpressure of a filtration system depends upon complex interactions of particulate matter and ash with the microscopic pores in filter media. Better characterization of these phenomena is essential for exhaust system optimization. The acicular mullite (ACM) diesel particulate filter substrate is under continuing development by Dow Automotive. ACM is made up of long mullite crystals which intersect to form filter wall framework and protrude from the wall surface into the DPF channels. ACM filters have been demonstrated to effectively remove diesel exhaust particles while maintaining relatively low backpressure. Modeling approaches developed for more conventional ceramic filter materials, such as silicon carbide and cordierite, have been difficult to apply to ACM because of properties arising from its unique microstructure. Penetration of soot into the high-porosity region of projecting crystal structures leads to a somewhat extended depth filtration mode, but with less dramatic increases in pressure drop than are normally observed during depth filtration in cordierite or silicon carbide filters. Another consequence is greater contact between the soot and solid surfaces, which may enhance the action of some catalyst coatings in filter regeneration. The projecting crystals appear to provide a two-fold benefit for maintaining low backpressures during filter loading: they help prevent soot from being forced into the throats of pores in the lower porosity region of the filter wall, and they also tend to support the forming filter cake, resulting in lower average cake density and higher permeability. Other simulations suggest that soot deposits may also tend to form at the tips of projecting crystals due to the axial velocity component of exhaust moving down the filter inlet channel. Soot mass collected in this way would have a smaller impact on backpressure than soot forced into the flow restrictions deeper in the porous wall structure. This project has focused on the development of computational, analytical, and experimental techniques that are generally applicable to a wide variety of exhaust aftertreatment technologies. By helping to develop improved fundamental understanding pore-scale phenomena affecting filtration, soot oxidation, and NOX abatement, this cooperative research and development agreement (CRADA) has also assisted Dow Automotive in continuing development and commercialization of the ACM filter substrate. Over the course of this research project, ACM filters were successfully deployed on the Audi R10 TDI racecar which won the 24 Hours of LeMans endurance race in 2006, 2007, and 2008; and the 12 Hours of Sebring endurance race in 2006 and 2007. It would not have been possible for the R10 to compete in these traditionally gasoline-dominated events without reliable and effective exhaust particulate filtration. These successes demonstrated not only the performance of automotive diesel engines, but the efficacy of DPF technology as it was being deployed around the world to meet new emissions standards on consumer vehicles. During the course of this CRADA project, Dow Automotive commercialized their ACM DPF technology under the AERIFYTM DPF brand.

  9. Inferior vena cava filters in cancer patients: to filter or not to filter

    Hikmat Abdel-Razeq

    2011-03-01

    Full Text Available Hikmat Abdel-Razeq1, Asem Mansour2, Yousef Ismael1, Hazem Abdulelah11Department of Internal Medicine, 2Department of Radiology, King Hussein Cancer Center, Amman, JordanPurpose: Cancer and its treatment are recognized risk factors for venous thromboembolism (VTE; active cancer accounts for almost 20% of all newly diagnosed VTE. Inferior vena cava (IVC filters are utilized to provide mechanical thromboprophylaxis to prevent pulmonary embolism (PE or to avoid bleeding from systemic anticoagulation in high-risk situations. In this report, and utilizing a case study, we will address the appropriate utilization of such filters in cancer patients.Methods: The case of a 43-year-old female patient with rectal cancer, who developed deep vein thrombosis following a complicated medical course, will be presented. The patient was anticoagulated with a low molecular weight heparin, but a few months later and following an episode of bleeding, an IVC filter was planned. Using the PubMed database, articles published in English language addressing issues related to IVC filters in cancer patients were accessed and will be presented.Results: Many recent studies questioned the need to insert IVC filters in advanced-stage cancer patients, particularly those whose anticipated survival is short and prevention of PE may be of little clinical benefit and could be a poor utilization of resources.Conclusion: Systemic anticoagulation can be safely offered for the majority of cancer patients. When the risk of bleeding or pulmonary embolism is high, IVC filters can be utilized. However, placement of such filters should take into consideration the stage of disease and life expectancy of such patients.Keywords: anticoagulation, bleeding, chemotherapy

  10. Merging particle filter for sequential data assimilation

    Nakano, S.; Ueno, G.; Higuchi, T

    2007-01-01

    A new filtering technique for sequential data assimilation, the merging particle filter (MPF), is proposed. The MPF is devised to avoid the degeneration problem, which is inevitable in the particle filter (PF), without prohibitive computational cost. In addition, it is applicable to cases in which a nonlinear relationship exists between a state and observed data where the application of the ensemble Kalman filter (EnKF) is not effectual. In the MPF, the filtering procedure is performe...

  11. Novel Nonlinear Hybrid Filters for Image Enhancement

    Peng, Shaomin

    1995-01-01

    Image noise removal and enhancement are important subjects in image processing. Nonlinear techniques for image enhancement and noise reduction challenge the linear techniques by improving image quality while removing noise. The purpose of this thesis is devoted to systematically unifying theory and techniques for mixed noise removal and image enhancement, and to developing new techniques for removing large amounts of mixed Gaussian and impulsive noise while preserving image details. In this thesis, we introduce three new hybrid filters which combine linear and nonlinear filters to produce new hybrid filters capable of removing large amounts of mixed noise. To efficiently use the ambiguous information in an image, both fuzzy set concepts and fuzzy logic operating rules are utilized in the filter design techniques. The three new filters include the single level trained fuzzy filter (SLTF), the multi-level adaptive fuzzy filter (MLAF), and the decision directed window adaptive hybrid filter (DDWAH). The SLTF filter is designed to remove large amounts of mixed noise by combining an impulse filter with a fuzzy filter. The efficiency of the SLTF filter in removing large amounts of mixed noise while preserving image edges is demonstrated. The MLAF filter is an adaptive SLTF filter which uses the local variance of image gray scales to adapt the weights used in the linear portion of the filter to local image statistics. The MLAF filter provides improved visual performance compared to the SLTF filter. The adaptive DDWAH filter uses local statistics to adapt the window size of the filter to local statistics. This approach prevents distortion of small objects in the image, and removes noise more effectively than non-adaptive filters. The experimental results clearly show the improved noise removal performance and good edge preservation properties. Theoretical analysis verifies the measured results.

  12. Fast algorithm for Morphological Filters

    Lou Shan; Jiang Xiangqian; Scott, Paul J, E-mail: x.jiang@hud.ac.uk [Centre for Precision Technologies, University of Huddersfield, Huddersfield HD1 3DH (United Kingdom)

    2011-08-19

    In surface metrology, morphological filters, which evolved from the envelope filtering system (E-system) work well for functional prediction of surface finish in the analysis of surfaces in contact. The naive algorithms are time consuming, especially for areal data, and not generally adopted in real practice. A fast algorithm is proposed based on the alpha shape. The hull obtained by rolling the alpha ball is equivalent to the morphological opening/closing in theory. The algorithm depends on Delaunay triangulation with time complexity O(nlogn). In comparison to the naive algorithms it generates the opening and closing envelope without combining dilation and erosion. Edge distortion is corrected by reflective padding for open profiles/surfaces. Spikes in the sample data are detected and points interpolated to prevent singularities. The proposed algorithm works well both for morphological profile and area filters. Examples are presented to demonstrate the validity and superiority on efficiency of this algorithm over the naive algorithm.

  13. Quantum Diffusion, Measurement and Filtering

    Belavkin, V P

    1993-01-01

    A brief presentation of the basic concepts in quantum probability theory is given in comparison to the classical one. The notion of quantum white noise, its explicit representation in Fock space, and necessary results of noncommutative stochastic analysis and integration are outlined. Algebraic differential equations that unify the quantum non Markovian diffusion with continuous non demolition observation are derived. A stochastic equation of quantum diffusion filtering generalising the classical Markov filtering equation to the quantum flows over arbitrary *-algebra is obtained. A Gaussian quantum diffusion with one dimensional continuous observation is considered.The a posteriori quantum state difusion in this case is reduced to a linear quantum stochastic filter equation of Kalman-Bucy type and to the operator Riccati equation for quantum correlations. An example of continuous nondemolition observation of the coordinate of a free quantum particle is considered, describing a continuous collase to the statio...

  14. HEPA filter concerns - an overview

    The U.S. Department of Energy (DOE) recently initiated a complete review of the DOE High Efficiency Particulate Air (HEPA) Filter Program to identify areas for improvement. Although this process is currently ongoing, various issues and problems have already been identified for action that not only impacts the DOE HEPA filter program, but potentially the national and international air cleaning community as well. This paper briefly reviews a few of those concerns that may be of interest, and discusses actions initiated by the DOE to address the associated issues and problems. Issues discussed include: guidance standards, in-place testing, specifications, Test Facilities, portable units, vacuum cleaners, substitute aerosols, filter efficiencies, aging/shelf life/service life, fire suppression, handbook, Quality Products List (QPL), QA testing, and evaluations

  15. Fast algorithm for Morphological Filters

    In surface metrology, morphological filters, which evolved from the envelope filtering system (E-system) work well for functional prediction of surface finish in the analysis of surfaces in contact. The naive algorithms are time consuming, especially for areal data, and not generally adopted in real practice. A fast algorithm is proposed based on the alpha shape. The hull obtained by rolling the alpha ball is equivalent to the morphological opening/closing in theory. The algorithm depends on Delaunay triangulation with time complexity O(nlogn). In comparison to the naive algorithms it generates the opening and closing envelope without combining dilation and erosion. Edge distortion is corrected by reflective padding for open profiles/surfaces. Spikes in the sample data are detected and points interpolated to prevent singularities. The proposed algorithm works well both for morphological profile and area filters. Examples are presented to demonstrate the validity and superiority on efficiency of this algorithm over the naive algorithm.

  16. Fast algorithm for Morphological Filters

    Lou, Shan; Jiang, Xiangqian; Scott, Paul J.

    2011-08-01

    In surface metrology, morphological filters, which evolved from the envelope filtering system (E-system) work well for functional prediction of surface finish in the analysis of surfaces in contact. The naive algorithms are time consuming, especially for areal data, and not generally adopted in real practice. A fast algorithm is proposed based on the alpha shape. The hull obtained by rolling the alpha ball is equivalent to the morphological opening/closing in theory. The algorithm depends on Delaunay triangulation with time complexity O(nlogn). In comparison to the naive algorithms it generates the opening and closing envelope without combining dilation and erosion. Edge distortion is corrected by reflective padding for open profiles/surfaces. Spikes in the sample data are detected and points interpolated to prevent singularities. The proposed algorithm works well both for morphological profile and area filters. Examples are presented to demonstrate the validity and superiority on efficiency of this algorithm over the naive algorithm.

  17. An Improved Morphological Algorithm for Filtering Airborne LiDAR Point Cloud Based on Multi-Level Kriging Interpolation

    Zhenyang Hui

    2016-01-01

    Full Text Available Filtering is one of the core post-processing steps for airborne LiDAR point cloud. In recent years, the morphology-based filtering algorithms have proven to be a powerful and efficient tool for filtering airborne LiDAR point cloud. However, most traditional morphology-based algorithms have difficulties in preserving abrupt terrain features, especially when using larger filtering windows. In order to suppress the omission error caused by protruding terrain features, this paper proposes an improved morphological algorithm based on multi-level kriging interpolation. This algorithm is essentially a combination of progressive morphological filtering algorithm and multi-level interpolation filtering algorithm. The morphological opening operation is performed with filtering window gradually downsizing, while kriging interpolation is conducted at different levels according to the different filtering windows. This process is iterative in a top to down fashion until the filtering window is no longer greater than the preset minimum filtering window. Fifteen samples provided by the ISPRS commission were chosen to test the performance of the proposed algorithm. Experimental results show that the proposed method can achieve promising results not only in flat urban areas but also in rural areas. Comparing with other eight classical filtering methods, the proposed method obtained the lowest omission error, and preserved protruding terrain features better.

  18. FILTER CAKE REDEPOSITION IN A PULSE-JET FILTER

    The report gives results of a pilot-scale study of pulse-jet filter cleaning, a process that is ineffective to the extent that collected dust redeposits, rather than falling to the hopper. Dust tracer techniques were used to measure the amount of redeposition. A mathematical mode...

  19. Flat microwave photonic filter based on hybrid of two filters

    A new microwave photonic filter (MPF) hybrid of two filters that can realize both multiple taps and a flat bandpass or bandstop response is presented. Based on the phase character of a Mach–Zehnder modulator (MZM), a two taps finite impulse response (FIR) filter is obtained as the first part. The second part is obtained by taking full advantage of the wavelength selectivity of the fiber Bragg grating (FBG) and the gain of a erbium-doped fiber (EDF). Combining the two filters, the flat bandpass or bandstop response is realized by changing the coupler's factor k, the reflectivity of FBG1 R1 or the gain of the EDF g. Optimizing the system parameters, a flat bandpass response with amplitude depth of more than 45 dB is obtained at k = 0.5, R1 = 0.33, g = 10, and a flat bandstop response is also obtained at k = 0.4, R1 = 0.5, g = 2. In addition, the free-spectral range (FSR) can be controlled by changing the length of the EDF and the length difference between two MZMs. The method is proved feasible by some experiments. Such a method offers realistic solutions to support future radio-frequency (RF) optical communication systems

  20. Flat microwave photonic filter based on hybrid of two filters

    Qi, Chunhui; Pei, Li; Ning, Tigang; Li, Jing; Gao, Song

    2010-05-01

    A new microwave photonic filter (MPF) hybrid of two filters that can realize both multiple taps and a flat bandpass or bandstop response is presented. Based on the phase character of a Mach-Zehnder modulator (MZM), a two taps finite impulse response (FIR) filter is obtained as the first part. The second part is obtained by taking full advantage of the wavelength selectivity of the fiber Bragg grating (FBG) and the gain of a erbium-doped fiber (EDF). Combining the two filters, the flat bandpass or bandstop response is realized by changing the coupler's factor k, the reflectivity of FBG1 R1 or the gain of the EDF g. Optimizing the system parameters, a flat bandpass response with amplitude depth of more than 45 dB is obtained at k = 0.5, R1 = 0.33, g = 10, and a flat bandstop response is also obtained at k = 0.4, R1 = 0.5, g = 2. In addition, the free-spectral range (FSR) can be controlled by changing the length of the EDF and the length difference between two MZMs. The method is proved feasible by some experiments. Such a method offers realistic solutions to support future radio-frequency (RF) optical communication systems.