WorldWideScience

Sample records for traditional filtered back-projection

  1. Coronary CT angiography: image quality, diagnostic accuracy, and potential for radiation dose reduction using a novel iterative image reconstruction technique - comparison with traditional filtered back projection

    International Nuclear Information System (INIS)

    To compare image noise, image quality and diagnostic accuracy of coronary CT angiography (cCTA) using a novel iterative reconstruction algorithm versus traditional filtered back projection (FBP) and to estimate the potential for radiation dose savings. Sixty five consecutive patients (48 men; 59.3 ± 7.7 years) prospectively underwent cCTA and coronary catheter angiography (CCA). Full radiation dose data, using all projections, were reconstructed with FBP. To simulate image acquisition at half the radiation dose, 50% of the projections were discarded from the raw data. The resulting half-dose data were reconstructed with sinogram-affirmed iterative reconstruction (SAFIRE). Full-dose FBP and half-dose iterative reconstructions were compared with regard to image noise and image quality, and their respective accuracy for stenosis detection was compared against CCA. Compared with full-dose FBP, half-dose iterative reconstructions showed significantly (p = 0.001 - p = 0.025) lower image noise and slightly higher image quality. Iterative reconstruction improved the accuracy of stenosis detection compared with FBP (per-patient: accuracy 96.9% vs. 93.8%, sensitivity 100% vs. 100%, specificity 94.6% vs. 89.2%, NPV 100% vs. 100%, PPV 93.3% vs. 87.5%). Iterative reconstruction significantly reduces image noise without loss of diagnostic information and holds the potential for substantial radiation dose reduction from cCTA. (orig.)

  2. Filter back-projection technique applied to Abel inversion

    International Nuclear Information System (INIS)

    The inverse Abel transform is applicable to optically thin plasma with cylindrical symmetry, which is often encountered in plasma physics and inertial (or magnetic) confinement fusion. The filter back-projection technique is modified, and then a new method of inverse Abel transform is presented

  3. An adaptive filtered back-projection for photoacoustic image reconstruction

    Energy Technology Data Exchange (ETDEWEB)

    Huang, He; Bustamante, Gilbert; Peterson, Ralph; Ye, Jing Yong, E-mail: jingyong.ye@utsa.edu [Department of Biomedical Engineering, University of Texas at San Antonio, San Antonio, Texas 78249 (United States)

    2015-05-15

    Purpose: The purpose of this study is to develop an improved filtered-back-projection (FBP) algorithm for photoacoustic tomography (PAT), which allows image reconstruction with higher quality compared to images reconstructed through traditional algorithms. Methods: A rigorous expression of a weighting function has been derived directly from a photoacoustic wave equation and used as a ramp filter in Fourier domain. The authors’ new algorithm utilizes this weighting function to precisely calculate each photoacoustic signal’s contribution and then reconstructs the image based on the retarded potential generated from the photoacoustic sources. In addition, an adaptive criterion has been derived for selecting the cutoff frequency of a low pass filter. Two computational phantoms were created to test the algorithm. The first phantom contained five spheres with each sphere having different absorbances. The phantom was used to test the capability for correctly representing both the geometry and the relative absorbed energy in a planar measurement system. The authors also used another phantom containing absorbers of different sizes with overlapping geometry to evaluate the performance of the new method for complicated geometry. In addition, random noise background was added to the simulated data, which were obtained by using an arc-shaped array of 50 evenly distributed transducers that spanned 160° over a circle with a radius of 65 mm. A normalized factor between the neighbored transducers was applied for correcting measurement signals in PAT simulations. The authors assumed that the scanned object was mounted on a holder that rotated over the full 360° and the scans were set to a sampling rate of 20.48 MHz. Results: The authors have obtained reconstructed images of the computerized phantoms by utilizing the new FBP algorithm. From the reconstructed image of the first phantom, one can see that this new approach allows not only obtaining a sharp image but also showing the correct signal strength of the absorbers. The reconstructed image of the second phantom further demonstrates the capability to form clear images of the spheres with sharp borders in the overlapping geometry. The smallest sphere is clearly visible and distinguishable, even though it is surrounded by two big spheres. In addition, image reconstructions were conducted with randomized noise added to the observed signals to mimic realistic experimental conditions. Conclusions: The authors have developed a new FBP algorithm that is capable for reconstructing high quality images with correct relative intensities and sharp borders for PAT. The results demonstrate that the weighting function serves as a precise ramp filter for processing the observed signals in the Fourier domain. In addition, this algorithm allows an adaptive determination of the cutoff frequency for the applied low pass filter.

  4. Improvement of wavelet threshold filtered back-projection image reconstruction algorithm

    Science.gov (United States)

    Ren, Zhong; Liu, Guodong; Huang, Zhen

    2014-11-01

    Image reconstruction technique has been applied into many fields including some medical imaging, such as X ray computer tomography (X-CT), positron emission tomography (PET) and nuclear magnetic resonance imaging (MRI) etc, but the reconstructed effects are still not satisfied because original projection data are inevitably polluted by noises in process of image reconstruction. Although some traditional filters e.g., Shepp-Logan (SL) and Ram-Lak (RL) filter have the ability to filter some noises, Gibbs oscillation phenomenon are generated and artifacts leaded by back-projection are not greatly improved. Wavelet threshold denoising can overcome the noises interference to image reconstruction. Since some inherent defects exist in the traditional soft and hard threshold functions, an improved wavelet threshold function combined with filtered back-projection (FBP) algorithm was proposed in this paper. Four different reconstruction algorithms were compared in simulated experiments. Experimental results demonstrated that this improved algorithm greatly eliminated the shortcomings of un-continuity and large distortion of traditional threshold functions and the Gibbs oscillation. Finally, the availability of this improved algorithm was verified from the comparison of two evaluation criterions, i.e. mean square error (MSE), peak signal to noise ratio (PSNR) among four different algorithms, and the optimum dual threshold values of improved wavelet threshold function was gotten.

  5. A reconstruction algorithm for coherent scatter computed tomography based on filtered back-projection.

    Science.gov (United States)

    van Stevendaal, U; Schlomka, J P; Harding, A; Grass, M

    2003-09-01

    Coherent scatter computed tomography (CSCT) is a reconstructive x-ray imaging technique that yields the spatially resolved coherent-scatter form factor of the investigated object. Reconstruction from coherently scattered x-rays is commonly done using algebraic reconstruction techniques (ART). In this paper, we propose an alternative approach based on filtered back-projection. For the first time, a three-dimensional (3D) filtered back-projection technique using curved 3D back-projection lines is applied to two-dimensional coherent scatter projection data. The proposed algorithm is tested with simulated projection data as well as with projection data acquired with a demonstrator setup similar to a multi-line CT scanner geometry. While yielding comparable image quality as ART reconstruction, the modified 3D filtered back-projection algorithm is about two orders of magnitude faster. In contrast to iterative reconstruction schemes, it has the advantage that subfield-of-view reconstruction becomes feasible. This allows a selective reconstruction of the coherent-scatter form factor for a region of interest. The proposed modified 3D filtered back-projection algorithm is a powerful reconstruction technique to be implemented in a CSCT scanning system. This method gives coherent scatter CT the potential of becoming a competitive modality for medical imaging or nondestructive testing. PMID:14528968

  6. PET reconstruction artifact can be minimized by using sinogram correction and filtered back-projection technique

    OpenAIRE

    Jha, Ashish Kumar; Purandare, Nilendu C.; Shah, Sneha; Agrawal, Archi; Puranik, Ameya D; Rangarajan, Venkatesh

    2014-01-01

    Filtered Back-Projection (FBP) has become an outdated image reconstruction technique in new-generation positron emission tomography (PET)/computed tomography (CT) scanners. Iterative reconstruction used in all new-generation PET scanners is a much improved reconstruction technique. Though a well-calibrated PET system can only be used for clinical imaging in few situations like ours, when compromised PET scanner with one PET module bypassed was used for PET acquisition, FBP with sinogram corre...

  7. PET reconstruction artifact can be minimized by using sinogram correction and filtered back-projection technique

    Directory of Open Access Journals (Sweden)

    Ashish Kumar Jha

    2014-01-01

    Full Text Available Filtered Back-Projection (FBP has become an outdated image reconstruction technique in new-generation positron emission tomography (PET/computed tomography (CT scanners. Iterative reconstruction used in all new-generation PET scanners is a much improved reconstruction technique. Though a well-calibrated PET system can only be used for clinical imaging in few situations like ours, when compromised PET scanner with one PET module bypassed was used for PET acquisition, FBP with sinogram correction proved to be a better reconstruction technique to minimize streak artifact present in the image reconstructed by the iterative technique.

  8. Two-dimensional water temperature reconstruction by filtered back-projection method

    International Nuclear Information System (INIS)

    The reconstruction of water temperature in combustion is realized by the tunable diode absorption spectroscopy technique. The model for H2O temperature distribution is assumed as a Gaussian function, ranging from 300 K to 1300 K. Radon transform is used to simulate the experimental results. The reconstruction of temperature distribution is achieved by reconstruction of two temperature-dependent line strengths based on the filtered back-projection method. The temperature reconstruction result agrees well with the original model. Moreover, the influences of the number of projections and random errors in projections on reconstruction are also studied. The simulation results indicate that the decrease in projection number or the increase in noise increases the mean square error of the reconstructed temperature, deteriorating the reconstructed image. The temperature reconstruction can not reveal the original temperature distribution when the projection number reduces to four. (authors)

  9. Comparison of adaptive statistical iterative and filtered back projection reconstruction techniques in brain CT

    Energy Technology Data Exchange (ETDEWEB)

    Ren, Qingguo, E-mail: renqg83@163.com [Department of Radiology, Hua Dong Hospital of Fudan University, Shanghai 200040 (China); Dewan, Sheilesh Kumar, E-mail: sheilesh_d1@hotmail.com [Department of Geriatrics, Hua Dong Hospital of Fudan University, Shanghai 200040 (China); Li, Ming, E-mail: minli77@163.com [Department of Radiology, Hua Dong Hospital of Fudan University, Shanghai 200040 (China); Li, Jianying, E-mail: Jianying.Li@med.ge.com [CT Imaging Research Center, GE Healthcare China, Beijing (China); Mao, Dingbiao, E-mail: maodingbiao74@163.com [Department of Radiology, Hua Dong Hospital of Fudan University, Shanghai 200040 (China); Wang, Zhenglei, E-mail: Williswang_doc@yahoo.com.cn [Department of Radiology, Shanghai Electricity Hospital, Shanghai 200050 (China); Hua, Yanqing, E-mail: cjr.huayanqing@vip.163.com [Department of Radiology, Hua Dong Hospital of Fudan University, Shanghai 200040 (China)

    2012-10-15

    Purpose: To compare image quality and visualization of normal structures and lesions in brain computed tomography (CT) with adaptive statistical iterative reconstruction (ASIR) and filtered back projection (FBP) reconstruction techniques in different X-ray tube current–time products. Materials and methods: In this IRB-approved prospective study, forty patients (nineteen men, twenty-one women; mean age 69.5 ± 11.2 years) received brain scan at different tube current–time products (300 and 200 mAs) in 64-section multi-detector CT (GE, Discovery CT750 HD). Images were reconstructed with FBP and four levels of ASIR-FBP blending. Two radiologists (please note that our hospital is renowned for its geriatric medicine department, and these two radiologists are more experienced in chronic cerebral vascular disease than in neoplastic disease, so this research did not contain cerebral tumors but as a discussion) assessed all the reconstructed images for visibility of normal structures, lesion conspicuity, image contrast and diagnostic confidence in a blinded and randomized manner. Volume CT dose index (CTDI{sub vol}) and dose-length product (DLP) were recorded. All the data were analyzed by using SPSS 13.0 statistical analysis software. Results: There was no statistically significant difference between the image qualities at 200 mAs with 50% ASIR blending technique and 300 mAs with FBP technique (p > .05). While between the image qualities at 200 mAs with FBP and 300 mAs with FBP technique a statistically significant difference (p < .05) was found. Conclusion: ASIR provided same image quality and diagnostic ability in brain imaging with greater than 30% dose reduction compared with FBP reconstruction technique.

  10. Comparison of adaptive statistical iterative and filtered back projection reconstruction techniques in brain CT

    International Nuclear Information System (INIS)

    Purpose: To compare image quality and visualization of normal structures and lesions in brain computed tomography (CT) with adaptive statistical iterative reconstruction (ASIR) and filtered back projection (FBP) reconstruction techniques in different X-ray tube current–time products. Materials and methods: In this IRB-approved prospective study, forty patients (nineteen men, twenty-one women; mean age 69.5 ± 11.2 years) received brain scan at different tube current–time products (300 and 200 mAs) in 64-section multi-detector CT (GE, Discovery CT750 HD). Images were reconstructed with FBP and four levels of ASIR-FBP blending. Two radiologists (please note that our hospital is renowned for its geriatric medicine department, and these two radiologists are more experienced in chronic cerebral vascular disease than in neoplastic disease, so this research did not contain cerebral tumors but as a discussion) assessed all the reconstructed images for visibility of normal structures, lesion conspicuity, image contrast and diagnostic confidence in a blinded and randomized manner. Volume CT dose index (CTDIvol) and dose-length product (DLP) were recorded. All the data were analyzed by using SPSS 13.0 statistical analysis software. Results: There was no statistically significant difference between the image qualities at 200 mAs with 50% ASIR blending technique and 300 mAs with FBP technique (p > .05). While between the image qualities at 200 mAs with FBP and 300 mAs with FBP technique a statistically significant difference (p < .05) was found. Conclusion: ASIR provided same image quality and diagnostic ability in brain imaging with greater than 30% dose reduction compared with FBP reconstruction technique

  11. CT coronary angiography: Image quality with sinogram-affirmed iterative reconstruction compared with filtered back-projection

    International Nuclear Information System (INIS)

    Aim: To investigate image quality and potential for radiation dose reduction using sinogram-affirmed iterative reconstruction (SAFIRE) at computed tomography (CT) coronary angiography (CTCA) compared with filtered back-projection (FBP) reconstruction. Materials and methods: A water phantom and 49 consecutive patients were scanned using a retrospectively electrocardiography (ECG)-gated CTCA protocol on a dual-source CT system. Image reconstructions were performed with both conventional FBP and SAFIRE. The SAFIRE series were reconstructed image data from only one tube, simulating a 50% radiation dose reduction. Two blinded observers independently assessed the image quality of each coronary segment using a four-point scale and measured image noise (the standard deviation of Hounsfield values, SD), signal-to-noise ratio (SNR), and contrast-to-noise ratio (CNR). Radiation dose estimates were calculated. Results: In the water phantom, image noise decreased at the same ratio as the tube current increased for both reconstruction algorithms. Despite an estimated radiation dose reduction from 7.9 ± 2.8 to 4 ± 1.4 mSv, there was no significant difference in the SD and SNR within the aortic root and left ventricular chamber between the two reconstruction methods. There was also no significant difference in the image quality between the FBP and SAFIRE series. Conclusion: Compared with traditional FBP, there is potential for substantial radiation dose reduction at CTCA with use of SAFIRE, while maintaining similar diagnostic image quality

  12. Single Image Super-Resolution VIA Iterative Back Projection Based Canny Edge Detection and a Gabor Filter Prior

    Directory of Open Access Journals (Sweden)

    Rujul R Makwana

    2013-03-01

    Full Text Available The Iterative back-projection (IBP is a classical super-resolution method with low computational complexity that can be applied in real time applications. This paper presents an effective novel single image super resolution approach to recover a high resolution image from a single low resolution input image. The approach is based on an Iterative back projection (IBP method combined with the Canny Edge Detection and Gabor Filter to recover high frequency information. This method is applied on different natural gray images and compared with different existing image super resolution approaches. Simulation results show that the proposed algorithms can more accurately enlarge the low resolution image than previous approaches. Proposed algorithm increases the MSSIM and the PSNR and decreases MSE compared to other existing algorithms and also improves visual quality of enlarged images.

  13. Comprehensive analysis of high-performance computing methods for filtered back-projection

    OpenAIRE

    Mendl, Christian B.; Eliuk, Steven; Noga, Michelle; Boulanger, Pierre

    2013-01-01

    This paper provides an extensive runtime, accuracy, and noise analysis of Computed To-mography (CT) reconstruction algorithms using various High-Performance Computing (HPC) frameworks such as: “conventional” multi-core, multi threaded CPUs, Compute Unified Device Architecture (CUDA), and DirectX or OpenGL graphics pipeline programming. The proposed algorithms exploit various built-in hardwired features of GPUs such as rasterization and texture filtering. We compare implementations of the Filt...

  14. Gamma-ray momentum reconstruction from Compton electron trajectories by filtered back-projection

    Energy Technology Data Exchange (ETDEWEB)

    Haefner, A., E-mail: ahaefner@berkeley.edu; Plimley, B.; Pavlovsky, R. [Department of Nuclear Engineering, University of California Berkeley, 4155 Etcheverry Hall, MC 1730, Berkeley, California 94720-1730 (United States); Gunter, D. [Applied Nuclear Physics, Lawrence Berkeley National Laboratory, 1 Cyclotron Road Berkeley, California 94720 (United States); Vetter, K. [Department of Nuclear Engineering, University of California Berkeley, 4155 Etcheverry Hall, MC 1730, Berkeley, California 94720-1730 (United States); Applied Nuclear Physics, Lawrence Berkeley National Laboratory, 1 Cyclotron Road Berkeley, California 94720 (United States)

    2014-11-03

    Gamma-ray imaging utilizing Compton scattering has traditionally relied on measuring coincident gamma-ray interactions to map directional information of the source distribution. This coincidence requirement makes it an inherently inefficient process. We present an approach to gamma-ray reconstruction from Compton scattering that requires only a single electron tracking detector, thus removing the coincidence requirement. From the Compton scattered electron momentum distribution, our algorithm analytically computes the incident photon's correlated direction and energy distributions. Because this method maps the source energy and location, it is useful in applications, where prior information about the source distribution is unknown. We demonstrate this method with electron tracks measured in a scientific Si charge coupled device. While this method was demonstrated with electron tracks in a Si-based detector, it is applicable to any detector that can measure electron direction and energy, or equivalently the electron momentum. For example, it can increase the sensitivity to obtain energy and direction in gas-based systems that suffer from limited efficiency.

  15. Evaluation of dose reduction and image quality in CT colonography: Comparison of low-dose CT with iterative reconstruction and routine-dose CT with filtered back projection

    Energy Technology Data Exchange (ETDEWEB)

    Nagata, Koichi [Kameda Medical Center, Department of Radiology, Kamogawa, Chiba (Japan); Jichi Medical University, Department of Radiology, Tochigi (Japan); National Cancer Center, Cancer Screening Technology Division, Research Center for Cancer Prevention and Screening, Tokyo (Japan); Fujiwara, Masanori; Mogi, Tomohiro; Iida, Nao [Kameda Medical Center Makuhari, Department of Radiology, Chiba (Japan); Kanazawa, Hidenori; Sugimoto, Hideharu [Jichi Medical University, Department of Radiology, Tochigi (Japan); Mitsushima, Toru [Kameda Medical Center Makuhari, Department of Gastroenterology, Chiba (Japan); Lefor, Alan T. [Jichi Medical University, Department of Surgery, Tochigi (Japan)

    2015-01-15

    To prospectively evaluate the radiation dose and image quality comparing low-dose CT colonography (CTC) reconstructed using different levels of iterative reconstruction techniques with routine-dose CTC reconstructed with filtered back projection. Following institutional ethics clearance and informed consent procedures, 210 patients underwent screening CTC using automatic tube current modulation for dual positions. Examinations were performed in the supine position with a routine-dose protocol and in the prone position, randomly applying four different low-dose protocols. Supine images were reconstructed with filtered back projection and prone images with iterative reconstruction. Two blinded observers assessed the image quality of endoluminal images. Image noise was quantitatively assessed by region-of-interest measurements. The mean effective dose in the supine series was 1.88 mSv using routine-dose CTC, compared to 0.92, 0.69, 0.57, and 0.46 mSv at four different low doses in the prone series (p < 0.01). Overall image quality and noise of low-dose CTC with iterative reconstruction were significantly improved compared to routine-dose CTC using filtered back projection. The lowest dose group had image quality comparable to routine-dose images. Low-dose CTC with iterative reconstruction reduces the radiation dose by 48.5 to 75.1 % without image quality degradation compared to routine-dose CTC with filtered back projection. (orig.)

  16. Chest computed tomography using iterative reconstruction vs filtered back projection (Part 1): evaluation of image noise reduction in 32 patients

    International Nuclear Information System (INIS)

    To assess noise reduction achievable with an iterative reconstruction algorithm. 32 consecutive chest CT angiograms were reconstructed with regular filtered back projection (FBP) (Group 1) and an iterative reconstruction technique (IRIS) with 3 (Group 2a) and 5 (Group 2b) iterations. Objective image noise was significantly reduced in Group 2a and Group 2b compared with FBP (p < 0.0001). There was a significant reduction in the level of subjective image noise in Group 2a compared with Group 1 images (p < 0.003), further reinforced on Group 2b images (Group 2b vs Group 1; p < 0.0001) (Group 2b vs Group 2a; p = 0.0006). The overall image quality scores significantly improved on Group 2a images compared with Group 1 images (p = 0.0081) and on Group 2b images compared with Group 2a images (p < 0.0001). Comparative analysis of individual CT features of mild lung infiltration showed improved conspicuity of ground glass attenuation (p < 0.0001), ill-defined micronodules (p = 0.0351) and emphysematous lesions (p < 0.0001) on Group 2a images, further improved on Group 2b images for ground glass attenuation (p < 0.0001), and emphysematous lesions (p = 0.0087). Compared with regular FBP, iterative reconstructions enable significant reduction of image noise without loss of diagnostic information, thus having the potential to decrease radiation dose during chest CT examinations. (orig.)

  17. Rapid mapping of visual receptive fields by filtered back projection: application to multi-neuronal electrophysiology and imaging.

    Science.gov (United States)

    Johnston, Jamie; Ding, Huayu; Seibel, Sofie H; Esposti, Federico; Lagnado, Leon

    2014-11-15

    Neurons in the visual system vary widely in the spatiotemporal properties of their receptive fields (RFs), and understanding these variations is key to elucidating how visual information is processed. We present a new approach for mapping RFs based on the filtered back projection (FBP), an algorithm used for tomographic reconstructions. To estimate RFs, a series of bars were flashed across the retina at pseudo-random positions and at a minimum of five orientations. We apply this method to retinal neurons and show that it can accurately recover the spatial RF and impulse response of ganglion cells recorded on a multi-electrode array. We also demonstrate its utility for in vivo imaging by mapping the RFs of an array of bipolar cell synapses expressing a genetically encoded Ca(2+) indicator. We find that FBP offers several advantages over the commonly used spike-triggered average (STA): (i) ON and OFF components of a RF can be separated; (ii) the impulse response can be reconstructed at sample rates of 125 Hz, rather than the refresh rate of a monitor; (iii) FBP reveals the response properties of neurons that are not evident using STA, including those that display orientation selectivity, or fire at low mean spike rates; and (iv) the FBP method is fast, allowing the RFs of all the bipolar cell synaptic terminals in a field of view to be reconstructed in under 4 min. Use of the FBP will benefit investigations of the visual system that employ electrophysiology or optical reporters to measure activity across populations of neurons. PMID:25172952

  18. Coronary CT angiography: Comparison of a novel iterative reconstruction with filtered back projection for reconstruction of low-dose CT—Initial experience

    International Nuclear Information System (INIS)

    Objective: To prospectively compare subjective and objective image quality in 20% tube current coronary CT angiography (cCTA) datasets between an iterative reconstruction algorithm (SAFIRE) and traditional filtered back projection (FBP). Materials and methods: Twenty patients underwent a prospectively ECG-triggered dual-step cCTA protocol using 2nd generation dual-source CT (DSCT). CT raw data was reconstructed using standard FBP at full-dose (Group1a) and 80% tube current reduced low-dose (Group1b). The low-dose raw data was additionally reconstructed using iterative raw data reconstruction (Group2). Attenuation and image noise were measured in three regions of interest and signal-to-noise-ratio (SNR) as well as contrast-to-noise-ratio (CNR) was calculated. Subjective diagnostic image quality was evaluated using a 4-point Likert scale. Results: Mean image noise of group2 was lowered by 22% on average when compared to group1b (p 2 compared to group1b (p 2 (1.88 ± 0.63) was also rated significantly higher when compared to group1b (1.58 ± 0.63, p = 0.004). Conclusions: Image quality of 80% tube current reduced iteratively reconstructed cCTA raw data is significantly improved when compared to standard FBP and consequently may improve the diagnostic accuracy of cCTA

  19. Coronary CT angiography: Comparison of a novel iterative reconstruction with filtered back projection for reconstruction of low-dose CT—Initial experience

    Energy Technology Data Exchange (ETDEWEB)

    Takx, Richard A.P. [Department of Radiology and Radiological Science, Medical University of South Carolina, Charleston, SC (United States); Department of Radiology, Maastricht University Medical Centre, Maastricht (Netherlands); Schoepf, U. Joseph, E-mail: schoepf@musc.edu [Department of Radiology and Radiological Science, Medical University of South Carolina, Charleston, SC (United States); Division of Cardiology, Department of Medicine, Medical University of South Carolina, Charleston, SC (United States); Moscariello, Antonio [Department of Radiology and Radiological Science, Medical University of South Carolina, Charleston, SC (United States); Department of Radiology, Policlinico Universitario Campus Bio-Medico, Rome (Italy); Das, Marco [Department of Radiology, Maastricht University Medical Centre, Maastricht (Netherlands); Rowe, Garrett [Department of Radiology and Radiological Science, Medical University of South Carolina, Charleston, SC (United States); Schoenberg, Stefan O.; Fink, Christian [Institute of Clinical Radiology and Nuclear Medicine, University Medical Center Mannheim, Medical Faculty Mannheim, Heidelberg University (Germany); Henzler, Thomas [Department of Radiology and Radiological Science, Medical University of South Carolina, Charleston, SC (United States); Institute of Clinical Radiology and Nuclear Medicine, University Medical Center Mannheim, Medical Faculty Mannheim, Heidelberg University (Germany)

    2013-02-15

    Objective: To prospectively compare subjective and objective image quality in 20% tube current coronary CT angiography (cCTA) datasets between an iterative reconstruction algorithm (SAFIRE) and traditional filtered back projection (FBP). Materials and methods: Twenty patients underwent a prospectively ECG-triggered dual-step cCTA protocol using 2nd generation dual-source CT (DSCT). CT raw data was reconstructed using standard FBP at full-dose (Group{sub 1}a) and 80% tube current reduced low-dose (Group{sub 1}b). The low-dose raw data was additionally reconstructed using iterative raw data reconstruction (Group{sub 2}). Attenuation and image noise were measured in three regions of interest and signal-to-noise-ratio (SNR) as well as contrast-to-noise-ratio (CNR) was calculated. Subjective diagnostic image quality was evaluated using a 4-point Likert scale. Results: Mean image noise of group{sub 2} was lowered by 22% on average when compared to group{sub 1}b (p < 0.0001–0.0033), while there were no significant differences in mean attenuation within the same anatomical regions. The lower image noise resulted in significantly higher SNR and CNR ratios in group{sub 2} compared to group{sub 1}b (p < 0.0001–0.0232). Subjective image quality of group{sub 2} (1.88 ± 0.63) was also rated significantly higher when compared to group{sub 1}b (1.58 ± 0.63, p = 0.004). Conclusions: Image quality of 80% tube current reduced iteratively reconstructed cCTA raw data is significantly improved when compared to standard FBP and consequently may improve the diagnostic accuracy of cCTA.

  20. Why do commercial CT scanners still employ traditional, filtered back-projection for image reconstruction?

    International Nuclear Information System (INIS)

    Despite major advances in x-ray sources, detector arrays, gantry mechanical design and especially computer performance, one component of computed tomography (CT) scanners has remained virtually constant for the past 25 years—the reconstruction algorithm. Fundamental advances have been made in the solution of inverse problems, especially tomographic reconstruction, but these works have not been translated into clinical and related practice. The reasons are not obvious and seldom discussed. This review seeks to examine the reasons for this discrepancy and provides recommendations on how it can be resolved. We take the example of field of compressive sensing (CS), summarizing this new area of research from the eyes of practical medical physicists and explaining the disconnection between theoretical and application-oriented research. Using a few issues specific to CT, which engineers have addressed in very specific ways, we try to distill the mathematical problem underlying each of these issues with the hope of demonstrating that there are interesting mathematical problems of general importance that can result from in depth analysis of specific issues. We then sketch some unconventional CT-imaging designs that have the potential to impact on CT applications, if the link between applied mathematicians and engineers/physicists were stronger. Finally, we close with some observations on how the link could be strengthened. There is, we believe, an important opportunity to rapidly improve the performance of CT and related tomographic imaging techniques by addressing these issues. (topical review)

  1. Ultra low-dose chest CT using filtered back projection: Comparison of 80-, 100- and 120 kVp protocols in a prospective randomized study

    International Nuclear Information System (INIS)

    Highlights: • Filtered back projection technique enables acceptable image quality for chest CT examinations at 0.9 mGy (estimated effective dose of 0.5 mSv) for selected sizes of patients. • Lesion detection (such as solid non-calcified lung nodules) in lung parenchyma is optimal at 0.9 mGy, with limited visualization of thyroid nodules in FBP images. • Further dose reduction down to 0.4 mGy is possible for most patients undergoing follow-up chest CT for evaluation of larger lung nodules and GGOs. • Our results may help set the reference ALARA dose for chest CT examinations reconstructed with filtered back projection technique using the minimum possible radiation dose with acceptable image quality and lesion detection. - Abstract: Purpose: To assess lesion detection and diagnostic image quality of filtered back projection (FBP) reconstruction technique in ultra low-dose chest CT examinations. Methods and materials: In this IRB-approved ongoing prospective clinical study, 116 CT-image-series at four different radiation-doses were performed for 29 patients (age, 57–87 years; F:M – 15:12; BMI 16–32 kg/m2). All patients provided written-informed-consent for the acquisitions of additional ultra low-dose (ULD) series on a 256-slice MDCT (iCT, Philips Healthcare). In-addition to their clinical standard-dose chest CT (SD, 120 kV mean CTDIvol, 6 ± 1 mGy), ULD-CT was subsequently performed at three-dose-levels (0.9 mGy [120 kV]; 0.5 mGy [100 kV] and 0.2 mGy [80 kV]). Images were reconstructed with FBP (2.5 mm * 1.25 mm) resulting into four-stacks: SD-FBP (reference-standard), FBP0.9, FBP0.5, and FBP0.2. Four thoracic-radiologists from two-teaching-hospitals independently-evaluated data for lesion-detection and visibility-of-small-structures. Friedman's-non-parametric-test with post hoc Dunn's-test was used for data-analysis. Results: Interobserver-agreement was substantial between radiologists (k = 0.6–0.8). With pooled analysis, 146-pulmonary (27-groundglass-opacities, 64-solid-lung-nodules, 7-consolidations, 27-emphysema) and 347-mediastinal/soft tissue lesions (87-mediastinal, 46-hilar, 62-axillary-lymph-nodes, and 11-mediastinal-masses) were evaluated. Compared to the SD-FBP, 100% pulmonary-lesions were seen with FBP0.9, up to 81% with FBP0.5 (missed: 4), and up to 30% with FBP0.2 images (missed:16). Compared to SD-FBP, all enlarged mediastinal-lymph-nodes were seen with FBP0.9 images. All mediastinal-masses (>2 cm, 11/11) were seen equivalent to SD-FBP images at 0.9 mGy. Across all sizes of patients, FBP0.9 images had optimal visualization for lung findings. They were optimal for mediastinal soft tissues for only non-obese patients. Conclusion: Filtered-back-projection technique allows optimal lesion detection and acceptable image quality for chest-CT examinations at CDTIvol of 0.9 mGy for lung and mediastinal findings in selected sizes of patients

  2. Half-dose abdominal CT with sinogram-affirmed iterative reconstruction technique in children - comparison with full-dose CT with filtered back projection

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Minwook; Kim, Myung-Joon; Lee, Mi-Jung [Yonsei University College of Medicine, Department of Radiology and Research Institute of Radiological Science, Severance Children' s Hospital, 50 Yonsei-ro, Seodaemun-gu, Seoul (Korea, Republic of); Han, Kyung Hwa [Yonsei University College of Medicine, Gangnam Medical Research Center, Biostatistics Collaboration Unit, Seoul (Korea, Republic of)

    2014-07-17

    Iterative reconstruction can be helpful to reduce radiation dose while maintaining image quality. However, this technique has not been fully evaluated in children during abdominal CT. To compare objective and subjective image quality between half-dose images reconstructed with iterative reconstruction at iteration strength levels 1 to 5 (half-S1 to half-S5 studies) and full-dose images reconstructed with filtered back projection (full studies) in pediatric abdominal CT. Twenty-one children (M:F = 13:8; mean age 8.2 ± 5.7 years) underwent dual-source abdominal CT (mean effective dose 4.8 ± 2.1 mSv). The objective image quality was evaluated as noise. Subjective image quality analysis was performed comparing each half study to the full study for noise, sharpness, artifact and diagnostic acceptability. Both objective and subjective image noise decreased with increasing iteration strength. Half-S4 and -S5 studies showed objective image noise similar to or lower than that of full studies. The half-S2 and -S3 studies produced the greatest sharpness and the half-S5 studies were the worst from a blocky appearance. Full and half studies did not differ in artifacts. Half-S3 studies showed the best diagnostic acceptability. Half-S4 and -S5 studies objectively and half-S3 studies subjectively showed comparable image quality to full studies in pediatric abdominal CT. (orig.)

  3. Quantitative analysis of emphysema and airway measurements according to iterative reconstruction algorithms: comparison of filtered back projection, adaptive statistical iterative reconstruction and model-based iterative reconstruction

    Energy Technology Data Exchange (ETDEWEB)

    Choo, Ji Yung [Seoul National University Medical Research Center, Department of Radiology, Seoul National University College of Medicine, and Institute of Radiation Medicine, Seoul (Korea, Republic of); Korea University Ansan Hospital, Ansan-si, Department of Radiology, Gyeonggi-do (Korea, Republic of); Goo, Jin Mo; Park, Chang Min; Park, Sang Joon [Seoul National University Medical Research Center, Department of Radiology, Seoul National University College of Medicine, and Institute of Radiation Medicine, Seoul (Korea, Republic of); Seoul National University, Cancer Research Institute, Seoul (Korea, Republic of); Lee, Chang Hyun; Shim, Mi-Suk [Seoul National University Medical Research Center, Department of Radiology, Seoul National University College of Medicine, and Institute of Radiation Medicine, Seoul (Korea, Republic of)

    2014-04-15

    To evaluate filtered back projection (FBP) and two iterative reconstruction (IR) algorithms and their effects on the quantitative analysis of lung parenchyma and airway measurements on computed tomography (CT) images. Low-dose chest CT obtained in 281 adult patients were reconstructed using three algorithms: FBP, adaptive statistical IR (ASIR) and model-based IR (MBIR). Measurements of each dataset were compared: total lung volume, emphysema index (EI), airway measurements of the lumen and wall area as well as average wall thickness. Accuracy of airway measurements of each algorithm was also evaluated using an airway phantom. EI using a threshold of -950 HU was significantly different among the three algorithms in decreasing order of FBP (2.30 %), ASIR (1.49 %) and MBIR (1.20 %) (P < 0.01). Wall thickness was also significantly different among the three algorithms with FBP (2.09 mm) demonstrating thicker walls than ASIR (2.00 mm) and MBIR (1.88 mm) (P < 0.01). Airway phantom analysis revealed that MBIR showed the most accurate value for airway measurements. The three algorithms presented different EIs and wall thicknesses, decreasing in the order of FBP, ASIR and MBIR. Thus, care should be taken in selecting the appropriate IR algorithm on quantitative analysis of the lung. (orig.)

  4. Half-dose abdominal CT with sinogram-affirmed iterative reconstruction technique in children - comparison with full-dose CT with filtered back projection

    International Nuclear Information System (INIS)

    Iterative reconstruction can be helpful to reduce radiation dose while maintaining image quality. However, this technique has not been fully evaluated in children during abdominal CT. To compare objective and subjective image quality between half-dose images reconstructed with iterative reconstruction at iteration strength levels 1 to 5 (half-S1 to half-S5 studies) and full-dose images reconstructed with filtered back projection (full studies) in pediatric abdominal CT. Twenty-one children (M:F = 13:8; mean age 8.2 ± 5.7 years) underwent dual-source abdominal CT (mean effective dose 4.8 ± 2.1 mSv). The objective image quality was evaluated as noise. Subjective image quality analysis was performed comparing each half study to the full study for noise, sharpness, artifact and diagnostic acceptability. Both objective and subjective image noise decreased with increasing iteration strength. Half-S4 and -S5 studies showed objective image noise similar to or lower than that of full studies. The half-S2 and -S3 studies produced the greatest sharpness and the half-S5 studies were the worst from a blocky appearance. Full and half studies did not differ in artifacts. Half-S3 studies showed the best diagnostic acceptability. Half-S4 and -S5 studies objectively and half-S3 studies subjectively showed comparable image quality to full studies in pediatric abdominal CT. (orig.)

  5. Adaptive statistical iterative reconstruction versus filtered back projection in the same patient: 64 channel liver CT image quality and patient radiation dose

    Energy Technology Data Exchange (ETDEWEB)

    Mitsumori, Lee M.; Shuman, William P.; Busey, Janet M.; Kolokythas, Orpheus; Koprowicz, Kent M. [University of Washington School of Medicine, Department of Radiology, Seattle, WA (United States)

    2012-01-15

    To compare routine dose liver CT reconstructed with filtered back projection (FBP) versus low dose images reconstructed with FBP and adaptive statistical iterative reconstruction (ASIR). In this retrospective study, patients had a routine dose protocol reconstructed with FBP, and again within 17 months (median 6.1 months), had a low dose protocol reconstructed twice, with FBP and ASIR. These reconstructions were compared for noise, image quality, and radiation dose. Nineteen patients were included. (12 male, mean age 58). Noise was significantly lower in low dose images reconstructed with ASIR compared to routine dose images reconstructed with FBP (liver: p <.05, aorta: p < 0.001). Low dose FBP images were scored significantly lower for subjective image quality than low dose ASIR (2.1 {+-} 0.5, 3.2 {+-} 0.8, p < 0.001). There was no difference in subjective image quality scores between routine dose FBP images and low dose ASIR images (3.6 {+-} 0.5, 3.2 {+-} 0.8, NS).Radiation dose was 41% less for the low dose protocol (4.4 {+-} 2.4 mSv versus 7.5 {+-} 5.5 mSv, p < 0.05). Our initial results suggest low dose CT images reconstructed with ASIR may have lower measured noise, similar image quality, yet significantly less radiation dose compared with higher dose images reconstructed with FBP. (orig.)

  6. Chest computed tomography using iterative reconstruction vs filtered back projection (Part 2): image quality of low-dose CT examinations in 80 patients

    International Nuclear Information System (INIS)

    To evaluate the image quality of an iterative reconstruction algorithm (IRIS) in low-dose chest CT in comparison with standard-dose filtered back projection (FBP) CT. Eighty consecutive patients referred for a follow-up chest CT examination of the chest, underwent a low-dose CT examination (Group 2) in similar technical conditions to those of the initial examination, (Group 1) except for the milliamperage selection and the replacement of regular FBP reconstruction by iterative reconstructions using three (Group 2a) and five iterations (Group 2b). Despite a mean decrease of 35.5% in the dose-length-product, there was no statistically significant difference between Group 2a and Group 1 in the objective noise, signal-to-noise (SNR) and contrast-to-noise (CNR) ratios and distribution of the overall image quality scores. Compared to Group 1, objective image noise in Group 2b was significantly reduced with increased SNR and CNR and a trend towards improved image quality. Iterative reconstructions using three iterations provide similar image quality compared with the conventionally used FBP reconstruction at 35% less dose, thus enabling dose reduction without loss of diagnostic information. According to our preliminary results, even higher dose reductions than 35% may be feasible by using more than three iterations. (orig.)

  7. Quantitative analysis of emphysema and airway measurements according to iterative reconstruction algorithms: comparison of filtered back projection, adaptive statistical iterative reconstruction and model-based iterative reconstruction

    International Nuclear Information System (INIS)

    To evaluate filtered back projection (FBP) and two iterative reconstruction (IR) algorithms and their effects on the quantitative analysis of lung parenchyma and airway measurements on computed tomography (CT) images. Low-dose chest CT obtained in 281 adult patients were reconstructed using three algorithms: FBP, adaptive statistical IR (ASIR) and model-based IR (MBIR). Measurements of each dataset were compared: total lung volume, emphysema index (EI), airway measurements of the lumen and wall area as well as average wall thickness. Accuracy of airway measurements of each algorithm was also evaluated using an airway phantom. EI using a threshold of -950 HU was significantly different among the three algorithms in decreasing order of FBP (2.30 %), ASIR (1.49 %) and MBIR (1.20 %) (P < 0.01). Wall thickness was also significantly different among the three algorithms with FBP (2.09 mm) demonstrating thicker walls than ASIR (2.00 mm) and MBIR (1.88 mm) (P < 0.01). Airway phantom analysis revealed that MBIR showed the most accurate value for airway measurements. The three algorithms presented different EIs and wall thicknesses, decreasing in the order of FBP, ASIR and MBIR. Thus, care should be taken in selecting the appropriate IR algorithm on quantitative analysis of the lung. (orig.)

  8. Local detection of prostate cancer by positron emission tomography with 2-fluorodeoxyglucose comparison of filtered back projection and iterative reconstruction with segmented attenuation correction

    International Nuclear Information System (INIS)

    To compare filtered back projection (FBP) and iterative reconstruction with segmented attenuation correction (IRSAC) in the local imaging of prostate cancer by positron emission tomography with 2-fluorodeoxyglucose (FDG-PET). 13 patients with primary (n=7) or recurrent (n=6) prostate cancer who had increased uptake in the prostate on FDG-PET performed without urinary catheterization, contemporaneous biopsy confirming the presence of active tumor in the prostate, were retrospectively identified. Two independent nuclear medicine physicians separately rated FBP and IRSAC images for visualization of prostatic activity on a 4-point scale. Results were compared using biopsy and cross-sectional imaging findings as the standard of reference. IRSAC images were significantly better that FBP in terms of visualization of prostatic activity in 12 of 13 patients, and were equivalent in 1 patient (p<0.001, Wilcoxon signed ranks test). In particular, 2 foci of tumor activity in 2 different patients seen on IRSAC images were not visible on FBP images. In 11 patients who had a gross tumor mass evident on cross-sectional imaging, there was good agreement between PET and cross-sectional anatomic imaging with respect to tumor localization. In selected patients, cancer can be imaged within the prostate using FDG-PET, and IRSAC is superior to FBP in image reconstruction for local tumor visualization

  9. Forward problem solution as the operator of filtered and back projection matrix to reconstruct the various method of data collection and the object element model in electrical impedance tomography

    Energy Technology Data Exchange (ETDEWEB)

    Ain, Khusnul [Engineering Physics Program, ITB, Bandung - Indonesia (Indonesia); Physics Department - Airlangga University, Surabaya – Indonesia, khusnulainunair@yahoo.com (Indonesia); Kurniadi, Deddy; Suprijanto [Engineering Physics Program, ITB, Bandung - Indonesia (Indonesia); Santoso, Oerip [Informatics Program, ITB, Bandung - Indonesia (Indonesia); Wibowo, Arif [Physics Department - Airlangga University, Surabaya – Indonesia, khusnulainunair@yahoo.com (Indonesia)

    2015-04-16

    Back projection reconstruction has been implemented to get the dynamical image in electrical impedance tomography. However the implementation is still limited in method of adjacent data collection and circular object element model. The study aims to develop the methods of back projection as reconstruction method that has the high speed, accuracy, and flexibility, which can be used for various methods of data collection and model of the object element. The proposed method uses the forward problem solution as the operator of filtered and back projection matrix. This is done through a simulation study on several methods of data collection and various models of the object element. The results indicate that the developed method is capable of producing images, fastly and accurately for reconstruction of the various methods of data collection and models of the object element.

  10. Forward problem solution as the operator of filtered and back projection matrix to reconstruct the various method of data collection and the object element model in electrical impedance tomography

    International Nuclear Information System (INIS)

    Back projection reconstruction has been implemented to get the dynamical image in electrical impedance tomography. However the implementation is still limited in method of adjacent data collection and circular object element model. The study aims to develop the methods of back projection as reconstruction method that has the high speed, accuracy, and flexibility, which can be used for various methods of data collection and model of the object element. The proposed method uses the forward problem solution as the operator of filtered and back projection matrix. This is done through a simulation study on several methods of data collection and various models of the object element. The results indicate that the developed method is capable of producing images, fastly and accurately for reconstruction of the various methods of data collection and models of the object element

  11. Image quality and radiation dose of low dose coronary CT angiography in obese patients: Sinogram affirmed iterative reconstruction versus filtered back projection

    International Nuclear Information System (INIS)

    Purpose: To investigate the image quality and radiation dose of low radiation dose CT coronary angiography (CTCA) using sinogram affirmed iterative reconstruction (SAFIRE) compared with standard dose CTCA using filtered back-projection (FBP) in obese patients. Materials and methods: Seventy-eight consecutive obese patients were randomized into two groups and scanned using a prospectively ECG-triggered step-and-shot (SAS) CTCA protocol on a dual-source CT scanner. Thirty-nine patients (protocol A) were examined using a routine radiation dose protocol at 120 kV and images were reconstructed with FBP (protocol A). Thirty-nine patients (protocol B) were examined using a low dose protocol at 100 kV and images were reconstructed with SAFIRE. Two blinded observers independently assessed the image quality of each coronary segment using a 4-point scale (1 = non-diagnostic, 4 = excellent) and measured the objective parameters image noise, signal-to-noise ratio (SNR), and contrast-to-noise ratio (CNR). Radiation dose was calculated. Results: The coronary artery image quality scores, image noise, SNR and CNR were not significantly different between protocols A and B (all p > 0.05), with image quality scores of 3.51 ± 0.70 versus 3.55 ± 0.47, respectively. The effective radiation dose was significantly lower in protocol B (4.41 ± 0.83 mSv) than that in protocol A (8.83 ± 1.74 mSv, p < 0.01). Conclusion: Compared with standard dose CTCA using FBP, low dose CTCA using SAFIRE can maintain diagnostic image quality with 50% reduction of radiation dose.

  12. Dose reduction in computed tomography of the chest. Image quality of iterative reconstructions at a 50% radiation dose compared to filtered back projection at a 100% radiation dose

    International Nuclear Information System (INIS)

    Purpose: The aim of this study was to evaluate the potential of iterative reconstruction (IR) in chest computed tomography (CT) to reduce radiation exposure. The qualitative and quantitative image quality of standard reconstructions with filtered back projection (FBP) and half dose (HD) chest CT data reconstructed with FBP and IR was assessed. Materials and Methods: 52 consecutive patients underwent contrast-enhanced chest CT on a dual-source CT system at 120 kV and automatic exposure control. The tube current was equally split on both tube detector systems. For the HD datasets, only data from one tube detector system was utilized. Thus, FD and HD data was available for each patient with a single scan. Three datasets were reconstructed from the raw data: standard full dose (FD) images applying FBP which served as a reference, HD images applying FBP and IR. Objective image quality analysis was performed by measuring the image noise in tissue and air. The subjective image quality was evaluated by 2 radiologists according to European guidelines. Additional assessment of artifacts, lesion conspicuity and edge sharpness was performed. Results: Image noise did not differ significantly between HD-IR and FD-FBP (p = 0.254) but increased substantially in HD-FBP (p < 0.001). No statistically significant differences were found for the reproduction of anatomical and pathological structures between HD-IR and FD-FBP, subsegmental bronchi and bronchioli. The image quality of HD-FBP was rated inferior because of increased noise. Conclusion: A 50% dose reduction in contrast-enhanced chest CT is feasible without a loss of diagnostic confidence if IR is used for image data reconstruction. Iterative reconstruction is another powerful tool to reduce radiation exposure and can be combined with other dose-saving techniques. (orig.)

  13. Adaptive iterative dose reduction algorithm in CT: Effect on image quality compared with filtered back projection in body phantoms of different sizes

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Milim; Lee, Jeong Min; Son, Hyo Shin; Han, Joon Koo; Choi, Byung Ihn [College of Medicine, Seoul National University, Seoul (Korea, Republic of); Yoon, Jeong Hee; Choi, Jin Woo [Dept. of Radiology, Seoul National University Hospital, Seoul (Korea, Republic of)

    2014-04-15

    To evaluate the impact of the adaptive iterative dose reduction (AIDR) three-dimensional (3D) algorithm in CT on noise reduction and the image quality compared to the filtered back projection (FBP) algorithm and to compare the effectiveness of AIDR 3D on noise reduction according to the body habitus using phantoms with different sizes. Three different-sized phantoms with diameters of 24 cm, 30 cm, and 40 cm were built up using the American College of Radiology CT accreditation phantom and layers of pork belly fat. Each phantom was scanned eight times using different mAs. Images were reconstructed using the FBP and three different strengths of the AIDR 3D. The image noise, the contrast-to-noise ratio (CNR) and the signal-to-noise ratio (SNR) of the phantom were assessed. Two radiologists assessed the image quality of the 4 image sets in consensus. The effectiveness of AIDR 3D on noise reduction compared with FBP were also compared according to the phantom sizes. Adaptive iterative dose reduction 3D significantly reduced the image noise compared with FBP and enhanced the SNR and CNR (p < 0.05) with improved image quality (p < 0.05). When a stronger reconstruction algorithm was used, greater increase of SNR and CNR as well as noise reduction was achieved (p < 0.05). The noise reduction effect of AIDR 3D was significantly greater in the 40-cm phantom than in the 24-cm or 30-cm phantoms (p < 0.05). The AIDR 3D algorithm is effective to reduce the image noise as well as to improve the image-quality parameters compared by FBP algorithm, and its effectiveness may increase as the phantom size increases.

  14. Dose reduction in chest CT: Comparison of the adaptive iterative dose reduction 3D, adaptive iterative dose reduction, and filtered back projection reconstruction techniques

    International Nuclear Information System (INIS)

    Objectives: To assess the effectiveness of adaptive iterative dose reduction (AIDR) and AIDR 3D in improving the image quality in low-dose chest CT (LDCT). Materials and methods: Fifty patients underwent standard-dose chest CT (SDCT) and LDCT simultaneously, performed under automatic exposure control with noise index of 19 and 38 (for a 2-mm slice thickness), respectively. The SDCT images were reconstructed with filtered back projection (SDCT-FBP images), and the LDCT images with FBP, AIDR and AIDR 3D (LDCT-FBP, LDCT-AIDR and LDCT-AIDR 3D images, respectively). On all the 200 lung and 200 mediastinal image series, objective image noise and signal-to-noise ratio (SNR) were measured in several regions, and two blinded radiologists independently assessed the subjective image quality. Wilcoxon's signed rank sum test with Bonferroni's correction was used for the statistical analyses. Results: The mean dose reduction in LDCT was 64.2% as compared with the dose in SDCT. LDCT-AIDR 3D images showed significantly reduced objective noise and significantly increased SNR in all regions as compared to the SDCT-FBP, LDCT-FBP and LDCT-AIDR images (all, P ? 0.003). In all assessments of the image quality, LDCT-AIDR 3D images were superior to LDCT-AIDR and LDCT-FBP images. The overall diagnostic acceptability of both the lung and mediastinal LDCT-AIDR 3D images was comparable to that of the lung and mediastinal SDCT-FBP images. Conclusions: AIDR 3D is superior to AIDR. Intra-individual comparisons between SDCT and LDCT suggest that AIDR 3D allows a 64.2% reduction of the radiation dose as compared to SDCT, by substantially reducing the objective image noise and increasing the SNR, while maintaining the overall diagnostic acceptability.

  15. Impact of adaptive iterative dose reduction (AIDR) 3D on low-dose abdominal CT - Comparison with routine-dose CT using filtered back projection

    Energy Technology Data Exchange (ETDEWEB)

    Matsuki, Mitsuru; Murakami, Takamichi [Dept. of Radiology, Kinki Univ. School of Medicine, Osaka (Japan)], e-mail: rad053@poh.osaka-med.ac.jp; Juri, Hiroshi; Yoshikawa, Shushi; Narumi, Yoshifumi [Dept. of Radiology, Osaka Medical Coll., Osaka (Japan)

    2013-10-15

    Background: While CT is widely used in medical practice, a substantial source of radiation exposure is associated with an increased lifetime risk of cancer. Therefore, concerns to dose reduction in CT examinations are increasing and an iterative reconstruction algorithm, which allow for dose reduction by compensating image noise in the image reconstruction, has been developed. Purpose: To investigate the performance of low-dose abdominal CT using adaptive iterative dose reduction 3D (AIDR 3D) compared to routine-dose CT using filtered back projection (FBP). Material and Methods: Fifty-eight patients underwent both routine-dose CT scans using FBP and low-dose CT scans using AIDR 3D in the abdomen. The image noise levels, signal-to-noise ratios (SNRs), and contrast-to-noise ratios (CNRs) of the aorta, portal vein, liver, and pancreas were measured and compared in both scans. Visual evaluations were performed. The volume CT dose index (CTDIvol) was measured. Results: Image noise levels on low-dose CT images using AIDR 3D were significantly lower than, or not significantly different from, routine-dose CT images using FBP in reviewing the data on the basis of all patients and the three BMI groups. SNRs and CNRs on low-dose CT images using AIDR 3D were significantly higher than, or not significantly different from, routine-dose CT images using FBP in reviewing the data on the basis of all patients and the three BMI groups. In visual evaluation of the images, there were no statistically significant differences between the scans in all organs independently of BMI. The average CTDIvol at routine-dose and low dose CT was 21.4 and 10.8 mGy, respectively. Conclusion: Low-dose abdominal CT using AIDR 3D allows for approximately 50 % reduction in radiation dose without a degradation of image quality compared to routine-dose CT using FBP independently of BMI.

  16. Impact of hybrid iterative reconstruction on Agatston coronary artery calcium scores in comparison to filtered back projection in native cardiac CT

    International Nuclear Information System (INIS)

    To investigate whether the effects of hybrid iterative reconstruction (HIR) on coronary artery calcium (CAC) measurements using the Agatston score lead to changes in assignment of patients to cardiovascular risk groups compared to filtered back projection (FBP). 68 patients (mean age 61.5 years; 48 male; 20 female) underwent prospectively ECG-gated, non-enhanced, cardiac 256-MSCT for coronary calcium scoring. Scanning parameters were as follows: Tube voltage, 120 kV; Mean tube current time-product 63.67 mAs (50 - 150 mAs); collimation, 2 x 128 x 0.625 mm. Images were reconstructed with FBP and with HIR at all levels (L1 to L7). Two independent readers measured Agatston scores of all reconstructions and assigned patients to cardiovascular risk groups. Scores of HIR and FBP reconstructions were correlated (Spearman). Interobserver agreement and variability was assessed with k-statistics and Bland-Altmann-Plots. Agatston scores of HIR reconstructions were closely correlated with FBP reconstructions (L1, R = 0.9996; L2, R = 0.9995; L3, R = 0.9991; L4, R = 0.986; L5, R = 0.9986; L6, R = 0.9987; and L7, R = 0.9986). In comparison to FBP, HIR led to reduced Agatston scores between 97% (L1) and 87.4% (L7) of the FBP values. Using HIR iterations L1-L3, all patients were assigned to identical risk groups as after FPB reconstruction. In 5.4% of patients the risk group after HIR with the maximum iteration level was different from the group after FBP reconstruction. There was an excellent correlation of Agatston scores after HIR and FBP with identical risk group assignment at levels 1 - 3 for all patients. Hence it appears that the application of HIR in routine calcium scoring does not entail any disadvantages. Thus, future studies are needed to demonstrate whether HIR is a reliable method for reducing radiation dose in coronary calcium scoring.

  17. Iterative reconstruction technique vs filter back projection: utility for quantitative bronchial assessment on low-dose thin-section MDCT in patients with/without chronic obstructive pulmonary disease

    International Nuclear Information System (INIS)

    The aim of this study was to evaluate the utility of the iterative reconstruction (IR) technique for quantitative bronchial assessment during low-dose computed tomography (CT) as a substitute for standard-dose CT in patients with/without chronic obstructive pulmonary disease. Fifty patients (mean age, 69.2; mean % predicted FEV1, 79.4) underwent standard-dose CT (150mAs) and low-dose CT (25mAs). Except for tube current, the imaging parameters were identical for both protocols. Standard-dose CT was reconstructed using filtered back-projection (FBP), and low-dose CT was reconstructed using IR and FBP. For quantitative bronchial assessment, the wall area percentage (WA%) of the sub-segmental bronchi and the airway luminal volume percentage (LV%) from the main bronchus to the peripheral bronchi were acquired in each dataset. The correlation and agreement of WA% and LV% between standard-dose CT and both low-dose CTs were statistically evaluated. WA% and LV% between standard-dose CT and both low-dose CTs were significant correlated (r > 0.77, p < 0.00001); however, only the LV% agreement between SD-CT and low-dose CT reconstructed with IR was moderate (concordance correlation coefficient = 0.93); the other agreement was poor (concordance correlation coefficient <0.90). Quantitative bronchial assessment via low-dose CT has potential as a substitute for standard-dose CT by using IR and airway luminal volumetry techniques. circle Quantitative bronchial assessment of COPD using low-dose CT is possible. (orig.)

  18. Computed tomography of the cervical spine: comparison of image quality between a standard-dose and a low-dose protocol using filtered back-projection and iterative reconstruction

    International Nuclear Information System (INIS)

    To compare image quality of a standard-dose (SD) and a low-dose (LD) cervical spine CT protocol using filtered back-projection (FBP) and iterative reconstruction (IR). Forty patients investigated by cervical spine CT were prospectively randomised into two groups: SD (120 kVp, 275 mAs) and LD (120 kVp, 150 mAs), both applying automatic tube current modulation. Data were reconstructed using both FBP and sinogram-affirmed IR. Image noise, signal-to-noise (SNR) and contrast-to-noise (CNR) ratios were measured. Two radiologists independently and blindly assessed the following anatomical structures at C3-C4 and C6-C7 levels, using a four-point scale: intervertebral disc, content of neural foramina and dural sac, ligaments, soft tissues and vertebrae. They subsequently rated overall image quality using a ten-point scale. For both protocols and at each disc level, IR significantly decreased image noise and increased SNR and CNR, compared with FBP. SNR and CNR were statistically equivalent in LD-IR and SD-FBP protocols. Regardless of the dose and disc level, the qualitative scores with IR compared with FBP, and with LD-IR compared with SD-FBP, were significantly higher or not statistically different for intervertebral discs, neural foramina and ligaments, while significantly lower or not statistically different for soft tissues and vertebrae. The overall image quality scores were significantly higher with IR compared with FBP, and with LD-IR compared with SD-FBP. LD-IR cervical spine CT provides better image quality for intervertebral discs, neural foramina and ligaments, and worse image quality for soft tissues and vertebrae, compared with SD-FBP, while reducing radiation dose by approximately 40 %. (orig.)

  19. Computed tomography of the cervical spine: comparison of image quality between a standard-dose and a low-dose protocol using filtered back-projection and iterative reconstruction

    Energy Technology Data Exchange (ETDEWEB)

    Becce, Fabio [University of Lausanne, Department of Diagnostic and Interventional Radiology, Centre Hospitalier Universitaire Vaudois, Lausanne (Switzerland); Universite Catholique Louvain, Department of Radiology, Cliniques Universitaires Saint-Luc, Brussels (Belgium); Ben Salah, Yosr; Berg, Bruno C. vande; Lecouvet, Frederic E.; Omoumi, Patrick [Universite Catholique Louvain, Department of Radiology, Cliniques Universitaires Saint-Luc, Brussels (Belgium); Verdun, Francis R. [University of Lausanne, Institute of Radiation Physics, Centre Hospitalier Universitaire Vaudois, Lausanne (Switzerland); Meuli, Reto [University of Lausanne, Department of Diagnostic and Interventional Radiology, Centre Hospitalier Universitaire Vaudois, Lausanne (Switzerland)

    2013-07-15

    To compare image quality of a standard-dose (SD) and a low-dose (LD) cervical spine CT protocol using filtered back-projection (FBP) and iterative reconstruction (IR). Forty patients investigated by cervical spine CT were prospectively randomised into two groups: SD (120 kVp, 275 mAs) and LD (120 kVp, 150 mAs), both applying automatic tube current modulation. Data were reconstructed using both FBP and sinogram-affirmed IR. Image noise, signal-to-noise (SNR) and contrast-to-noise (CNR) ratios were measured. Two radiologists independently and blindly assessed the following anatomical structures at C3-C4 and C6-C7 levels, using a four-point scale: intervertebral disc, content of neural foramina and dural sac, ligaments, soft tissues and vertebrae. They subsequently rated overall image quality using a ten-point scale. For both protocols and at each disc level, IR significantly decreased image noise and increased SNR and CNR, compared with FBP. SNR and CNR were statistically equivalent in LD-IR and SD-FBP protocols. Regardless of the dose and disc level, the qualitative scores with IR compared with FBP, and with LD-IR compared with SD-FBP, were significantly higher or not statistically different for intervertebral discs, neural foramina and ligaments, while significantly lower or not statistically different for soft tissues and vertebrae. The overall image quality scores were significantly higher with IR compared with FBP, and with LD-IR compared with SD-FBP. LD-IR cervical spine CT provides better image quality for intervertebral discs, neural foramina and ligaments, and worse image quality for soft tissues and vertebrae, compared with SD-FBP, while reducing radiation dose by approximately 40 %. (orig.)

  20. Iterative reconstruction technique vs filter back projection: utility for quantitative bronchial assessment on low-dose thin-section MDCT in patients with/without chronic obstructive pulmonary disease

    Energy Technology Data Exchange (ETDEWEB)

    Koyama, Hisanobu; Seki, Shinichiro; Sugimura, Kazuro [Kobe University Graduate School of Medicine, Division of Radiology, Department of Radiology, Kobe, Hyogo (Japan); Ohno, Yoshiharu; Nishio, Mizuho; Matsumoto, Sumiaki; Yoshikawa, Takeshi [Kobe University Graduate School of Medicine, Advanced Biomedical Imaging Research Centre, Kobe (Japan); Kobe University Graduate School of Medicine, Division of Functional and Diagnostic Imaging Research, Department of Radiology, Kobe (Japan); Sugihara, Naoki [Toshiba Medical Systems Corporation, Ohtawara, Tochigi (Japan)

    2014-08-15

    The aim of this study was to evaluate the utility of the iterative reconstruction (IR) technique for quantitative bronchial assessment during low-dose computed tomography (CT) as a substitute for standard-dose CT in patients with/without chronic obstructive pulmonary disease. Fifty patients (mean age, 69.2; mean % predicted FEV1, 79.4) underwent standard-dose CT (150mAs) and low-dose CT (25mAs). Except for tube current, the imaging parameters were identical for both protocols. Standard-dose CT was reconstructed using filtered back-projection (FBP), and low-dose CT was reconstructed using IR and FBP. For quantitative bronchial assessment, the wall area percentage (WA%) of the sub-segmental bronchi and the airway luminal volume percentage (LV%) from the main bronchus to the peripheral bronchi were acquired in each dataset. The correlation and agreement of WA% and LV% between standard-dose CT and both low-dose CTs were statistically evaluated. WA% and LV% between standard-dose CT and both low-dose CTs were significant correlated (r > 0.77, p < 0.00001); however, only the LV% agreement between SD-CT and low-dose CT reconstructed with IR was moderate (concordance correlation coefficient = 0.93); the other agreement was poor (concordance correlation coefficient <0.90). Quantitative bronchial assessment via low-dose CT has potential as a substitute for standard-dose CT by using IR and airway luminal volumetry techniques. circle Quantitative bronchial assessment of COPD using low-dose CT is possible. (orig.)

  1. Quantitative evaluation of calcium (content) in the coronary artery using hybrid iterative reconstruction (iDose) algorithm on low-dose 64-detector CT. Comparison of iDose and filtered back projection

    International Nuclear Information System (INIS)

    To evaluate the usefulness of hybrid iterative reconstruction (iDose) for quantification of calcium content in the coronary artery on 64-detector computed tomography (CT), an anthropomorphic cardiac CT phantom containing cylinders with known calcium content was scanned at tube current-time products of 15, 20, 25, and 50 mAs using 64-detector CT. The images obtained at 15, 20, 25, and 50 mAs were reconstructed using filtered back projection (FBP), and those at 15, 20, and 25 mAs were also reconstructed using iDose. Then the volume and mass of the calcium content in the cylinders were calculated and compared with the true values. The Agatston score was also evaluated. The Agatston score and mass of calcium obtained at 50 mAs using FBP were 656.92 and 159.91 mg, respectively. In contrast, those obtained at 25 mAs using iDose were 641.91 and 159.05 mg, respectively. No significant differences were found in the calcium measurements obtained using FBP and iDose. In addition, the Agatston score and mass of calcium obtained at 15 mAs and 20 mAs using iDose were not significantly different from those obtained at 25 mAs with iDose. By using iDose, accurate quantification of calcium in the coronary artery can be achieved at 15 mAs using 64-detector CT. The radiation dose can be significantly reduced in coronary artery calcium scoring without impairing the detection and quantification of coronary calcification. (author)

  2. Image quality of low mA CT pulmonary angiography reconstructed with model based iterative reconstruction versus standard CT pulmonary angiography reconstructed with filtered back projection: an equivalency trial

    International Nuclear Information System (INIS)

    To determine whether CT pulmonary angiography (CTPA) using low mA setting reconstructed with model-based iterative reconstruction (MBIR) is equivalent to routine CTPA reconstructed with filtered back projection (FBP). This prospective study was approved by the institutional review board and patients provided written informed consent. Eighty-two patients were examined with a low mA MBIR-CTPA (100 kV, 20 mA) and 82 patients with a standard FBP-CTPA (100 kV, 250 mA). Region of interests were drawn in nine pulmonary vessels; signal-to-noise ratio (SNR) and contrast-to-noise ratio (CNR) were calculated. A five-point scale was used to subjectively evaluate the image quality of FBP-CTPA and low mA MBIR-CTPA. Compared to routine FBP-CTPA, low mA MBIR-CTPA showed no differences in the attenuation measured in nine pulmonary vessels, higher SNR (56 ± 19 vs 43 ± 20, p < 0.0001) and higher CNR (50 ± 17 vs 38 ± 18, p < 0.0001) despite a dose reduction of 93 % (p < 0.0001). The subjective image quality of low mA MBIR-CTPA was quoted as diagnostic in 98 % of the cases for patient with body mass index less than 30 kg/m2. Low mA MBIR-CTPA is equivalent to routine FBP-CTPA and allows a significant dose reduction while improving SNR and CNR in the pulmonary vessels, as compared with routine FBP-CTPA. (orig.)

  3. Image quality of low mA CT pulmonary angiography reconstructed with model based iterative reconstruction versus standard CT pulmonary angiography reconstructed with filtered back projection: an equivalency trial

    Energy Technology Data Exchange (ETDEWEB)

    Montet, Xavier; Hachulla, Anne-Lise; Neroladaki, Angeliki; Botsikas, Diomidis; Becker, Christoph D. [Geneva University Hospital, Division of Radiology, Geneva 4 (Switzerland); Lador, Frederic; Rochat, Thierry [Geneva University Hospital, Division of Pulmonary Medicine, Geneva 4 (Switzerland)

    2015-06-01

    To determine whether CT pulmonary angiography (CTPA) using low mA setting reconstructed with model-based iterative reconstruction (MBIR) is equivalent to routine CTPA reconstructed with filtered back projection (FBP). This prospective study was approved by the institutional review board and patients provided written informed consent. Eighty-two patients were examined with a low mA MBIR-CTPA (100 kV, 20 mA) and 82 patients with a standard FBP-CTPA (100 kV, 250 mA). Region of interests were drawn in nine pulmonary vessels; signal-to-noise ratio (SNR) and contrast-to-noise ratio (CNR) were calculated. A five-point scale was used to subjectively evaluate the image quality of FBP-CTPA and low mA MBIR-CTPA. Compared to routine FBP-CTPA, low mA MBIR-CTPA showed no differences in the attenuation measured in nine pulmonary vessels, higher SNR (56 ± 19 vs 43 ± 20, p < 0.0001) and higher CNR (50 ± 17 vs 38 ± 18, p < 0.0001) despite a dose reduction of 93 % (p < 0.0001). The subjective image quality of low mA MBIR-CTPA was quoted as diagnostic in 98 % of the cases for patient with body mass index less than 30 kg/m{sup 2}. Low mA MBIR-CTPA is equivalent to routine FBP-CTPA and allows a significant dose reduction while improving SNR and CNR in the pulmonary vessels, as compared with routine FBP-CTPA. (orig.)

  4. Feasible Dose Reduction in Routine Chest Computed Tomography Maintaining Constant Image Quality Using the Last Three Scanner Generations: From Filtered Back Projection to Sinogram-affirmed Iterative Reconstruction and Impact of the Novel Fully Integrated Detector Design Minimizing Electronic Noise

    Directory of Open Access Journals (Sweden)

    Lukas Ebner

    2014-01-01

    Full Text Available Objective:The aim of the present study was to evaluate a dose reduction in contrast-enhanced chest computed tomography (CT by comparing the three latest generations of Siemens CT scanners used in clinical practice. We analyzed the amount of radiation used with filtered back projection (FBP and an iterative reconstruction (IR algorithm to yield the same image quality. Furthermore, the influence on the radiation dose of the most recent integrated circuit detector (ICD; Stellar detector, Siemens Healthcare, Erlangen, Germany was investigated. Materials and Methods: 136 Patients were included. Scan parameters were set to a thorax routine: SOMATOM Sensation 64 (FBP, SOMATOM Definition Flash (IR, and SOMATOM Definition Edge (ICD and IR. Tube current was set constantly to the reference level of 100 mA automated tube current modulation using reference milliamperes. Care kV was used on the Flash and Edge scanner, while tube potential was individually selected between 100 and 140 kVp by the medical technologists at the SOMATOM Sensation. Quality assessment was performed on soft-tissue kernel reconstruction. Dose was represented by the dose length product. Results: Dose-length product (DLP with FBP for the average chest CT was 308 mGycm ± 99.6. In contrast, the DLP for the chest CT with IR algorithm was 196.8 mGycm ± 68.8 (P = 0.0001. Further decline in dose can be noted with IR and the ICD: DLP: 166.4 mGycm ± 54.5 (P = 0.033. The dose reduction compared to FBP was 36.1% with IR and 45.6% with IR/ICD. Signal-to-noise ratio (SNR was favorable in the aorta, bone, and soft tissue for IR/ICD in combination compared to FBP (the P values ranged from 0.003 to 0.048. Overall contrast-to-noise ratio (CNR improved with declining DLP. Conclusion: The most recent technical developments, namely IR in combination with integrated circuit detectors, can significantly lower radiation dose in chest CT examinations.

  5. Impact of hybrid iterative reconstruction on Agatston coronary artery calcium scores in comparison to filtered back projection in native cardiac CT; Einfluss der hybriden iterativen Rekonstruktion bei der nativen CT des Herzens auf die Agatston-Kalziumscores der Koronararterien

    Energy Technology Data Exchange (ETDEWEB)

    Obmann, V.C.; Heverhagen, J.T. [Inselspital - University Hospital Bern (Switzerland). University Inst. for Diagnostic, Interventional and Pediatric Radiology; Klink, T. [Wuerzburg Univ. (Germany). Inst. of Diagnostic and Interventional Radiology; Stork, A.; Begemann, P.G.C. [Roentgeninstitut Duesseldorf, Duesseldorf (Germany); Laqmani, A.; Adam, G. [University Medical Center Hamburg-Eppendorf, Hamburg (Germany). Dept. of Diagnostic and Interventional Radiology

    2015-05-15

    To investigate whether the effects of hybrid iterative reconstruction (HIR) on coronary artery calcium (CAC) measurements using the Agatston score lead to changes in assignment of patients to cardiovascular risk groups compared to filtered back projection (FBP). 68 patients (mean age 61.5 years; 48 male; 20 female) underwent prospectively ECG-gated, non-enhanced, cardiac 256-MSCT for coronary calcium scoring. Scanning parameters were as follows: Tube voltage, 120 kV; Mean tube current time-product 63.67 mAs (50 - 150 mAs); collimation, 2 x 128 x 0.625 mm. Images were reconstructed with FBP and with HIR at all levels (L1 to L7). Two independent readers measured Agatston scores of all reconstructions and assigned patients to cardiovascular risk groups. Scores of HIR and FBP reconstructions were correlated (Spearman). Interobserver agreement and variability was assessed with k-statistics and Bland-Altmann-Plots. Agatston scores of HIR reconstructions were closely correlated with FBP reconstructions (L1, R = 0.9996; L2, R = 0.9995; L3, R = 0.9991; L4, R = 0.986; L5, R = 0.9986; L6, R = 0.9987; and L7, R = 0.9986). In comparison to FBP, HIR led to reduced Agatston scores between 97% (L1) and 87.4% (L7) of the FBP values. Using HIR iterations L1-L3, all patients were assigned to identical risk groups as after FPB reconstruction. In 5.4% of patients the risk group after HIR with the maximum iteration level was different from the group after FBP reconstruction. There was an excellent correlation of Agatston scores after HIR and FBP with identical risk group assignment at levels 1 - 3 for all patients. Hence it appears that the application of HIR in routine calcium scoring does not entail any disadvantages. Thus, future studies are needed to demonstrate whether HIR is a reliable method for reducing radiation dose in coronary calcium scoring.

  6. Reconstruction of CT images by the Bayes- back projection method

    CERN Document Server

    Haruyama, M; Takase, M; Tobita, H

    2002-01-01

    In the course of research on quantitative assay of non-destructive measurement of radioactive waste, the have developed a unique program based on the Bayesian theory for reconstruction of transmission computed tomography (TCT) image. The reconstruction of cross-section images in the CT technology usually employs the Filtered Back Projection method. The new imaging reconstruction program reported here is based on the Bayesian Back Projection method, and it has a function of iterative improvement images by every step of measurement. Namely, this method has the capability of prompt display of a cross-section image corresponding to each angled projection data from every measurement. Hence, it is possible to observe an improved cross-section view by reflecting each projection data in almost real time. From the basic theory of Baysian Back Projection method, it can be not only applied to CT types of 1st, 2nd, and 3rd generation. This reported deals with a reconstruction program of cross-section images in the CT of ...

  7. Image reconstruction of simulated specimens using convolution back projection

    Directory of Open Access Journals (Sweden)

    Mohd. Farhan Manzoor

    2001-04-01

    Full Text Available This paper reports about the reconstruction of cross-sections of composite structures. The convolution back projection (CBP algorithm has been used to capture the attenuation field over the specimen. Five different test cases have been taken up for evaluation. These cases represent varying degrees of complexity. In addition, the role of filters on the nature of the reconstruction errors has also been discussed. Numerical results obtained in the study reveal that CBP algorithm is a useful tool for qualitative as well as quantitative assessment of composite regions encountered in engineering applications.

  8. Procedure and apparatus for back projection

    International Nuclear Information System (INIS)

    (1) The procedure is for back projection in the form of a tomographic picture of a member, characterised in that the strip pictures are written onto the signal plate by a conversion pick-up unit; and that to the address inputs of the pick-up unit voltages are applied that represent a rotating coordinate system. (2) Procedure following claim 1 characterised by the fact that the voltages are respectively applied as sawtooth waveform horizontal and vertical television deflections. (3) Procedure following claims 1 and 2, characterised that in order to correct the television deflection voltages for the effect of a fan shaped radiation beam, first one and then the other of the amplitudes is modulated. (G.C.)

  9. UV Fluorescence Photography of Works of Art : Replacing the Traditional UV Cut Filters with Interference Filters

    Directory of Open Access Journals (Sweden)

    Luís BRAVO PEREIRA

    2010-09-01

    Full Text Available For many years filters like the Kodak Wratten E series, or the equivalent Schneider B+W 415, were used as standard UV cut filters, necessary to obtain good quality on UV Fluorescence photography. The only problem with the use of these filters is that, when they receive the UV radiation that they should remove, they present themselves an internal fluorescence as side effect, that usually reduce contrast and quality on the final image. This article presents the results of our experiences on using some innovative filters, that appeared available on the market in recent years, projected to adsorb UV radiation even more efficiently than with the mentioned above pigment based standard filters: the interference filters for UV rejection (and, usually, for IR rejection too manufactured using interference layers, that present better results than the pigment based filters. The only problem with interference filters type is that they are sensitive to the rays direction and, because of that, they are not adequate to wide-angle lenses. The internal fluorescence for three filters: the B+W 415 UV cut (equivalent to the Kodak Wratten 2E, pigment based, the B+W 486 UV IR cut (an interference type filter, used frequently on digital cameras to remove IR or UV and the Baader UVIR rejection filter (two versions of this interference filter were used had been tested and compared. The final quality of the UV fluorescence images seems to be of a superior quality when compared to the images obtained with classic filters.

  10. Error estimates for universal back-projection-based photoacoustic tomography

    Science.gov (United States)

    Pandey, Prabodh K.; Naik, Naren; Munshi, Prabhat; Pradhan, Asima

    2015-07-01

    Photo-acoustic tomography is a hybrid imaging modality that combines the advantages of optical as well as ultrasound imaging techniques to produce images with high resolution and good contrast at high penetration depths. Choice of reconstruction algorithm as well as experimental and computational parameters plays a major role in governing the accuracy of a tomographic technique. Therefore error estimates with the variation of these parameters have extreme importance. Due to the finite support, that photo-acoustic source has, the pressure signals are not band-limited, but in practice, our detection system is. Hence the reconstructed image from ideal, noiseless band-limited forward data (for future references we will call this band-limited reconstruction) is the best approximation that we have for the unknown object. In the present study, we report the error that arises in the universal back-projection (UBP) based photo-acoustic reconstruction for planer detection geometry due to sampling and filtering of forward data (pressure signals).Computational validation of the error estimates have been carried out for synthetic phantoms. Validation with noisy forward data has also been carried out, to study the effect of noise on the error estimates derived in our work. Although here we have derived the estimates for planar detection geometry, the derivations for spherical and cylindrical geometries follow accordingly.

  11. Traditional Tracking with Kalman Filter on Parallel Architectures

    CERN Document Server

    Cerati, Giuseppe; Lantz, Steven; MacNeill, Ian; McDermott, Kevin; Riley, Dan; Tadel, Matevz; Wittich, Peter; Wuerthwein, Frank; Yagil, Avi

    2014-01-01

    Power density constraints are limiting the performance improvements of modern CPUs. To address this, we have seen the introduction of lower-power, multi-core processors, but the future will be even more exciting. In order to stay within the power density limits but still obtain Moore's Law performance/price gains, it will be necessary to parallelize algorithms to exploit larger numbers of lightweight cores and specialized functions like large vector units. Example technologies today include Intel's Xeon Phi and GPGPUs. Track finding and fitting is one of the most computationally challenging problems for event reconstruction in particle physics. At the High Luminosity LHC, for example, this will be by far the dominant problem. The most common track finding techniques in use today are however those based on the Kalman Filter. Significant experience has been accumulated with these techniques on real tracking detector systems, both in the trigger and offline. We report the results of our investigations into the p...

  12. Tsunami Wave Estimation Using GPS-TEC Back Projection

    Science.gov (United States)

    Ito, T.

    2014-12-01

    Large tsunami generates an acoustic wave and shakes the atmosphere layer around a focal region. The generated acoustic wave propagates to ionosphere layer which is located at about 300km hight and causes a ionospheric disturbance. So, a generated acoustic wave causes the density of total electron content (TEC) inside ionospheric layer to be change. The changing TEC inside ionospheric layer can be measured by dense GPS network, such as GEONET. We focus on a large tsunami such as the 2011 March 11 Tohoku-Oki earthquake (Mw 9.0) , which caused vast damages to the country. Tsunami wave due to the 2011 Tohoku-Oki earthquake generates acoustic wave which propagates to ionospheric layer. The Japanese dense network of GPS detected clear anomaly of ionospheric TEC due to Tsunami wave around the focal region. We assume that acoustic wave cause the ionospheric disturbance and estimate tsunami wave propagation using Back Projection (BP) method of ionospheric disturbance. The Japanese dense array of GPS recorded ionospheric disturbances as changes in TEC due to the 2011 Tohoku-Oki earthquake. In this study, we try to reveal the detail of generating tsunami propagation using changing TEC from GPS observation network. At first, we process GPS-TEC above focal region by 1 sec sampling of GEONET. We remove slant GPS-TEC effects using filter out second degrees polynomial fitting. We try to adapt back projection (BP) method for GPS-TEC time series. The BP product shows the beam formed time history and location of coherent acoustic-wave energy generated by large tsunami observed at ionospheric layer regional arrays and across the GPS Network. BPs are performed by beam forming (stacking) energy to a flat grid around the source region with variable spatial resolution scaled by the magnitude of generating tsunami velocity for each second. In result, we can obtain the generating tsunami wave due to a large earthquake. The method is perfectly new and provide detail of tsunami propagating wave distribution from huge GPS-TEC data. We can obtain in-direct measurement Tsunami wave generation from ionospheric disturbances. So, this result will bring a revolution of tsunami study.

  13. Dose reduction in computed tomography of the chest. Image quality of iterative reconstructions at a 50% radiation dose compared to filtered back projection at a 100% radiation dose; Dosisreduktion in der Thorax-CT. Vergleich der Bildqualitaet bei 50% Dosis und iterativer Bildrekonstruktion mit 100% Dosis und gefilterter Rueckprojektion

    Energy Technology Data Exchange (ETDEWEB)

    May, M.S.; Eller, A.; Stahl, C. [University Hospital Erlangen (Germany). Dept. of Radiology; and others

    2014-06-15

    Purpose: The aim of this study was to evaluate the potential of iterative reconstruction (IR) in chest computed tomography (CT) to reduce radiation exposure. The qualitative and quantitative image quality of standard reconstructions with filtered back projection (FBP) and half dose (HD) chest CT data reconstructed with FBP and IR was assessed. Materials and Methods: 52 consecutive patients underwent contrast-enhanced chest CT on a dual-source CT system at 120 kV and automatic exposure control. The tube current was equally split on both tube detector systems. For the HD datasets, only data from one tube detector system was utilized. Thus, FD and HD data was available for each patient with a single scan. Three datasets were reconstructed from the raw data: standard full dose (FD) images applying FBP which served as a reference, HD images applying FBP and IR. Objective image quality analysis was performed by measuring the image noise in tissue and air. The subjective image quality was evaluated by 2 radiologists according to European guidelines. Additional assessment of artifacts, lesion conspicuity and edge sharpness was performed. Results: Image noise did not differ significantly between HD-IR and FD-FBP (p = 0.254) but increased substantially in HD-FBP (p < 0.001). No statistically significant differences were found for the reproduction of anatomical and pathological structures between HD-IR and FD-FBP, subsegmental bronchi and bronchioli. The image quality of HD-FBP was rated inferior because of increased noise. Conclusion: A 50% dose reduction in contrast-enhanced chest CT is feasible without a loss of diagnostic confidence if IR is used for image data reconstruction. Iterative reconstruction is another powerful tool to reduce radiation exposure and can be combined with other dose-saving techniques. (orig.)

  14. Traditional Medicine Through the Filter of Modernity: A brief historical analysis

    Directory of Open Access Journals (Sweden)

    R. Rabarihoela Razafimandimby

    2014-12-01

    Full Text Available Traditional medicines still prevail in current Malagasy context. A careful historical analysis shows however that Malagasy traditional medicine has been screened through many filters before being accepted in a global context. Traditional medicine in its authentic form has been more or less rejected with the advent of  modern medicine – although not without reaction. This paper will retrace the historical encountering of the modern and traditional to determine the extent to which traditional medicine is acknowledged and used in the current prevailing modern, rational and scientific global context.

  15. Traditional Tracking with Kalman Filter on Parallel Architectures

    Science.gov (United States)

    Cerati, Giuseppe; Elmer, Peter; Lantz, Steven; MacNeill, Ian; McDermott, Kevin; Riley, Dan; Tadel, Matevž; Wittich, Peter; Würthwein, Frank; Yagil, Avi

    2015-05-01

    Power density constraints are limiting the performance improvements of modern CPUs. To address this, we have seen the introduction of lower-power, multi-core processors, but the future will be even more exciting. In order to stay within the power density limits but still obtain Moore's Law performance/price gains, it will be necessary to parallelize algorithms to exploit larger numbers of lightweight cores and specialized functions like large vector units. Example technologies today include Intel's Xeon Phi and GPGPUs. Track finding and fitting is one of the most computationally challenging problems for event reconstruction in particle physics. At the High Luminosity LHC, for example, this will be by far the dominant problem. The most common track finding techniques in use today are however those based on the Kalman Filter. Significant experience has been accumulated with these techniques on real tracking detector systems, both in the trigger and offline. We report the results of our investigations into the potential and limitations of these algorithms on the new parallel hardware.

  16. UV Fluorescence Photography of Works of Art : Replacing the Traditional UV Cut Filters with Interference Filters

    OpenAIRE

    Luís BRAVO PEREIRA

    2010-01-01

    For many years filters like the Kodak Wratten E series, or the equivalent Schneider B+W 415, were used as standard UV cut filters, necessary to obtain good quality on UV Fluorescence photography. The only problem with the use of these filters is that, when they receive the UV radiation that they should remove, they present themselves an internal fluorescence as side effect, that usually reduce contrast and quality on the final image. This article presents the results of our experiences on usi...

  17. Fast tomographic reconstruction using a hardwired optical back-projection device

    International Nuclear Information System (INIS)

    Back-projection of filtered projections is one of the most commonly used techniques for reconstruction of tomographic slices from multiple views of an object. In the device we have developed, 60 linear projections are stored in a digital memory, organized as a matrix of 60 lines and 64 columns, with a capacity of 8 bits for each channel. The input-output procedures of this memory are under the control of a microprocessor. The first results achieved by means of this device from theoretical data, phantom and patient studies, demonstrate the good quality of this mode of tomographic reconstruction and introduce the future developments

  18. Earthquake Characteristics as Imaged by the Back-Projection Method

    OpenAIRE

    Kiser, Eric

    2012-01-01

    This dissertation explores the capability of dense seismic array data for imaging the rupture properties of earthquake sources using a method known as back-projection. Only within the past 10 or 15 years has implementation of the method become feasible through the development of large aperture seismic arrays such as the High Sensitivity Seismograph Network in Japan and the Transportable Array in the United States. Coincidentally, this buildup in data coverage has also been accompanied by a gl...

  19. An accelerated threshold-based back-projection algorithm for Compton camera image reconstruction

    International Nuclear Information System (INIS)

    Purpose: Compton camera imaging (CCI) systems are currently under investigation for radiotherapy dose reconstruction and verification. The ability of such a system to provide real-time images during dose delivery will be limited by the computational speed of the image reconstruction algorithm. In this work, the authors present a fast and simple method by which to generate an initial back-projected image from acquired CCI data, suitable for use in a filtered back-projection algorithm or as a starting point for iterative reconstruction algorithms, and compare its performance to the current state of the art. Methods: Each detector event in a CCI system describes a conical surface that includes the true point of origin of the detected photon. Numerical image reconstruction algorithms require, as a first step, the back-projection of each of these conical surfaces into an image space. The algorithm presented here first generates a solution matrix for each slice of the image space by solving the intersection of the conical surface with the image plane. Each element of the solution matrix is proportional to the distance of the corresponding voxel from the true intersection curve. A threshold function was developed to extract those pixels sufficiently close to the true intersection to generate a binary intersection curve. This process is repeated for each image plane for each CCI detector event, resulting in a three-dimensional back-projection image. The performance of this algorithm was tested against a marching algorithm known for speed and accuracy. Results: The threshold-based algorithm was found to be approximately four times faster than the current state of the art with minimal deficit to image quality, arising from the fact that a generically applicable threshold function cannot provide perfect results in all situations. The algorithm fails to extract a complete intersection curve in image slices near the detector surface for detector event cones having axes nearly parallel to the image plane. This effect decreases the sum of the image, thereby also affecting the mean, standard deviation, and SNR of the image. All back-projected events associated with a simulated point source intersected the voxel containing the source and the FWHM of the back-projected image was similar to that obtained from the marching method. Conclusions: The slight deficit to image quality observed with the threshold-based back-projection algorithm described here is outweighed by the 75% reduction in computation time. The implementation of this method requires the development of an optimum threshold function, which determines the overall accuracy of the method. This makes the algorithm well-suited to applications involving the reconstruction of many large images, where the time invested in threshold development is offset by the decreased image reconstruction time. Implemented in a parallel-computing environment, the threshold-based algorithm has the potential to provide real-time dose verification for radiation therapy.

  20. Distance driven back projection image reconstruction in digital tomosysthesis

    Science.gov (United States)

    Malalla, Nuhad A. Y.; Xu, Shiyu; Chen, Ying

    2015-03-01

    In this paper, distance driven (DD) back projection image reconstruction was investigated for digital tomosysthesis. Digital tomosysthesis is an imaging technique to produce three dimensional information of the object with low radiation dosage. This paper is our new study of DD back projection for image reconstruction in digital tomosysthesis. Since DD considers that the image pixel and detector cell have width, the convolution operation is used to calculate DD coefficients. The approximation characteristics of some other methods such as ray driven method (RD) can be avoided. A computer simulation result of DD with Maximum Likelihood Expectation Maximization (MLEM) of tomosysthesis reconstruction algorithm was studied. The sequence of projection images were simulated with 25 projections and a total view angle of 48 degrees. DD with MLEM reconstruction results were demonstrated. Line profile along x direction was used to evaluate DD and RD methods. Compared with RD, the computation time in DD with MLEM to provide the reconstruction results was shorter, since the main loop of DD is over x-y plane intercepts, not over the image pixels or detectors cells. In clinical applications, both the accuracy and computation speed of implementation condition are necessary requirements. DD back projection may satisfy the required conditions.

  1. Camera calibration based on the back projection process

    Science.gov (United States)

    Gu, Feifei; Zhao, Hong; Ma, Yueyang; Bu, Penghui

    2015-12-01

    Camera calibration plays a crucial role in 3D measurement tasks of machine vision. In typical calibration processes, camera parameters are iteratively optimized in the forward imaging process (FIP). However, the results can only guarantee the minimum of 2D projection errors on the image plane, but not the minimum of 3D reconstruction errors. In this paper, we propose a universal method for camera calibration, which uses the back projection process (BPP). In our method, a forward projection model is used to obtain initial intrinsic and extrinsic parameters with a popular planar checkerboard pattern. Then, the extracted image points are projected back into 3D space and compared with the ideal point coordinates. Finally, the estimation of the camera parameters is refined by a non-linear function minimization process. The proposed method can obtain a more accurate calibration result, which is more physically useful. Simulation and practical data are given to demonstrate the accuracy of the proposed method.

  2. Instantaneous tomographic reconstruction device by back-projection

    International Nuclear Information System (INIS)

    Tomographic slices are easily reconstructed from a series of projections using an optical back-projection technique. In the device we have developed, the 60 mono-dimensional projections are stored in a digital memory under the control of a microprocessor. The contents of this memory is displayed on an oscilloscope in two modes: a biparametric presentation of the set of projections and a tomographic display consisting in a high speed rotation of the pictures after a two dimensional spreading of the monodimensional projections over the screen. The preliminary results achieved by means of this device from theoretical data, phantom and patient studies, demonstrate the good quality of this mode of tomographic reconstruction and introduce the future developments

  3. Imaging Seismic Source Variations Using Back-Projection Methods at El Tatio Geyser Field, Northern Chile

    Science.gov (United States)

    Kelly, C. L.; Lawrence, J. F.

    2014-12-01

    During October 2012, 51 geophones and 6 broadband seismometers were deployed in an ~50x50m region surrounding a periodically erupting columnar geyser in the El Tatio Geyser Field, Chile. The dense array served as the seismic framework for a collaborative project to study the mechanics of complex hydrothermal systems. Contemporaneously, complementary geophysical measurements (including down-hole temperature and pressure, discharge rates, thermal imaging, water chemistry, and video) were also collected. Located on the western flanks of the Andes Mountains at an elevation of 4200m, El Tatio is the third largest geyser field in the world. Its non-pristine condition makes it an ideal location to perform minutely invasive geophysical studies. The El Jefe Geyser was chosen for its easily accessible conduit and extremely periodic eruption cycle (~120s). During approximately 2 weeks of continuous recording, we recorded ~2500 nighttime eruptions which lack cultural noise from tourism. With ample data, we aim to study how the source varies spatially and temporally during each phase of the geyser's eruption cycle. We are developing a new back-projection processing technique to improve source imaging for diffuse signals. Our method was previously applied to the Sierra Negra Volcano system, which also exhibits repeating harmonic and diffuse seismic sources. We back-project correlated seismic signals from the receivers back to their sources, assuming linear source to receiver paths and a known velocity model (obtained from ambient noise tomography). We apply polarization filters to isolate individual and concurrent geyser energy associated with P and S phases. We generate 4D, time-lapsed images of the geyser source field that illustrate how the source distribution changes through the eruption cycle. We compare images for pre-eruption, co-eruption, post-eruption and quiescent periods. We use our images to assess eruption mechanics in the system (i.e. top-down vs. bottom-up) and determine variations in source depth and distribution in the conduit and larger geyser field over many eruption cycles.

  4. Image-domain sampling properties of the Hotelling Observer in CT using filtered back-projection

    Science.gov (United States)

    Sanchez, Adrian A.; Sidky, Emil Y.; Pan, Xiaochuan

    2015-03-01

    The Hotelling Observer (HO),1 along with its channelized variants,2 has been proposed for image quality evaluation in x-ray CT.3,4 In this work, we investigate HO performance for a detection task in parallel-beam FBP as a function of two image-domain sampling parameters, namely pixel size and field-of-view. These two parameters are of central importance in adapting HO methods to use in CT, since the large number of pixels in a single image makes direct computation of HO performance for a full image infeasible in most cases. Reduction of the number of image pixels and/or restriction of the image to a region-of-interest (ROI) has the potential to make direct computation of HO statistics feasible in CT, provided that the signal and noise properties lead to redundant information in some regions of the image. For small signals, we hypothesize that reduction of image pixel size and enlargement of the image field-of-view are approximately equivalent means of gaining additional information relevant to a detection task. The rationale for this hypothesis is that the backprojection operation in FBP introduces long range correlations so that, for small signals, the reconstructed signal outside of a small ROI is not linearly independent of the signal within the ROI. In this work, we perform a preliminary investigation of this hypothesis by sweeping these two sampling parameters and computing HO performance for a signal detection task.

  5. A fast marching method based back projection algorithm for photoacoustic tomography in heterogeneous media

    CERN Document Server

    Wang, Tianren

    2015-01-01

    This paper presents a numerical study on a fast marching method based back projection reconstruction algorithm for photoacoustic tomography in heterogeneous media. Transcranial imaging is used here as a case study. To correct for the phase aberration from the heterogeneity (i.e., skull), the fast marching method is adopted to compute the phase delay based on the known speed of sound distribution, and the phase delay is taken into account by the back projection algorithm for more accurate reconstructions. It is shown that the proposed algorithm is more accurate than the conventional back projection algorithm, but slightly less accurate than the time reversal algorithm particularly in the area close to the skull. However, the image reconstruction time for the proposed algorithm can be as little as 124 ms when implemented by a GPU (512 sensors, 21323 pixels reconstructed), which is two orders of magnitude faster than the time reversal reconstruction. The proposed algorithm, therefore, not only corrects for the p...

  6. Filtering

    Directory of Open Access Journals (Sweden)

    Jan Drugowitsch

    2015-04-01

    Full Text Available Predictive coding appears to be one of the fundamental working principles of brain processing. Amongst other aspects, brains often predict the sensory consequences of their own actions. Predictive coding resembles Kalman filtering, where incoming sensory information is filtered to produce prediction errors for subsequent adaptation and learning. However, to generate prediction errors given motor commands, a suitable temporal forward model is required to generate predictions. While in engineering applications, it is usually assumed that this forward model is known, the brain has to learn it. When filtering sensory input and learning from the residual signal in parallel, a fundamental problem arises: the system can enter a delusional loop when filtering the sensory information using an overly trusted forward model. In this case, learning stalls before accurate convergence because uncertainty about the forward model is not properly accommodated. We present a Bayes-optimal solution to this generic and pernicious problem for the case of linear forward models, which we call Predictive Inference and Adaptive Filtering (PIAF. PIAF filters incoming sensory information and learns the forward model simultaneously. We show that PIAF is formally related to Kalman filtering and to the Recursive Least Squares linear approximation method, but combines these procedures in a Bayes optimal fashion. Numerical evaluations confirm that the delusional loop is precluded and that the learning of the forward model is more than ten-times faster when compared to a naive combination of Kalman filtering and Recursive Least Squares.

  7. Navigating Earthquake Physics with High-Resolution Array Back-Projection

    Science.gov (United States)

    Meng, Lingsen

    Understanding earthquake source dynamics is a fundamental goal of geophysics. Progress toward this goal has been slow due to the gap between state-of-art earthquake simulations and the limited source imaging techniques based on conventional low-frequency finite fault inversions. Seismic array processing is an alternative source imaging technique that employs the higher frequency content of the earthquakes and provides finer detail of the source process with few prior assumptions. While the back-projection provides key observations of previous large earthquakes, the standard beamforming back-projection suffers from low resolution and severe artifacts. This thesis introduces the MUSIC technique, a high-resolution array processing method that aims to narrow the gap between the seismic observations and earthquake simulations. The MUSIC is a high-resolution method taking advantage of the higher order signal statistics. The method has not been widely used in seismology yet because of the nonstationary and incoherent nature of the seismic signal. We adapt MUSIC to transient seismic signal by incorporating the Multitaper cross-spectrum estimates. We also adopt a "reference window" strategy that mitigates the "swimming artifact," a systematic drift effect in back projection. The improved MUSIC back projections allow the imaging of recent large earthquakes in finer details which give rise to new perspectives on dynamic simulations. In the 2011 Tohoku-Oki earthquake, we observe frequency-dependent rupture behaviors which relate to the material variation along the dip of the subduction interface. In the 2012 off-Sumatra earthquake, we image the complicated ruptures involving orthogonal fault system and an usual branching direction. This result along with our complementary dynamic simulations probes the pressure-insensitive strength of the deep oceanic lithosphere. In another example, back projection is applied to the 2010 M7 Haiti earthquake recorded at regional distance. The high-frequency subevents are located at the edges of geodetic slip regions, which are correlated to the stopping phases associated with rupture speed reduction when the earthquake arrests.

  8. Super-resolution Reconstruction Algorithm Based on Patch Similarity and Back-projection Modification

    Directory of Open Access Journals (Sweden)

    Wei-long Chen

    2014-07-01

    Full Text Available We propose an effective super-resolution reconstruction algorithm based on patch similarity and back-projection modification. In the proposed algorithm, we assume patch to be similar in natural images and extract the high-frequency information from the best similar patch to add into goal high-resolution image. In the process of reconstruction, the high-resolution patch is back-projected into the low-resolution patch so as to gain detailed modification. Experiments performed on simulated low-resolution image and real low-resolution image are proved that the proposed super-resolution reconstruction algorithm is effective and efficient to improve the resolution of image and achieve a better visual performance.

  9. Comparison of back projection methods of determining earthquake rupture process in time and frequency domains

    Science.gov (United States)

    Wang, W.; Wen, L.

    2013-12-01

    Back projection is a method to back project the seismic energy recorded in a seismic array back to the earthquake source region and determine the rupture process of a large earthquake. The method takes advantage of the coherence of seismic energy in a seismic array and is quick in determining some important properties of earthquake source. The method can be performed in both time and frequency domains. In time domain, the most conventional procedure is beam forming with some measures of suppressing the noise, such as the Nth root stacking, etc. In the frequency domain, the multiple signal classification method (MUSIC) estimates the direction of arrivals of multiple waves propagating through an array using the subspace method. The advantage of this method is the ability to study rupture properties at various frequencies and to resolve simultaneous arrivals making it suitable for detecting biliteral rupture of an earthquake source. We present a comparison of back projection results on some large earthquakes between the methods in time domain and frequency domain. The time-domain procedure produces an image that is smeared and exhibits some artifacts, although some enhancing stacking methods can at some extent alleviate the problem. On the other hand, the MUSIC method resolves clear multiple arrivals and provides higher resolution of rupture imaging.

  10. Accurate two-dimensional IMRT verification using a back-projection EPID dosimetry method

    International Nuclear Information System (INIS)

    The use of electronic portal imaging devices (EPIDs) is a promising method for the dosimetric verification of external beam, megavoltage radiation therapy--both pretreatment and in vivo. In this study, a previously developed EPID back-projection algorithm was modified for IMRT techniques and applied to an amorphous silicon EPID. By using this back-projection algorithm, two-dimensional dose distributions inside a phantom or patient are reconstructed from portal images. The model requires the primary dose component at the position of the EPID. A parametrized description of the lateral scatter within the imager was obtained from measurements with an ionization chamber in a miniphantom. In addition to point dose measurements on the central axis of square fields of different size, we also used dose profiles of those fields as reference input data for our model. This yielded a better description of the lateral scatter within the EPID, which resulted in a higher accuracy in the back-projected, two-dimensional dose distributions. The accuracy of our approach was tested for pretreatment verification of a five-field IMRT plan for the treatment of prostate cancer. Each field had between six and eight segments and was evaluated by comparing the back-projected, two-dimensional EPID dose distribution with a film measurement inside a homogeneous slab phantom. For this purpose, the ?-evaluation method was used with a dose-difference criterion of 2% of dose maximum and a distance-to-agreement criterion of 2 mm. Excellent agreement was found between EPID and film measurements for each field, both in the central part of the beam and in the penumbra and low-dose regions. It can be concluded that our modified algorithm is able to accurately predict the dose in the midplane of a homogeneous slab phantom. For pretreatment IMRT plan verification, EPID dosimetry is a reliable and potentially fast tool to check the absolute dose in two dimensions inside a phantom for individual IMRT fields. Film measurements inside a phantom can therefore be replaced by EPID measurements

  11. A new linear back projection algorithm to electrical tomography based on measuring data decomposition

    Science.gov (United States)

    Sun, Benyuan; Yue, Shihong; Cui, Ziqiang; Wang, Huaxiang

    2015-12-01

    As an advanced measurement technique of non-radiant, non-intrusive, rapid response, and low cost, the electrical tomography (ET) technique has developed rapidly in recent decades. The ET imaging algorithm plays an important role in the ET imaging process. Linear back projection (LBP) is the most used ET algorithm due to its advantages of dynamic imaging process, real-time response, and easy realization. But the LBP algorithm is of low spatial resolution due to the natural ‘soft field’ effect and ‘ill-posed solution’ problems; thus its applicable ranges are greatly limited. In this paper, an original data decomposition method is proposed, and every ET measuring data are decomposed into two independent new data based on the positive and negative sensing areas of the measuring data. Consequently, the number of total measuring data is extended to twice as many as the number of the original data, thus effectively reducing the ‘ill-posed solution’. On the other hand, an index to measure the ‘soft field’ effect is proposed. The index shows that the decomposed data can distinguish between different contributions of various units (pixels) for any ET measuring data, and can efficiently reduce the ‘soft field’ effect of the ET imaging process. In light of the data decomposition method, a new linear back projection algorithm is proposed to improve the spatial resolution of the ET image. A series of simulations and experiments are applied to validate the proposed algorithm by the real-time performances and the progress of spatial resolutions.

  12. External force back-projective composition and globally deformable optimization for 3-D coronary artery reconstruction

    International Nuclear Information System (INIS)

    The clinical value of the 3D reconstruction of a coronary artery is important for the diagnosis and intervention of cardiovascular diseases. This work proposes a method based on a deformable model for reconstructing coronary arteries from two monoplane angiographic images acquired from different angles. First, an external force back-projective composition model is developed to determine the external force, for which the force distributions in different views are back-projected to the 3D space and composited in the same coordinate system based on the perspective projection principle of x-ray imaging. The elasticity and bending forces are composited as an internal force to maintain the smoothness of the deformable curve. Second, the deformable curve evolves rapidly toward the true vascular centerlines in 3D space and angiographic images under the combination of internal and external forces. Third, densely matched correspondence among vessel centerlines is constructed using a curve alignment method. The bundle adjustment method is then utilized for the global optimization of the projection parameters and the 3D structures. The proposed method is validated on phantom data and routine angiographic images with consideration for space and re-projection image errors. Experimental results demonstrate the effectiveness and robustness of the proposed method for the reconstruction of coronary arteries from two monoplane angiographic images. The proposed method can achieve a mean space error of 0.564 mm and a mean re-projection error of 0.349 mm. (paper)

  13. Visual Effects for Stereoscopic 3D Contents: Experiences from the Don't Look Back-project

    OpenAIRE

    Matilainen, Tapio

    2012-01-01

    Visual effects for stereoscopic 3D contents, experiences from the Don't Look Back-project is a description of workflow and phases I went through by making of visual effects on stereoscopic 3D footage filmed earlier in 2010. Work includes two shots from the Don't Look Back-project, where I had opportunity to make computer generated imagery and visual effects. Stereoscopic three dimensional (s3D) refers in cinema to the films which are seen with specific glasses on. S3D material is create...

  14. Resolution Analysis of finite fault inversions: A back-projection approach.

    Science.gov (United States)

    Ji, C.; Shao, G.

    2007-12-01

    The resolution of inverted source models of large earthquakes is controlled by frequency contents of "coherent" (or "useful") seismic observations and their spatial distribution. But it is difficult to distinguish whether some features consistent during different inversions are really required by data or a consequence of "prior" information, such as velocity structures, fault geometry, model parameterizations. Here, we investigate the model spatial resolution by first back projecting and stacking the data at the source regions and then analyzing the spatial- temporal variations of the focusing regions, which arbitrarily defined as the regions with 90% of the peak focusing amplitude. Our preliminary results indicated 1) The spatial-temporal resolution at a particularly direction is controlled by the region of directivity parameter [pcos(?)] within the seismic network, where p is the horizontal slowness from the hypocenter and ? is the difference between the station azimuth and this orientation. Therefore, the network aperture is more important than the number of stations. 2) Simple stacking method is a robust method to capture the asperities but the sizes of focusing regions are usually much larger than what data could resolve. By carefully weighting the data before the stacking could enhance the spatial resolution in a particular direction. 3) The results based on the teleseismic P waves of a local network usually surfers the trade-off between the source's spatial location and its rupture time. The resolution of the 2001 Kunlunshan earthquake and 2006 Kuril island earthquake will be investigated.

  15. Parathyroid scintigraphy - benefit of early and late SPECT with iterative reconstruction (IR) versus filtered back projection (FBP)

    International Nuclear Information System (INIS)

    Full text: The published literature is conflicting on the benefit of Parathyroid SPECT. This study evaluates the incremental benefit of SPECT over planar imaging, comparing both the timing of SPECT and the processing technique used. Over 1 year, patients referred for parathyroid scintigraphy were studied with conventional dual tracer and dual phase planar imaging using 50 MBq of Tc-99m-pertechnetate and 800 MBq of Tc-99m MIBI. SPECT was performed both after the initial and 2 hour planars, and processed using both PBP and IR. All studies were randomised and read as planar, planar+FBP then planar+FBP+IR for both early and late SPECT by 2 blinded readers. Focal abnormalities were scored on a 5 point scale, with scores of 4 and 5 being called positive. In cases of observer disagreement a third blinded reader was used. Surgical follow up was available in 16 of 33 patients. 2 were surgically non curative and excluded (including one probable scintigraphic mediastinal adenoma not located at surgery). Sensitivity and ROC analyses were performed to evaluate incremental benefit of FBP and IR SPECT over planars. No significant difference was found between Early and Late SPECT. ROC analysis of individual readers showed improved accuracy of SPECT over planars for one of the two readers. IR SPECT improves sensitivity without loss of specificity compared to planar imaging. Late SPECT shows no additional benefit. Copyright (2002) The Australian and New Zealand Society of Nuclear Medicine Inc

  16. Importance of point-by-point back projection correction for isocentric motion in digital breast tomosynthesis: Relevance to morphology of structures such as microcalcifications

    International Nuclear Information System (INIS)

    Digital breast tomosynthesis is a three-dimensional imaging technique that provides an arbitrary set of reconstruction planes in the breast from a limited-angle series of projection images acquired while the x-ray tube moves. Traditional shift-and-add (SAA) tomosynthesis reconstruction is a common mathematical method to line up each projection image based on its shifting amount to generate reconstruction slices. With parallel-path geometry of tube motion, the path of the tube lies in a plane parallel to the plane of the detector. The traditional SAA algorithm gives shift amounts for each projection image calculated only along the direction of x-ray tube movement. However, with the partial isocentric motion of the x-ray tube in breast tomosynthesis, small objects such as microcalcifications appear blurred (for instance, about 1-4 pixels in blur for a microcalcification in a human breast) in traditional SAA images in the direction perpendicular to the direction of tube motion. Some digital breast tomosynthesis algorithms reported in the literature utilize a traditional one-dimensional SAA method that is not wholly suitable for isocentric motion. In this paper, a point-by-point back projection (BP) method is described and compared with traditional SAA for the important clinical task of evaluating morphology of small objects such as microcalcifications. Impulse responses at different three-dimensional locations with five different combinations of imaging acquisition parameters were investigated. Reconstruction images of microcalcifications in a human subject were also evaluated. Results showed that with traditional SAA and 45 deg. view angle of tube movement with respect to the detector, at the same height above the detector, the in-plane blur artifacts were obvious for objects farther away from x-ray source. In a human subject, the appearance of calcifications was blurred in the direction orthogonal to the tube motion with traditional SAA. With point-by-point BP, the appearance of calcifications was sharper. The point-by-point BP method demonstrated improved rendition of microcalcifications in the direction perpendicular to the tube motion direction. With wide angles or for imaging of larger breasts, this point-by-point BP rather than the traditional SAA should also be considered as the basis of further deblurring algorithms that work in conjunction with the BP method

  17. Importance of point-by-point back projection correction for isocentric motion in digital breast tomosynthesis: relevance to morphology of structures such as microcalcifications.

    Science.gov (United States)

    Chen, Ying; Lo, Joseph Y; Dobbins, James T

    2007-10-01

    Digital breast tomosynthesis is a three-dimensional imaging technique that provides an arbitrary set of reconstruction planes in the breast from a limited-angle series of projection images acquired while the x-ray tube moves. Traditional shift-and-add (SAA) tomosynthesis reconstruction is a common mathematical method to line up each projection image based on its shifting amount to generate reconstruction slices. With parallel-path geometry of tube motion, the path of the tube lies in a plane parallel to the plane of the detector. The traditional SAA algorithm gives shift amounts for each projection image calculated only along the direction of x-ray tube movement. However, with the partial isocentric motion of the x-ray tube in breast tomosynthesis, small objects such as microcalcifications appear blurred (for instance, about 1-4 pixels in blur for a microcalcification in a human breast) in traditional SAA images in the direction perpendicular to the direction of tube motion. Some digital breast tomosynthesis algorithms reported in the literature utilize a traditional one-dimensional SAA method that is not wholly suitable for isocentric motion. In this paper, a point-by-point back projection (BP) method is described and compared with traditional SAA for the important clinical task of evaluating morphology of small objects such as microcalcifications. Impulse responses at different three-dimensional locations with five different combinations of imaging acquisition parameters were investigated. Reconstruction images of microcalcifications in a human subject were also evaluated. Results showed that with traditional SAA and 45 degrees view angle of tube movement with respect to the detector, at the same height above the detector, the in-plane blur artifacts were obvious for objects farther away from x-ray source. In a human subject, the appearance of calcifications was blurred in the direction orthogonal to the tube motion with traditional SAA. With point-by-point BP, the appearance of calcifications was sharper. The point-by-point BP method demonstrated improved rendition of microcalcifications in the direction perpendicular to the tube motion direction. With wide angles or for imaging of larger breasts, this point-by-point BP rather than the traditional SAA should also be considered as the basis of further deblurring algorithms that work in conjunction with the BP method. PMID:17985634

  18. GPU-accelerated back-projection revisited. Squeezing performance by careful tuning

    International Nuclear Information System (INIS)

    In recent years, GPUs have become an increasingly popular tool in computed tomography (CT) reconstruction. In this paper, we discuss performance optimization techniques for a GPU-based filtered-backprojection reconstruction implementation. We explore the different optimization techniques we used and explain how those techniques affected performance. Our results show a nearly 50% increase in performance when compared to the current top ranked GPU implementation. (orig.)

  19. Mitigating artifacts in back-projection source imaging with implications for frequency-dependent properties of the Tohoku-Oki earthquake

    Science.gov (United States)

    Meng, Lingsen; Ampuero, Jean-Paul; Luo, Yingdi; Wu, Wenbo; Ni, Sidao

    2012-12-01

    Comparing teleseismic array back-projection source images of the 2011 Tohoku-Oki earthquake with results from static and kinematic finite source inversions has revealed little overlap between the regions of high- and low-frequency slip. Motivated by this interesting observation, back-projection studies extended to intermediate frequencies, down to about 0.1 Hz, have suggested that a progressive transition of rupture properties as a function of frequency is observable. Here, by adapting the concept of array response function to non-stationary signals, we demonstrate that the "swimming artifact", a systematic drift resulting from signal non-stationarity, induces significant bias on beamforming back-projection at low frequencies. We introduce a "reference window strategy" into the multitaper-MUSIC back-projection technique and significantly mitigate the "swimming artifact" at high frequencies (1 s to 4 s). At lower frequencies, this modification yields notable, but significantly smaller, artifacts than time-domain stacking. We perform extensive synthetic tests that include a 3D regional velocity model for Japan. We analyze the recordings of the Tohoku-Oki earthquake at the USArray and at the European array at periods from 1 s to 16 s. The migration of the source location as a function of period, regardless of the back-projection methods, has characteristics that are consistent with the expected effect of the "swimming artifact". In particular, the apparent up-dip migration as a function of frequency obtained with the USArray can be explained by the "swimming artifact". This indicates that the most substantial frequency-dependence of the Tohoku-Oki earthquake source occurs at periods longer than 16 s. Thus, low-frequency back-projection needs to be further tested and validated in order to contribute to the characterization of frequency-dependent rupture properties.

  20. Images of gravitational and magnetic phenomena derived from two-dimensional back-projection Doppler tomography of interacting binary stars

    Energy Technology Data Exchange (ETDEWEB)

    Richards, Mercedes T.; Cocking, Alexander S.; Fisher, John G.; Conover, Marshall J., E-mail: mrichards@astro.psu.edu, E-mail: asc5097@psu.edu [Department of Astronomy and Astrophysics, Pennsylvania State University, University Park, PA 16802 (United States)

    2014-11-10

    We have used two-dimensional back-projection Doppler tomography as a tool to examine the influence of gravitational and magnetic phenomena in interacting binaries that undergo mass transfer from a magnetically active star onto a non-magnetic main-sequence star. This multitiered study of over 1300 time-resolved spectra of 13 Algol binaries involved calculations of the predicted dynamical behavior of the gravitational flow and the dynamics at the impact site, analysis of the velocity images constructed from tomography, and the influence on the tomograms of orbital inclination, systemic velocity, orbital coverage, and shadowing. The H? tomograms revealed eight sources: chromospheric emission, a gas stream along the gravitational trajectory, a star-stream impact region, a bulge of absorption or emission around the mass-gaining star, a Keplerian accretion disk, an absorption zone associated with hotter gas, a disk-stream impact region, and a hot spot where the stream strikes the edge of a disk. We described several methods used to extract the physical properties of the emission sources directly from the velocity images, including S-wave analysis, the creation of simulated velocity tomograms from hydrodynamic simulations, and the use of synthetic spectra with tomography to sequentially extract the separate sources of emission from the velocity image. In summary, the tomography images have revealed results that cannot be explained solely by gravitational effects: chromospheric emission moving with the mass-losing star, a gas stream deflected from the gravitational trajectory, and alternating behavior between stream state and disk state. Our results demonstrate that magnetic effects cannot be ignored in these interacting binaries.

  1. Images of gravitational and magnetic phenomena derived from two-dimensional back-projection Doppler tomography of interacting binary stars

    International Nuclear Information System (INIS)

    We have used two-dimensional back-projection Doppler tomography as a tool to examine the influence of gravitational and magnetic phenomena in interacting binaries that undergo mass transfer from a magnetically active star onto a non-magnetic main-sequence star. This multitiered study of over 1300 time-resolved spectra of 13 Algol binaries involved calculations of the predicted dynamical behavior of the gravitational flow and the dynamics at the impact site, analysis of the velocity images constructed from tomography, and the influence on the tomograms of orbital inclination, systemic velocity, orbital coverage, and shadowing. The Hα tomograms revealed eight sources: chromospheric emission, a gas stream along the gravitational trajectory, a star-stream impact region, a bulge of absorption or emission around the mass-gaining star, a Keplerian accretion disk, an absorption zone associated with hotter gas, a disk-stream impact region, and a hot spot where the stream strikes the edge of a disk. We described several methods used to extract the physical properties of the emission sources directly from the velocity images, including S-wave analysis, the creation of simulated velocity tomograms from hydrodynamic simulations, and the use of synthetic spectra with tomography to sequentially extract the separate sources of emission from the velocity image. In summary, the tomography images have revealed results that cannot be explained solely by gravitational effects: chromospheric emission moving with the mass-losing star, a gas stream deflected from the gravitational trajectory, and alternating behavior between stream state and disk state. Our results demonstrate that magnetic effects cannot be ignored in these interacting binaries.

  2. Electrical capacitance tomography two-phase oil-gas pipe flow imaging by the linear back-projection algorithm

    Directory of Open Access Journals (Sweden)

    R. Martin

    2005-12-01

    Full Text Available Electrical Capacitance Tomography (ECT is a novel technology that can deal with the complexity of two-phase gas-oil flow measurement by explicitly deriving the component distributions on two adjacent planes along a pipeline. One of its most promising applications is the visualization of gas-oil flows. ECT offers some advantages over other tomography modalities, such as no radiation, rapid response, low-cost, being non-intrusive and non-invasive, and the ability to withstand high temperature and high pressure. The linear back-projection (LBP algorithm is one of the most popular methods employed to perform image reconstruction in ECT. Despite its relatively poor accuracy, it is a simple and fast procedure capable of real-time operation in many applications, and it has remained a very popular choice. However, since it was first reported it has lacked a clear formal support in the context of this application. Its only justification has been that it was an adaptation of a method normally used in linear X-ray medical tomography, and the fact that it actually does produce useful (albeit only ‘qualitative’ images. In this paper, one illustrative way of interpreting LBP is presented. It is shown how LBP is actually based on the linearisation of a normalised form of the forward problem. More specifically, the normalised forward problem is approximated by means of a series of hyper-planes. The reconstruction matrix used in LBP is found to be a ‘weighted’ transpose of the linear operator (matrix that defines the linearised normalised forward problem. The rows of this latter matrix contain the information of the sensitivity maps used in LBP.

  3. Semicircular microstrip low pass filter

    OpenAIRE

    Kumud Ranjan Jha; Manish Rai

    2008-01-01

    This paper presents semicircular microstrip low pass filter with the sharp rejection and wide stop band. The proposed filter design is based on the calculation of filter parameters from traditional hi-lo impedance method and is available in the literature of microstrip filter. To further improve the design performance, high impedance lines are magnetically coupled, resulting an attenuation pole near -3dB cut off point of the filter. This design gives insight in designing a low pass filter wit...

  4. Digital Filters for Low Frequency Equalization

    DEFF Research Database (Denmark)

    Tyril, Marni; Abildgaard, J.; Rubak, Per

    Digital filters with high resolution in the low-frequency range are studied. Specifically, for a given computational power, traditional IIR filters are compared with warped FIR filters, warped IIR filters, and modified warped FIR filters termed warped individual z FIR filters (WizFIR). The results...

  5. Filter Program

    International Nuclear Information System (INIS)

    Air filter research and development in the following areas is discussed: post-accident recirculation filter for removal of fission products from containment atmospheres; exhaust air filters for reprocessing facilities; and exhaust air filters for reactors. (JWR)

  6. Generalized Nonlinear Complementary Attitude Filter

    OpenAIRE

    Jensen, Kenneth

    2011-01-01

    This work describes a family of attitude estimators that are based on a generalization of Mahony's nonlinear complementary filter. This generalization reveals the close mathematical relationship between the nonlinear complementary filter and the more traditional multiplicative extended Kalman filter. In fact, the bias-free and constant gain multiplicative continuous-time extended Kalman filters may be interpreted as special cases of the generalized attitude estimator. The co...

  7. Rupture process of the 2011 off the Pacific coast of Tohoku Earthquake ( M w 9.0) as imaged with back-projection of teleseismic P-waves

    Science.gov (United States)

    Wang, Dun; Mori, Jim

    2011-07-01

    We use the back-projection method, with data recorded on the dense USArray network, to estimate the rupture propagation for the M w 9.0 earthquake that occurred offshore of the Tohoku region, Japan. The results show a variable rupture propagation ranging from about 1.0 to 3.0 km/s for the high-frequency radiation. The rupture propagates over about 450 km in approximately 150 s. Based on the rupture speed and direction, the high-frequency source process can be divided into two parts. The first part has a relatively slow rupture speed of 1.0 to 1.5 km/s and propagates northwestward. In the second part, the rupture progresses southwestward starting with a slow speed of about 1.5 km/s and accelerating to about 3.0 km/s. We see three large pulses at 30 s, 80 s and 130 s. The first two, including the largest second pulse, were located 50 to 70 km northwest of the epicenter. The third occurred about 250 km southwest of the epicenter. The variability of rupture velocity may be associated with significant changes of physical properties along the fault plane. Areas of low/high rupture speed are associated with large/small energy releases on the fault plane. These variations may reflect the strength properties along the fault. Also, locations of the high-frequency radiation derived from the back-projection analysis are significantly different from the areas of very large slip for this earthquake.

  8. Additive filtering

    International Nuclear Information System (INIS)

    A filtering system with ferromagnetic plates at the tube-housing and magnets glued onto the filters has been developed and used by us. The system allows free positioning of filters of different sizes, the use of more than one filter at the same time and, with a second coupling plate, even the addition of two filters. (author)

  9. Evaluating low pass filters on SPECT reconstructed cardiac orientation estimation

    Science.gov (United States)

    Dwivedi, Shekhar

    2009-02-01

    Low pass filters can affect the quality of clinical SPECT images by smoothing. Appropriate filter and parameter selection leads to optimum smoothing that leads to a better quantification followed by correct diagnosis and accurate interpretation by the physician. This study aims at evaluating the low pass filters on SPECT reconstruction algorithms. Criteria for evaluating the filters are estimating the SPECT reconstructed cardiac azimuth and elevation angle. Low pass filters studied are butterworth, gaussian, hamming, hanning and parzen. Experiments are conducted using three reconstruction algorithms, FBP (filtered back projection), MLEM (maximum likelihood expectation maximization) and OSEM (ordered subsets expectation maximization), on four gated cardiac patient projections (two patients with stress and rest projections). Each filter is applied with varying cutoff and order for each reconstruction algorithm (only butterworth used for MLEM and OSEM). The azimuth and elevation angles are calculated from the reconstructed volume and the variation observed in the angles with varying filter parameters is reported. Our results demonstrate that behavior of hamming, hanning and parzen filter (used with FBP) with varying cutoff is similar for all the datasets. Butterworth filter (cutoff > 0.4) behaves in a similar fashion for all the datasets using all the algorithms whereas with OSEM for a cutoff < 0.4, it fails to generate cardiac orientation due to oversmoothing, and gives an unstable response with FBP and MLEM. This study on evaluating effect of low pass filter cutoff and order on cardiac orientation using three different reconstruction algorithms provides an interesting insight into optimal selection of filter parameters.

  10. Filtering apparatus

    International Nuclear Information System (INIS)

    Filtering apparatus for removing toxic particles from a fluid includes a sealed vessel having a fluid inlet and a fluid outlet between which is an array of elongate filter elements. The filter elements are so disposed that when viewed in the direction of their length the axis of each filter element is spaced from that of each adjacent filter element by a distance R whereby the axes lie at the apexes of a plurality of equilateral triangles. The axes of six of the filter elements are equispaced about a central pitch circle whose radius is R and at the centre of which is the axis of a seventh filter element. A portion of the wall of the vessel is removable to permit access to the filter elements and each filter element has means to locate a machine for remotely exchanging the filter elements. (author)

  11. Keeping Tradition

    OpenAIRE

    Zenhong, C.; Buwalda, P.L.

    2011-01-01

    Chinese dumplings such as Jiao Zi and Bao Zi are two of the popular traditional foods in Asia. They are usually made from wheat flour dough (rice flour or starch is sometimes used) that contains fillings. They can be steamed, boiled and fried and are consumed either as a main meal or dessert. As these tasty dumplings are easy to prepare, they have become one of Asia's fastest growing products in the frozen and ready-to-eat sector.

  12. Portable Wideband Microwave Imaging System for Intracranial Hemorrhage Detection Using Improved Back-projection Algorithm with Model of Effective Head Permittivity.

    Science.gov (United States)

    Mobashsher, Ahmed Toaha; Mahmoud, A; Abbosh, A M

    2016-01-01

    Intracranial hemorrhage is a medical emergency that requires rapid detection and medication to restrict any brain damage to minimal. Here, an effective wideband microwave head imaging system for on-the-spot detection of intracranial hemorrhage is presented. The operation of the system relies on the dielectric contrast between healthy brain tissues and a hemorrhage that causes a strong microwave scattering. The system uses a compact sensing antenna, which has an ultra-wideband operation with directional radiation, and a portable, compact microwave transceiver for signal transmission and data acquisition. The collected data is processed to create a clear image of the brain using an improved back projection algorithm, which is based on a novel effective head permittivity model. The system is verified in realistic simulation and experimental environments using anatomically and electrically realistic human head phantoms. Quantitative and qualitative comparisons between the images from the proposed and existing algorithms demonstrate significant improvements in detection and localization accuracy. The radiation and thermal safety of the system are examined and verified. Initial human tests are conducted on healthy subjects with different head sizes. The reconstructed images are statistically analyzed and absence of false positive results indicate the efficacy of the proposed system in future preclinical trials. PMID:26842761

  13. Relationship between high-frequency radiation and asperity ruptures, revealed by hybrid back-projection with a non-planar fault model.

    Science.gov (United States)

    Okuwaki, Ryo; Yagi, Yuji; Hirano, Shiro

    2014-01-01

    High-frequency seismic waves are generated by abrupt changes of rupture velocity and slip-rate during an earthquake. Therefore, analysis of high-frequency waves is crucial to understanding the dynamic rupture process. Here, we developed a hybrid back-projection method that considers variations in focal mechanisms by introducing a non-planar fault model that reflects the subducting slab geometry. We applied it to teleseismic P-waveforms of the Mw 8.8 2010 Chile earthquake to estimate the spatiotemporal distribution of high-frequency (0.5-2.0 Hz) radiation. By comparing the result with the coseismic slip distribution obtained by waveform inversion, we found that strong high-frequency radiation can precede and may trigger a large asperity rupture. Moreover, in between the large slip events, high-frequency radiation of intermediate strength was concentrated along the rupture front. This distribution suggests that by bridging the two large slips, this intermediate-strength high-frequency radiation might play a key role in the interaction of the large slip events. PMID:25406638

  14. GMTI processing using back projection.

    Energy Technology Data Exchange (ETDEWEB)

    Doerry, Armin Walter

    2013-07-01

    Backprojection has long been applied to SAR image formation. It has equal utility in forming the range-velocity maps for Ground Moving Target Indicator (GMTI) radar processing. In particular, it overcomes the problem of targets migrating through range resolution cells.

  15. Air filter

    International Nuclear Information System (INIS)

    An air filter consists of an upright cylinder of corrugated or pleated filter fabric, joined at its upper end to a tubular right-angled elbow. The open end of the elbow includes an internal lip seal, so the elbow can be slid onto a horizontal spigot in an air filter unit. The filter can be cleaned by subjecting the fabric to a reverse pressure pulse from a nozzle. The construction facilitates removal of the filter into a plastic bag secured round a frame behind a door, when the unit is used to filter radioactive dust from air. (author)

  16. Characterizing trends in HIV infection among men who have sex with men in Australia by birth cohorts: results from a modified back-projection method

    Directory of Open Access Journals (Sweden)

    Wand Handan

    2009-09-01

    Full Text Available Abstract Background We set out to estimate historical trends in HIV incidence in Australian men who have sex with men with respect to age at infection and birth cohort. Methods A modified back-projection technique is applied to data from the HIV/AIDS Surveillance System in Australia, including "newly diagnosed HIV infections", "newly acquired HIV infections" and "AIDS diagnoses", to estimate trends in HIV incidence over both calendar time and age at infection. Results Our results demonstrate that since 2000, there has been an increase in new HIV infections in Australian men who have sex with men across all age groups. The estimated mean age at infection increased from ~35 years in 2000 to ~37 years in 2007. When the epidemic peaked in the mid 1980s, the majority of the infections (56% occurred among men aged 30 years and younger; 30% occurred in ages 31 to 40 years; and only ~14% of them were attributed to the group who were older than 40 years of age. In 2007, the proportion of infections occurring in persons 40 years or older doubled to 31% compared to the mid 1980s, while the proportion of infections attributed to the group younger than 30 years of age decreased to 36%. Conclusion The distribution of HIV incidence for birth cohorts by infection year suggests that the HIV epidemic continues to affect older homosexual men as much as, if not more than, younger men. The results are useful for evaluating the impact of the epidemic across successive birth cohorts and study trends among the age groups most at risk.

  17. Rupture Processes of the Mw8.3 Sea of Okhotsk Earthquake and Aftershock Sequences from 3-D Back Projection Imaging

    Science.gov (United States)

    Jian, P. R.; Hung, S. H.; Meng, L.

    2014-12-01

    On May 24, 2013, the largest deep earthquake ever recorded in history occurred on the southern tip of the Kamchatka Island, where the Pacific Plate subducts underneath the Okhotsk Plate. Previous 2D beamforming back projection (BP) of P- coda waves suggests the mainshock ruptured bilaterally along a horizontal fault plane determined by the global centroid moment tensor solution. On the other hand, the multiple point source inversion of P and SH waveforms argued that the earthquake comprises a sequence of 6 subevents not located on a single plane but actually distributed in a zone that extends 64 km horizontally and 35 km in depth. We then apply a three-dimensional MUSIC BP approach to resolve the rupture processes of the manishock and two large aftershocks (M6.7) with no a priori setup of preferential orientations of the planar rupture. The maximum pseudo-spectrum of high-frequency P wave in a sequence of time windows recorded by the densely-distributed stations from US and EU Array are used to image 3-D temporal and spatial rupture distribution. The resulting image confirms that the nearly N-S striking but two antiparallel rupture stages. The first subhorizontal rupture initially propagates toward the NNE direction, while at 18 s later it directs reversely to the SSW and concurrently shifts downward to 35 km deeper lasting for about 20 s. The rupture lengths in the first NNE-ward and second SSW-ward stage are about 30 km and 85 km; the estimated rupture velocities are 3 km/s and 4.25 km/s, respectively. Synthetic experiments are undertaken to assess the capability of the 3D MUSIC BP for the recovery of spatio-temporal rupture processes. Besides, high frequency BP images based on the EU-Array data show two M6.7 aftershocks are more likely to rupture on the vertical fault planes.

  18. Filteration device

    Energy Technology Data Exchange (ETDEWEB)

    Igarashi, Ryokichi.

    1990-05-31

    The present invention concerns a filtration device for filtering impurities in condensates or radioactive fluid generated in BWR type reactors. A filter device for removing radioactive particles and iodine in a pressurized air is disposed to a filteration device. The filter device comprises a wet component removing device, a high performance particle filter for removing radioactive particles and an iodine removing filter. When backwashing water used for backwashing the filtration material in the filter device is transported under pressure by a pressurizing air, since the backwashing water is in contact with the pressurizing air, radioactive particles and iodine contained in the backwashing water are isolated and mixed into the pressuring air. In this case, the radioactive particles and iodine in the pressurizing air are removed by passing through the filteration device. Accordingly, clean air is discharged to a ventilating air conditioner system thereby enabling to decrease operator's exposure. (I.N.).

  19. Filter element

    International Nuclear Information System (INIS)

    A filter element for a precoat filter has a septum formed with longitudinal pleats having broad roots proximal to a perforate core and narrow tips distal from the core. The element may include circumferential supporting bands spaced axially along the filter element and constraining the septum tips. In use a deposit of mechanical or ion exchange precoat is applied to the element before filtering and later removed by backwashing. (author)

  20. Rectifier Filters

    Directory of Open Access Journals (Sweden)

    Y. A. Bladyko

    2014-07-01

    Full Text Available The paper contains definition of a smoothing factor which is suitable for any rectifier filter. The formulae of complex smoothing factors have been developed for simple and complex passive filters. The paper shows conditions for application of calculation formulae and filters

  1. Optimal filtering

    CERN Document Server

    Anderson, Brian D O

    2005-01-01

    This graduate-level text augments and extends beyond undergraduate studies of signal processing, particularly in regard to communication systems and digital filtering theory. Vital for students in the fields of control and communications, its contents are also relevant to students in such diverse areas as statistics, economics, bioengineering, and operations research.Topics include filtering, linear systems, and estimation; the discrete-time Kalman filter; time-invariant filters; properties of Kalman filters; computational aspects; and smoothing of discrete-time signals. Additional subjects e

  2. Filter casing

    International Nuclear Information System (INIS)

    The filter housing includes filters for removing noxious constituents from gases and a test system for checking the tightness of the filter. Between the next to the last prefilter exposed from the top and the following fine-mesh filter exposed from below there is an inlet for a test medium, e.g., an oil mist. Downstream of the fine-mesh filter there is a window in the wall of the housing through which a thread of the flow of test medium made visible can be observed. This allows any leaks in the fine-mesh filter and its seat, respectively, to be detected. Replacement is possible through a replacement aperture. Any dusts removed from the filter in this process will not be dragged to the clean air side. (DG)

  3. Extensions to polar formatting with spatially variant post-filtering

    Science.gov (United States)

    Garber, Wendy L.; Hawley, Robert W.

    2011-06-01

    The polar format algorithm (PFA) is computationally faster than back projection for producing spotlight mode synthetic aperture radar (SAR). This is very important in applications such as video SAR for persistent surveillance, as images may need to be produced in real time. PFA's speed is largely due to making a planar wavefront assumption and forming the image onto a regular grid of pixels lying in a plane. Unfortunately, both assumptions cause loss of focus in airborne persistent surveillance applications. The planar wavefront assumption causes a loss of focus in the scene for pixels that are far from scene center. The planar grid of image pixels causes loss of the depth of focus for conic flight geometries. In this paper, we present a method to compensate for the loss of depth of focus while warping the image onto a terrain map to produce orthorectified imagery. This technique applies a spatially variant post-filter and resampling to correct the defocus while dewarping the image. This work builds on spatially variant post-filtering techniques previously developed at Sandia National Laboratories in that it incorporates corrections for terrain height and circular flight paths. This approach produces high quality SAR images many times faster than back projection.

  4. Filter Program

    International Nuclear Information System (INIS)

    Performance studies are described for a recirculation air filter system for removal of radioactive iodine from the containment of an HTGR reactor following a depressurization accident. Included are data on aging of adsorbent at room temperature, temperature effects on adsorbent performance, and filter testing procedures

  5. Application of circular filter inserts

    International Nuclear Information System (INIS)

    High efficiency particulate air (HEPA) filters are used in the ventilation of nuclear plant as passive clean-up devices. Traditionally, the work-horse of the industry has been the rectangular HEPA filter. An assessment of the problems associated with remote handling, changing, and disposal of these rectangular filters suggested that significant advantages to filtration systems could be obtained by the adoption of HEPA filters with circular geometry for both new and existing ventilation plants. This paper covers the development of circular geometry filters and highlights the advantages of this design over their rectangular counterparts. The work has resulted in a range of commercially available filters for flows from 45 m3/h up to 3400 m3/h. This paper also covers the development of a range of sizes and types of housings that employ simple change techniques which take advantage of the circular geometry. The systems considered here have been designed in response to the requirements for shielded (remote filter change) and for unshielded facilities (potentially for bag changing of filters). Additionally the designs have allowed for the possibility of retrofitting circular geometry HEPA filters in place of the rectangular geometry filter

  6. Filter assemblies

    International Nuclear Information System (INIS)

    A filter assembly for the nuclear industry comprises a plurality of tubular filters welded at one end to a plenum chamber which is made of plastics by rotational moulding and includes an outlet. The other ends of the filters are closed and supported by a plate attached to the plenum chamber by tie rods. A central rod screws into the capture nut at one end and has a fitting, to facilitate remote handling, at the other. The assembly is cheap and destructible after use. (author)

  7. Data assimilation the ensemble Kalman filter

    CERN Document Server

    Evensen, Geir

    2006-01-01

    Covers data assimilation and inverse methods, including both traditional state estimation and parameter estimation. This text and reference focuses on various popular data assimilation methods, such as weak and strong constraint variational methods and ensemble filters and smoothers.

  8. An area efficient low noise 100 Hz low-pass filter

    DEFF Research Database (Denmark)

    Ølgaard, Christian; Sassene, Haoues; Perch-Nielsen, Ivan R.

    1996-01-01

    A technique based on scaling a filter's capacitor currents to improve the noise performance of low frequency continuous-time filters is presented. Two 100 Hz low-pass filters have been implemented: a traditional low pass filter (as reference), and a filter utilizing the above mentioned current scaling technique. The two filters utilize approximately the same silicon area. The scaled filter implements the scaling by use of a MOS based current conveyor type CCII. Measurements indicate that the cur...

  9. An area efficient low noise 100 Hz low-pass filter

    OpenAIRE

    Ølgaard, Christian; Sassene, Haoues; Perch-Nielsen, Ivan R.

    2010-01-01

    A technique based on scaling a filter's capacitor currents to improve the noise performance of low frequency continuous-time filters is presented. Two 100 Hz low-pass filters have been implemented: a traditional low pass filter (as reference), and a filter utilizing the above mentioned current scaling technique. The two filters utilize approximately the same silicon area. The scaled filter implements the scaling by use of a MOS based current conveyor type CCII. Measurements indicate that the ...

  10. Well filter

    Energy Technology Data Exchange (ETDEWEB)

    Aliverdizade, T.K.; Agayev, Sh,K.; Matveyenko, L.M.; Yelizarova, Z.Ye.

    1984-01-01

    The purpose of the invention is to improve reliable operation of the filter by guaranteeing self-cleaning during operation. This goal is achieved because in the well filter which contains two metal concentrically arranged perforated pipes with openings of the same shape and dimensions and edges of the corresponding openings converged by the opposite sides and shifted in relation to each other with the formation of transitional channels, bottom in the lower part of the outer pipes and connecting element, the bottom of the outer pipes is equipped with an overflow valve, and the inner perforated pipe has a cap on its lower end forming a chamber with the bottom filled with gas. The gap between the perforated pipes is filled with a glue composition with electrical-insulating properties, while the connecting element is made of electrical-insulating material.

  11. Robust filtering for uncertain systems a parameter-dependent approach

    CERN Document Server

    Gao, Huijun

    2014-01-01

    This monograph provides the reader with a systematic treatment of robust filter design, a key issue in systems, control and signal processing, because of the fact that the inevitable presence of uncertainty in system and signal models often degrades the filtering performance and may even cause instability. The methods described are therefore not subject to the rigorous assumptions of traditional Kalman filtering. The monograph is concerned with robust filtering for various dynamical systems with parametric uncertainties, and focuses on parameter-dependent approaches to filter design. Classical filtering schemes, like H2 filtering and H¥ filtering, are addressed, and emerging issues such as robust filtering with constraints on communication channels and signal frequency characteristics are discussed. The text features: ·        design approaches to robust filters arranged according to varying complexity level, and emphasizing robust filtering in the parameter-dependent framework for the first time; ·...

  12. Anti-Aliasing filter for reverse-time migration

    KAUST Repository

    Zhan, Ge

    2012-01-01

    We develop an anti-aliasing filter for reverse-time migration (RTM). It is similar to the traditional anti-aliasing filter used for Kirchhoff migration in that it low-pass filters the migration operator so that the dominant wavelength in the operator is greater than two times the trace sampling interval, except it is applied to both primary and multiple reflection events. Instead of applying this filter to the data in the traditional RTM operation, we apply the anti-aliasing filter to the generalized diffraction-stack migration operator. This gives the same migration image as computed by anti-aliased RTM. Download

  13. Intelligent Optimize Design of LCL Filter for Three-Phase Voltage-Source PWM Rectifier

    OpenAIRE

    Sun, Wei; Chen, Zhe; WU, XIAOJIE

    2009-01-01

    Compared to traditional L filter, a LCL filter is more effective on reducing harmonic distortion at switch frequency. So it is important to choose the LCL filter parameters to achieve good filtering effect. This paper introduces some traditional design methods. Design of a LCL filter by genetic algorithm (GA) and particle swam optimization (PSO) are presented in this paper and comparison of the two intelligent optimization. Simulation result and calculate data are provided to prove that intel...

  14. Digital filters

    CERN Document Server

    Hamming, Richard W

    1997-01-01

    Digital signals occur in an increasing number of applications: in telephone communications; in radio, television, and stereo sound systems; and in spacecraft transmissions, to name just a few. This introductory text examines digital filtering, the processes of smoothing, predicting, differentiating, integrating, and separating signals, as well as the removal of noise from a signal. The processes bear particular relevance to computer applications, one of the focuses of this book.Readers will find Hamming's analysis accessible and engaging, in recognition of the fact that many people with the s

  15. The 3-D alignment of objects in dynamic PET scans using filtered sinusoidal trajectories of sinogram

    Science.gov (United States)

    Kostopoulos, Aristotelis E.; Happonen, Antti P.; Ruotsalainen, Ulla

    2006-12-01

    In this study, our goal is to employ a novel 3-D alignment method for dynamic positron emission tomography (PET) scans. Because the acquired data (i.e. sinograms) often contain noise considerably, filtering of the data prior to the alignment presumably improves the final results. In this study, we utilized a novel 3-D stackgram domain approach. In the stackgram domain, the signals along the sinusoidal trajectory signals of the sinogram can be processed separately. In this work, we performed angular stackgram domain filtering by employing well known 1-D filters: the Gaussian low-pass filter and the median filter. In addition, we employed two wavelet de-noising techniques. After filtering we performed alignment of objects in the stackgram domain. The local alignment technique we used is based on similarity comparisons between locus vectors (i.e. the signals along the sinusoidal trajectories of the sinogram) in a 3-D neighborhood of sequences of the stackgrams. Aligned stackgrams can be transformed back to sinograms (Method 1), or alternatively directly to filtered back-projected images (Method 2). In order to evaluate the alignment process, simulated data with different kinds of additive noises were used. The results indicated that the filtering prior to the alignment can be important concerning the accuracy.

  16. The 3-D alignment of objects in dynamic PET scans using filtered sinusoidal trajectories of sinogram

    International Nuclear Information System (INIS)

    In this study, our goal is to employ a novel 3-D alignment method for dynamic positron emission tomography (PET) scans. Because the acquired data (i.e. sinograms) often contain noise considerably, filtering of the data prior to the alignment presumably improves the final results. In this study, we utilized a novel 3-D stackgram domain approach. In the stackgram domain, the signals along the sinusoidal trajectory signals of the sinogram can be processed separately. In this work, we performed angular stackgram domain filtering by employing well known 1-D filters: the Gaussian low-pass filter and the median filter. In addition, we employed two wavelet de-noising techniques. After filtering we performed alignment of objects in the stackgram domain. The local alignment technique we used is based on similarity comparisons between locus vectors (i.e. the signals along the sinusoidal trajectories of the sinogram) in a 3-D neighborhood of sequences of the stackgrams. Aligned stackgrams can be transformed back to sinograms (Method 1), or alternatively directly to filtered back-projected images (Method 2). In order to evaluate the alignment process, simulated data with different kinds of additive noises were used. The results indicated that the filtering prior to the alignment can be important concerning the accuracy

  17. Traditional Agriculture and Permaculture.

    Science.gov (United States)

    Pierce, Dick

    1997-01-01

    Discusses benefits of combining traditional agricultural techniques with the concepts of "permaculture," a framework for revitalizing traditions, culture, and spirituality. Describes school, college, and community projects that have assisted American Indian communities in revitalizing sustainable agricultural practices that incorporate cultural…

  18. Fault Tolerant Parallel Filters Based On Bch Codes

    Directory of Open Access Journals (Sweden)

    K.Mohana Krishna

    2015-04-01

    Full Text Available Digital filters are used in signal processing and communication systems. In some cases, the reliability of those systems is critical, and fault tolerant filter implementations are needed. Over the years, many techniques that exploit the filters’ structure and properties to achieve fault tolerance have been proposed. As technology scales, it enables more complex systems that incorporate many filters. In those complex systems, it is common that some of the filters operate in parallel, for example, by applying the same filter to different input signals. Recently, a simple technique that exploits the presence of parallel filters to achieve multiple fault tolerance has been presented. In this brief, that idea is generalized to show that parallel filters can be protected using Bose– Chaudhuri–Hocquenghem codes (BCH in which each filter is the equivalent of a bit in a traditional ECC. This new scheme allows more efficient protection when the number of parallel filters is large.

  19. Electromagnetic filter

    International Nuclear Information System (INIS)

    The twin-flow electromagnetic filter consists of a vertical cylindrical casing, an upper and lower connecting nozzle, an upper and lower perforated sheet enclosing a filling of magnetizable spheres, and a magnet coil extending over the height of the bed of spheres. In order to void bursts of wash water there are provided spoke-shaped installations above the lower perforated sheet. The width of the spokes is at least ten times the diameter of the spheres and the largest space between the spokes five times the width of the spokes. The surface of the bed of spheres is about 10 to 15 per cent of the hight of the bed away from the upper perforated sheet. (RW)

  20. Quantitative gated SPECT: the effect of reconstruction filter on calculated left ventricular ejection fractions and volumes

    International Nuclear Information System (INIS)

    Gated SPECT (GSPECT) offers the possibility of obtaining additional functional information from perfusion studies, including calculation of left ventricular ejection fraction (LVEF). The calculation of LVEF relies upon the identification of the endocardial surface, which will be affected by the spatial resolution and statistical noise in the reconstructed images. The aim of this study was to compare LVEFs and ventricular volumes calculated from GSPECT using six reconstruction filters. GSPECT and radionuclide ventriculography (RNVG) were performed on 40 patients; filtered back projection was used to reconstruct the datasets with each filter. LVEFs and volumes were calculated using the Cedars-Sinai QGS package. The correlation coefficient between RNVG and GSPECT ranged from 0.81 to 0.86 with higher correlations for smoother filters. The narrowest prediction interval was 11±2%. There was a trend towards higher LVEF values with smoother filters, the ramp filter yielding LVEFs 2.55±3.10% (p<0.001) lower than the Hann filter. There was an overall fall in ventricular volumes with smoother filters with a mean difference of 13.98±10.15 ml (p<0.001) in EDV between the Butterworth-0.5 and Butterworth-0.3 filters. In conclusion, smoother reconstruction filters lead to lower volumes and higher ejection fractions with the QGS algorithm, with the Butterworth-0.4 filter giving the highest correlation with LVEFs from RNVG. Even if the optimal filter is chosen the uncertainty in the measured ejection fractions is still too great to be clinically acceptable. (author)

  1. Filtered - a tool for editing SVG filters

    OpenAIRE

    Kallio, Kiia

    2013-01-01

    Vector graphic image files defined in SVG format can contain filters, graphical effects that can be used for modifying the image pixels algorithmically. Filtered is an open source tool for visual edit-ing of these filters. Although the process that eventually led to filtered started over ten years ago, Filtered is still a relevant tool for SVG content development, as no other tool supports visual editing of SVG filters to the same extent. During the past ten years, SVG has also become an ...

  2. SAW Filter Performance Improvement

    OpenAIRE

    Monali R. Dave

    2012-01-01

    Surface acoustic wave (SAW) filters have a widerange of applications, including, for example, inmobile/wireless transceivers, radio frequency (RF) filters,intermediate frequency (IF) filters, resonator-filters, filtersfor mobile and wireless circuits, IF filters in a basetransceiver station (BTS), RF front-end filtersfor mobile/wireless circuitry, multimode frequencyagile oscillators for spread-spectrum securecommunications, nyquist ...

  3. Gadamers verständnis der tradition

    Directory of Open Access Journals (Sweden)

    Radoj?i? Saša

    2009-01-01

    Full Text Available (nema?ki In diesem Aufsatz man Gadamers Verständnis der wichtigen hermeneutischen Begriffe des Vorurteils, der Autorität und der Tradition erörtet. Der Vollzug des Verstehens, in dem die Vorurteile unvermeidlich sind, wird als Prozeß ihre ununterbrechende Korrektion bestimmt. Die positive Auswertung des begrifflichen Paar Autorität-Tradition ist ein karakteristischen Motiv der philosophischen Hermeneutik, für die die Autorität kein negativen Mitklang hat, sondern auf freie und rationelle Annahmung begründet ist. Der Zusammenhang des Verstehens und der Tradition ist eine dynamische Beziehung, in die weder Tradition noch das Subjekt des Verstehens ungeändert bleiben. Daraus führt man zwei Implikationen aus: daß der Sinn eines Textes kann man nie ausschöpfen, und seines Verstehen ein unendlichen Prozeß ist; und daß die Suspension der Vorurteilen nur gelingt, wo die Tradition sie sozusagen 'filtriert'. Der Author stellt eine Spannung aus, zwischen Gadamers Verständnis der hermeneutischen Produktivität der Tradition und des zeitlichen Abstand als Instanz die dem Verstehen beiträgt.

  4. Adaptive particle filtering

    Science.gov (United States)

    Stevens, Mark R.; Gutchess, Dan; Checka, Neal; Snorrason, Magnús

    2006-05-01

    Image exploitation algorithms for Intelligence, Surveillance and Reconnaissance (ISR) and weapon systems are extremely sensitive to differences between the operating conditions (OCs) under which they are trained and the extended operating conditions (EOCs) in which the fielded algorithms are tested. As an example, terrain type is an important OC for the problem of tracking hostile vehicles from an airborne camera. A system designed to track cars driving on highways and on major city streets would probably not do well in the EOC of parking lots because of the very different dynamics. In this paper, we present a system we call ALPS for Adaptive Learning in Particle Systems. ALPS takes as input a sequence of video images and produces labeled tracks. The system detects moving targets and tracks those targets across multiple frames using a multiple hypothesis tracker (MHT) tightly coupled with a particle filter. This tracker exploits the strengths of traditional MHT based tracking algorithms by directly incorporating tree-based hypothesis considerations into the particle filter update and resampling steps. We demonstrate results in a parking lot domain tracking objects through occlusions and object interactions.

  5. Single-layer optical bandpass filter technology.

    Science.gov (United States)

    Niraula, Manoj; Yoon, Jae Woong; Magnusson, Robert

    2015-11-01

    Resonant periodic surfaces and films enable new functionalities with wide applicability in practical optical systems. Their material sparsity, ease of fabrication, and minimal interface count provide environmental and thermal stability and robustness in applications. Here, we report an experimental bandpass filter fashioned in a single patterned silicon layer on a quartz substrate. Its performance corresponds to bandpass filters requiring 15 traditional Si/SiO2 thin-film layers. The feasibility of sparse narrowband high-efficiency bandpass filters with extremely wide, flat, and low sidebands is thereby demonstrated. This class of devices is designed with rigorous solutions of Maxwell's equations while engaging the physical principles of resonant waveguide gratings. An experimental filter presented exhibits a transmittance of ?72%, bandwidth of ?0.5??nm, and low sidebands spanning ?100??nm. The proposed technology is integration-friendly and opens doors for further development in various disciplines and spectral regions where thin-film solutions are traditionally applied. PMID:26512519

  6. Traditional fishing tools

    Directory of Open Access Journals (Sweden)

    STOICA Georgeta

    2009-09-01

    Full Text Available In this paper I present the traditional fishing tools used in the area of the Danube Delta. More precisely, I speak about the village of Sfantu Gheorghe, a traditional fishing village, where the fishing activity has been the main activity along the years and where, lately, there have been major changes due to the decrease of the fish species.

  7. Traditional fishing tools

    OpenAIRE

    Stoica, Georgeta

    2009-01-01

    In this paper I present the traditional fishing tools used in the area of the Danube Delta. More precisely, I speak about the village of Sfantu Gheorghe, a traditional fishing village, where the fishing activity has been the main activity along the years and where, lately, there have been major changes due to the decrease of the fish species.

  8. Family Customs and Traditions.

    Science.gov (United States)

    MacGregor, Cynthia

    Recognizing the importance of maintaining open communication with immediate and extended family members, this book provides a compilation of ideas for family traditions and customs that are grounded in compassion and human kindness. The traditions were gathered from families in the United States and Canada who responded to advertisements in…

  9. Varactor-tuned Substrate Integrated Evanescent Filter

    DEFF Research Database (Denmark)

    Zhurbenko, Vitaliy; Acar, Öncel

    Evanescent mode waveguides allow for more compact microwave component design in comparison to the traditional fundamental mode waveguide technology. Evanescent waveguides can be integrated into a dielectric substrate in order to further reduce the mass and volume. Unfortunately, traditional realization methods used in the standard evanescent waveguides are often not directly applicable to substrate integrated waveguide (SIW) technology due to dielectric filling and small height of the waveguide. In this work, one of the realization methods of evanescent waveguides using lumped elements is considered. In contrast to other methods described in the literature, it avoids etching split ring resonators in the metal layer of the SIW. The filters presented here use varactors as tuning elements. The varactors (as well as DC decoupling circuits) are mounted on the surface of PCB bringing the lower metal layer of the waveguide to the top layer with metalized via holes. The present filters are analyzed using models basedon impedance matrix representation. The developed models allow computationally efficient and relatively accurate prediction of the filter behavior in a wide frequency range (at least up to frequencies below the cut-off of the second propagating mode). This work investigates the applicability of the evanescent SIW approach to tunable filter realization. The advantages and disadvantages of the approach are analyzed. As an example, a second order microwave filter is designed, fabricated and tested in order to validate the developed filter models as well as the implemented realization method. The filter structure as well as its tuning are shown in Figure 1.

  10. Filters in 2D and 3D Cardiac SPECT Image Processing.

    Science.gov (United States)

    Lyra, Maria; Ploussi, Agapi; Rouchota, Maritina; Synefia, Stella

    2014-01-01

    Nuclear cardiac imaging is a noninvasive, sensitive method providing information on cardiac structure and physiology. Single photon emission tomography (SPECT) evaluates myocardial perfusion, viability, and function and is widely used in clinical routine. The quality of the tomographic image is a key for accurate diagnosis. Image filtering, a mathematical processing, compensates for loss of detail in an image while reducing image noise, and it can improve the image resolution and limit the degradation of the image. SPECT images are then reconstructed, either by filter back projection (FBP) analytical technique or iteratively, by algebraic methods. The aim of this study is to review filters in cardiac 2D, 3D, and 4D SPECT applications and how these affect the image quality mirroring the diagnostic accuracy of SPECT images. Several filters, including the Hanning, Butterworth, and Parzen filters, were evaluated in combination with the two reconstruction methods as well as with a specified MatLab program. Results showed that for both 3D and 4D cardiac SPECT the Butterworth filter, for different critical frequencies and orders, produced the best results. Between the two reconstruction methods, the iterative one might be more appropriate for cardiac SPECT, since it improves lesion detectability due to the significant improvement of image contrast. PMID:24804144

  11. Traditional medicine and genomics

    Directory of Open Access Journals (Sweden)

    Kalpana Joshi

    2010-01-01

    Full Text Available ′Omics′ developments in the form of genomics, proteomics and metabolomics have increased the impetus of traditional medicine research. Studies exploring the genomic, proteomic and metabolomic basis of human constitutional types based on Ayurveda and other systems of oriental medicine are becoming popular. Such studies remain important to developing better understanding of human variations and individual differences. Countries like India, Korea, China and Japan are investing in research on evidence-based traditional medicines and scientific validation of fundamental principles. This review provides an account of studies addressing relationships between traditional medicine and genomics.

  12. Multichannel distance filter

    OpenAIRE

    Pappas, M.; Pitas, I.

    2010-01-01

    A nonlinear multichannel digital filter is presented in this correspondence. The output is a weighted sum of all samples in the filter window, with a single parameter controlling the filter nonlinearity. Although input data ordering is not required, performance can surpass the performance of other ordering-based multichannel filters

  13. Recirculating electric air filter

    Science.gov (United States)

    Bergman, W.

    1985-01-01

    An electric air filter cartridge has a cylindrical inner high voltage electrode, a layer of filter material, and an outer ground electrode formed of a plurality of segments moveably connected together. The outer electrode can be easily opened to remove or insert filter material. Air flows through the two electrodes and the filter material and is exhausted from the center of the inner electrode.

  14. Active Optical Lattice Filters

    OpenAIRE

    Gary Evans; MacFarlane, Duncan L.; Govind Kannan; Jian Tong; Issa Panahi; Vishnupriya Govindan; L. Roberts Hunt

    2005-01-01

    Optical lattice filter structures including gains are introduced and analyzed. The photonic realization of the active, adaptive lattice filter is described. The algorithms which map between gains space and filter coefficients space are presented and studied. The sensitivities of filter parameters with respect to gains are derived and calculated. An example which is relevant to adaptive signal processing is also provided.

  15. Collaboration and Tradition

    OpenAIRE

    Armstrong, T.; Capulet, E

    2009-01-01

    It is not uncommon to find the collaborative process associated with innovation, whether by the participants themselves (Fitch 2007: 93) or the funding bodies supporting them (Haydn and Windsor 2007: 30-31). This paper investigates collaboration within a more traditional musical context and addresses two main questions in so doing: to what extent did composer and performer collaborate and can collaboration play a role within traditional compositional practice? By viewing the composer-performe...

  16. Traditional Indonesian dairy foods.

    Science.gov (United States)

    Surono, Ingrid S

    2015-12-01

    Indonesia is the largest archipelago blessed with one of the richest mega-biodiversities and also home to one of the most diverse cuisines and traditional fermented foods. There are 3 types of traditional dairy foods, namely the butter-like product minyak samin; yogurt-like product dadih; and cheese-like products dali or bagot in horbo, dangke, litsusu, and cologanti, which reflect the culture of dairy product consumption in Indonesia. PMID:26715081

  17. The Fine Dutch Tradition:

    OpenAIRE

    Hooimeijer, F.L.

    2013-01-01

    Publication of the exhibition and symposium on water adaptive urban planning and architecture in Bangkok. The Urban Fine Dutch Tradition is a dynamic tradition of making urban designs using the parameters of the natural system – incorperating in an efficient way the hydrological cycle, the soil and subsurface conditions, technology and urban development opportunities. Sustainability is the capacity of making a sensible choice for enabling technology taking a perspective from the natural s...

  18. Applications of the spline filter for areal filtration

    Science.gov (United States)

    Tong, Mingsi; Zhang, Hao; Ott, Daniel; Chu, Wei; Song, John

    2015-12-01

    This paper proposes a general use isotropic areal spline filter. This new areal spline filter can achieve isotropy by approximating the transmission characteristic of the Gaussian filter. It can also eliminate the effect of void areas using a weighting factor, and resolve end-effect issues by applying new boundary conditions, which replace the first order finite difference in the traditional spline formulation. These improvements make the spline filter widely applicable to 3D surfaces and extend the applications of the spline filter in areal filtration.

  19. Spam filter analysis

    CERN Document Server

    Garcia, F D; Basiuk, Vincent; Becoulet, Alain; Coulon, Jean-Pierre; Garcia, Flavio D.; Hoepman, Jaap-Henk; Hutter, Thierry; Saoutic, Vincent

    2004-01-01

    Unsolicited bulk email (aka. spam) is a major problem on the Internet. To counter spam, several techniques, ranging from spam filters to mail protocol extensions like hashcash, have been proposed. In this paper we investigate the effectiveness of several spam filtering techniques and technologies. Our analysis was performed by simulating email traffic under different conditions. We show that genetic algorithm based spam filters perform best at server level and naive Bayesian filters are the most appropriate for filtering at user level.

  20. Bias aware Kalman filters

    DEFF Research Database (Denmark)

    Drecourt, J.-P.; Madsen, H.; Rosbjerg, Dan

    2006-01-01

    This paper reviews two different approaches that have been proposed to tackle the problems of model bias with the Kalman filter: the use of a colored noise model and the implementation of a separate bias filter. Both filters are implemented with and without feedback of the bias into the model state. The colored noise filter formulation is extended to correct both time correlated and uncorrelated model error components. A more stable version of the separate filter without feedback is presented. T...

  1. Analog Frequency Tracking Filter

    OpenAIRE

    ISAR, A.; ISAR, D.

    2013-01-01

    This paper propose a new type of analog adaptive filter derived as a generalization of the concept of matched filter. We conceive such a filter to track the instantaneous frequency of frequency modulated signals. Some properties of the proposed analog frequency tracking filter are established using the time-frequency representations theory. A constructive solution, based on common analog integrated circuits, is also proposed. The performance of the analog frequency tracking filter built ...

  2. KASTAMONU TRADITIONAL WOMEN CLOTHES

    Directory of Open Access Journals (Sweden)

    E.Elhan ÖZUS

    2015-08-01

    Full Text Available Clothing is a unique dressing style of a community, a period or a profession. In clothing there is social status and difference principle rather than fashion. In this context, the society created a clothing style in line with its own customs, traditions and social structure. One of the features separating societies from each other and indicating their cultural and social classes is the clothing style. As it is known, traditional Turkish clothes reflecting the characteristics of Turkish society is our most beautiful heritage from past to present. From this heritage there are several examples of women's clothes c arried to present. When these examples are examined, it is possible to see the taste, the way of understanding art, joy and the lifestyle of the history. These garments are also the documents outlining the taste and grace of Turkish people. In the present study, traditional Kastamonu women's clothing, that has an important place in traditional cultural clothes of Anatolia, is investigated . The method of the present research is primarily defined as the examination of the written sources. The study is complet ed with the observations and examinations made in Kastamonu. According to the findings of the study, traditional Kastamonu women's clothing are examined and adapted to todays’ clothing.

  3. Traditional Chinese Biotechnology

    Science.gov (United States)

    Xu, Yan; Wang, Dong; Fan, Wen Lai; Mu, Xiao Qing; Chen, Jian

    The earliest industrial biotechnology originated in ancient China and developed into a vibrant industry in traditional Chinese liquor, rice wine, soy sauce, and vinegar. It is now a significant component of the Chinese economy valued annually at about 150 billion RMB. Although the production methods had existed and remained basically unchanged for centuries, modern developments in biotechnology and related fields in the last decades have greatly impacted on these industries and led to numerous technological innovations. In this chapter, the main biochemical processes and related technological innovations in traditional Chinese biotechnology are illustrated with recent advances in functional microbiology, microbial ecology, solid-state fermentation, enzymology, chemistry of impact flavor compounds, and improvements made to relevant traditional industrial facilities. Recent biotechnological advances in making Chinese liquor, rice wine, soy sauce, and vinegar are reviewed.

  4. Application of CT filter algorithms to digitized film data

    International Nuclear Information System (INIS)

    The CT studies performed thus far in the RAS division have been based on neutron radiographs of two 7-pin reactor fuel bundles which were subjected to over-power accident simulations in the TREAT reactor. As a result of the tests, the pins were severely damaged, with molten fuel and steel spreading throughout the fuel assembly. The neutron radiographs are produced at the NRAD reactor facility at ANL-West in Idaho. The RAS reconstruction codes are based on the filtered back-projection technique, using standard fast Fourier transforms and filter algorithms. Because of the length of the fuel assemblies, and the fact that they are held only at the top by the rotation mechanism, it is nearly impossible to achieve a perfect vertical alignment, so a major part of the analysis time is spent in rotating and aligning the images. As part of this computerized alignment, each image is also normalized to a constant exposure time, based on the data in a neutron absorbing step wedge that is imaged along with the fuel pins. All computer codes were loosely developed from those given in the Donner Algorithms prepared for the National Cancer Institute and are currently run on a PDP-11/60 computer

  5. A Kalman filter technique applied for medical image reconstruction

    International Nuclear Information System (INIS)

    Medical images contain information about vital organic tissues inside of human body and are widely used for diagnoses of disease or for surgical purposes. Image reconstruction is essential for medical images for some applications such as suppression of noise or de-blurring the image in order to provide images with better quality and contrast. Due to vital rule of image reconstruction in medical sciences the corresponding algorithms with better efficiency and higher speed is desirable. Most algorithms in image reconstruction are operated on frequency domain such as the most popular one known as filtered back projection. In this paper we introduce a Kalman filter technique which is operated in time domain for medical image reconstruction. Results indicated that as the number of projection increases in both normal collected ray sum and the collected ray sum corrupted by noise the quality of reconstructed image becomes better in terms of contract and transparency. It is also seen that as the number of projection increases the error index decreases.

  6. Single-periodic-film optical bandpass filter

    CERN Document Server

    Niraula, Manoj; Magnusson, Robert

    2015-01-01

    Resonant periodic surfaces and films enable new functionalities with wide applicability in practical optical systems. Their material sparsity, ease of fabrication, and minimal interface count provide environmental and thermal stability and robustness in applications. Here we report an experimental bandpass filter fashioned in a single patterned layer on a substrate. Its performance corresponds to bandpass filters requiring perhaps 30 traditional thin-film layers as shown by an example. We demonstrate an ultra-narrow, high-efficiency bandpass filter with extremely wide, flat, and low sidebands. This class of devices is designed with rigorous solutions of the Maxwell equations while engaging the physical principles of resonant waveguide gratings. The proposed technology is integration-friendly and opens doors for further development in various disciplines and spectral regions where thin-film solutions are traditionally applied.

  7. Geometric computation theory for morphological filtering on freeform surfaces

    OpenAIRE

    Lou, S.; Jiang, X.; Scott, P. J.

    2013-01-01

    Surfaces govern functional behaviours of geometrical products, especially high precision and high added-value products. Compared to the mean-line based filters, morphological filters, evolved from the traditional E-system, are relevant to functional performance of surfaces. The conventional implementation of morphological filters based on image processing does not work for state-of-the-art surfaces, for example freeform surfaces. A set of novel geometric computation theory is developed by app...

  8. An Efficient Divide-and-Conquer Algorithm for Morphological Filters

    OpenAIRE

    Lou, Shan; Jiang, Xiangqian; Scott, Paul J

    2013-01-01

    Morphological filters, evolved from the traditional envelope filter, are function oriented filtration techniques. A recent research on the implementation of morphological filters was based on the theoretical link between morphological operations and the alpha shape. However the Delaunay triangulation on which the alpha shape method depends is costly for large areal data. This paper proposes a divide-and-conquer method as an optimization to the alpha shape method aiming to speed up its perfor...

  9. An Adaptive Approach to Mitigate Background Covariance Limitations in the Ensemble Kalman Filter

    KAUST Repository

    Song, Hajoon

    2010-07-01

    A new approach is proposed to address the background covariance limitations arising from undersampled ensembles and unaccounted model errors in the ensemble Kalman filter (EnKF). The method enhances the representativeness of the EnKF ensemble by augmenting it with new members chosen adaptively to add missing information that prevents the EnKF from fully fitting the data to the ensemble. The vectors to be added are obtained by back projecting the residuals of the observation misfits from the EnKF analysis step onto the state space. The back projection is done using an optimal interpolation (OI) scheme based on an estimated covariance of the subspace missing from the ensemble. In the experiments reported here, the OI uses a preselected stationary background covariance matrix, as in the hybrid EnKF–three-dimensional variational data assimilation (3DVAR) approach, but the resulting correction is included as a new ensemble member instead of being added to all existing ensemble members. The adaptive approach is tested with the Lorenz-96 model. The hybrid EnKF–3DVAR is used as a benchmark to evaluate the performance of the adaptive approach. Assimilation experiments suggest that the new adaptive scheme significantly improves the EnKF behavior when it suffers from small size ensembles and neglected model errors. It was further found to be competitive with the hybrid EnKF–3DVAR approach, depending on ensemble size and data coverage.

  10. Enhanced 3D PET OSEM reconstruction using inter-update Metz filtering

    International Nuclear Information System (INIS)

    We present an enhancement of the OSEM (ordered set expectation maximization) algorithm for 3D PET reconstruction, which we call the inter-update Metz filtered OSEM (IMF-OSEM). The IMF-OSEM algorithm incorporates filtering action into the image updating process in order to improve the quality of the reconstruction. With this technique, the multiplicative correction image - ordinarily used to update image estimates in plain OSEM - is applied to a Metz-filtered version of the image estimate at certain intervals. In addition, we present a software implementation that employs several high-speed features to accelerate reconstruction. These features include, firstly, forward and back projection functions which make full use of symmetry as well as a fast incremental computation technique. Secondly, the software has the capability of running in parallel mode on several processors. The parallelization approach employed yields a significant speed-up, which is nearly independent of the amount of data. Together, these features lead to reasonable reconstruction times even when using large image arrays and non-axially compressed projection data. The performance of IMF-OSEM was tested on phantom data acquired on the GE Advance scanner. Our results demonstrate that an appropriate choice of Metz filter parameters can improve the contrast-noise balance of certain regions of interest relative to both plain and post-filtered OSEM, and to the GE commercial reprojection algorithm software. (author)

  11. Effect of different thickness of material filter on Tc-99m spectra and performance parameters of gamma camera

    Science.gov (United States)

    Nazifah, A.; Norhanna, S.; Shah, S. I.; Zakaria, A.

    2014-11-01

    This study aimed to investigate the effects of material filter technique on Tc-99m spectra and performance parameters of Philip ADAC forte dual head gamma camera. Thickness of material filter was selected on the basis of percentage attenuation of various gamma ray energies by different thicknesses of zinc material. A cylindrical source tank of NEMA single photon emission computed tomography (SPECT) Triple Line Source Phantom filled with water and Tc-99m radionuclide injected was used for spectra, uniformity and sensitivity measurements. Vinyl plastic tube was used as a line source for spatial resolution. Images for uniformity were reconstructed by filtered back projection method. Butterworth filter of order 5 and cut off frequency 0.35 cycles/cm was selected. Chang's attenuation correction method was applied by selecting 0.13/cm linear attenuation coefficient. Count rate was decreased with material filter from the compton region of Tc-99m energy spectrum, also from the photopeak region. Spatial resolution was improved. However, uniformity of tomographic image was equivocal, and system volume sensitivity was reduced by material filter. Material filter improved system's spatial resolution. Therefore, the technique may be used for phantom studies to improve the image quality.

  12. Making Tradition Healthy

    Centers for Disease Control (CDC) Podcasts

    2007-11-01

    In this podcast, a Latina nutrition educator shows how a community worked with local farmers to grow produce traditionally enjoyed by Hispanic/Latinos.  Created: 11/1/2007 by National Diabetes Education Program (NDEP), a joint program of the Centers for Disease Control and Prevention and the National Institutes of Health.   Date Released: 11/10/2007.

  13. Tradition in Science

    Science.gov (United States)

    Heisenberg, Werner

    1973-01-01

    Discusses the influence of tradition in science on selection of scientific problems and methods and on the use of concepts as tools for research work. Indicates that future research studies will be directed toward the change of fundamental concepts in such fields as astrophysics, molecular biology, and environmental science. (CC)

  14. The Traditional Rebel.

    Science.gov (United States)

    Lemansky, Janet

    1993-01-01

    Outlines the Linden, New Jersey, schools' introduction and use of electronic musical technology and contemporary instruments in the orchestral music program, which has broadened the musical repertoire and the recruitment of talented students not schooled in the classical tradition. Four applications of technology for rehearsals and instrumental…

  15. Child Psychotherapy: Converging Traditions

    Science.gov (United States)

    Altman, Neil

    2004-01-01

    In this paper I outline some of the ways in which I believe the psychoanalytic traditions in North America and in Great Britain are influencing each other. I identify points of convergence and divergence at this moment in the evolution of psychoanalytic theory and technique. I then point out some of the implications of relational perspectives in…

  16. Traditional Cherokee Food.

    Science.gov (United States)

    Hendrix, Janey B.

    A collection for children and teachers of traditional Cherokee recipes emphasizes the art, rather than the science, of cooking. The hand-printed, illustrated format is designed to communicate the feeling of Cherokee history and culture and to encourage readers to collect and add family recipes. The cookbook could be used as a starting point for…

  17. Traditions of technology

    Energy Technology Data Exchange (ETDEWEB)

    Nandy, A.

    1979-01-01

    Modern technology, with about 300 years of history behind it, has become the dominant tradition by marginalizing the other traditions of technology in the West and in the rest of the world. Important roles have been played in this marginalization by the ideology of Englightenment, by the Industrial Revolution, and nineteenth and twentieth century colonialism. They have blurred the difference between science and technology, underwritten the mechanomorphic world-image and promoted the concept of a value-free, ethically unrestrained technology. However, the present crises of technological consciousness has brought to the fore alternative traditions of technology, not as ethnotechnologies from which a universal, secular, modern technology can draw lessons, but as a competing philosophies of universality which can provide correctives to the alienating, exploitative, and dehumanizing role of modern science and technology. An alternative ideology of science is needed for this as well as a new legitimacy for the traditional technosystems and their cultural environments. Such a legitimacy will have to be based on a different set of values relating to the man--nature and man--man relationships and a deeper understanding of the politics of technology in its cross-national and cross-cultural contexts.

  18. Sensory Pollution from Bag Filters, Carbon Filters and Combinations

    DEFF Research Database (Denmark)

    Bekö, Gabriel; Clausen, Geo; Weschler, Charles J.

    2008-01-01

    by an upstream pre-filter (changed monthly), an EU7 filter protected by an upstream activated carbon (AC) filter, and EU7 filters with an AC filter either downstream or both upstream and downstream. In addition, two types of stand-alone combination filters were evaluated: a bag-type fiberglass filter...... that contained AC and a synthetic fiber cartridge filter that contained AC. Air that had passed through used filters was most acceptable for those sets in which an AC filter was used downstream of the particle filter. Comparable air quality was achieved with the stand-alone bag filter that contained AC...

  19. HEPA Filter Vulnerability Assessment

    International Nuclear Information System (INIS)

    This assessment of High Efficiency Particulate Air (HEPA) filter vulnerability was requested by the USDOE Office of River Protection (ORP) to satisfy a DOE-HQ directive to evaluate the effect of filter degradation on the facility authorization basis assumptions. Within the scope of this assessment are ventilation system HEPA filters that are classified as Safety-Class (SC) or Safety-Significant (SS) components that perform an accident mitigation function. The objective of the assessment is to verify whether HEPA filters that perform a safety function during an accident are likely to perform as intended to limit release of hazardous or radioactive materials, considering factors that could degrade the filters. Filter degradation factors considered include aging, wetting of filters, exposure to high temperature, exposure to corrosive or reactive chemicals, and exposure to radiation. Screening and evaluation criteria were developed by a site-wide group of HVAC engineers and HEPA filter experts from published empirical data. For River Protection Project (RPP) filters, the only degradation factor that exceeded the screening threshold was for filter aging. Subsequent evaluation of the effect of filter aging on the filter strength was conducted, and the results were compared with required performance to meet the conditions assumed in the RPP Authorization Basis (AB). It was found that the reduction in filter strength due to aging does not affect the filter performance requirements as specified in the AB. A portion of the HEPA filter vulnerability assessment is being conducted by the ORP and is not part of the scope of this study. The ORP is conducting an assessment of the existing policies and programs relating to maintenance, testing, and change-out of HEPA filters used for SC/SS service. This document presents the results of a HEPA filter vulnerability assessment conducted for the River protection project as requested by the DOE Office of River Protection

  20. Challenging tradition in Nigeria.

    Science.gov (United States)

    Supriya, K E

    1991-01-01

    In Nigeria since 1987, the National Association of Nigeria Nurses and Midwives (NSNNM) has used traditional medial and traditional health care workers to curtail the practice of female circumcision. Other harmful traditions are being changed also, such as early marriage, taboos of pregnancy and childbirth, and scarification. 30,000 member of NANNM are involved in this effort to halt the harmful practices themselves and to change community opinion. The program involved national and state level workshops on harmful health consequences of traditional practices and instruction on how to conduct focus group discussions to assess women's beliefs and practices. The focus groups were found to be a particularly successful method of opening up discussion of taboo topics and expressing deep emotions. The response to the knowledge that circumcision was not necessary was rage and anger, which was channeled into advocacy roles or change in the practice. The result was the channeled into advocacy roles for change in the practice. The result was the development of books, leaflets and videos. One community group designed a dress with a decorative motif of tatoos and bodily cuts to symbolize circumcision and scarring. Plays and songs were written and performed. Artists provided models of female genitalia both before and after circumcision. The campaign has been successful in bringing this issue to the public attention in prominent ways, such a national television, health talk shows, and women;s magazines. One of the most important results of the effort has been the demonstration that culture and tradition can be changed from within, rather than from outside imposition of values and beliefs. PMID:12284522

  1. Robot Visual Servo with Fuzzy Particle Filter

    Directory of Open Access Journals (Sweden)

    Jie Ma

    2012-04-01

    Full Text Available This paper proposes a robot visual servo method with an adaptive particle filter based on fuzzy logic theory to online estimate the total Jacobian matrix of a robot visual servo system. A set of fuzzy rules are used to select appropriate numbers of particles according to the filtering estimation errors. When an estimation error is high more particles are used, and when the estimation error is low fewer particles are used. The visual servo results on a two degree-of-freedom robot system show that the proposed fuzzy adaptive particle filter visual servo method needs less time than that of traditional particle filter visual servo method to get a comparative tracking accuracy.

  2. Ceramic chemical filter; Seramikku kemikaru fuiruta

    Energy Technology Data Exchange (ETDEWEB)

    Sakata, Soichiro. [Takasago Thermal Engineering Corp. Tokyo (Japan)

    1999-07-01

    In the clean room for producing semiconductors and device element of liquid crystals, the impurities such as acid, alkali, very small amount of gas organism etc. in air by vol ppb order become the reason of bad quality of elements. As countermeasures reducing the gas impurities, there are two of them, one is reducing the impurity generation quantity, such as reducing the impurities included in an introduced air, equipment used in production, raw materials, workers, construction materials of the clean room etc, and the other is removing the impurities from air by a chemical filter. The chemical filter is traditionally made of active carbon, ion exchange fibers. In this paper, a new chemical filter that uses ceramics developed by authors of the paper as the base material, is introduced. The ceramic chemical filters are divided into for removing alkali gas, for removing acid gas, for removing gas organism and for dopint processing. (NEDO)

  3. A New Glass of Nonlinear Filters: Microstatistic Volterra Filters

    Directory of Open Access Journals (Sweden)

    S. Marchevsky

    1996-04-01

    Full Text Available In this paper a new subset of the time-invariant microstatistic filters so-called microstatistic Volterra filters are proposed. This class of nonlinear filters is based on the idea of the conventional microstatistic filter generalization by substituting Wiener filters applied in the conventional microstatistic filter structure by Volterra filters. The advantage of the microstatistic Volterra filters in comparison with the Wiener filters, Volterra filters and conventional microstatistic filters is the fact that in the case of non-Gaussian signal processing the microstatistic Volterra filters can outperform Wiener filters, Volterra filters or conventional microstatistic filters. The validity of this basic property of the microstatistic Volterra filters is verified by a number of computer experiments. The disadvantage of the microstatistic Volterra filters is their relatively high computational complexity.

  4. Basalt Fiber Based Filters

    International Science & Technology Center (ISTC)

    Development of Highly Effective Filtering Systems on the Basis of a Super-Thin Basalt Fiber for Radioactive Aerosols Purification and Creation of a Work Cycle for Filters Manufacturig with the Purpose of Their Operation at the Nuclear Power Plants

  5. MST Filterability Tests

    Energy Technology Data Exchange (ETDEWEB)

    Poirier, M. R. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Burket, P. R. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Duignan, M. R. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2015-03-12

    The Savannah River Site (SRS) is currently treating radioactive liquid waste with the Actinide Removal Process (ARP) and the Modular Caustic Side Solvent Extraction Unit (MCU). The low filter flux through the ARP has limited the rate at which radioactive liquid waste can be treated. Recent filter flux has averaged approximately 5 gallons per minute (gpm). Salt Batch 6 has had a lower processing rate and required frequent filter cleaning. Savannah River Remediation (SRR) has a desire to understand the causes of the low filter flux and to increase ARP/MCU throughput. In addition, at the time the testing started, SRR was assessing the impact of replacing the 0.1 micron filter with a 0.5 micron filter. This report describes testing of MST filterability to investigate the impact of filter pore size and MST particle size on filter flux and testing of filter enhancers to attempt to increase filter flux. The authors constructed a laboratory-scale crossflow filter apparatus with two crossflow filters operating in parallel. One filter was a 0.1 micron Mott sintered SS filter and the other was a 0.5 micron Mott sintered SS filter. The authors also constructed a dead-end filtration apparatus to conduct screening tests with potential filter aids and body feeds, referred to as filter enhancers. The original baseline for ARP was 5.6 M sodium salt solution with a free hydroxide concentration of approximately 1.7 M.3 ARP has been operating with a sodium concentration of approximately 6.4 M and a free hydroxide concentration of approximately 2.5 M. SRNL conducted tests varying the concentration of sodium and free hydroxide to determine whether those changes had a significant effect on filter flux. The feed slurries for the MST filterability tests were composed of simple salts (NaOH, NaNO2, and NaNO3) and MST (0.2 – 4.8 g/L). The feed slurry for the filter enhancer tests contained simulated salt batch 6 supernate, MST, and filter enhancers.

  6. Oriented Fiber Filter Media

    OpenAIRE

    Bharadwaj, R.; A. Patel, S. Chokdeepanich, Ph.D.; G.G. Chase, Ph.D.

    2008-01-01

    Coalescing filters are widely used throughout industry and improved performance will reduce droplet emissions and operating costs. Experimental observations show orientation of micro fibers in filter media effect the permeability and the separation efficiency of the filter media. In this work two methods are used to align the fibers to alter the filter structure. The results show that axially aligned fiber media improve quality factor on the order of 20% and cutting media on an angle from a t...

  7. Filtering, FDR and power

    Directory of Open Access Journals (Sweden)

    van Iterson Maarten

    2010-09-01

    Full Text Available Abstract Background In high-dimensional data analysis such as differential gene expression analysis, people often use filtering methods like fold-change or variance filters in an attempt to reduce the multiple testing penalty and improve power. However, filtering may introduce a bias on the multiple testing correction. The precise amount of bias depends on many quantities, such as fraction of probes filtered out, filter statistic and test statistic used. Results We show that a biased multiple testing correction results if non-differentially expressed probes are not filtered out with equal probability from the entire range of p-values. We illustrate our results using both a simulation study and an experimental dataset, where the FDR is shown to be biased mostly by filters that are associated with the hypothesis being tested, such as the fold change. Filters that induce little bias on the FDR yield less additional power of detecting differentially expressed genes. Finally, we propose a statistical test that can be used in practice to determine whether any chosen filter introduces bias on the FDR estimate used, given a general experimental setup. Conclusions Filtering out of probes must be used with care as it may bias the multiple testing correction. Researchers can use our test for FDR bias to guide their choice of filter and amount of filtering in practice.

  8. Practical Active Capacitor Filter

    Science.gov (United States)

    Shuler, Robert L., Jr. (Inventor)

    2005-01-01

    A method and apparatus is described that filters an electrical signal. The filtering uses a capacitor multiplier circuit where the capacitor multiplier circuit uses at least one amplifier circuit and at least one capacitor. A filtered electrical signal results from a direct connection from an output of the at least one amplifier circuit.

  9. The Ribosome Filter Redux

    OpenAIRE

    Mauro, Vincent P.; Edelman, Gerald M.

    2007-01-01

    The ribosome filter hypothesis postulates that ribosomes are not simply translation machines but also function as regulatory elements that differentially affect or filter the translation of particular mRNAs. On the basis of new information, we take the opportunity here to review the ribosome filter hypothesis, suggest specific mechanisms of action, and discuss recent examples from the literature that support it.

  10. [Children and traditional practices].

    Science.gov (United States)

    Aholi, P

    1990-07-01

    African traditional practices can be both beneficial and harmful to the newly born. This article describes these practices from 4 perspectives: 1) the period following childbirth or "maternage;" 2) nutrition; 3) curative care; and 4) social customs. The beneficial practices include: 1) giving the baby water as soon as he is washed to prevent neonatal hypoglycemia; 2) breast feeding; 3) carrying the baby on the mother's back and 4) the traditional massage. The harmful practices during maternage include: 1) the baby is rolled in the dirt to protect him and "give birth to his race;" 2) after birth the baby is given lemon juice or gin to prevent the obstruction of the respiratory cords; 3) mother and baby are "put in the dark" or a separate room for the rest of the family and community for 6 days to protect them against evil influences. The harmful nutritional practices are based on superstitions that relate to all animal products because they might produce diseases. 1) Eggs are known to cause diarrhea and throughout Africa eggs are forbidden because of their effect on children's physical development. 2) Chicken and pigeon and "everything that flies" causes convulsions. 3) Palm oil, oranges and bananas cause coughing. 4) Sugar cane, manioc leaves and everything with natural sugar cause intestinal ailments. Traditional health cures are used during an illness and are aimed at reestablishing the balance between man and his environment. Examples described include treatment for measles and chicken pox; fevers; diarrhea, and vomiting and convulsions. The positive traditional African practices need to be combined with those of modern medicine while discouraging the harmful practices. PMID:12342828

  11. Bias aware Kalman filters

    DEFF Research Database (Denmark)

    Drecourt, J.-P.; Madsen, H.; Rosbjerg, Dan

    2006-01-01

    This paper reviews two different approaches that have been proposed to tackle the problems of model bias with the Kalman filter: the use of a colored noise model and the implementation of a separate bias filter. Both filters are implemented with and without feedback of the bias into the model state...... illustrated on a simple one-dimensional groundwater problem. The results show that the presented filters outperform the standard Kalman filter and that the implementations with bias feedback work in more general conditions than the implementations without feedback. 2005 Elsevier Ltd. All rights reserved....

  12. Ceramic fiber filter technology

    Energy Technology Data Exchange (ETDEWEB)

    Holmes, B.L.; Janney, M.A.

    1996-06-01

    Fibrous filters have been used for centuries to protect individuals from dust, disease, smoke, and other gases or particulates. In the 1970s and 1980s ceramic filters were developed for filtration of hot exhaust gases from diesel engines. Tubular, or candle, filters have been made to remove particles from gases in pressurized fluidized-bed combustion and gasification-combined-cycle power plants. Very efficient filtration is necessary in power plants to protect the turbine blades. The limited lifespan of ceramic candle filters has been a major obstacle in their development. The present work is focused on forming fibrous ceramic filters using a papermaking technique. These filters are highly porous and therefore very lightweight. The papermaking process consists of filtering a slurry of ceramic fibers through a steel screen to form paper. Papermaking and the selection of materials will be discussed, as well as preliminary results describing the geometry of papers and relative strengths.

  13. Miniature wideband filter based on coupled-line sections and quasi-lumped element resonator

    DEFF Research Database (Denmark)

    Zhurbenko, Vitaliy; Krozer, Viktor

    2007-01-01

    A new design of a wideband bandpass filter is proposed, based on coupled-line sections and quasi-lumped element resonator, taking advantage of the last one to introduce two transmission zeros and suppress a spurious response. The proposed filter demonstrates significantly improved characteristics in comparison with traditional coupled-line filter and exhibits a very compact structure.

  14. In the Dirac tradition

    International Nuclear Information System (INIS)

    It was Paul Dirac who cast quantum mechanics into the form we now use, and many generations of theoreticians openly acknowledge his influence on their thinking. When Dirac died in 1984, St. John's College, Cambridge, his base for most of his lifetime, instituted an annual lecture in his memory at Cambridge. The first lecture, in 1986, attracted two heavyweights - Richard Feynman and Steven Weinberg. Far from using the lectures as a platform for their own work, in the Dirac tradition they presented stimulating material on deep underlying questions

  15. Non-Traditional Vectors for Paralytic Shellfish Poisoning

    Directory of Open Access Journals (Sweden)

    Sara Watt Longan

    2008-06-01

    Full Text Available Paralytic shellfish poisoning (PSP, due to saxitoxin and related compounds, typically results from the consumption of filter-feeding molluscan shellfish that concentrate toxins from marine dinoflagellates. In addition to these microalgal sources, saxitoxin and related compounds, referred to in this review as STXs, are also produced in freshwater cyanobacteria and have been associated with calcareous red macroalgae. STXs are transferred and bioaccumulate throughout aquatic food webs, and can be vectored to terrestrial biota, including humans. Fisheries closures and human intoxications due to STXs have been documented in several non-traditional (i.e. non-filter-feeding vectors. These include, but are not limited to, marine gastropods, both carnivorous and grazing, crustacea, and fish that acquire STXs through toxin transfer. Often due to spatial, temporal, or a species disconnection from the primary source of STXs (bloom forming dinoflagellates, monitoring and management of such non-traditional PSP vectors has been challenging. A brief literature review is provided for filter feeding (traditional and nonfilter feeding (non-traditional vectors of STXs with specific reference to human effects. We include several case studies pertaining to management actions to prevent PSP, as well as food poisoning incidents from STX(s accumulation in non-traditional PSP vectors.

  16. Evaluation of median filtering after reconstruction with maximum likelihood expectation maximization (ML-EM) by real space and frequency space

    International Nuclear Information System (INIS)

    Maximum likelihood expectation maximization (ML-EM) image quality is sensitive to the number of iterations, because a large number of iterations leads to images with checkerboard noise. The use of median filtering in the reconstruction process allows both noise reduction and edge preservation. We examined the value of median filtering after reconstruction with ML-EM by comparing filtered back projection (FBP) with a ramp filter or ML-EM without filtering. SPECT images were obtained with a dual-head gamma camera. The acquisition time was changed from 10 to 200 (seconds/frame) to examine the effect of the count statistics on the quality of the reconstructed images. First, images were reconstructed with ML-EM by changing the number of iterations from 1 to 150 in each study. Additionally, median filtering was applied following reconstruction with ML-EM. The quality of the reconstructed images was evaluated in terms of normalized mean square error (NMSE) values and two-dimensional power spectrum analysis. Median filtering after reconstruction by the ML-EM method provided stable NMSE values even when the number of iterations was increased. The signal element of the image was close to the reference image for any repetition number of iterations. Median filtering after reconstruction with ML-EM was useful in reducing noise, with a similar resolution achieved by reconstruction with FBP and a ramp filter. Especially in images with poor count statistics, median filtering after reconstruction with ML-EM is effective as a simple, widely available method. (author)

  17. Generic Kalman Filter Software

    Science.gov (United States)

    Lisano, Michael E., II; Crues, Edwin Z.

    2005-01-01

    The Generic Kalman Filter (GKF) software provides a standard basis for the development of application-specific Kalman-filter programs. Historically, Kalman filters have been implemented by customized programs that must be written, coded, and debugged anew for each unique application, then tested and tuned with simulated or actual measurement data. Total development times for typical Kalman-filter application programs have ranged from months to weeks. The GKF software can simplify the development process and reduce the development time by eliminating the need to re-create the fundamental implementation of the Kalman filter for each new application. The GKF software is written in the ANSI C programming language. It contains a generic Kalman-filter-development directory that, in turn, contains a code for a generic Kalman filter function; more specifically, it contains a generically designed and generically coded implementation of linear, linearized, and extended Kalman filtering algorithms, including algorithms for state- and covariance-update and -propagation functions. The mathematical theory that underlies the algorithms is well known and has been reported extensively in the open technical literature. Also contained in the directory are a header file that defines generic Kalman-filter data structures and prototype functions and template versions of application-specific subfunction and calling navigation/estimation routine code and headers. Once the user has provided a calling routine and the required application-specific subfunctions, the application-specific Kalman-filter software can be compiled and executed immediately. During execution, the generic Kalman-filter function is called from a higher-level navigation or estimation routine that preprocesses measurement data and post-processes output data. The generic Kalman-filter function uses the aforementioned data structures and five implementation- specific subfunctions, which have been developed by the user on the basis of the aforementioned templates. The GKF software can be used to develop many different types of unfactorized Kalman filters. A developer can choose to implement either a linearized or an extended Kalman filter algorithm, without having to modify the GKF software. Control dynamics can be taken into account or neglected in the filter-dynamics model. Filter programs developed by use of the GKF software can be made to propagate equations of motion for linear or nonlinear dynamical systems that are deterministic or stochastic. In addition, filter programs can be made to operate in user-selectable "covariance analysis" and "propagation-only" modes that are useful in design and development stages.

  18. Retrofitting fabric filters for clean stack emission

    International Nuclear Information System (INIS)

    The fly ash generated from New South Wales coals, which are predominately low sulphur coals, has been difficult to collect in traditional electrostatic precipitators. During the early 1970's development work was undertaken on the use of fabric filters at some of the Commission's older power stations. The satisfactory performance of the plant at those power stations led to the selection of fabric filters for flue gas cleaning at the next two new power stations constructed by the Electricity Commission of New South Wales. On-going pilot plant testing has continued to indicate the satisfactory performance of enhanced designs of fabric filters of varying types and the Commission has recently retrofitted pulse cleaned fabric filters to 2 x 350 MW units at a further power station with plans to retrofit similar plant to the remaining 2 x 350 MW units at that station. A contract has also been let for the retrofitting of pulse cleaned fabric filters to 4 x 500 MW units at another power station in the Commission's system. The paper reviews the performance of the 6000 MW of plant operating with fabric filters. Fabric selection and fabric life forms an important aspect of this review

  19. Concentric Split Flow Filter

    Science.gov (United States)

    Stapleton, Thomas J. (Inventor)

    2015-01-01

    A concentric split flow filter may be configured to remove odor and/or bacteria from pumped air used to collect urine and fecal waste products. For instance, filter may be designed to effectively fill the volume that was previously considered wasted surrounding the transport tube of a waste management system. The concentric split flow filter may be configured to split the air flow, with substantially half of the air flow to be treated traveling through a first bed of filter media and substantially the other half of the air flow to be treated traveling through the second bed of filter media. This split flow design reduces the air velocity by 50%. In this way, the pressure drop of filter may be reduced by as much as a factor of 4 as compare to the conventional design.

  20. Spot- Zombie Filtering System

    OpenAIRE

    Arathy Rajagopal; B. Geethanjali; Arulprakash. P

    2014-01-01

    A major security challenge on the Internet is the existence of the large number of compromised machines. Such machines have been increasingly used to launch various security attacks including spamming and spreading malware, DDoS, and identity theft. These compromised machines are called “Zombies”. In general E-mail applications and providers uses spam filters to filter the spam messages. Spam filtering is a technique for discriminating the genuine message from the spam message...

  1. Bayesian Trend Filtering

    OpenAIRE

    Roualdes, Edward A.

    2015-01-01

    We develop a fully Bayesian hierarchical model for trend filtering, itself a new development in nonparametric, univariate regression. The framework more broadly applies to the generalized lasso, but focus is on Bayesian trend filtering. We compare two shrinkage priors, double exponential and generalized double Pareto. A simulation study, comparing Bayesian trend filtering to the original formulation and a number of other popular methods shows our method to improve estimation...

  2. Hybrid Filter Membrane

    Science.gov (United States)

    Laicer, Castro; Rasimick, Brian; Green, Zachary

    2012-01-01

    Cabin environmental control is an important issue for a successful Moon mission. Due to the unique environment of the Moon, lunar dust control is one of the main problems that significantly diminishes the air quality inside spacecraft cabins. Therefore, this innovation was motivated by NASA s need to minimize the negative health impact that air-suspended lunar dust particles have on astronauts in spacecraft cabins. It is based on fabrication of a hybrid filter comprising nanofiber nonwoven layers coated on porous polymer membranes with uniform cylindrical pores. This design results in a high-efficiency gas particulate filter with low pressure drop and the ability to be easily regenerated to restore filtration performance. A hybrid filter was developed consisting of a porous membrane with uniform, micron-sized, cylindrical pore channels coated with a thin nanofiber layer. Compared to conventional filter media such as a high-efficiency particulate air (HEPA) filter, this filter is designed to provide high particle efficiency, low pressure drop, and the ability to be regenerated. These membranes have well-defined micron-sized pores and can be used independently as air filters with discreet particle size cut-off, or coated with nanofiber layers for filtration of ultrafine nanoscale particles. The filter consists of a thin design intended to facilitate filter regeneration by localized air pulsing. The two main features of this invention are the concept of combining a micro-engineered straight-pore membrane with nanofibers. The micro-engineered straight pore membrane can be prepared with extremely high precision. Because the resulting membrane pores are straight and not tortuous like those found in conventional filters, the pressure drop across the filter is significantly reduced. The nanofiber layer is applied as a very thin coating to enhance filtration efficiency for fine nanoscale particles. Additionally, the thin nanofiber coating is designed to promote capture of dust particles on the filter surface and to facilitate dust removal with pulse or back airflow.

  3. Practical alarm filtering

    International Nuclear Information System (INIS)

    An expert system-based alarm filtering method is described which prioritizes and reduces the number of alarms facing an operator. This patented alarm filtering methodology was originally developed and implemented in a pressurized water reactor, and subsequently in a chemical processing facility. Both applications were in LISP and both were successful. In the chemical processing facility, for instance, alarm filtering reduced the quantity of alarm messages by 90%. 6 figs

  4. Updating the OMERACT filter

    DEFF Research Database (Denmark)

    Wells, George; Beaton, Dorcas E; Tugwell, Peter; Boers, Maarten; Kirwan, John R; Bingham, Clifton O; Boonen, Annelies; Brooks, Peter; Conaghan, Philip G; D'Agostino, Maria-Antonietta; Dougados, Maxime; Furst, Daniel E; Gossec, Laure; Guillemin, Francis; Helliwell, Philip; Hewlett, Sarah; Kvien, Tore K; Landewé, Robert B; March, Lyn; Mease, Philip J; Østergaard, Mikkel; Simon, Lee; Singh, Jasvinder A; Strand, Vibeke; van der Heijde, Désirée M

    2014-01-01

    The "Discrimination" part of the OMERACT Filter asks whether a measure discriminates between situations that are of interest. "Feasibility" in the OMERACT Filter encompasses the practical considerations of using an instrument, including its ease of use, time to complete, monetary costs, and interpretability of the question(s) included in the instrument. Both the Discrimination and Reliability parts of the filter have been helpful but were agreed on primarily by consensus of OMERACT participants ...

  5. Randomized Filtering Algorithms

    DEFF Research Database (Denmark)

    Katriel, Irit; Van Hentenryck, Pascal

    2008-01-01

    Filtering every global constraint of a CPS to are consistency at every search step can be costly and solvers often compromise on either the level of consistency or the frequency at which are consistency is enforced. In this paper we propose two randomized filtering schemes for dense instances of AllDifferent and is generalization, the Global Cardinality Constraint. The first delayed filtering scheme is a Monte Carlo algorithm: its running time is superior, in the worst case, to that of enforcing...

  6. Approximate Kalman filtering

    CERN Document Server

    Chen, G

    1993-01-01

    Kalman filtering algorithm gives optimal (linear, unbiased and minimum error-variance) estimates of the unknown state vectors of a linear dynamic-observation system, under the regular conditions such as perfect data information; complete noise statistics; exact linear modeling; ideal well-conditioned matrices in computation and strictly centralized filtering.In practice, however, one or more of the aforementioned conditions may not be satisfied, so that the standard Kalman filtering algorithm cannot be directly used, and hence "approximate Kalman filtering" becomes necessary. In the last decad

  7. Linear phase compressive filter

    Science.gov (United States)

    McEwan, Thomas E. (Livermore, CA)

    1995-01-01

    A phase linear filter for soliton suppression is in the form of a laddered series of stages of non-commensurate low pass filters with each low pass filter having a series coupled inductance (L) and a reverse biased, voltage dependent varactor diode, to ground which acts as a variable capacitance (C). L and C values are set to levels which correspond to a linear or conventional phase linear filter. Inductance is mapped directly from that of an equivalent nonlinear transmission line and capacitance is mapped from the linear case using a large signal equivalent of a nonlinear transmission line.

  8. Filters in nuclear facilities

    International Nuclear Information System (INIS)

    The topics of the nine papers given include the behavior of HEPA filters during exposure to air flows of high humidity as well as of high differential pressure, the development of steel-fiber filters suitable for extreme operating conditions, and the occurrence of various radioactive iodine species in the exhaust air from boiling water reactors. In an introductory presentation the German view of the performance requirements to be met by filters in nuclear facilities as well as the present status of filter quality assurance are discussed. (orig.)

  9. Nanofiber Filters Eliminate Contaminants

    Science.gov (United States)

    2009-01-01

    With support from Phase I and II SBIR funding from Johnson Space Center, Argonide Corporation of Sanford, Florida tested and developed its proprietary nanofiber water filter media. Capable of removing more than 99.99 percent of dangerous particles like bacteria, viruses, and parasites, the media was incorporated into the company's commercial NanoCeram water filter, an inductee into the Space Foundation's Space Technology Hall of Fame. In addition to its drinking water filters, Argonide now produces large-scale nanofiber filters used as part of the reverse osmosis process for industrial water purification.

  10. Paul Rodgersi filter Kohilas

    Index Scriptorium Estoniae

    2000-01-01

    28. I Kohila keskkoolis kohaspetsiifiline skulptuur ja performance "Filter". Kooli 130. aastapäeva tähistava ettevõtmise eesotsas oli skulptor Paul Rodgers ja kaks viimase klassi noormeest ? Marko Heinmäe, Hendrik Karm.

  11. Updating the OMERACT filter

    DEFF Research Database (Denmark)

    Wells, George; Beaton, Dorcas E; Tugwell, Peter; Boers, Maarten; Kirwan, John R; Bingham, Clifton O; Boonen, Annelies; Brooks, Peter; Conaghan, Philip G; D'Agostino, Maria-Antonietta; Dougados, Maxime; Furst, Daniel E; Gossec, Laure; Guillemin, Francis; Helliwell, Philip; Hewlett, Sarah; Kvien, Tore K; Landewé, Robert B; March, Lyn; Mease, Philip J; Østergaard, Mikkel; Simon, Lee; Singh, Jasvinder A; Strand, Vibeke; van der Heijde, Désirée M

    2014-01-01

    The "Discrimination" part of the OMERACT Filter asks whether a measure discriminates between situations that are of interest. "Feasibility" in the OMERACT Filter encompasses the practical considerations of using an instrument, including its ease of use, time to complete, monetary costs, and...... interpretability of the question(s) included in the instrument. Both the Discrimination and Reliability parts of the filter have been helpful but were agreed on primarily by consensus of OMERACT participants rather than through explicit evidence-based guidelines. In Filter 2.0 we wanted to improve this definition...

  12. Fast Guided Filter

    OpenAIRE

    He, Kaiming; Sun, Jian

    2015-01-01

    The guided filter is a technique for edge-aware image filtering. Because of its nice visual quality, fast speed, and ease of implementation, the guided filter has witnessed various applications in real products, such as image editing apps in phones and stereo reconstruction, and has been included in official MATLAB and OpenCV. In this note, we remind that the guided filter can be simply sped up from O(N) time to O(N/s^2) time for a subsampling ratio s. In a variety of applications, this leads...

  13. Oriented Fiber Filter Media

    Directory of Open Access Journals (Sweden)

    R. Bharadwaj

    2008-06-01

    Full Text Available Coalescing filters are widely used throughout industry and improved performance will reduce droplet emissions and operating costs. Experimental observations show orientation of micro fibers in filter media effect the permeability and the separation efficiency of the filter media. In this work two methods are used to align the fibers to alter the filter structure. The results show that axially aligned fiber media improve quality factor on the order of 20% and cutting media on an angle from a thick layered media can improve performance by about 40%. The results also show the improved performance is not monotonically correlated to the average fiber angle of the medium.

  14. Sensory Pollution from Bag Filters, Carbon Filters and Combinations

    DEFF Research Database (Denmark)

    Bekö, Gabriel; Clausen, Geo

    2008-01-01

    Used ventilation filters are a major source of sensory pollutants in air handling systems. The objective of the present study was to evaluate the net effect that different combinations of filters had on perceived air quality after 5 months of continuous filtration of outdoor suburban air. A panel of 32 subjects assessed different sets of used filters and identical sets consisting of new filters. Additionally, filter weights and pressure drops were measured at the beginning and end of the operation period. The filter sets included single EU5 and EU7 fiberglass filters, an EU7 filter protected by an upstream pre-filter (changed monthly), an EU7 filter protected by an upstream activated carbon (AC) filter, and EU7 filters with an AC filter either downstream or both upstream and downstream. In addition, two types of stand-alone combination filters were evaluated: a bag-type fiberglass filter that contained AC and a synthetic fiber cartridge filter that contained AC. Air that had passed through used filters was most acceptable for those sets in which an AC filter was used downstream of the particle filter. Comparable air quality was achieved with the stand-alone bag filter that contained AC. Furthermore, its pressure drop changed very little during the 5 months of service, and it had the added benefit of removing a large fraction of ozone from the airstream. If similar results are obtained over a wider variety of soiling conditions, such filters may be a viable solution to a long recognized problem.

  15. FPGA Based Kalman Filter for Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Vikrant Vij

    2011-01-01

    Full Text Available A Wireless Sensor Network (WSN is a set of tiny and low-cost devices equipped with different kind of sensors, a small microcontroller and a radio transceiver, typically powered by batteries. Target tracking is one of the very important applications of such a network system. Traditionally, KF (Kalman filtering and its derivatives are used for tracking of a random signal. Kalman filter is a linear optimal filtering approach, to address the problem when system dynamics become nonlinear, researchers developed sub-optimal extensions of Kalman filter, two popular versions are EKF (extended Kalman filter and UKF (unscented Kalman filter.The rapidly increasing popularity of WSNs has placed increased computational demands upon these systemswhich can be met by FPGA based design. FPGAs offer increased performance compared to microprocessors and increased flexibility compared to ASICs , while maintaining low power consumption

  16. Fuzzy Based Median Filtering for Removal of Salt-and-Pepper Noise

    Directory of Open Access Journals (Sweden)

    Bhavana Deshpande

    2012-07-01

    Full Text Available This paper presents a filter for restoration of images that are highly corrupted by salt and pepper noise. By incorporating fuzzy logic after detecting and correcting the noisy pixel, the proposed filter is able to suppress noise and preserve details across a wide range of salt and pepper noise corruption, ranging from 1% to 60%. The proposed filter is tested on different images and is found to produce better results than the Traditional Median Filter.

  17. Location Estimation for an Autonomously Guided Vehicle using an Augmented Kalman Filter to Autocalibrate the Odometry

    DEFF Research Database (Denmark)

    Larsen, Thomas Dall; Bak, Martin; Andersen, Nils Axel; Ravn, Ole

    1998-01-01

    A Kalman filter using encoder readings as inputs and vision measurements as observations is designed as a location estimator for an autonomously guided vehicle (AGV). To reduce the effect of modelling errors an augmented filter that estimates the true system parameters is designed. The traditional way of reducing these errors is by fictitious noise injection in the filter model. The main problem with that approach however is that the filter does not learn about its bad model, it just puts more c...

  18. Low-power implementation of polyphase filters in Quadratic Residue Number System

    DEFF Research Database (Denmark)

    Cardarilli, Gian Carlo; Re, Andrea Del; Nannarelli, Alberto; Re, Marco

    2004-01-01

    The aim of this work is the reduction of the power dissipated in digital filters, while maintaining the timing unchanged. A polyphase filter bank in the Quadratic Residue Number System (QRNS) has been implemented and then compared, in terms of performance, area, and power dissipation to the implementation of a polyphase filter bank in the traditional two's complement system (TCS). The resulting implementations, designed to have the same clock rates, show that the QRNS filter is smaller and consu...

  19. Varactor-tuned Substrate Integrated Evanescent Filter

    DEFF Research Database (Denmark)

    Zhurbenko, Vitaliy; Acar, Öncel; Dong, Yunfeng

    Evanescent mode waveguides allow for more compact microwave component design in comparison to the traditional fundamental mode waveguide technology. Evanescent waveguides can be integrated into a dielectric substrate in order to further reduce the mass and volume. Unfortunately, traditional...... realization methods used in the standard evanescent waveguides are often not directly applicable to substrate integrated waveguide (SIW) technology due to dielectric filling and small height of the waveguide. In this work, one of the realization methods of evanescent waveguides using lumped elements is...... metal layer of the waveguide to the top layer with metalized via holes. The present filters are analyzed using models based on impedance matrix representation. The developed models allow computationally efficient and relatively accurate prediction of the filter behavior in a wide frequency range (at...

  20. Kalman filtering technique for reactivity measurement

    International Nuclear Information System (INIS)

    Measurement of reactivity and its on-line display is of great help in calibration of reactivity control and safety devices and in the planning of suitable actions during the reactor operation. In traditional approaches the reactivity is estimated from reactor period or by solving the inverse point kinetic equation. In this paper, an entirely new approach based on the Kalman filtering technique has been presented. The theory and design of the reactivity measuring instrument based on the approach has been explained. Its performance has been compared with traditional approaches by estimation of transient reactivity from flux variation data recorded in a research reactor. It is demonstrated that the Kalman filtering approach is superior to other methods from the viewpoints of accuracy, noise suppression, and robustness against uncertainties in the reactor parameters. (author). 1 fig

  1. Dual filtered backprojection for micro-rotation confocal microscopy

    International Nuclear Information System (INIS)

    Micro-rotation confocal microscopy is a novel optical imaging technique which employs dielectric fields to trap and rotate individual cells to facilitate 3D fluorescence imaging using a confocal microscope. In contrast to computed tomography (CT) where an image can be modelled as parallel projection of an object, the ideal confocal image is recorded as a central slice of the object corresponding to the focal plane. In CT, the projection images and the 3D object are related by the Fourier slice theorem which states that the Fourier transform of a CT image is equal to the central slice of the Fourier transform of the 3D object. In the micro-rotation application, we have a dual form of this setting, i.e. the Fourier transform of the confocal image equals the parallel projection of the Fourier transform of the 3D object. Based on the observed duality, we present here the dual of the classical filtered back projection (FBP) algorithm and apply it in micro-rotation confocal imaging. Our experiments on real data demonstrate that the proposed method is a fast and reliable algorithm for the micro-rotation application, as FBP is for CT application

  2. Spatial filters for focusing ultrasound images

    DEFF Research Database (Denmark)

    Jensen, Jørgen Arendt; Gori, Paola

    Traditionally focusing is done by taking out one sample in the received signal from each transducer element and then sum these signals. This method does not take into account the temporal or spatial spread of the received signal from a point scatterer and does not make an optimal focus of the data...... filter for beamforming the received RF signals from the individual transducer elements. The matched filter is applied on RF signals from individual transducer elements, thus properly taking into account the spatial spread of the received signal. The method can be applied to any transducer and can also be...... of the autocovariance function of the image shows a -6 dB width reduction by a factor of 3.3 at 20 mm and by a factor of 1.8 at 30 mm. Other simulations use a 64 elements, 3 MHz, linear array. Different receiving conditions are compared and this shows that the effect of the filter is progressively...

  3. A Robust Gaussian Filter Corresponding to the Transmisson Characterisic of the Gaussian Filter

    International Nuclear Information System (INIS)

    A surface roughness profile of an object can be measured by extracting a mean line of the long wavelength component from the primary profile, and by subtracting it from the primary profile. This mean line is usually computed by convolving the traditional Gaussian filter (GF) and the primary profile. However, if an outlier exists in the primary profile, the output of a Gaussian filter will be greatly affected by the outlier. To solve the outlier problem, several schemes of robust Gaussian filter have been proposed. However there are several fatal problems that a mean line determined with respect to the measurement data containing no outliers does not agree with the mean line of the Gaussian filter output. To solve these problems, this paper proposes a new robust Gaussian filter based on a fast M-estimation method (FMGF) and the performance of the new robust Gaussian filter was experimentally clarified. As a result, if an outlier exist, the proposed method behaves a robust performance. If no outlier exists, the output wave pattern, RMSE and transmission characteristic accorded mutually with Gaussian filter

  4. Invariant texture segmentation via circular gabor filter

    OpenAIRE

    Zhang, JianGuo; Tan, Tieniu

    2002-01-01

    In this paper, we focus on invariant texture segmentation, and propose a new method using circular Gabor filters (CGF) for rotation invariant texture segmentation. The traditional Gabor function is modified into a circular symmetric version. The rotation invariant texture features are achieved via the channel output of the CGF. A new scheme of the selection of Gabor parameters is also proposed for texture segmentation. Experiments show the efficacy of this method

  5. An Ontology- Content-based Filtering Method

    OpenAIRE

    Shoval, Peretz; Maidel, Veronica; Shapira, Bracha

    2008-01-01

    Traditional content-based filtering methods usually utilize text extraction and classification techniques for building user profiles as well as for representations of contents, i.e. item profiles. These methods have some disadvantages e.g. mismatch between user profile terms and item profile terms, leading to low performance. Some of the disadvantages can be overcome by incorporating a common ontology which enables representing both the users' and the items' profiles with con...

  6. Vena cava filter; Vena-cava-Filter

    Energy Technology Data Exchange (ETDEWEB)

    Helmberger, T. [Klinikum Bogenhausen, Institut fuer Diagnostische und Interventionelle Radiologie und Nuklearmedizin, Muenchen (Germany)

    2007-05-15

    Fulminant pulmonary embolism is one of the major causes of death in the Western World. In most cases, deep leg and pelvic venous thrombosis are the cause. If an anticoagulant/thrombotic therapy is no longer possible or ineffective, a vena cava filter implant may be indicated if an embolism is threatening. Implantation of the filter is a simple and safe intervention. Nevertheless, it is necessary to take into consideration that the data base for determining the indications for this treatment are very limited. Currently, a reduction in the risk of thromboembolism with the use of filters of about 30%, of recurrences of almost 5% and fatal pulmonary embolism of 1% has been reported, with a risk of up to 20% of filter induced vena cava thrombosis. (orig.) [German] Die fulminante Lungenembolie zaehlt zu den Haupttodesursachen in der westlichen Welt. In der Mehrzahl der Faelle sind tiefe Bein- und Beckenvenenthrombosen ursaechlich verantwortlich. Ist eine antikoagulative/-thrombotische Therapie nicht (mehr) moeglich oder unwirksam, kann bei drohender Emboliegefahr die Vena-cava-Filterimplantation indiziert sein. Die Filterimplantation ist eine einfache und sehr sichere Intervention. Dennoch muss bei der Indikationsstellung beruecksichtigt werden, dass die Datenlage zur Wirksamkeit sehr limitiert ist. So wird aktuell ueber eine Reduktion des Thrombembolierisikos um 30% bei Embolierezidiven von knapp 5% und fatalen Lungenembolien von 1% unter Filterprophylaxe berichtet, bei einem Risiko von bis zu 20% fuer die filterinduzierte Vena-cava-Thrombose. (orig.)

  7. Band-elimination filter

    Science.gov (United States)

    Shelton, G. B.

    1977-01-01

    Helical resonator is employed to produce stable, highly selective filter. Other features of filter include controlled bandwidth by cascading identical stages and stagger tuning, adjustable notch depth, good isolation between stages, gain set by proper choice of resistors, and elimination of spurious responses.

  8. Internet Filtering in China

    OpenAIRE

    Zittrain, Jonathan L.

    2003-01-01

    We collected data on the methods, scope, and depth of selective barriers to Internet usage through networks in China. Tests conducted from May through November 2002 indicated at least four distinct and independently operable Internet filtering methods - Web server IP address, DNS server IP address, keyword, and DNS redirection with a quantifiable leap in filtering sophistication beginning in September 2002.

  9. Multilevel ensemble Kalman filtering

    OpenAIRE

    Hoel, HÃ¥kon; Law, Kody J. H.; Tempone, Raul

    2015-01-01

    This work embeds a multilevel Monte Carlo (MLMC) sampling strategy into the Monte Carlo step of the ensemble Kalman filter (ENKF), thereby yielding a multilevel ensemble Kalman filter (MLENKF) which has provably superior asymptotic cost to a given accuracy level. The theoretical results are illustrated numerically.

  10. Updating the OMERACT filter

    DEFF Research Database (Denmark)

    Kirwan, John R; Boers, Maarten; Hewlett, Sarah; Beaton, Dorcas; Bingham, Clifton O; Choy, Ernest; Conaghan, Philip G; D'Agostino, Maria-Antonietta; Dougados, Maxime; Furst, Daniel E; Guillemin, Francis; Gossec, Laure; van der Heijde, Désirée M; Kloppenburg, Margreet; Kvien, Tore K; Landewé, Robert B M; Mackie, Sarah L; Matteson, Eric L; Mease, Philip J; Merkel, Peter A; Østergaard, Mikkel; Saketkoo, Lesley Ann; Simon, Lee; Singh, Jasvinder A; Strand, Vibeke; Tugwell, Peter

    2014-01-01

    OBJECTIVE: The Outcome Measures in Rheumatology (OMERACT) Filter provides guidelines for the development and validation of outcome measures for use in clinical research. The "Truth" section of the OMERACT Filter presupposes an explicit framework for identifying the relevant core outcomes that are...

  11. Filter cake breaker systems

    Energy Technology Data Exchange (ETDEWEB)

    Garcia, Marcelo H.F. [Poland Quimica Ltda., Duque de Caxias, RJ (Brazil)

    2004-07-01

    Drilling fluids filter cakes are based on a combination of properly graded dispersed particles and polysaccharide polymers. High efficiency filter cakes are formed by these combination , and their formation on wellbore walls during the drilling process has, among other roles, the task of protecting the formation from instantaneous or accumulative invasion of drilling fluid filtrate, granting stability to well and production zones. Filter cake minimizes contact between drilling fluid filtrate and water, hydrocarbons and clay existent in formations. The uniform removal of the filter cake from the entire interval is a critical factor of the completion process. The main methods used to breaking filter cake are classified into two groups, external or internal, according to their removal mechanism. The aim of this work is the presentation of these mechanisms as well their efficiency. (author)

  12. Sub-micron filter

    Science.gov (United States)

    Tepper, Frederick (Sanford, FL); Kaledin, Leonid (Port Orange, FL)

    2009-10-13

    Aluminum hydroxide fibers approximately 2 nanometers in diameter and with surface areas ranging from 200 to 650 m.sup.2/g have been found to be highly electropositive. When dispersed in water they are able to attach to and retain electronegative particles. When combined into a composite filter with other fibers or particles they can filter bacteria and nano size particulates such as viruses and colloidal particles at high flux through the filter. Such filters can be used for purification and sterilization of water, biological, medical and pharmaceutical fluids, and as a collector/concentrator for detection and assay of microbes and viruses. The alumina fibers are also capable of filtering sub-micron inorganic and metallic particles to produce ultra pure water. The fibers are suitable as a substrate for growth of cells. Macromolecules such as proteins may be separated from each other based on their electronegative charges.

  13. Use of changing filters

    International Nuclear Information System (INIS)

    The invention concerns a changing filter for accommodating bulk contact material, eg active carbons, for cleaning gas or air flows. The filter consists of a square, in which the meandering filter bed surrounded by perforated sheets with windings alternately towards the incoming and outgoing flow side is accommodated. The windings of the filter bed are flattened on the outgoing flow side, where the part of the perforated sheet partitions towards the flattened part is open on the outgoing flow side and can be closed by perforated sheet lids. Pressure plates with rubber or plastic mats permeable to air are laid on the filter bed below the perforated sheet lids and tensioned against the bed. The advantage is that while guaranteeing many applications, the danger of leaks to the mechanical parts of the closing lid is overcome. (orig./HP)

  14. Experimental validation of a single shaped filter approach for CT using variable source-to-filter distance for examination of arbitrary object diameters

    International Nuclear Information System (INIS)

    The purpose of this study was to validate the use of a single shaped filter (SF) for computed tomography (CT) using variable source-to-filter distance (SFD) for the examination of different object diameters. A SF was designed by performing simulations with the purpose of achieving noise homogeneity in the reconstructed volume and dose reduction for arbitrary phantom diameters. This was accomplished by using a filter design method thats target is to achieve a homogeneous detector noise, but also uses a correction factor for the filtered back projection process. According to simulation results, a single SF designed for one of the largest phantom diameters meets the requirements for all diameters when SFD can be adjusted. To validate these results, a SF made of aluminium alloy was manufactured. Measurements were performed on a CT scanner with polymethyl methacrylate (PMMA) phantoms of diameters from 40–100?mm. The filter was positioned at SFDs ranging from 97–168?mm depending on the phantom diameter. Image quality was evaluated for the reconstructed volume by assessing CT value accuracy, noise homogeneity, contrast-to-noise ratio weighted by dose (CNRD) and spatial resolution. Furthermore, scatter distribution was determined with the use of a beam-stop phantom. Dose was measured for a PMMA phantom with a diameter of 100?mm using a calibrated ionization chamber. The application of a single SF at variable SFD led to improved noise uniformity and dose reduction: noise homogeneity was improved from 15% down to about 0%, and dose was reduced by about 37%. Furthermore, scatter dropped by about 32%, which led to reduced cupping artifacts and improved CT value accuracy. Spatial resolution and CNRD was not affected by the SF. By means of a single SF with variable SFD designed for CT, significant dose reduction can be achieved and image quality can be improved by reducing noise inhomogeneity as well as scatter-induced artifacts. (paper)

  15. Experimental validation of a single shaped filter approach for CT using variable source-to-filter distance for examination of arbitrary object diameters

    Science.gov (United States)

    Lück, Ferdinand; Kolditz, Daniel; Hupfer, Martin; Steiding, Christian; Kalender, Willi A.

    2014-10-01

    The purpose of this study was to validate the use of a single shaped filter (SF) for computed tomography (CT) using variable source-to-filter distance (SFD) for the examination of different object diameters. A SF was designed by performing simulations with the purpose of achieving noise homogeneity in the reconstructed volume and dose reduction for arbitrary phantom diameters. This was accomplished by using a filter design method thats target is to achieve a homogeneous detector noise, but also uses a correction factor for the filtered back projection process. According to simulation results, a single SF designed for one of the largest phantom diameters meets the requirements for all diameters when SFD can be adjusted. To validate these results, a SF made of aluminium alloy was manufactured. Measurements were performed on a CT scanner with polymethyl methacrylate (PMMA) phantoms of diameters from 40-100?mm. The filter was positioned at SFDs ranging from 97-168?mm depending on the phantom diameter. Image quality was evaluated for the reconstructed volume by assessing CT value accuracy, noise homogeneity, contrast-to-noise ratio weighted by dose (CNRD) and spatial resolution. Furthermore, scatter distribution was determined with the use of a beam-stop phantom. Dose was measured for a PMMA phantom with a diameter of 100?mm using a calibrated ionization chamber. The application of a single SF at variable SFD led to improved noise uniformity and dose reduction: noise homogeneity was improved from 15% down to about 0%, and dose was reduced by about 37%. Furthermore, scatter dropped by about 32%, which led to reduced cupping artifacts and improved CT value accuracy. Spatial resolution and CNRD was not affected by the SF. By means of a single SF with variable SFD designed for CT, significant dose reduction can be achieved and image quality can be improved by reducing noise inhomogeneity as well as scatter-induced artifacts.

  16. Experimental validation of a single shaped filter approach for CT using variable source-to-filter distance for examination of arbitrary object diameters.

    Science.gov (United States)

    Lück, Ferdinand; Kolditz, Daniel; Hupfer, Martin; Steiding, Christian; Kalender, Willi A

    2014-10-01

    The purpose of this study was to validate the use of a single shaped filter (SF) for computed tomography (CT) using variable source-to-filter distance (SFD) for the examination of different object diameters.A SF was designed by performing simulations with the purpose of achieving noise homogeneity in the reconstructed volume and dose reduction for arbitrary phantom diameters. This was accomplished by using a filter design method thats target is to achieve a homogeneous detector noise, but also uses a correction factor for the filtered back projection process. According to simulation results, a single SF designed for one of the largest phantom diameters meets the requirements for all diameters when SFD can be adjusted. To validate these results, a SF made of aluminium alloy was manufactured. Measurements were performed on a CT scanner with polymethyl methacrylate (PMMA) phantoms of diameters from 40-100?mm. The filter was positioned at SFDs ranging from 97-168?mm depending on the phantom diameter. Image quality was evaluated for the reconstructed volume by assessing CT value accuracy, noise homogeneity, contrast-to-noise ratio weighted by dose (CNRD) and spatial resolution. Furthermore, scatter distribution was determined with the use of a beam-stop phantom. Dose was measured for a PMMA phantom with a diameter of 100?mm using a calibrated ionization chamber.The application of a single SF at variable SFD led to improved noise uniformity and dose reduction: noise homogeneity was improved from 15% down to about 0%, and dose was reduced by about 37%. Furthermore, scatter dropped by about 32%, which led to reduced cupping artifacts and improved CT value accuracy. Spatial resolution and CNRD was not affected by the SF.By means of a single SF with variable SFD designed for CT, significant dose reduction can be achieved and image quality can be improved by reducing noise inhomogeneity as well as scatter-induced artifacts. PMID:25198916

  17. LCL Interface Filter Design for Shunt Active Power Filters

    OpenAIRE

    DOBRICEANU, M.; Marin, D; Popescu, M.; BITOLEANU, A.

    2010-01-01

    This paper is focused on finding the parameters of a second order interface filter connected between the power system and the shunt active filter based on switching frequency of the active filter. Many publications on power active filters include various design methods for the interface inductive filter which take into account the injected current and its dynamic. Compared to these ones, the approach presented in this paper is oriented toward the design of the interface filter starting fro...

  18. X-ray differential phase-contrast tomographic reconstruction with a phase line integral retrieval filter

    International Nuclear Information System (INIS)

    We report an alternative reconstruction technique for x-ray differential phase-contrast computed tomography (DPC-CT). This approach is based on a new phase line integral projection retrieval filter, which is rooted in the derivative property of the Fourier transform and counteracts the differential nature of the DPC-CT projections. It first retrieves the phase line integral from the DPC-CT projections. Then the standard filtered back-projection (FBP) algorithms popular in x-ray absorption-contrast CT are directly applied to the retrieved phase line integrals to reconstruct the DPC-CT images. Compared with the conventional DPC-CT reconstruction algorithms, the proposed method removes the Hilbert imaginary filter and allows for the direct use of absorption-contrast FBP algorithms. Consequently, FBP-oriented image processing techniques and reconstruction acceleration softwares that have already been successfully used in absorption-contrast CT can be directly adopted to improve the DPC-CT image quality and speed up the reconstruction

  19. LCL Interface Filter Design for Shunt Active Power Filters

    Directory of Open Access Journals (Sweden)

    DOBRICEANU, M.

    2010-08-01

    Full Text Available This paper is focused on finding the parameters of a second order interface filter connected between the power system and the shunt active filter based on switching frequency of the active filter. Many publications on power active filters include various design methods for the interface inductive filter which take into account the injected current and its dynamic. Compared to these ones, the approach presented in this paper is oriented toward the design of the interface filter starting from filter transfer functions by imposing the performances of the filter.

  20. Substrate Integrated Evanescent Filters Employing Coaxial Stubs

    DEFF Research Database (Denmark)

    Zhurbenko, Vitaliy

    2015-01-01

    Evanescent mode substrate integrated waveguide (SIW) is one of the promising technologies for design of light-weight low-cost microwave components. Traditional realization methods used in the standard evanescent waveguide technology are often not directly applicable to SIW due to dielectric filling and small height of the waveguide. In this work, one of the realization methods of evanescent mode waveguides using a single layer substrate is considered. The method is based on the use of coaxial stubs as capacitive susceptances externally connected to a SIW. A microwave filter based on these principles is designed, fabricated, and tested. The filter exhibits a transmission zero due to the implemented stubs. The problem of evanescent mode filter analysis is formulated in terms of conventional network concepts. This formulation is then used for modelling of the filters. Strategies to further miniaturization of the microwave filter are discussed. The approach is useful in applications where a sharp roll-off at the upper stop-band is required.

  1. Ceramic fiber reinforced filter

    Science.gov (United States)

    Stinton, David P. (Knoxville, TN); McLaughlin, Jerry C. (Oak Ridge, TN); Lowden, Richard A. (Powell, TN)

    1991-01-01

    A filter for removing particulate matter from high temperature flowing fluids, and in particular gases, that is reinforced with ceramic fibers. The filter has a ceramic base fiber material in the form of a fabric, felt, paper of the like, with the refractory fibers thereof coated with a thin layer of a protective and bonding refractory applied by chemical vapor deposition techniques. This coating causes each fiber to be physically joined to adjoining fibers so as to prevent movement of the fibers during use and to increase the strength and toughness of the composite filter. Further, the coating can be selected to minimize any reactions between the constituents of the fluids and the fibers. A description is given of the formation of a composite filter using a felt preform of commercial silicon carbide fibers together with the coating of these fibers with pure silicon carbide. Filter efficiency approaching 100% has been demonstrated with these filters. The fiber base material is alternately made from aluminosilicate fibers, zirconia fibers and alumina fibers. Coating with Al.sub.2 O.sub.3 is also described. Advanced configurations for the composite filter are suggested.

  2. Static Filtered Skin Detection

    Directory of Open Access Journals (Sweden)

    Rehanullah Khan

    2012-03-01

    Full Text Available A static skin filter defines explicitly (using a number of rules the boundaries the skin cluster has in a color space. Single or multiple ranges of threshold values for each color space component are created and the image pixel values falling within these range(s for all the chosen color components are defined as skin pixels. In this paper, we investigate and evaluate static skin filters for skin segmentation. As a contribution, two new static skin filters for the IHLS and CIELAB color spaces are developed. The two new static filters and four state-of-the-art static filters in YCbCr, HSI, RGB and normalized RGB color spaces are evaluated on the two datasets DS1 and DS2, on the basis of F-measure. Experimental results reveal the feasibility of the developed static skin filters. We also found that since the static filters use static boundaries, any shift of skin color ranges from the static boundaries will result in varying performance. Therefore, the F-measure rankings of the color spaces are different for the datasets DS1 and DS2.

  3. Comparison Of Chest Filters

    Science.gov (United States)

    Schiller, G.; Olson, A.; Nguyen, H.

    1985-09-01

    Recently, anatomically shaped lead acrylic filters have been introduced for chest radiography. Two of these filters were compared to several (Al, Cu, Y, and Pb acrylic) uniform filters. Phototimed exposures at 80, 100, 120 and 140 kVp were made on a realistic chest phantom. The optical density in the lung field was kept constant for all filters and kVp's. Exposure time and entrance exposures to the mediastinum and lungs were measured. When compared to standard (3.2 mm Al HVL) aluminum filtration a reduction of (44% - 60%) of lung exposure and better visualization of the mediastinum and retrocardiac areas were noted. However, a significant (80 - 350%) increase in exposure time was required and mediastinum exposure increased by 40 - 100 percent. When using uniform filters, in addition to the standard aluminum filter, entrance exposures to the lung and mediastinum were reduced by 30 -50% for Yttrium and Copper and by 27 to 35% for lead acrylic. Exposure times increased by up to 36%, 64%, and 52% respectively. When using spatially shaped filters, improved image quality and reduced lung exposure results, however, one must be aware of the significant increase in exposure time especially at low (80) kVp's. There is also an increase of medias-tinum exposure and possibility of positioning artifacts.

  4. Network synthesis and filter design

    International Nuclear Information System (INIS)

    This book deals with network synthesis and filter design. This contains twelve chapters, which includes Historical background, Network function on Historical background, network function on typical types, Hurwitz polynomials and LC network function on typical types, Hurwitz polynomials and LC network function, Filter function with transitional butter worth-chebyshev filters and step response and impulse response, Frequency transformation such as frequency scaling, LP : HP transformation, LP : BP transformation and LP : BS transformation, Basics of network synthesis, butter worth filter, chebyshev filter, sensitivity filter, chebyshev filter, sensitivity of circuitry, operational Amplifier and its apply, VCVS RC filter, Higher-order filter, OP Amp, Ladder network simulation and switched capacitor filter. It adds marks on butter worth filter and chebyshev filter.

  5. Smoke and pollutant filtering device

    International Nuclear Information System (INIS)

    A smoke and pollutant filtering device comprising a mask having a filter composed of a series of contiguous, serial layers of filtering material. The filter consists of front and rear gas permeable covers, a first filter layer of pressed vegetable matter, a second filter layer comprising a layer of activated charcoal adjacent a layer of aqua filter floss, a third filter comprising a gas permeable cloth situated between layers of pressed vegetable matter, and a fourth filter layer comprising an aqua filter floss. The first through fourth filter layers are sandwiched between the front and rear gas permeable covers. The filtering device is stitched together and mounted within a fireretardant hood shaped to fit over a human head. Elastic bands are included in the hood to maintain the hood snugly about the head when worn

  6. A New Approach for Cluster Based Collaborative Filters

    OpenAIRE

    R. Venu Babu,; K. Srinivas,; S. Anjali Devi

    2010-01-01

    In modern E-Commerce it is not easy for customers to find the best suitable goods of their interest as more and more information is placed on line (like movies, audios, books, documents etc...). So in order to provide most suitable information of high value to customers of an e-commerce business system, a customized recommender system is required. Collaborative Filtering has become a popular technique for reducing this information overload. While traditional collaborative filtering systems ha...

  7. A personalized web page content filtering model based on segmentation

    OpenAIRE

    K.S.Kuppusamy; AGHILA G

    2012-01-01

    In the view of massive content explosion in World Wide Web through diverse sources, it has become mandatory to have content filtering tools. The filtering of contents of the web pages holds greater significance in cases of access by minor-age people. The traditional web page blocking systems goes by the Boolean methodology of either displaying the full page or blocking it completely. With the increased dynamism in the web pages, it has become a common phenomenon that differe...

  8. A Contextual Item-Based Collaborative Filtering Technology

    OpenAIRE

    Pan Pan; Xueqing Tan

    2012-01-01

    This paper proposes a contextual item-based collaborative filtering technology, which is based on the traditional item-based collaborative filtering technology. In the process of the recommendation, user's important mobile contextual information are taken into account, and the technology combines with those ratings on the items in the users' historical contextual information who are familiar with user's current context information in order to predict that which items will be preferred by user...

  9. Various Topics on Angle-Only Tracking using Particle Filters

    OpenAIRE

    Karlsson, Rickard

    2002-01-01

    Angle-only tracking estimates range and range rate from measured angle information by maneuvering the observation platform to gain observability. Traditionally, linear or linearized models are used, where the uncertainty in the sensor and motion models is typically modeled by Gaussian densities. Hence, classical sub-optimal Bayesian methods based on linearized Kalman filters can be used. The sequential Monte Carlo method, or particle filter, provides an approximative solution to the non-linea...

  10. An LLCL Power Filter for Single-Phase Grid-Tied Inverter

    DEFF Research Database (Denmark)

    Wu, Weimin; He, Yuanbin

    2012-01-01

    This paper presents a new topology of higher order power filter for grid-tied voltage-source inverters, named the LLCL filter, which inserts a small inductor in the branch loop of the capacitor in the traditional LCL filter to compose a series resonant circuit at the switching frequency. Particularly, it can attenuate the switching-frequency current ripple components much better than an LCL filter, leading to a decrease in the total inductance and volume. Furthermore, by decreasing the inductance of a grid-side inductor, it raises the characteristic resonance frequency, which is beneficial to the inverter system control. The parameter design criteria of the proposed LLCL filter is also introduced. The comparative analysis and discussions regarding the traditional LCL filter and the proposed LLCL filter have been presented and evaluated through experiment on a 1.8-kW-single-phase grid-tied inverter prototype.

  11. EMI filter design

    CERN Document Server

    Ozenbaugh, Richard Lee

    2011-01-01

    With today's electrical and electronics systems requiring increased levels of performance and reliability, the design of robust EMI filters plays a critical role in EMC compliance. Using a mix of practical methods and theoretical analysis, EMI Filter Design, Third Edition presents both a hands-on and academic approach to the design of EMI filters and the selection of components values. The design approaches covered include matrix methods using table data and the use of Fourier analysis, Laplace transforms, and transfer function realization of LC structures. This edition has been fully revised

  12. Static Filtered Skin Detection

    OpenAIRE

    Rehanullah Khan; Zeeshan Khan; Muhammad Aamir

    2012-01-01

    A static skin filter defines explicitly (using a number of rules) the boundaries the skin cluster has in a color space. Single or multiple ranges of threshold values for each color space component are created and the image pixel values falling within these range(s) for all the chosen color components are defined as skin pixels. In this paper, we investigate and evaluate static skin filters for skin segmentation. As a contribution, two new static skin filters for the IHLS and CIELAB color spac...

  13. Circuits and filters handbook

    CERN Document Server

    Chen, Wai-Kai

    2003-01-01

    A bestseller in its first edition, The Circuits and Filters Handbook has been thoroughly updated to provide the most current, most comprehensive information available in both the classical and emerging fields of circuits and filters, both analog and digital. This edition contains 29 new chapters, with significant additions in the areas of computer-aided design, circuit simulation, VLSI circuits, design automation, and active and digital filters. It will undoubtedly take its place as the engineer's first choice in looking for solutions to problems encountered in the design, analysis, and behavi

  14. Implementing Cepstral Filtering Technique using Gabor Filters

    OpenAIRE

    Sharma, Sheena

    2012-01-01

    Cepstral filtering technique is applied on an interlaced image, the pattern similar to that which is found in layer IV of Primate Visual Cortex. Unless the signals from left and right eyes are placed simultaneously, the disparity cannot be detected. Therefore, it has a great significance in the sphere of stereo vision. It involves Power spectrum in computation, which is square of absolute of Fast Fourier Transform (FFT), is a complicated and hardware unfriendly. This paper shows the estimatio...

  15. A Reconfigurable FIR Filter System

    OpenAIRE

    Guosheng Xu

    2013-01-01

    Introduced a practical reconfigurable FIR filter system. According to the filter specialties, the filter coefficients are calculated by the computer. And the configured coefficients of the multistage FIR filter are downloaded to the chip. The filtering computing is completed by the FPGA. All these make virtual value of voltage and current more accurate and steady, so reseach on FIR filter Algorithms is the emphases and difficulty. The performance testing and application examples are given to ...

  16. Derivative free filtering using Kalmtool

    DEFF Research Database (Denmark)

    Bayramoglu, Enis; Hansen, Søren; Ravn, Ole; Poulsen, Niels Kjølstad

    2010-01-01

    In this paper we present a toolbox enabling easy evaluation and comparison of different filtering algorithms. The toolbox is called Kalmtool 4 and is a set of MATLAB tools for state estimation of nonlinear systems. The toolbox contains functions for extended Kalman filtering as well as for DD1 filter and the DD2 filter. It also contains functions for Unscented Kalman filters as well as several versions of particle filters. The toolbox requires MATLAB version 7, but no additional toolboxes are re...

  17. Derivative free filtering using Kalmtool

    DEFF Research Database (Denmark)

    Bayramoglu, Enis Technical University of Denmark,

    2010-01-01

    In this paper we present a toolbox enabling easy evaluation and comparison of different filtering algorithms. The toolbox is called Kalmtool 4 and is a set of MATLAB tools for state estimation of nonlinear systems. The toolbox contains functions for extended Kalman filtering as well as for DD1 filter and the DD2 filter. It also contains functions for Unscented Kalman filters as well as several versions of particle filters. The toolbox requires MATLAB version 7, but no additional toolboxes are required.

  18. Cleaning an air filter

    International Nuclear Information System (INIS)

    A hydrophobic HEPA air filter partially blocked by small ferric oxide and iron particles, such as arise from the plasma arc cutting of iron or steel, is unblocked by subjecting the filter to a flow of a vapour (which term includes droplets), atomised liquid or solution, aerosol or the like, of such concentration and duration as not to block the filter but to dissolve the oxide, which is redistributed to leave gas-conducting voids when the liquid subsequently dries out. The vapour may be a cold water fog produced by solid carbon dioxide and hot water, or an acid or organic solvent may be used. The unblocked filter may be coated with silane-coated glass spheres to facilitate subsequent filtration. A particular application involves plasma cutting of equipment contaminated with plutonium. (author)

  19. Cryogenic coaxial microwave filters

    CERN Document Server

    Tancredi, G; Meeson, P J

    2014-01-01

    At millikelvin temperatures the careful filtering of electromagnetic radiation, especially in the microwave regime, is critical for controlling the electromagnetic environment for experiments in fields such as solid-state quantum information processing and quantum metrology. We present a design for a filter consisting of small diameter dissipative coaxial cables that is straightforward to construct and provides a quantitatively predictable attenuation spectrum. We describe the fabrication process and demonstrate that the performance of the filters is in good agreement with theoretical modelling. We further perform an indicative test of the performance of the filters by making current-voltage measurements of small, underdamped Josephson Junctions at 15 mK and we present the results.

  20. Adaptive digital filters

    CERN Document Server

    Kova?evi?, Branko; Milosavljevi?, Milan

    2013-01-01

    “Adaptive Digital Filters” presents an important discipline applied to the domain of speech processing. The book first makes the reader acquainted with the basic terms of filtering and adaptive filtering, before introducing the field of advanced modern algorithms, some of which are contributed by the authors themselves. Working in the field of adaptive signal processing requires the use of complex mathematical tools. The book offers a detailed presentation of the mathematical models that is clear and consistent, an approach that allows everyone with a college level of mathematics knowledge to successfully follow the mathematical derivations and descriptions of algorithms.   The algorithms are presented in flow charts, which facilitates their practical implementation. The book presents many experimental results and treats the aspects of practical application of adaptive filtering in real systems, making it a valuable resource for both undergraduate and graduate students, and for all others interested in m...

  1. Paul Rodgersi filter Kohilas

    Index Scriptorium Estoniae

    2000-01-01

    28. I Kohila keskkoolis kohaspetsiifiline skulptuur ja performance "Filter". Kooli 130. aastapäeva tähistava ettevõtmise eesotsas oli skulptor Paul Rodgers ja kaks viimase klassi noormeest ئ Marko Heinmäe, Hendrik Karm.

  2. Bloofi: Multidimensional Bloom Filters

    OpenAIRE

    Crainiceanu, Adina; Lemire, Daniel

    2015-01-01

    Bloom filters are probabilistic data structures commonly used for approximate membership problems in many areas of Computer Science (networking, distributed systems, databases, etc.). With the increase in data size and distribution of data, problems arise where a large number of Bloom filters are available, and all them need to be searched for potential matches. As an example, in a federated cloud environment, each cloud provider could encode the information using Bloom filt...

  3. NIRCam filter wheels

    Science.gov (United States)

    McCully, Sean; Schermerhorn, Michael; Thatcher, John

    2005-08-01

    The NIRCam instrument will provide near-infrared imaging capabilities for the James Webb Space Telescope. In addition, this instrument contains the wavefront-sensing elements necessary for optimizing the performance of the primary mirror. Several of these wavefront-sensing elements will reside in the NIRCam Filter Wheel Assembly. The instrument and its complement of mechanisms and optics will operate at a cryogenic temperature of 35K. This paper describes the design of the NIRCam Filter Wheel Assembly.

  4. Concurrent filtering and smoothing

    OpenAIRE

    Kaess, Michael; Williams, Stephen; Indelman, Vadim; Roberts, Richard; Leonard, John Joseph; Dellaert, Frank

    2012-01-01

    This paper presents a novel algorithm for integrating real-time filtering of navigation data with full map/trajectory smoothing. Unlike conventional mapping strategies, the result of loop closures within the smoother serve to correct the real-time navigation solution in addition to the map. This solution views filtering and smoothing as different operations applied within a single graphical model known as a Bayes tree. By maintaining all information within a single graph, the optimal linear e...

  5. Updating the OMERACT filter

    DEFF Research Database (Denmark)

    Kirwan, John R; Boers, Maarten; Hewlett, Sarah; Beaton, Dorcas; Bingham, Clifton O; Choy, Ernest; Conaghan, Philip G; D'Agostino, Maria-Antonietta; Dougados, Maxime; Furst, Daniel E; Guillemin, Francis; Gossec, Laure; van der Heijde, Désirée M; Kloppenburg, Margreet; Kvien, Tore K; Landewé, Robert B M; Mackie, Sarah L; Matteson, Eric L; Mease, Philip J; Merkel, Peter A; Østergaard, Mikkel; Saketkoo, Lesley Ann; Simon, Lee; Singh, Jasvinder A; Strand, Vibeke; Tugwell, Peter

    2014-01-01

    OBJECTIVE: The Outcome Measures in Rheumatology (OMERACT) Filter provides guidelines for the development and validation of outcome measures for use in clinical research. The "Truth" section of the OMERACT Filter presupposes an explicit framework for identifying the relevant core outcomes that are universal to all studies of the effects of intervention effects. There is no published outline for instrument choice or development that is aimed at measuring outcome, was derived from broad consensus o...

  6. Updating the OMERACT filter

    DEFF Research Database (Denmark)

    Tugwell, Peter; Boers, Maarten; D'Agostino, Maria-Antonietta; Beaton, Dorcas; Boonen, Annelies; Bingham, Clifton O; Choy, Ernest; Conaghan, Philip G; Dougados, Maxime; Duarte, Catia; Furst, Daniel E; Guillemin, Francis; Gossec, Laure; Heiberg, Turid; van der Heijde, Désirée M; Hewlett, Sarah; Kirwan, John R; Kvien, Tore K; Landewé, Robert B; Mease, Philip J; Østergaard, Mikkel; Simon, Lee; Singh, Jasvinder A; Strand, Vibeke; Wells, George

    2014-01-01

    OBJECTIVE: The Outcome Measures in Rheumatology (OMERACT) Filter provides guidelines for the development and validation of outcome measures for use in clinical research. The "Truth" section of the OMERACT Filter requires that criteria be met to demonstrate that the outcome instrument meets the criteria for content, face, and construct validity. METHODS: Discussion groups critically reviewed a variety of ways in which case studies of current OMERACT Working Groups complied with the Truth componen...

  7. Kalman Filtering in R

    OpenAIRE

    Fernando Tusell

    2011-01-01

    Support in R for state space estimation via Kalman filtering was limited to one package, until fairly recently. In the last five years, the situation has changed with no less than four additional packages offering general implementations of the Kalman filter, including in some cases smoothing, simulation smoothing and other functionality. This paper reviews some of the offerings in R to help the prospective user to make an informed choice.

  8. Spatial filter issues

    International Nuclear Information System (INIS)

    Experiments and calculations indicate that the threshold pressure in spatial filters for distortion of a transmitted pulse scales approximately as IO.2 and (F number-sign)2 over the intensity range from 1014 to 2xlO15 W/CM2 . We also demonstrated an interferometric diagnostic that will be used to measure the scaling relationships governing pinhole closure in spatial filters

  9. Multilevel Mixture Kalman Filter

    OpenAIRE

    Xiaodong Wang; Dong Guo; Rong Chen

    2004-01-01

    The mixture Kalman filter is a general sequential Monte Carlo technique for conditional linear dynamic systems. It generates samples of some indicator variables recursively based on sequential importance sampling (SIS) and integrates out the linear and Gaussian state variables conditioned on these indicators. Due to the marginalization process, the complexity of the mixture Kalman filter is quite high if the dimension of the indicator sampling space is high. In this paper, we address this di...

  10. Ceramic filters for bulk inoculation of nickel alloy castings

    Directory of Open Access Journals (Sweden)

    F. Binczyk

    2011-07-01

    Full Text Available The work includes the results of research on production technology of ceramic filters which, besides the traditional filtering function, playalso the role of an inoculant modifying the macrostructure of cast nickel alloys. To play this additional role, filters should demonstratesufficient compression strength and ensure proper flow rate of liquid alloy. The role of an inoculant is played by cobalt aluminateintroduced to the composition of external coating in an amount from 5 to 10 wt.% . The required compression strength (over 1MPa isprovided by the supporting layers, deposited on the preform, which is a polyurethane foam. Based on a two-level fractional experiment24-1, the significance of an impact of various technological parameters (independent variables on selected functional parameters of theready filters was determined. Important effect of the number of the supporting layers and sintering temperature of filters after evaporationof polyurethane foam was stated.

  11. Distortion Parameters Analysis Method Based on Improved Filtering Algorithm

    Directory of Open Access Journals (Sweden)

    ZHANG Shutuan

    2013-10-01

    Full Text Available In order to realize the accurate distortion parameters test of aircraft power supply system, and satisfy the requirement of corresponding equipment in the aircraft, the novel power parameters test system based on improved filtering algorithm is introduced in this paper. The hardware of the test system has the characters of s portable and high-speed data acquisition and processing, and the software parts utilize the software Labwindows/CVI as exploitation software, and adopt the pre-processing technique and adding filtering algorithm. Compare with the traditional filtering algorithm, the test system adopted improved filtering algorithm can help to increase the test accuracy. The application shows that the test system with improved filtering algorithm can realize the accurate test results, and reach to the design requirements.  

  12. Traditional West Coast Native Medicine

    Science.gov (United States)

    Deagle, George

    1988-01-01

    An important part of the complex culture of the Native people of Canada's Pacific coast is the traditional system of medicine each culture has developed. Population loss from epidemics and the influence of dominant European cultures has resulted in loss of many aspects of traditional medicine. Although some Native practices are potentially hazardous, continuation of traditional approaches to illness remains an important part of health care for many Native people. The use of “devil's club” plant by the Haida people illustrates that Native medicine has both spiritual and physical properties. Modern family practice shares many important foundations with traditional healing systems. PMID:21253031

  13. Miniaturized superconducting microwave filters

    International Nuclear Information System (INIS)

    In this paper we present methods for the miniaturization of superconducting filters. We consider two designs of seventh-order bandpass Chebyshev filters based on lumped elements and a novel quasi-lumped element resonator. In both designs the area of the filters, with a central frequency of 2-5 GHz, is less than 1.2 mm2. Such small filters can be readily integrated on a single board for multi-channel microwave control of superconducting qubits. The filters have been experimentally tested and the results are compared with simulations. The miniaturization resulted in parasitic coupling between resonators and within each resonator that affected primarily the stopband and increased the bandwidth. The severity of the error depends on the design in particular, and was less sensitive when a groundplane was used under the inductances of the resonators. The best performance was reached for the quasi-lumped filter with central frequency of 4.45 GHz, quality factor of 40 and 50 dB stopband

  14. NICMOS Filter Wheel Test

    Science.gov (United States)

    Wheeler, Thomas

    2009-07-01

    This is an engineering test {described in SMOV4 Activity Description NICMOS-04} to verify the aliveness, functionality, operability, and electro-mechanical calibration of the NICMOS filter wheel motors and assembly after NCS restart in SMOV4. This test has been designed to obviate concerns over possible deformation or breakage of the fitter wheel "soda-straw" shafts due to excess rotational drag torque and/or bending moments which may be imparted due to changes in the dewar metrology from warm-up/cool-down. This test should be executed after the NCS {and filter wheel housing} has reached and approximately equilibrated to its nominal operating temperature.Addition of visits G0 - G9 {9/9/09}: Ten visits copied from proposal 11868 {visits 20, 30, ..., 90, A0, B0}. Each visit moves two filter positions, takes lamp ON/OFF exposures and then moves back to the blank position. Visits G0, G1 and G2 will leave the filter wheels disabled. The remaining visits will leave the filter wheels enabled. There are sufficient in between times to allow for data download and analysis. In the case of problem is encountered, the filter wheels will be disabled through a real time command. The in between times are all set to 22-50 hours. It is preferable to have as short as possible in between time.

  15. Containment venting filter designs incorporating stainless-steel fiber filters

    International Nuclear Information System (INIS)

    Failure of the containment of a PWR as a consequence of a major reactor accident can be prevented by filtered containment venting into the stack through an accident filter system. This greatly reduces the environmental contamination by fission products, which otherwise would be released. Possible filter concepts and their embodiment in stainless-steel fiber filters are described. (orig./HP)

  16. Filtering-efficiency measurement of Chinese-made filter 1 by double-filter method

    International Nuclear Information System (INIS)

    Filtering efficiency of the Chinese-made filter 1 has been measured by the double-filter method with only one set of measurement equipment. The ? counts of the two filters are measured in turn for every 3 minutes, from which the decay constants are extracted to be 0.0131 min-1 and 0.0129 min-1 for the first and second filter respectively. After correcting the ? counts of the second filter, the filtering efficiency of the filter is obtained to be (87.4 +- 0.7)% in average

  17. Sensory Pollution from Bag Filters, Carbon Filters and Combinations

    DEFF Research Database (Denmark)

    Bekö, Gabriel; Clausen, Geo; Weschler, Charles J.

    2008-01-01

    Used ventilation filters are a major source of sensory pollutants in air handling systems. The objective of the present study was to evaluate the net effect that different combinations of filters had on perceived air quality after 5 months of continuous filtration of outdoor suburban air. A panel of 32 subjects assessed different sets of used filters and identical sets consisting of new filters. Additionally, filter weights and pressure drops were measured at the beginning and end of the operati...

  18. Optimal filtering and filter stability of linear stochastic delay systems

    Science.gov (United States)

    Kwong, R. H.-S.; Willsky, A. S.

    1977-01-01

    Optimal filtering equations are obtained for very general linear stochastic delay systems. Stability of the optimal filter is studied in the case where there are no delays in the observations. Using the duality between linear filtering and control, asymptotic stability of the optimal filter is proved. Finally, the cascade of the optimal filter and the deterministic optimal quadratic control system is shown to be asymptotically stable as well.

  19. Filter and method of fabricating

    Science.gov (United States)

    Janney, Mark A.

    2006-02-14

    A method of making a filter includes the steps of: providing a substrate having a porous surface; applying to the porous surface a coating of dry powder comprising particles to form a filter preform; and heating the filter preform to bind the substrate and the particles together to form a filter.

  20. Kalman Filtering for Manufacturing Processes

    OpenAIRE

    Oakes, Thomas; Tang, Lie; Robert G. Landers; Balakrishnan, S.N.

    2009-01-01

    This chapter presented a methodology, based on stochastic process modeling and Kalman filtering, to filter manufacturing process measurements, which are known to be inherently noisy. Via simulation studies, the methodology was compared to low pass and Butterworth filters. The methodology was applied in a Friction Stir Welding (FSW) process to filter data

  1. Choosing and using astronomical filters

    CERN Document Server

    Griffiths, Martin

    2014-01-01

    As a casual read through any of the major amateur astronomical magazines will demonstrate, there are filters available for all aspects of optical astronomy. This book provides a ready resource on the use of the following filters, among others, for observational astronomy or for imaging: Light pollution filters Planetary filters Solar filters Neutral density filters for Moon observation Deep-sky filters, for such objects as galaxies, nebulae and more Deep-sky objects can be imaged in much greater detail than was possible many years ago. Amateur astronomers can take

  2. Asymmetric Baxter-King filter

    OpenAIRE

    Buss, Ginters

    2011-01-01

    The paper proposes an extension of the symmetric Baxter-King band pass filter to an asymmetric Baxter-King filter. The optimal correction scheme of the ideal filter weights is the same as in the symmetric version, i.e, cut the ideal filter at the appropriate length and add a constant to all filter weights to ensure zero weight on zero frequency. Since the symmetric Baxter-King filter is unable to extract the desired signal at the very ends of the series, the extension to an asymmetric filter...

  3. A new algorithm of inter-frame filtering in IR image based on threshold value

    Science.gov (United States)

    Liu, Wei; Leng, Hanbing; Chen, Weining; Yang, Hongtao; Xie, Qingsheng; Yi, Bo; Zhang, Haifeng

    2013-09-01

    This paper proposed a new algorithm of inter-frame filtering in IR image based on threshold value for the purpose of solving image blur and smear brought by traditional inter-frame filtering algorithm. At first, it finds out causes of image blur and smear by analyzing general inter-frame filtering algorithm and dynamic inter-frame filtering algorithm, hence to bring up a new kind of time-domain filter. In order to obtain coefficients of the filter, it firstly gets difference image of present image and previous image, and then, it gets noisy threshold value by analyzing difference image with probability analysis method. The relationship between difference image and threshold value helps obtaining the coefficients of filter. At last, inter-frame filtering method is adopted to process pixels interrupted by noise. The experimental result shows that this algorithm has successfully repressed IR image blur and smear, and NETD tested by traditional inter filtering algorithm and the new algorithm are respectively 78mK and 70mK, which shows it has a better noise reduction performance than traditional ones. The algorithm is not only applied to still image, but also to sports image. As a new algorithm with great practical value, it is easy to achieve on FPGA, of excellent real-time performance and it effectively extends application scope of time domain filtering algorithm.

  4. An IIR median hybrid filter

    Science.gov (United States)

    Bauer, Peter H.; Sartori, Michael A.; Bryden, Timothy M.

    1992-01-01

    A new class of nonlinear filters, the so-called class of multidirectional infinite impulse response median hybrid filters, is presented and analyzed. The input signal is processed twice using a linear shift-invariant infinite impulse response filtering module: once with normal causality and a second time with inverted causality. The final output of the MIMH filter is the median of the two-directional outputs and the original input signal. Thus, the MIMH filter is a concatenation of linear filtering and nonlinear filtering (a median filtering module). Because of this unique scheme, the MIMH filter possesses many desirable properties which are both proven and analyzed (including impulse removal, step preservation, and noise suppression). A comparison to other existing median type filters is also provided.

  5. Multilevel Mixture Kalman Filter

    Directory of Open Access Journals (Sweden)

    Xiaodong Wang

    2004-11-01

    Full Text Available The mixture Kalman filter is a general sequential Monte Carlo technique for conditional linear dynamic systems. It generates samples of some indicator variables recursively based on sequential importance sampling (SIS and integrates out the linear and Gaussian state variables conditioned on these indicators. Due to the marginalization process, the complexity of the mixture Kalman filter is quite high if the dimension of the indicator sampling space is high. In this paper, we address this difficulty by developing a new Monte Carlo sampling scheme, namely, the multilevel mixture Kalman filter. The basic idea is to make use of the multilevel or hierarchical structure of the space from which the indicator variables take values. That is, we draw samples in a multilevel fashion, beginning with sampling from the highest-level sampling space and then draw samples from the associate subspace of the newly drawn samples in a lower-level sampling space, until reaching the desired sampling space. Such a multilevel sampling scheme can be used in conjunction with the delayed estimation method, such as the delayed-sample method, resulting in delayed multilevel mixture Kalman filter. Examples in wireless communication, specifically the coherent and noncoherent 16-QAM over flat-fading channels, are provided to demonstrate the performance of the proposed multilevel mixture Kalman filter.

  6. Improved Passive-Damped LCL Filter to Enhance Stability in Grid-Connected Voltage-Source Converters

    DEFF Research Database (Denmark)

    Beres, Remus Narcis; Wang, Xiongfei

    2015-01-01

    This paper proposes an improved passive-damped LCL filter to be used as interface between the grid-connected voltage-source converters and the utility grid. The proposed filter replaces the LCL filter capacitor with a traditional C-type filter with the resonant circuit tuned in such a way that switching harmonics due to pulse width modulation are to be cancelled. Since the tuned circuit of the C-type filter suppresses the switching harmonics more effectively, the total inductance of the filter can be reduced. Additionally, the rating of the damping resistor is lower, compared with conventional passive-damped LCL filter. To verify the benefits of the proposed filter, a comparison with the conventional filter is made in terms of losses and ratings when both the filters are designed under the same condition.

  7. NOTCH FILTER USING SIMULATED INDUCTOR

    OpenAIRE

    D.SUSAN,; Dr.S.JAYALALITHA

    2011-01-01

    The design of analog filters at low frequencies is not possible because the size of inductors becomes very large. In such cases, the simulated inductors using operational amplifiers are used. This paper deals with the implementation of notch filter using band pass filter which uses simulated inductor where the direct implementation of notch filter using simulated inductor is not possible because of floating inductor. The design of notch filter and the simulation done in PSPICE is presented.

  8. DOE HEPA filter test program

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-05-01

    This standard establishes essential elements of a Department of Energy (DOE) program for testing HEPA filters to be installed in DOE nuclear facilities or used in DOE-contracted activities. A key element is the testing of HEPA filters for performance at a DOE Filter Test Facility (FTF) prior to installation. Other key elements are (1) providing for a DOE HEPA filter procurement program, and (2) verifying that HEPA filters to be installed in nuclear facilities appear on a Qualified Products List (QPL).

  9. DOE HEPA filter test program

    International Nuclear Information System (INIS)

    This standard establishes essential elements of a Department of Energy (DOE) program for testing HEPA filters to be installed in DOE nuclear facilities or used in DOE-contracted activities. A key element is the testing of HEPA filters for performance at a DOE Filter Test Facility (FTF) prior to installation. Other key elements are (1) providing for a DOE HEPA filter procurement program, and (2) verifying that HEPA filters to be installed in nuclear facilities appear on a Qualified Products List (QPL)

  10. Boolean filters of distributive lattices

    Directory of Open Access Journals (Sweden)

    M. Sambasiva Rao

    2013-07-01

    Full Text Available In this paper we introduce the notion of Boolean filters in a pseudo-complemented distributive lattice and characterize the class of all Boolean filters. Further a set of equivalent conditions are derived for a proper filter to become a prime Boolean filter. Also a set of equivalent conditions is derived for a pseudo-complemented distributive lattice to become a Boolean algebra. Finally, a Boolean filter is characterized in terms of congruences.

  11. Fuzzy Digital Filtering: Signal Interpretation

    OpenAIRE

    Juan C. Sánchez García; J. Jesús Medel Juárez; Juan C. García Infante

    2011-01-01

    The paper makes a description of the fuzzy filter properties considering its operational principles. A digital filter interacts with a reference model signal into real process in order to get the best corresponding answer, having the minimum error at the filter output using the mean square criterion. Adding into this filter structure a fuzzy mechanism, to obtain an intelligent filtering because adaptively select and emit a decision answer according with the external reference signal changes, ...

  12. Disinfecting Filters For Recirculated Air

    Science.gov (United States)

    Pilichi, Carmine A.

    1992-01-01

    Simple treatment disinfects air filters by killing bacteria, algae, fungi, mycobacteria, viruses, spores, and any other micro-organisms filters might harbor. Concept applied to reusable stainless-steel wire mesh filters and disposable air filters. Treatment used on filters in air-circulation systems in spacecraft, airplanes, other vehicles, and buildings to help prevent spread of colds, sore throats, and more-serious illnesses.

  13. Tradition?! Traditional Cultural Institutions on Customary Practices in Uganda

    Directory of Open Access Journals (Sweden)

    Joanna R. Quinn

    2014-01-01

    Full Text Available This contribution traces the importance of traditional institutions in rehabilitating societies in general terms and more particularly in post-independence Uganda. The current regime, partly by inventing “traditional” cultural institutions, partly by co-opting them for its own interests, contributed to a loss of legitimacy of those who claim responsibility for customary law. More recently, international prosecutions have complicated the use of customary mechanisms within such societies. This article shows that some traditional and cultural leaders continue to struggle to restore their original institutions, some having taken the initiative of inventing new forms of engaging with society. Uganda is presented as a test case for the International Criminal Court’s ability to work with traditional judicial institutions in Africa.

  14. Approaching Traditional Literature in Non-Traditional Ways.

    Science.gov (United States)

    Tensen, Tracy Anderson; And Others

    1996-01-01

    Presents three brief essays that discuss approaching traditional literature (Thornton Wilder's "Our Town," Mark Twain "Adventures of Huckleberry Finn," and Geoffrey Chaucer's "Canterbury Tales") in imaginative ways in high school English and vocational/technical classrooms. (RS)

  15. Hybrid Data Assimilation without Ensemble Filtering

    Science.gov (United States)

    Todling, Ricardo; Akkraoui, Amal El

    2014-01-01

    The Global Modeling and Assimilation Office is preparing to upgrade its three-dimensional variational system to a hybrid approach in which the ensemble is generated using a square-root ensemble Kalman filter (EnKF) and the variational problem is solved using the Grid-point Statistical Interpolation system. As in most EnKF applications, we found it necessary to employ a combination of multiplicative and additive inflations, to compensate for sampling and modeling errors, respectively and, to maintain the small-member ensemble solution close to the variational solution; we also found it necessary to re-center the members of the ensemble about the variational analysis. During tuning of the filter we have found re-centering and additive inflation to play a considerably larger role than expected, particularly in a dual-resolution context when the variational analysis is ran at larger resolution than the ensemble. This led us to consider a hybrid strategy in which the members of the ensemble are generated by simply converting the variational analysis to the resolution of the ensemble and applying additive inflation, thus bypassing the EnKF. Comparisons of this, so-called, filter-free hybrid procedure with an EnKF-based hybrid procedure and a control non-hybrid, traditional, scheme show both hybrid strategies to provide equally significant improvement over the control; more interestingly, the filter-free procedure was found to give qualitatively similar results to the EnKF-based procedure.

  16. Traditional Methods for Mineral Analysis

    Science.gov (United States)

    Ward, Robert E.; Carpenter, Charles E.

    This chapter describes traditional methods for analysis of minerals involving titrimetric and colorimetric procedures, and the use of ion selective electrodes. Other traditional methods of mineral analysis include gravimetric titration (i.e., insoluble forms of minerals are precipitated, rinse, dried, and weighed) and redox reactions (i.e., mineral is part of an oxidation-reduction reaction, and product is quantitated). However, these latter two methods will not be covered because they currently are used little in the food industry. The traditional methods that will be described have maintained widespread usage in the food industry despite the development of more modern instrumentation such as atomic absorption spectroscopy and inductively coupled plasma-atomic emission spectroscopy (Chap. 24). Traditional methods generally require chemicals and equipment that are routinely available in an analytical laboratory and are within the experience of most laboratory technicians. Additionally, traditional methods often form the basis for rapid analysis kits (e.g., Quantab®; for salt determination) that are increasingly in demand. Procedures for analysis of minerals of major nutritional or food processing concern are used for illustrative purposes. For additional examples of traditional methods refer to references (1-6). Slight modifications of these traditional methods are often needed for specific foodstuffs to minimize interferences or to be in the range of analytical performance. For analytical requirements for specific foods see the Official Methods of Analysis of AOAC International (5) and related official methods (6).

  17. Glove-box filters

    International Nuclear Information System (INIS)

    Description is given of a device for simply and rapidly assembling and dissassembling the filters used inside sealed enclosures, such as glove-boxes and shielded cells equipped with nippers or manipulators, said filters being of the type comprising a cylindrical casing containing a filtering member, the upper portion of said casing being open so as to allow the gases to be cleaned to flow in, whereas the casing bottom is centrally provided with a hole extended outwardly by a threaded collar on which is screwed a connecting-sleeve to be fixed to the mouth of a gas outlet pipe. To a yoke transverse bar is welded a pin which can be likened to a bent spring-blade, one arm of which welded to said transverse bar, is rectilinear whereas its other arm is provided with a boss cooperating with a cavity made in a protrusion of said pipe, right under the mouth thereof

  18. Morphing Ensemble Kalman Filters

    CERN Document Server

    Beezley, Jonathan D

    2007-01-01

    A new type of ensemble filter is proposed, which combines an ensemble Kalman filter (EnKF) with the ideas of morphing and registration from image processing. This results in filters suitable for nonlinear problems whose solutions exhibit moving coherent features, such as thin interfaces in wildfire modeling. The ensemble members are represented as the composition of one common state with a spatial transformation, called registration mapping, plus a residual. A fully automatic registration method is used that requires only gridded data, so the features in the model state do not need to be identified by the user. The morphing EnKF operates on a transformed state consisting of the registration mapping and the residual. Essentially, the morphing EnKF uses intermediate states obtained by morphing instead of linear combinations of the states.

  19. Ferroelectric electronically tunable filters

    International Nuclear Information System (INIS)

    A cylindrical cavity is loaded with a ferroelectric rod and is resonant at the dominant mode. The loaded cylindrical cavity is a band pass filter. As a bias voltage is applied across the ferroelectric rod, its permittivity changes resulting in a new resonant frequency for the loaded cylindrical cavity. The ferroelectric rod is operated at a temperature slightly above its Curie temperature. The loaded cylindrical cavity is kept at a constant designed temperature. The cylindrical cavity is made of conductors, a single crystal high Tc superconductor including YBCO and a single crystal dielectric, including sapphire and lanthanum aluminate, the interior conducting surfaces of which are deposited with a film of a single crystal high Tc superconductor. Embodiments also include waveguide single and multiple cavity type tunable filters. Embodiments also include tunable band reject filters. 10 figs

  20. Filtered containment venting

    International Nuclear Information System (INIS)

    After the TMI accident in the USA, the Swedish Government in 1981 decided that all Swedish nuclear power plants must be upgraded to mitigate the consequences of severe accidents. Mitigating measures were required to be in place by 1985 for the Barsebaeck plants and by 1988 for the remaining plants. The technical solution selected for accident mitigation was filtered venting of the reactor containment. The filtering requirement was that the release of radioactivity should be less than 0.1% of the core inventory. In Barsebaeck the filter consists of a gravel bed with a volume of 10,000 cubic metres. For the other Swedish plants (7 BWRs and 3 PWRs) a wet scrubber system of significantly smaller volume (300-400 cubic metres) has been selected. This filter system which is called FILTRA-MVSS (Multi Venturi Scrubber System) has been jointly developed by ASEA-ATOM and FLAEKT Industri, two companies belonging to the ASEA BROWN BOVERI Group of Companies. The FILTRA-MVSS which can accomodate a wide range of flow rates based on an automatic passive technique consists of a number of venturi nozzles submerged in a pool of water. The venturi separation technique has been employed in the field of industrial air pollution control for several decades. The technique has now been further developed and adapted for the cleaning of contaminated radioactive off-gases that might be the consequence of a severe reactor accident. After the Chernobyl accident the discussion on filtered containment venting intensified also in other European countries, and several countries, for example France, the Federal Republic of Germany and Finland are now planning for filtered venting systems. The FILTRA-MVSS system can be designed to meet a wide range of hypothetic design basis events for both BWR and PWR plants. The system can be optimized with respect to its size depending on various pressure requirements and it can be optimized for specified decontamination factors. (author). Poster presentation. 3 figs

  1. Aurorae in Australian Aboriginal Traditions

    CERN Document Server

    Hamacher, Duane W

    2013-01-01

    Transient celestial phenomena feature prominently in the astronomical knowledge and traditions of Aboriginal Australians. In this paper, I collect accounts of the Aurora Australis from the literature regarding Aboriginal culture. Using previous studies of meteors, eclipses, and comets in Aboriginal traditions, I anticipate that the physical properties of aurora, such as their generally red colour as seen from southern Australia, will be associated with fire, death, blood, and evil spirits. The survey reveals this to be the case and also explores historical auroral events in Aboriginal cultures, aurorae in rock art, and briefly compares Aboriginal auroral traditions with other global indigenous groups, including the Maori of New Zealand.

  2. Aurorae in Australian Aboriginal Traditions

    Science.gov (United States)

    Hamacher, Duane W.

    2013-07-01

    Transient celestial phenomena feature prominently in the astronomical knowledge and traditions of Aboriginal Australians. In this paper, I collect accounts of the Aurora Australis from the literature regarding Aboriginal culture. Using previous studies of meteors, eclipses, and comets in Aboriginal traditions, I anticipate that the physical properties of aurora, such as their generally red colour as seen from southern Australia, will be associated with fire, death, blood, and evil spirits. The survey reveals this to be the case and also explores historical auroral events in Aboriginal cultures, aurorae in rock art, and briefly compares Aboriginal auroral traditions with other global indigenous groups, including the Maori of New Zealand.

  3. Application of Unscented Kalman Filter for Sonar Signal Processing

    Directory of Open Access Journals (Sweden)

    Leela Kumari. B , Padma Raju.K

    2012-06-01

    Full Text Available State estimation theory is one of the best mathematical approaches to analyze variants in the states of the system or process. The state of the system is defined by a set of variables that provide a complete representation of the internal condition at any given instant of time. Filtering of Random processes is referred to as Estimation, and is a well defined statistical technique. There are two types of state estimation processes, Linear and Nonlinear. Linear estimation of a system can easily be analyzed by using Kalman Filter (KF and is used to compute the target state parameters with a priori information under noisy environment. But the traditional KF is optimal only when the model is linear and its performance is well defined under the assumptions that the system model and noise statistics are well known. Most of the state estimation problems are nonlinear, thereby limiting the practical applications of the KF. The modified KF, aka EKF, Unscented Kalman filter and Particle filter are best known for nonlinear estimates. Extended Kalman filter (EKF is the nonlinear version of the Kalman filter which linearizes about the current mean and covariance. The EKF has been considered the standard in the theory of nonlinear state estimation. Since linear systems do not really exist, a novel transformation is adopted. Unscented Kalman filter and Particle filter are best known nonlinear estimates. The approach in this paper is to analyze the algorithm for maneuvering target tracking using bearing only measurements where UKF provides better probability of state estimation.

  4. Directional bilateral filters for smoothing fluorescence microscopy images

    Science.gov (United States)

    Venkatesh, Manasij; Mohan, Kavya; Seelamantula, Chandra Sekhar

    2015-08-01

    Images obtained through fluorescence microscopy at low numerical aperture (NA) are noisy and have poor resolution. Images of specimens such as F-actin filaments obtained using confocal or widefield fluorescence microscopes contain directional information and it is important that an image smoothing or filtering technique preserve the directionality. F-actin filaments are widely studied in pathology because the abnormalities in actin dynamics play a key role in diagnosis of cancer, cardiac diseases, vascular diseases, myofibrillar myopathies, neurological disorders, etc. We develop the directional bilateral filter as a means of filtering out the noise in the image without significantly altering the directionality of the F-actin filaments. The bilateral filter is anisotropic to start with, but we add an additional degree of anisotropy by employing an oriented domain kernel for smoothing. The orientation is locally adapted using a structure tensor and the parameters of the bilateral filter are optimized for within the framework of statistical risk minimization. We show that the directional bilateral filter has better denoising performance than the traditional Gaussian bilateral filter and other denoising techniques such as SURE-LET, non-local means, and guided image filtering at various noise levels in terms of peak signal-to-noise ratio (PSNR). We also show quantitative improvements in low NA images of F-actin filaments.

  5. Alarm filtering and presentation

    International Nuclear Information System (INIS)

    This paper discusses alarm filtering and presentation in the control room of nuclear and other process control plants. Alarm generation and presentation is widely recognized as a general process control problem. Alarm systems often fail to provide meaningful alarms to operators. Alarm generation and presentation is an area in which computer aiding is feasible and provides clear benefits. Therefore, researchers have developed several computerized alarm filtering and presentation approaches. This paper discusses problems associated with alarm generation and presentation. Approaches to improving the alarm situation and installation issues of alarm system improvements are discussed. The impact of artificial intelligence (AI) technology on alarm system improvements is assessed. (orig.)

  6. Digital filters in spectrometry

    International Nuclear Information System (INIS)

    In this work is presented the development and application of the digital signal processing for different multichannel analysis spectra. The use of the smoothing classic methods in applications of signal processing is illustrated by a filters discussion; autoregressive, mobile average and the ARMA filters. Generally, simple routines of lineal smoothing do not provide appropriate smoothing of the data that show the local ruggedness as the strong discontinuities; however the indicated development algorithms have been enough to leave adapting to this task. Four algorithms were proven: autoregressive, mobile average, ARMA and binomial methods for 5, 7, and 9 of data, everything in the domain of the time and programmed in Mat lab. (Author)

  7. A FUZZY FILTERING MODEL FOR CONTOUR DETECTION

    Directory of Open Access Journals (Sweden)

    T.C. Rajakumar

    2011-04-01

    Full Text Available Contour detection is the basic property of image processing. Fuzzy Filtering technique is proposed to generate thick edges in two dimensional gray images. Fuzzy logic is applied to extract value for an image and is used for object contour detection. Fuzzy based pixel selection can reduce the drawbacks of conventional methods(Prewitt, Robert. In the traditional methods, filter mask is used for all kinds of images. It may succeed in one kind of image but fail in another one. In this frame work the threshold parameter values are obtained from the fuzzy histogram of the input image. The Fuzzy inference method selects the complete information about the border of the object and the resultant image has less impulse noise and the contrast of the edge is increased. The extracted object contour is thicker than the existing methods. The performance of the algorithm is tested with Peak Signal Noise Ratio(PSNR and Complex Wavelet Structural Similarity Metrics(CWSSIM.

  8. Active flutter suppression using dipole filters

    Science.gov (United States)

    Srinathkumar, S.; Waszak, Martin R.

    1992-01-01

    By using traditional control concepts of gain root locus, the active suppression of a flutter mode of a flexible wing is examined. It is shown that the attraction of the unstable mode towards a critical system zero determines the degree to which the flutter mode can be stabilized. For control situations where the critical zero is adversely placed in the complex plane, a novel compensation scheme called a 'Dipole' filter is proposed. This filter ensures that the flutter mode is stabilized with acceptable control energy. The control strategy is illustrated by designing flutter suppression laws for an active flexible wing (AFW) wind-tunnel model, where minimal control effort solutions are mandated by control rate saturation problems caused by wind-tunnel turbulence.

  9. Traditional birth attendants in Malawi

    Directory of Open Access Journals (Sweden)

    J. J. M. Smit

    1994-05-01

    Full Text Available Traditional Birth Attendants (TBAs and traditional healers form an important link in the chain of health personnel providing primary health care in Malawi. In spite of the establishment of hospitals and health centres, it is to these traditional healers and TBAs that the majority of people turn in times of sickness and child-birth. Approximately 60 percent of all deliveries in Malawi occur in the villages. It is therefore important that due regard be paid to the activities of these traditional practitioners in order to ensure the achievement of the goal - "Health for all by the year 2000". The training of TBAs is seen as part of the Maternal and Child Health Services in the country.

  10. Traditional Medicine in Developing Countries

    DEFF Research Database (Denmark)

    Thorsen, Rikke Stamp

    or spiritual healer and self-treatment with herbal medicine or medicinal plants. Reliance on traditional medicine varies between countries and rural and urban areas, but is reported to be as high as 80% in some developing countries. Increased realization of the continued importance of traditional...... the use of self-treatment with medicinal plants, is however scarce. Thus, this thesis contributes to understanding the extent of traditional medicine use as well as why people use it in rural areas in developing countries. This is important for the formulation of inclusive health care policies which...... can address the health care needs of people. Using Nepal as a case, the specific objectives of this thesis are; 1) to quantify the reliance on traditional medicine for health care as well as study the determinants of this reliance in rural Nepal; 2) to increase the understanding of why people use...

  11. Little Eyolf and dramatic tradition

    Directory of Open Access Journals (Sweden)

    Roland Lysell

    2015-02-01

    Full Text Available The article criticises an Ibsen tradition who has seen the last scene of Little Eyolf as a reconciliation. Instead, the article discusses the improbability of a happy marriage characterised by social engagement. The play is open but it is hardly probable that Rita, with her erotic desire, and Allmers, whose desire has turned into metaphysics, can be happy together. The arguments refer to inner criteria and the constantly present dramatic tradition.

  12. The evolution of traditional knowledge:

    DEFF Research Database (Denmark)

    Saslis Lagoudakis, C Haris; Hawkins, Julie A; Greenhill, Simon J; Pendry, Colin A; Watson, Mark F; Tuladhar-Douglas, Will; Baral, Sushim R; Savolainen, Vincent

    2014-01-01

    Traditional knowledge is influenced by ancestry, inter-cultural diffusion and interaction with the natural environment. It is problematic to assess the contributions of these influences independently because closely related ethnic groups may also be geographically close, exposed to similar environments and able to exchange knowledge readily. Medicinal plant use is one of the most important components of traditional knowledge, since plants provide healthcare for up to 80% of the world's populatio...

  13. Chapter 1. Traditional marketing revisited

    OpenAIRE

    LAMBIN, Jean-Jacques

    2013-01-01

    The objective of this chapter is to review the traditional marketing concept and to analyse its main ambiguities as presented in popular textbooks. The traditional marketing management model placing heavy emphasis of the marketing mix is in fact a supply-driven approach of the market, using the understanding of consumers’ needs to mould demand to the requirements of supply, instead of adapting supply to the expectations of demand. To clarify the true role of marketing, a distinction is made b...

  14. Health traditions of Sikkim Himalaya

    OpenAIRE

    Panda, Ashok Kumar; Misra, Sangram

    2010-01-01

    Ancient medical systems are still prevalent in Sikkim, popularly nurtured by Buddhist groups using the traditional Tibetan pharmacopoeia overlapping with Ayurvedic medicine. Traditional medical practices and their associated cultural values are based round Sikkim’s three major communities, Lepcha, Bhutia and Nepalis. In this study, a semi-structured questionnaire was prepared for folk healers covering age and sex, educational qualification, source of knowledge, types of practices, experience ...

  15. The impact of metallic filter media on HEPA filtration

    International Nuclear Information System (INIS)

    Traditional HEPA filter systems have limitations that often prevent them from solving many of the filtration problems in the nuclear industry; particularly in applications where long service or storage life, high levels of radioactivity, dangerous decomposition products, chemical aggression, organic solvents, elevated operating temperatures, fire resistance and resistance to moisture are issues. This paper addresses several of these matters of concern by considering the use of metallic filter media to solve HEPA filtration problems ranging from the long term storage of transuranic waste at the WIPP site, spent and damaged fuel assemblies, in glove box ventilation and tank venting to the venting of fumes at elevated temperatures from incinerators, vitrification processes and conversion and sintering furnaces as well as downstream of iodine absorbers in gas cooled reactors in the UK. The paper reviews the basic technology, development, performance characteristics and filtration efficiency, flow versus differential pressure, cleanability and costs of sintered metal fiber in comparison with traditional resin bonded glass fiber filter media and sintered metal powder filter media. Examples of typical filter element and system configurations and applications will be presented The paper will also address the economic case for installing self cleaning pre-filtration, using metallic media, to recover the small volumes of dust that would otherwise blind large volumes of final disposable HEPA filters, thus presenting a route to reduce ultimate disposal volumes and secondary waste streams. (authors)

  16. Study on New Method of Heart Disturbance Filtering on Measurement of Impedance Pneumograph

    International Nuclear Information System (INIS)

    The graph of impedance pneumograph often occurred heart disturbance. In order to decrease the disturbance, some traditional hardware and software methods were used. But effects of the filters were limited for the small frequency difference between heart and lung. The paper gave a new filter theory that heart was acted as a variety capacitance, then heart disturbance was decreased by integration of variety capacitance. Test results showed that the new filter theory was feasible and satisfied the requirements of clinic measurement

  17. Toward Green Cloud Computing: An Attribute Clustering Based Collaborative Filtering Method for Virtual Machine Migration

    OpenAIRE

    Zhang Liu-Mei; Ma Jian-Feng; Wang Yi-Chuan; Lu Di

    2013-01-01

    In this study, an attribute clustering based collaborative filtering algorithm is depicted for virtual machine migration towards green Cloud computing. The algorithm utilizes similarity characteristics of virtual machine task related attributes, especially CPU related attributes, to filter redundant data by feature selection. Then by referencing K-Means clustering to effectively solve the rating scale problems existing in the traditional collaborative filtering recommendation algorithm. Exper...

  18. A novel metamaterial filter with stable passband performance based on frequency selective surface

    OpenAIRE

    Fang, C Y; J. S. Gao; Hai Liu

    2014-01-01

    In this paper, a novel metamaterial filter based on frequency selective surface (FSS) is proposed. Using the mode matching method, we theoretically studied the transmission performance of the structure. Results show that, by rotating its neighboring elements 90 degree, the novel filter has a better stability to angle of incidence than traditional structures for TE and TM polarization. As the incident angles vary from 0 to 50 degrees, the metamaterial filter exhibits a transmittance higher tha...

  19. Location Estimation for an Autonomously Guided Vehicle using an Augmented Kalman Filter to Autocalibrate the Odometry

    DEFF Research Database (Denmark)

    Larsen, Thomas Dall; Bak, Martin; Andersen, Nils Axel; Ravn, Ole

    1998-01-01

    A Kalman filter using encoder readings as inputs and vision measurements as observations is designed as a location estimator for an autonomously guided vehicle (AGV). To reduce the effect of modelling errors an augmented filter that estimates the true system parameters is designed. The traditional...

  20. Ceramic HEPA Filter Program

    Energy Technology Data Exchange (ETDEWEB)

    Mitchell, M A; Bergman, W; Haslam, J; Brown, E P; Sawyer, S; Beaulieu, R; Althouse, P; Meike, A

    2012-04-30

    Potential benefits of ceramic filters in nuclear facilities: (1) Short term benefit for DOE, NRC, and industry - (a) CalPoly HTTU provides unique testing capability to answer questions for DOE - High temperature testing of materials, components, filter, (b) Several DNFSB correspondences and presentations by DNFSB members have highlighted the need for HEPA filter R and D - DNFSB Recommendation 2009-2 highlighted a nuclear facility response to an evaluation basis earthquake followed by a fire (aka shake-n-bake) and CalPoly has capability for a shake-n-bake test; (2) Intermediate term benefit for DOE and industry - (a) Filtration for specialty applications, e.g., explosive applications at Nevada, (b) Spin-off technologies applicable to other commercial industries; and (3) Long term benefit for DOE, NRC, and industry - (a) Across industry, strong desire for better performance filter, (b) Engineering solution to safety problem will improve facility safety and decrease dependence on associated support systems, (c) Large potential life-cycle cost savings, and (d) Facilitates development and deployment of LLNL process innovations to allow continuous ventilation system operation during a fire.

  1. Soft morphological filters

    Science.gov (United States)

    Koskinen, Lasse; Astola, Jaakko T.; Neuvo, Yrjo A.

    1991-07-01

    New morphological operations, called soft morphological operations, are introduced. They maintain most of the properties of standard morphological operations, yet give improved performance under certain conditions. The main difference to standard morphological operations is that soft morphological operations are less sensitive to additive noise and to small variations in the shape of the objects to be filtered.

  2. Morphing and Ensemble Filtering.

    Czech Academy of Sciences Publication Activity Database

    Mandel, J.; Beezley, J.; Resler, Jaroslav; Juruš, Pavel; Eben, Kryštof

    Prague : Institute of Computer Science of the AS CR, v.v.i, 2010, s. 1-9. [Workshop on "GHG reduction using IT" /2./. Prague (CZ), 28.05.2010] Institutional research plan: CEZ:AV0Z10300504 Keywords : data assimilation * Kalman filter Subject RIV: JE - Non-nuclear Energetics, Energy Consumption ; Use

  3. Spectral Ensemble Kalman Filters.

    Czech Academy of Sciences Publication Activity Database

    Mandel, Jan; Kasanický, Ivan; Vejmelka, Martin; Fuglík, Viktor; Tur?i?ová, Marie; Eben, Kryštof; Resler, Jaroslav; Juruš, Pavel

    2014-01-01

    Ro?. 11, - (2014), EMS2014-446. [EMS Annual Meeting /14./ & European Conference on Applied Climatology (ECAC) /10./. 06.10.2014-10.10.2014, Prague] R&D Projects: GA ?R GA13-34856S Grant ostatní: NSF DMS-1216481 Institutional support: RVO:67985807 Keywords : data assimilation * spectral filter Subject RIV: DG - Athmosphere Sciences, Meteorology

  4. Enhanced Optical Filter Design

    CERN Document Server

    Cushing, David

    2011-01-01

    This book serves as a supplement to the classic texts by Angus Macleod and Philip Baumeister, taking an intuitive approach to the enhancement of optical coating (or filter) performance. Drawing from 40 years of experience in thin film design, Cushing introduces the basics of thin films, the commonly used materials and their deposition, the major coatings and their applications, and improvement methods for each.

  5. Domain wall filters

    CERN Document Server

    Bär, O; Neuberger, H; Witzel, O; Baer, Oliver; Narayanan, Rajamani; Neuberger, Herbert; Witzel, Oliver

    2007-01-01

    We propose using the extra dimension separating the domain walls carrying lattice quarks of opposite handedness to gradually filter out the ultraviolet fluctuations of the gauge fields that are felt by the fermionic excitations living in the bulk. This generalization of the homogeneous domain wall construction has some theoretical features that seem nontrivial.

  6. Domain wall filters

    International Nuclear Information System (INIS)

    We propose using the extra dimension separating the domain walls carrying lattice quarks of opposite handedness to gradually filter out the ultraviolet fluctuations of the gauge fields that are felt by the fermionic excitations living in the bulk. This generalization of the homogeneous domain wall construction has some theoretical features that seem nontrivial

  7. Digital hum filtering

    Science.gov (United States)

    Knapp, Ralph W.; Anderson, Neil L.

    1994-06-01

    Data may be overprinted by a steady-state cyclical noise (hum). Steady-state indicates that the noise is invariant with time; its attributes, frequency, amplitude, and phase, do not change with time. Hum recorded on seismic data usually is powerline noise and associated higher harmonics; leakage from full-waveform rectified cathodic protection devices that contain the odd higher harmonics of powerline frequencies; or vibrational noise from mechanical devices. The fundamental frequency of powerline hum may be removed during data acquisition with the use of notch filters. Unfortunately, notch filters do not discriminate signal and noise, attenuating both. They also distort adjacent frequencies by phase shifting. Finally, they attenuate only the fundamental mode of the powerline noise; higher harmonics and frequencies other than that of powerlines are not removed. Digital notch filters, applied during processing, have many of the same problems as analog filters applied in the field. The method described here removes hum of a particular frequency. Hum attributes are measured by discrete Fourier analysis, and the hum is canceled from the data by subtraction. Errors are slight and the result of the presence of (random) noise in the window or asynchrony of the hum and data sampling. Error is minimized by increasing window size or by resampling to a finer interval. Errors affect the degree of hum attenuation, not the signal. The residual is steady-state hum of the same frequency.

  8. High temperature filter materials

    Science.gov (United States)

    Alvin, M. A.; Lippert, T. E.; Bachovchin, D. M.; Tressler, R. E.

    Objectives of this program are to identify the potential long-term thermal/chemical effects that advanced coal-based power generating system environments have on the stability of porous ceramic filter materials, as well as to assess the influence of these effects on filter operating performance and life. We have principally focused our efforts on developing an understanding of the stability of the alumina/mullite filter material at high temperature (i.e., 870, 980, and 1100 C) under oxidizing conditions which contain gas phase alkali species. Testing has typically been performed in two continuous flow-through, high temperature test facilities at the Westinghouse Science and Technology Center, using 7 cm diameter times 6.4 mm thick discs. (Alvin, 1992) Each disc of ceramic filter material is exposed for periods of 100 to 3,000 hours in duration. Additional efforts have been performed at Westinghouse to broaden our understanding of the stability of cordierite, cordierite-silicon nitride, reaction and sintered silicon nitride, and clay bonded silicon carbide under similar simulated advanced coal fired process conditions. The results of these efforts are presented in this paper.

  9. Parzen Particle Filters

    DEFF Research Database (Denmark)

    Lehn-Schiøler, Tue; Erdogmus, Deniz; Principe, Jose C.

    Using a Parzen density estimator any distribution can be approximated arbitrarily close by a sum of kernels. In particle filtering this fact is utilized to estimate a probability density function with Dirac delta kernels; when the distribution is discretized it becomes possible to solve an otherw...

  10. Printed analogue filter structures

    OpenAIRE

    Evans, PSA; Ramsey, BJ; Harrey, PM; Harrison, DJ

    1999-01-01

    The authors report progress in conductive lithographic film (CLF) technology, which uses the offset lithographic printing process to form electrically conductive patterns on flexible substrates. Networks of planar passive components and interconnects fabricated simultaneously via the CLF process form notch filter networks at 85 kHz.

  11. Ceramic HEPA Filter Program

    International Nuclear Information System (INIS)

    Potential benefits of ceramic filters in nuclear facilities: (1) Short term benefit for DOE, NRC, and industry - (a) CalPoly HTTU provides unique testing capability to answer questions for DOE - High temperature testing of materials, components, filter, (b) Several DNFSB correspondences and presentations by DNFSB members have highlighted the need for HEPA filter R and D - DNFSB Recommendation 2009-2 highlighted a nuclear facility response to an evaluation basis earthquake followed by a fire (aka shake-n-bake) and CalPoly has capability for a shake-n-bake test; (2) Intermediate term benefit for DOE and industry - (a) Filtration for specialty applications, e.g., explosive applications at Nevada, (b) Spin-off technologies applicable to other commercial industries; and (3) Long term benefit for DOE, NRC, and industry - (a) Across industry, strong desire for better performance filter, (b) Engineering solution to safety problem will improve facility safety and decrease dependence on associated support systems, (c) Large potential life-cycle cost savings, and (d) Facilitates development and deployment of LLNL process innovations to allow continuous ventilation system operation during a fire.

  12. Structural notch filter optimization

    Energy Technology Data Exchange (ETDEWEB)

    Felton, R.; Burge, S.; Bradshaw, A. [Lancaster Univ. (United Kingdom). Dept. of Engineering

    1995-09-01

    A modified algorithm for nonlinear constrained optimization of structural mode filters for an aeroelastic aircraft model is presented. The optimizer set-up and control is implemented in a MATLAB{trademark} graphical user interface environment. It is shown that the modified algorithm gives improved performance over existing nonlinear constrained optimization methods.

  13. Filtered beam spectrometers

    International Nuclear Information System (INIS)

    Filtered Beam Spectrometers (FBS) are one type of spectrometer that has now shown that inelastic scattering experiments can be made using neutrons with energy in the electron volt range. A general description of the FBS is given in this paper. Examples of several types of data are presented and discussed

  14. Signalverarbeitung, Filter und Effekte

    Science.gov (United States)

    Zölzer, Udo

    In diesem Kapitel werden die Grundlagen der digitalen Signalverarbeitung, eine Einführung in digitale Filter und daran anschließend digitale Audio-Effekte vorgestellt. Hierzu wird eine einfache mathematische Formulierung eingeführt, die auf Algorithmen im Zeitbereich beruht. Die äquivalente Betrachtung dieser Algorithmen im Frequenzbereich wird durch Nutzung der zeitdiskreten Fourier-Transformation möglich.

  15. Ozone decomposing filter

    Science.gov (United States)

    Simandl, Ronald F. (Farragut, TN); Brown, John D. (Harriman, TN); Whinnery, Jr., LeRoy L. (Dublin, CA)

    1999-01-01

    In an improved ozone decomposing air filter carbon fibers are held together with a carbonized binder in a perforated structure. The structure is made by combining rayon fibers with gelatin, forming the mixture in a mold, freeze-drying, and vacuum baking.

  16. Bayesian Filters in Practice.

    Czech Academy of Sciences Publication Activity Database

    Krejsa, Ji?í; V?chet, S.

    Bratislava : Slovak University of Technology in Bratislava, 2010, s. 217-222. ISBN 978-80-227-3353-3. [Robotics in Education . Bratislava (SK), 16.09.2010-17.09.2010] Institutional research plan: CEZ:AV0Z20760514 Keywords : mobile robot localization * bearing only beacons * Bayesian filters Subject RIV: JD - Computer Applications, Robotics

  17. Efficient Iterated Filtering

    DEFF Research Database (Denmark)

    Lindstro?m, Erik; Ionides, Edward; Frydendall, Jan; Madsen, Henrik

    -Rao efficient. The proposed estimator is easy to implement as it only relies on non-linear filtering. This makes the framework flexible as it is easy to tune the implementation to achieve computational efficiency. This is done by using the approximation of the score function derived from the theory on Iterative...

  18. Spot- Zombie Filtering System

    Directory of Open Access Journals (Sweden)

    Arathy Rajagopal

    2014-01-01

    Full Text Available A major security challenge on the Internet is the existence of the large number of compromised machines. Such machines have been increasingly used to launch various security attacks including spamming and spreading malware, DDoS, and identity theft. These compromised machines are called “Zombies”. In general E-mail applications and providers uses spam filters to filter the spam messages. Spam filtering is a technique for discriminating the genuine message from the spam messages. The attackers send the spam messages to the targeted machine by exalting the filters, which causes the increase in false positives and false negatives. We develop an effective spam zombie detection system named SPOT by monitoring outgoing messages of a network. SPOT focuses on the number of outgoing messages that are originated or forwarded by each computer on a network to identify the presence of Zombies. SPOT is designed based on a powerful statistical tool called Sequential Probability Ratio Test, which has bounded false positive and false negative error rates.

  19. Spot- Zombie Filtering System

    Directory of Open Access Journals (Sweden)

    Arathy Rajagopal

    2015-10-01

    Full Text Available A major security challenge on the Internet is the existence of the large number of compromised machines. Such machines have been increasingly used to launch various security attacks including spamming and spreading malware, DDoS, and identity theft. These compromised machines are called "Zombies". In general E-mail applications and providers uses spam filters to filter the spam messages. Spam filtering is a technique for discriminating the genuine message from the spam messages. The attackers send the spam messages to the targeted machine by exalting the filters, which causes the increase in false positives and false negatives. We develop an effective spam zombie detection system named SPOT by monitoring outgoing messages of a network. SPOT focuses on the number of outgoing messages that are originated or forwarded by each computer on a network to identify the presence of Zombies. SPOT is designed based on a powerful statistical tool called Sequential Probability Ratio Test, which has bounded false positive and false negative error rates.

  20. The Kalman filter

    OpenAIRE

    Andrade-Cetto, J

    2002-01-01

    The kalman Filter developed in the early sixties by R. E. Kalman is a recursive state estimator for partially observed non-stationary stochastic prosses. It gives an optimal estimate in the least squares sense of the actual value of state vector from noisy observations.

  1. Substrate Integrated Evanescent Filters Employing Coaxial Stubs

    DEFF Research Database (Denmark)

    Zhurbenko, Vitaliy

    2015-01-01

    Evanescent mode substrate integrated waveguide (SIW) is one of the promising technologies for design of light-weight low-cost microwave components. Traditional realization methods used in the standard evanescent waveguide technology are often not directly applicable to SIW due to dielectric filling...... and small height of the waveguide. In this work, one of the realization methods of evanescent mode waveguides using a single layer substrate is considered. The method is based on the use of coaxial stubs as capacitive susceptances externally connected to a SIW. A microwave filter based on these...

  2. Intraindividual evaluation of the influence of iterative reconstruction and filter kernel on subjective and objective image quality in computed tomography of the brain

    Energy Technology Data Exchange (ETDEWEB)

    Buhk, J.H. [Univ. Medical Center, Hamburg-Eppendorf (Germany). Dept. of Neuroradiology; Laqmani, A.; Schultzendorff, H.C. von; Hammerle, D.; Adam, G.; Regier, M. [Univ. Medical Center, Hamburg-Eppendorf (Germany). Dept. of Diagnostic and Interventional Radiology; Sehner, S. [Univ. Medical Center, Hamburg-Eppendorf (Germany). Inst. of Medical Biometry and Epidemiology; Fiehler, J. [Univ. Medical Center, Hamburg-Eppendorf (Germany). Neuroradiology; Nagel, H.D. [Dr. HD Nagel, Science and Technology for Radiology, Buchholz (Germany)

    2013-08-15

    Objectives: To intraindividually evaluate the potential of 4th generation iterative reconstruction (IR) on brain CT with regard to subjective and objective image quality. Methods: 31 consecutive raw data sets of clinical routine native sequential brain CT scans were reconstructed with IR level 0 (= filtered back projection), 1, 3 and 4; 3 different brain filter kernels (smooth/standard/sharp) were applied respectively. Five independent radiologists with different levels of experience performed subjective image rating. Detailed ROI analysis of image contrast and noise was performed. Statistical analysis was carried out by applying a random intercept model. Results: Subjective scores for the smooth and the standard kernels were best at low IR levels, but both, in particular the smooth kernel, scored inferior with an increasing IR level. The sharp kernel scored lowest at IR 0, while the scores substantially increased at high IR levels, reaching significantly best scores at IR 4. Objective measurements revealed an overall increase in contrast-to-noise ratio at higher IR levels, which was highest when applying the soft filter kernel. The absolute grey-white contrast decreased with an increasing IR level and was highest when applying the sharp filter kernel. All subjective effects were independent of the raters' experience and the patients' age and sex. Conclusion: Different combinations of IR level and filter kernel substantially influence subjective and objective image quality of brain CT. (orig.)

  3. Intraindividual evaluation of the influence of iterative reconstruction and filter kernel on subjective and objective image quality in computed tomography of the brain

    International Nuclear Information System (INIS)

    Objectives: To intraindividually evaluate the potential of 4th generation iterative reconstruction (IR) on brain CT with regard to subjective and objective image quality. Methods: 31 consecutive raw data sets of clinical routine native sequential brain CT scans were reconstructed with IR level 0 (= filtered back projection), 1, 3 and 4; 3 different brain filter kernels (smooth/standard/sharp) were applied respectively. Five independent radiologists with different levels of experience performed subjective image rating. Detailed ROI analysis of image contrast and noise was performed. Statistical analysis was carried out by applying a random intercept model. Results: Subjective scores for the smooth and the standard kernels were best at low IR levels, but both, in particular the smooth kernel, scored inferior with an increasing IR level. The sharp kernel scored lowest at IR 0, while the scores substantially increased at high IR levels, reaching significantly best scores at IR 4. Objective measurements revealed an overall increase in contrast-to-noise ratio at higher IR levels, which was highest when applying the soft filter kernel. The absolute grey-white contrast decreased with an increasing IR level and was highest when applying the sharp filter kernel. All subjective effects were independent of the raters' experience and the patients' age and sex. Conclusion: Different combinations of IR level and filter kernel substantially influence subjective and objective image quality of brain CT. (orig.)

  4. An Adjoint-Based Adaptive Ensemble Kalman Filter

    KAUST Repository

    Song, Hajoon

    2013-10-01

    A new hybrid ensemble Kalman filter/four-dimensional variational data assimilation (EnKF/4D-VAR) approach is introduced to mitigate background covariance limitations in the EnKF. The work is based on the adaptive EnKF (AEnKF) method, which bears a strong resemblance to the hybrid EnKF/three-dimensional variational data assimilation (3D-VAR) method. In the AEnKF, the representativeness of the EnKF ensemble is regularly enhanced with new members generated after back projection of the EnKF analysis residuals to state space using a 3D-VAR [or optimal interpolation (OI)] scheme with a preselected background covariance matrix. The idea here is to reformulate the transformation of the residuals as a 4D-VAR problem, constraining the new member with model dynamics and the previous observations. This should provide more information for the estimation of the new member and reduce dependence of the AEnKF on the assumed stationary background covariance matrix. This is done by integrating the analysis residuals backward in time with the adjoint model. Numerical experiments are performed with the Lorenz-96 model under different scenarios to test the new approach and to evaluate its performance with respect to the EnKF and the hybrid EnKF/3D-VAR. The new method leads to the least root-mean-square estimation errors as long as the linear assumption guaranteeing the stability of the adjoint model holds. It is also found to be less sensitive to choices of the assimilation system inputs and parameters.

  5. Soft morphological filters: a robust morphological filtering method

    Science.gov (United States)

    Koskinen, Lasse; Astola, Jaakko T.

    1994-01-01

    We introduce new morphological filters, called soft morphological filters. They maintain most of the desirable properties of standard morphological operations yet are less sensitive to additive noise and to small variations in the shapes of the objects to be filtered. The main difference from standard morphological filters is that maximum and minimum operations are replaced by more general weighted-order statistics. This results in the loss of some algebraic properties but improved performance under noisy conditions.

  6. Experimental study of filter cake formation on different filter media

    International Nuclear Information System (INIS)

    Removal of particulate matter from gases generated in the process industry is important for product recovery as well as emission control. Dynamics of filtration plant depend on operating conditions. The models, that predict filter plant behaviour, involve empirical resistance parameters which are usually derived from limited experimental data and are characteristics of the filter media and filter cake (dust deposited on filter medium). Filter cake characteristics are affected by the nature of filter media, process parameters and mode of filter regeneration. Removal of dust particles from air is studied in a pilot scale jet pulsed bag filter facility resembling closely to the industrial filters. Limestone dust and ambient air are used in this study with two widely different filter media. All important parameters like pressure drop, gas flow rate, dust settling, are recorded continuously at 1s interval. The data is processed for estimation of the resistance parameters. The pressure drop rise on test filter media is compared. Results reveal that the surface of filter media has an influence on pressure drop rise (concave pressure drop rise). Similar effect is produced by partially jet pulsed filter surface. Filter behaviour is also simulated using estimated parameters and a simplified model and compared with the experimental results. Distribution of cake area load is therefore an important aspect of jet pulse cleaned bag filter modeling. Mean specific cake resistance remains nearly constant on thoroughly jet pulse cleaned membrane coated filter bags. However, the trend can not be confirmed without independent cake height and density measurements. Thus the results reveal the importance of independent measurements of cake resistance. (author)

  7. EnFilter: a Password Enforcement and Filter

    OpenAIRE

    Bergadano, Francesco; RUFFO, GIANCARLO

    2005-01-01

    EnFilter is a Proactive Password Checking System, designed to avoid password guessing attacks. It is made of a set of configurable filters, each one based on a specific pattern recognition measure that can be tuned by the system administrator depending on the adopted password policy. Filters use decision trees, lexical analysers, as well as Levenshtein distance based techniques. EnFilter is implemented for Windows 2000/2003/XP.

  8. The Rao-Blackwellized Particle Filter: A Filter Bank Implementation

    OpenAIRE

    Hendeby, Gustaf; Hendeby G.; Karlsson, Rickard; Karlsson R; Gustafsson, Fredrik; Gustafsson F.

    2010-01-01

    For computational efficiency, it is important to utilize model structure in particle filtering. One of the most important cases occurs when there exists a linear Gaussian substructure, which can be efficiently handled by Kalman filters. This is the standard formulation of the Rao-Blackwellized particle filter (RBPF). This contribution suggests an alternative formulation of this well-known result that facilitates reuse of standard filtering components and which is also suitable for object-ori...

  9. Hierarchical Bayes Ensemble Kalman Filtering

    CERN Document Server

    Tsyrulnikov, Michael

    2015-01-01

    Ensemble Kalman filtering (EnKF), when applied to high-dimensional systems, suffers from an inevitably small affordable ensemble size, which results in poor estimates of the background error covariance matrix ${\\bf B}$. The common remedy is a kind of regularization, usually an ad-hoc spatial covariance localization (tapering) combined with artificial covariance inflation. Instead of using an ad-hoc regularization, we adopt the idea by Myrseth and Omre (2010) and explicitly admit that the ${\\bf B}$ matrix is unknown and random and estimate it along with the state (${\\bf x}$) in an optimal hierarchical Bayes analysis scheme. We separate forecast errors into predictability errors (i.e. forecast errors due to uncertainties in the initial data) and model errors (forecast errors due to imperfections in the forecast model) and include the two respective components ${\\bf P}$ and ${\\bf Q}$ of the ${\\bf B}$ matrix into the extended control vector $({\\bf x},{\\bf P},{\\bf Q})$. Similarly, we break the traditional backgrou...

  10. OPTIMIZATION OF ADVANCED FILTER SYSTEMS

    Energy Technology Data Exchange (ETDEWEB)

    R.A. Newby; M.A. Alvin; G.J. Bruck; T.E. Lippert; E.E. Smeltzer; M.E. Stampahar

    2002-06-30

    Two advanced, hot gas, barrier filter system concepts have been proposed by the Siemens Westinghouse Power Corporation to improve the reliability and availability of barrier filter systems in applications such as PFBC and IGCC power generation. The two hot gas, barrier filter system concepts, the inverted candle filter system and the sheet filter system, were the focus of bench-scale testing, data evaluations, and commercial cost evaluations to assess their feasibility as viable barrier filter systems. The program results show that the inverted candle filter system has high potential to be a highly reliable, commercially successful, hot gas, barrier filter system. Some types of thin-walled, standard candle filter elements can be used directly as inverted candle filter elements, and the development of a new type of filter element is not a requirement of this technology. Six types of inverted candle filter elements were procured and assessed in the program in cold flow and high-temperature test campaigns. The thin-walled McDermott 610 CFCC inverted candle filter elements, and the thin-walled Pall iron aluminide inverted candle filter elements are the best candidates for demonstration of the technology. Although the capital cost of the inverted candle filter system is estimated to range from about 0 to 15% greater than the capital cost of the standard candle filter system, the operating cost and life-cycle cost of the inverted candle filter system is expected to be superior to that of the standard candle filter system. Improved hot gas, barrier filter system availability will result in improved overall power plant economics. The inverted candle filter system is recommended for continued development through larger-scale testing in a coal-fueled test facility, and inverted candle containment equipment has been fabricated and shipped to a gasifier development site for potential future testing. Two types of sheet filter elements were procured and assessed in the program through cold flow and high-temperature testing. The Blasch, mullite-bonded alumina sheet filter element is the only candidate currently approaching qualification for demonstration, although this oxide-based, monolithic sheet filter element may be restricted to operating temperatures of 538 C (1000 F) or less. Many other types of ceramic and intermetallic sheet filter elements could be fabricated. The estimated capital cost of the sheet filter system is comparable to the capital cost of the standard candle filter system, although this cost estimate is very uncertain because the commercial price of sheet filter element manufacturing has not been established. The development of the sheet filter system could result in a higher reliability and availability than the standard candle filter system, but not as high as that of the inverted candle filter system. The sheet filter system has not reached the same level of development as the inverted candle filter system, and it will require more design development, filter element fabrication development, small-scale testing and evaluation before larger-scale testing could be recommended.

  11. Active pre-filters for dc/dc Boost regulators

    Directory of Open Access Journals (Sweden)

    Carlos Andrés Ramos-Paja

    2014-07-01

    Full Text Available This paper proposes an active pre-filter to mitigate the current harmonics generated by classical dc/dc Boost regulators, which generate current ripples proportional to the duty cycle. Therefore, high output voltage conditions, i.e., high voltage conversion ratios, produce high current harmonics that must be filtered to avoid damage or source losses. Traditionally, these current components are filtered using electrolytic capacitors, which introduce reliability problems because of their high failure rate. The solution introduced in this paper instead uses a dc/dc converter based on the parallel connection of the Boost canonical cells to filter the current ripples generated by the Boost regulator, improving the system reliability. This solution provides the additional benefits of improving the overall efficiency and the voltage conversion ratio. Finally, the solution is validated with simulations and experimental results.

  12. Traditional botanical medicine: an introduction.

    Science.gov (United States)

    Rosenbloom, Richard A; Chaudhary, Jayesh; Castro-Eschenbach, Diane

    2011-01-01

    The role of traditional medicine in the well-being of mankind has certainly journeyed a long way. From an ancient era, in which knowledge was limited to a few traditional healers and dominated by the use of whole plants or crude drugs, the science has gradually evolved into a complete healthcare system with global recognition. Technologic advancements have facilitated traditional science to deliver numerous breakthrough botanicals with potency equivalent to those of conventional drugs. The renewed interest in traditional medicine is mainly attributed to its ability to prevent disease, promote health, and improve quality of life. Despite the support received from public bodies and research organizations, development of botanical medicines continues to be a challenging process. The present article gives a summarized description of the various difficulties encountered in the development and evaluation of botanical drugs, including isolation of active compounds and standardization of plant ingredients. It indicates a future direction of traditional medicine toward evidence-based evaluation of health claims through well-controlled safety and efficacy studies. PMID:21336093

  13. Inorganic UV filters

    Scientific Electronic Library Online (English)

    Eloísa Berbel, Manaia; Renata Cristina Kiatkoski, Kaminski; Marcos Antonio, Corrêa; Leila Aparecida, Chiavacci.

    2013-06-01

    Full Text Available A preocupação com o câncer de pele hoje em dia vem crescendo cada vez mais principalmente em países tropicais, onde a incidência da radiação UVA/B é maior. O uso correto de protetores solares é a forma mais eficaz de prevenir o aparecimento desta doença. Os ativos utilizados em protetores solares po [...] dem ser filtros orgânicos e inorgânicos. Filtros inorgânicos apresentam muitas vantagens em relação aos orgânicos, tais como fotoestabilidade, ausência de irritabilidade e amplo espectro de proteção. Entretanto, em razão de apresentarem alto índice de refração, os ativos inorgânicos conferem aos protetores solares aparência esbranquiçada, diminuindo sua atratividade estética. Muitas alternativas têm sido desenvolvidas no sentido de resolver este problema e dentre elas pode-se destacar o uso da nanotecnologia. Estima-se que o uso de nanomateriais deve crescer das atuais 2000 para 58000 toneladas até 2020. Neste sentido, este trabalho tem como objetivo fazer a análise crítica abordando diferentes aspectos envolvidos tanto na obtenção de protetores solares inorgânicos (rotas de sínteses propostas nos últimos anos) quanto na permeabilidade, na segurança e em outros aspectos relacionados à nova geração de filtros solares inorgânicos. Abstract in english Nowadays, concern over skin cancer has been growing more and more, especially in tropical countries where the incidence of UVA/B radiation is higher. The correct use of sunscreen is the most efficient way to prevent the development of this disease. The ingredients of sunscreen can be organic and/or [...] inorganic sun filters. Inorganic filters present some advantages over organic filters, such as photostability, non-irritability and broad spectrum protection. Nevertheless, inorganic filters have a whitening effect in sunscreen formulations owing to the high refractive index, decreasing their esthetic appeal. Many techniques have been developed to overcome this problem and among them, the use of nanotechnology stands out. The estimated amount of nanomaterial in use must increase from 2000 tons in 2004 to a projected 58000 tons in 2020. In this context, this article aims to analyze critically both the different features of the production of inorganic filters (synthesis routes proposed in recent years) and the permeability, the safety and other characteristics of the new generation of inorganic filters.

  14. Positive implicative ordered filters of implicative semigroups

    OpenAIRE

    Kyung Ho Kim; Young Bae Jun

    2000-01-01

    We introduce the notion of positive implicative ordered filters in implicative semigroups. We show that every positive implicative ordered filter is both an ordered filter and an implicative ordered filter. We give examples that an ordered filter (an implicative ordered filter) may not be a positive implicative ordered filter. We also give equivalent conditions of positive implicative ordered filters. Finally we establish the extension property for positive implicative ordered filters.

  15. Filters used in scoliosis radiography

    International Nuclear Information System (INIS)

    The use of X-ray filters during full spinal radiography for scoliosis in adolescent patients is discussed. The filters compensate for differences in body thickness while maintaining optimum image quality. They also help to reduce patient dose

  16. Resonant ?-filter for Moessbauer spectroscopy

    International Nuclear Information System (INIS)

    The parameters of a resonant filter used to measure a proportion of resonance ?-quanta emitted without recoil are analyzed. The optimal thickness and darkness of the filter versus its chemical composition are determined

  17. Analog filters in nanometer CMOS

    CERN Document Server

    Uhrmann, Heimo; Zimmermann, Horst

    2014-01-01

    Starting from the basics of analog filters and the poor transistor characteristics in nanometer CMOS 10 high-performance analog filters developed by the authors in 120 nm and 65 nm CMOS are described extensively. Among them are gm-C filters, current-mode filters, and active filters for system-on-chip realization for Bluetooth, WCDMA, UWB, DVB-H, and LTE applications. For the active filters several operational amplifier designs are described. The book, furthermore, contains a review of the newest state of research on low-voltage low-power analog filters. To cover the topic of the book comprehensively, linearization issues and measurement methods for the characterization of advanced analog filters are introduced in addition. Numerous elaborate illustrations promote an easy comprehension. This book will be of value to engineers and researchers in industry as well as scientists and Ph.D students at universities. The book is also recommendable to graduate students specializing on nanoelectronics, microelectronics ...

  18. Active resistance capacitance filter design

    Science.gov (United States)

    Kerwin, W. J.

    1970-01-01

    Filters, formed by combinations of distributed RC elements with positive-feedback voltage amplifiers, provide transfer functions similar to those the heavier LC filters ordinarily employ. They also provide signal amplification.

  19. Kalman filtering implementation with Matlab

    OpenAIRE

    Kleinbauer, Rachel

    2004-01-01

    1960 und 1961 veröffentlichte Rudolf Emil Kalmen seine Arbeiten über einen rekursiven prädiktiven Filter, der auf dem Gebrauch von rekursiven Algorithmen basiert. Damit revolutionierte er das Feld der Schätzverfahren. Seitdem ist der sogenannte Kalman Filter Gegenstand ausführlicher Forschung und findet bis heute Anwendung in zahlreichen Gebieten. Der Kalman Filter schätzt den Zustand eines dynamischen Systems, auch wenn die exakte Form dieses Systems unbekannt ist. Der Filter ist sehr lei...

  20. Analysis of Traditional Historical Clothing

    DEFF Research Database (Denmark)

    Jensen, Karsten; Schmidt, A. L.; Petersen, A. H.

    2013-01-01

    A recurrent problem for scholars who investigate traditional and historical clothing is the measuring of items of clothing and subsequent pattern construction. The challenge is to produce exact data without damaging the item. The main focus of this paper is to present a new procedure for establis......A recurrent problem for scholars who investigate traditional and historical clothing is the measuring of items of clothing and subsequent pattern construction. The challenge is to produce exact data without damaging the item. The main focus of this paper is to present a new procedure for...... establishing a three-dimensional model and the corresponding two-dimensional pattern for items of skin clothing that are not flat. The new method is non-destructive, and also accurate and fast. Furthermore, this paper presents an overview of the more traditional methods of pattern documentation and measurement...

  1. The evolution of traditional knowledge:

    DEFF Research Database (Denmark)

    Saslis Lagoudakis, C Haris; Hawkins, Julie A; Greenhill, Simon J; Pendry, Colin A; Watson, Mark F; Tuladhar-Douglas, Will; Baral, Sushim R; Savolainen, Vincent

    2014-01-01

    environments and able to exchange knowledge readily. Medicinal plant use is one of the most important components of traditional knowledge, since plants provide healthcare for up to 80% of the world's population. Here, we assess the significance of ancestry, geographical proximity of cultures and the...... the effects of shared ancestry and geographical proximity. These findings demonstrate the importance of adaptation to local environments, even at small spatial scale, in shaping traditional knowledge during human cultural evolution.......Traditional knowledge is influenced by ancestry, inter-cultural diffusion and interaction with the natural environment. It is problematic to assess the contributions of these influences independently because closely related ethnic groups may also be geographically close, exposed to similar...

  2. Drilling fluid filter

    Science.gov (United States)

    Hall, David R.; Fox, Joe; Garner, Kory

    2007-01-23

    A drilling fluid filter for placement within a bore wall of a tubular drill string component comprises a perforated receptacle with an open end and a closed end. A hanger for engagement with the bore wall is mounted at the open end of the perforated receptacle. A mandrel is adjacent and attached to the open end of the perforated receptacle. A linkage connects the mandrel to the hanger. The linkage may be selected from the group consisting of struts, articulated struts and cams. The mandrel operates on the hanger through the linkage to engage and disengage the drilling fluid filter from the tubular drill string component. The mandrel may have a stationary portion comprising a first attachment to the open end of the perforated receptacle and a telescoping adjustable portion comprising a second attachment to the linkage. The mandrel may also comprise a top-hole interface for top-hole equipment.

  3. Stochastic stacking without filters

    International Nuclear Information System (INIS)

    The rate of accumulation of antiprotons is a critical factor in the design of p anti p colliders. A design of a system to accumulate higher anti p fluxes is presented here which is an alternative to the schemes used at the CERN AA and in the Fermilab Tevatron I design. Contrary to these stacking schemes, which use a system of notch filters to protect the dense core of antiprotons from the high power of the stack tail stochastic cooling, an eddy current shutter is used to protect the core in the region of the stack tail cooling kicker. Without filters one can have larger cooling bandwidths, better mixing for stochastic cooling, and easier operational criteria for the power amplifiers. In the case considered here a flux of 1.4 x 108 per sec is achieved with a 4 to 8 GHz bandwidth

  4. Assessment of ceramic membrane filters

    Science.gov (United States)

    Ahluwalia, Rajesh K.; Geyer, Howard K.; Im, Kwan H.; Zhu, Chao; Shelleman, David; Tressler, Richard E.

    The objectives of this project are (1) to develop analytical models for evaluating the fluid mechanics of membrane coated, dead-end ceramic filters; and (2) to determine the effects of thermal and thermo-chemical aging on the material properties of emerging ceramic hot gas filters. A honeycomb cordierite monolith with a thin ceramic coating and a rigid candle filter were evaluated.

  5. A new nonlinear filter

    OpenAIRE

    Elliott, Robert J.; Haykin, Simon

    2006-01-01

    A discrete time filter is constructed where both the observation and signal process have non-linear dynamics with additive white Gaussian noise. Using the reference probably frame- work a convolution Zakai equation is obtained which updates the unnormalized conditional density. Our work obtains approximate solutions of this equation in terms of Gaussian sum when second order expansions are introduced for the non-linear terms.

  6. Efficient Iterated Filtering

    DEFF Research Database (Denmark)

    Lindstro?m, Erik; Ionides, Edward; Frydendall, Jan; Madsen, Henrik

    2012-01-01

    Parameter estimation in general state space models is not trivial as the likelihood is unknown. We propose a recursive estimator for general state space models, and show that the estimates converge to the true parameters with probability one. The estimates are also asymptotically Cramer-Rao efficient. The proposed estimator is easy to implement as it only relies on non-linear filtering. This makes the framework flexible as it is easy to tune the implementation to achieve computational efficiency...

  7. Factorized Kalman Filtering.

    Czech Academy of Sciences Publication Activity Database

    Suzdaleva, Evgenia

    Praha : ÚTIA AV ?R, 2006 - (P?ikryl, J.; Šmídl, V.). s. 51-52 [International PhD Workshop on Interplay of Societal and Technical Decision-Making, Young Generation Viewpoint /7./. 25.09.2006-30.09.2006, Hrubá Skála] R&D Projects: GA MŠk 1M0572; GA ?R GP201/06/P434 Institutional research plan: CEZ:AV0Z10750506 Keywords : state estimation * factorized filters * traffic control Subject RIV: BC - Control Systems Theory

  8. Metamaterial Tunable Filter Design

    OpenAIRE

    Naima Benmostefa; M. MELIANI; H. Ouslimani

    2013-01-01

    This paper presents a new concept to implement a tunable filter metamaterial with dual negative refraction composed of ferrite slabs and metallic resonators, including split-ring resonators (SRR), and short wire pairs. The ferrite slabs under an applied magnetics bias provide one magnetic resonance frequency band and the metallic resonators provide another one. The continuous wires within the metamaterials provide the negative permittivity in a wide frequency band covering the two magnetic r...

  9. Decayed MCMC Filtering

    OpenAIRE

    Marthi, Bhaskara; Pasula, Hanna; Russell, Stuart; Peres, Yuval

    2012-01-01

    Filtering---estimating the state of a partially observable Markov process from a sequence of observations---is one of the most widely studied problems in control theory, AI, and computational statistics. Exact computation of the posterior distribution is generally intractable for large discrete systems and for nonlinear continuous systems, so a good deal of effort has gone into developing robust approximation algorithms. This paper describes a simple stochastic approximation...

  10. Convolution filters for triangles

    OpenAIRE

    Nicollier, Grégoire

    2014-01-01

    The construction of a new triangle by erecting similar ears on the sides of a given triangle (as in Napoleon's theorem) can be considered as the convolution of the initial triangle with another triangle. We use the discrete Fourier transformation and a shape function to give a complete and explicit description of such convolution filters and their iterates. Our method leads to many old and new results in a very direct way.

  11. Wire frame filter

    Energy Technology Data Exchange (ETDEWEB)

    Babev, D.A.; Abasov, S.M.

    1981-10-23

    Presentation is made of a filter, having a cylindrical hollow support frame with sliding openings, arranged uniformly along the perimeter of the frame with longitudinal shafts reinforced on it, and whose ends are rigidly installed in connection pieces, and a wire, wound in a spiral fashion around support shafts. In order to increase the productivity and to simplify the preparation, the support frame is made in the form of a spiral.

  12. Updating the OMERACT filter

    DEFF Research Database (Denmark)

    D'Agostino, Maria-Antonietta; Boers, Maarten; Kirwan, John; van der Heijde, Désirée; Østergaard, Mikkel; Schett, Georg; Landewé, Robert B; Maksymowych, Walter P; Naredo, Esperanza; Dougados, Maxime; Iagnocco, Annamaria; Bingham, Clifton O; Brooks, Peter M; Beaton, Dorcas E; Gandjbakhch, Frederique; Gossec, Laure; Guillemin, Francis; Hewlett, Sarah E; Kloppenburg, Margreet; March, Lyn; Mease, Philip J; Moller, Ingrid; Simon, Lee S; Singh, Jasvinder A; Strand, Vibeke; Wakefield, Richard J; Wells, George A; Tugwell, Peter; Conaghan, Philip G

    2014-01-01

    OBJECTIVE: The Outcome Measures in Rheumatology (OMERACT) Filter provides a framework for the validation of outcome measures for use in rheumatology clinical research. However, imaging and biochemical measures may face additional validation challenges because of their technical nature. The Imaging and Soluble Biomarker Session at OMERACT 11 aimed to provide a guide for the iterative development of an imaging or biochemical measurement instrument so it can be used in therapeutic assessment. METHO...

  13. Filters in topology optimization

    DEFF Research Database (Denmark)

    Bourdin, Blaise

    1999-01-01

    In this article, a modified (``filtered'') version of the minimum compliance topology optimization problem is studied. The direct dependence of the material properties on its pointwise density is replaced by a regularization of the density field using a convolution operator. In this setting it is possible to establish the existence of solutions. Moreover, convergence of an approximation by means of finite elements can be obtained. This is illustrated through some numerical experiments. The ``fil...

  14. Multimodal microwave filters

    OpenAIRE

    Contreras Lizarraga, Adrián Arturo

    2013-01-01

    This thesis presents the conception, design and implementation of new topologies of multimodal microwave resonators and filters, using a combination of uniplanar technologies such as coplanar waveguide (CPW), coplanar strips (CPS) and slotlines. The term "multimodal" refers to uniplanar circuits in which the two fundamental modes of the CPW propagate (the even and the odd mode). By using both modes of the CPW, it is possible to achieve added functions, such as additional transmission zeros to...

  15. Superconducting notch filter

    International Nuclear Information System (INIS)

    Results of a preliminary investigation of a superconducting notch filter for possible application in the 2 to 30 MHz high frequency (HF) communication band are presented. The circuit was successfully implemented using planar geometry so that closed cycle refrigeration could be used to cool circuits fabricated from high T/sub c/ Nb3Sn or Nb3Ge thin films. In the present design, circuit Q's of about 2 x 103 were obtained with 50-ohm source and output impedance

  16. Software filter strategies

    International Nuclear Information System (INIS)

    At an SSC luminosity of 10/sup 33/, the hardware triggers should lower the event rate to a few kHz. A farm of microprocessors can then be used to further reduce the rate to the 1 Hz level. This paper uses the proposed DO filter strategy as an example of what as SSC software filter could look like. A sketch of the current DO software filter design is given with a detailed example of the muon trigger. The hardware trigger will have imposed cuts on various trigger 'components' (e.g. muon rho/sub t/, total E/sub T/, number of jets). The first step is to recalculate each of those components which passed the hardware level but with an improved resolution due to utilizing more information (calibration constants, corrections, muon drift tube times). The various trigger components can then be correlated. Next, specified regions in other detector subsystems can be inspected. Again, an electron trigger should have a singly ionizing track approximately pointing to it. The processing time to do complete central tracking analysis is prohibitive but tracking only around the 'electron' can be done. Finally, all components can be used to classify and reject events. Those events which pass this point can then have a complete analysis performed on them

  17. Carbon nanotube filters

    Science.gov (United States)

    Srivastava, A.; Srivastava, O. N.; Talapatra, S.; Vajtai, R.; Ajayan, P. M.

    2004-09-01

    Over the past decade of nanotube research, a variety of organized nanotube architectures have been fabricated using chemical vapour deposition. The idea of using nanotube structures in separation technology has been proposed, but building macroscopic structures that have controlled geometric shapes, density and dimensions for specific applications still remains a challenge. Here we report the fabrication of freestanding monolithic uniform macroscopic hollow cylinders having radially aligned carbon nanotube walls, with diameters and lengths up to several centimetres. These cylindrical membranes are used as filters to demonstrate their utility in two important settings: the elimination of multiple components of heavy hydrocarbons from petroleum-a crucial step in post-distillation of crude oil-with a single-step filtering process, and the filtration of bacterial contaminants such as Escherichia coli or the nanometre-sized poliovirus (~25 nm) from water. These macro filters can be cleaned for repeated filtration through ultrasonication and autoclaving. The exceptional thermal and mechanical stability of nanotubes, and the high surface area, ease and cost-effective fabrication of the nanotube membranes may allow them to compete with ceramic- and polymer-based separation membranes used commercially.

  18. Controlling flow conditions of test filters in iodine filters

    International Nuclear Information System (INIS)

    Several different iodine filter and test filter designs and experience gained from their operation are presented. For the flow experiments, an iodine filter system equipped with flow regulating and measuring devices was built. In the experiments the influence of the packing method of the iodine sorption material and the influence of the flow regulating and measuring divices upon the flow conditions in the test filters was studied. On the basis of the experiments it has been shown that the flows through the test filters always can be adjusted to a correct value if there only is a high enough pressure difference available across the test filter ducting. As a result of the research, several different methods are presented with which the flows through the test filters in both operating and future iodine sorption system can easily be measured and adjusted to their correct values. (author)

  19. Improvement in birefringent filters. 4: The alternate partial polarizer filter.

    Science.gov (United States)

    Title, A M

    1976-11-01

    A design for a birefringent filter is proposed in which alternate polarizers are partial polarizers. Calculated performance characteristics of alternate partial polarizer filters (APP) are compared with those of Lyot and contrast element Lyot filters. These calculations show that the APP design has significant advantages in both transmission and profile shape. Using pulse techniques, partial polarizer systems are shown to be a natural evolution from the standard Lyot and contrast element Lyot systems. The APP filter using achromatic waveplates discussed in earlier papers of this series has been used to construct a universal alternate partial polarizer filter. This filter has a measured full width at half-maximum (FWHM) of 0.09 A at 5500 A and a transmission in polarized light of 38%. It is tunable from 4500 A to 8500 A. The measured characteristics of the filter agree well with theoretical predictions. PMID:20165504

  20. Manufacturing a low-cost ceramic water filter and filter system for the elimination of common pathogenic bacteria

    Science.gov (United States)

    Simonis, J. J.; Basson, A. K.

    Africa is one of the most water-scarce continents in the world but it is the lack of potable water which results in diarrhoea being the leading cause of death amongst children under the age of five in Africa (696 million children under 5 years old in Africa contract diarrhoea resulting in 2000 deaths per day: WHO and UNICEF, 2009). Most potable water treatment methods use bulk water treatment not suitable or available to the majority of rural poor in Sub-Saharan Africa. One simple but effective way of making sure that water is of good quality is by purifying it by means of a household ceramic water filter. The making and supply of water filters suitable for the removal of suspended solids, pathogenic bacteria and other toxins from drinking water is therefore critical. A micro-porous ceramic water filter with micron-sized pores was developed using the traditional slip casting process. This locally produced filter has the advantage of making use of less raw materials, cost, labour, energy and expertise and being more effective and efficient than other low cost produced filters. The filter is fitted with a silicone tube inserted into a collapsible bag that acts as container and protection for the filter. Enhanced flow is obtained through this filter system. The product was tested using water inoculated with high concentrations of different bacterial cultures as well as with locally polluted stream water. The filter is highly effective (log10 > 4 with 99.99% reduction efficiency) in providing protection from bacteria and suspended solids found in natural water. With correct cleaning and basic maintenance this filter technology can effectively provide drinking water to rural families affected by polluted surface water sources. This is an African solution for the more than 340 million people in Africa without access to clean drinking water (WHO and UNICEF, 2008).

  1. Active Learning versus Traditional Teaching

    Directory of Open Access Journals (Sweden)

    L.A. Azzalis

    2009-05-01

    Full Text Available In traditional teaching most of the class time is spent with the professor lecturing and the students watching and listening. The students work individually, and cooperation is discouraged. On the other hand,  active learning  changes the focus of activity from the teacher to the learners, in which students solve problems, answer questions, formulate questions of their own, discuss, explain, debate during class;  moreover, students work in teams on problems and projects under conditions that assure positive interdependence and individual accountability. Although student-centered methods have repeatedly been shown to be superior to the traditional teacher-centered approach to instruction, the literature regarding the efficacy of various teaching methods is inconclusive. The purpose of this study was to compare the student perceptions of course and instructor effectiveness, course difficulty, and amount learned between the active learning and lecture sections  in Health Sciences´ courses by statistical data from Anhembi Morumbi University. Results indicated significant  difference between active  learning and traditional  teaching. Our conclusions were that strategies promoting  active  learning to  traditional lectures could increase knowledge and understanding.

  2. Editorial: Between Tradition and Modernity

    OpenAIRE

    Editor Al-Jami'ah: Journal of Islamic Studies

    2011-01-01

    The relationship between religious ‘tradition’ and ‘modernity’ is a central theme in various academic debates. Of the heatedly debated topic is concerning religious identity in the face of constantly political, economical, and global changes. As with other religious communities, Muslims have to response to these changes, on the one hand, and to the call for preserving their religious identity, on the other. 

  3. Traditional Teacher Education Still Matters

    Science.gov (United States)

    Jacobs, Nick

    2013-01-01

    Fresh from teaching his first full school year the author reflects on his traditional teacher preparation path into the classroom and finds he was instilled with a common sense of ethics, compassion, a demand for reflective practice, and a robust guiding philosophy. As a college student, he learned theory and was able to augment that with…

  4. Digital filters basics and design

    CERN Document Server

    Schlichtharle, Dietrich

    2011-01-01

    The second, strongly enlarged edition of the textbook gives a substantial insight into the characteristics and the design of digital filters. It briefly introduces to the theory of continuous-time systems and the design methods for analog filters. Time-discrete systems, the basic structures of digital filters, sampling theorem, and the design of IIR filters are widely discussed. The author devotes important parts to the design of non-recursive filters and the effects of finite register length. The explanation of techniques like oversampling and noise shaping conclude the book. The author has s

  5. Three-Dimensional Shuman Filter.

    Science.gov (United States)

    Nelson, Stephan P.; Weible, Michael L.

    1980-04-01

    Equations for the three-dimensional Shuman filter and its response function are presented. The filter heavily dampens even fairly long waves, but this can be alleviated somewhat by a second amplifying pass (tandem filter).Due to the number of points (27) needed to filter the data, irregular boundaries and missing data can be a problem. The effects of one possible solution (reverting to three one-dimensional filters) is shown. Under some instances it is preferable to make only one pass to prevent propagation of boundary discontinuities.

  6. A Short Note on t-filters, II-filters and Extended Filters on Residuated Lattices.

    Czech Academy of Sciences Publication Activity Database

    Víta, Martin

    2015-01-01

    Ro?. 271, 15 July (2015), s. 168-171. ISSN 0165-0114 R&D Projects: GA ?R GAP202/10/1826 Institutional support: RVO:67985807 Keywords : t-filters * II-filters * extended filters * residuated lattices Subject RIV: BA - General Mathematics Impact factor: 1.986, year: 2014

  7. Filter element in analysis apparatus

    International Nuclear Information System (INIS)

    This invention is concerned with the continuous-flow analysis of liquids, and more particularly, with the separation of mixtures of solid and liquid phases. The separator is comprised of a first conduit through which the mixture is passed and a lateral second conduit joined to the first conduit via a filter. Liquid is pumped from the first conduit through the filter into the second conduit, solid phase being retained by the filter. Means for backwashing the filter is provided. In continuous-flow analysis, the mixture in the first conduit is segmented by air or air and wash liquid, and the backflow washing is controlled to occur when an air or wash flow segment in the first conduit is in contact with the filter. The filter may be of sintered glass particles. The liquid phase is directed on to the filter via a special protuberance to facilitate and accelerate separation. (author)

  8. Optimization of HEPA filter design

    International Nuclear Information System (INIS)

    Often high efficiency filters are associated with high pressure drop, particularly at high airflow velocity. With design optimization, a cost effective design of clean room can be achieved by lowering energy cost or reducing the size of the filter housing. Through mathematical analysis, optimum filter designs are obtained for the separator type HEPA filter. In the mathematical analysis, a similarity solution obtained from Navier-Stoke's equation for airflow between the filter pleat spacing with uniform mass addition and extraction is applied to each finite element along the pleat channel. The optimum pleat aspect ratio is obtained by combining the expressions for the axial pressure gradient for upstream channel with mass extraction, the axial pressure gradient for downstream channel with mass addition, and the filter media flow characteristics for finite elements along the filter pleat channel with varying wall airflow rate

  9. Filter for reactor emergency cooling system

    International Nuclear Information System (INIS)

    The invention describes the design of a filter for the emergency cooling system. The new type of filter can be rinsed by flushing water backwards through the filter. The arrangement will prevent the filter from being silt up

  10. A New Stateless Packet Classification and Filter against DoS Attacks

    Directory of Open Access Journals (Sweden)

    Guang Jin

    2014-02-01

    Full Text Available Capabilities is a typical scheme of stateless filtering. In order to classify and filter packets effectively, a novel scheme of packet classification and filter based on capabilities is proposed in this paper. In our scheme, a new classifier module is added and a new filter structure is designed. We employ capabilities as verification and introduce new authorization in the communications. All these innovations make packet classification owning good effects in attacking scenario. The experimental results based on large-scale topology datasets and NS2 show that our scheme is better than traditional packet classification algorithms, especially under complex cyber environment.

  11. Design and Implementation for a Non Linear State Filter for LEO Micro Satellite

    Directory of Open Access Journals (Sweden)

    S. Chouraqui

    2009-01-01

    Full Text Available This study preliminarily investigates the numerical application of both Extended Kalman Filter (EKF (which has traditionally been used for non linear estimation and a relatively new filter, Unscented Kalman Filter (UKF to the nonlinear estimation problem. The new method can be applied to nonlinear systems without the linearization process necessary for the EKF and it does not demand a Gaussian distribution of noise and what's more, its ease of implementation and more accurate estimation features enables it to demonstrate its good performance. Present experimental results and analysis indicate that unscented Kalman filtering UKF have shown better performances in presence of the severe nonlinearity in state equations.

  12. Toward Green Cloud Computing: An Attribute Clustering Based Collaborative Filtering Method for Virtual Machine Migration

    Directory of Open Access Journals (Sweden)

    Zhang Liu-Mei

    2013-01-01

    Full Text Available In this study, an attribute clustering based collaborative filtering algorithm is depicted for virtual machine migration towards green Cloud computing. The algorithm utilizes similarity characteristics of virtual machine task related attributes, especially CPU related attributes, to filter redundant data by feature selection. Then by referencing K-Means clustering to effectively solve the rating scale problems existing in the traditional collaborative filtering recommendation algorithm. Experiments use virtual machine task related information for clustering the data. By integration of a scaled rating scheme on task related properties and the collaborative filtering philosophy to provide migration recommendation for system administrators.

  13. Fast algorithm of the robust Gaussian regression filter for areal surface analysis

    International Nuclear Information System (INIS)

    In this paper, the general model of the Gaussian regression filter for areal surface analysis is explored. The intrinsic relationships between the linear Gaussian filter and the robust filter are addressed. A general mathematical solution for this model is presented. Based on this technique, a fast algorithm is created. Both simulated and practical engineering data (stochastic and structured) have been used in the testing of the fast algorithm. Results show that with the same accuracy, the processing time of the second-order nonlinear regression filters for a dataset of 1024*1024 points has been reduced to several seconds from the several hours of traditional algorithms

  14. OPTIMIZATION OF ADVANCED FILTER SYSTEMS

    Energy Technology Data Exchange (ETDEWEB)

    R.A. Newby; G.J. Bruck; M.A. Alvin; T.E. Lippert

    1998-04-30

    Reliable, maintainable and cost effective hot gas particulate filter technology is critical to the successful commercialization of advanced, coal-fired power generation technologies, such as IGCC and PFBC. In pilot plant testing, the operating reliability of hot gas particulate filters have been periodically compromised by process issues, such as process upsets and difficult ash cake behavior (ash bridging and sintering), and by design issues, such as cantilevered filter elements damaged by ash bridging, or excessively close packing of filtering surfaces resulting in unacceptable pressure drop or filtering surface plugging. This test experience has focused the issues and has helped to define advanced hot gas filter design concepts that offer higher reliability. Westinghouse has identified two advanced ceramic barrier filter concepts that are configured to minimize the possibility of ash bridge formation and to be robust against ash bridges should they occur. The ''inverted candle filter system'' uses arrays of thin-walled, ceramic candle-type filter elements with inside-surface filtering, and contains the filter elements in metal enclosures for complete separation from ash bridges. The ''sheet filter system'' uses ceramic, flat plate filter elements supported from vertical pipe-header arrays that provide geometry that avoids the buildup of ash bridges and allows free fall of the back-pulse released filter cake. The Optimization of Advanced Filter Systems program is being conducted to evaluate these two advanced designs and to ultimately demonstrate one of the concepts in pilot scale. In the Base Contract program, the subject of this report, Westinghouse has developed conceptual designs of the two advanced ceramic barrier filter systems to assess their performance, availability and cost potential, and to identify technical issues that may hinder the commercialization of the technologies. A plan for the Option I, bench-scale test program has also been developed based on the issues identified. The two advanced barrier filter systems have been found to have the potential to be significantly more reliable and less expensive to operate than standard ceramic candle filter system designs. Their key development requirements are the assessment of the design and manufacturing feasibility of the ceramic filter elements, and the small-scale demonstration of their conceptual reliability and availability merits.

  15. Transversal filters with charge transfer devices

    International Nuclear Information System (INIS)

    Analog sampled-data transversal filters can be realized using charge transfer devices. Low pass, bandpass and various matched filters have been fabricated using this technology. In addition these transversal filters can be used to obtain the discrete Fourier transform via the chirp Z algorithm. The state of the art in charge transfer transversal filters is reviewed. Applications of these filters to low pass filtering, matched filtering and spectrum analysis is described. These transversal filters are uniquely suited to providing an optimum filter for nuclear radiation spectrometers. The design of a charge transfer transversal filter for filtering the pulse obtained from a silicon or germanium nuclear detector is described

  16. Demonstration of polarization-independent resonant subwavelength grating filter arrays.

    Science.gov (United States)

    Peters, D W; Boye, R R; Wendt, J R; Kellogg, R A; Kemme, S A; Carter, T R; Samora, S

    2010-10-01

    We demonstrate a two-dimensional (2D) polarization-independent resonant subwavelength grating (RSG) in a filter array. RSGs, also called guided mode resonant filters, are traditionally one-dimensional gratings; however, this leads to TE and TM resonances at different wavelengths and with different spectral shape. A 2D grating can remove the polarization dependence at normal incidence, while maintaining the desirable RSG properties of high reflectivity, narrow passband, and low sidebands without ripple. We designed and fabricated 2D gratings with near-identical responses for both polarizations at normal incidence in the telecommunication band. Ninety percent reflectivity is achieved at the resonant wavelengths. PMID:20890333

  17. Direct analysis of air filter samples for alpha emitting isotopes

    International Nuclear Information System (INIS)

    The traditional method for determination of alpha emitting isotopes on air filters has been to process the samples by radiochemical methods. However, this method is too slow for cases of incidents involving radioactive materials where the determination of personnel received dose is urgent. A method is developed to directly analyze the air filters taken from personal and area air monitors. The site knowledge is used in combination with alpha spectral information to identify isotopes. A mathematical function is developed to estimate the activity for each isotope. The strengths and weaknesses of the method are discussed

  18. Initial results of a new generation dual source CT system using only an in-plane comb filter for ultra-high resolution temporal bone imaging

    International Nuclear Information System (INIS)

    To prospectively evaluate radiation dose and image quality of a third generation dual-source CT (DSCT) without z-axis filter behind the patient for temporal bone CT. Forty-five patients were either examined on a first, second, or third generation DSCT in an ultra-high-resolution (UHR) temporal bone-imaging mode. On the third generation DSCT system, the tighter focal spot of 0.2 mm2 removesthe necessity for an additional z-axis-filter, leading to an improved z-axis radiation dose efficiency. Images of 0.4 mm were reconstructed using standard filtered-back-projection or iterative reconstruction (IR) technique for previous generations of DSCT and a novel IR algorithm for the third generation DSCT. Radiation dose and image quality were compared between the three DSCT systems. The statistically significantly highest subjective and objective image quality was evaluated for the third generation DSCT when compared to the first or second generation DSCT systems (all p < 0.05). Total effective dose was 63 %/39 % lower for the third generation examination as compared to the first and second generation DSCT. Temporal bone imaging without z-axis-UHR-filter and a novel third generation IR algorithm allows for significantly higher image quality while lowering effective dose when compared to the first two generations of DSCTs. (orig.)

  19. Initial results of a new generation dual source CT system using only an in-plane comb filter for ultra-high resolution temporal bone imaging

    Energy Technology Data Exchange (ETDEWEB)

    Meyer, Mathias; Haubenreisser, Holger; Schoenberg, Stefan O.; Henzler, Thomas [Heidelberg University, Institute of Clinical Radiology and Nuclear Medicine, University Medical Center Mannheim, Medical Faculty Mannheim, Mannheim (Germany); Raupach, Rainer; Schmidt, Bernhard; Leidecker, Christianne; Allmendinger, Thomas; Flohr, Thomas [Siemens Healthcare, Imaging and Therapy Division, Forchheim (Germany); Lietzmann, Florian; Schad, Lothar R. [Heidelberg University, Computer Assisted Clinical Medicine, University Medical Center Mannheim, Medical Faculty Mannheim, Mannheim (Germany)

    2015-01-15

    To prospectively evaluate radiation dose and image quality of a third generation dual-source CT (DSCT) without z-axis filter behind the patient for temporal bone CT. Forty-five patients were either examined on a first, second, or third generation DSCT in an ultra-high-resolution (UHR) temporal bone-imaging mode. On the third generation DSCT system, the tighter focal spot of 0.2 mm{sup 2} removesthe necessity for an additional z-axis-filter, leading to an improved z-axis radiation dose efficiency. Images of 0.4 mm were reconstructed using standard filtered-back-projection or iterative reconstruction (IR) technique for previous generations of DSCT and a novel IR algorithm for the third generation DSCT. Radiation dose and image quality were compared between the three DSCT systems. The statistically significantly highest subjective and objective image quality was evaluated for the third generation DSCT when compared to the first or second generation DSCT systems (all p < 0.05). Total effective dose was 63 %/39 % lower for the third generation examination as compared to the first and second generation DSCT. Temporal bone imaging without z-axis-UHR-filter and a novel third generation IR algorithm allows for significantly higher image quality while lowering effective dose when compared to the first two generations of DSCTs. (orig.)

  20. Factorized Kalman Filtering.

    Czech Academy of Sciences Publication Activity Database

    Suzdaleva, Evgenia

    Praha : ÚTIA AV ?R, 2006 - (P?ikryl, J.; Andrýsek, J.; Šmídl, V.), s. 226-233 [International PhD Workshop on Interplay of Societal and Technical Decision-Making, Young Generation Viewpoint /7./. Hrubá Skála (CZ), 25.09.2006-30.09.2006] R&D Projects: GA MŠk 1M0572; GA ?R(CZ) GP201/06/P434 Grant ostatní: project TED ESF Institutional research plan: CEZ:AV0Z10750506 Keywords : state estimation * factorized filters * traffic control Subject RIV: BC - Control Systems Theory

  1. Charcoal filter testing

    Energy Technology Data Exchange (ETDEWEB)

    Lyons, J. [Nuclear Regulatory Commission, Washington, DC (United States)

    1997-08-01

    In this very brief, informal presentation, a representative of the US Nuclear Regulatory Commission outlines some problems with charcoal filter testing procedures and actions being taken to correct the problems. Two primary concerns are addressed: (1) the process to find the test method is confusing, and (2) the requirements of the reference test procedures result in condensation on the charcoal and causes the test to fail. To address these problems, emergency technical specifications were processed for three nuclear plants. A generic or an administrative letter is proposed as a more permanent solution. 1 fig.

  2. Air Sampling Filter

    Science.gov (United States)

    1980-01-01

    General Metal Works' Accu-Vol is a high-volume air sampling system used by many government agencies to monitor air quality for pollution control purposes. Procedure prevents possible test-invalidating contamination from materials other than particulate pollutants, caused by manual handling or penetration of windblown matter during transit, a cassette was developed in which the filter is sealed within a metal frame and protected in transit by a snap-on aluminum cover, thus handled only under clean conditions in the laboratory.

  3. Adaptive projective filters

    International Nuclear Information System (INIS)

    The new approach to solving of the finding problem is proposed. The method is based on Discrete Projective Transformations (DPT), the List Square Fitting (LSF) and uses the information feedback in tracing for linear or quadratic track segments (TS). The fast and stable with respect to measurement errors and background points recurrent algorithm is suggested. The algorithm realizes the family of digital adaptive projective filters (APF) with known nonlinear weight functions-projective invariants. APF can be used in adequate control systems for collection, processing and compression of data, including tracking problems for the wide class of detectors. 10 refs.; 9 figs

  4. Fantastic filters of lattice implication algebras

    OpenAIRE

    Young Bae Jun

    2000-01-01

    The notion of a fantastic filter in a lattice implication algebra is introduced, and the relations among filter, positive implicative filter, and fantastic filter are given. We investigate an equivalent condition for a filter to be fantastic, and state an extension property for fantastic filter.

  5. Vertical media bed filter and method of cleaning filter panels

    International Nuclear Information System (INIS)

    A vertical media bed dust collector in which the media bed of a filter panel is rejuvenated when necessary by interrupting the gas flow through the panel, withdrawing the filter media from the panel] separating the agglomerated dust from the filter media, returing the filter media to the filter panel, and reestablishing the gas flow through the panel. The system further includes apparatus for removing collected dust from the deparating and recirculating surfaces of the media handling apparatus and also from the remote face of the filter panels before the cleaned gas is allowed to pass out of the collector so that the cleaned gas is not recontaminated by small amounts of dust adhering to those surfaces

  6. Radiopasteurization of traditional herbal medicine

    International Nuclear Information System (INIS)

    Investigation on the effects of irradiation using pasteurization dose of 500 krad (5kGy) on microbes contaminating traditional herbal medicine, produced by 3 large manufacturers in Indonesia, was carried out. Storage effects on microbial count moisture content of traditional herbal medicine packed in microbe tight packages, were also observed. The results showed that initial bacterial counts varied between 104 and 108 per gram, and mould and yeast counts varied between 0 and 105 per gram. These numbers decreased as much as 2 to 5 log cycles after irradiation with 500 krad. After 6 month storage, bacterial counts of irradiated samples decreased as much as 0 to 103 per gram. Initial moisture content varied from 5 to 12% and after 6 month storage the moisture content of most samples increased as much as 0 to 5%. Irradiated samples were found to be mould free, and most of the surviving microbes consisted of spore forming aerobic bacteria and yeast. (author)

  7. Post-traditional corporate governance

    OpenAIRE

    Mason, Michael; O'Mahony, Joan

    2007-01-01

    Traditional definitions of corporate governance are narrow, focusing on legal relations between managers and shareholders. More recent definitions extend the boundaries of governance to consider the role that various stakeholders play in shaping the behaviour of firms. While stakeholding theory embraces a broader set of corporate constituencies, our argument in this paper is that even these definitions are too narrow – they lack the analytical capacity to account for the social embeddedness a...

  8. Two Thoughts about Traditional Knowledge

    OpenAIRE

    Fisher, William W

    2007-01-01

    Fisher argues the traditional knowledge of environmentalism and the public domain ideas by presenting two combined related themes involving the British colonist of Native Americans. The idea of devaluing the Indian's nonacquisitive, natural, respectful way of living lightly upon the land while conserving it, and fostering imperialism and unjust conquest. Among other things, he formulates three parallel provisions to the TRIPS Agreement to increase the leverage of the countries in determining ...

  9. Traditional grasslands : Conservation measures needed?

    OpenAIRE

    Treuren, R., van; Visser, L.

    2006-01-01

    A number of temperate grasses and legumes, important for animal feeding, have their centre of diversity in the North-West European region, including perennial ryegrass (Lolium perenne L.; Engels raaigras), white clover (Trifolium repens L.; witte klaver) and Kentucky bluegrass (Poa pratensis L.; veldbeemdgras). These species traditionally occur in Dutch grasslands where they can be considered as typical. Undisturbed grasslands that are still in agricultural use have severely become reduced in...

  10. TRADITIONAL FERMENTED FOODS OF LESOTHO

    OpenAIRE

    Gadaga, Tendekayi H; Molupe Lehohla; Victor Ntuli

    2013-01-01

    This paper describes the traditional methods of preparing fermented foods and beverages of Lesotho. Information on the preparation methods was obtained through a combination of literature review and face to face interviews with respondents from Roma in Lesotho. An unstructured questionnaire was used to capture information on the processes, raw materials and utensils used. Four products; motoho (a fermented porridge), Sesotho (a sorghum based alcoholic beverage), hopose (sorghum fermented beer...

  11. Ginseng in Traditional Herbal Prescriptions

    OpenAIRE

    Park, Ho Jae; Kim, Dong Hyun; Park, Se Jin; Kim, Jong Min; Ryu, Jong Hoon

    2012-01-01

    Panax ginseng Meyer has been widely used as a tonic in traditional Korean, Chinese, and Japanese herbal medicines and in Western herbal preparations for thousands of years. In the past, ginseng was very rare and was considered to have mysterious powers. Today, the efficacy of drugs must be tested through well-designed clinical trials or meta-analyses, and ginseng is no exception. In the present review, we discuss the functions of ginseng described in historical documents and describe how thes...

  12. Symmetry in Traditional Persian Poetry

    OpenAIRE

    BEHNEJAD, S. ALIREZA; Zahedi, Maryam

    2010-01-01

    A great many Persian poems have been composed by many famous or obscure poets throughout the centuries which Persians have learned, memorized and recited throughout their lives. Regardless of their meaning, there are other aspects that make learning these poems simple and pleasant. It seems that the rhythm in traditional Persian poems is an important factor that makes it possible for non-Persian speaking people to enjoy Persian poems. As Marco polo, writes in his itinerary: ‘Persians are peop...

  13. Traditional communities, multinationals and biodiversity

    OpenAIRE

    Greissing, Anna; Le Tourneau, François-Michel

    2009-01-01

    The Iratapuru Sustainable Development Reserve is mainly exploited by the community of the São Francisco village. Due to its efforts to organize its members around a production co-operative and to improve their standard of living, but also as a result of massive funding from local and international institutions, this community has become a symbol for the actions of sustainable development undertaken with and in the benefit of « traditional » communities living in protected areas in the Amazon ...

  14. From Microwave Filter to Digital Filter and Back Again

    DEFF Research Database (Denmark)

    Dalby, Arne Brejning

    1989-01-01

    A new very simple state variable flow graph representation for interdigital transmission line bandpass filters is presented, which has led to two important results: 1) A new type of digital filter with properties, that surpass the properties of most other (all pole) digital filtertypes. 2) The study of the new digital filtertype has led to design formulas for interdigital transmission line filters that are very simple compared to the hitherto known formulas. The accuracy is the same or better.

  15. From Microwave Filter to Digital Filter and Back Again

    OpenAIRE

    Dalby, Arne Brejning

    2010-01-01

    A new very simple state variable flow graph representation for interdigital transmission line bandpass filters is presented, which has led to two important results: 1) A new type of digital filter with properties, that surpass the properties of most other (all pole) digital filtertypes. 2) The study of the new digital filtertype has led to design formulas for interdigital transmission line filters that are very simple compared to the hitherto known formulas. The accuracy is the same or better.

  16. Health traditions of Sikkim Himalaya

    Directory of Open Access Journals (Sweden)

    Ashok Kumar Panda

    2010-01-01

    Full Text Available Ancient medical systems are still prevalent in Sikkim, popularly nurtured by Buddhist groups using the traditional Tibetan pharmacopoeia overlapping with Ayurvedic medicine. Traditional medical practices and their associated cultural values are based round Sikkim?s three major communities, Lepcha, Bhutia and Nepalis. In this study, a semi-structured questionnaire was prepared for folk healers covering age and sex, educational qualification, source of knowledge, types of practices, experience and generation of practice, and transformation of knowledge. These were administered to forty-eight folk healers identified in different parts of Sikkim. 490 medicinal plants find their habitats in Sikkim because of its large variations in altitude and climate. For 31 commonly used by these folk healers, we present botanical name, family, local name, distribution, and parts used, together with their therapeutic uses, mostly Rheumatoid arthritis, Gout, Gonorrhea, Fever, Viral flu, asthma, Cough and Cold, indigestion, Jaundice etc. A case treated by a folk healer is also recounted. This study indicates that, in the studied area, Sikkim?s health traditions and folk practices are declining due to shifts in socio-economic patterns, and unwillingness of the younger generation to adopt folk healing as a profession.

  17. Kalman Filtered Compressed Sensing

    CERN Document Server

    Vaswani, Namrata

    2008-01-01

    We consider the problem of reconstructing time sequences of spatially sparse signals (with unknown and time-varying sparsity patterns) from a limited number of linear "incoherent" measurements, in real-time. The signals are sparse in some transform domain referred to as the sparsity basis. For a single spatial signal, the solution is provided by Compressed Sensing (CS). The question that we address is, for a sequence of sparse signals, can we do better than CS, if (a) the sparsity pattern of the signal's transform coefficients' vector changes slowly over time, and (b) a simple prior model on the temporal dynamics of its current non-zero elements is available. The overall idea of our solution is to use CS to estimate the support set of the initial signal's transform vector. At future times, run a reduced order Kalman filter with the currently estimated support and estimate new additions to the support set by applying CS to the Kalman innovations or filtering error (whenever it is "large").

  18. Nanoparticle optical notch filters

    Science.gov (United States)

    Kasinadhuni, Pradeep Kumar

    Developing novel light blocking products involves the design of a nanoparticle optical notch filter, working on the principle of localized surface plasmon resonance (LSPR). These light blocking products can be used in many applications. One such application is to naturally reduce migraine headaches and light sensitivity. Melanopsin ganglion cells present in the retina of the human eye, connect to the suprachiasmatic nucleus (SCN-the body's clock) in the brain, where they participate in the entrainment of the circadian rhythms. As the Melanopsin ganglion cells are involved in triggering the migraine headaches in photophobic patients, it is necessary to block the part of visible spectrum that activates these cells. It is observed from the action potential spectrum of the ganglion cells that they absorb light ranging from 450-500nm (blue-green part) of the visible spectrum with a ?max (peak sensitivity) of around 480nm (blue line). Currently prescribed for migraine patients is the FL-41 coating, which blocks a broad range of wavelengths, including wavelengths associated with melanopsin absorption. The nanoparticle optical notch filter is designed to block light only at 480nm, hence offering an effective prescription for the treatment of migraine headaches.

  19. Filtered multitensor tractography.

    Science.gov (United States)

    Malcolm, James G; Shenton, Martha E; Rathi, Yogesh

    2010-09-01

    We describe a technique that uses tractography to drive the local fiber model estimation. Existing techniques use independent estimation at each voxel so there is no running knowledge of confidence in the estimated model fit. We formulate fiber tracking as recursive estimation: at each step of tracing the fiber, the current estimate is guided by those previous. To do this we perform tractography within a filter framework and use a discrete mixture of Gaussian tensors to model the signal. Starting from a seed point, each fiber is traced to its termination using an unscented Kalman filter to simultaneously fit the local model to the signal and propagate in the most consistent direction. Despite the presence of noise and uncertainty, this provides a causal estimate of the local structure at each point along the fiber. Using two- and three-fiber models we demonstrate in synthetic experiments that this approach significantly improves the angular resolution at crossings and branchings. In vivo experiments confirm the ability to trace through regions known to contain such crossing and branching while providing inherent path regularization. PMID:20805043

  20. New filter efficiency test for future nuclear grade HEPA filters

    International Nuclear Information System (INIS)

    We have developed a new test procedure for evaluating filter penetrations as low as 10-9 at 0.1-?m particle diameter. In comparison, the present US nuclear filter certification test has a lower penetration limit of 10-5. Our new test procedure is unique not only in its much higher sensitivity, but also in avoiding the undesirable effect of clogging the filter. Our new test procedure consists of a two-step process: (1) We challenge the test filter with a very high concentration of heterodisperse aerosol for a short time while passing all or a significant portion of the filtered exhaust into an inflatable bag; (2) We then measure the aerosol concentration in the bag using a new laser particle counter sensitive to 0.07-?m diameter. The ratio of particle concentration in the bag to the concentration challenging the filter gives the filter penetration as a function of particle diameter. The bag functions as a particle accumulator for subsequent analysis to minimize the filter exposure time. We have studied the particle losses in the bag over time and find that they are negligible when the measurements are taken within one hour. We also compared filter penetration measurements taken in the conventional direct-sampling method with the indirect bag-sampling method and found excellent agreement

  1. New filter efficiency test for future nuclear grade HEPA filters

    International Nuclear Information System (INIS)

    We have developed a new test procedure for evaluating filter penetrations as low as 10/sup /minus/9/ at 0.1-?m particle diameter. In comparison, the present US nuclear filter certification test has a lower penetration limit of 10/sup /minus/5/. Our new test procedure is unique not only in its much higher sensitivity, but also in avoiding the undesirable effect of clogging the filter. Our new test procedure consists of a two-step process: (1) We challenge the test filter with a very high concentration of heterodisperse aerosol for a short time while passing all or a significant portion of the filtered exhaust into an inflatable bag; (2) We then measure the aerosol concentration in the bag using a new laser particle counter sensitive to 0.07-?m diameter. The ratio of particle concentration in the bag to the concentration challenging the filter gives the filter penetration as a function of particle diameter. The bad functions as a particle accumulator for subsequent analysis to minimize the filter exposure time. We have studied the particle losses in the bag over time and find that they are negligible when the measurements are taken within one hour. We also compared filter penetration measurements taken in the conventional direct-sampling method with the indirect bag-sampling method and found excellent agreement. 6 refs., 18 figs., 1 tab

  2. Filter and Filter Bank Design for Image Texture Recognition

    Energy Technology Data Exchange (ETDEWEB)

    Randen, Trygve

    1997-12-31

    The relevance of this thesis to energy and environment lies in its application to remote sensing such as for instance sea floor mapping and seismic pattern recognition. The focus is on the design of two-dimensional filters for feature extraction, segmentation, and classification of digital images with textural content. The features are extracted by filtering with a linear filter and estimating the local energy in the filter response. The thesis gives a review covering broadly most previous approaches to texture feature extraction and continues with proposals of some new techniques. 143 refs., 59 figs., 7 tabs.

  3. Morphological and Median Adaptive Filters Based on LCBP Rank Filter

    Directory of Open Access Journals (Sweden)

    D. Prokin

    2013-11-01

    Full Text Available The presented median and morphological (min and max filters based on low complexity bit-pipeline (LCBP rank filter provide reduced complexity of required processing hardware, due to similar pipeline stages and the complete absence of sorting networks in comparison with other solutions. FPGA realization of bit-pipeline median and morphological filter and adaptive bit-pipeline rank filter according to this paper provides significantly higher maximum operating frequency and much smaller used chip resources in comparison with state-of-the-art sorting methods.

  4. Complex Coherence Estimation Based on Adaptive Refined Lee Filter

    Directory of Open Access Journals (Sweden)

    LONG Jiangping

    2015-12-01

    Full Text Available Polarimetric synthetic aperture radar interferometry (PolInSAR data procedures and their application are based on the estimation of polarimetric complex coherence, which are influenced by size of windows and filter methods. In this paper, the adaptive refined Lee filter, which based on traditional refined Lee filter, is used to estimate the interferometric coherence. The size of filter window is changed by the correlation coefficient between the central sub window and the neighboring sub window. Correlation coefficient which is larger than the threshold value means to the homogeneous pixels in the selected window, and then boxcar filter is chosen to estimate complex coherence. When maximum of correlation coefficient in difference windows sizes is smaller than the threshold value, the refined Lee filter is used to estimate the complex coherence. The efficiency and advantage of the new algorithm are demonstrated with E-SAR data sets. The results show that the influence of speckle noise and edge information is improved; more accurate complex coherence estimated by selected window size and selected pixels increase the accurate of forest parameters inversion.

  5. A taxonomy fuzzy filtering approach

    Directory of Open Access Journals (Sweden)

    Vrettos S.

    2003-01-01

    Full Text Available Our work proposes the use of topic taxonomies as part of a filtering language. Given a taxonomy, a classifier is trained for each one of its topics. The user is able to formulate logical rules combining the available topics, e.g. (Topic1 AND Topic2 OR Topic3, in order to filter related documents in a stream. Using the trained classifiers, every document in the stream is assigned a belief value of belonging to the topics of the filter. These belief values are then aggregated using logical operators to yield the belief to the filter. In our study, Support Vector Machines and Naïve Bayes classifiers were used to provide topic probabilities. Aggregation of topic probabilities based on fuzzy logic operators was found to improve filtering performance on the Renters text corpus, as compared to the use of their Boolean counterparts. Finally, we deployed a filtering system on the web using a sample taxonomy of the Open Directory Project.

  6. Transversal filters for pulse spectroscopy

    International Nuclear Information System (INIS)

    Transversal filtering consists of forming an output signal from the suitably weighted sum of successively delayed samples of a given input waveform. Although such filters exhibit many attractive features, economic considerations have previously limited their hardware implementation to rather few areas, primarily the fields of communications and radar. However, recent technological developments have significantly reduced the cost of the multiple tapped-delay operations that are required in such filters, thereby greatly expanding their potential areas of application. The purpose of the present paper is two-fold, first to serve as a general introduction to the possibilities of transversal filters for the processing of isolated randomly distributed pulses (such as occur in experimental nuclear physics), and second to introduce a new implementation of such filters in the form of capacitively tapped delay lines. This latter form of transversal filter is readily fabricated and is well suited for operation on pulses in the few nanosecond to few microsecond time range. (auth)

  7. Modernism and tradition and the traditions of modernism

    Directory of Open Access Journals (Sweden)

    Kros Džonatan

    2006-01-01

    Full Text Available Conventionally, the story of musical modernism has been told in terms of a catastrophic break with the (tonal past and the search for entirely new techniques and modes of expression suitable to a new age. The resulting notion of a single, linear, modernist mainstream (predicated on the basis of a Schoenbergian model of musical progress has served to conceal a more subtle relationship between past and present. Increasingly, it is being recognized that there exist many modernisms and their various identities are forged from a continual renegotiation between past and present, between tradition(s and the avant-garde. This is especially relevant when attempting to discuss the reception of modernism outside central Europe, where the adoption of (Germanic avant-garde attitudes was often interpreted as being "unpatriotic". The case of Great Britain is examined in detail: Harrison Birtwistle’s opera The Mask of Orpheus (1973–83 forms the focus for a wider discussion of modernism within the context of late/post-modern thought.

  8. Liquid filter for liquids containing radioactive materials

    International Nuclear Information System (INIS)

    A filter insert consisting of several filter plates is suspended in a filter case. The radioactive liquid containing solids flows into the filter case from above and is distributed from its centre via a central duct into intermediate spaces between the filter plates. A filter cake is formed on the filters. The filtrate flows through the centre of the filter and is taken radially outwards. The filtrate is collected in a lower collector space. The filter insert can be removed from the filter case by remote operation when it is used up. (DG)

  9. Digital filtering in nuclear medicine

    International Nuclear Information System (INIS)

    Digital filtering is a powerful mathematical technique in computer analysis of nuclear medicine studies. The basic concepts of object-domain and frequency-domain filtering are presented in simple, largely nonmathemaical terms. Computational methods are described using both the Fourier transform and convolution techniques. The frequency response is described and used to represent the behavior of several classes of filters. These concepts are illustrated with examples drawn from a variety of important applications in nuclear medicine

  10. FILT - Filtering Indexed Lucene Triples

    OpenAIRE

    Stuhr, Magnus

    2012-01-01

    The Resource Description Framework (RDF) is the W3C recommended standard for data on the semantic web, while the SPARQL Protocol and RDF Query Language (SPARQL) is the query language that retrieves RDF triples by subject, predicate, or object. RDF data often contain valuable information that can only be queried through filter functions. The SPARQL query language for RDF can include filter clauses in order to define specific data criteria, such as full-text searches, numerical filtering, and c...

  11. Filter-extruded liposomes revisited

    DEFF Research Database (Denmark)

    Hinna, Askell; Steiniger, Frank; Hupfeld, Stefan; Stein, Paul C.; Kuntsche, Judith; Brandl, Martin

    2015-01-01

    Filter-extrusion is a widely used technique for down-sizing of phospholipid vesicles. In order to gain a detailed insight into size and size distributions of filter-extruded vesicles composed of egg phosphatidyl-choline (with varying fractions of cholesterol) – in relation to extrusionparameters (pore-size, number of filter passages, and flow-rate), flow field-flow fractionation in conjunction with multi-angle laser light scattering (AF4-MALLS, Wyatt Technology Corp., Santa Barbara, CA) was empl...

  12. Sample-whitened matched filters

    DEFF Research Database (Denmark)

    Andersen, Ib

    1973-01-01

    A sample-whitened matched filter (SWMF) for a channel with intersymbol interference and additive white Gaussian noise is defined as a linear filter with the properties that its output samples are a sufficient statistic for the MAP estimation of the transmitted sequence and have uncorrelated noise components. These filters are shown to exist for ali realistic channels and the complete set of SWMF's for any channel is determined. It is shown that for nonpathological channels there is a unique SWMF...

  13. Matrix Factorisation with Linear Filters

    OpenAIRE

    Aky?ld?z, Ömer Deniz

    2015-01-01

    This text investigates relations between two well-known family of algorithms, matrix factorisations and recursive linear filters, by describing a probabilistic model in which approximate inference corresponds to a matrix factorisation algorithm. Using the probabilistic model, we derive a matrix factorisation algorithm as a recursive linear filter. More precisely, we derive a matrix-variate recursive linear filter in order to perform efficient inference in high dimensions. We...

  14. Multi-filter spectrophotometry simulations

    Science.gov (United States)

    Callaghan, Kim A. S.; Gibson, Brad K.; Hickson, Paul

    1993-01-01

    To complement both the multi-filter observations of quasar environments described in these proceedings, as well as the proposed UBC 2.7 m Liquid Mirror Telescope (LMT) redshift survey, we have initiated a program of simulated multi-filter spectrophotometry. The goal of this work, still very much in progress, is a better quantitative assessment of the multiband technique as a viable mechanism for obtaining useful redshift and morphological class information from large scale multi-filter surveys.

  15. Ozone removal by HVAC filters

    Science.gov (United States)

    Zhao, P.; Siegel, J. A.; Corsi, R. L.

    Residential and commercial HVAC filters that have been loaded with particles during operation in the field can remove ozone from intake or recirculated air. However, knowledge of the relative importance of HVAC filters as a removal mechanism for ozone in residential and commercial buildings is incomplete. We measured the ozone removal efficiencies of clean (unused) fiberglass, clean synthetic filters, and field-loaded residential and commercial filters in a controlled laboratory setting. For most filters, the ozone removal efficiency declined rapidly but converged to a non-zero (steady-state) value. This steady-state ozone removal efficiency varied from 0% to 9% for clean filters. The mean steady-state ozone removal efficiencies for loaded residential and commercial filters were 10% and 41%, respectively. Repeated exposure of filters to ozone following a 24-h period of no exposure led to a regeneration of ozone removal efficiency. Based on a theoretical scaling analysis of mechanisms that are involved in the ozone removal process, we speculate that the steady-state ozone removal efficiency is limited by reactant diffusion out of particles, and that regeneration is due to internal diffusion of reactive species to sites available to ozone for reaction. Finally, by applying our results to a screening model for typical residential and commercial buildings, HVAC filters were estimated to contribute 22% and 95%, respectively, of total ozone removal in HVAC systems.

  16. Device for filtering gaseous media

    International Nuclear Information System (INIS)

    The air filter system for gaseous radioactive substances consists of a vertical chamber with filter material (charcoal, e.g. impregnated). On one side of the chamber there is an inlet compartment and an outlet compartment. On the other side a guiding compartment turns the gas flow coming from the natural-air side through the lower part of filter chamber to the upper part of the filter. The gas flow leaves the upper part through the outlet conpartment as cleaned-air flow. The filter material may be filled into the chamber from above and drawn off below. For better utilization of the filter material the filter chamber is separated by means of a wall between the inlet and outlet compartment. This partition wall consist of two sheets arranged one above the other provided with slots which may be superposed in alignment. In this case filter material is tickling from the upper part of the chamber into the lower part avoiding to form a crater in the filter bed. (DG)

  17. Optimization of integrated polarization filters

    CERN Document Server

    Gagnon, Denis; Déziel, Jean-Luc; Dubé, Louis J

    2014-01-01

    This study reports on the design of small footprint, integrated polarization filters based on engineered photonic lattices. Using a rods-in-air lattice as a basis for a TE filter and a holes-in-slab lattice for the analogous TM filter, we are able to maximize the degree of polarization of the output beams up to 98 % with a transmission efficiency greater than 75 %. The proposed designs allow not only for logical polarization filtering, but can also be tailored to output an arbitrary transverse beam profile. The lattice configurations are found using a recently proposed parallel tabu search algorithm for combinatorial optimization problems in integrated photonics.

  18. Optimization of integrated polarization filters

    OpenAIRE

    Gagnon, Denis; Dumont, Joey; Déziel, Jean-Luc; Dubé, Louis J.

    2014-01-01

    This study reports on the design of small footprint, integrated polarization filters based on engineered photonic lattices. Using a rods-in-air lattice as a basis for a TE filter and a holes-in-slab lattice for the analogous TM filter, we are able to maximize the degree of polarization of the output beams up to 98 % with a transmission efficiency greater than 75 %. The proposed designs allow not only for logical polarization filtering, but can also be tailored to output an a...

  19. Optimization of integrated polarization filters.

    Science.gov (United States)

    Gagnon, Denis; Dumont, Joey; Déziel, Jean-Luc; Dubé, Louis J

    2014-10-01

    This study reports on the design of small footprint, integrated polarization filters based on engineered photonic lattices. Using a rods-in-air lattice as a basis for a TE filter and a holes-in-slab lattice for the analogous TM filter, we are able to maximize the degree of polarization of the output beams up to 98% with a transmission efficiency greater than 75%. The proposed designs allow not only for logical polarization filtering, but can also be tailored to output an arbitrary transverse beam profile. The lattice configurations are found using a recently proposed parallel tabu search algorithm for combinatorial optimization problems in integrated photonics. PMID:25360980

  20. Adaptive filtering and change detection

    CERN Document Server

    Gustafsson, Fredrik

    2003-01-01

    Adaptive filtering is a classical branch of digital signal processing (DSP). Industrial interest in adaptive filtering grows continuously with the increase in computer performance that allows ever more conplex algorithms to be run in real-time. Change detection is a type of adaptive filtering for non-stationary signals and is also the basic tool in fault detection and diagnosis. Often considered as separate subjects Adaptive Filtering and Change Detection bridges a gap in the literature with a unified treatment of these areas, emphasizing that change detection is a natural extensi

  1. Spatial filtering with photonic crystals

    International Nuclear Information System (INIS)

    Photonic crystals are well known for their celebrated photonic band-gaps—the forbidden frequency ranges, for which the light waves cannot propagate through the structure. The frequency (or chromatic) band-gaps of photonic crystals can be utilized for frequency filtering. In analogy to the chromatic band-gaps and the frequency filtering, the angular band-gaps and the angular (spatial) filtering are also possible in photonic crystals. In this article, we review the recent advances of the spatial filtering using the photonic crystals in different propagation regimes and for different geometries. We review the most evident configuration of filtering in Bragg regime (with the back-reflection—i.e., in the configuration with band-gaps) as well as in Laue regime (with forward deflection—i.e., in the configuration without band-gaps). We explore the spatial filtering in crystals with different symmetries, including axisymmetric crystals; we discuss the role of chirping, i.e., the dependence of the longitudinal period along the structure. We also review the experimental techniques to fabricate the photonic crystals and numerical techniques to explore the spatial filtering. Finally, we discuss several implementations of such filters for intracavity spatial filtering

  2. Cryptographically Secure Bloom-Filters

    Directory of Open Access Journals (Sweden)

    Ryo Nojima

    2009-08-01

    Full Text Available In this paper, we propose a privacy-preserving variant of Bloom-filters. The Bloom-filter has many applications such as hash-based IP-traceback systems and Web cache sharing. In some of those applications, equipping the Bloom-filter with the privacy-preserving mechanism is crucial for the deployment. In this paper, we propose a cryptographically secure privacy-preserving Bloom-filter protocol. We propose such two protocols based on blind signatures and oblivious pseudorandom functions, respectively. To show that the proposed protocols are secure, we provide a reasonable security definition and prove the security.

  3. Spatial Filter with Volume Gratings for High-peak-power Multistage Laser Amplifiers

    CERN Document Server

    Tan, Yi-zhou; Zheng, Guang-wei; Shen, Ben-jian; Pan, Heng-yue; Li, Liu

    2012-01-01

    The regular spatial filters comprised of lens and pinhole are essential component in high power laser systems, such as lasers for inertial confinement fusion, nonlinear optical technology and directed-energy weapon. On the other hand the pinhole is treated as a bottleneck of high power laser due to harmful plasma created by the focusing beam. In this paper we present a spatial filter based on angular selectivity of Bragg diffraction grating to avoid the harmful focusing effect in the traditional pinhole filter. A spatial filter consisted of volume phase gratings in two-pass amplifier cavity were reported. Two-dimensional filter was proposed by using single Pi-phase-shifted Bragg grating, numerical simulation results shown that its angular spectrum bandwidth can be less than 160urad. The angular selectivity of photo-thermo-refractive glass and RUGATE film filters, construction stability, thermal stability and the effects of misalignments of gratings on the diffraction efficiencies under high-pulse-energy laser...

  4. T Source Inverter Based Shunt Active Filter with LCL Passive Filter for the 415V 50 Hz Distribution systems

    Directory of Open Access Journals (Sweden)

    S. Sellakumar

    2015-06-01

    Full Text Available The inverter topology is being used as an active filter to reduce the harmonics in the power system [1]. The traditional voltage source or current source inverters are having the disadvantages of limited output voltage range hence it may not be able to supply enough compensating currents during heavy switching surges, Vulnerable to EMI noise and the devices gets damaged in either open or short circuit conditions and the main switching device of VSI and CSI are not interchangeable. The active filters are the type of DC-AC system with wide range of voltage regulation and integration of energy storages is often required. This cannot be achieved with conventional inverters and hence the impedance source inverters have been suggested. The T source inverters are basically impedance source inverters which can be used as an active filter in the power system. The MATLAB simulation is done and the results are discussed in this paper for both the types. The proposed dampening system is fully characterized by LCL based passive filters [6] and T source inverter based shunt active filter. The disturbances in the supply voltage and load current due to the non linear loads are observed in the simulation. The same is studied after connecting the designed hybrid shunt active filter in the distribution system. The simulation results obtained from the proposed method proves that it gives comparatively better THD value.

  5. Two traditions of interaction research.

    Science.gov (United States)

    Peräkylä, Anssi

    2004-03-01

    The paper compares Bales' Interaction Process Analysis (IPA) with Sacks' Conversation Analysis (CA), arguing that CA has answered several questions that originally motivated the development of IPA, and while doing so, it has re-specified the phenomena of interaction research. These two research traditions are in many ways diametrically opposed: the former is quantitative, theory-oriented and aims at global characterizations of interactional situations, while the latter is qualitative, inductive and aims at characterizing specific layers of organization (such as turn taking or sequence organization) that give structure to interactional situations. Their primary objects of study are different. For the Balesian tradition, it is the functioning and the structure of a small group, whereas in the Sacksian tradition, it is the structures and practices of human social interaction per se. It is argued, however, that CA has radically expanded understanding of the questions IPA was originally developed to address. These questions include allocation of resources, control and solidarity. Bales' research deals with them in terms of the differentiation of participants of a group, whereas CA has re-specified them as emergent aspects of the very rules and structures that constitute and regulate interaction sequences. The uniqueness of the CA perspective on social interaction is demonstrated by exploring the display of emotion as an interactional phenomenon. It is argued that the display of emotion is intrinsically embedded in the sequential organization of action. Sensitive 'coding and counting' approaches can detect emotion displays, but the contribution of CA is to show the specific ways in which they are part of the business of interaction. PMID:15035695

  6. Adapting agriculture with traditional knowledge

    Energy Technology Data Exchange (ETDEWEB)

    Swiderska, Krystyna; Reid, Hannah [IIED, London (United Kingdom); Song, Yiching; Li, Jingsong [Centre for Chinese Agriculutral Policy (China); Mutta, Doris [Kenya Forestry Research Institute (Kenya)

    2011-10-15

    Over the coming decades, climate change is likely to pose a major challenge to agriculture; temperatures are rising, rainfall is becoming more variable and extreme weather is becoming a more common event. Researchers and policymakers agree that adapting agriculture to these impacts is a priority for ensuring future food security. Strategies to achieve that in practice tend to focus on modern science. But evidence, both old and new, suggests that the traditional knowledge and crop varieties of indigenous peoples and local communities could prove even more important in adapting agriculture to climate change.

  7. Fabric filter system study

    Science.gov (United States)

    Chambers, R. L.; Plunk, O. C.; Kunka, S. L.

    1984-08-01

    Results of the fourth year of operation of a fabric filter installed on a coal-fired boiler are reported. Project work during the fourth year concentrated on fabric studies. The 10-oz/sq yd fabrics of the 150 1/2 warp, 150 2/2T fill construction demonstrated superior performance over the most common 14-oz/sq yd constructions, regardless of coating. It was determined that improving cleaning by increasing shaking amplitude is more detrimental to baglife than increasing shaker frequency. Maintenance and operation observations continued, and the resolution of these types of problems became more efficient because of increased experience of maintenance personnel with baghouse-related problems.

  8. Efficient Multicore Collaborative Filtering

    CERN Document Server

    Wu, Yao; Bickson, Danny; Low, Yucheng; Yang, Qing

    2011-01-01

    his paper describes the solution method taken by LeBuSiShu team for track1 in ACM KDD CUP 2011 contest (resulting in the 5th place). We identified two main challenges: the unique item taxonomy characteristics as well as the large data set size.To handle the item taxonomy, we present a novel method called Matrix Factorization Item Taxonomy Regularization (MFITR). MFITR obtained the 2nd best prediction result out of more then ten implemented algorithms. For rapidly computing multiple solutions of various algorithms, we have implemented an open source parallel collaborative filtering library on top of the GraphLab machine learning framework. We report some preliminary performance results obtained using the BlackLight supercomputer.

  9. A parallel Kalman filter via the square root Kalman filtering

    OpenAIRE

    Romera, Rosario; Cipra, Tomas

    1993-01-01

    A parallel algorithm for Kalman filtering with contaminated observations is developed. Theı parallel implementation is based on the square root version of the Kalman filter (see [3]). Thisı represents a great improvement over serial implementations reducing drastically computationalı costs for each state update.

  10. AER image filtering

    Science.gov (United States)

    Gómez-Rodríguez, F.; Linares-Barranco, A.; Paz, R.; Miró-Amarante, L.; Jiménez, G.; Civit, A.

    2007-05-01

    Address Event Representation (AER) is an emergent neuromorphic interchip communication protocol that allows real-time virtual massive connectivity among huge number of neurons located on different chips.[1] By exploiting high speed digital communication circuits (with nano-seconds timing), synaptic neural connections can be time multiplexed, while neural activity signals (with mili-seconds timings) are sampled at low frequencies. Neurons generate "events" according to their activity levels. That is, more active neurons generate more events per unit time and access the interchip communication channel more frequently than neurons with low activity. In Neuromorphic system development, AER brings some advantages to develop real-time image processing system: (1) AER represents the information like time continuous stream not like a frame; (2) AER sends the most important information first (although this depends on the sender); (3) AER allows to process information as soon as it is received. When AER is used in artificial vision field, each pixel is considered like a neuron, so pixel's intensity is represented like a sequence of events; modifying the number and the frequency of these events, it is possible to make some image filtering. In this paper we present four image filters using AER: (a) Noise addition and suppression, (b) brightness modification, (c) single moving object tracking and (d) geometrical transformations (rotation, translation, reduction and magnification). For testing and debugging, we use USB-AER board developed by Robotic and Technology of Computers Applied to Rehabilitation (RTCAR) research group. This board is based on an FPGA, devoted to manage the AER functionality. This board also includes a micro-controlled for USB communication, 2 Mbytes RAM and 2 AER ports (one for input and one for output).

  11. Ground roll attenuation using non-stationary matching filtering

    Science.gov (United States)

    Jiao, Shebao; Chen, Yangkang; Bai, Min; Yang, Wencheng; Wang, Erying; Gan, Shuwei

    2015-12-01

    Conventional approaches based on adaptive subtraction for ground roll attenuation first predict an initial model for ground rolls and then adaptively subtract it from the original data using a stationary matching filter (MF). Because of the non-stationary property of seismic data and ground rolls, the application of a traditional stationary MF is not physically plausible. Thus, in the case of highly non-stationary seismic reflections and ground rolls, a stationary MF cannot obtain satisfactory results. In this paper, we apply a non-stationary matching filter (NMF) to adaptively subtract the ground rolls. The NMF can be obtained by solving a highly under-determined inversion problem using non-stationary autoregression. We apply the proposed approach to one synthetic example and two field data examples, and demonstrate a much improved performance compared with the traditional MF approach.

  12. Shunt Active Filter in Damping Harmonics Propagation

    OpenAIRE

    BERBAOUI, B.; M. Rahli; MESLEM, Y.; TEDJINI, H.

    2010-01-01

    This paper deals with a hybrid shunt active power filter applied on 500 kV HVDC, after a description of the causes and effects harmonic pollution which may damage equipments and interrupt electric power customers service; in this paper we present the deferent solutions of this problem among one has to study the two most recent types of filtering: passive and hybrid filter. The hybrid filter consists of active filter connected in shunt with passive filter. The hybrid shunt active filter pro...

  13. Particle Kalman Filtering: A Nonlinear Framework for Ensemble Kalman Filters

    KAUST Repository

    Hoteit, Ibrahim

    2010-09-19

    Optimal nonlinear filtering consists of sequentially determining the conditional probability distribution functions (pdf) of the system state, given the information of the dynamical and measurement processes and the previous measurements. Once the pdfs are obtained, one can determine different estimates, for instance, the minimum variance estimate, or the maximum a posteriori estimate, of the system state. It can be shown that, many filters, including the Kalman filter (KF) and the particle filter (PF), can be derived based on this sequential Bayesian estimation framework. In this contribution, we present a Gaussian mixture‐based framework, called the particle Kalman filter (PKF), and discuss how the different EnKF methods can be derived as simplified variants of the PKF. We also discuss approaches to reducing the computational burden of the PKF in order to make it suitable for complex geosciences applications. We use the strongly nonlinear Lorenz‐96 model to illustrate the performance of the PKF.

  14. Compressed sensing & sparse filtering

    CERN Document Server

    Carmi, Avishy Y; Godsill, Simon J

    2013-01-01

    This book is aimed at presenting concepts, methods and algorithms ableto cope with undersampled and limited data. One such trend that recently gained popularity and to some extent revolutionised signal processing is compressed sensing. Compressed sensing builds upon the observation that many signals in nature are nearly sparse (or compressible, as they are normally referred to) in some domain, and consequently they can be reconstructed to within high accuracy from far fewer observations than traditionally held to be necessary. Apart from compressed sensing this book contains other related app

  15. Testing Dual Rotary Filters - 12373

    International Nuclear Information System (INIS)

    The Savannah River National Laboratory (SRNL) installed and tested two hydraulically connected SpinTekR Rotary Micro-filter units to determine the behavior of a multiple filter system and develop a multi-filter automated control scheme. Developing and testing the control of multiple filters was the next step in the development of the rotary filter for deployment. The test stand was assembled using as much of the hardware planned for use in the field including instrumentation and valving. The control scheme developed will serve as the basis for the scheme used in deployment. The multi filter setup was controlled via an Emerson DeltaV control system running version 10.3 software. Emerson model MD controllers were installed to run the control algorithms developed during this test. Savannah River Remediation (SRR) Process Control Engineering personnel developed the software used to operate the process test model. While a variety of control schemes were tested, two primary algorithms provided extremely stable control as well as significant resistance to process upsets that could lead to equipment interlock conditions. The control system was tuned to provide satisfactory response to changing conditions during the operation of the multi-filter system. Stability was maintained through the startup and shutdown of one of the filter units while the second was still in operation. The equipment selected for deployment, including the concentrate discharge control valve, the pressure transmitters, and flow meters, performed well. Automation of the valve control integrated well with the control scheme and when used in concert with the other control variables, allowed automated control of the dual rotary filter system. Experience acquired on a multi-filter system behavior and with the system layout during this test helped to identify areas where the current deployment rotary filter installation design could be improved. Completion of this testing provides the necessary information on the control and system behavior that will be used in deployment on actual waste. (authors)

  16. Elephant resource-use traditions.

    Science.gov (United States)

    Fishlock, Victoria; Caldwell, Christine; Lee, Phyllis C

    2016-03-01

    African elephants (Loxodonta africana) use unusual and restricted habitats such as swampy clearings, montane outcrops and dry rivers for a variety of social and ecological reasons. Within these habitats, elephants focus on very specific areas for resource exploitation, resulting in deep caves, large forest clearings and sand pits as well as long-established and highly demarcated routes for moving between resources. We review evidence for specific habitat exploitation in elephants and suggest that this represents socially learned cultural behaviour. Although elephants show high fidelity to precise locations over the very long term, these location preferences are explained neither by resource quality nor by accessibility. Acquiring techniques for exploiting specific resource sites requires observing conspecifics and practice and is evidence for social learning. Elephants possess sophisticated cognitive capacities used to track relationships and resources over their long lifespans, and they have an extended period of juvenile dependency as a result of the need to acquire this considerable social and ecological knowledge. Thus, elephant fidelity to particular sites results in traditional behaviour over generations, with the potential to weaken relationships between resource quality and site preferences. Illustrating the evidence for such powerful traditions in a species such as elephants contributes to understanding animal cognition in natural contexts. PMID:26359083

  17. Traditional and Modern Morphometrics: Review

    Directory of Open Access Journals (Sweden)

    Gökhan OCAKO?LU

    2013-01-01

    Full Text Available Morphometrics, a branch of morphology, is the study of the size and shape components of biological forms and their variation in the population. In biological and medical sciences, there is a long history of attempts to quantitatively express the diversity of the size and shape of biological forms. On the basis of historical developments in morphometry, we address several questions related to the shape of organs or organisms that are considered in biological and medical studies. In the field of morphometrics, multivariate statistical analysis is used to rigorously address such questions. Historically, these methods have involved the analysis of collections of distances or angles, but recent theoretical, computational, and other advances have shifted the focus of morphometric procedures to the Cartesian coordinates of anatomical points. In recent years, in biology and medicine, the traditional morphometric studies that aim to analyze shape variation have been replaced by modern morphometric studies. In the biological and medical sciences, morphometric methods are frequently preferred for examining the morphologic structures of organs or organisms with regard to diseases or environmental factors. These methods are also preferred for evaluating and classifying the variation of organs or organisms with respect to growth or allometry time dependently. Geometric morphometric methods are more valid than traditional morphometric methods in protecting more morphological information and in permitting analysis of this information.

  18. TRADITIONAL FERMENTED FOODS OF LESOTHO

    Directory of Open Access Journals (Sweden)

    Tendekayi H. Gadaga

    2013-06-01

    Full Text Available This paper describes the traditional methods of preparing fermented foods and beverages of Lesotho. Information on the preparation methods was obtained through a combination of literature review and face to face interviews with respondents from Roma in Lesotho. An unstructured questionnaire was used to capture information on the processes, raw materials and utensils used. Four products; motoho (a fermented porridge, Sesotho (a sorghum based alcoholic beverage, hopose (sorghum fermented beer with added hops and mafi (spontaneously fermented milk, were found to be the main fermented foods prepared and consumed at household level in Lesotho. Motoho is a thin gruel, popular as refreshing beverage as well as a weaning food. Sesotho is sorghum based alcoholic beverage prepared for household consumption as well as for sale. It is consumed in the actively fermenting state. Mafi is the name given to spontaneously fermented milk with a thick consistency. Little research has been done on the technological aspects, including the microbiological and biochemical characteristics of fermented foods in Lesotho. Some of the traditional aspects of the preparation methods, such as use of earthenware pots, are being replaced, and modern equipment including plastic utensils are being used. There is need for further systematic studies on the microbiological and biochemical characteristics of these these products.

  19. Estimation of aircraft aerodynamic derivatives using Extended Kalman Filter

    OpenAIRE

    Curvo M.

    2000-01-01

    Design of flight control laws, verification of performance predictions, and the implementation of flight simulations are tasks that require a mathematical model of the aircraft dynamics. The dynamical models are characterized by coefficients (aerodynamic derivatives) whose values must be determined from flight tests. This work outlines the use of the Extended Kalman Filter (EKF) in obtaining the aerodynamic derivatives of an aircraft. The EKF shows several advantages over the more traditional...

  20. UV-filtered overlap fermions

    CERN Document Server

    Dürr, S; Wenger, U; D\\"urr, Stephan; Hoelbling, Christian; Wenger, Urs

    2005-01-01

    We discuss the kernel spectrum, locality properties and the axial-vector renormalization constant of UV-filtered overlap fermions. We find that UV-filtered overlap fermions have a better conditioned kernel, better locality and an axial-vector renormalization constant closer to 1 than their unfiltered counterparts, even if the shift parameter $\\rho$ is simply set to 1.

  1. X-band preamplifier filter

    Science.gov (United States)

    Manshadi, F.

    1986-01-01

    A low-loss bandstop filter designed and developed for the Deep Space Network's 34-meter high-efficiency antennas is described. The filter is used for protection of the X-band traveling wave masers from the 20-kW transmitter signal. A combination of empirical and theoretical techniques was employed as well as computer simulation to verify the design before fabrication.

  2. The double well mass filter

    International Nuclear Information System (INIS)

    Various mass filter concepts based on rotating plasmas have been suggested with the specific purpose of nuclear waste remediation. We report on a new rotating mass filter combining radial separation with axial extraction. The radial separation of the masses is the result of a “double-well” in effective radial potential in rotating plasma with a sheared rotation profile

  3. The double well mass filter

    Energy Technology Data Exchange (ETDEWEB)

    Gueroult, Renaud; Fisch, Nathaniel J. [Princeton Plasma Physics Laboratory, Princeton University, Princeton, New Jersey 08543 (United States); Rax, Jean-Marcel [Laboratoire d' optique appliquée-LOA, Ecole Polytechnique, Chemin de la Hunière, 91761 Palaiseau Cedex (France)

    2014-02-15

    Various mass filter concepts based on rotating plasmas have been suggested with the specific purpose of nuclear waste remediation. We report on a new rotating mass filter combining radial separation with axial extraction. The radial separation of the masses is the result of a “double-well” in effective radial potential in rotating plasma with a sheared rotation profile.

  4. Filter desulfation system and method

    Science.gov (United States)

    Lowe, Michael D. (Metamora, IL); Robel, Wade J. (Normale, IL); Verkiel, Maarten (Metamora, IL); Driscoll, James J. (Dunlap, IL)

    2010-08-10

    A method of removing sulfur from a filter system of an engine includes continuously passing an exhaust flow through a desulfation leg of the filter system during desulfation. The method also includes sensing at least one characteristic of the exhaust flow and modifying a flow rate of the exhaust flow during desulfation in response to the sensing.

  5. Mobile filters in nuclear engineering

    International Nuclear Information System (INIS)

    The need for filters with high efficiencies which may be used at any place originated in nuclear power plants. Filters of this type, called Filtermobil, have been developed by Sulzer. They have been used successfully in nuclear plants for several years. (orig.)

  6. Filters in Fuzzy Class Theory.

    Czech Academy of Sciences Publication Activity Database

    Kroupa, Tomáš

    2008-01-01

    Ro?. 159, ?. 14 (2008), s. 1773-1787. ISSN 0165-0114 R&D Projects: GA MŠk 1M0572; GA AV ?R KJB100300502 Institutional research plan: CEZ:AV0Z10750506 Keywords : filter * prime filter * fuzzy class theory Subject RIV: BA - General Mathematics Impact factor: 1.833, year: 2008

  7. Comparison Types of Filter Used in Viewing Inner Structure of Materials Using X-Ray Computed Tomography

    International Nuclear Information System (INIS)

    The inspection of inner structure of materials plays a very important role in safety and quality elements in industry. This process of inspection needs to be carried out in most accurate and efficient manner. Current practice, region of interest is normally being scanned directly to the radiation detector without filtering added. These procedures are assuming perfectly and accurately for the result image reconstruction without justification. While the process of inspection itself may generate unintentionally defects (artefacts). Besides, safety and quality, the use of several filtering techniques can overcome these deficiencies. X-ray computed tomography is a powerful non-invasive imaging technique for viewing an objects inner structures in 2-D or 3-D cross-section images without the need to physically section it. The invention of CT techniques revolutionised the field of medical diagnostic imaging because it provided more detailed and useful information than any previous non-invasive imaging techniques. The method is increasingly being used in industry, aerospace, geosciences and archaeology. This paper describes the comparison of filtering type (aluminium and copper) in X-ray computed tomography system for imaging and visualising of casting material (rear bracket engine mounting). The theoretical aspects of CT scanner, the system configurations and the adopted algorithm for image reconstruction are discussed. The penetrating rays from a 160 kV/10 mA industrial X-ray machine and a bank of Linear Array Detectors (LAD) in combination with a three-axis sample table were used to construct the CT system. The movement of the sample table in vertical, linear and rotary motion is controlled by a Lab View-based software, the x-ray transmission data is collected by using a commercial image grabber package, and the image reconstruction is performed by using the classical Linear Back Projection (LBP) algorithm. Sample of rear bracket engine mounting were scanned using this CT scanner with different type of filters. Some of the reconstructed images are presented in this paper. (author)

  8. Implementational Aspects of the Contourlet Filter Bank and Application in Image Coding

    Directory of Open Access Journals (Sweden)

    Truong T. Nguyen

    2009-02-01

    Full Text Available This paper analyzed the implementational aspects of the contourlet filter bank (or the pyramidal directional filter bank (PDFB, and considered its application in image coding. First, details of the binary tree-structured directional filter bank (DFB are presented, including a modification to minimize the phase delay factor and necessary steps for handling rectangular images. The PDFB is viewed as an overcomplete filter bank, and the directional filters are expressed in terms of polyphase components of the pyramidal filter bank and the conventional DFB. The aliasing effect of the conventional DFB and the Laplacian pyramid to the directional filters is then considered, and the conditions for reducing this effect are presented. The new filters obtained by redesigning the PDFBs satisfying these requirements have much better frequency responses. A hybrid multiscale filter bank consisting of the PDFB at higher scales and the traditional maximally decimated wavelet filter bank at lower scales is constructed to provide a sparse image representation. A novel embedded image coding system based on the image decomposition and a morphological dilation algorithm is then presented. The coding algorithm efficiently clusters the significant coefficients using progressive morphological operations. Context models for arithmetic coding are designed to exploit the intraband dependency and the correlation existing among the neighboring directional subbands. Experimental results show that the proposed coding algorithm outperforms the current state-of-the-art wavelet-based coders, such as JPEG2000, for images with directional features.

  9. Precision filters. Seimitsu roka sochi

    Energy Technology Data Exchange (ETDEWEB)

    Okugawa, Katsumi (Japan Organo Co., Ltd., Tokyo, (Japan))

    1990-03-01

    The purpose of the precision filration is to separate suspended particles or colloidal particles which cannot be separated by conventional sedimentation or sand filtration. Recently, filters for super precision filteration which are located between the ultrafiltration and the precision filtration are used to manufacture highly refined pure water for semiconductors, medicines and nuclear services. The superprecision filters are classified to direct type and precoated type and further the former are divided into the solid type and the packed layer type, and the latter into the cylinder type and the leaflike type. New materials like ion exchange fibers, ceramic filter and hollow fiber membrane are used. Future superfine filters may be developed and wudely used, and especially the hollow fiber membrane is thought to be applied extensively compared with the precoated type because the stability after treatment, simple handling and operation, and capacity enlarging can be obtained. 2 refs., 6 figs.

  10. Identification Filtering with fuzzy estimations

    Directory of Open Access Journals (Sweden)

    J.J Medel J

    2012-10-01

    Full Text Available A digital identification filter interacts with an output reference model signal known as a black-box output system. The identification technique commonly needs the transition and gain matrixes. Both estimation cases are based on mean square criterion obtaining of the minimum output error as the best estimation filtering. The evolution system represents adaptive properties that the identification mechanism includes considering the fuzzy logic strategies affecting in probability sense the evolution identification filter. The fuzzy estimation filter allows in two forms describing the transition and the gain matrixes applying actions that affect the identification structure. Basically, the adaptive criterion conforming the inference mechanisms set, the Knowledge and Rule bases, selecting the optimal coefficients in distribution form. This paper describes the fuzzy strategies applied to the Kalman filter transition function, and gain matrixes. The simulation results were developed using Matlab©.

  11. Statistical properties of soft morphological filters

    Science.gov (United States)

    Koskinen, Lasse; Astola, Jaakko T.

    1992-04-01

    In this paper, statistical properties of standard and soft morphological filters are analyzed using stack filter representation. An asymptotically tight bounds are derived for the outputs of two-dimensional morphological filters. It is shown that soft morphological filters are less sensitive to noise than standard flat morphological filters. Simulation results illustrating this behavior are presented.

  12. DSP Control of Line Hybrid Active Filter

    DEFF Research Database (Denmark)

    Dan, Stan George; Benjamin, Doniga Daniel; Magureanu, R.; Asiminoaei, Lucian; Teodorescu, Remus; Blaabjerg, Frede

    Active Power Filters have been intensively explored in the past decade. Hybrid active filters inherit the efficiency of passive filters and the improved performance of active filters, and thus constitute a viable improved approach for harmonic compensation. In this paper a parallel hybrid filter ...

  13. Trust and Traditions in Transitions

    DEFF Research Database (Denmark)

    McQuaid, Sara Dybris

    On New Year’s Eve 2013, months of talks on ‘Dealing with the past’, ‘Flags’ and ‘Parades’ ended without agreement on how to move towards a reconciliation of positions in Northern Ireland. The failure of the talks illustrates the importance of culture and (mis)trust in divided societies, where politics often pivot around whose culture shall be official and whose subordinated, whose history shall be remembered and whose forgotten (Jordan and Weedon 1995). These struggles are particularly intense in times of transition where traditions, power relations and frames of relevant remembrance are reconfigured. Historically, parading traditions have been important cultural carriers of identity in Northern Ireland. (Jarman 1997). Correspondingly, the marching season has been an arena for politico-cultural struggles and resistance, indexing relations of trust between communities, between society and the state and more recently, trust in the peace process. As the contest over meaning is always determined by the context of articulation, this paper examines the role of parades in the current ‘post-conflict’ phase of the peace process. Using theories of cultural and collective memory (Assman 2011, Olick 2011, Bodnar 1994), politics of affect (Hogget and Thompson) and data from republican and loyalist parades in North Belfast it is argued that a) there is fear of memory collapse in particular communities on the margins of the peace process with a conscious doubling of efforts to articulate the hidden recesses of memory in the current transition. And b) that patterns of ‘competitive commemoration’ in parades should be understood in relation to the increasing dissonance between vernacular languages of conflict and the official post-conflict discourses in Northern Ireland.

  14. Anti-aliasing Filter in Hybrid Filter Banks

    OpenAIRE

    Poulton, Daniel

    2006-01-01

    Hybrid Filter Banks allow wide-band, high frequency conversion. All existing design methods suppose that the input signal is band-limited and that each sub-band signal is sampled at 1/M times the effective Nyquist frequency of the input signal 1/T . To avoid aliasing in the sampling process, an analog anti-aliasing filter should be used in order to eliminate noise in frequency bands in which there is no signal (or a few signal) . In this paper, it is shown that this pre-filtering operation is...

  15. On-line filtering

    International Nuclear Information System (INIS)

    Present day electronic detectors used in high energy physics make it possible to obtain high event rates and it is likely that future experiments will face even higher data rates than at present. The complexity of the apparatus increases very rapidly with time and also the criteria for selecting desired events become more and more complex. So complex in fact that the fast trigger system cannot be designed to fully cope with it. The interesting events become thus contaminated with multitudes of uninteresting ones. To distinguish the 'good' events from the often overwhelming background of other events one has to resort to computing techniques. Normally this selection is made in the first part of the analysis of the events, analysis normally performed on a powerful scientific computer. This implies however that many uninteresting or background events have to be recorded during the experiment for subsequent analysis. A number of undesired consequences result; and these constitute a sufficient reason for trying to perform the selection at an earlier stage, in fact ideally before the events are recorded on magnetic tape. This early selection is called 'on-line filtering' and it is the topic of the present lectures. (Auth.)

  16. Traditional perception of Greeks in Serbian oral tradition

    Directory of Open Access Journals (Sweden)

    Konjik Ivana

    2006-01-01

    Full Text Available Based on material on Greeks from Vuk’s corpus of epic poems, we discuss the construction of ethnic stereotype of Greeks in Serbian language. However, the limitation of the paper’s possible conclusion lies in the nature of the corpus: Vuk had deliberately chosen one material over another, therefore, the corpus relating to Greeks cannot be considered as representative of the whole Serbian folk poems. Therefore, the discussion is limited to certain elements of the stereotype. Nevertheless, these Serbian epic folk poems contain many layers: historical, geographical, sociological, mythological and so on, with a strong foundation in traditional culture; thus, they provide an insight into geo-political situation of the time period, viewpoints, perspectives and experiences of other ethnic groups that Serbs have been into contact with. In particular, the relationship toward Greeks was marked with pronounced patriarchal attitude concerning others: we-others, ours-foreign, good-bad. In this sense, Greeks are portrayed as foreign, and as such, as a potential source of danger. On the other hand, Greeks are Christian Orthodox, which associates them with the category ours. In socio-economic sense, they were traders and wealthy, respected gentlemen. In epical-heroic profile, they were not considered as great heroes, but as "lousy army", and frequently, as unfaithful.

  17. Time Weight Update Model Based on the Memory Principle in Collaborative Filtering

    Directory of Open Access Journals (Sweden)

    Dan Li

    2013-11-01

    Full Text Available Collaborative filtering is the most widely used technology in the recommender systems. Existing collaborative filtering algorithms do not take the time factor into account. However, users’ interests always change with time, and traditional collaborative filtering cannot reflect the changes. In this paper, the change of users’ interests is considered as the memory process, and a time weight iteration model is designed based on memory principle. For a certain user, the proposed model introduces the time weight for each item, and updates the weight by computing the similarity with the items chosen in a recent period. In the recommend process, the weight will be applied to the prediction algorithm. Experimental results show that the modified algorithm can optimize the result of the recommendation in a certain extent, and performs better than traditional collaborative filtering.

  18. A local particle filter for high dimensional geophysical systems

    Science.gov (United States)

    Penny, S. G.; Miyoshi, T.

    2015-12-01

    A local particle filter (LPF) is introduced that outperforms traditional ensemble Kalman filters in highly nonlinear/non-Gaussian scenarios, both in accuracy and computational cost. The standard Sampling Importance Resampling (SIR) particle filter is augmented with an observation-space localization approach, for which an independent analysis is computed locally at each gridpoint. The deterministic resampling approach of Kitagawa is adapted for application locally and combined with interpolation of the analysis weights to smooth the transition between neighboring points. Gaussian noise is applied with magnitude equal to the local analysis spread to prevent particle degeneracy while maintaining the estimate of the growing dynamical instabilities. The approach is validated against the Local Ensemble Transform Kalman Filter (LETKF) using the 40-variable Lorenz-96 model. The results show that: (1) the accuracy of LPF surpasses LETKF as the forecast length increases (thus increasing the degree of nonlinearity), (2) the cost of LPF is significantly lower than LETKF as the ensemble size increases, and (3) LPF prevents filter divergence experienced by LETKF in cases with non-Gaussian observation error distributions.

  19. Gas separating and venting filter

    International Nuclear Information System (INIS)

    A gas separating and venting filter is disclosed for separating gases and liquids and venting the gases in any position of the filter. A housing defines an interior chamber, with inlet and outlet means for the flow of liquid into and out of the chamber. A hydrophilic filter membrane extends along one major wall of the chamber, with longitudinally extending open-sided passageways in the one major wall facing the hydrophilic filter membrane and leading to the outlet means. The hydrophilic filter membrane is flexible for ballooning into the passageways in response to a build-up of pressure in the chamber to restrict and/or cut off the flow of liquid through the passageways. A hydrophobic filter membrane extends along substantially the entire length of an opposite major wall of the chamber between the inlet and outlet means for passing gas but not liquid therethrough. A plurality of spaced vent holes are formed in the opposite major wall for venting gas which has passed through the hydrophobic filter membrane

  20. Nanophotonic filters for digital imaging

    Science.gov (United States)

    Walls, Kirsty

    There has been an increasing demand for low cost, portable CMOS image sensors because of increased integration, and new applications in the automotive, mobile communication and medical industries, amongst others. Colour reproduction remains imperfect in conventional digital image sensors, due to the limitations of the dye-based filters. Further improvement is required if the full potential of digital imaging is to be realised. In alternative systems, where accurate colour reproduction is a priority, existing equipment is too bulky for anything but specialist use. In this work both these issues are addressed by exploiting nanophotonic techniques to create enhanced trichromatic filters, and multispectral filters, all of which can be fabricated on-chip, i.e. integrated into a conventional digital image sensor, to create compact, low cost, mass produceable imaging systems with accurate colour reproduction. The trichromatic filters are based on plasmonic structures. They exploit the excitation of surface plasmon resonances in arrays of subwavelength holes in metal films to filter light. The currently-known analytical expressions are inadequate for optimising all relevant parameters of a plasmonic structure. In order to obtain arbitrary filter characteristics, an automated design procedure was developed that integrated a genetic algorithm and 3D finite-difference time-domain tool. The optimisation procedure's efficacy is demonstrated by designing a set of plasmonic filters that replicate the CIE (1931) colour matching functions, which themselves mimic the human eye's daytime colour response.

  1. Factors Influencing HEPA Filter Performance

    International Nuclear Information System (INIS)

    Properly functioning HEPA air filtration systems depend on a variety of factors that start with the use of fully characterized challenge conditions for system design and then process control during operation. This paper addresses factors that should be considered during the design phase as well as operating parameters that can be monitored to ensure filter function and lifetime. HEPA filters used in nuclear applications are expected to meet design, fabrication, and performance requirements set forth in the ASME AG-1 standard. The DOE publication Nuclear Air Cleaning Handbook (NACH) is an additional guidance document for design and operation HEPA filter systems in DOE facilities. These two guidelines establish basic maximum operating parameters for temperature, maximum aerosol particle size, maximum particulate matter mass concentration, acceptable differential pressure range, and filter media velocity. Each of these parameters is discussed along with data linking variability of each parameter with filter function and lifetime. Temporal uncertainty associated with gas composition, temperature, and absolute pressure of the air flow can have a direct impact on the volumetric flow rate of the system with a corresponding impact on filter media velocity. Correlations between standard units of flow rate (standard meters per minute or cubic feet per minute) versus actual units of volumetric flow rate are shown for variations in relative humidity for a 70 deg. C to 200 deg. C temperature range as an example of gas composition that, uncorrected, will influence media velocity. The AG-1 standard establishes a 2.5 cm/s (5 feet per minute) ceiling for media velocities of nuclear grade HEPA filters. Data are presented that show the impact of media velocities from 2.0 to 4.0 cm/s media velocities (4 to 8 fpm) on differential pressure, filter efficiency, and filter lifetime. Data will also be presented correlating media velocity effects with two different particle size distributions. (authors)

  2. Advanced Filtering Techniques Applied to Spaceflight Project

    Data.gov (United States)

    National Aeronautics and Space Administration — IST-Rolla developed two nonlinear filters for spacecraft orbit determination during the Phase I contract. The theta-D filter and the cost based filter, CBF, were...

  3. The impact of ensemble filter definition on the assimilation of temperature profiles in the tropical Pacific

    OpenAIRE

    Leeuwenburgh, O.; Evensen, Geir; Bertino, Laurent

    2005-01-01

    The traditional analysis scheme in the Ensemble Kalman Filter (EnKF) uses a stochastic perturbation or randomization of the measurements which ensures a correct variance in the updated ensemble. An alternative so called deterministic analysis algorithm is based on a square-root formulation where the perturbation of measurements is avoided. Experiments with simple models have indicated that ensemble collapse is likely to occur when deterministic filters are applied to nonlinear problems. I...

  4. Experimental results of a single-phase shunt active filter prototype with different switching techniques

    OpenAIRE

    Neves, Pedro; Pinto, J.G.; Pregitzer, Ricardo G.; Luís F. C. Monteiro; Afonso, João L.; Sepúlveda, João

    2007-01-01

    This paper presents experimental results obtained with a developed single-phase shunt active power filter laboratory prototype operating with different switching techniques. This active filter can compensate harmonic currents and power factor in single-phase electric installations. Its power circuit is based on a two-leg IGBT inverter, with a single capacitor in the dc side, and an inductor in the ac side. Its control system is based on a simple stratagem that enables the use of the tradition...

  5. Grid filter design for a multi-megawatt medium-voltage voltage source inverter

    OpenAIRE

    Rockhill, A.A.; Liserre, M.; TEODORESCU, Remus; Rodríguez Cortés, Pedro

    2010-01-01

    This paper describes the design procedure and performance of an LCL grid filter for a medium-voltage neutral point clamped (NPC) converter to be adopted for a multimegawatt wind turbine. The unique filter design challenges in this application are driven by a combination of the medium voltage converter, a limited allowable switching frequency, component physical size and weight concerns, and the stringent limits for allowable injected current harmonics. Traditional design ...

  6. Application of Archimedes Filter for Reduction of Hanford HLW

    International Nuclear Information System (INIS)

    Archimedes Technology Group, Inc., is developing a plasma mass separator called the Archimedes Filter that separates waste oxide mixtures ion by ion into two mass groups: light and heavy. For the first time, it is feasible to separate large amounts of material atom by atom in a single pass device. Although vacuum ion based electromagnetic separations have been around for many decades, they have traditionally depended on ion beam manipulation. Neutral plasma devices, on the other hand, are much easier, less costly, and permit several orders of magnitude greater throughput. The Filter has many potential applications in areas where separation of species is otherwise difficult or expensive. In particular, radioactive waste sludges at Hanford have been a particularly difficult issue for pretreatment and immobilization. Over 75% of Hanford HLW oxide mass (excluding water, carbon, and nitrogen) has mass less than 59 g/mol. On the other hand, 99.9% of radionuclide activity has mass greater than 89 g/mol. Therefore, Filter mass separation tuned to this cutoff would have a dramatic effect on the amount of IHLW produced--in fact IHLW would be reduced by a factor of at least four. The Archimedes Filter is a brand new tool for the separations specialist's toolbox. In this paper, we show results that describe the extent to which the Filter separates ionized material. Such results provide estimates for the potential advantages of Filter tunability, both in cutoff mass (electric and magnetic fields) and in degree of ionization (plasma power). Archimedes is now engaged in design and fabrication of its Demonstration Filter separator and intends on performing a full-scale treatment of Hanford high-level waste surrogates. The status of the Demo project will be described

  7. Tunable Imaging Filters in Astronomy

    CERN Document Server

    Bland-Hawthorn, J

    2000-01-01

    While tunable filters are a recent development in night time astronomy, they have long been used in other physical sciences, e.g. solar physics, remote sensing and underwater communications. With their ability to tune precisely to a given wavelength using a bandpass optimized for the experiment, tunable filters are already producing some of the deepest narrowband images to date of astrophysical sources. Furthermore, some classes of tunable filters can be used in fast telescope beams and therefore allow for narrowband imaging over angular fields of more than a degree over the sky.

  8. Properties of ceramic candle filters

    Energy Technology Data Exchange (ETDEWEB)

    Pontius, D.H.

    1995-06-01

    The mechanical integrity of ceramic filter elements is a key issue for hot gas cleanup systems. To meet the demands of the advanced power systems, the filter components must sustain the thermal stresses of normal operations (pulse cleaning), of start-up and shut-down conditions, and of unanticipated process upsets such as excessive ash accumulation without catastrophic failure. They must also survive the various mechanical loads associated with handling and assembly, normal operation, and process upsets. For near-term filter systems, these elements must survive at operating temperatures of 1650{degrees}F for three years.

  9. Adaptive filtering primer with Matlab

    CERN Document Server

    Poularikas, Alexander D

    2006-01-01

    Adaptive Filtering Primer with MATLAB® clearly explains the fundamentals of adaptive filtering supported by numerous examples and computer simulations. The authors introduce discrete-time signal processing, random variables and stochastic processes, the Wiener filter, properties of the error surface, the steepest descent method, and the least mean square (LMS) algorithm. They also supply many MATLAB® functions and m-files along with computer experiments to illustrate how to apply the concepts to real-world problems. The book includes problems along with hints, suggestions, and solutions for solving them. An appendix on matrix computations completes the self-contained coverage.

  10. Pragmatic circuits signals and filters

    CERN Document Server

    Eccles, William

    2006-01-01

    Pragmatic Circuits: Signals and Filters is built around the processing of signals. Topics include spectra, a short introduction to the Fourier series, design of filters, and the properties of the Fourier transform. The focus is on signals rather than power. But the treatment is still pragmatic. For example, the author accepts the work of Butterworth and uses his results to design filters in a fairly methodical fashion. This third of three volumes finishes with a look at spectra by showing how to get a spectrum even if a signal is not periodic. The Fourier transform provides a way of dealing wi

  11. ADVANCED HOT GAS FILTER DEVELOPMENT

    Energy Technology Data Exchange (ETDEWEB)

    E.S. Connolly; G.D. Forsythe

    2000-09-30

    DuPont Lanxide Composites, Inc. undertook a sixty-month program, under DOE Contract DEAC21-94MC31214, in order to develop hot gas candle filters from a patented material technology know as PRD-66. The goal of this program was to extend the development of this material as a filter element and fully assess the capability of this technology to meet the needs of Pressurized Fluidized Bed Combustion (PFBC) and Integrated Gasification Combined Cycle (IGCC) power generation systems at commercial scale. The principal objective of Task 3 was to build on the initial PRD-66 filter development, optimize its structure, and evaluate basic material properties relevant to the hot gas filter application. Initially, this consisted of an evaluation of an advanced filament-wound core structure that had been designed to produce an effective bulk filter underneath the barrier filter formed by the outer membrane. The basic material properties to be evaluated (as established by the DOE/METC materials working group) would include mechanical, thermal, and fracture toughness parameters for both new and used material, for the purpose of building a material database consistent with what is being done for the alternative candle filter systems. Task 3 was later expanded to include analysis of PRD-66 candle filters, which had been exposed to actual PFBC conditions, development of an improved membrane, and installation of equipment necessary for the processing of a modified composition. Task 4 would address essential technical issues involving the scale-up of PRD-66 candle filter manufacturing from prototype production to commercial scale manufacturing. The focus would be on capacity (as it affects the ability to deliver commercial order quantities), process specification (as it affects yields, quality, and costs), and manufacturing systems (e.g. QA/QC, materials handling, parts flow, and cost data acquisition). Any filters fabricated during this task would be used for product qualification tests being conducted by Westinghouse at Foster-Wheeler's Pressurized Circulating Fluidized Bed (PCFBC) test facility in Karhula, Finland. Task 5 was designed to demonstrate the improvements implemented in Task 4 by fabricating fifty 1.5-meter hot gas filters. These filters were to be made available for DOE-sponsored field trials at the Power Systems Development Facility (PSDF), operated by Southern Company Services in Wilsonville, Alabama.

  12. Advanced simulation of digital filters

    Science.gov (United States)

    Doyle, G. S.

    1980-09-01

    An Advanced Simulation of Digital Filters has been implemented on the IBM 360/67 computer utilizing Tektronix hardware and software. The program package is appropriate for use by persons beginning their study of digital signal processing or for filter analysis. The ASDF programs provide the user with an interactive method by which filter pole and zero locations can be manipulated. Graphical output on both the Tektronix graphics screen and the Versatec plotter are provided to observe the effects of pole-zero movement.

  13. Face Recognition using Gabor Filters

    Directory of Open Access Journals (Sweden)

    Sajjad MOHSIN

    2011-01-01

    Full Text Available An Elastic Bunch Graph Map (EBGM algorithm is being proposed in this research paper that successfully implements face recognition using Gabor filters. The proposed system applies 40 different Gabor filters on an image. As aresult of which 40 images with different angles and orientation are received. Next, maximum intensity points in each filtered image are calculated and mark them as Fiducial points. The system reduces these points in accordance to distance between them. The next step is calculating the distances between the reduced points using distance formula. At last, the distances are compared with database. If match occurs, it means that the image is recognized.

  14. Simplified design of filter circuits

    CERN Document Server

    Lenk, John

    1999-01-01

    Simplified Design of Filter Circuits, the eighth book in this popular series, is a step-by-step guide to designing filters using off-the-shelf ICs. The book starts with the basic operating principles of filters and common applications, then moves on to describe how to design circuits by using and modifying chips available on the market today. Lenk's emphasis is on practical, simplified approaches to solving design problems.Contains practical designs using off-the-shelf ICsStraightforward, no-nonsense approachHighly illustrated with manufacturer's data sheets

  15. Gas cleaning with Granular Filters

    OpenAIRE

    Natvig, Ingunn Roald

    2007-01-01

    The panel bed filter (PBF) is a granular filter patented by A. M. Squires in the late sixties. PBFs consist of louvers with stationary, granular beds. Dust is deposited in the top layers and on the bed surface when gas flows through. PBFs are resistant to high temperatures, variations in the gas flow and hot particles. The filter is cleaned by releasing a pressure pulse in the opposite direction of the bulk flow (a puff back pulse). A new louver geometry patented by A. M. Squires is the...

  16. Attitude Representations for Kalman Filtering

    Science.gov (United States)

    Markley, F. Landis; Bauer, Frank H. (Technical Monitor)

    2001-01-01

    The four-component quaternion has the lowest dimensionality possible for a globally nonsingular attitude representation, it represents the attitude matrix as a homogeneous quadratic function, and its dynamic propagation equation is bilinear in the quaternion and the angular velocity. The quaternion is required to obey a unit norm constraint, though, so Kalman filters often employ a quaternion for the global attitude estimate and a three-component representation for small errors about the estimate. We consider these mixed attitude representations for both a first-order Extended Kalman filter and a second-order filter, as well for quaternion-norm-preserving attitude propagation.

  17. Thick Filter Transmission Measurements

    International Nuclear Information System (INIS)

    In multigroup schemes for calculation of neutron space-energy distribution the fluxes g and reaction rates a*F(r)>g averaged over energy intervals g (groups) are determined. The averaged over resonances in the group transmissions and cross sections with self-indication: g = g, a(n)>g a*exp(-n*?)>g, is considered here as a basic input information about elementary processes. In principle for the resolved resonance region these cross section functionals can be determined directly by using the data from evaluated data files. The situation in the region of unresolved resonances levels, where the direct information about the cross sections resonance structure in the averaging interval is not available, is much more complicate. Indirectly the effect of resonance structure appear as a disagreement of thick samples transmission data with calculated exp(-n*) in dependence on the sample thickness n. The strong dependence of the average transmissions at big n on the cross section values in resonance minima and resonance wings imply the requirement for validation of the evaluated data files with the possible correction of these according to the results of direct transmission measurements for relatively thick samples with the beam attenuation of 100 - 1000 times. The tick samples transmission measurements can be treated as simplest benchmark experiments. These are the only integral measurements easily reproduced by evaluated files and can be used for validation and improvements of the evaluated libraries. The main advantage of using such measurements for that is the possibility of achieving high accuracy in experiment. A brief overview of available thick filters transmission and self-indication data including our results obtained previously is presented. The prospective of new high intensity neutron sources for thick sample transmission measurements is discussed concerning the possibility of reaching bigger attenuation of neutron beam and enlarging the number and variety of nuclei involved in these investigations. (author)

  18. LMS order statistic filters adaptation by backpropagation

    OpenAIRE

    Pitas, I.; Vougioukas, S.,

    2010-01-01

    A novel class of nonlinear adaptive filters based on order statistics is presented. An LMS algorithm for their adaptation is proposed. This algorithm is essentially a backpropagation algorithm for the adaptation of coefficients that are used before data sorting. The nonlinear filters that can become adaptive by the techniques presented in this paper are the median hybrid filter, the general nonlinear filter structure, the L-filters and the Ll-filters.

  19. Block implementation of adaptive digital filters

    International Nuclear Information System (INIS)

    Block digital filtering involves the calculation of a block or finite set of filter outputs from a block of input values. This paper presents a block adaptive filtering procedure in which the filter coefficients are adjusted once per each output block in accordance with a generalized least mean-square (LMS) algorithm. Analyses of convergence properties and computational complexity show that the block adaptive filter permits fast implementations while maintaining performance equivalent to that of the widely used LMS adaptive filter

  20. Characterizing a Tune-all bandstop filter

    OpenAIRE

    Musoll, Carles; Llamas Garro, Ignacio; Brito Brito, Zabdiel; Pradell i Cara, Lluís; Corona, Alfonso

    2009-01-01

    In this paper a reconfigurable bandstop filter able to reconfigure central frequency, bandwidth and selectivity for fine tuning applications is presented. The reconfigurable filter topology has four poles and a quasielliptic bandstop filter response. The filter is tuned by varactor diodes placed at different locations on the filter topology. The varactors are voltage controlled in pairs due to filter symmetry for central frequency and bandwidth control. An additional v...

  1. Infusing Qualitative Traditions in Counseling Research Designs

    Science.gov (United States)

    Hays, Danica G.; Wood, Chris

    2011-01-01

    Research traditions serve as a blueprint or guide for a variety of design decisions throughout qualitative inquiry. This article presents 6 qualitative research traditions: grounded theory, phenomenology, consensual qualitative research, ethnography, narratology, and participatory action research. For each tradition, the authors describe its…

  2. Exploring Oral Traditions through the Written Text.

    Science.gov (United States)

    Metting, Fred

    1995-01-01

    Argues that, by reading literature that incorporates folklore and oral traditions, students learn to recognize and appreciate how oral traditions have influenced all cultures. Argues that a study of contemporary American written literature which incorporates elements of the oral tradition introduces students to old and deep wisdom and to a diverse…

  3. Digital notch filter based active damping for LCL filters

    DEFF Research Database (Denmark)

    Yao, Wenli; Yang, Yongheng

    2015-01-01

    LCL filters are widely used in Pulse Width Modulation (PWM) inverters. However, it also introduces a pair of unstable resonant poles that may challenge the controller stability. The passive damping is a convenient possibility to tackle the resonance problem at the cost of system overall efficiency. In contrast, the active damping does not require any dissipation elements, and thus has become of increasing interest. As a result, a vast of active damping solutions have been reported, among which multi-loop control systems and additional sensors are necessary, leading to increased cost and complexity. In this paper, a notch filter based active damping without the requirement of additional sensors is proposed, where the inverter current is employed as the feedback variable. Firstly, a design method of the notch filter for active damping is presented. The entire system stability has then been investigated, which has revealed that negative variations of the resonant frequency can seriously affect the system stability. In order to make the controller more robust against grid impedance variations, the notch filter frequency is thus designed smaller than the LCL filter resonant frequency, which is done in the z-domain. Simulations and experiments are carried out to verify the proposed active damping method. Both results have confirmed that the notch filter based active damping can ensure the entire system stability in the case of resonances with a good system performance.

  4. Concrete ensemble Kalman filters with rigorous catastrophic filter divergence.

    Science.gov (United States)

    Kelly, David; Majda, Andrew J; Tong, Xin T

    2015-08-25

    The ensemble Kalman filter and ensemble square root filters are data assimilation methods used to combine high-dimensional, nonlinear dynamical models with observed data. Ensemble methods are indispensable tools in science and engineering and have enjoyed great success in geophysical sciences, because they allow for computationally cheap low-ensemble-state approximation for extremely high-dimensional turbulent forecast models. From a theoretical perspective, the dynamical properties of these methods are poorly understood. One of the central mysteries is the numerical phenomenon known as catastrophic filter divergence, whereby ensemble-state estimates explode to machine infinity, despite the true state remaining in a bounded region. In this article we provide a breakthrough insight into the phenomenon, by introducing a simple and natural forecast model that transparently exhibits catastrophic filter divergence under all ensemble methods and a large set of initializations. For this model, catastrophic filter divergence is not an artifact of numerical instability, but rather a true dynamical property of the filter. The divergence is not only validated numerically but also proven rigorously. The model cleanly illustrates mechanisms that give rise to catastrophic divergence and confirms intuitive accounts of the phenomena given in past literature. PMID:26261335

  5. Abortion traditions in rural Jamaica.

    Science.gov (United States)

    Sobo, E J

    1996-02-01

    Abortion is not condoned in Jamaica. Its meaning is linked to the meanings of kinship and parenthood, which are expressed through procreation and involve altruism and the assumption of responsibility for the well-being of others. Abortion subverts these ideals but indigenous methods for it are known and are secretly used. The inconsistencies between abortion talk and abortion practice are examined, and the structural functions of abortion (and of its culturally constructed, ideological meaning) are discussed. The distinction--and the overlap--between abortion as such and menstrual regulation is explored. The use of the culturally constructed 'witchcraft baby' syndrome to justify abortion is also investigated. Traditional abortion techniques follow from (and can illuminate) general health practices, which focus on inducing the ejection of 'blockages' and toxins, and from ethnophysiological beliefs about procreation and reproductive health, which easily allow for menstrual delays not caused by conception. The latter understanding and the similarity between abortifacients, emmenagogues and general purgatives allows women flexibility in interpreting the meanings of their missed periods and the physical effects of the remedy. PMID:8643976

  6. Kazakh Traditional Dance Gesture Recognition

    Science.gov (United States)

    Nussipbekov, A. K.; Amirgaliyev, E. N.; Hahn, Minsoo

    2014-04-01

    Full body gesture recognition is an important and interdisciplinary research field which is widely used in many application spheres including dance gesture recognition. The rapid growth of technology in recent years brought a lot of contribution in this domain. However it is still challenging task. In this paper we implement Kazakh traditional dance gesture recognition. We use Microsoft Kinect camera to obtain human skeleton and depth information. Then we apply tree-structured Bayesian network and Expectation Maximization algorithm with K-means clustering to calculate conditional linear Gaussians for classifying poses. And finally we use Hidden Markov Model to detect dance gestures. Our main contribution is that we extend Kinect skeleton by adding headwear as a new skeleton joint which is calculated from depth image. This novelty allows us to significantly improve the accuracy of head gesture recognition of a dancer which in turn plays considerable role in whole body gesture recognition. Experimental results show the efficiency of the proposed method and that its performance is comparable to the state-of-the-art system performances.

  7. Nutraceutical enriched Indian traditional chikki.

    Science.gov (United States)

    Ramakrishna, Chetana; Pamisetty, Aruna; Reddy, Sunki Reddy Yella

    2015-08-01

    Chikki or peanut brittle, a traditional sweet snack was chosen as vehicle for enrichment with added natural nutraceuticals through herbs. The formulation and process for preparation of chikki with added herbs like ashwagandha (Withania somenifera), tulasi (Ocimumsanctum L.) and ajwain (Trachyspermum ammi S.) were standardized. The polyphenol content of chikki with added herbs ranged 0.29-0.46 g/100 g. Among the herbs, ajwain showed more potent antioxidant activity followed by tulasi, whereas ashwagandha and product prepared with it showed the least activity. Total carotenoid contents of chikki with added herbs ranged between 1.5 and 4.3 mg/100 g. Storage studies showed that chikki prepared with tulasi and ajwain were sensorily acceptable up to 90 days, while rancid notes were observed in control and chikki with added ashwagandha at the end of 30 days. Thus chikki with added herbs in addition to containing natural nutraceuticals like polyphenols and carotenoids had improved storage stability compared to control. PMID:26243935

  8. Integrated Spatial Filter Array Project

    Data.gov (United States)

    National Aeronautics and Space Administration — To address the NASA Earth Science Division need for spatial filter arrays for amplitude and wavefront control, Luminit proposes to develop a novel Integrated...

  9. Matched Spectral Filter Imager Project

    Data.gov (United States)

    National Aeronautics and Space Administration — OPTRA proposes the development of an imaging spectrometer for greenhouse gas and volcanic gas imaging based on matched spectral filtering and compressive imaging....

  10. A matched filter for chaos.

    Science.gov (United States)

    Corron, Ned J; Blakely, Jonathan N; Stahl, Mark T

    2010-06-01

    A novel chaotic oscillator is shown to admit an exact analytic solution and a simple matched filter. The oscillator is a hybrid dynamical system including both a differential equation and a discrete switching condition. The analytic solution is written as a linear convolution of a symbol sequence and a fixed basis function, similar to that of conventional communication waveforms. Waveform returns at switching times are shown to be conjugate to a chaotic shift map, effectively proving the existence of chaos in the system. A matched filter in the form of a delay differential equation is derived for the basis function. Applying the matched filter to a received waveform, the bit error rate for detecting symbols is derived, and explicit closed-form expressions are presented for special cases. The oscillator and matched filter are realized in a low-frequency electronic circuit. Remarkable agreement between the analytic solution and the measured chaotic waveform is observed. PMID:20590319

  11. Periodic systems filtering and control

    CERN Document Server

    Bittanti, Sergio

    2008-01-01

    This book offers a comprehensive treatment of the theory of periodic systems, including the problems of filtering and control. It covers an array of topics, presenting an overview of the field and focusing on discrete-time signals and systems..

  12. Challenging traditional authority in the platinum belt

    Scientific Electronic Library Online (English)

    Boitumelo, Matlala.

    2014-09-01

    Full Text Available Members of the Bakgatla-ba-Kgafela traditional community have attempted to hold their traditional leader to account for decisions affecting the community. This article describes the interactions between some community members, traditional leaders, the state and courts, as members of the community ha [...] ve sought to challenge unilateral action by the traditional leader with regard to how community assets and revenue are managed and accounted for. The article examines the various actions groups and individuals have resorted to in an effort to confront traditional leadership and appeal to politicians, officials and the North West provincial government.

  13. Stochastic processes and filtering theory

    CERN Document Server

    Jazwinski, Andrew H

    2007-01-01

    This unified treatment of linear and nonlinear filtering theory presents material previously available only in journals, and in terms accessible to engineering students. Its sole prerequisites are advanced calculus, the theory of ordinary differential equations, and matrix analysis. Although theory is emphasized, the text discusses numerous practical applications as well.Taking the state-space approach to filtering, this text models dynamical systems by finite-dimensional Markov processes, outputs of stochastic difference, and differential equations. Starting with background material on probab

  14. Quantized, piecewise linear filter network

    DEFF Research Database (Denmark)

    Sørensen, John Aasted

    1993-01-01

    A quantization based piecewise linear filter network is defined. A method for the training of this network based on local approximation in the input space is devised. The training is carried out by repeatedly alternating between vector quantization of the training set into quantization classes and equalization of the quantization classes linear filter mean square training errors. The equalization of the mean square training errors is carried out by adapting the boundaries between neighbor quanti...

  15. The Kalman-Levy filter

    OpenAIRE

    SORNETTE, D; Ide, K.

    2000-01-01

    The Kalman filter combines forecasts and new observations to obtain an estimation which is optimal in the sense of a minimum average quadratic error. The Kalman filter has two main restrictions: (i) the dynamical system is assumed linear and (ii) forecasting errors and observational noises are taken Gaussian. Here, we offer an important generalization to the case where errors and noises have heavy tail distributions such as power laws and L\\'evy laws. The main tool needed to...

  16. Filter indexing for spectrophotometer system

    International Nuclear Information System (INIS)

    A spectrophotometer system has an optical system for transmitting a beam from a source at select wavelengths onto a detector. A plurality of filters are positioned in a tray. A stepper mechanism indexes the tray along a path. A microcomputer controls the stepper mechanism and the optical system. The wavelength is successively changed over a range, the tray is indexed to move a select filter into the beam at a predetermined wavelength and the changing is discontinued during indexing

  17. Narrow-Band Microwave Filters

    Directory of Open Access Journals (Sweden)

    A.V. Strizhachenko

    2010-01-01

    Full Text Available Original design of the narrow-band compact filters based on the high-quality waveguide-dielectric resonator with anisotropic materials has been presented in this work. Designed filters satisfy the contradictory requirements: they provide the narrow frequency band (0.05 ÷ 0.1 % of the main frequency f0 and the low initial losses ?0 ? 1 dB.

  18. Contraception: traditional and religious attitudes.

    Science.gov (United States)

    Schenker, J G; Rabenou, V

    1993-04-01

    Humans have tried to control fertility for centuries. Primitive, preliterate societies practiced infanticide and abortion. When primitive women understood the advantages of conception control, they tried, when possible, to use contraception. In the 4th century B.C., Plato and Aristotle advocated a one-child family. Greek medical literature reported a hollow tube inserted through the cervix into the uterus and a potion as contraceptives. Islamic physicians had much knowledge about conception control. The attitudes toward contraception. In the 5th century B.C., Saint Augustine condemned contraception, even among married couples. The condom emerged in the early modern period. Yet, they were usually worn to protect against disease, e.g., bilharzia in Egypt and syphilis in Europe. The cervical cap and the diaphragm are examples of occlusive pessaries. By 1880, contraceptives and spermicides were advertised. In 1928, the IUD joined the existing contraceptives. Today we have combined oral contraceptives. Judaic law requires husbands to fulfill their wives sexual needs, separate from their duty to procreate. It also calls men, not women, to procreate and forbids men from masturbating, thus Judaic law does not forbid women from practicing contraception. The Roman Catholic church forbids contraceptive use because it is a sin against nature. Some Protestant denominations have allowed contraceptive use. Islamic law states that children are gifts from Allah. Some Moslems believe that they must have many children, but Allah and the Prophet state that children have rights to education and future security. These rights allow couples to prevent pregnancy. Neither Hinduism nor Buddhism prohibit contraceptive use. Differences in husband-wife communication, sex roles, access to contraceptives, and traditional family values will have more of an effect on contraceptive use and fertility than theological barriers or the social class of religious groups. PMID:8365507

  19. Photonic Color Filters Integrated with Organic Solar Cells for Energy Harvesting

    KAUST Repository

    Park, Hui Joon

    2011-09-27

    Color filters are indispensable in most color display applications. In most cases, they are chemical pigment-based filters, which produce a particular color by absorbing its complementary color, and the absorbed energy is totally wasted. If the absorbed and wasted energy can be utilized, e.g., to generate electricity, innovative energy-efficient electronic media could be envisioned. Here we show photonic nanostructures incorporated with photovoltaics capable of producing desirable colors in the visible band and utilize the absorbed light to simultaneously generate electrical powers. In contrast to the traditional colorant-based filters, these devices offer great advantages for electro-optic applications. © 2011 American Chemical Society.

  20. LC Filter Design for Wide Band Gap Device Based Adjustable Speed Drives

    DEFF Research Database (Denmark)

    Vadstrup, Casper; Wang, Xiongfei; Blaabjerg, Frede

    and must therefore be reduced. This may be accomplished by a sine LC filter with DC link feedback, which reduces both the differential mode and common mode dV/dt. Wide band gap devices are capable of switching at frequencies much higher than traditional Si based devices. This makes it possible to...... design the LC filter with a higher cut off frequency and without damping resistors. The selection of inductance and capacitance is chosen based on capacitor voltage ripple and current ripple. The filter adds a base load to the inverter, which increases the inverter losses. It is shown how the modulation...... index affects the capacitor capacitor and the inverter current....