WorldWideScience

Sample records for traditional filtered back-projection

  1. Coronary CT angiography: image quality, diagnostic accuracy, and potential for radiation dose reduction using a novel iterative image reconstruction technique - comparison with traditional filtered back projection

    International Nuclear Information System (INIS)

    To compare image noise, image quality and diagnostic accuracy of coronary CT angiography (cCTA) using a novel iterative reconstruction algorithm versus traditional filtered back projection (FBP) and to estimate the potential for radiation dose savings. Sixty five consecutive patients (48 men; 59.3 ± 7.7 years) prospectively underwent cCTA and coronary catheter angiography (CCA). Full radiation dose data, using all projections, were reconstructed with FBP. To simulate image acquisition at half the radiation dose, 50% of the projections were discarded from the raw data. The resulting half-dose data were reconstructed with sinogram-affirmed iterative reconstruction (SAFIRE). Full-dose FBP and half-dose iterative reconstructions were compared with regard to image noise and image quality, and their respective accuracy for stenosis detection was compared against CCA. Compared with full-dose FBP, half-dose iterative reconstructions showed significantly (p = 0.001 - p = 0.025) lower image noise and slightly higher image quality. Iterative reconstruction improved the accuracy of stenosis detection compared with FBP (per-patient: accuracy 96.9% vs. 93.8%, sensitivity 100% vs. 100%, specificity 94.6% vs. 89.2%, NPV 100% vs. 100%, PPV 93.3% vs. 87.5%). Iterative reconstruction significantly reduces image noise without loss of diagnostic information and holds the potential for substantial radiation dose reduction from cCTA. (orig.)

  2. Coronary CT angiography: image quality, diagnostic accuracy, and potential for radiation dose reduction using a novel iterative image reconstruction technique - comparison with traditional filtered back projection

    Energy Technology Data Exchange (ETDEWEB)

    Moscariello, Antonio [Medical University of South Carolina, Heart and Vascular Center, Charleston, SC (United States); Catholic University of the Sacred Heart, ' ' A. Gemelli' ' Hospital, Department of Bioimaging and Radiological Sciences, Rome (Italy); Takx, Richard A.P. [Medical University of South Carolina, Heart and Vascular Center, Charleston, SC (United States); Maastricht University Medical Center, Department of Radiology, Maastricht (Netherlands); Schoepf, U.J.; Renker, Matthias; Zwerner, Peter L. [Medical University of South Carolina, Heart and Vascular Center, Charleston, SC (United States); O' Brien, Terrence X. [Medical University of South Carolina, Heart and Vascular Center, Charleston, SC (United States); The Ralph H. Johnson Veterans Affairs Medical Center, Charleston, SC (United States); Allmendinger, Thomas; Schmidt, Bernhard [Siemens AG, Healthcare Sector, Forchheim (Germany); Vogt, Sebastian [Siemens Medical Solutions USA, Malvern, PA (United States); Savino, Giancarlo; Bonomo, Lorenzo [Catholic University of the Sacred Heart, ' ' A. Gemelli' ' Hospital, Department of Bioimaging and Radiological Sciences, Rome (Italy); Fink, Christian; Henzler, Thomas [Medical University of South Carolina, Heart and Vascular Center, Charleston, SC (United States); Heidelberg University, Institute of Clinical Radiology and Nuclear Medicine, University Medical Center Mannheim, Medical Faculty Mannheim, Heidelberg (Germany)

    2011-10-15

    To compare image noise, image quality and diagnostic accuracy of coronary CT angiography (cCTA) using a novel iterative reconstruction algorithm versus traditional filtered back projection (FBP) and to estimate the potential for radiation dose savings. Sixty five consecutive patients (48 men; 59.3 {+-} 7.7 years) prospectively underwent cCTA and coronary catheter angiography (CCA). Full radiation dose data, using all projections, were reconstructed with FBP. To simulate image acquisition at half the radiation dose, 50% of the projections were discarded from the raw data. The resulting half-dose data were reconstructed with sinogram-affirmed iterative reconstruction (SAFIRE). Full-dose FBP and half-dose iterative reconstructions were compared with regard to image noise and image quality, and their respective accuracy for stenosis detection was compared against CCA. Compared with full-dose FBP, half-dose iterative reconstructions showed significantly (p = 0.001 - p = 0.025) lower image noise and slightly higher image quality. Iterative reconstruction improved the accuracy of stenosis detection compared with FBP (per-patient: accuracy 96.9% vs. 93.8%, sensitivity 100% vs. 100%, specificity 94.6% vs. 89.2%, NPV 100% vs. 100%, PPV 93.3% vs. 87.5%). Iterative reconstruction significantly reduces image noise without loss of diagnostic information and holds the potential for substantial radiation dose reduction from cCTA. (orig.)

  3. Improvement of wavelet threshold filtered back-projection image reconstruction algorithm

    Science.gov (United States)

    Ren, Zhong; Liu, Guodong; Huang, Zhen

    2014-11-01

    Image reconstruction technique has been applied into many fields including some medical imaging, such as X ray computer tomography (X-CT), positron emission tomography (PET) and nuclear magnetic resonance imaging (MRI) etc, but the reconstructed effects are still not satisfied because original projection data are inevitably polluted by noises in process of image reconstruction. Although some traditional filters e.g., Shepp-Logan (SL) and Ram-Lak (RL) filter have the ability to filter some noises, Gibbs oscillation phenomenon are generated and artifacts leaded by back-projection are not greatly improved. Wavelet threshold denoising can overcome the noises interference to image reconstruction. Since some inherent defects exist in the traditional soft and hard threshold functions, an improved wavelet threshold function combined with filtered back-projection (FBP) algorithm was proposed in this paper. Four different reconstruction algorithms were compared in simulated experiments. Experimental results demonstrated that this improved algorithm greatly eliminated the shortcomings of un-continuity and large distortion of traditional threshold functions and the Gibbs oscillation. Finally, the availability of this improved algorithm was verified from the comparison of two evaluation criterions, i.e. mean square error (MSE), peak signal to noise ratio (PSNR) among four different algorithms, and the optimum dual threshold values of improved wavelet threshold function was gotten.

  4. Filtered back-projection algorithm for Compton telescopes

    Science.gov (United States)

    Gunter, Donald L. (Lisle, IL)

    2008-03-18

    A method for the conversion of Compton camera data into a 2D image of the incident-radiation flux on the celestial sphere includes detecting coincident gamma radiation flux arriving from various directions of a 2-sphere. These events are mapped by back-projection onto the 2-sphere to produce a convolution integral that is subsequently stereographically projected onto a 2-plane to produce a second convolution integral which is deconvolved by the Fourier method to produce an image that is then projected onto the 2-sphere.

  5. Research of inverse synthetic aperture imaging lidar based on filtered back-projection tomography technique

    Science.gov (United States)

    Liu, Zhi-chao; Yang, Jin-hua

    2014-07-01

    In order to obtain clear two-dimensional image under the conditions without using heterodyne interferometry by inverse synthetic aperture lidar(ISAL), designed imaging algorithms based on filtered back projection tomography technique, and the target "A" was reconstructed with simulation algorithm by the system in the turntable model. Analyzed the working process of ISAL, and the function of the reconstructed image was given. Detail analysis of the physical meaning of the various parameters in the process of echo data, and its parameters affect the reconstructed image. The image in test area was reconstructed by the one-dimensional distance information with filtered back projection tomography technique. When the measured target rotated, the sum of the echo light intensity at the same distance was constituted by the different position of the measured target. When the total amount collected is large enough, multiple equations can be solved change. Filtered back-projection image of the ideal image is obtained through MATLAB simulation, and analyzed that the angle intervals affected the reconstruction of image. The ratio of the intensity of echo light and loss light affected the reconstruction of image was analyzed. Simulation results show that, when the sampling angle is smaller, the resolution of the reconstructed image of measured target is higher. And the ratio of the intensity of echo light and loss light is greater, the resolution of the reconstructed image of measured target is higher. In conclusion after some data processing, the reconstructed image basically meets be effective identification requirements.

  6. Data correction in reconstructing the image of CBS in the manner of filtered back-projection

    International Nuclear Information System (INIS)

    CBS is a non-destruction detection technology, which was recently developed. The different in contrast with the conventional CT is that the conventional CT uses the integration of the X-ray absorptivity of the various apparatus to reconstruct the image, and that the CBS uses the rate at which the incidence ray are scattered from a particular region to reconstruct the image. The paper primarily analyses the data correction in reconstructing the image of CBS in the manner of classical filtered back-projection. (authors)

  7. Characterization of filters applied in filtered back-projection reconstruction for absorption and differential phase-contrast imaging

    International Nuclear Information System (INIS)

    In computed X-ray tomography, window functions like ''Ram-Lak'' and ''Hamming'' (absorption) or the ''Hilbert filter'' (phase-contrast) are commonly used for correct tomographic reconstruction and improvement of image quality in the back-projection process. Choosing the appropriate filter, i.e. the adequate weight of the spatial frequencies, is not obvious. For instance, the noise spectrum of phase-contrast tomography differs considerably from absorption tomography. Existing literature does not provide a clear comparison and characterization, neither of absorption nor of differential phase-contrast (DPC) filters corresponding to the target application. In this study, we examine the modulation transfer function (MTF) of a simulated phantom in contrast to its tomographic reconstructions, obtained by forward- and filtered back-projection with different filters. We thereby optimize the reconstruction with respect to sharpness of edges as well as to remaining noise. Those parameters are especially studied using the ''Hilbert filter'' in order to improve frequency weighting, considering amongst others the half-pixel shift method introduced in earlier studies. As a result, we provide a set of rules to facilitate the choice of the appropriate filter for both, absorption and DPC tomography. In biomedical imaging, this filter selection allows for individual contrast enhancement depending on the structure of interest (e.g. bones, soft tissue).

  8. Single Image Super-Resolution VIA Iterative Back Projection Based Canny Edge Detection and a Gabor Filter Prior

    OpenAIRE

    Makwana, Rujul R.; Mehta, Prof Nita D.

    2013-01-01

    The Iterative back-projection (IBP) is a classical super-resolution method with low computational complexity that can be applied in real time applications. This paper presents an effective novel single image super resolution approach to recover a high resolution image from a single low resolution input image. The approach is based on an Iterative back projection (IBP) method combined with the Canny Edge Detection and Gabor Filter to recover high frequency information. This method is applied o...

  9. Two-dimensional water temperature reconstruction by filtered back-projection method

    International Nuclear Information System (INIS)

    The reconstruction of water temperature in combustion is realized by the tunable diode absorption spectroscopy technique. The model for H2O temperature distribution is assumed as a Gaussian function, ranging from 300 K to 1300 K. Radon transform is used to simulate the experimental results. The reconstruction of temperature distribution is achieved by reconstruction of two temperature-dependent line strengths based on the filtered back-projection method. The temperature reconstruction result agrees well with the original model. Moreover, the influences of the number of projections and random errors in projections on reconstruction are also studied. The simulation results indicate that the decrease in projection number or the increase in noise increases the mean square error of the reconstructed temperature, deteriorating the reconstructed image. The temperature reconstruction can not reveal the original temperature distribution when the projection number reduces to four. (authors)

  10. Comparison of parabolic filtration methods for 3D filtered back projection in pulsed EPR imaging

    Science.gov (United States)

    Qiao, Zhiwei; Redler, Gage; Epel, Boris; Halpern, Howard J.

    2014-11-01

    Pulse electron paramagnetic resonance imaging (Pulse EPRI) is a robust method for noninvasively measuring local oxygen concentrations in vivo. For 3D tomographic EPRI, the most commonly used reconstruction algorithm is filtered back projection (FBP), in which the parabolic filtration process strongly influences image quality. In this work, we designed and compared 7 parabolic filtration methods to reconstruct both simulated and real phantoms. To evaluate these methods, we designed 3 error criteria and 1 spatial resolution criterion. It was determined that the 2 point derivative filtration method and the two-ramp-filter method have unavoidable negative effects resulting in diminished spatial resolution and increased artifacts respectively. For the noiseless phantom the rectangular-window parabolic filtration method and sinc-window parabolic filtration method were found to be optimal, providing high spatial resolution and small errors. In the presence of noise, the 3 point derivative method and Hamming-window parabolic filtration method resulted in the best compromise between low image noise and high spatial resolution. The 3 point derivative method is faster than Hamming-window parabolic filtration method, so we conclude that the 3 point derivative method is optimal for 3D FBP.

  11. The Comparison of Iterative and Filtered Back Projection Method for Gamma-ray CT

    International Nuclear Information System (INIS)

    There are two categories for an image reconstruction, the transform based reconstruction and the iterative reconstruction technique. The transform based reconstruction is based on an inverse radon transform theory. Filtered Back Projection (FBP) method is a frequently used algorithm based on radon model. In FBP, it is assumed that the measured data consist of line integrals of the object distribution. Iterative reconstruction technique is the method in which the estimated image is progressively refined in a repetitive calculation. For X-ray CT, FBP has been the most powerful technique because it has the sufficient number of total ray-sums. Unlike the X-ray CT, there are situations where it is not possible to measure a large number of projections for the industrial gamma-ray CT. In addition to aforementioned factors, there are many different characteristics between the gamma-ray and Xray. To get a precise image from gamma-ray CT, the adequate image reconstruction algorithm should be adopted. To evaluate the algorithm suitable for gamma ray CT, comparison of iterative and FBP method result from the gamma-ray CT is introduced

  12. CT coronary angiography: Image quality with sinogram-affirmed iterative reconstruction compared with filtered back-projection

    International Nuclear Information System (INIS)

    Aim: To investigate image quality and potential for radiation dose reduction using sinogram-affirmed iterative reconstruction (SAFIRE) at computed tomography (CT) coronary angiography (CTCA) compared with filtered back-projection (FBP) reconstruction. Materials and methods: A water phantom and 49 consecutive patients were scanned using a retrospectively electrocardiography (ECG)-gated CTCA protocol on a dual-source CT system. Image reconstructions were performed with both conventional FBP and SAFIRE. The SAFIRE series were reconstructed image data from only one tube, simulating a 50% radiation dose reduction. Two blinded observers independently assessed the image quality of each coronary segment using a four-point scale and measured image noise (the standard deviation of Hounsfield values, SD), signal-to-noise ratio (SNR), and contrast-to-noise ratio (CNR). Radiation dose estimates were calculated. Results: In the water phantom, image noise decreased at the same ratio as the tube current increased for both reconstruction algorithms. Despite an estimated radiation dose reduction from 7.9 ± 2.8 to 4 ± 1.4 mSv, there was no significant difference in the SD and SNR within the aortic root and left ventricular chamber between the two reconstruction methods. There was also no significant difference in the image quality between the FBP and SAFIRE series. Conclusion: Compared with traditional FBP, there is potential for substantial radiation dose reduction at CTCA with use of SAFIRE, while maintaining similar diagnostic image quality

  13. Comparison of adaptive statistical iterative and filtered back projection reconstruction techniques in brain CT

    Energy Technology Data Exchange (ETDEWEB)

    Ren, Qingguo, E-mail: renqg83@163.com [Department of Radiology, Hua Dong Hospital of Fudan University, Shanghai 200040 (China); Dewan, Sheilesh Kumar, E-mail: sheilesh_d1@hotmail.com [Department of Geriatrics, Hua Dong Hospital of Fudan University, Shanghai 200040 (China); Li, Ming, E-mail: minli77@163.com [Department of Radiology, Hua Dong Hospital of Fudan University, Shanghai 200040 (China); Li, Jianying, E-mail: Jianying.Li@med.ge.com [CT Imaging Research Center, GE Healthcare China, Beijing (China); Mao, Dingbiao, E-mail: maodingbiao74@163.com [Department of Radiology, Hua Dong Hospital of Fudan University, Shanghai 200040 (China); Wang, Zhenglei, E-mail: Williswang_doc@yahoo.com.cn [Department of Radiology, Shanghai Electricity Hospital, Shanghai 200050 (China); Hua, Yanqing, E-mail: cjr.huayanqing@vip.163.com [Department of Radiology, Hua Dong Hospital of Fudan University, Shanghai 200040 (China)

    2012-10-15

    Purpose: To compare image quality and visualization of normal structures and lesions in brain computed tomography (CT) with adaptive statistical iterative reconstruction (ASIR) and filtered back projection (FBP) reconstruction techniques in different X-ray tube current–time products. Materials and methods: In this IRB-approved prospective study, forty patients (nineteen men, twenty-one women; mean age 69.5 ± 11.2 years) received brain scan at different tube current–time products (300 and 200 mAs) in 64-section multi-detector CT (GE, Discovery CT750 HD). Images were reconstructed with FBP and four levels of ASIR-FBP blending. Two radiologists (please note that our hospital is renowned for its geriatric medicine department, and these two radiologists are more experienced in chronic cerebral vascular disease than in neoplastic disease, so this research did not contain cerebral tumors but as a discussion) assessed all the reconstructed images for visibility of normal structures, lesion conspicuity, image contrast and diagnostic confidence in a blinded and randomized manner. Volume CT dose index (CTDI{sub vol}) and dose-length product (DLP) were recorded. All the data were analyzed by using SPSS 13.0 statistical analysis software. Results: There was no statistically significant difference between the image qualities at 200 mAs with 50% ASIR blending technique and 300 mAs with FBP technique (p > .05). While between the image qualities at 200 mAs with FBP and 300 mAs with FBP technique a statistically significant difference (p < .05) was found. Conclusion: ASIR provided same image quality and diagnostic ability in brain imaging with greater than 30% dose reduction compared with FBP reconstruction technique.

  14. Comparison of adaptive statistical iterative and filtered back projection reconstruction techniques in brain CT

    International Nuclear Information System (INIS)

    Purpose: To compare image quality and visualization of normal structures and lesions in brain computed tomography (CT) with adaptive statistical iterative reconstruction (ASIR) and filtered back projection (FBP) reconstruction techniques in different X-ray tube current–time products. Materials and methods: In this IRB-approved prospective study, forty patients (nineteen men, twenty-one women; mean age 69.5 ± 11.2 years) received brain scan at different tube current–time products (300 and 200 mAs) in 64-section multi-detector CT (GE, Discovery CT750 HD). Images were reconstructed with FBP and four levels of ASIR-FBP blending. Two radiologists (please note that our hospital is renowned for its geriatric medicine department, and these two radiologists are more experienced in chronic cerebral vascular disease than in neoplastic disease, so this research did not contain cerebral tumors but as a discussion) assessed all the reconstructed images for visibility of normal structures, lesion conspicuity, image contrast and diagnostic confidence in a blinded and randomized manner. Volume CT dose index (CTDIvol) and dose-length product (DLP) were recorded. All the data were analyzed by using SPSS 13.0 statistical analysis software. Results: There was no statistically significant difference between the image qualities at 200 mAs with 50% ASIR blending technique and 300 mAs with FBP technique (p > .05). While between the image qualities at 200 mAs with FBP and 300 mAs with FBP technique a statistically significant difference (p < .05) was found. Conclusion: ASIR provided same image quality and diagnostic ability in brain imaging with greater than 30% dose reduction compared with FBP reconstruction technique

  15. The influence of filtered back-projection and iterative reconstruction on partial volume correction in PET

    International Nuclear Information System (INIS)

    Aim: We assess the influence of the reconstruction algorithms [OS-EM for the iterative one vs. a filtered back-projection in Fourier space (DiFT)] on partial volume correction in PET employing a fully 3D 3-compartment MR based PV-correction algorithm. The gray matter voxels in the PET image - after removal of the white matter and cerebrospinal fluid contribution - are corrected voxel-by-voxel using the image resolution. Material, methods: Phantom measurements and one healthy human brain FDG study were carried out. For the OSEM reconstruction, a combination of iteration steps and subset numbers (It/Sub) was used, whereby in case of no-convergence the image resolution had to be fitted. The results from the DiFT reconstruction were equivalent to those obtained from the OSEM reconstruction with 10/32 combination for objects with widespread activity concentration. For the sphere phantom, the mean recovery based on the actual values achieved 99.2%±1.8 for all spheres and all reconstruction modes at It/Sub combinations (except for 2/8). In case of the Hoffman 3D brain phantom the mean recovery of the cortical regions was 101%±1.2 (the increase based on the uncorrected values: 35.5%±1.5), while the subcortical regions reached a mean recovery of 80% with an increase of 43.9%±2.5. For the human data, an increase of the metabolized values of several cortical regions ranged between 42% and 48% independent form the reconstruction mode. Conclusions: Our data show that the 3-comclusions: Our data show that the 3-compartment fully 3-D MR based PV-correction is sensitive to the choice of reconstruction algorithms and to the parameter choice. They indicate that despite improved spatial resolution, the use of the iterative reconstruction algorithm for PV-correction results in similar recovery factors when compared to a correction using DiFT reconstruction, insofar the image resolution values are fitted at the It/Sub combinations. (orig.)

  16. Single Image Super-Resolution VIA Iterative Back Projection Based Canny Edge Detection and a Gabor Filter Prior

    Directory of Open Access Journals (Sweden)

    Rujul R Makwana

    2013-03-01

    Full Text Available The Iterative back-projection (IBP is a classical super-resolution method with low computational complexity that can be applied in real time applications. This paper presents an effective novel single image super resolution approach to recover a high resolution image from a single low resolution input image. The approach is based on an Iterative back projection (IBP method combined with the Canny Edge Detection and Gabor Filter to recover high frequency information. This method is applied on different natural gray images and compared with different existing image super resolution approaches. Simulation results show that the proposed algorithms can more accurately enlarge the low resolution image than previous approaches. Proposed algorithm increases the MSSIM and the PSNR and decreases MSE compared to other existing algorithms and also improves visual quality of enlarged images.

  17. Gaussian frequency blending algorithm with matrix inversion tomosynthesis (MITS) and filtered back projection (FBP) for better digital breast tomosynthesis reconstruction

    Science.gov (United States)

    Chen, Ying; Lo, Joseph Y.; Baker, Jay A.; Dobbins, James T., III

    2006-03-01

    Breast cancer is a major problem and the most common cancer among women. The nature of conventional mammpgraphy makes it very difficult to distinguish a cancer from overlying breast tissues. Digital Tomosynthesis refers to a three-dimensional imaging technique that allows reconstruction of an arbitrary set of planes in the breast from limited-angle series of projection images as the x-ray source moves. Several tomosynthesis algorithms have been proposed, including Matrix Inversion Tomosynthesis (MITS) and Filtered Back Projection (FBP) that have been investigated in our lab. MITS shows better high frequency response in removing out-of-plane blur, while FBP shows better low frequency noise propertities. This paper presents an effort to combine MITS and FBP for better breast tomosynthesis reconstruction. A high-pass Gaussian filter was designed and applied to three-slice "slabbing" MITS reconstructions. A low-pass Gaussian filter was designed and applied to the FBP reconstructions. A frequency weighting parameter was studied to blend the high-passed MITS with low-passed FBP frequency components. Four different reconstruction methods were investigated and compared with human subject images: 1) MITS blended with Shift-And-Add (SAA), 2) FBP alone, 3) FBP with applied Hamming and Gaussian Filters, and 4) Gaussian Frequency Blending (GFB) of MITS and FBP. Results showed that, compared with FBP, Gaussian Frequency Blending (GFB) has better performance for high frequency content such as better reconstruction of micro-calcifications and removal of high frequency noise. Compared with MITS, GFB showed more low frequency breast tissue content.

  18. Evaluation of dose reduction and image quality in CT colonography: Comparison of low-dose CT with iterative reconstruction and routine-dose CT with filtered back projection

    Energy Technology Data Exchange (ETDEWEB)

    Nagata, Koichi [Kameda Medical Center, Department of Radiology, Kamogawa, Chiba (Japan); Jichi Medical University, Department of Radiology, Tochigi (Japan); National Cancer Center, Cancer Screening Technology Division, Research Center for Cancer Prevention and Screening, Tokyo (Japan); Fujiwara, Masanori; Mogi, Tomohiro; Iida, Nao [Kameda Medical Center Makuhari, Department of Radiology, Chiba (Japan); Kanazawa, Hidenori; Sugimoto, Hideharu [Jichi Medical University, Department of Radiology, Tochigi (Japan); Mitsushima, Toru [Kameda Medical Center Makuhari, Department of Gastroenterology, Chiba (Japan); Lefor, Alan T. [Jichi Medical University, Department of Surgery, Tochigi (Japan)

    2015-01-15

    To prospectively evaluate the radiation dose and image quality comparing low-dose CT colonography (CTC) reconstructed using different levels of iterative reconstruction techniques with routine-dose CTC reconstructed with filtered back projection. Following institutional ethics clearance and informed consent procedures, 210 patients underwent screening CTC using automatic tube current modulation for dual positions. Examinations were performed in the supine position with a routine-dose protocol and in the prone position, randomly applying four different low-dose protocols. Supine images were reconstructed with filtered back projection and prone images with iterative reconstruction. Two blinded observers assessed the image quality of endoluminal images. Image noise was quantitatively assessed by region-of-interest measurements. The mean effective dose in the supine series was 1.88 mSv using routine-dose CTC, compared to 0.92, 0.69, 0.57, and 0.46 mSv at four different low doses in the prone series (p < 0.01). Overall image quality and noise of low-dose CTC with iterative reconstruction were significantly improved compared to routine-dose CTC using filtered back projection. The lowest dose group had image quality comparable to routine-dose images. Low-dose CTC with iterative reconstruction reduces the radiation dose by 48.5 to 75.1 % without image quality degradation compared to routine-dose CTC with filtered back projection. (orig.)

  19. Evaluation of dose reduction and image quality in CT colonography: Comparison of low-dose CT with iterative reconstruction and routine-dose CT with filtered back projection

    International Nuclear Information System (INIS)

    To prospectively evaluate the radiation dose and image quality comparing low-dose CT colonography (CTC) reconstructed using different levels of iterative reconstruction techniques with routine-dose CTC reconstructed with filtered back projection. Following institutional ethics clearance and informed consent procedures, 210 patients underwent screening CTC using automatic tube current modulation for dual positions. Examinations were performed in the supine position with a routine-dose protocol and in the prone position, randomly applying four different low-dose protocols. Supine images were reconstructed with filtered back projection and prone images with iterative reconstruction. Two blinded observers assessed the image quality of endoluminal images. Image noise was quantitatively assessed by region-of-interest measurements. The mean effective dose in the supine series was 1.88 mSv using routine-dose CTC, compared to 0.92, 0.69, 0.57, and 0.46 mSv at four different low doses in the prone series (p < 0.01). Overall image quality and noise of low-dose CTC with iterative reconstruction were significantly improved compared to routine-dose CTC using filtered back projection. The lowest dose group had image quality comparable to routine-dose images. Low-dose CTC with iterative reconstruction reduces the radiation dose by 48.5 to 75.1 % without image quality degradation compared to routine-dose CTC with filtered back projection. (orig.)

  20. Gamma-ray momentum reconstruction from Compton electron trajectories by filtered back-projection

    International Nuclear Information System (INIS)

    Gamma-ray imaging utilizing Compton scattering has traditionally relied on measuring coincident gamma-ray interactions to map directional information of the source distribution. This coincidence requirement makes it an inherently inefficient process. We present an approach to gamma-ray reconstruction from Compton scattering that requires only a single electron tracking detector, thus removing the coincidence requirement. From the Compton scattered electron momentum distribution, our algorithm analytically computes the incident photon's correlated direction and energy distributions. Because this method maps the source energy and location, it is useful in applications, where prior information about the source distribution is unknown. We demonstrate this method with electron tracks measured in a scientific Si charge coupled device. While this method was demonstrated with electron tracks in a Si-based detector, it is applicable to any detector that can measure electron direction and energy, or equivalently the electron momentum. For example, it can increase the sensitivity to obtain energy and direction in gas-based systems that suffer from limited efficiency.

  1. Coronary CT angiography: Comparison of a novel iterative reconstruction with filtered back projection for reconstruction of low-dose CT—Initial experience

    International Nuclear Information System (INIS)

    Objective: To prospectively compare subjective and objective image quality in 20% tube current coronary CT angiography (cCTA) datasets between an iterative reconstruction algorithm (SAFIRE) and traditional filtered back projection (FBP). Materials and methods: Twenty patients underwent a prospectively ECG-triggered dual-step cCTA protocol using 2nd generation dual-source CT (DSCT). CT raw data was reconstructed using standard FBP at full-dose (Group1a) and 80% tube current reduced low-dose (Group1b). The low-dose raw data was additionally reconstructed using iterative raw data reconstruction (Group2). Attenuation and image noise were measured in three regions of interest and signal-to-noise-ratio (SNR) as well as contrast-to-noise-ratio (CNR) was calculated. Subjective diagnostic image quality was evaluated using a 4-point Likert scale. Results: Mean image noise of group2 was lowered by 22% on average when compared to group1b (p 2 compared to group1b (p 2 (1.88 ± 0.63) was also rated significantly higher when compared to group1b (1.58 ± 0.63, p = 0.004). Conclusions: Image quality of 80% tube current reduced iteratively reconstructed cCTA raw data is significantly improved when compared to standard FBP and consequently may improve the diagnostic accuracy of cCTA

  2. Rapid mapping of visual receptive fields by filtered back projection: application to multi-neuronal electrophysiology and imaging.

    Science.gov (United States)

    Johnston, Jamie; Ding, Huayu; Seibel, Sofie H; Esposti, Federico; Lagnado, Leon

    2014-11-15

    Neurons in the visual system vary widely in the spatiotemporal properties of their receptive fields (RFs), and understanding these variations is key to elucidating how visual information is processed. We present a new approach for mapping RFs based on the filtered back projection (FBP), an algorithm used for tomographic reconstructions. To estimate RFs, a series of bars were flashed across the retina at pseudo-random positions and at a minimum of five orientations. We apply this method to retinal neurons and show that it can accurately recover the spatial RF and impulse response of ganglion cells recorded on a multi-electrode array. We also demonstrate its utility for in vivo imaging by mapping the RFs of an array of bipolar cell synapses expressing a genetically encoded Ca(2+) indicator. We find that FBP offers several advantages over the commonly used spike-triggered average (STA): (i) ON and OFF components of a RF can be separated; (ii) the impulse response can be reconstructed at sample rates of 125 Hz, rather than the refresh rate of a monitor; (iii) FBP reveals the response properties of neurons that are not evident using STA, including those that display orientation selectivity, or fire at low mean spike rates; and (iv) the FBP method is fast, allowing the RFs of all the bipolar cell synaptic terminals in a field of view to be reconstructed in under 4 min. Use of the FBP will benefit investigations of the visual system that employ electrophysiology or optical reporters to measure activity across populations of neurons. PMID:25172952

  3. Comparison of pure and hybrid iterative reconstruction techniques with conventional filtered back projection: Image quality assessment in the cervicothoracic region

    Energy Technology Data Exchange (ETDEWEB)

    Katsura, Masaki, E-mail: mkatsura-tky@umin.ac.jp [Department of Radiology, Graduate School of Medicine, The University of Tokyo, 7-3-1 Hongo, Bunkyo-ku, Tokyo 113-8655 (Japan); Sato, Jiro; Akahane, Masaaki; Matsuda, Izuru; Ishida, Masanori; Yasaka, Koichiro; Kunimatsu, Akira; Ohtomo, Kuni [Department of Radiology, Graduate School of Medicine, The University of Tokyo, 7-3-1 Hongo, Bunkyo-ku, Tokyo 113-8655 (Japan)

    2013-02-15

    Objectives: To evaluate the impact on image quality of three different image reconstruction techniques in the cervicothoracic region: model-based iterative reconstruction (MBIR), adaptive statistical iterative reconstruction (ASIR), and filtered back projection (FBP). Methods: Forty-four patients underwent unenhanced standard-of-care clinical computed tomography (CT) examinations which included the cervicothoracic region with a 64-row multidetector CT scanner. Images were reconstructed with FBP, 50% ASIR-FBP blending (ASIR50), and MBIR. Two radiologists assessed the cervicothoracic region in a blinded manner for streak artifacts, pixilated blotchy appearances, critical reproduction of visually sharp anatomical structures (thyroid gland, common carotid artery, and esophagus), and overall diagnostic acceptability. Objective image noise was measured in the internal jugular vein. Data were analyzed using the sign test and pair-wise Student's t-test. Results: MBIR images had significant lower quantitative image noise (8.88 ± 1.32) compared to ASIR images (18.63 ± 4.19, P < 0.01) and FBP images (26.52 ± 5.8, P < 0.01). Significant improvements in streak artifacts of the cervicothoracic region were observed with the use of MBIR (P < 0.001 each for MBIR vs. the other two image data sets for both readers), while no significant difference was observed between ASIR and FBP (P > 0.9 for ASIR vs. FBP for both readers). MBIR images were all diagnostically acceptable. Unique features of MBIR images included pixilated blotchy appearances, which did not adversely affect diagnostic acceptability. Conclusions: MBIR significantly improves image noise and streak artifacts of the cervicothoracic region over ASIR and FBP. MBIR is expected to enhance the value of CT examinations for areas where image noise and streak artifacts are problematic.

  4. Coronary CT angiography: Comparison of a novel iterative reconstruction with filtered back projection for reconstruction of low-dose CT—Initial experience

    Energy Technology Data Exchange (ETDEWEB)

    Takx, Richard A.P. [Department of Radiology and Radiological Science, Medical University of South Carolina, Charleston, SC (United States); Department of Radiology, Maastricht University Medical Centre, Maastricht (Netherlands); Schoepf, U. Joseph, E-mail: schoepf@musc.edu [Department of Radiology and Radiological Science, Medical University of South Carolina, Charleston, SC (United States); Division of Cardiology, Department of Medicine, Medical University of South Carolina, Charleston, SC (United States); Moscariello, Antonio [Department of Radiology and Radiological Science, Medical University of South Carolina, Charleston, SC (United States); Department of Radiology, Policlinico Universitario Campus Bio-Medico, Rome (Italy); Das, Marco [Department of Radiology, Maastricht University Medical Centre, Maastricht (Netherlands); Rowe, Garrett [Department of Radiology and Radiological Science, Medical University of South Carolina, Charleston, SC (United States); Schoenberg, Stefan O.; Fink, Christian [Institute of Clinical Radiology and Nuclear Medicine, University Medical Center Mannheim, Medical Faculty Mannheim, Heidelberg University (Germany); Henzler, Thomas [Department of Radiology and Radiological Science, Medical University of South Carolina, Charleston, SC (United States); Institute of Clinical Radiology and Nuclear Medicine, University Medical Center Mannheim, Medical Faculty Mannheim, Heidelberg University (Germany)

    2013-02-15

    Objective: To prospectively compare subjective and objective image quality in 20% tube current coronary CT angiography (cCTA) datasets between an iterative reconstruction algorithm (SAFIRE) and traditional filtered back projection (FBP). Materials and methods: Twenty patients underwent a prospectively ECG-triggered dual-step cCTA protocol using 2nd generation dual-source CT (DSCT). CT raw data was reconstructed using standard FBP at full-dose (Group{sub 1}a) and 80% tube current reduced low-dose (Group{sub 1}b). The low-dose raw data was additionally reconstructed using iterative raw data reconstruction (Group{sub 2}). Attenuation and image noise were measured in three regions of interest and signal-to-noise-ratio (SNR) as well as contrast-to-noise-ratio (CNR) was calculated. Subjective diagnostic image quality was evaluated using a 4-point Likert scale. Results: Mean image noise of group{sub 2} was lowered by 22% on average when compared to group{sub 1}b (p < 0.0001–0.0033), while there were no significant differences in mean attenuation within the same anatomical regions. The lower image noise resulted in significantly higher SNR and CNR ratios in group{sub 2} compared to group{sub 1}b (p < 0.0001–0.0232). Subjective image quality of group{sub 2} (1.88 ± 0.63) was also rated significantly higher when compared to group{sub 1}b (1.58 ± 0.63, p = 0.004). Conclusions: Image quality of 80% tube current reduced iteratively reconstructed cCTA raw data is significantly improved when compared to standard FBP and consequently may improve the diagnostic accuracy of cCTA.

  5. Why do commercial CT scanners still employ traditional, filtered back-projection for image reconstruction?

    International Nuclear Information System (INIS)

    Despite major advances in x-ray sources, detector arrays, gantry mechanical design and especially computer performance, one component of computed tomography (CT) scanners has remained virtually constant for the past 25 years—the reconstruction algorithm. Fundamental advances have been made in the solution of inverse problems, especially tomographic reconstruction, but these works have not been translated into clinical and related practice. The reasons are not obvious and seldom discussed. This review seeks to examine the reasons for this discrepancy and provides recommendations on how it can be resolved. We take the example of field of compressive sensing (CS), summarizing this new area of research from the eyes of practical medical physicists and explaining the disconnection between theoretical and application-oriented research. Using a few issues specific to CT, which engineers have addressed in very specific ways, we try to distill the mathematical problem underlying each of these issues with the hope of demonstrating that there are interesting mathematical problems of general importance that can result from in depth analysis of specific issues. We then sketch some unconventional CT-imaging designs that have the potential to impact on CT applications, if the link between applied mathematicians and engineers/physicists were stronger. Finally, we close with some observations on how the link could be strengthened. There is, we believe, an important opportunity to rapidly improve the performance of CT and related tomographic imaging techniques by addressing these issues. (topical review)

  6. Investigation of the quantitative accuracy of 3D iterative reconstruction algorithms in comparison to filtered back projection method: a phantom study

    Science.gov (United States)

    Abuhadi, Nouf; Bradley, David; Katarey, Dev; Podolyak, Zsolt; Sassi, Salem

    2014-03-01

    Introduction: Single-Photon Emission Computed Tomography (SPECT) is used to measure and quantify radiopharmaceutical distribution within the body. The accuracy of quantification depends on acquisition parameters and reconstruction algorithms. Until recently, most SPECT images were constructed using Filtered Back Projection techniques with no attenuation or scatter corrections. The introduction of 3-D Iterative Reconstruction algorithms with the availability of both computed tomography (CT)-based attenuation correction and scatter correction may provide for more accurate measurement of radiotracer bio-distribution. The effect of attenuation and scatter corrections on accuracy of SPECT measurements is well researched. It has been suggested that the combination of CT-based attenuation correction and scatter correction can allow for more accurate quantification of radiopharmaceutical distribution in SPECT studies (Bushberg et al., 2012). However, The effect of respiratory induced cardiac motion on SPECT images acquired using higher resolution algorithms such 3-D iterative reconstruction with attenuation and scatter corrections has not been investigated. Aims: To investigate the quantitative accuracy of 3D iterative reconstruction algorithms in comparison to filtered back projection (FBP) methods implemented on cardiac SPECT/CT imaging with and without CT-attenuation and scatter corrections. Also to investigate the effects of respiratory induced cardiac motion on myocardium perfusion quantification. Lastly, to present a comparison of spatial resolution for FBP and ordered subset expectation maximization (OSEM) Flash 3D together with and without respiratory induced motion, and with and without attenuation and scatter correction. Methods: This study was performed on a Siemens Symbia T16 SPECT/CT system using clinical acquisition protocols. Respiratory induced cardiac motion was simulated by imaging a cardiac phantom insert whilst moving it using a respiratory motion motor inducing cyclical elliptical motion of the apex of the cardiac insert. Results: Our analyses revealed that the use of the Flash 3-D reconstruction algorithm without scatter or attenuation correction has improved Spatial Resolution by 30% relative to FBP. Reduction in Spatial Resolution due to respiratory induced motion was 12% and 38% for FBP and Flash 3-D respectively. The implementation of scatter correction has resulted in a reduction in resolution by up to 6%. The application of CT-based attenuation correction has resulted in 13% and 26% reduction in spatial resolution for SPECT images reconstructed using FBP and Flash 3-D algorithms respectively. Conclusion: We conclude that iterative reconstruction (Flash-3D) provides significant improvement in image spatial resolution, however as a result the effects of respiratory induced motion have become more evident and correction of this is required before the full potential of these algorithms can be realised for myocardial perfusion imaging. Attenuation and scatter correction can improve image contrast, but may have significant detrimental effect on spatial resolution.

  7. FDG-PET standardized uptake values in normal anatomical structures using iterative reconstruction segmented attenuation correction and filtered back-projection

    International Nuclear Information System (INIS)

    Filtered back-projection (FBP) is the most commonly used reconstruction method for PET images, which are usually noisy. The iterative reconstruction segmented attenuation correction (IRSAC) algorithm improves image quality without reducing image resolution. The standardized uptake value (SUV) is the most clinically utilized quantitative parameter of [fluorine-18]fluoro-2-deoxy-d-glucose (FDG) accumulation. The objective of this study was to obtain a table of SUVs for several normal anatomical structures from both routinely used FBP and IRSAC reconstructed images and to compare the data obtained with both methods. Twenty whole-body PET scans performed in consecutive patients with proven or suspected non-small cell lung cancer were retrospectively analyzed. Images were processed using both IRSAC and FBP algorithms. Nonquantitative or gaussian filters were used to smooth the transmission scan when using FBP or IRSAC algorithms, respectively. A phantom study was performed to evaluate the effect of different filters on SUV. Maximum and average SUVs (SUVmax and SUVavg) were calculated in 28 normal anatomical structures and in one pathological site. The phantom study showed that the use of a nonquantitative smoothing filter in the transmission scan results in a less accurate quantification and in a 20% underestimation of the actual measurement. Most anatomical structures were identified in all patients using the IRSAC images. On average, SUVavgavg and SUVmax measured on IRSAC images using a gaussian filter in the transmission scan were respectively 20% and 8% higher than the SUVs calculated from conventional FBP images. Scatterplots of the data values showed an overall strong relationship between IRSAC and FBP SUVs. Individual scatterplots of each site demonstrated a weaker relationship for lower SUVs and for SUVmax than for higher SUVs and SUVavg. A set of reference values was obtained for SUVmax and SUVavg of normal anatomical structures, calculated with both IRSAC and FBP image reconstruction algorithms. The use of IRSAC and a gaussian filter for the transmission scan seems to give more accurate SUVs than are obtained from conventional FBP images using a nonquantitative filter for the transmission scan. (orig.)

  8. Quantitative analysis of emphysema and airway measurements according to iterative reconstruction algorithms: comparison of filtered back projection, adaptive statistical iterative reconstruction and model-based iterative reconstruction

    International Nuclear Information System (INIS)

    To evaluate filtered back projection (FBP) and two iterative reconstruction (IR) algorithms and their effects on the quantitative analysis of lung parenchyma and airway measurements on computed tomography (CT) images. Low-dose chest CT obtained in 281 adult patients were reconstructed using three algorithms: FBP, adaptive statistical IR (ASIR) and model-based IR (MBIR). Measurements of each dataset were compared: total lung volume, emphysema index (EI), airway measurements of the lumen and wall area as well as average wall thickness. Accuracy of airway measurements of each algorithm was also evaluated using an airway phantom. EI using a threshold of -950 HU was significantly different among the three algorithms in decreasing order of FBP (2.30 %), ASIR (1.49 %) and MBIR (1.20 %) (P < 0.01). Wall thickness was also significantly different among the three algorithms with FBP (2.09 mm) demonstrating thicker walls than ASIR (2.00 mm) and MBIR (1.88 mm) (P < 0.01). Airway phantom analysis revealed that MBIR showed the most accurate value for airway measurements. The three algorithms presented different EIs and wall thicknesses, decreasing in the order of FBP, ASIR and MBIR. Thus, care should be taken in selecting the appropriate IR algorithm on quantitative analysis of the lung. (orig.)

  9. Chest computed tomography using iterative reconstruction vs filtered back projection (Part 2): image quality of low-dose CT examinations in 80 patients

    International Nuclear Information System (INIS)

    To evaluate the image quality of an iterative reconstruction algorithm (IRIS) in low-dose chest CT in comparison with standard-dose filtered back projection (FBP) CT. Eighty consecutive patients referred for a follow-up chest CT examination of the chest, underwent a low-dose CT examination (Group 2) in similar technical conditions to those of the initial examination, (Group 1) except for the milliamperage selection and the replacement of regular FBP reconstruction by iterative reconstructions using three (Group 2a) and five iterations (Group 2b). Despite a mean decrease of 35.5% in the dose-length-product, there was no statistically significant difference between Group 2a and Group 1 in the objective noise, signal-to-noise (SNR) and contrast-to-noise (CNR) ratios and distribution of the overall image quality scores. Compared to Group 1, objective image noise in Group 2b was significantly reduced with increased SNR and CNR and a trend towards improved image quality. Iterative reconstructions using three iterations provide similar image quality compared with the conventionally used FBP reconstruction at 35% less dose, thus enabling dose reduction without loss of diagnostic information. According to our preliminary results, even higher dose reductions than 35% may be feasible by using more than three iterations. (orig.)

  10. Adaptive statistical iterative reconstruction versus filtered back projection in the same patient: 64 channel liver CT image quality and patient radiation dose

    International Nuclear Information System (INIS)

    To compare routine dose liver CT reconstructed with filtered back projection (FBP) versus low dose images reconstructed with FBP and adaptive statistical iterative reconstruction (ASIR). In this retrospective study, patients had a routine dose protocol reconstructed with FBP, and again within 17 months (median 6.1 months), had a low dose protocol reconstructed twice, with FBP and ASIR. These reconstructions were compared for noise, image quality, and radiation dose. Nineteen patients were included. (12 male, mean age 58). Noise was significantly lower in low dose images reconstructed with ASIR compared to routine dose images reconstructed with FBP (liver: p <.05, aorta: p < 0.001). Low dose FBP images were scored significantly lower for subjective image quality than low dose ASIR (2.1 ± 0.5, 3.2 ± 0.8, p < 0.001). There was no difference in subjective image quality scores between routine dose FBP images and low dose ASIR images (3.6 ± 0.5, 3.2 ± 0.8, NS).Radiation dose was 41% less for the low dose protocol (4.4 ± 2.4 mSv versus 7.5 ± 5.5 mSv, p < 0.05). Our initial results suggest low dose CT images reconstructed with ASIR may have lower measured noise, similar image quality, yet significantly less radiation dose compared with higher dose images reconstructed with FBP. (orig.)

  11. Standard dose versus low-dose abdominal and pelvic CT: Comparison between filtered back projection versus adaptive iterative dose reduction 3D

    International Nuclear Information System (INIS)

    Purpose: To compare the dose and image quality of a standard dose abdominal and pelvic CT with Filtered Back Projection (FBP) to low-dose CT with Adaptive Iterative Dose Reduction 3D (AIDR 3D). Materials and methods: We retrospectively examined the images of 21 patients in the portal phase of an abdominal and pelvic CT scan before and after implementation of AIDR 3D iterative reconstruction. The acquisition length, dose and evaluations of the image quality were compared between standard dose FBP images and low-dose images reconstructed with AIDR 3D and FBP using the Wilcoxon test. Results: The mean acquisition length was similar for both CT scans. There was a significant dose reduction of 49.5% with low-dose CT compared to standard dose CT (mean DLP of 451 mGy.cm versus 892 mGy.cm, P < 0.001). There were no differences in image quality scores between standard dose FBP and low-dose AIDR 3D images (4.6 ± 0.6 versus 4.4 ± 0.6 respectively, P = 0.147). Conclusion: AIDR 3D iterative reconstruction enables a significant reduction in dose of 49.5% to be achieved with abdominal CT scan compared to FBP, whilst maintaining equivalent image quality. (authors)

  12. Quantitative analysis of emphysema and airway measurements according to iterative reconstruction algorithms: comparison of filtered back projection, adaptive statistical iterative reconstruction and model-based iterative reconstruction

    Energy Technology Data Exchange (ETDEWEB)

    Choo, Ji Yung [Seoul National University Medical Research Center, Department of Radiology, Seoul National University College of Medicine, and Institute of Radiation Medicine, Seoul (Korea, Republic of); Korea University Ansan Hospital, Ansan-si, Department of Radiology, Gyeonggi-do (Korea, Republic of); Goo, Jin Mo; Park, Chang Min; Park, Sang Joon [Seoul National University Medical Research Center, Department of Radiology, Seoul National University College of Medicine, and Institute of Radiation Medicine, Seoul (Korea, Republic of); Seoul National University, Cancer Research Institute, Seoul (Korea, Republic of); Lee, Chang Hyun; Shim, Mi-Suk [Seoul National University Medical Research Center, Department of Radiology, Seoul National University College of Medicine, and Institute of Radiation Medicine, Seoul (Korea, Republic of)

    2014-04-15

    To evaluate filtered back projection (FBP) and two iterative reconstruction (IR) algorithms and their effects on the quantitative analysis of lung parenchyma and airway measurements on computed tomography (CT) images. Low-dose chest CT obtained in 281 adult patients were reconstructed using three algorithms: FBP, adaptive statistical IR (ASIR) and model-based IR (MBIR). Measurements of each dataset were compared: total lung volume, emphysema index (EI), airway measurements of the lumen and wall area as well as average wall thickness. Accuracy of airway measurements of each algorithm was also evaluated using an airway phantom. EI using a threshold of -950 HU was significantly different among the three algorithms in decreasing order of FBP (2.30 %), ASIR (1.49 %) and MBIR (1.20 %) (P < 0.01). Wall thickness was also significantly different among the three algorithms with FBP (2.09 mm) demonstrating thicker walls than ASIR (2.00 mm) and MBIR (1.88 mm) (P < 0.01). Airway phantom analysis revealed that MBIR showed the most accurate value for airway measurements. The three algorithms presented different EIs and wall thicknesses, decreasing in the order of FBP, ASIR and MBIR. Thus, care should be taken in selecting the appropriate IR algorithm on quantitative analysis of the lung. (orig.)

  13. Evaluation of Back Projection Methods for Breast Tomosynthesis Image Reconstruction.

    Science.gov (United States)

    Zhou, Weihua; Lu, Jianping; Zhou, Otto; Chen, Ying

    2014-11-11

    Breast cancer is the most common cancer among women in the USA. Compared to mammography, digital breast tomosynthesis is a new imaging technique that may improve the diagnostic accuracy by removing the ambiguities of overlapped tissues and providing 3D information of the breast. Tomosynthesis reconstruction algorithms generate 3D reconstructed slices from a few limited angle projection images. Among different reconstruction algorithms, back projection (BP) is considered an important foundation of quite a few reconstruction techniques with deblurring algorithms such as filtered back projection. In this paper, two BP variants, including ?-trimmed BP and principal component analysis-based BP, were proposed to improve the image quality against that of traditional BP. Computer simulations and phantom studies demonstrated that the ?-trimmed BP may improve signal response performance and suppress noise in breast tomosynthesis image reconstruction. PMID:25384538

  14. Image quality and radiation dose of low dose coronary CT angiography in obese patients: Sinogram affirmed iterative reconstruction versus filtered back projection

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Rui, E-mail: rui_wang1979@yahoo.cn [Department of Radiology, Beijing Anzhen Hospital, Capital Medical University, 100029 Beijing (China); Schoepf, U. Joseph, E-mail: schoepf@musc.edu [Heart and Vascular Center, Medical University of South Carolina, Ashley River Tower, 25 Courtenay Drive, Charleston, SC 29425-2260 (United States); Wu, Runze, E-mail: runze.wu@gmail.com [Siemens Healthcare China, 7 Wangjing Zhonghuan Nanlu, 100102 Beijing (China); Reddy, Ryan P., E-mail: reddyr@musc.edu [Heart and Vascular Center, Medical University of South Carolina, Ashley River Tower, 25 Courtenay Drive, Charleston, SC 29425-2260 (United States); Zhang, Chuanchen, E-mail: zhangchuanchen666@163.com [Department of Radiology, Liaocheng People Hospital, 252000 Shandong (China); Yu, Wei, E-mail: yuwei02@gmail.com [Department of Radiology, Beijing Anzhen Hospital, Capital Medical University, 100029 Beijing (China); Liu, Yi, E-mail: liuyi198311@yahoo.cn [Department of Radiology, Beijing Anzhen Hospital, Capital Medical University, 100029 Beijing (China); Zhang, Zhaoqi, E-mail: zhaoqi5000@vip.sohu.com [Department of Radiology, Beijing Anzhen Hospital, Capital Medical University, 100029 Beijing (China)

    2012-11-15

    Purpose: To investigate the image quality and radiation dose of low radiation dose CT coronary angiography (CTCA) using sinogram affirmed iterative reconstruction (SAFIRE) compared with standard dose CTCA using filtered back-projection (FBP) in obese patients. Materials and methods: Seventy-eight consecutive obese patients were randomized into two groups and scanned using a prospectively ECG-triggered step-and-shot (SAS) CTCA protocol on a dual-source CT scanner. Thirty-nine patients (protocol A) were examined using a routine radiation dose protocol at 120 kV and images were reconstructed with FBP (protocol A). Thirty-nine patients (protocol B) were examined using a low dose protocol at 100 kV and images were reconstructed with SAFIRE. Two blinded observers independently assessed the image quality of each coronary segment using a 4-point scale (1 = non-diagnostic, 4 = excellent) and measured the objective parameters image noise, signal-to-noise ratio (SNR), and contrast-to-noise ratio (CNR). Radiation dose was calculated. Results: The coronary artery image quality scores, image noise, SNR and CNR were not significantly different between protocols A and B (all p > 0.05), with image quality scores of 3.51 {+-} 0.70 versus 3.55 {+-} 0.47, respectively. The effective radiation dose was significantly lower in protocol B (4.41 {+-} 0.83 mSv) than that in protocol A (8.83 {+-} 1.74 mSv, p < 0.01). Conclusion: Compared with standard dose CTCA using FBP, low dose CTCA using SAFIRE can maintain diagnostic image quality with 50% reduction of radiation dose.

  15. Adaptive iterative dose reduction algorithm in CT: Effect on image quality compared with filtered back projection in body phantoms of different sizes

    International Nuclear Information System (INIS)

    To evaluate the impact of the adaptive iterative dose reduction (AIDR) three-dimensional (3D) algorithm in CT on noise reduction and the image quality compared to the filtered back projection (FBP) algorithm and to compare the effectiveness of AIDR 3D on noise reduction according to the body habitus using phantoms with different sizes. Three different-sized phantoms with diameters of 24 cm, 30 cm, and 40 cm were built up using the American College of Radiology CT accreditation phantom and layers of pork belly fat. Each phantom was scanned eight times using different mAs. Images were reconstructed using the FBP and three different strengths of the AIDR 3D. The image noise, the contrast-to-noise ratio (CNR) and the signal-to-noise ratio (SNR) of the phantom were assessed. Two radiologists assessed the image quality of the 4 image sets in consensus. The effectiveness of AIDR 3D on noise reduction compared with FBP were also compared according to the phantom sizes. Adaptive iterative dose reduction 3D significantly reduced the image noise compared with FBP and enhanced the SNR and CNR (p < 0.05) with improved image quality (p < 0.05). When a stronger reconstruction algorithm was used, greater increase of SNR and CNR as well as noise reduction was achieved (p < 0.05). The noise reduction effect of AIDR 3D was significantly greater in the 40-cm phantom than in the 24-cm or 30-cm phantoms (p < 0.05). The AIDR 3D algorithm is effective to reduce the image noise as well as to improve the image-quality parameters compared by FBP algorithm, and its effectiveness may increase as the phantom size increases.

  16. Adaptive iterative dose reduction algorithm in CT: Effect on image quality compared with filtered back projection in body phantoms of different sizes

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Milim; Lee, Jeong Min; Son, Hyo Shin; Han, Joon Koo; Choi, Byung Ihn [College of Medicine, Seoul National University, Seoul (Korea, Republic of); Yoon, Jeong Hee; Choi, Jin Woo [Dept. of Radiology, Seoul National University Hospital, Seoul (Korea, Republic of)

    2014-04-15

    To evaluate the impact of the adaptive iterative dose reduction (AIDR) three-dimensional (3D) algorithm in CT on noise reduction and the image quality compared to the filtered back projection (FBP) algorithm and to compare the effectiveness of AIDR 3D on noise reduction according to the body habitus using phantoms with different sizes. Three different-sized phantoms with diameters of 24 cm, 30 cm, and 40 cm were built up using the American College of Radiology CT accreditation phantom and layers of pork belly fat. Each phantom was scanned eight times using different mAs. Images were reconstructed using the FBP and three different strengths of the AIDR 3D. The image noise, the contrast-to-noise ratio (CNR) and the signal-to-noise ratio (SNR) of the phantom were assessed. Two radiologists assessed the image quality of the 4 image sets in consensus. The effectiveness of AIDR 3D on noise reduction compared with FBP were also compared according to the phantom sizes. Adaptive iterative dose reduction 3D significantly reduced the image noise compared with FBP and enhanced the SNR and CNR (p < 0.05) with improved image quality (p < 0.05). When a stronger reconstruction algorithm was used, greater increase of SNR and CNR as well as noise reduction was achieved (p < 0.05). The noise reduction effect of AIDR 3D was significantly greater in the 40-cm phantom than in the 24-cm or 30-cm phantoms (p < 0.05). The AIDR 3D algorithm is effective to reduce the image noise as well as to improve the image-quality parameters compared by FBP algorithm, and its effectiveness may increase as the phantom size increases.

  17. Dose reduction in computed tomography of the chest. Image quality of iterative reconstructions at a 50% radiation dose compared to filtered back projection at a 100% radiation dose

    International Nuclear Information System (INIS)

    Purpose: The aim of this study was to evaluate the potential of iterative reconstruction (IR) in chest computed tomography (CT) to reduce radiation exposure. The qualitative and quantitative image quality of standard reconstructions with filtered back projection (FBP) and half dose (HD) chest CT data reconstructed with FBP and IR was assessed. Materials and Methods: 52 consecutive patients underwent contrast-enhanced chest CT on a dual-source CT system at 120 kV and automatic exposure control. The tube current was equally split on both tube detector systems. For the HD datasets, only data from one tube detector system was utilized. Thus, FD and HD data was available for each patient with a single scan. Three datasets were reconstructed from the raw data: standard full dose (FD) images applying FBP which served as a reference, HD images applying FBP and IR. Objective image quality analysis was performed by measuring the image noise in tissue and air. The subjective image quality was evaluated by 2 radiologists according to European guidelines. Additional assessment of artifacts, lesion conspicuity and edge sharpness was performed. Results: Image noise did not differ significantly between HD-IR and FD-FBP (p = 0.254) but increased substantially in HD-FBP (p < 0.001). No statistically significant differences were found for the reproduction of anatomical and pathological structures between HD-IR and FD-FBP, subsegmental bronchi and bronchioli. The image quality of HD-FBP was rated inferior because of increased noise. Conclusion: A 50% dose reduction in contrast-enhanced chest CT is feasible without a loss of diagnostic confidence if IR is used for image data reconstruction. Iterative reconstruction is another powerful tool to reduce radiation exposure and can be combined with other dose-saving techniques. (orig.)

  18. The comparison of ordered subset expectation maximization and filtered back projection technique for RBC blood pool SPECT in detection of liver hemangioma

    International Nuclear Information System (INIS)

    Odered subset expectation maximization (OSEM) is a new iterative reconstruction technique for tomographic images that can reduce the reconstruction time comparing with conventional iteration method. We adopted this method of RBC blood pool SPECT and tried to validate the usefulness of OSEM in detection of liver hemangioma comparing with filtered back projection (FBP). A 64 projection SPECT study was acquired over 360 .deg. C by dual-head cameras after the injection of 750MBq of 99mTc-RBC. OSEM was performed with various condition of subset (1,2,4,8,16 and 32) and iteration number (1,2,4,8 and 16) to obtain the best set for lesion detection. OSEM underwent in 17 lesions of 15 patients with liver hemangioma and compared with FBP images. Two nuclear medicine physicians reviewed these results independently. Best set for images was 4 iteration and 16 subset. In general, OSEM revealed more homogeneous images than FBP. Eighty-eight percent (15/17) of OSEM images were superior or equal to FBP for anatomic resolution. According to the blind review of images 70.5% (12/17) of OSEM was better in contrast (4/17), anatomic detail (4/17) and both (2/17). Two small lesions were detected by OSEM only and another 2 small lesions were failed to depict in both methods. Remaining 3 lesions revealed no difference in image quality. OSEM can provide better image quality as well as better results in detection of liver hemangioma than conventional FBP techniquetional FBP technique

  19. Impact of adaptive iterative dose reduction (AIDR) 3D on low-dose abdominal CT - Comparison with routine-dose CT using filtered back projection

    Energy Technology Data Exchange (ETDEWEB)

    Matsuki, Mitsuru; Murakami, Takamichi [Dept. of Radiology, Kinki Univ. School of Medicine, Osaka (Japan)], e-mail: rad053@poh.osaka-med.ac.jp; Juri, Hiroshi; Yoshikawa, Shushi; Narumi, Yoshifumi [Dept. of Radiology, Osaka Medical Coll., Osaka (Japan)

    2013-10-15

    Background: While CT is widely used in medical practice, a substantial source of radiation exposure is associated with an increased lifetime risk of cancer. Therefore, concerns to dose reduction in CT examinations are increasing and an iterative reconstruction algorithm, which allow for dose reduction by compensating image noise in the image reconstruction, has been developed. Purpose: To investigate the performance of low-dose abdominal CT using adaptive iterative dose reduction 3D (AIDR 3D) compared to routine-dose CT using filtered back projection (FBP). Material and Methods: Fifty-eight patients underwent both routine-dose CT scans using FBP and low-dose CT scans using AIDR 3D in the abdomen. The image noise levels, signal-to-noise ratios (SNRs), and contrast-to-noise ratios (CNRs) of the aorta, portal vein, liver, and pancreas were measured and compared in both scans. Visual evaluations were performed. The volume CT dose index (CTDIvol) was measured. Results: Image noise levels on low-dose CT images using AIDR 3D were significantly lower than, or not significantly different from, routine-dose CT images using FBP in reviewing the data on the basis of all patients and the three BMI groups. SNRs and CNRs on low-dose CT images using AIDR 3D were significantly higher than, or not significantly different from, routine-dose CT images using FBP in reviewing the data on the basis of all patients and the three BMI groups. In visual evaluation of the images, there were no statistically significant differences between the scans in all organs independently of BMI. The average CTDIvol at routine-dose and low dose CT was 21.4 and 10.8 mGy, respectively. Conclusion: Low-dose abdominal CT using AIDR 3D allows for approximately 50 % reduction in radiation dose without a degradation of image quality compared to routine-dose CT using FBP independently of BMI.

  20. Image quality and radiation dose of low dose coronary CT angiography in obese patients: Sinogram affirmed iterative reconstruction versus filtered back projection

    International Nuclear Information System (INIS)

    Purpose: To investigate the image quality and radiation dose of low radiation dose CT coronary angiography (CTCA) using sinogram affirmed iterative reconstruction (SAFIRE) compared with standard dose CTCA using filtered back-projection (FBP) in obese patients. Materials and methods: Seventy-eight consecutive obese patients were randomized into two groups and scanned using a prospectively ECG-triggered step-and-shot (SAS) CTCA protocol on a dual-source CT scanner. Thirty-nine patients (protocol A) were examined using a routine radiation dose protocol at 120 kV and images were reconstructed with FBP (protocol A). Thirty-nine patients (protocol B) were examined using a low dose protocol at 100 kV and images were reconstructed with SAFIRE. Two blinded observers independently assessed the image quality of each coronary segment using a 4-point scale (1 = non-diagnostic, 4 = excellent) and measured the objective parameters image noise, signal-to-noise ratio (SNR), and contrast-to-noise ratio (CNR). Radiation dose was calculated. Results: The coronary artery image quality scores, image noise, SNR and CNR were not significantly different between protocols A and B (all p > 0.05), with image quality scores of 3.51 ± 0.70 versus 3.55 ± 0.47, respectively. The effective radiation dose was significantly lower in protocol B (4.41 ± 0.83 mSv) than that in protocol A (8.83 ± 1.74 mSv, p < 0.01). Conclusion: Compared with standard dose CTCA using FBP, low dose CTCA using SAFIRE can maintain diagnostic image quality with 50% reduction of radiation dose.

  1. Evaluation of iterative reconstruction (OSEM) versus filtered back-projection for the assessment of myocardial glucose uptake and myocardial perfusion using dynamic PET

    International Nuclear Information System (INIS)

    Iterative reconstruction methods based on ordered-subset expectation maximisation (OSEM) has replaced filtered backprojection (FBP) in many clinical settings owing to the superior image quality. Whether OSEM is as accurate as FBP in quantitative positron emission tomography (PET) is uncertain. We compared the accuracy of OSEM and FBP for regional myocardial 18F-FDG uptake and 13NH3 perfusion measurements in cardiac PET. Ten healthy volunteers were studied. Five underwent dynamic 18F-FDG PET during hyperinsulinaemic-euglycaemic clamp, and five underwent 13NH3 perfusion measurement during rest and adenosine-induced hyperaemia. Images were reconstructed using FBP and OSEM ± an 8-mm Gaussian post-reconstruction filter. Filtered and unfiltered images showed agreement between the reconstruction methods within ±2SD in Bland-Altman plots of Ki values. The use of a Gaussian filter resulted in a systematic underestimation of Ki in the filtered images of 11%. The mean deviation between the reconstruction methods for both unfiltered and filtered images was 1.3%. Agreement within ±2SD between the methods was demonstrated for perfusion rate constants up to 2.5 min-1, corresponding to a perfusion of 3.4 ml g-1 min-1. The mean deviation between the two methods for unfiltered data was 2.7%, and for filtered data, 5.3%. The 18F-FDG uptake rate constantssup>18F-FDG uptake rate constants showed excellent agreement between the two reconstruction methods. In the perfusion range up to 3.4 ml g-1 min-1, agreement between 13NH3 perfusion obtained with OSEM and FBP was acceptable. The use of OSEM for measurement of perfusion values higher than 3.4 ml g-1 min-1 requires further evaluation. (orig.)

  2. Iterative reconstruction technique vs filter back projection: utility for quantitative bronchial assessment on low-dose thin-section MDCT in patients with/without chronic obstructive pulmonary disease

    International Nuclear Information System (INIS)

    The aim of this study was to evaluate the utility of the iterative reconstruction (IR) technique for quantitative bronchial assessment during low-dose computed tomography (CT) as a substitute for standard-dose CT in patients with/without chronic obstructive pulmonary disease. Fifty patients (mean age, 69.2; mean % predicted FEV1, 79.4) underwent standard-dose CT (150mAs) and low-dose CT (25mAs). Except for tube current, the imaging parameters were identical for both protocols. Standard-dose CT was reconstructed using filtered back-projection (FBP), and low-dose CT was reconstructed using IR and FBP. For quantitative bronchial assessment, the wall area percentage (WA%) of the sub-segmental bronchi and the airway luminal volume percentage (LV%) from the main bronchus to the peripheral bronchi were acquired in each dataset. The correlation and agreement of WA% and LV% between standard-dose CT and both low-dose CTs were statistically evaluated. WA% and LV% between standard-dose CT and both low-dose CTs were significant correlated (r > 0.77, p < 0.00001); however, only the LV% agreement between SD-CT and low-dose CT reconstructed with IR was moderate (concordance correlation coefficient = 0.93); the other agreement was poor (concordance correlation coefficient <0.90). Quantitative bronchial assessment via low-dose CT has potential as a substitute for standard-dose CT by using IR and airway luminal volumetry techniques. circle Quantitative bronchial assessment of COPD using low-dose CT is possible. (orig.)

  3. Iterative reconstruction technique vs filter back projection: utility for quantitative bronchial assessment on low-dose thin-section MDCT in patients with/without chronic obstructive pulmonary disease

    Energy Technology Data Exchange (ETDEWEB)

    Koyama, Hisanobu; Seki, Shinichiro; Sugimura, Kazuro [Kobe University Graduate School of Medicine, Division of Radiology, Department of Radiology, Kobe, Hyogo (Japan); Ohno, Yoshiharu; Nishio, Mizuho; Matsumoto, Sumiaki; Yoshikawa, Takeshi [Kobe University Graduate School of Medicine, Advanced Biomedical Imaging Research Centre, Kobe (Japan); Kobe University Graduate School of Medicine, Division of Functional and Diagnostic Imaging Research, Department of Radiology, Kobe (Japan); Sugihara, Naoki [Toshiba Medical Systems Corporation, Ohtawara, Tochigi (Japan)

    2014-08-15

    The aim of this study was to evaluate the utility of the iterative reconstruction (IR) technique for quantitative bronchial assessment during low-dose computed tomography (CT) as a substitute for standard-dose CT in patients with/without chronic obstructive pulmonary disease. Fifty patients (mean age, 69.2; mean % predicted FEV1, 79.4) underwent standard-dose CT (150mAs) and low-dose CT (25mAs). Except for tube current, the imaging parameters were identical for both protocols. Standard-dose CT was reconstructed using filtered back-projection (FBP), and low-dose CT was reconstructed using IR and FBP. For quantitative bronchial assessment, the wall area percentage (WA%) of the sub-segmental bronchi and the airway luminal volume percentage (LV%) from the main bronchus to the peripheral bronchi were acquired in each dataset. The correlation and agreement of WA% and LV% between standard-dose CT and both low-dose CTs were statistically evaluated. WA% and LV% between standard-dose CT and both low-dose CTs were significant correlated (r > 0.77, p < 0.00001); however, only the LV% agreement between SD-CT and low-dose CT reconstructed with IR was moderate (concordance correlation coefficient = 0.93); the other agreement was poor (concordance correlation coefficient <0.90). Quantitative bronchial assessment via low-dose CT has potential as a substitute for standard-dose CT by using IR and airway luminal volumetry techniques. circle Quantitative bronchial assessment of COPD using low-dose CT is possible. (orig.)

  4. Computed tomography of the cervical spine: comparison of image quality between a standard-dose and a low-dose protocol using filtered back-projection and iterative reconstruction

    International Nuclear Information System (INIS)

    To compare image quality of a standard-dose (SD) and a low-dose (LD) cervical spine CT protocol using filtered back-projection (FBP) and iterative reconstruction (IR). Forty patients investigated by cervical spine CT were prospectively randomised into two groups: SD (120 kVp, 275 mAs) and LD (120 kVp, 150 mAs), both applying automatic tube current modulation. Data were reconstructed using both FBP and sinogram-affirmed IR. Image noise, signal-to-noise (SNR) and contrast-to-noise (CNR) ratios were measured. Two radiologists independently and blindly assessed the following anatomical structures at C3-C4 and C6-C7 levels, using a four-point scale: intervertebral disc, content of neural foramina and dural sac, ligaments, soft tissues and vertebrae. They subsequently rated overall image quality using a ten-point scale. For both protocols and at each disc level, IR significantly decreased image noise and increased SNR and CNR, compared with FBP. SNR and CNR were statistically equivalent in LD-IR and SD-FBP protocols. Regardless of the dose and disc level, the qualitative scores with IR compared with FBP, and with LD-IR compared with SD-FBP, were significantly higher or not statistically different for intervertebral discs, neural foramina and ligaments, while significantly lower or not statistically different for soft tissues and vertebrae. The overall image quality scores were significantly higher with IR compared with FBP, and with LD-IR compared with SD-FBP. LD-IR cervica LD-IR compared with SD-FBP. LD-IR cervical spine CT provides better image quality for intervertebral discs, neural foramina and ligaments, and worse image quality for soft tissues and vertebrae, compared with SD-FBP, while reducing radiation dose by approximately 40 %. (orig.)

  5. Computed tomography of the cervical spine: comparison of image quality between a standard-dose and a low-dose protocol using filtered back-projection and iterative reconstruction

    Energy Technology Data Exchange (ETDEWEB)

    Becce, Fabio [University of Lausanne, Department of Diagnostic and Interventional Radiology, Centre Hospitalier Universitaire Vaudois, Lausanne (Switzerland); Universite Catholique Louvain, Department of Radiology, Cliniques Universitaires Saint-Luc, Brussels (Belgium); Ben Salah, Yosr; Berg, Bruno C. vande; Lecouvet, Frederic E.; Omoumi, Patrick [Universite Catholique Louvain, Department of Radiology, Cliniques Universitaires Saint-Luc, Brussels (Belgium); Verdun, Francis R. [University of Lausanne, Institute of Radiation Physics, Centre Hospitalier Universitaire Vaudois, Lausanne (Switzerland); Meuli, Reto [University of Lausanne, Department of Diagnostic and Interventional Radiology, Centre Hospitalier Universitaire Vaudois, Lausanne (Switzerland)

    2013-07-15

    To compare image quality of a standard-dose (SD) and a low-dose (LD) cervical spine CT protocol using filtered back-projection (FBP) and iterative reconstruction (IR). Forty patients investigated by cervical spine CT were prospectively randomised into two groups: SD (120 kVp, 275 mAs) and LD (120 kVp, 150 mAs), both applying automatic tube current modulation. Data were reconstructed using both FBP and sinogram-affirmed IR. Image noise, signal-to-noise (SNR) and contrast-to-noise (CNR) ratios were measured. Two radiologists independently and blindly assessed the following anatomical structures at C3-C4 and C6-C7 levels, using a four-point scale: intervertebral disc, content of neural foramina and dural sac, ligaments, soft tissues and vertebrae. They subsequently rated overall image quality using a ten-point scale. For both protocols and at each disc level, IR significantly decreased image noise and increased SNR and CNR, compared with FBP. SNR and CNR were statistically equivalent in LD-IR and SD-FBP protocols. Regardless of the dose and disc level, the qualitative scores with IR compared with FBP, and with LD-IR compared with SD-FBP, were significantly higher or not statistically different for intervertebral discs, neural foramina and ligaments, while significantly lower or not statistically different for soft tissues and vertebrae. The overall image quality scores were significantly higher with IR compared with FBP, and with LD-IR compared with SD-FBP. LD-IR cervical spine CT provides better image quality for intervertebral discs, neural foramina and ligaments, and worse image quality for soft tissues and vertebrae, compared with SD-FBP, while reducing radiation dose by approximately 40 %. (orig.)

  6. Reconstruction of CT images by the Bayes- back projection method

    CERN Document Server

    Haruyama, M; Takase, M; Tobita, H

    2002-01-01

    In the course of research on quantitative assay of non-destructive measurement of radioactive waste, the have developed a unique program based on the Bayesian theory for reconstruction of transmission computed tomography (TCT) image. The reconstruction of cross-section images in the CT technology usually employs the Filtered Back Projection method. The new imaging reconstruction program reported here is based on the Bayesian Back Projection method, and it has a function of iterative improvement images by every step of measurement. Namely, this method has the capability of prompt display of a cross-section image corresponding to each angled projection data from every measurement. Hence, it is possible to observe an improved cross-section view by reflecting each projection data in almost real time. From the basic theory of Baysian Back Projection method, it can be not only applied to CT types of 1st, 2nd, and 3rd generation. This reported deals with a reconstruction program of cross-section images in the CT of ...

  7. Image reconstruction of simulated specimens using convolution back projection

    Directory of Open Access Journals (Sweden)

    Mohd. Farhan Manzoor

    2012-04-01

    Full Text Available This paper reports about the reconstruction of cross-sections of composite structures. The convolution back projection (CBP algorithm has been used to capture the attenuation field over the specimen. Five different test cases have been taken up for evaluation. These cases represent varying degrees of complexity. In addition, the role of filters on the nature of the reconstruction errors has also been discussed. Numerical results obtained in the study reveal that CBP algorithm is a useful tool for qualitative as well as quantitative assessment of composite regions encountered in engineering applications.

  8. A Multi-Scale Weighted Back Projection Imaging Technique for Ground Penetrating Radar Applications

    OpenAIRE

    Wentai Lei; Ronghua Shi; Jian Dong; Yujia Shi

    2014-01-01

    In this paper, we propose a new ground penetrating radar (GPR) imaging technique based on multi-scale weighted back projection (BP) processing. Firstly, the whole imaging region is discretized by large scale and low-resolution imaging result is obtained by using traditional BP imaging technique. Secondly, the potential targets regions (PTR) are delineated from low-resolution imaging result by using intensity detection method. In the PTR, small scale discretization is implemented and higher r...

  9. Dose reduction in computed tomography of the chest. Image quality of iterative reconstructions at a 50% radiation dose compared to filtered back projection at a 100% radiation dose; Dosisreduktion in der Thorax-CT. Vergleich der Bildqualitaet bei 50% Dosis und iterativer Bildrekonstruktion mit 100% Dosis und gefilterter Rueckprojektion

    Energy Technology Data Exchange (ETDEWEB)

    May, M.S.; Eller, A.; Stahl, C. [University Hospital Erlangen (Germany). Dept. of Radiology; and others

    2014-06-15

    Purpose: The aim of this study was to evaluate the potential of iterative reconstruction (IR) in chest computed tomography (CT) to reduce radiation exposure. The qualitative and quantitative image quality of standard reconstructions with filtered back projection (FBP) and half dose (HD) chest CT data reconstructed with FBP and IR was assessed. Materials and Methods: 52 consecutive patients underwent contrast-enhanced chest CT on a dual-source CT system at 120 kV and automatic exposure control. The tube current was equally split on both tube detector systems. For the HD datasets, only data from one tube detector system was utilized. Thus, FD and HD data was available for each patient with a single scan. Three datasets were reconstructed from the raw data: standard full dose (FD) images applying FBP which served as a reference, HD images applying FBP and IR. Objective image quality analysis was performed by measuring the image noise in tissue and air. The subjective image quality was evaluated by 2 radiologists according to European guidelines. Additional assessment of artifacts, lesion conspicuity and edge sharpness was performed. Results: Image noise did not differ significantly between HD-IR and FD-FBP (p = 0.254) but increased substantially in HD-FBP (p < 0.001). No statistically significant differences were found for the reproduction of anatomical and pathological structures between HD-IR and FD-FBP, subsegmental bronchi and bronchioli. The image quality of HD-FBP was rated inferior because of increased noise. Conclusion: A 50% dose reduction in contrast-enhanced chest CT is feasible without a loss of diagnostic confidence if IR is used for image data reconstruction. Iterative reconstruction is another powerful tool to reduce radiation exposure and can be combined with other dose-saving techniques. (orig.)

  10. UV Fluorescence Photography of Works of Art : Replacing the Traditional UV Cut Filters with Interference Filters

    Directory of Open Access Journals (Sweden)

    Luís BRAVO PEREIRA

    2010-09-01

    Full Text Available For many years filters like the Kodak Wratten E series, or the equivalent Schneider B+W 415, were used as standard UV cut filters, necessary to obtain good quality on UV Fluorescence photography. The only problem with the use of these filters is that, when they receive the UV radiation that they should remove, they present themselves an internal fluorescence as side effect, that usually reduce contrast and quality on the final image. This article presents the results of our experiences on using some innovative filters, that appeared available on the market in recent years, projected to adsorb UV radiation even more efficiently than with the mentioned above pigment based standard filters: the interference filters for UV rejection (and, usually, for IR rejection too manufactured using interference layers, that present better results than the pigment based filters. The only problem with interference filters type is that they are sensitive to the rays direction and, because of that, they are not adequate to wide-angle lenses. The internal fluorescence for three filters: the B+W 415 UV cut (equivalent to the Kodak Wratten 2E, pigment based, the B+W 486 UV IR cut (an interference type filter, used frequently on digital cameras to remove IR or UV and the Baader UVIR rejection filter (two versions of this interference filter were used had been tested and compared. The final quality of the UV fluorescence images seems to be of a superior quality when compared to the images obtained with classic filters.

  11. Traditional Tracking with Kalman Filter on Parallel Architectures

    CERN Document Server

    Cerati, Giuseppe; Lantz, Steven; MacNeill, Ian; McDermott, Kevin; Riley, Dan; Tadel, Matevz; Wittich, Peter; Wuerthwein, Frank; Yagil, Avi

    2014-01-01

    Power density constraints are limiting the performance improvements of modern CPUs. To address this, we have seen the introduction of lower-power, multi-core processors, but the future will be even more exciting. In order to stay within the power density limits but still obtain Moore's Law performance/price gains, it will be necessary to parallelize algorithms to exploit larger numbers of lightweight cores and specialized functions like large vector units. Example technologies today include Intel's Xeon Phi and GPGPUs. Track finding and fitting is one of the most computationally challenging problems for event reconstruction in particle physics. At the High Luminosity LHC, for example, this will be by far the dominant problem. The most common track finding techniques in use today are however those based on the Kalman Filter. Significant experience has been accumulated with these techniques on real tracking detector systems, both in the trigger and offline. We report the results of our investigations into the p...

  12. Impulse response and Modulation Transfer Function analysis for Shift-And-Add and Back Projection image reconstruction algorithms in Digital Breast Tomosynthesis (DBT)

    OpenAIRE

    Chen, Ying; Lo, Joseph Y.; Dobbins, James T.

    2008-01-01

    Breast cancer is second only to lung cancer as the leading cause of non-preventable cancer death in women. Digital Breast Tomosynthesis (DBT) is a promising technique to improve early breast cancer detection. In this paper, we present the impulse response and Modulation Transfer Function (MTF) analysis to quantitatively compare Shift-And-Add (SAA) and point-by-point Back Projection (BP) three-dimensional image reconstruction algorithms in DBT. A Filtered Back Projection (FBP) deblurring algor...

  13. UV Fluorescence Photography of Works of Art : Replacing the Traditional UV Cut Filters with Interference Filters

    OpenAIRE

    Bravo Pereira, Lui?s

    2010-01-01

    For many years filters like the Kodak Wratten E series, or the equivalent Schneider B+W 415, were used as standard UV cut filters, necessary to obtain good quality on UV Fluorescence photography. The only problem with the use of these filters is that, when they receive the UV radiation that they should remove, they present themselves an internal fluorescence as side effect, that usually reduce contrast and quality on the final image. This article presents the results of our experiences on usi...

  14. Investigating the 2010 Moro Gulf deep earthquake sequence using the continuous back-projection technique

    Science.gov (United States)

    Kowalke, S. M.; Kiser, E.; Ishii, M.

    2011-12-01

    Deep earthquakes make up approximately one-quarter of all earthquakes, yet current understanding of mechanisms for deep (300-700 km) earthquake generation fails to explain why deep range earthquakes occur at all. Various mechanisms, such as metastable phase change, have been proposed, however, there is a lack of observational constraints on the properties of deep earthquakes, which deter our understanding. In order to progress our knowledge of the deep earth, such as its chemistry, mineralogy, and dynamics, robust constraints on the mechanisms of deep earthquakes are required. The goal of this project is to explore the mechanism(s) behind deep earthquakes through studying the 2010 Moro Gulf deep earthquake sequence using a continuous back-projection technique. The Moro Gulf sequence features a 'triplet' of earthquakes with hypocentral depths between 585 and 640 km. The triplet earthquakes are particularly unusual, because they are large magnitude events (Mw7.3, 7.6, and 7.4) that occurred within an hour and a half of each other, which does not agree with the Gutenberg-Richter relationship. No other triplet sequences of this magnitude have been recorded within such a short time period. Another anomalous characteristic of these earthquakes is the emergent waveforms of the sequence. Typically, deep earthquakes have impulsive waveforms, which is thought to be associated with more rapid stress drop than shallow events, for which the stress drop occurs more gradually. The back-projection technique is an efficient method to constrain earthquake rupture properties, such as rupture direction, rupture speed, location, timing, and relative energy release of an earthquake. It requires high-quality data from a dense network of seismic stations covering a large area, and we use data from the High-Sensitivity Seismograph Network (Hi-net) in Japan and US Transportable Array (US-TA). The technique creates a grid of potential source locations around the hypocenter. The seismograms are time-shifted and stacked at each grid point. The data from the three earthquakes are filtered to a frequency range of 0.8-2 Hz in order to maximize the resolution from the back-projection, and to capture the first arriving P-waves in the waveforms. Another crucial step is aligning the P- and PKP(DF)-phases for Hi-net and US-TA, respectively. Since alignment has a strong influence on the back-projection, we aligned the US-TA phases using an aftershock because smaller magnitude earthquakes often improve alignment. The back-projection technique has been used in previous studies to investigate both shallow (0-100 km) and intermediate-depth (100-300 km) earthquakes. Using back-projection on deep earthquakes is expected to increase depth resolution results from the incorporation of depth phases, that, when combined with downward take-off phases, reveal a more accurate hypocentral depth. Also, better spatial resolution is achieved by the combination of the Hi-net and US-TA arrays, which overlap signals at a more accurate location of the rupture. Back-projection analysis of the Moro Gulf sequence is expected to provide a detailed rupture process of these deep events with high depth and spatial resolution.

  15. An accelerated threshold-based back-projection algorithm for Compton camera image reconstruction

    International Nuclear Information System (INIS)

    Purpose: Compton camera imaging (CCI) systems are currently under investigation for radiotherapy dose reconstruction and verification. The ability of such a system to provide real-time images during dose delivery will be limited by the computational speed of the image reconstruction algorithm. In this work, the authors present a fast and simple method by which to generate an initial back-projected image from acquired CCI data, suitable for use in a filtered back-projection algorithm or as a starting point for iterative reconstruction algorithms, and compare its performance to the current state of the art. Methods: Each detector event in a CCI system describes a conical surface that includes the true point of origin of the detected photon. Numerical image reconstruction algorithms require, as a first step, the back-projection of each of these conical surfaces into an image space. The algorithm presented here first generates a solution matrix for each slice of the image space by solving the intersection of the conical surface with the image plane. Each element of the solution matrix is proportional to the distance of the corresponding voxel from the true intersection curve. A threshold function was developed to extract those pixels sufficiently close to the true intersection to generate a binary intersection curve. This process is repeated for each image plane for each CCI detector event, resulting in a three-dimensional back-projection image. The performance of this alojection image. The performance of this algorithm was tested against a marching algorithm known for speed and accuracy. Results: The threshold-based algorithm was found to be approximately four times faster than the current state of the art with minimal deficit to image quality, arising from the fact that a generically applicable threshold function cannot provide perfect results in all situations. The algorithm fails to extract a complete intersection curve in image slices near the detector surface for detector event cones having axes nearly parallel to the image plane. This effect decreases the sum of the image, thereby also affecting the mean, standard deviation, and SNR of the image. All back-projected events associated with a simulated point source intersected the voxel containing the source and the FWHM of the back-projected image was similar to that obtained from the marching method. Conclusions: The slight deficit to image quality observed with the threshold-based back-projection algorithm described here is outweighed by the 75% reduction in computation time. The implementation of this method requires the development of an optimum threshold function, which determines the overall accuracy of the method. This makes the algorithm well-suited to applications involving the reconstruction of many large images, where the time invested in threshold development is offset by the decreased image reconstruction time. Implemented in a parallel-computing environment, the threshold-based algorithm has the potential to provide real-time dose verification for radiation therapy.

  16. An improved back projection algorithm of ultrasound tomography

    International Nuclear Information System (INIS)

    Binary logic back projection algorithm is improved in this work for the development of fast ultrasound tomography system with a better effect of image reconstruction. The new algorithm is characterized by an extra logical value ‘2’ and dual-threshold processing of collected raw data. To compare with the original algorithm, a numerical simulation was conducted by the verification of COMSOL simulations formerly, and then a set of ultrasonic tomography system is established to perform the experiments of one, two and three cylindrical objects. The object images are reconstructed through the inversion of signals matrix acquired by the transducer array after a preconditioning, while the corresponding spatial imaging errors can obviously indicate that the improved back projection method can achieve modified inversion effect

  17. An improved back projection algorithm of ultrasound tomography

    Science.gov (United States)

    Xiaozhen, Chen; Mingxu, Su; Xiaoshu, Cai

    2014-04-01

    Binary logic back projection algorithm is improved in this work for the development of fast ultrasound tomography system with a better effect of image reconstruction. The new algorithm is characterized by an extra logical value `2' and dual-threshold processing of collected raw data. To compare with the original algorithm, a numerical simulation was conducted by the verification of COMSOL simulations formerly, and then a set of ultrasonic tomography system is established to perform the experiments of one, two and three cylindrical objects. The object images are reconstructed through the inversion of signals matrix acquired by the transducer array after a preconditioning, while the corresponding spatial imaging errors can obviously indicate that the improved back projection method can achieve modified inversion effect.

  18. Tradition.

    Science.gov (United States)

    Cowan, Elizabeth

    Abstract words such as "tradition" are like ancient coins whose concrete images have worn away. Traditions can be of two forms--either alive, amendable, and expandable (such as those in a family's annual Christmas celebration), or dead, empty formalities. An example of an empty tradition is the strict rule in freshman composition classes that…

  19. Modified Kalman Filter-based Approach in Comparison with Traditional Speech Enhancement Algorithms from Adverse Noisy Environments

    OpenAIRE

    G. Ramesh Babu,; Rameshwara Rao

    2011-01-01

    The paper presents a new speech enhancement approach for a single channel speech enhancement in a noise environment. In this method speech is mixed with real-world noises from babble, car and street environments. In this paper we proposed modified Kalman method for effective speech enhancement. The proposed method is compared to the traditional Spectral Subtraction (SS), Wiener Filter (WF), Minimum Mean Square Error (MMSE) and Wavelet based Filter (WAVELET). Experiments showed that the modifi...

  20. Solving the Electrical Impedance Tomography (EIT inverse problem by the conductivity and back projection methods

    Directory of Open Access Journals (Sweden)

    I. O. Rybina

    2011-06-01

    Full Text Available In this paper comparing analysis of back projection method and finite element method for imagine projection reconstruction in EIT (voltages measured on electrodes, attached around the phantom – suitable transfer resistances, by means of solving the forward problem – analysis and iteration procedure of solving the inverse problem is carried out. Advantages of back projection method are absence of necessary in solving the unwieldy (with a great number of finite elements forward problem and solving the reconstruction problem by means of simple projection of measured along equal voltage line transfer resistances. Disadvantage of the back projection method in EIT is absence of information about equal voltage lines trajectory in the case of presence of some weight and structure from standard deviations, EIT task is calculating and monitoring them. Moreover, using back projection method, not real resistivity (conductivity distribution visualization of tomography section elements, but transfer resistances (which are complex functions of these desired resistivities visualization is carried out. So it should not to consider that back projection method is correct in mathematical terms. Modernized finite element method using modification method and conductivity zones method allows avoid (at the expence of considering of phantom structure and iteration procedure structuring by conductivity zones introduction all standard difficulties, which brake using more correct finite element method for solving image reconstruction problem in Electrical Impedance Tomography. Presented thesises are illustrated by example of calculating phantom, which is chosen according  to simple control of results.

  1. A fast marching method based back projection algorithm for photoacoustic tomography in heterogeneous media

    CERN Document Server

    Wang, Tianren

    2015-01-01

    This paper presents a numerical study on a fast marching method based back projection reconstruction algorithm for photoacoustic tomography in heterogeneous media. Transcranial imaging is used here as a case study. To correct for the phase aberration from the heterogeneity (i.e., skull), the fast marching method is adopted to compute the phase delay based on the known speed of sound distribution, and the phase delay is taken into account by the back projection algorithm for more accurate reconstructions. It is shown that the proposed algorithm is more accurate than the conventional back projection algorithm, but slightly less accurate than the time reversal algorithm particularly in the area close to the skull. However, the image reconstruction time for the proposed algorithm can be as little as 124 ms when implemented by a GPU (512 sensors, 21323 pixels reconstructed), which is two orders of magnitude faster than the time reversal reconstruction. The proposed algorithm, therefore, not only corrects for the p...

  2. SAR focusing of P-band ice sounding data using back-projection

    DEFF Research Database (Denmark)

    Kusk, Anders; Dall, JØrgen

    2010-01-01

    SAR processing can be applied to ice sounder data to improve along-track resolution and clutter suppression. This paper presents a time-domain back-projection technique for SAR focusing of ice sounder data. With this technique, variations in flight track and ice surface slope can be accurately accommodated at the expense of computation time. The back-projection algorithm can be easily parallelized however, and can advantageously be implemented on a graphics processing unit (GPU). Results from using the back-projection algorithm on POLARIS ice sounder data from North Greenland shows that the quality of data is improved by the processing, and the performance of the GPU implementation allows for very fast focusing.

  3. Improved reconstructions and generalized filtered back projection for optical projection tomography.

    Science.gov (United States)

    Birk, Udo Jochen; Darrell, Alex; Konstantinides, Nikos; Sarasa-Renedo, Ana; Ripoll, Jorge

    2011-02-01

    Optical projection tomography (OPT) is a noninvasive imaging technique that enables imaging of small specimens (Parhyale hawaiensis embedded in glycerol and in sea water. Successful reconstructions of fluorescence and absorption OPT images have been obtained for weakly scattering specimens embedded in media with nonmatched refractive index, thus advancing OPT toward routine in vivo imaging. PMID:21283227

  4. Combining time-domain back-projection and capon beamforming for tomographic SAR processing

    OpenAIRE

    Frey, O.; Meier, E.

    2008-01-01

    Various tomographic processing methods have been investigated in recent years. The quality of the focused tomographic image is usually limited by several factors. In particular, Fourier-based focusing methods are susceptible to irregular and sparse sampling, two problems that are unavoidable in case of multi-pass, multi-baseline SAR data acquired by an airborne system. Neither time-domain back-projection(TDBP) processing, although providing a very accurate processing framework, is able to ove...

  5. Comparison of back projection methods of determining earthquake rupture process in time and frequency domains

    Science.gov (United States)

    Wang, W.; Wen, L.

    2013-12-01

    Back projection is a method to back project the seismic energy recorded in a seismic array back to the earthquake source region and determine the rupture process of a large earthquake. The method takes advantage of the coherence of seismic energy in a seismic array and is quick in determining some important properties of earthquake source. The method can be performed in both time and frequency domains. In time domain, the most conventional procedure is beam forming with some measures of suppressing the noise, such as the Nth root stacking, etc. In the frequency domain, the multiple signal classification method (MUSIC) estimates the direction of arrivals of multiple waves propagating through an array using the subspace method. The advantage of this method is the ability to study rupture properties at various frequencies and to resolve simultaneous arrivals making it suitable for detecting biliteral rupture of an earthquake source. We present a comparison of back projection results on some large earthquakes between the methods in time domain and frequency domain. The time-domain procedure produces an image that is smeared and exhibits some artifacts, although some enhancing stacking methods can at some extent alleviate the problem. On the other hand, the MUSIC method resolves clear multiple arrivals and provides higher resolution of rupture imaging.

  6. Realtime parallel back-projection algorithm for three-dimensional optoacoustic imaging devices

    Science.gov (United States)

    Ozbek, Ali; Deán-Ben, X. L.; Razansky, Daniel

    2013-06-01

    Back-projection algorithms are probably the fastest approach to reconstruct an image from a set of optoacoustic (photoacoustic) data set. However, standard implementations of back-projection formulae are still not adequate for real-time (greater than 5 frames per second) visualization of three-dimensional structures. This is due to the fact that the number of voxels one needs to reconstruct in three-dimensions is orders of magnitude larger than the number of pixels in two dimensions. Herein we describe a parallel implementation of optoacoustic signal processing and back-projection reconstruction in an attempt to achieve real-time visualization of structures with three-dimensional optoacoustic tomographic systems. For this purpose, the parallel computation power of a graphics processing unit (GPU) is utilized. The GPU is programmed with OpenCL, a programming language for heterogenous platforms. We showcase that with the implementation suggested in this work imaging at frame rates up to 50 high-resolution three-dimensional images per second is achievable.

  7. [ART reconstruction from few views using bilateral-filtering iterative method].

    Science.gov (United States)

    Qi, Hongliang; Zhou, Linghong; Xu, Yuan; Hong, Hong; Lu, Wenting; Zhen, Xin

    2013-04-01

    An algebraic image reconstruction from few views using bilateral-filtering iterative method was proposed due to the problem of computed tomography insufficient data in the present study. In each iteration reconstruction, we first used algebraic reconstruction technique (ART) algorithm to reconstruct an image, ensuring the non-negativity of the reconstructed image at the same time, and then performed bilateral-filtering to the above-mentioned image. In order to improve reconstructed image quality and accelerate the convergence speed, we developed a modified bilateral-filtering method. Shepp-Logan simulation experiments and real CT projection data reconstructions showed the feasibility of the algorithm. The results showed that, compared with the traditional methods of filtered back projection (FBP), ART and GF-ART,the proposed method has a higher signal-to-noise ratio, and maintains more effectively the image edge information. PMID:23858736

  8. Decoding using back-project algorithm from coded image in ICF

    International Nuclear Information System (INIS)

    The principle of the coded imaging and its decoding in inertial confinement fusion is described simply. The authors take ring aperture microscope for example and use back-project (BP) algorithm to decode the coded image. The decoding program has been performed for numerical simulation. Simulations of two models are made, and the results show that the accuracy of BP algorithm is high and effect of reconstruction is good. Thus, it indicates that BP algorithm is applicable to decoding for coded image in ICF experiments

  9. Earthquake source characterization by the isochrone back projection method using near-source ground motions

    Science.gov (United States)

    Jakka, R. S.; Cochran, E. S.; Lawrence, J. F.

    2010-08-01

    Rapid and accurate assessment of source characteristics of moderate to large earthquakes is extremely useful for hazard assessment and to guide response of emergency services. Using the back projection method (BPM) it is possible to obtain an image of the source rupture process rapidly. The potential of the method in identifying the rupture propagation and its slip distribution has been shown in previous studies. However, most of the earlier back projection implementations obtain only slip intensities not slip amplitude. Here, we propose a method that is capable of providing quick estimates of the slip amplitude in addition to its distribution across the fault plane, using high frequency near-source records. First, we explore the advantages and limitations of the proposed BPM through a series of synthetic examples. We demonstrate the utility of the method to identify slip asperities and their associated intensities, with a limited number of stations (weighting scheme is introduced. To test the BPM, we apply the method to the 2004 Mw 6.0 Parkfield earthquake using available near-source seismic data. The method identifies similar locations and amplitudes of slip using either P- or S-wave displacement records. And, for real earthquakes, we find that a significant number of observations are needed around the source to reduce the influence of local propagation and site effects.

  10. A comparative study between matched and mis-matched projection/back projection pairs used with ASIRT reconstruction method

    International Nuclear Information System (INIS)

    For algebraic reconstruction techniques both forward and back projection operators are needed. The ability to perform accurate reconstruction relies fundamentally on the forward projection and back projection methods which are usually, the transpose of each other. Even though the mis-matched pairs may introduce additional errors during the iterative process, the usefulness of mis-matched projector/back projector pairs has been proved in image reconstruction. This work investigates the performance of matched and mis-matched reconstruction pairs using popular forward projectors and their transposes when used in reconstruction tasks with additive simultaneous iterative reconstruction techniques (ASIRT) in a parallel beam approach. Simulated noiseless phantoms are used to compare the performance of the investigated pairs in terms of the root mean squared errors (RMSE) which are calculated between reconstructed slices and the reference in different regions. Results show that mis-matched projection/back projection pairs can promise more accuracy of reconstructed images than matched ones. The forward projection operator performance seems independent of the choice of the back projection operator and vice versa.

  11. A comparative study between matched and mis-matched projection/back projection pairs used with ASIRT reconstruction method

    Energy Technology Data Exchange (ETDEWEB)

    Guedouar, R., E-mail: raja_guedouar@yahoo.f [Higher School of Health Sciences and Techniques of Monastir, Av. Avicenne, 5060 Monastir, B.P. 128 (Tunisia); Zarrad, B., E-mail: boubakerzarrad@yahoo.f [Higher School of Health Sciences and Techniques of Monastir, Av. Avicenne, 5060 Monastir, B.P. 128 (Tunisia)

    2010-07-21

    For algebraic reconstruction techniques both forward and back projection operators are needed. The ability to perform accurate reconstruction relies fundamentally on the forward projection and back projection methods which are usually, the transpose of each other. Even though the mis-matched pairs may introduce additional errors during the iterative process, the usefulness of mis-matched projector/back projector pairs has been proved in image reconstruction. This work investigates the performance of matched and mis-matched reconstruction pairs using popular forward projectors and their transposes when used in reconstruction tasks with additive simultaneous iterative reconstruction techniques (ASIRT) in a parallel beam approach. Simulated noiseless phantoms are used to compare the performance of the investigated pairs in terms of the root mean squared errors (RMSE) which are calculated between reconstructed slices and the reference in different regions. Results show that mis-matched projection/back projection pairs can promise more accuracy of reconstructed images than matched ones. The forward projection operator performance seems independent of the choice of the back projection operator and vice versa.

  12. Radiation dose reduction in time-resolved CT angiography using highly constrained back projection reconstruction

    Energy Technology Data Exchange (ETDEWEB)

    Supanich, Mark; Tao Yinghua; Nett, Brian; Mistretta, Charles; Chen Guanghong [Department of Medical Physics, University of Wisconsin in Madison, 1111 Highland Avenue, Madison, WI, 53705 (United States); Pulfer, Kari; Turski, Patrick; Rowley, Howard [Department of Radiology, University of Wisconsin in Madison, 600 Highland Avenue, Madison, WI, 53792 (United States); Hsieh Jiang [GE Medical Systems, 300 N, Grandview Blvd., Waukesha, WI 53188 (United States)], E-mail: gchen7@wisc.edu

    2009-07-21

    Recently dynamic, time-resolved three-dimensional computed tomography angiography (CTA) has been introduced to the neurological imaging community. However, the radiation dose delivered to patients in time-resolved CTA protocol is a high and potential risk associated with the ionizing radiation dose. Thus, minimizing the radiation dose is highly desirable for time-resolved CTA. In order to reduce the radiation dose delivered during dynamic, contrast-enhanced CT applications, we introduce here the CT formulation of HighlY constrained back PRojection (HYPR) imaging. We explore the radiation dose reduction approaches of both acquiring a reduced number of projections for each image and lowering the tube current used during acquisition. We then apply HYPR image reconstruction to produce image sets at a reduced patient dose and with low image noise. Numerical phantom experiments and retrospective analysis of in vivo canine studies are used to assess the accuracy and quality of HYPR reduced dose image sets and validate our approach. Experimental results demonstrated that a factor of 6-8 times radiation dose reduction is possible when the HYPR algorithm is applied to time-resolved CTA exams.

  13. Direct aperture optimization using an inverse form of back-projection.

    Science.gov (United States)

    Zhu, Xiaofeng; Cullip, Timothy; Tracton, Gregg; Tang, Xiaoli; Lian, Jun; Dooley, John; Chang, Sha X

    2014-01-01

    Direct aperture optimization (DAO) has been used to produce high dosimetric quality intensity-modulated radiotherapy (IMRT) treatment plans with fast treatment delivery by directly modeling the multileaf collimator segment shapes and weights. To improve plan quality and reduce treatment time for our in-house treatment planning system, we implemented a new DAO approach without using a global objective function (GFO). An index concept is introduced as an inverse form of back-projection used in the CT multiplicative algebraic reconstruction technique (MART). The index, introduced for IMRT optimization in this work, is analogous to the multiplicand in MART. The index is defined as the ratio of the optima over the current. It is assigned to each voxel and beamlet to optimize the fluence map. The indices for beamlets and segments are used to optimize multileaf collimator (MLC) segment shapes and segment weights, respectively. Preliminary data show that without sacrificing dosimetric quality, the implementation of the DAO reduced average IMRT treatment time from 13 min to 8 min for the prostate, and from 15 min to 9 min for the head and neck using our in-house treatment planning system PlanUNC. The DAO approach has also shown promise in optimizing rotational IMRT with burst mode in a head and neck test case. PMID:24710439

  14. Rupture imaging of the Mw 7.9 12 May 2008 Wenchuan earthquake from back projection of teleseismic P waves

    Science.gov (United States)

    Xu, Yan; Koper, Keith D.; Sufri, Oner; Zhu, Lupei; Hutko, Alexander R.

    2009-04-01

    The Mw 7.9 Wenchuan earthquake of 12 May 2008 was the most destructive Chinese earthquake since the 1976 Tangshan event. Tens of thousands of people were killed, hundreds of thousands were injured, and millions were left homeless. Here we infer the detailed rupture process of the Wenchuan earthquake by back-projecting teleseismic P energy from several arrays of seismometers. This technique has only recently become feasible and is potentially faster than traditional finite-fault inversion of teleseismic body waves; therefore, it may reduce the notification time to emergency response agencies. Using the IRIS DMC, we collected 255 vertical component broadband P waves at 30-95° from the epicenter. We found that at periods of 5 s and greater, nearly all of these P waves were coherent enough to be used in a global array. We applied a simple down-sampling heuristic to define a global subarray of 70 stations that reduced the asymmetry and sidelobes of the array response function (ARF). We also considered three regional subarrays of seismometers in Alaska, Australia, and Europe that had apertures less than 30° and P waves that were coherent to periods as short as 1 s. Individual ARFs for these subarrays were skewed toward the subarrays; however, the linear sum of the regional subarray beams at 1 s produced a symmetric ARF, similar to that of the groomed global subarray at 5 s. For both configurations we obtained the same rupture direction, rupture length, and rupture time. We found that the Wenchuan earthquake had three distinct pulses of high beam power at 0, 23, and 57 s after the origin time, with the pulse at 23 s being highest, and that it ruptured unilaterally to the northeast for about 300 km and 110 s, with an average speed of 2.8 km/s. It is possible that similar results can be determined for future large dip-slip earthquakes within 20-30 min of the origin time using relatively sparse global networks of seismometers such as those the USGS uses to locate earthquakes in near-real time.

  15. Importance of point-by-point back projection correction for isocentric motion in digital breast tomosynthesis: Relevance to morphology of structures such as microcalcifications

    International Nuclear Information System (INIS)

    Digital breast tomosynthesis is a three-dimensional imaging technique that provides an arbitrary set of reconstruction planes in the breast from a limited-angle series of projection images acquired while the x-ray tube moves. Traditional shift-and-add (SAA) tomosynthesis reconstruction is a common mathematical method to line up each projection image based on its shifting amount to generate reconstruction slices. With parallel-path geometry of tube motion, the path of the tube lies in a plane parallel to the plane of the detector. The traditional SAA algorithm gives shift amounts for each projection image calculated only along the direction of x-ray tube movement. However, with the partial isocentric motion of the x-ray tube in breast tomosynthesis, small objects such as microcalcifications appear blurred (for instance, about 1-4 pixels in blur for a microcalcification in a human breast) in traditional SAA images in the direction perpendicular to the direction of tube motion. Some digital breast tomosynthesis algorithms reported in the literature utilize a traditional one-dimensional SAA method that is not wholly suitable for isocentric motion. In this paper, a point-by-point back projection (BP) method is described and compared with traditional SAA for the important clinical task of evaluating morphology of small objects such as microcalcifications. Impulse responses at different three-dimensional locations with five different combinations of imaging acquisition param combinations of imaging acquisition parameters were investigated. Reconstruction images of microcalcifications in a human subject were also evaluated. Results showed that with traditional SAA and 45 deg. view angle of tube movement with respect to the detector, at the same height above the detector, the in-plane blur artifacts were obvious for objects farther away from x-ray source. In a human subject, the appearance of calcifications was blurred in the direction orthogonal to the tube motion with traditional SAA. With point-by-point BP, the appearance of calcifications was sharper. The point-by-point BP method demonstrated improved rendition of microcalcifications in the direction perpendicular to the tube motion direction. With wide angles or for imaging of larger breasts, this point-by-point BP rather than the traditional SAA should also be considered as the basis of further deblurring algorithms that work in conjunction with the BP method

  16. GPU-accelerated back-projection revisited. Squeezing performance by careful tuning

    International Nuclear Information System (INIS)

    In recent years, GPUs have become an increasingly popular tool in computed tomography (CT) reconstruction. In this paper, we discuss performance optimization techniques for a GPU-based filtered-backprojection reconstruction implementation. We explore the different optimization techniques we used and explain how those techniques affected performance. Our results show a nearly 50% increase in performance when compared to the current top ranked GPU implementation. (orig.)

  17. UWB Short-Pulse Radar: Combining Trilateration and Back Projection for Through-the-Wall Radar Imaging

    Science.gov (United States)

    Daho, O. B.; Khamlichi, J.; Ménard, M.; Gaugue, A.

    In this chapter, we propose a novel way to combine back projection and trilateration algorithms for through-the-wall imaging using an ultra-wideband (UWB) short-pulse radar system. The combination of the two algorithms increases the detection-localization performance. To accomplish this improvement, the multi-target localization problem of trilateration is addressed by the calculation of the root-mean-square error with regard to the estimated position and those of all possible target positions. The radar system's entire processing pipeline is described, with a focus on the imaging block. The data were acquired using a multistatic radar system with a 3.2 GHz bandwidth. Simulations and experiments indicate that our combined method outperforms other methods. Simulation and experimental results are shown, compared, and discussed.

  18. High-frequency seismic emission during Maule earthquake (Mw 8.8, 27/02/2010) inferred from high-resolution back-projection analysis of P waves

    Science.gov (United States)

    Palo, Mauro; Tilmann, Frederik; Ehlert, Lutz; Krüger, Frank; Lange, Dietrich

    2013-04-01

    Since its first application on Sumatra-Andaman earthquake, back-projection analysis has been widely exploited to infer the time-evolution of the rupture fronts of mega-earthquakes. In this technique, selected seismic phases recorded at teleseismic distances by a network of sensors are shifted according to a possible source position and a velocity model, and a multichannel version of the cross-correlation function is estimated. In this way, the time-dependent map of the seismic energy emission in the source area can be inferred. We have back-projected the mainshock of Maule earthquake (Mw 8.8), which nucleated on 27/02/2010 in central Chile and is one of the largest earthquakes recorded in modern times. We have analyzed P phases filtered in the frequency range (0.4-3) Hz recorded by three seismic arrays located in US, Africa and Antarctica. Relative time shifts between sensors (inferred by maximizing the cross-correlation function) have been estimated with respect to a 1D global velocity model (ak135) and have been refined introducing two corrections, a static correction and a dynamic correction. The former is the time shift induced by local effects in the sensor area, whereas the latter is the correction associated with the source-sensor path and is mostly affected by medium properties in the source area. We have inferred these two corrections by analyzing the waveforms of 23 aftershocks and foreshocks with high magnitude (>5.3). In detail, static correction was chosen as the mean time shift averaged over all the events recorded by one station, while dynamic correction was the remaining part of the travel time after removing the 1D model travel time and the static correction. Moreover, dynamic corrections (and hence the complete travel times) have been interpolated over all the source area by Kriging, a spatial interpolation method. Results show that high-frequency seismic energy emission mostly occurs along the coastline with a general northward migration during the event. Specifically, in the first minute of the rupture process, the energy emission occurs southerly from or close to the epicenter. Afterwards, seismic emission moves northwards, with a gap with respect to the first emission zone, and a further northward migration occurs till the end of emission. Both the spatial gap of seismic emission and the northward migration are in line with the results of other studies in the same area, whereas we find a shallower emission area and different emission features in the zone close to the epicenter. Results for different frequency bands and the analysis of secondary maxima of energy emission are being investigated. In particular, we are shifting towards higher frequencies looking at the frequency bands (1-4) Hz and (2-8) Hz. The former band displays an emission pattern similar to that of (0.4-3) Hz, but with a sharper gap of about 50 Km; the latter band shows coherent arrivals only during the first 80 s, with a clear energy emission south of the epicenter at the onset of the event and preserving the northward migration afterwards.

  19. Images of Gravitational and Magnetic Phenomena Derived from 2D Back-Projection Doppler Tomography of Interacting Binary Stars

    CERN Document Server

    Richards, Mercedes T; Fisher, John G; Conover, Marshall J

    2014-01-01

    We have used 2D back-projection Doppler tomography as a tool to examine the influence of gravitational and magnetic phenomena in interacting binaries which undergo mass transfer from a magnetically-active star onto a non-magnetic main sequence star. This multi-tiered study of over 1300 time-resolved spectra of 13 Algol binaries involved calculations of the predicted dynamical behavior of the gravitational flow and the dynamics at the impact site, analysis of the velocity images constructed from tomography, and the influence on the tomograms of orbital inclination, systemic velocity, orbital coverage, and shadowing. The H$\\alpha$ tomograms revealed eight sources: chromospheric emission, a gas stream along the gravitational trajectory, a star-stream impact region, a bulge of absorption or emission around the mass-gaining star, a Keplerian accretion disk, an absorption zone associated with hotter gas, a disk-stream impact region, and a hot spot where the stream strikes the edge of a disk. We described several me...

  20. Electrical capacitance tomography two-phase oil-gas pipe flow imaging by the linear back-projection algorithm

    Directory of Open Access Journals (Sweden)

    R. Martin

    2005-12-01

    Full Text Available Electrical Capacitance Tomography (ECT is a novel technology that can deal with the complexity of two-phase gas-oil flow measurement by explicitly deriving the component distributions on two adjacent planes along a pipeline. One of its most promising applications is the visualization of gas-oil flows. ECT offers some advantages over other tomography modalities, such as no radiation, rapid response, low-cost, being non-intrusive and non-invasive, and the ability to withstand high temperature and high pressure. The linear back-projection (LBP algorithm is one of the most popular methods employed to perform image reconstruction in ECT. Despite its relatively poor accuracy, it is a simple and fast procedure capable of real-time operation in many applications, and it has remained a very popular choice. However, since it was first reported it has lacked a clear formal support in the context of this application. Its only justification has been that it was an adaptation of a method normally used in linear X-ray medical tomography, and the fact that it actually does produce useful (albeit only ‘qualitative’ images. In this paper, one illustrative way of interpreting LBP is presented. It is shown how LBP is actually based on the linearisation of a normalised form of the forward problem. More specifically, the normalised forward problem is approximated by means of a series of hyper-planes. The reconstruction matrix used in LBP is found to be a ‘weighted’ transpose of the linear operator (matrix that defines the linearised normalised forward problem. The rows of this latter matrix contain the information of the sensitivity maps used in LBP.

  1. Images of Gravitational and Magnetic Phenomena Derived from Two-dimensional Back-projection Doppler Tomography of Interacting Binary Stars

    Science.gov (United States)

    Richards, Mercedes T.; Cocking, Alexander S.; Fisher, John G.; Conover, Marshall J.

    2014-11-01

    We have used two-dimensional back-projection Doppler tomography as a tool to examine the influence of gravitational and magnetic phenomena in interacting binaries that undergo mass transfer from a magnetically active star onto a non-magnetic main-sequence star. This multitiered study of over 1300 time-resolved spectra of 13 Algol binaries involved calculations of the predicted dynamical behavior of the gravitational flow and the dynamics at the impact site, analysis of the velocity images constructed from tomography, and the influence on the tomograms of orbital inclination, systemic velocity, orbital coverage, and shadowing. The H? tomograms revealed eight sources: chromospheric emission, a gas stream along the gravitational trajectory, a star-stream impact region, a bulge of absorption or emission around the mass-gaining star, a Keplerian accretion disk, an absorption zone associated with hotter gas, a disk-stream impact region, and a hot spot where the stream strikes the edge of a disk. We described several methods used to extract the physical properties of the emission sources directly from the velocity images, including S-wave analysis, the creation of simulated velocity tomograms from hydrodynamic simulations, and the use of synthetic spectra with tomography to sequentially extract the separate sources of emission from the velocity image. In summary, the tomography images have revealed results that cannot be explained solely by gravitational effects: chromospheric emission moving with the mass-losing star, a gas stream deflected from the gravitational trajectory, and alternating behavior between stream state and disk state. Our results demonstrate that magnetic effects cannot be ignored in these interacting binaries.

  2. Development of the code for filter calculation

    International Nuclear Information System (INIS)

    This paper describes a calculation method, which commonly used in the Neutron Physics Department to develop a new neutron filter or to improve the existing neutron filter. This calculation is the first step of the traditional filter development procedure. It allows easy selection of the qualitative and quantitative contents of a composite filter in order to receive the filtered neutron beam with given parameters

  3. A new field-of-view autotracking method based on back-projected ray image cross-correlation for online tomography reconstruction.

    Science.gov (United States)

    Tomonaga, Sachihiko; Baba, Misuzu; Baba, Norio

    2014-11-01

    In general, a tomogram cannot be observed immediately after the acquisition of a series of specimen tilt images, but is instead observed after the post-processing of the tilt series alignment, which often requires a substantial amount of time. Moreover, for general specimens, the automatic acquisition of the tilt series is difficult because field-of-view tracking frequently fails as the tilt angle or specimen thickness increases.In this study, we focus on the improvement of the field-of-view autotracking technique for the purpose of online tomography reconstruction and propose a new alternative technique [1,2]. The method we proposed uses a so-called 'back-projected ray image' instead of a specimen tilt image. The back-projected ray image is a cross-section image calculated from each projection image only during reconstruction. As a result of a study on 'ray images', the quality and accuracy of the cross-correlation between a pair of neighboring ray images among the tilt series were observed to be very high compared with those between a pair of projection images. We observed that a back projected ray image reliably cross-correlates with other neighboring ray images at the position of an existing three-dimensional object. The proposed method can therefore consistently track the field-of-view, overcoming the weakness of a conventional image-matching-based method. In addition, the present method is simple, and high speed processing is expected to be achieved because fast Fourier transform (FFT) and inverse fast Fourier transform (IFFT) algorithms can be used.We applied this method to real specimens in online experiments using a TEM and thereby demonstrated its successful performance. Online autotracking experiments with thin-section samples were used to demonstrate the effectiveness of the proposed method. The field-of-view was automatically tracked with high accuracy through a tilt angle range. Furthermore, online tomograms were obtained immediately after the last specimen tilting. With increases in the tracking speed, in situ tomographic observations for analyzing the dynamic behavior might become feasible in the future.jmicro;63/suppl_1/i23-a/DFU058F1F1DFU058F1Fig. 1.Comparison of the proposed autotracking method with the conventional PCF based alignment method using the yeast cell thin-section. a and b: Reconstructed X-Y cross-section images from tracking results at 8° increment angle with the PCF method and with the proposed method. N, nucleus; V, vacuole; NVJ, nucleus-vacuole junction. c: A reconstructed cross-section image from autotracking result at 1° increment angle with the proposed method. (scale bar: 100 nm). PMID:25359820

  4. A back projection dosimetry method for diagnostic and orthovoltage x-ray from 40 to 140 kVp for patients and phantoms

    Science.gov (United States)

    Afrashteh, Hossein

    2005-07-01

    Patient dosimetry in practice is involved with time consuming, tedious calculations during the measurement process. There is a need for a straight forward and accurate method to perform patient dosimetry when required. A back projection dosimetry method for patient/phantom using Entrance Surface Dose (ESD) and its corresponding Exit Surface Dose with an average value for attenuation coefficient (mu), (e.g., mean effective attenuation coefficient (mu`)), was developed. The method focused on low energy X-ray units (40--140 kVp), primarily for conventional diagnostic radiography and low energy radiation therapy procedures. The assumption is that it may be used for similar concepts and modalities within the same energy range, (e.g., fluoroscopy, where the skin injuries have been common in the past, or mammography, where the radiation carcinogenesis has been a matter of concern). A new Gafchromic film, XR-QA, as a precision dosimeter was assessed and used with this algorithm. Due to the fact that the dose range often seen in conventional radiography exams in most cases is not high enough to activate the sensitive layer of this film sufficiently, the measured net Optical Density (OD) changes were not substantial enough. Therefore, a conventional and relatively low speed dental film, DF58 Ultra, was used. Various thicknesses of Acrylic, a tissue equivalent material, were used with the algorithm. When compared with the other sources and reference data, the results from the developed mathematical algorithm are in a reasonable agreement with these values. The developed method is straight forward, and within the acceptable accuracy range. The back projection dosimetry method is effective and may be used individually for the desired body parts or fetus areas, depending on the clinical practice and interests.

  5. Implicit Kalman filtering

    Science.gov (United States)

    Skliar, M.; Ramirez, W. F.

    1997-01-01

    For an implicitly defined discrete system, a new algorithm for Kalman filtering is developed and an efficient numerical implementation scheme is proposed. Unlike the traditional explicit approach, the implicit filter can be readily applied to ill-conditioned systems and allows for generalization to descriptor systems. The implementation of the implicit filter depends on the solution of the congruence matrix equation (A1)(Px)(AT1) = Py. We develop a general iterative method for the solution of this equation, and prove necessary and sufficient conditions for convergence. It is shown that when the system matrices of an implicit system are sparse, the implicit Kalman filter requires significantly less computer time and storage to implement as compared to the traditional explicit Kalman filter. Simulation results are presented to illustrate and substantiate the theoretical developments.

  6. Reduction of noise by wiener filtering in the nuclear tomography image

    International Nuclear Information System (INIS)

    Wiener filter is important for several radiological applications in the nuclear medical image analysis. The Filtered Back Projection is a method of image reconstruction used in the effort to form a unified approach that is robust to noise and blurring. Although filter function in computed tomography is usual given such as analytical form in the frequency domain, Wiener filter will be defined in the terms of the power spectral density functions, with assumption the signal and noise are jointly stationary. A number of numerical experiments were performed on synthetic image was shown that Wiener filter typically performs better when compared to conventional filters, lake as ramp filter. Further, this method yields improvements with little increase in computational overhead. An advantage derived from reduction of some frequency in the projection data into Fourier domain by Wiener filtering, which are carriers of the noise. (author)

  7. Keeping Tradition

    OpenAIRE

    Zenhong, C.; Buwalda, P. L.

    2011-01-01

    Chinese dumplings such as Jiao Zi and Bao Zi are two of the popular traditional foods in Asia. They are usually made from wheat flour dough (rice flour or starch is sometimes used) that contains fillings. They can be steamed, boiled and fried and are consumed either as a main meal or dessert. As these tasty dumplings are easy to prepare, they have become one of Asia's fastest growing products in the frozen and ready-to-eat sector.

  8. Improved Resolution and Reduced Clutter in Ultra-Wideband Microwave Imaging Using Cross-Correlated Back Projection: Experimental and Numerical Results

    Science.gov (United States)

    Jacobsen, S.; Birkelund, Y.

    2010-01-01

    Microwave breast cancer detection is based on the dielectric contrast between healthy and malignant tissue. This radar-based imaging method involves illumination of the breast with an ultra-wideband pulse. Detection of tumors within the breast is achieved by some selected focusing technique. Image formation algorithms are tailored to enhance tumor responses and reduce early-time and late-time clutter associated with skin reflections and heterogeneity of breast tissue. In this contribution, we evaluate the performance of the so-called cross-correlated back projection imaging scheme by using a scanning system in phantom experiments. Supplementary numerical modeling based on commercial software is also presented. The phantom is synthetically scanned with a broadband elliptical antenna in a mono-static configuration. The respective signals are pre-processed by a data-adaptive RLS algorithm in order to remove artifacts caused by antenna reverberations and signal clutter. Successful detection of a 7?mm diameter cylindrical tumor immersed in a low permittivity medium was achieved in all cases. Selecting the widely used delay-and-sum (DAS) beamforming algorithm as a benchmark, we show that correlation based imaging methods improve the signal-to-clutter ratio by at least 10?dB and improves spatial resolution through a reduction of the imaged peak full-width half maximum (FWHM) of about 40–50%. PMID:21331362

  9. Short-range ground-based synthetic aperture radar imaging: performance comparison between frequency-wavenumber migration and back-projection algorithms

    Science.gov (United States)

    Yigit, Enes; Demirci, Sevket; Özdemir, Caner; Tekba?, Mustafa

    2013-01-01

    Two popular synthetic aperture radar (SAR) reconstruction algorithms, namely the back-projection (BP) and the frequency wavenumber (?-k) algorithms, were tested and compared against each other, especially for their use in ground-based (GB) SAR applications directed to foreign object debris removal. For this purpose, an experimental setup in a semi-anechoic chamber room was accomplished to obtain near-field SAR images of objects on the ground. Then, the 90 to 95 GHz scattering data were acquired by using a stepped frequency continuous-wave radar operation. The performances of the setup and the imaging algorithms were then assessed by exploiting various metrics including point spread function, signal-to-clutter ratio, integrated side-lobe ratio, and computational complexity. Results demonstrate that although both algorithms produce almost accurate images of targets, the BP algorithm is shown to be superior to the ?-k algorithm due to its some inherent advantages specifically suited for short-range GB-SAR applications.

  10. High security and robust optical image encryption approach based on computer-generated integral imaging pickup and iterative back-projection techniques

    Science.gov (United States)

    Li, Xiao Wei; Cho, Sung Jin; Kim, Seok Tae

    2014-04-01

    In this paper, a novel optical image encryption algorithm by combining the use of computer-generated integral imaging (CGII) pickup technique and iterative back-projection (IBP) technique is proposed. In this scheme, a color image to be encrypted which is firstly segregated into three channels: red, green, and blue. Each of these three channels is independently captured by using a virtual pinhole array and be computationally transformed as a sub-image array. Then, each of these three sub-image arrays are scrambled by the Fibonacci transformation (FT) algorithm, respectively. These three scrambled sub-image arrays are encrypted by the hybrid cellular automata (HCA), respectively. Ultimately, these three encrypted images are combined to produce the colored encrypted image. In the reconstruction process, because the computational integral imaging reconstruction (CIIR) is a pixel-overlapping reconstruction technique, the interference of the adjacent pixels will decrease the quality of the reconstructed image. To address this problem, we introduce an image super-resolution reconstruction technique, the image can be computationally reconstructed by the IBP technique. Some numerical simulations are made to test the validity and the capability of the proposed image encryption algorithm.

  11. The diffuse ensemble filter

    Directory of Open Access Journals (Sweden)

    X. Yang

    2009-07-01

    Full Text Available A new class of ensemble filters, called the Diffuse Ensemble Filter (DEnF, is proposed in this paper. The DEnF assumes that the forecast errors orthogonal to the first guess ensemble are uncorrelated with the latter ensemble and have infinite variance. The assumption of infinite variance corresponds to the limit of "complete lack of knowledge" and differs dramatically from the implicit assumption made in most other ensemble filters, which is that the forecast errors orthogonal to the first guess ensemble have vanishing errors. The DEnF is independent of the detailed covariances assumed in the space orthogonal to the ensemble space, and reduces to conventional ensemble square root filters when the number of ensembles exceeds the model dimension. The DEnF is well defined only in data rich regimes and involves the inversion of relatively large matrices, although this barrier might be circumvented by variational methods. Two algorithms for solving the DEnF, namely the Diffuse Ensemble Kalman Filter (DEnKF and the Diffuse Ensemble Transform Kalman Filter (DETKF, are proposed and found to give comparable results. These filters generally converge to the traditional EnKF and ETKF, respectively, when the ensemble size exceeds the model dimension. Numerical experiments demonstrate that the DEnF eliminates filter collapse, which occurs in ensemble Kalman filters for small ensemble sizes. Also, the use of the DEnF to initialize a conventional square root filter dramatically accelerates the spin-up time for convergence. However, in a perfect model scenario, the DEnF produces larger errors than ensemble square root filters that have covariance localization and inflation. For imperfect forecast models, the DEnF produces smaller errors than the ensemble square root filter with inflation. These experiments suggest that the DEnF has some advantages relative to the ensemble square root filters in the regime of small ensemble size, imperfect model, and copious observations.

  12. Electromechanical Frequency Filters

    Science.gov (United States)

    Wersing, W.; Lubitz, K.

    Frequency filters select signals with a frequency inside a definite frequency range or band from signals outside this band, traditionally afforded by a combination of L-C-resonators. The fundamental principle of all modern frequency filters is the constructive interference of travelling waves. If a filter is set up of coupled resonators, this interference occurs as a result of the successive wave reflection at the resonators' ends. In this case, the center frequency f c of a filter, e.g., set up of symmetrical ?/2-resonators of length 1, is given by f_c = f_r = v_{ph}/? = v_{ph}/2l , where v ph is the phase velocity of the wave. This clearly shows the big advantage of acoustic waves for filter applications in comparison to electro-magnetic waves. Because v ph of acoustic waves in solids is about 104-105 smaller than that of electro-magnetic waves, much smaller filters can be realised. Today, piezoelectric materials and processing technologies exist that electromechanical resonators and filters can be produced in the frequency range from 1 kHz up to 10 GHz. Further requirements for frequency filters such as low losses (high resonator Q) and low temperature coefficients of frequency constants can also be fulfilled with these filters. Important examples are quartz-crystal resonators and filters (1 kHz-200 MHz) as discussed in Chap. 2, electromechanical channel filters (50 kHz and 130 kHz) for long-haul communication systems as discussed in this section, surface acoustic wave (SAW) filters (20 MHz-5 GHz), as discussed in Chap. 14, and thin film bulk acoustic resonators (FBAR) and filters (500 MHz-10 GHz), as discussed in Chap. 15.

  13. Qualitative Evaluation of Filter Function in Brain SPECT

    Directory of Open Access Journals (Sweden)

    Nahid Yaghobi

    2007-06-01

    Full Text Available Introduction: Filtering can greatly affect the quality of clinical images. Determining the best filter and the proper degree of smoothing can help to ensure the most accurate diagnosis. Methods: Forty five patient’s data aquired during brain phantom SPECT studies were reconstructed using filtered back-projection technique. The ramp, Shepp-Logan, Cosine, Hamming, Hanning, Butterworth, Metz and Wiener filters were examined to find the optimum condition for each filter. For each slice image, 6200 reconstruction options were considered. The corresponding planar image of each slice was used as the reference image. The quality of reconstructed images was determined using universal image quality index (UIQI. Four nuclear medicine physicians evaluated the images to choose the best of the filters. Results: Images with best resolution, contrast, smoothness and overall quality were selected by nuclear medicine physicians depending on filters used to generate the best image. A significant difference (p<0.05 between the filters regarding these parameters were observed. Conclusion: The results of this study revealed that maximum resolution and contrast could be obtained using both Metz and Wiener filters. However, the best quality images were generated by using Butterworth filter.

  14. Characterizing trends in HIV infection among men who have sex with men in Australia by birth cohorts: results from a modified back-projection method

    Directory of Open Access Journals (Sweden)

    Wand Handan

    2009-09-01

    Full Text Available Abstract Background We set out to estimate historical trends in HIV incidence in Australian men who have sex with men with respect to age at infection and birth cohort. Methods A modified back-projection technique is applied to data from the HIV/AIDS Surveillance System in Australia, including "newly diagnosed HIV infections", "newly acquired HIV infections" and "AIDS diagnoses", to estimate trends in HIV incidence over both calendar time and age at infection. Results Our results demonstrate that since 2000, there has been an increase in new HIV infections in Australian men who have sex with men across all age groups. The estimated mean age at infection increased from ~35 years in 2000 to ~37 years in 2007. When the epidemic peaked in the mid 1980s, the majority of the infections (56% occurred among men aged 30 years and younger; 30% occurred in ages 31 to 40 years; and only ~14% of them were attributed to the group who were older than 40 years of age. In 2007, the proportion of infections occurring in persons 40 years or older doubled to 31% compared to the mid 1980s, while the proportion of infections attributed to the group younger than 30 years of age decreased to 36%. Conclusion The distribution of HIV incidence for birth cohorts by infection year suggests that the HIV epidemic continues to affect older homosexual men as much as, if not more than, younger men. The results are useful for evaluating the impact of the epidemic across successive birth cohorts and study trends among the age groups most at risk.

  15. Air filter

    International Nuclear Information System (INIS)

    An air filter consists of an upright cylinder of corrugated or pleated filter fabric, joined at its upper end to a tubular right-angled elbow. The open end of the elbow includes an internal lip seal, so the elbow can be slid onto a horizontal spigot in an air filter unit. The filter can be cleaned by subjecting the fabric to a reverse pressure pulse from a nozzle. The construction facilitates removal of the filter into a plastic bag secured round a frame behind a door, when the unit is used to filter radioactive dust from air. (author)

  16. Distance protection using a wavelet-based filtering algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Silva, Kleber M. [University of Brasilia, Electrical Engineering Department, 70919-970 Brasilia (Brazil); Neves, Washington L.A.; Souza, Benemar A. [Federal University of Campina Grande, Paraiba, Brazil, Electrical Engineering Department, 882 Aprigio Veloso Ave, Bodocongo, 58109-970 Campina Grande (Brazil)

    2010-01-15

    This paper presents a novel filter design technique based on the maximal overlap discrete wavelet transform (MODWT). Time and frequency response of the designed filters are compared to the ones of traditional discrete Fourier transform (DFT) based filters. The obtained results show improvements on response speed when comparing to full cycle traditional filters and improvements on frequency response when compared to half cycle traditional filters. According to these characteristics, the designed filter may be used in secure high-speed distance protection, since it provides fast relay operating times, without compromising their frequency response. (author)

  17. Kidney Filtering

    Science.gov (United States)

    Integrated Teaching and Learning Program,

    In this activity, students filter different substances through a plastic window screen, different sized hardware cloth and poultry netting. Their model shows how the thickness of a filter in the kidney is imperative in deciding what will be filtered out and what will stay within the blood stream.

  18. Filter apparatus

    International Nuclear Information System (INIS)

    This invention relates to liquid filters, precoated by replaceable powders, which are used in the production of ultra pure water required for steam generation of electricity. The filter elements are capable of being installed and removed by remote control so that they can be used in nuclear power reactors. (UK)

  19. Data assimilation the ensemble Kalman filter

    CERN Document Server

    Evensen, Geir

    2006-01-01

    Covers data assimilation and inverse methods, including both traditional state estimation and parameter estimation. This text and reference focuses on various popular data assimilation methods, such as weak and strong constraint variational methods and ensemble filters and smoothers.

  20. Traditional Agriculture and Permaculture.

    Science.gov (United States)

    Pierce, Dick

    1997-01-01

    Discusses benefits of combining traditional agricultural techniques with the concepts of "permaculture," a framework for revitalizing traditions, culture, and spirituality. Describes school, college, and community projects that have assisted American Indian communities in revitalizing sustainable agricultural practices that incorporate cultural…

  1. Emission computerized axial tomography from multiple gamma-camera views using frequency filtering

    International Nuclear Information System (INIS)

    Emission computerized axial tomography is achievable in any nuclear medicine department from multiple gamma camera views. Data are collected by rotating the patient in front of the camera. A simple fast algorithm is implemented, known as the convolution technique: first the projection data are Fourier transformed and then an original filter designed for optimizing resolution and noise suppression is applied; finally the inverse transform of the latter operation is back-projected. This program, which can also take into account the attenuation for single photon events, was executed with good results on phantoms and patients. We think that it can be easily implemented for specific diagnostic problems. (orig.)

  2. Favorite Family Traditions

    Science.gov (United States)

    2012-12-08

    Students use the text The Relatives Came by Cynthia Rylant as a springboard for discussion about family traditions. After identifying the traditions observed by the relatives, students meet in small groups to brainstorm new traditions that could arise from the families gathering together during the winter. The lesson is concluded by having each student write about their own favorite family tradition and share it with a small group.

  3. Filter Dimension

    CERN Document Server

    Bavula, V

    2005-01-01

    Intuitively, the filter dimension of an algebra or a module measures how `close' standard filtrations of the algebra or the module are. In particular, for a simple algebra it also measures the growth of how `fast' one can prove that the algebra is simple. The filter dimension appears naturally when one wants to generalize the Bernstein's inequality for the Weyl algebras to the class of simple finitely generated algebras. This paper is a review of the author's results on the filter dimension and its connections with the Gelfand-Kirillov dimension, the Krull dimension, the representation theory of simple finitely generated algebras and the `size' of their maximal commutative subalgebras.

  4. Robust filtering for uncertain systems a parameter-dependent approach

    CERN Document Server

    Gao, Huijun

    2014-01-01

    This monograph provides the reader with a systematic treatment of robust filter design, a key issue in systems, control and signal processing, because of the fact that the inevitable presence of uncertainty in system and signal models often degrades the filtering performance and may even cause instability. The methods described are therefore not subject to the rigorous assumptions of traditional Kalman filtering. The monograph is concerned with robust filtering for various dynamical systems with parametric uncertainties, and focuses on parameter-dependent approaches to filter design. Classical filtering schemes, like H2 filtering and H¥ filtering, are addressed, and emerging issues such as robust filtering with constraints on communication channels and signal frequency characteristics are discussed. The text features: ·        design approaches to robust filters arranged according to varying complexity level, and emphasizing robust filtering in the parameter-dependent framework for the first time; ·...

  5. Intelligent Optimize Design of LCL Filter for Three-Phase Voltage-Source PWM Rectifier

    OpenAIRE

    Sun, Wei; Chen, Zhe; Wu, Xiaojie

    2009-01-01

    Compared to traditional L filter, a LCL filter is more effective on reducing harmonic distortion at switch frequency. So it is important to choose the LCL filter parameters to achieve good filtering effect. This paper introduces some traditional design methods. Design of a LCL filter by genetic algorithm (GA) and particle swam optimization (PSO) are presented in this paper and comparison of the two intelligent optimization. Simulation result and calculate data are provided to prove that intel...

  6. The 3-D alignment of objects in dynamic PET scans using filtered sinusoidal trajectories of sinogram

    International Nuclear Information System (INIS)

    In this study, our goal is to employ a novel 3-D alignment method for dynamic positron emission tomography (PET) scans. Because the acquired data (i.e. sinograms) often contain noise considerably, filtering of the data prior to the alignment presumably improves the final results. In this study, we utilized a novel 3-D stackgram domain approach. In the stackgram domain, the signals along the sinusoidal trajectory signals of the sinogram can be processed separately. In this work, we performed angular stackgram domain filtering by employing well known 1-D filters: the Gaussian low-pass filter and the median filter. In addition, we employed two wavelet de-noising techniques. After filtering we performed alignment of objects in the stackgram domain. The local alignment technique we used is based on similarity comparisons between locus vectors (i.e. the signals along the sinusoidal trajectories of the sinogram) in a 3-D neighborhood of sequences of the stackgrams. Aligned stackgrams can be transformed back to sinograms (Method 1), or alternatively directly to filtered back-projected images (Method 2). In order to evaluate the alignment process, simulated data with different kinds of additive noises were used. The results indicated that the filtering prior to the alignment can be important concerning the accuracy

  7. Traditional fishing tools

    OpenAIRE

    Stoica, Georgeta

    2009-01-01

    In this paper I present the traditional fishing tools used in the area of the Danube Delta. More precisely, I speak about the village of Sfantu Gheorghe, a traditional fishing village, where the fishing activity has been the main activity along the years and where, lately, there have been major changes due to the decrease of the fish species.

  8. Spectral CT Using Multiple Balanced K-Edge Filters.

    Science.gov (United States)

    Rakvongthai, Yothin; Worstell, William; El Fakhri, Georges; Bian, Junguo; Lorsakul, Auranuch; Ouyang, Jinsong

    2015-03-01

    Our goal is to validate a spectral computed tomography (CT) system design that uses a conventional X-ray source with multiple balanced K-edge filters. By performing a simultaneously synthetic reconstruction in multiple energy bins, we obtained a good agreement between measurements and model expectations for a reasonably complex phantom. We performed simulation and data acquisition on a phantom containing multiple rods of different materials using a NeuroLogica CT scanner. Five balanced K-edge filters including Molybdenum, Cerium, Dysprosium, Erbium, and Tungsten were used separately proximal to the X-ray tube. For each sinogram bin, measured filtered vector can be defined as a product of a transmission matrix, which is determined by the filters and is independent of the imaging object, and energy-binned intensity vector. The energy-binned sinograms were then obtained by inverting the transmission matrix followed by a multiplication of the filter measurement vector. For each energy bin defined by two consecutive K-edges, a synthesized energy-binned attenuation image was obtained using filtered back-projection reconstruction. The reconstructed attenuation coefficients for each rod obtained from the experiment was in good agreement with the corresponding simulated results. Furthermore, the reconstructed attenuation coefficients for a given energy bin, agreed with National Institute of Standards and Technology reference values when beam hardening within the energy bin is small. The proposed cost-effective system design using multiple balanced K-edge filters can be used to perform spectral CT imaging at clinically relevant flux rates using conventional detectors and integrating electronics. PMID:25252276

  9. Drug Filtering

    Science.gov (United States)

    Lawrence F. Iles

    2010-01-01

    In this math meets health science activity, learners observe a model of exponential decay, and how kidneys filter blood. Learners will calculate the amount of a drug in the body over a period of time. Then, they will make and analyze the graphical representation of this exponential function. This lesson guide includes questions for learners, assessment options, extensions, and reflection questions.

  10. Compact Microstrip Spurline Bandstop Filter with Defected Ground Structure (Dgs)

    OpenAIRE

    Abhijeet Kumar; Prity Mishra

    2014-01-01

    This paper presents a new structure to implement compact narrowband high-rejection microstrip band-stop filter (BSF). This structure is the combination of two traditional BSFs: Spurline filter and BSF using defected ground structure (DGS). Due to inherently compact characteristics of both Spurline and interdigital capacitance (used as DGS), the proposed filter shows a better rejection performance than Spurline filter and open stub conventional BSF without increasing the circui...

  11. IMPLEMENTATION OF DIGITAL FILTERS FOR HIGH THROUGHPUT APPLICATIONS ON FPGA

    OpenAIRE

    Pushpa, T.; Venkatlakshmi, M.

    2013-01-01

    The digital filter implementation in FPGA,utilising the dedicated hardware resources can effectively achieve ASIC-like performance while reducing development time cost and risks. Advantage of FPGA approach to digital filter implementation including sampling rates than are availablefrom traditional DSP chips. In this paper a low pass,band pass and highpass FIR filter is implemented on FPGA.This approach gives a better performance than the common filter structures in terms of speed of operation...

  12. Oral Tradition Journal

    Science.gov (United States)

    2008-01-01

    Stretching back thousands of years, the oral traditions that have enriched and documented human existence remain a subject of much fascination. The Oral Tradition Journal was founded in 1986 in order to "serve as an international and interdisciplinary forum for discussion of worldwide oral traditions and related forms." The journal is based at the University of Missouri, and visitors to the site can search the entire run of the journal on this site by keyword or author. Clicking over to the "Browse the Journal" area, visitors can look over back issues that include special issues on the Serbo-Croatian oral tradition, performance literature, and the performance artistry of Bob Dylan. The site is a real treat for anyone interested in the subject, and visitors can also learn how to submit their own work for possible inclusion in a forthcoming volume.

  13. Traditional Urban Aboriginal Religion

    Directory of Open Access Journals (Sweden)

    Kristina Everett

    2009-01-01

    Full Text Available This paper represents a group of Aboriginal people who claim traditional Aboriginal ownership of a large Australian metropol is. They have struggled for at least the last 25 to 30 years to articulate and represent the ir contemporary group identity to the wider Australian society that very often does not take th eir expressions seriously. This is largely because dominant discourses claim that ‘authentic’ Aboriginal culture only exists in remote, pristine areas far away from western societ y and that urban Aboriginal traditions, especially urban religious traditions are, today, d efunct. This paper is an account of one occasion on which such traditional Aboriginal relig ious practice was performed before the eyes of a group of tourists.

  14. The "Natural Law Tradition."

    Science.gov (United States)

    Finnis, John

    1986-01-01

    A discussion of natural law outlines some of the theory and tradition surrounding it and examines its relationship to the social science and legal curriculum and to the teaching of jurisprudence. (MSE)

  15. Energy filter

    International Nuclear Information System (INIS)

    An energy filter, for enclosing a Geiger Mueller tube, comprises a copper sheath, open at one end and possessing an aperture at the other, surrounded by a discontinuous jacket of 60-40 tin-lead alloy. The latter, assisted by the copper sheath, reduces those radiant frequencies at which the Geiger Mueller tube is most sensitive, enabling the tube to produce a response which is substantially independent of the energy of the incoming radiation. 60-40 tin-lead alloy is a commercial solder of eutectic composition. Consequently the filter is readily constructed by casting the tin-lead parts, either directly onto the copper sheath, or apart from it, for later assembly. The particular configuration of the alloy jacket, comprising two axially spaced rings and one end disc, and the sheath aperture, are chosen to give a circular polar response as nearly as possible. (author)

  16. Evaluating collaborative filtering over time

    OpenAIRE

    Lathia, N. K.

    2010-01-01

    Recommender systems have become essential tools for users to navigate the plethora of content in the online world. Collaborative filtering—a broad term referring to the use of a variety, or combination, of machine learning algorithms operating on user ratings—lies at the heart of recommender systems’ success. These algorithms have been traditionally studied from the point of view of how well they can predict users’ ratings and how precisely they rank content; state of the ...

  17. Non-traditional vectors for paralytic shellfish poisoning.

    Science.gov (United States)

    Deeds, Jonathan R; Landsberg, Jan H; Etheridge, Stacey M; Pitcher, Grant C; Longan, Sara Watt

    2008-01-01

    Paralytic shellfish poisoning (PSP), due to saxitoxin and related compounds, typically results from the consumption of filter-feeding molluscan shellfish that concentrate toxins from marine dinoflagellates. In addition to these microalgal sources, saxitoxin and related compounds, referred to in this review as STXs, are also produced in freshwater cyanobacteria and have been associated with calcareous red macroalgae. STXs are transferred and bioaccumulate throughout aquatic food webs, and can be vectored to terrestrial biota, including humans. Fisheries closures and human intoxications due to STXs have been documented in several non-traditional (i.e. non-filter-feeding) vectors. These include, but are not limited to, marine gastropods, both carnivorous and grazing, crustacea, and fish that acquire STXs through toxin transfer. Often due to spatial, temporal, or a species disconnection from the primary source of STXs (bloom forming dinoflagellates), monitoring and management of such non-traditional PSP vectors has been challenging. A brief literature review is provided for filter feeding (traditional) and non-filter feeding (non-traditional) vectors of STXs with specific reference to human effects. We include several case studies pertaining to management actions to prevent PSP, as well as food poisoning incidents from STX(s) accumulation in non-traditional PSP vectors. PMID:18728730

  18. Digital filters

    CERN Document Server

    Hamming, Richard W

    2013-01-01

    Digital signals occur in an increasing number of applications: in telephone communications; in radio, television, and stereo sound systems; and in spacecraft transmissions, to name just a few. This introductory text examines digital filtering, the processes of smoothing, predicting, differentiating, integrating, and separating signals, as well as the removal of noise from a signal. The processes bear particular relevance to computer applications, one of the focuses of this book.Readers will find Hamming's analysis accessible and engaging, in recognition of the fact that many people with the s

  19. Traditional Chinese Biotechnology

    Science.gov (United States)

    Xu, Yan; Wang, Dong; Fan, Wen Lai; Mu, Xiao Qing; Chen, Jian

    The earliest industrial biotechnology originated in ancient China and developed into a vibrant industry in traditional Chinese liquor, rice wine, soy sauce, and vinegar. It is now a significant component of the Chinese economy valued annually at about 150 billion RMB. Although the production methods had existed and remained basically unchanged for centuries, modern developments in biotechnology and related fields in the last decades have greatly impacted on these industries and led to numerous technological innovations. In this chapter, the main biochemical processes and related technological innovations in traditional Chinese biotechnology are illustrated with recent advances in functional microbiology, microbial ecology, solid-state fermentation, enzymology, chemistry of impact flavor compounds, and improvements made to relevant traditional industrial facilities. Recent biotechnological advances in making Chinese liquor, rice wine, soy sauce, and vinegar are reviewed.

  20. Filters in 2D and 3D Cardiac SPECT Image Processing.

    Science.gov (United States)

    Lyra, Maria; Ploussi, Agapi; Rouchota, Maritina; Synefia, Stella

    2014-01-01

    Nuclear cardiac imaging is a noninvasive, sensitive method providing information on cardiac structure and physiology. Single photon emission tomography (SPECT) evaluates myocardial perfusion, viability, and function and is widely used in clinical routine. The quality of the tomographic image is a key for accurate diagnosis. Image filtering, a mathematical processing, compensates for loss of detail in an image while reducing image noise, and it can improve the image resolution and limit the degradation of the image. SPECT images are then reconstructed, either by filter back projection (FBP) analytical technique or iteratively, by algebraic methods. The aim of this study is to review filters in cardiac 2D, 3D, and 4D SPECT applications and how these affect the image quality mirroring the diagnostic accuracy of SPECT images. Several filters, including the Hanning, Butterworth, and Parzen filters, were evaluated in combination with the two reconstruction methods as well as with a specified MatLab program. Results showed that for both 3D and 4D cardiac SPECT the Butterworth filter, for different critical frequencies and orders, produced the best results. Between the two reconstruction methods, the iterative one might be more appropriate for cardiac SPECT, since it improves lesion detectability due to the significant improvement of image contrast. PMID:24804144

  1. Intelligent Optimize Design of LCL Filter for Three-Phase Voltage-Source PWM Rectifier

    DEFF Research Database (Denmark)

    Sun, Wei; Chen, Zhe

    2009-01-01

    Compared to traditional L filter, a LCL filter is more effective on reducing harmonic distortion at switch frequency. So it is important to choose the LCL filter parameters to achieve good filtering effect. This paper introduces some traditional design methods. Design of a LCL filter by genetic algorithm (GA) and particle swam optimization (PSO) are presented in this paper and comparison of the two intelligent optimization. Simulation result and calculate data are provided to prove that intelligent optimization are more effective and simple than traditional methods.

  2. Building on tradition

    Directory of Open Access Journals (Sweden)

    John Ebdon

    2001-09-01

    Full Text Available Sheffield University has a long and distinguished tradition of research on polymeric materials and in polymer engineering. However, the University's profile in polymer research was greatly strengthened first by the appointments in 1998 of Tony Ryan and Richard Jones and by the transfer last year of seven polymer chemists, together with their research students, research associates and equipment, from Lancaster University.

  3. Effect of Post-Reconstruction Gaussian Filtering on Image Quality and Myocardial Blood Flow Measurement with N-13 Ammonia PET

    Directory of Open Access Journals (Sweden)

    Hyeon Sik Kim

    2014-10-01

    Full Text Available Objective(s: In order to evaluate the effect of post-reconstruction Gaussian filtering on image quality and myocardial blood flow (MBF measurement by dynamic N-13 ammonia positron emission tomography (PET, we compared various reconstruction and filtering methods with image characteristics. Methods: Dynamic PET images of three patients with coronary artery disease (male-female ratio of 2:1; age: 57, 53, and 76 years were reconstructed, using filtered back projection (FBP and ordered subset expectation maximization (OSEM methods. OSEM reconstruction consisted of OSEM_2I, OSEM_4I, and OSEM_6I with 2, 4, and 6 iterations, respectively. The images, reconstructed and filtered by Gaussian filters of 5, 10, and 15 mm, were obtained, as well as non-filtered images. Visual analysis of image quality (IQ was performed using a 3-grade scoring system by 2 independent readers, blinded to the reconstruction and filtering methods of stress images. Then, signal-to-noise ratio (SNR was calculated by noise and contrast recovery (CR. Stress and rest MBF and coronary flow reserve (CFR were obtained for each method. IQ scores, stress and rest MBF, and CFR were compared between the methods, using Chi-square and Kruskal-Wallis tests. Results: In the visual analysis, IQ was significantly higher by 10 mm Gaussian filtering, compared to other sizes of filter (PP=0.923 and 0.855 for readers 1 and 2, respectively. SNR was significantly higher in 10 mm Gaussian filter. There was a significant difference in stress and rest MBF between several vascular territories. However CFR was not significantly different according to various filtering methods. Conclusion: Post-reconstruction Gaussian filtering with a filter size of 10 mm significantly enhances the IQ of N-13 ammonia PET-CT, without changing the results of CFR calculation. .

  4. Proposing a New Metric for Collaborative Filtering

    Directory of Open Access Journals (Sweden)

    Arash Bahrehmand

    2011-07-01

    Full Text Available The aim of a recommender system is filtering the enormous quantity of information to obtain useful information based on the user’s interest. Collaborative filtering is a technique which improves the efficiency of recommendation systems by considering the similarity between users. The similarity is based on the given rating to data by similar users. However, user’s interest may change over time. In this paper we propose an adaptive metric which considers the time in measuring the similarity of users. The experimental results show that our approach is more accurate than the traditional collaborative filtering algorithm.

  5. The application of CIC filter in NQR explosive detection system

    International Nuclear Information System (INIS)

    It introduce the principle and application of CIC Digital-filter from Digital Signal Processing and Analyzing in NQR Explosive Detection System. Compared with FIR and IIR Digital-filter traditional used, CIC Digital-filter can be decimated. In NQR Explosive Detection System, Digital Receiver often adopts big over-sampling ratios. Facing requirement of high-speed processing great quantity data, CIC filter has flexible, high-efficiency, real-time advantage. CIC Digital-filter provides a favorable method for NQR signal acquiring, processing and analyzing from some kinds explosives. (authors)

  6. Effect of different thickness of material filter on Tc-99m spectra and performance parameters of gamma camera

    Science.gov (United States)

    Nazifah, A.; Norhanna, S.; Shah, S. I.; Zakaria, A.

    2014-11-01

    This study aimed to investigate the effects of material filter technique on Tc-99m spectra and performance parameters of Philip ADAC forte dual head gamma camera. Thickness of material filter was selected on the basis of percentage attenuation of various gamma ray energies by different thicknesses of zinc material. A cylindrical source tank of NEMA single photon emission computed tomography (SPECT) Triple Line Source Phantom filled with water and Tc-99m radionuclide injected was used for spectra, uniformity and sensitivity measurements. Vinyl plastic tube was used as a line source for spatial resolution. Images for uniformity were reconstructed by filtered back projection method. Butterworth filter of order 5 and cut off frequency 0.35 cycles/cm was selected. Chang's attenuation correction method was applied by selecting 0.13/cm linear attenuation coefficient. Count rate was decreased with material filter from the compton region of Tc-99m energy spectrum, also from the photopeak region. Spatial resolution was improved. However, uniformity of tomographic image was equivocal, and system volume sensitivity was reduced by material filter. Material filter improved system's spatial resolution. Therefore, the technique may be used for phantom studies to improve the image quality.

  7. Non-Traditional Vectors for Paralytic Shellfish Poisoning

    Directory of Open Access Journals (Sweden)

    Sara Watt Longan

    2008-06-01

    Full Text Available Paralytic shellfish poisoning (PSP, due to saxitoxin and related compounds, typically results from the consumption of filter-feeding molluscan shellfish that concentrate toxins from marine dinoflagellates. In addition to these microalgal sources, saxitoxin and related compounds, referred to in this review as STXs, are also produced in freshwater cyanobacteria and have been associated with calcareous red macroalgae. STXs are transferred and bioaccumulate throughout aquatic food webs, and can be vectored to terrestrial biota, including humans. Fisheries closures and human intoxications due to STXs have been documented in several non-traditional (i.e. non-filter-feeding vectors. These include, but are not limited to, marine gastropods, both carnivorous and grazing, crustacea, and fish that acquire STXs through toxin transfer. Often due to spatial, temporal, or a species disconnection from the primary source of STXs (bloom forming dinoflagellates, monitoring and management of such non-traditional PSP vectors has been challenging. A brief literature review is provided for filter feeding (traditional and nonfilter feeding (non-traditional vectors of STXs with specific reference to human effects. We include several case studies pertaining to management actions to prevent PSP, as well as food poisoning incidents from STX(s accumulation in non-traditional PSP vectors.

  8. HEPA Filter Performance under Adverse Conditions

    International Nuclear Information System (INIS)

    This study involved challenging nuclear grade high-efficiency particulate air (HEPA) filters under a variety of conditions that can arise in Department of Energy (DOE) applications such as: low or high RH, controlled and uncontrolled challenge, and filters with physically damaged media or seals (i.e., leaks). Reported findings correlate filter function as measured by traditional differential pressure techniques in comparison with simultaneous instrumental determination of up and down stream PM concentrations. Additionally, emission rates and failure signatures will be discussed for filters that have either failed or exceeded their usable lifetime. Significant findings from this effort include the use of thermocouples up and down stream of the filter housing to detect the presence of moisture. Also demonstrated in the moisture challenge series of tests is the effect of repeated wetting of the filter. This produces a phenomenon referred to as transient failure before the tensile strength of the media weakens to the point of physical failure. An evaluation of the effect of particle size distribution of the challenge aerosol on loading capacity of filters is also included. Results for soot and two size distributions of KCl are reported. Loading capacities for filters ranged from approximately 70 g of soot to nearly 900 g for the larger particle size distribution of KCl. (authors)

  9. Filter quality of pleated filter cartridges.

    Science.gov (United States)

    Chen, Chun-Wan; Huang, Sheng-Hsiu; Chiang, Che-Ming; Hsiao, Ta-Chih; Chen, Chih-Chieh

    2008-04-01

    The performance of dust cartridge filters commonly used in dust masks and in room ventilation depends both on the collection efficiency of the filter material and the pressure drop across the filter. Currently, the optimization of filter design is based only on minimizing the pressure drop at a set velocity chosen by the manufacturer. The collection efficiency, an equally important factor, is rarely considered in the optimization process. In this work, a filter quality factor, which combines the collection efficiency and the pressure drop, is used as the optimization criterion for filter evaluation. Most respirator manufacturers pleat the filter to various extents to increase the filtration area in the limit space within the dust cartridge. Six sizes of filter holders were fabricated to hold just one pleat of filter, simulating six different pleat counts, ranging from 0.5 to 3.33 pleats cm(-1). The possible electrostatic charges on the filter were removed by dipping in isopropyl alcohol, and the air velocity is fixed at 100 cm s(-1). Liquid dicotylphthalate particles generated by a constant output atomizer were used as challenge aerosols to minimize particle loading effects. A scanning mobility particle sizer was used to measure the challenge aerosol number concentrations and size distributions upstream and downstream of the pleated filter. The pressure drop across the filter was monitored by using a calibrated pressure transducer. The results showed that the performance of pleated filters depend not only on the size of the particle but also on the pleat count of the pleated filter. Based on filter quality factor, the optimal pleat count (OPC) is always higher than that based on pressure drop by about 0.3-0.5 pleats cm(-1). For example, the OPC is 2.15 pleats cm(-1) from the standpoint of pressure drop, but for the highest filter quality factor, the pleated filter needed to have a pleat count of 2.65 pleats cm(-1) at particle diameter of 122 nm. From the aspect of filter quality factor, this study suggests that the respirator manufacturers should add approximately 0.5 pleats cm(-1) to the OPC derived from the generalized correlation curve for pleated filter design based on minimum pressure drop. PMID:18326869

  10. Bayesian filtering in electronic surveillance

    Science.gov (United States)

    Coraluppi, Stefano; Carthel, Craig

    2012-06-01

    Fusion of passive electronic support measures (ESM) with active radar data enables tracking and identification of platforms in air, ground, and maritime domains. An effective multi-sensor fusion architecture adopts hierarchical real-time multi-stage processing. This paper focuses on the recursive filtering challenges. The first challenge is to achieve effective platform identification based on noisy emitter type measurements; we show that while optimal processing is computationally infeasible, a good suboptimal solution is available via a sequential measurement processing approach. The second challenge is to process waveform feature measurements that enable disambiguation in multi-target scenarios where targets may be using the same emitters. We show that an approach that explicitly considers the Markov jump process outperforms the traditional Kalman filtering solution.

  11. HEPA filter dissolution process

    Energy Technology Data Exchange (ETDEWEB)

    Brewer, K.N.; Murphy, J.A.

    1992-12-31

    This invention is comprised of a process for dissolution of spent high efficiency particulate air (HEPA) filters and then combining the complexed filter solution with other radioactive wastes prior to calcining the mixed and blended waste feed. The process is an alternate to a prior method of acid leaching the spent filters which is an inefficient method of treating spent HEPA filters for disposal.

  12. Hepa filter dissolution process

    Science.gov (United States)

    Brewer, Ken N. (Arco, ID); Murphy, James A. (Idaho Falls, ID)

    1994-01-01

    A process for dissolution of spent high efficiency particulate air (HEPA) filters and then combining the complexed filter solution with other radioactive wastes prior to calcining the mixed and blended waste feed. The process is an alternate to a prior method of acid leaching the spent filters which is an inefficient method of treating spent HEPA filters for disposal.

  13. Evaluation of median filtering after reconstruction with maximum likelihood expectation maximization (ML-EM) by real space and frequency space

    International Nuclear Information System (INIS)

    Maximum likelihood expectation maximization (ML-EM) image quality is sensitive to the number of iterations, because a large number of iterations leads to images with checkerboard noise. The use of median filtering in the reconstruction process allows both noise reduction and edge preservation. We examined the value of median filtering after reconstruction with ML-EM by comparing filtered back projection (FBP) with a ramp filter or ML-EM without filtering. SPECT images were obtained with a dual-head gamma camera. The acquisition time was changed from 10 to 200 (seconds/frame) to examine the effect of the count statistics on the quality of the reconstructed images. First, images were reconstructed with ML-EM by changing the number of iterations from 1 to 150 in each study. Additionally, median filtering was applied following reconstruction with ML-EM. The quality of the reconstructed images was evaluated in terms of normalized mean square error (NMSE) values and two-dimensional power spectrum analysis. Median filtering after reconstruction by the ML-EM method provided stable NMSE values even when the number of iterations was increased. The signal element of the image was close to the reference image for any repetition number of iterations. Median filtering after reconstruction with ML-EM was useful in reducing noise, with a similar resolution achieved by reconstruction with FBP and a ramp filter. Especially in images with poor count statistics, median filtering after reount statistics, median filtering after reconstruction with ML-EM is effective as a simple, widely available method. (author)

  14. Two-Channel IIR Filter Banks Utilizing the Frequency-Response Masking Technique

    Directory of Open Access Journals (Sweden)

    L. D. Mili?

    2009-11-01

    Full Text Available In this paper, a new approach for a twochannel IIR filter bank based on the frequency-response masking technique is presented. A model filter pair is a double complementary IIR filter pair implemented as a parallel connection of two all-pass filters. Masking filters are linear-phase FIR filters. The resulting overall filter pair is nearly power complementary, and simultaneously achieves high sub-channel selectivity. The approximately linear phase of channel filters is achieved with an IIR model filter pair of an approximately linear phase. Compared to the traditional solution based on FIR filters only, the proposed filter bank exhibits a smaller overall delay and requires a smaller number of multipliers for implementation.

  15. Filtering, FDR and power

    OpenAIRE

    van Iterson Maarten; Boer Judith M; Menezes Renée X

    2010-01-01

    Abstract Background In high-dimensional data analysis such as differential gene expression analysis, people often use filtering methods like fold-change or variance filters in an attempt to reduce the multiple testing penalty and improve power. However, filtering may introduce a bias on the multiple testing correction. The precise amount of bias depends on many quantities, such as fraction of probes filtered out, filter statistic and test statistic used. Results We show that a biased multiple...

  16. Optimization of filter loading

    International Nuclear Information System (INIS)

    The introduction of 10 CFR Part 61 has created potential difficulties in the disposal of spent cartridge filters. When this report was prepared, Rancho Seco had no method of packaging and disposing of class B or C filters. This work examined methods to minimize the total operating cost of cartridge filters while maintaining them below the class A limit. It was found that by encapsulating filters in cement the filter operating costs could be minimized

  17. IMPLEMENTATION OF DIGITAL FILTERS FOR HIGH THROUGHPUT APPLICATIONS ON FPGA

    Directory of Open Access Journals (Sweden)

    T.PUSHPA

    2013-04-01

    Full Text Available The digital filter implementation in FPGA,utilising the dedicated hardware resources can effectively achieve ASIC-like performance while reducing development time cost and risks. Advantage of FPGA approach to digital filter implementation including sampling rates than are availablefrom traditional DSP chips. In this paper a low pass,band pass and highpass FIR filter is implemented on FPGA.This approach gives a better performance than the common filter structures in terms of speed of operation,cost and power consumption in real time.In this technique, codes for direct fixed point FIR filter have been realized. Modules such as multiplier, adder, ram and two’s complement were used. For an N order filter the filter the number of registers and adders required is N and the number of multipliers required is N+1.For high speed and high throughput applications,MAC is used and that consumes less power.

  18. Median filtering in multispectral filter array demosaicking

    OpenAIRE

    Wang, Xingbo; Thomas, Jean-baptiste; Hardeberg, Jon Yngve; Gouton, Pierre

    2013-01-01

    Inspired by the concept of the colour filter array (CFA), the research community has shown much interest in adapting the idea of CFA to the multispectral domain, producing multispectral filter array (MSFAs). In addition to newly devised methods of MSFA demosaicking, there exists a wide spectrum of methods developed for CFA. Among others, some vector based operations can be adapted naturally for multispectral purposes. In this paper, we focused on studying two vector based median filtering met...

  19. Compact Microstrip Spurline Bandstop Filter with Defected Ground Structure (Dgs

    Directory of Open Access Journals (Sweden)

    Abhijeet Kumar

    2014-08-01

    Full Text Available This paper presents a new structure to implement compact narrowband high-rejection microstrip band-stop filter (BSF. This structure is the combination of two traditional BSFs: Spurline filter and BSF using defected ground structure (DGS. Due to inherently compact characteristics of both Spurline and interdigital capacitance (used as DGS, the proposed filter shows a better rejection performance than Spurline filter and open stub conventional BSF without increasing the circuit size. From, the proposed BSF has a rejection of better than 20dB and the maximum rejection level of 41dB.

  20. SAW Filter Performance Improvement

    Directory of Open Access Journals (Sweden)

    Monali R. Dave

    2012-01-01

    Full Text Available Surface acoustic wave (SAW filters have a widerange of applications, including, for example, inmobile/wireless transceivers, radio frequency (RF filters,intermediate frequency (IF filters, resonator-filters, filtersfor mobile and wireless circuits, IF filters in a basetransceiver station (BTS, RF front-end filtersfor mobile/wireless circuitry, multimode frequencyagile oscillators for spread-spectrum securecommunications, nyquist filters for microwave digitalradio, voltage controlled oscillators for first or second stagemixing in mobile transceivers, delay lines for low powertime-diversity wireless receivers, pseudo-noise-coded delaylines for combined code division multiple access/timedivision multiple access (CDMA/TDMA access, clockrecovery filters for fiber-opticscommunication repeater stages, synchronous, spreadspectrum communications, televisions, videorecorders, and many other applications. SAW filters arealso finding increasing use as picture-signal intermediatefrequency (PIF filters, vestigial sideband (VSB filters, andother types of communication filters, and as filters fordigital signal processing [1]. It is, however, supported and leadby various technologies of publiccommunication systems such as fiber optics, digital microwaveand satellites. Various custom SAW devices for publiccommunication systems have been already widely used andstill progress [2].This paper describes various methods to minimize some of thedistortions in SAW filter. It includes bulk wave distortion andfeed through distortion.

  1. Simon-nitinol filter

    International Nuclear Information System (INIS)

    This paper discusses a filter that exploits the thermal shape-memory properties of the nitinol alloy to achieve an optimized filter shape and a fine-bore introducer. Experimental methods and materials are given and results are analyzed

  2. HEPA filter monitoring program

    Science.gov (United States)

    Kirchner, K. N.; Johnson, C. M.; Aiken, W. F.; Lucerna, J. J.; Barnett, R. L.; Jensen, R. T.

    1986-07-01

    The testing and replacement of HEPA filters, widely used in the nuclear industry to purify process air, are costly and labor-intensive. Current methods of testing filter performance, such as differential pressure measurement and scanning air monitoring, allow determination of overall filter performance but preclude detection of incipient filter failure such as small holes in the filters. Using current technology, a continual in-situ monitoring system was designed which provides three major improvements over current methods of filter testing and replacement. The improvements include: cost savings by reducing the number of intact filters which are currently being replaced unnecessarily; more accurate and quantitative measurement of filter performance; and reduced personnel exposure to a radioactive environment by automatically performing most testing operations.

  3. Bag filters for TPP

    Energy Technology Data Exchange (ETDEWEB)

    L.V. Chekalov; Yu.I. Gromov; V.V. Chekalov [JSC ' Kondor-Eko,' Yaroslavl' Oblast' (Russian Federation)

    2007-05-15

    Cleaning of TPP flue gases with bag filters capable of pulsed regeneration is examined. A new filtering element with a three-dimensional filtering material formed from a needle-broached cloth in which the filtration area, as compared with a conventional smooth bag, is increased by more than two times, is proposed. The design of a new FRMI type of modular filter is also proposed. A standard series of FRMI filters with a filtration area ranging from 800 to 16,000 m{sup 2} is designed for an output more than 1 million m{sub 3}/h of with respect to cleaned gas. The new bag filter permits dry collection of sulfur oxides from waste gases at TPP operating on high-sulfur coals. The design of the filter makes it possible to replace filter elements without taking the entire unit out of service.

  4. A FUZZY FILTERING MODEL FOR CONTOUR DETECTION

    OpenAIRE

    Rajakumar, T. C.; Arumuga Perumal, S.; Krishnan, N.

    2011-01-01

    Contour detection is the basic property of image processing. Fuzzy Filtering technique is proposed to generate thick edges in two dimensional gray images. Fuzzy logic is applied to extract value for an image and is used for object contour detection. Fuzzy based pixel selection can reduce the drawbacks of conventional methods(Prewitt, Robert). In the traditional methods, filter mask is used for all kinds of images. It may succeed in one kind of image but fail in another one. In this frame work...

  5. Bloom Filters: A Review

    Directory of Open Access Journals (Sweden)

    K. Premalatha

    2010-10-01

    Full Text Available This paper presents different representations and applications of Bloom filter. A Bloom filter is a simple but powerful data structure that can check membership to a static set. Bloom filters become more popular for networking system applications, spell-checkers, string matching algorithms, network packet analysis tools and network/internet caches and database optimization. This paper will examine and analyze different types of bloom filter and its applications.

  6. Bloom Filters: A Review

    OpenAIRE

    Premalatha, K.; Arulanand Natarajan; Subramanian, S.

    2010-01-01

    This paper presents different representations and applications of Bloom filter. A Bloom filter is a simple but powerful data structure that can check membership to a static set. Bloom filters become more popular for networking system applications, spell-checkers, string matching algorithms, network packet analysis tools and network/internet caches and database optimization. This paper will examine and analyze different types of bloom filter and its applications.

  7. Oriented Fiber Filter Media

    OpenAIRE

    Bharadwaj, R.; A Patel, S. Chokdeepanich; G G Chase, Ph D.

    2008-01-01

    Coalescing filters are widely used throughout industry and improved performance will reduce droplet emissions and operating costs. Experimental observations show orientation of micro fibers in filter media effect the permeability and the separation efficiency of the filter media. In this work two methods are used to align the fibers to alter the filter structure. The results show that axially aligned fiber media improve quality factor on the order of 20% and cutting media on an angle from a t...

  8. Oversampled Filter Banks

    OpenAIRE

    Cvetkovic, Zoran; Vetterli, Martin

    1998-01-01

    Perfect reconstruction oversampled filter banks are equivalent to a particular class of frames in . These frames are the subject of this paper. First, necessary and sufficient conditions on a filter bank for implementing a frame or a tight frame expansion are established, as well as a necessary and sufficient condition for perfect reconstruction using FIR filters after an FIR analysis. Complete parameterizations of oversampled filter banks satisfying these conditions are g...

  9. Earth Water Filter

    Science.gov (United States)

    Designing a filter that turns black, salty, muck into drinkable water is a tall order. In this video segment, ZOOM cast members take cues from what they know about natural sediment filters, use similar materials to create their own water filters, and evaluate which combinations of materials make the fastest, most efficient filters. The segment is four minutes fifty-four seconds in length. A background essay and discussion questions are included.

  10. Frequency Spectrum Based Low-Area Low-Power Parallel FIR Filter Design

    OpenAIRE

    Parhi Keshab K; Chung Jin-Gyun

    2002-01-01

    Parallel (or block) FIR digital filters can be used either for high-speed or low-power (with reduced supply voltage) applications. Traditional parallel filter implementations cause linear increase in the hardware cost with respect to the block size. Recently, an efficient parallel FIR filter implementation technique requiring a less-than linear increase in the hardware cost was proposed. This paper makes two contributions. First, the filter spectrum characteristics are exploited to select th...

  11. Fuzzy Adaptive Interacting Multiple Model Nonlinear Filter for Integrated Navigation Sensor Fusion

    OpenAIRE

    Dah-Jing Jwo; Chih-Wen Chang; Chien-Hao Tseng

    2011-01-01

    In this paper, the application of the fuzzy interacting multiple model unscented Kalman filter (FUZZY-IMMUKF) approach to integrated navigation processing for the maneuvering vehicle is presented. The unscented Kalman filter (UKF) employs a set of sigma points through deterministic sampling, such that a linearization process is not necessary, and therefore the errors caused by linearization as in the traditional extended Kalman filter (EKF) can be avoided. The nonlinear filters naturally suff...

  12. Fuzzy Based Median Filtering for Removal of Salt-and-Pepper Noise

    OpenAIRE

    Bhavana Deshpande; Verma, H. K.; Prachi Deshpande

    2012-01-01

    This paper presents a filter for restoration of images that are highly corrupted by salt and pepper noise. By incorporating fuzzy logic after detecting and correcting the noisy pixel, the proposed filter is able to suppress noise and preserve details across a wide range of salt and pepper noise corruption, ranging from 1% to 60%. The proposed filter is tested on different images and is found to produce better results than the Traditional Median Filter.

  13. Filter service system

    Science.gov (United States)

    Sellers, Cheryl L. (Peoria, IL); Nordyke, Daniel S. (Arlington Heights, IL); Crandell, Richard A. (Morton, IL); Tomlins, Gregory (Peoria, IL); Fei, Dong (Peoria, IL); Panov, Alexander (Dunlap, IL); Lane, William H. (Chillicothe, IL); Habeger, Craig F. (Chillicothe, IL)

    2008-12-09

    According to an exemplary embodiment of the present disclosure, a system for removing matter from a filtering device includes a gas pressurization assembly. An element of the assembly is removably attachable to a first orifice of the filtering device. The system also includes a vacuum source fluidly connected to a second orifice of the filtering device.

  14. The 'Filter' program algorithm

    International Nuclear Information System (INIS)

    An algorithm of FILTER program is described in detail. A FILTER program complex is intended for identification of the track points in the pre-sifted data obtained from a veiwing and scanning device. FILTER program is compiled in FORTRAN language

  15. Practical Active Capacitor Filter

    Science.gov (United States)

    Shuler, Robert L., Jr. (Inventor)

    2005-01-01

    A method and apparatus is described that filters an electrical signal. The filtering uses a capacitor multiplier circuit where the capacitor multiplier circuit uses at least one amplifier circuit and at least one capacitor. A filtered electrical signal results from a direct connection from an output of the at least one amplifier circuit.

  16. HEPA filter encapsulation

    Science.gov (United States)

    Gates-Anderson, Dianne D. (Union City, CA); Kidd, Scott D. (Brentwood, CA); Bowers, John S. (Manteca, CA); Attebery, Ronald W. (San Lorenzo, CA)

    2003-01-01

    A low viscosity resin is delivered into a spent HEPA filter or other waste. The resin is introduced into the filter or other waste using a vacuum to assist in the mass transfer of the resin through the filter media or other waste.

  17. Filtering, FDR and power

    Directory of Open Access Journals (Sweden)

    van Iterson Maarten

    2010-09-01

    Full Text Available Abstract Background In high-dimensional data analysis such as differential gene expression analysis, people often use filtering methods like fold-change or variance filters in an attempt to reduce the multiple testing penalty and improve power. However, filtering may introduce a bias on the multiple testing correction. The precise amount of bias depends on many quantities, such as fraction of probes filtered out, filter statistic and test statistic used. Results We show that a biased multiple testing correction results if non-differentially expressed probes are not filtered out with equal probability from the entire range of p-values. We illustrate our results using both a simulation study and an experimental dataset, where the FDR is shown to be biased mostly by filters that are associated with the hypothesis being tested, such as the fold change. Filters that induce little bias on the FDR yield less additional power of detecting differentially expressed genes. Finally, we propose a statistical test that can be used in practice to determine whether any chosen filter introduces bias on the FDR estimate used, given a general experimental setup. Conclusions Filtering out of probes must be used with care as it may bias the multiple testing correction. Researchers can use our test for FDR bias to guide their choice of filter and amount of filtering in practice.

  18. A Christiansen filter realized with a cylindrical lens

    International Nuclear Information System (INIS)

    A type of Christiansen filter that takes the form of a cylindrical lens is proposed. This kind of Christiansen filter has a pass band narrower than the traditional one, with more means of control of its performance; it could approximate arbitrary functions over a frequency range of interest. An inverse scattering theory for the synthesis of such a filter is developed and a systematic design technique is established. A desired, prescribed response can be tailored by properly configuring the lens of the filter. A Gaussian filter centered at 225 nm with a full width at half maximum of 1 nm is synthesized and is numerically modeled on a computer using the finite difference beam propagation method. The simulation results confirm the feasibility of the proposed Christiansen filter

  19. Optimal Sharpening of Compensated Comb Decimation Filters: Analysis and Design

    Science.gov (United States)

    Troncoso Romero, David Ernesto

    2014-01-01

    Comb filters are a class of low-complexity filters especially useful for multistage decimation processes. However, the magnitude response of comb filters presents a droop in the passband region and low stopband attenuation, which is undesirable in many applications. In this work, it is shown that, for stringent magnitude specifications, sharpening compensated comb filters requires a lower-degree sharpening polynomial compared to sharpening comb filters without compensation, resulting in a solution with lower computational complexity. Using a simple three-addition compensator and an optimization-based derivation of sharpening polynomials, we introduce an effective low-complexity filtering scheme. Design examples are presented in order to show the performance improvement in terms of passband distortion and selectivity compared to other methods based on the traditional Kaiser-Hamming sharpening and the Chebyshev sharpening techniques recently introduced in the literature. PMID:24578674

  20. Retrofitting fabric filters for clean stack emission

    International Nuclear Information System (INIS)

    The fly ash generated from New South Wales coals, which are predominately low sulphur coals, has been difficult to collect in traditional electrostatic precipitators. During the early 1970's development work was undertaken on the use of fabric filters at some of the Commission's older power stations. The satisfactory performance of the plant at those power stations led to the selection of fabric filters for flue gas cleaning at the next two new power stations constructed by the Electricity Commission of New South Wales. On-going pilot plant testing has continued to indicate the satisfactory performance of enhanced designs of fabric filters of varying types and the Commission has recently retrofitted pulse cleaned fabric filters to 2 x 350 MW units at a further power station with plans to retrofit similar plant to the remaining 2 x 350 MW units at that station. A contract has also been let for the retrofitting of pulse cleaned fabric filters to 4 x 500 MW units at another power station in the Commission's system. The paper reviews the performance of the 6000 MW of plant operating with fabric filters. Fabric selection and fabric life forms an important aspect of this review

  1. Low pass filters with linear phase

    Science.gov (United States)

    Lerner, R. M.

    1985-12-01

    This report is intended as a modern sequel to Band Pass Filters with Linear Phase, published over 20 years ago. Here, the linear phase algorithms are adapted to low-pass integrated active circuit topologies for potential use as the input filter in sampled data control systems and other low-pass applications for which traditional coil-capacitor-transformer filter designs are no longer appropriate. The linear phase algorithm is applicable to a variety of circuit topologies which are not mathematically equivalent. Application to the overall transfer function (the simple approach) results in transmission that is nearly equal-ripple in both amplitude and phase, the errors being equivalent to echos in a misterminated transmission line. The resulting filters compare favorably with modest order Tchebychef filters (lacking phase control). Application to branches of a feedback topolgy (mathematically equivalent to a lattice) leads to control of echoes and stop-band zeros at the cost of extra elements. The results compare favorably with ordinary Butterworth filters (lacking phase control) having the same total number of poles. Practical means for inserting stop band zeros without redesign of the pass-band are considered. Circuits of both the R-C and switchcap types are discussed.

  2. Wavelet transform adaptive filtering

    Science.gov (United States)

    Dang, Laurence M.

    1994-10-01

    An LMS adaptive filtering algorithm is presented utilizing wavelet transforms. Its performance is compared to DCT and Walsh-Hadamard transform-based adaptive filtering. The experimental analysis is performed in the case of the system identification of an unknown system or filter for stationary input signals. The results show some improvement in the weight modelling of the filter with comparable convergence rates. A new performance criteria, the diagonality factor, is introduced in order to show the specific effect of the wavelet transform on a signal. A Mean Average Difference is also utilized to compare the weight modelling performance of the various transform-based LMS adaptive filterings studied in this paper.

  3. Filtered Perverse Complexes

    OpenAIRE

    Bressler, P.; Saito, M.; Youssin, B.

    1996-01-01

    We introduce the notion of filtered perversity of a filtered differential complex on a complex analytic manifold $X$, without any assumptions of coherence, with the purpose of studying the connection between the pure Hodge modules and the \\lt-complexes. We show that if a filtered differential complex $(\\cM^\\bullet,F_\\bullet)$ is filtered perverse then $\\aDR(\\cM^\\bullet,F_\\bullet)$ is isomorphic to a filtered $\\cD$-module; a coherence assumption on the cohomology of $(\\cM^\\bu...

  4. Dual filtered backprojection for micro-rotation confocal microscopy

    International Nuclear Information System (INIS)

    Micro-rotation confocal microscopy is a novel optical imaging technique which employs dielectric fields to trap and rotate individual cells to facilitate 3D fluorescence imaging using a confocal microscope. In contrast to computed tomography (CT) where an image can be modelled as parallel projection of an object, the ideal confocal image is recorded as a central slice of the object corresponding to the focal plane. In CT, the projection images and the 3D object are related by the Fourier slice theorem which states that the Fourier transform of a CT image is equal to the central slice of the Fourier transform of the 3D object. In the micro-rotation application, we have a dual form of this setting, i.e. the Fourier transform of the confocal image equals the parallel projection of the Fourier transform of the 3D object. Based on the observed duality, we present here the dual of the classical filtered back projection (FBP) algorithm and apply it in micro-rotation confocal imaging. Our experiments on real data demonstrate that the proposed method is a fast and reliable algorithm for the micro-rotation application, as FBP is for CT application

  5. Exhaust gas filter

    International Nuclear Information System (INIS)

    A filter material formed by joining glass clothes to both surfaces of a glass fiber non-woven fabric is used. The filter material is disposed at the inside of a square filter material support frame made of stainless steel. The filter material is attached in a zig-zag manner in the flowing direction of the exhaust gases so as to increase the filtration area. Separators, for example, made of stainless steel are inserted between the filter materials. The separator is corrugated so as to sandwich and support the filter materials from both sides by the ridged crests. The longitudinal bottom of the separator formed by corrugating it defines a flow channel of the exhaustion gases. The longitudinal bottom is also used as a channel for back blowing air. With such a constitution, combustion gases of radioactive miscellaneous solid wastes can be completely filtered. In addition, a back wash can be conducted under high temperature. (I.N.)

  6. Ceramic fiber filter technology

    Energy Technology Data Exchange (ETDEWEB)

    Holmes, B.L.; Janney, M.A.

    1996-06-01

    Fibrous filters have been used for centuries to protect individuals from dust, disease, smoke, and other gases or particulates. In the 1970s and 1980s ceramic filters were developed for filtration of hot exhaust gases from diesel engines. Tubular, or candle, filters have been made to remove particles from gases in pressurized fluidized-bed combustion and gasification-combined-cycle power plants. Very efficient filtration is necessary in power plants to protect the turbine blades. The limited lifespan of ceramic candle filters has been a major obstacle in their development. The present work is focused on forming fibrous ceramic filters using a papermaking technique. These filters are highly porous and therefore very lightweight. The papermaking process consists of filtering a slurry of ceramic fibers through a steel screen to form paper. Papermaking and the selection of materials will be discussed, as well as preliminary results describing the geometry of papers and relative strengths.

  7. Histogram Filters for Noise Reduction

    OpenAIRE

    Wrangsjo?, Andreas; Knutsson, Hans

    2003-01-01

    A class of filters based on histograms are presented. The signal probability density function is estimated and filtering is performed in the pdf domain. Such filters can be designed to preserve signal features such as sharp edges while suppressing stochastic variations. One particular histogram filter scheme is evaluated and compared to a median filter and a normal gaussian blurring filter.

  8. Kalman filtering technique for reactivity measurement

    International Nuclear Information System (INIS)

    Measurement of reactivity and its on-line display is of great help in calibration of reactivity control and safety devices and in the planning of suitable actions during the reactor operation. In traditional approaches the reactivity is estimated from reactor period or by solving the inverse point kinetic equation. In this paper, an entirely new approach based on the Kalman filtering technique has been presented. The theory and design of the reactivity measuring instrument based on the approach has been explained. Its performance has been compared with traditional approaches by estimation of transient reactivity from flux variation data recorded in a research reactor. It is demonstrated that the Kalman filtering approach is superior to other methods from the viewpoints of accuracy, noise suppression, and robustness against uncertainties in the reactor parameters. (author). 1 fig

  9. Composed filter for attenuating bi-frequential noise on LCD-generated ronchigrams

    Science.gov (United States)

    Mora-González, Miguel; Chiu-Zarate, Roger; Muñoz-Maciel, Jesús; Martinez-Romo, Julio C.; Salinas-Luna, Javier; Campos Colma, Juan; Luna-Rosas, Francisco J.

    2012-06-01

    This work aims to design a filter to attenuate high- and medium- frequency noise in optical test images without changing the edges and original characteristics of the test image, generated by traditional filters (spatial or frequential). The noise produced by the LCD pixels (used as a diffraction grating in the Ronchi test) was analyzed. The diffraction is modulated by the spherical wavefront of the mirror under test, generating at least two frequency band noise levels. To reduce this bi-frequential noise, we propose to use an array of filters with the following structure: a low-pass frequential filter LPFF, a band- pass frequential filter BPFF and a circular mask spatial filter CMSF; thus obtaining the composed filter CF=LPFF-(BPFF)(CMSF). Various sizes of filters were used to compare their signal-to-noise ratio against simple filters (low-pass and band-stop).

  10. An LLCL Power Filter for Single-Phase Grid-Tied Inverter

    DEFF Research Database (Denmark)

    Wu, Weimin; He, Yuanbin

    2012-01-01

    This paper presents a new topology of higher order power filter for grid-tied voltage-source inverters, named the LLCL filter, which inserts a small inductor in the branch loop of the capacitor in the traditional LCL filter to compose a series resonant circuit at the switching frequency. Particularly, it can attenuate the switching-frequency current ripple components much better than an LCL filter, leading to a decrease in the total inductance and volume. Furthermore, by decreasing the inductance of a grid-side inductor, it raises the characteristic resonance frequency, which is beneficial to the inverter system control. The parameter design criteria of the proposed LLCL filter is also introduced. The comparative analysis and discussions regarding the traditional LCL filter and the proposed LLCL filter have been presented and evaluated through experiment on a 1.8-kW-single-phase grid-tied inverter prototype.

  11. Applicapability of Traditional Ceramic for Water Treatment in Small Communities

    Directory of Open Access Journals (Sweden)

    K Naddafi

    2005-08-01

    Full Text Available There is a need for simple and inexpensive water supply systems in small communities, mainly because of high costs and water resource shortages. Ceramic filters used as a Point-of-Use (POU system could serve as a safe and inexpensive means for supplying water. In this research we looked at the possibility of using Iranian traditional ceramics as filters for drinking water, bearing in mind the importance of ceramic filters as a POU system. A number of parameters relating to water quality were measured before and after filtration through a ceramic pipe wall. We used ceramic pipes made of clay (with 90% purity. It turned out that ceramic filters are capable of eliminating indicator microorganisms and turbidity to a considerable degree. They can also remove 70% of the waters color. But this system could not eliminate Dissolved Solids, Electrical Conductivity, Hardness and the Nitrate ion content of water. Using Mann-Withney U Test and T- Test, it is confirmed that increasing the thickness and number of ceramic pipes is not effective for improving some water parameters. This research showed ceramic filters can be useful for household water treatment in places where there is microbial pollution or high turbidity.

  12. Sensory Pollution from Bag Filters, Carbon Filters and Combinations

    DEFF Research Database (Denmark)

    Bekö, Gabriel; Clausen, Geo

    2008-01-01

    Used ventilation filters are a major source of sensory pollutants in air handling systems. The objective of the present study was to evaluate the net effect that different combinations of filters had on perceived air quality after 5 months of continuous filtration of outdoor suburban air. A panel of 32 subjects assessed different sets of used filters and identical sets consisting of new filters. Additionally, filter weights and pressure drops were measured at the beginning and end of the operation period. The filter sets included single EU5 and EU7 fiberglass filters, an EU7 filter protected by an upstream pre-filter (changed monthly), an EU7 filter protected by an upstream activated carbon (AC) filter, and EU7 filters with an AC filter either downstream or both upstream and downstream. In addition, two types of stand-alone combination filters were evaluated: a bag-type fiberglass filter that contained AC and a synthetic fiber cartridge filter that contained AC. Air that had passed through used filters was most acceptable for those sets in which an AC filter was used downstream of the particle filter. Comparable air quality was achieved with the stand-alone bag filter that contained AC. Furthermore, its pressure drop changed very little during the 5 months of service, and it had the added benefit of removing a large fraction of ozone from the airstream. If similar results are obtained over a wider variety of soiling conditions, such filters may be a viable solution to a long recognized problem.

  13. Aurorae in Australian Aboriginal Traditions

    Science.gov (United States)

    Hamacher, Duane W.

    2013-07-01

    Transient celestial phenomena feature prominently in the astronomical knowledge and traditions of Aboriginal Australians. In this paper, I collect accounts of the Aurora Australis from the literature regarding Aboriginal culture. Using previous studies of meteors, eclipses, and comets in Aboriginal traditions, I anticipate that the physical properties of aurora, such as their generally red colour as seen from southern Australia, will be associated with fire, death, blood, and evil spirits. The survey reveals this to be the case and also explores historical auroral events in Aboriginal cultures, aurorae in rock art, and briefly compares Aboriginal auroral traditions with other global indigenous groups, including the Maori of New Zealand.

  14. Aurorae in Australian Aboriginal Traditions

    CERN Document Server

    Hamacher, Duane W

    2013-01-01

    Transient celestial phenomena feature prominently in the astronomical knowledge and traditions of Aboriginal Australians. In this paper, I collect accounts of the Aurora Australis from the literature regarding Aboriginal culture. Using previous studies of meteors, eclipses, and comets in Aboriginal traditions, I anticipate that the physical properties of aurora, such as their generally red colour as seen from southern Australia, will be associated with fire, death, blood, and evil spirits. The survey reveals this to be the case and also explores historical auroral events in Aboriginal cultures, aurorae in rock art, and briefly compares Aboriginal auroral traditions with other global indigenous groups, including the Maori of New Zealand.

  15. Generic Kalman Filter Software

    Science.gov (United States)

    Lisano, Michael E., II; Crues, Edwin Z.

    2005-01-01

    The Generic Kalman Filter (GKF) software provides a standard basis for the development of application-specific Kalman-filter programs. Historically, Kalman filters have been implemented by customized programs that must be written, coded, and debugged anew for each unique application, then tested and tuned with simulated or actual measurement data. Total development times for typical Kalman-filter application programs have ranged from months to weeks. The GKF software can simplify the development process and reduce the development time by eliminating the need to re-create the fundamental implementation of the Kalman filter for each new application. The GKF software is written in the ANSI C programming language. It contains a generic Kalman-filter-development directory that, in turn, contains a code for a generic Kalman filter function; more specifically, it contains a generically designed and generically coded implementation of linear, linearized, and extended Kalman filtering algorithms, including algorithms for state- and covariance-update and -propagation functions. The mathematical theory that underlies the algorithms is well known and has been reported extensively in the open technical literature. Also contained in the directory are a header file that defines generic Kalman-filter data structures and prototype functions and template versions of application-specific subfunction and calling navigation/estimation routine code and headers. Once the user has provided a calling routine and the required application-specific subfunctions, the application-specific Kalman-filter software can be compiled and executed immediately. During execution, the generic Kalman-filter function is called from a higher-level navigation or estimation routine that preprocesses measurement data and post-processes output data. The generic Kalman-filter function uses the aforementioned data structures and five implementation- specific subfunctions, which have been developed by the user on the basis of the aforementioned templates. The GKF software can be used to develop many different types of unfactorized Kalman filters. A developer can choose to implement either a linearized or an extended Kalman filter algorithm, without having to modify the GKF software. Control dynamics can be taken into account or neglected in the filter-dynamics model. Filter programs developed by use of the GKF software can be made to propagate equations of motion for linear or nonlinear dynamical systems that are deterministic or stochastic. In addition, filter programs can be made to operate in user-selectable "covariance analysis" and "propagation-only" modes that are useful in design and development stages.

  16. Complex Hilbert Transform Filter

    Directory of Open Access Journals (Sweden)

    Hannu Olkkonen

    2011-05-01

    Full Text Available Hilbert transform is a basic tool in constructing analytical signals for a various applications such as amplitude modulation, envelope and instantaneous frequency analysis, quadrature decoding, shift-invariant multi-rate signal processing and Hilbert-Huang decomposition. This work introduces a complex Hilbert transform (CHT filter, where the real and imaginary parts are a Hilbert transform pair. The CHT filtered signal is analytic, i.e. its Fourier transform is zero in negative frequency range. The CHT filter is constructed by half-sample delay operators based on the B-spline transform interpolation and decimation procedure. The CHT filter has an ideal phase response and the magnitude response is maximally flat in the frequency range 0 ? ? ? ?. The CHT filter has integer coefficients and the implementation in VLSI requires only summations and register shifts. We demonstrate the feasibility of the CHT filter in reconstruction of the sign modulated CMOS logic pulses in a fibre optic link.

  17. Method of magnetic filtering

    International Nuclear Information System (INIS)

    Particles of at least one paramagnetic material are filtered from a fluid by passing the fluid through a filter of permanently magnetised magnetically hard ferromagnetic material in a divided form which generates magnetic field gradients adjacent surfaces of ferromagnetic material. Subsequently the filter is regenerated for re-use by eluting substantially all the captured particles by the use of fluid or a series of fluids which has a magnetic susceptibility equal or close to that of the captured particles, or each kind of captured particle. In this way the use of a heavy magnet either to magnetise the filter during the filtering or to de-magnetise it for eluting is avoided. Applications of the invention include the filtering of air circulated through a glove box in the nuclear industry, separation of asbestos particles from aqueous effluent and detection of malaria by parasitised cells from a blood sample. (author)

  18. High gradient magnetic filters

    International Nuclear Information System (INIS)

    A high gradient magnetic filter is enhanced by the electrophoretic effect whereby the range of effectiveness of the conventional high gradient magnetic filter for collecting suspended magnetic solid particles from liquid streams, such as power plant coolant streams, is extended to collect suspended magnetic solid particles of submicron size as well as particulates not susceptible to a magnetic field gradient. A conventional high gradient magnetic filter is connected into a pipeline through which a liquid stream containing the suspended solid particles to be removed is flowing, so that the filter is electrically insulated from the remainder of the pipeline, and a potential is applied to the filter to polarize same relative to the pipeline with a polarity which is opposite the polarity of the charged particles in the stream which are to be collected. An insulating arrangement for connection flanges of the filter vessel and pipeline sections is described. (author)

  19. A Level Set Filter for Speckle Reduction in SAR Images

    OpenAIRE

    Huang Bo; Li Hongga; Huang Xiaoxia

    2010-01-01

    Despite much effort and significant progress in recent years, speckle removal for Synthetic Aperture Radar (SAR) image still is a challenging problem in image processing. Unlike the traditional noise filters, which are mainly based on local neighborhood statistical average or frequencies transform, in this paper, we propose a speckle reduction method based on the theory of level set, one form of curvature flow propagation. Firstly, based on partial differential equation, the Lee filter can b...

  20. Ultra accurate collaborative information filtering via directed user similarity

    OpenAIRE

    Guo, Qiang; Song, Wen-jun; Liu, Jian-guo

    2014-01-01

    A key challenge of the collaborative filtering (CF) information filtering is how to obtain the reliable and accurate results with the help of peers' recommendation. Since the similarities from small-degree users to large-degree users would be larger than the ones opposite direction, the large-degree users' selections are recommended extensively by the traditional second-order CF algorithms. By considering the users' similarity direction and the second-order correlations to d...

  1. A New Approach for Cluster Based Collaborative Filters

    OpenAIRE

    R.Venu Babu,; K. Srinivas,; Anjali Devi, S.

    2010-01-01

    In modern E-Commerce it is not easy for customers to find the best suitable goods of their interest as more and more information is placed on line (like movies, audios, books, documents etc...). So in order to provide most suitable information of high value to customers of an e-commerce business system, a customized recommender system is required. Collaborative Filtering has become a popular technique for reducing this information overload. While traditional collaborative filtering systems ha...

  2. The Marginalized Particle Filter for Automotive Tracking Applications

    OpenAIRE

    Eidehall, Andreas; Scho?n, Thomas; Gustafsson, Fredrik

    2005-01-01

    This paper deals with the problem of estimating the vehicle surroundings (lane geometry and the position of other vehicles), which is needed for intelligent automotive systems, such as adaptive cruise control, collision avoidance and lane guidance. This results in a nonlinear estimation problem. For automotive tracking systems, these problems are traditionally handled using the extended Kalman filter. In this paper we describe the application of the marginalized particle filter to this proble...

  3. Nonlinear Filter Based Image Denoising Using AMF Approach

    OpenAIRE

    Thivakaran, T. K.; Rm, Dr Chandrasekaran

    2010-01-01

    This paper proposes a new technique based on nonlinear Adaptive Median filter (AMF) for image restoration. Image denoising is a common procedure in digital image processing aiming at the removal of noise, which may corrupt an image during its acquisition or transmission, while retaining its quality. This procedure is traditionally performed in the spatial or frequency domain by filtering. The aim of image enhancement is to reconstruct the true image from the corrupted image....

  4. Hybrid Filter Membrane

    Science.gov (United States)

    Laicer, Castro; Rasimick, Brian; Green, Zachary

    2012-01-01

    Cabin environmental control is an important issue for a successful Moon mission. Due to the unique environment of the Moon, lunar dust control is one of the main problems that significantly diminishes the air quality inside spacecraft cabins. Therefore, this innovation was motivated by NASA s need to minimize the negative health impact that air-suspended lunar dust particles have on astronauts in spacecraft cabins. It is based on fabrication of a hybrid filter comprising nanofiber nonwoven layers coated on porous polymer membranes with uniform cylindrical pores. This design results in a high-efficiency gas particulate filter with low pressure drop and the ability to be easily regenerated to restore filtration performance. A hybrid filter was developed consisting of a porous membrane with uniform, micron-sized, cylindrical pore channels coated with a thin nanofiber layer. Compared to conventional filter media such as a high-efficiency particulate air (HEPA) filter, this filter is designed to provide high particle efficiency, low pressure drop, and the ability to be regenerated. These membranes have well-defined micron-sized pores and can be used independently as air filters with discreet particle size cut-off, or coated with nanofiber layers for filtration of ultrafine nanoscale particles. The filter consists of a thin design intended to facilitate filter regeneration by localized air pulsing. The two main features of this invention are the concept of combining a micro-engineered straight-pore membrane with nanofibers. The micro-engineered straight pore membrane can be prepared with extremely high precision. Because the resulting membrane pores are straight and not tortuous like those found in conventional filters, the pressure drop across the filter is significantly reduced. The nanofiber layer is applied as a very thin coating to enhance filtration efficiency for fine nanoscale particles. Additionally, the thin nanofiber coating is designed to promote capture of dust particles on the filter surface and to facilitate dust removal with pulse or back airflow.

  5. Interdigital cavity filter design

    International Nuclear Information System (INIS)

    The lower hybrid current drive system will be interferenced by jamming signal in the EAST experiment site, so a interdigital cavity filter with passband centered at 2.45 GHz is designed to repress the jamming signal. This paper gives the particular description of a theoretical analysis and practical design for interdigital cavity filter, and the filter is simulated by HFSS-software which greatly improves the efficiency and accuracy of the design. (authors)

  6. Spot- Zombie Filtering System

    OpenAIRE

    Arathy Rajagopal; Geethanjali, B.; Arulprakash. P

    2014-01-01

    A major security challenge on the Internet is the existence of the large number of compromised machines. Such machines have been increasingly used to launch various security attacks including spamming and spreading malware, DDoS, and identity theft. These compromised machines are called “Zombies”. In general E-mail applications and providers uses spam filters to filter the spam messages. Spam filtering is a technique for discriminating the genuine message from the spam mes...

  7. Sparse Norm Filtering

    OpenAIRE

    Ye, Chengxi; Tao, Dacheng; Song, Mingli; Jacobs, David W.; Wu, Min

    2013-01-01

    Optimization-based filtering smoothes an image by minimizing a fidelity function and simultaneously preserves edges by exploiting a sparse norm penalty over gradients. It has obtained promising performance in practical problems, such as detail manipulation, HDR compression and deblurring, and thus has received increasing attentions in fields of graphics, computer vision and image processing. This paper derives a new type of image filter called sparse norm filter (SNF) from o...

  8. Decentralised Power Active Filters

    Directory of Open Access Journals (Sweden)

    Marek Roch

    2004-01-01

    Full Text Available This paper deals with a decentralised power active filter control based on the separated computation of reference current for each active filter operating under determinated harmonic frequency. The basic principle of such controlled active filter is explained. It is shown how the nth hamonic component of the reference current can be calculated. Simulation results are shown on the end of the paper.

  9. Kernel Monte Carlo Filter

    OpenAIRE

    Kanagawa, Motonobu; Nishiyama, Yu; Gretton, Arthur; Fukumizu, Kenji

    2013-01-01

    Filtering methods for state-space models have been successfully applied to a wide range of applications. However, standard methods assume that the observation model is explicitly known, at least in a parametric form. This can be a problem for tasks where the observation model cannot be easily obtained. This paper proposes a filtering method for such situations based on the recent nonparametric framework of RKHS embeddings. The proposed kernel Monte Carlo filter combines Mont...

  10. Cross-flow filter

    International Nuclear Information System (INIS)

    A filter for concentrating radioactive sludge is described. It comprises a housing, open at the top and having an inlet and outlet for the sludge at the bottom and an outlet for the filtrate. An assembly of open-ended filter tubes forms a module which is removable from the housing by remote handling equipment. The assembly includes headers which locate in the inlet and outlet. The filter tubes are of bonded carbon, lined with metal oxide. (author)

  11. FPGA BASED FIR FILTER

    OpenAIRE

    SUVARNA JOSHI,; BHARATI AINAPURE

    2010-01-01

    This paper gives brief overview of the basic structure and hardware characteristics of the Finite Impulse Response (FIR) digital filter.FIR filter has been designed efficiently using matlab and implemented on Field Programmable Gate Array (FPGA) platform. MATLAB FDATool has been used to determine filter coefficients and 4th order 32 bit filter has been prototyped. The design has been prototyped on an XC3S500- 4FG320 in Spartan-3E Platform using Integrated Synthesis Environment (ISE) 9.1/10.1 ...

  12. Approximate Kalman filtering

    CERN Document Server

    Chen, G

    1993-01-01

    Kalman filtering algorithm gives optimal (linear, unbiased and minimum error-variance) estimates of the unknown state vectors of a linear dynamic-observation system, under the regular conditions such as perfect data information; complete noise statistics; exact linear modeling; ideal well-conditioned matrices in computation and strictly centralized filtering.In practice, however, one or more of the aforementioned conditions may not be satisfied, so that the standard Kalman filtering algorithm cannot be directly used, and hence "approximate Kalman filtering" becomes necessary. In the last decad

  13. Oriented Fiber Filter Media

    Directory of Open Access Journals (Sweden)

    R. Bharadwaj

    2008-06-01

    Full Text Available Coalescing filters are widely used throughout industry and improved performance will reduce droplet emissions and operating costs. Experimental observations show orientation of micro fibers in filter media effect the permeability and the separation efficiency of the filter media. In this work two methods are used to align the fibers to alter the filter structure. The results show that axially aligned fiber media improve quality factor on the order of 20% and cutting media on an angle from a thick layered media can improve performance by about 40%. The results also show the improved performance is not monotonically correlated to the average fiber angle of the medium.

  14. Filters in nuclear facilities

    International Nuclear Information System (INIS)

    The topics of the nine papers given include the behavior of HEPA filters during exposure to air flows of high humidity as well as of high differential pressure, the development of steel-fiber filters suitable for extreme operating conditions, and the occurrence of various radioactive iodine species in the exhaust air from boiling water reactors. In an introductory presentation the German view of the performance requirements to be met by filters in nuclear facilities as well as the present status of filter quality assurance are discussed. (orig.)

  15. Preaching in the Lutheran Tradition

    DEFF Research Database (Denmark)

    Lorensen, Marlene Ringgaard

    2010-01-01

      Preaching in the Lutheran tradition The ministry of preaching has traditionally been regarded as the most important characteristic - the sine qua non - of the Lutheran church. Luther characterized preaching as an indispensable means of grace, regarding it central to the church liturgy.  Contemporary Lutheran preachers, however, often find themselves in a dilemma trying to integrate traditional Lutheran ideals with contemporary practical experiences of preaching. The following portrait of preaching in the Lutheran tradition is written from the perspective of the Lutheran Evangelical Church of Denmark. On one hand there is a strong reformed emphasis on the belief that Preadicatio verbi dei est verbum dei - The preaching of the Word of God is the Word of God. On the other hand the majority of preachers cannot easily make themselves advocates for continuing the category, "Word of God", as a homiletical basis. The dialectical theology's attempt of reviving the category, in the middle of the 20th century, led homiletics into too grave difficulties and has been accused of great co-responsibility for the drying out of the church's preaching tradition. The claim that preaching is the Word of God has, in the opinion of many preachers and homileticians, led to too much listener-immune, monological preaching. Yet, Although most preachers struggle to identify with a traditional high homiletics, the continuing study of Luther within Danish theology contributes to a continued consciousness among preachers about the belief that proclamation demands to be understood somehow as the "Word of God".   Udgivelsesdato: October

  16. Distortion Parameters Analysis Method Based on Improved Filtering Algorithm

    Directory of Open Access Journals (Sweden)

    ZHANG Shutuan

    2013-10-01

    Full Text Available In order to realize the accurate distortion parameters test of aircraft power supply system, and satisfy the requirement of corresponding equipment in the aircraft, the novel power parameters test system based on improved filtering algorithm is introduced in this paper. The hardware of the test system has the characters of s portable and high-speed data acquisition and processing, and the software parts utilize the software Labwindows/CVI as exploitation software, and adopt the pre-processing technique and adding filtering algorithm. Compare with the traditional filtering algorithm, the test system adopted improved filtering algorithm can help to increase the test accuracy. The application shows that the test system with improved filtering algorithm can realize the accurate test results, and reach to the design requirements.  

  17. Ceramic filters for bulk inoculation of nickel alloy castings

    Directory of Open Access Journals (Sweden)

    F. Binczyk

    2011-07-01

    Full Text Available The work includes the results of research on production technology of ceramic filters which, besides the traditional filtering function, playalso the role of an inoculant modifying the macrostructure of cast nickel alloys. To play this additional role, filters should demonstratesufficient compression strength and ensure proper flow rate of liquid alloy. The role of an inoculant is played by cobalt aluminateintroduced to the composition of external coating in an amount from 5 to 10 wt.% . The required compression strength (over 1MPa isprovided by the supporting layers, deposited on the preform, which is a polyurethane foam. Based on a two-level fractional experiment24-1, the significance of an impact of various technological parameters (independent variables on selected functional parameters of theready filters was determined. Important effect of the number of the supporting layers and sintering temperature of filters after evaporationof polyurethane foam was stated.

  18. An Amplitude Spectral Capon Estimator with a Variable Filter Length

    DEFF Research Database (Denmark)

    Nielsen, Jesper Kjær; Smaragdis, Paris

    2012-01-01

    The filter bank methods have been a popular non-parametric way of computing the complex amplitude spectrum. So far, the length of the filters in these filter banks has been set to some constant value independently of the data. In this paper, we take the first step towards considering the filter length as an unknown parameter. Specifically, we derive a very simple and approximate way of determining the optimal filter length in a data-adaptive way. Based on this analysis, we also derive a model averaged version of the forward and the forward-backward amplitude spectral Capon estimators. Through simulations, we show that these estimators significantly improve the estimation accuracy compared to the traditional Capon estimators.

  19. Side loading filter apparatus

    International Nuclear Information System (INIS)

    A side loading filter chamber for use with radioactive gases is described. The equipment incorporates an inexpensive, manually operated, mechanism for aligning filter units with a number of laterally spaced wall openings and for removing the units from the chamber. (U.K.)

  20. Hodge filtered complex bordism

    OpenAIRE

    Hopkins, Michael J.; Quick, Gereon

    2012-01-01

    We construct Hodge filtered cohomology groups for complex manifolds that combine the topological information of generalized cohomology theories with geometric data of Hodge filtered holomorphic forms. This theory provides a natural generalization of Deligne cohomology. For smooth complex algebraic varieties, we show that the theory satisfies a projective bundle formula and $\\A^1$-homotopy invariance. Moreover, we obtain transfer maps along projective morphisms.

  1. Pinhole diffraction filter

    Science.gov (United States)

    Woodgate, B. E.

    1977-01-01

    Multistage diffraction filter consisting of coalined series of pinholes on parallel sheets can be used as nondegradable UV filter. Beam is attenuated as each pinhole diffracts radiation in controlled manner into divergent beam, and following pinhole accepts only small part of that beam.

  2. Filter Paper: Solution to High Self-Attenuation Corrections in HEPA Filter Measurements

    International Nuclear Information System (INIS)

    An 8 by 8 by 6 inch High Efficiency Particulate Air (HEPA) filter was measured as part of a uranium holdup survey in June of 2005 as it has been routinely measured every two months since 1998. Although the survey relies on gross gamma count measurements, this was one of a few measurements that had been converted to a quantitative measurement in 1998. The measurement was analyzed using the traditional Generalized Geometry Holdup (GGH) approach, using HMS3 software, with an area calibration and self-attenuation corrected with an empirical correction factor of 1.06. A result of 172 grams of 235U was reported. The actual quantity of 235U in the filter was approximately 1700g. Because of this unusually large discrepancy, the measurement of HEPA filters will be discussed. Various techniques for measuring HEPA filters will be described using the measurement of a 24 by 24 by 12 inch HEPA filter as an example. A new method to correct for self attenuation will be proposed for this measurement Following the discussion of the 24 by 24 by 12 inch HEPA filter, the measurement of the 8 by 8 by 6 inch will be discussed in detail

  3. RESEARCH ON SPATIAL FILTERS AND HOMOMORPHIC FILTERING METHODS

    OpenAIRE

    Abhinash Singla; Gurjant Singh; Kanwaljeet Kaur

    2012-01-01

    In image processing, denoising is one of the important tasks. Despite the significant research conducted on this topic, the development of efficient denoising methods is still a compelling challenge. In this paper, comparison of Spatial Filters methods with the Homomorphic Filters Methods. The spatial filter methods like Median Filter and Wiener Filter are based on the simple formulas that are proposed by different authors. In Homomorphic Filters Method NormalShrink and BayesShrink are used. ...

  4. Use of changing filters

    International Nuclear Information System (INIS)

    The invention concerns a changing filter for accommodating bulk contact material, eg active carbons, for cleaning gas or air flows. The filter consists of a square, in which the meandering filter bed surrounded by perforated sheets with windings alternately towards the incoming and outgoing flow side is accommodated. The windings of the filter bed are flattened on the outgoing flow side, where the part of the perforated sheet partitions towards the flattened part is open on the outgoing flow side and can be closed by perforated sheet lids. Pressure plates with rubber or plastic mats permeable to air are laid on the filter bed below the perforated sheet lids and tensioned against the bed. The advantage is that while guaranteeing many applications, the danger of leaks to the mechanical parts of the closing lid is overcome. (orig./HP)

  5. Filter cake breaker systems

    Energy Technology Data Exchange (ETDEWEB)

    Garcia, Marcelo H.F. [Poland Quimica Ltda., Duque de Caxias, RJ (Brazil)

    2004-07-01

    Drilling fluids filter cakes are based on a combination of properly graded dispersed particles and polysaccharide polymers. High efficiency filter cakes are formed by these combination , and their formation on wellbore walls during the drilling process has, among other roles, the task of protecting the formation from instantaneous or accumulative invasion of drilling fluid filtrate, granting stability to well and production zones. Filter cake minimizes contact between drilling fluid filtrate and water, hydrocarbons and clay existent in formations. The uniform removal of the filter cake from the entire interval is a critical factor of the completion process. The main methods used to breaking filter cake are classified into two groups, external or internal, according to their removal mechanism. The aim of this work is the presentation of these mechanisms as well their efficiency. (author)

  6. Weighted guided image filtering.

    Science.gov (United States)

    Li, Zhengguo; Zheng, Jinghong; Zhu, Zijian; Yao, Wei; Wu, Shiqian

    2015-01-01

    It is known that local filtering-based edge preserving smoothing techniques suffer from halo artifacts. In this paper, a weighted guided image filter (WGIF) is introduced by incorporating an edge-aware weighting into an existing guided image filter (GIF) to address the problem. The WGIF inherits advantages of both global and local smoothing filters in the sense that: 1) the complexity of the WGIF is O(N) for an image with N pixels, which is same as the GIF and 2) the WGIF can avoid halo artifacts like the existing global smoothing filters. The WGIF is applied for single image detail enhancement, single image haze removal, and fusion of differently exposed images. Experimental results show that the resultant algorithms produce images with better visual quality and at the same time halo artifacts can be reduced/avoided from appearing in the final images with negligible increment on running times. PMID:25415986

  7. Vena cava filter; Vena-cava-Filter

    Energy Technology Data Exchange (ETDEWEB)

    Helmberger, T. [Klinikum Bogenhausen, Institut fuer Diagnostische und Interventionelle Radiologie und Nuklearmedizin, Muenchen (Germany)

    2007-05-15

    Fulminant pulmonary embolism is one of the major causes of death in the Western World. In most cases, deep leg and pelvic venous thrombosis are the cause. If an anticoagulant/thrombotic therapy is no longer possible or ineffective, a vena cava filter implant may be indicated if an embolism is threatening. Implantation of the filter is a simple and safe intervention. Nevertheless, it is necessary to take into consideration that the data base for determining the indications for this treatment are very limited. Currently, a reduction in the risk of thromboembolism with the use of filters of about 30%, of recurrences of almost 5% and fatal pulmonary embolism of 1% has been reported, with a risk of up to 20% of filter induced vena cava thrombosis. (orig.) [German] Die fulminante Lungenembolie zaehlt zu den Haupttodesursachen in der westlichen Welt. In der Mehrzahl der Faelle sind tiefe Bein- und Beckenvenenthrombosen ursaechlich verantwortlich. Ist eine antikoagulative/-thrombotische Therapie nicht (mehr) moeglich oder unwirksam, kann bei drohender Emboliegefahr die Vena-cava-Filterimplantation indiziert sein. Die Filterimplantation ist eine einfache und sehr sichere Intervention. Dennoch muss bei der Indikationsstellung beruecksichtigt werden, dass die Datenlage zur Wirksamkeit sehr limitiert ist. So wird aktuell ueber eine Reduktion des Thrombembolierisikos um 30% bei Embolierezidiven von knapp 5% und fatalen Lungenembolien von 1% unter Filterprophylaxe berichtet, bei einem Risiko von bis zu 20% fuer die filterinduzierte Vena-cava-Thrombose. (orig.)

  8. Noise Removal of Spaceborne SAR Image Based on the FIR Digital Filter

    Directory of Open Access Journals (Sweden)

    Wei Zhang

    2010-01-01

    Full Text Available The speckle effect inevitably exists in the image of the Synthetic Aperture Radar (SAR. The removal of the speckle noise is the necessary approach before automatic partition, classification, target detection and abstraction of other quantitative special information in the SAR image, so it is very meaningful to eliminate or furthest restrain the speckle noises when the spatial resolution of the image is not be reduced. In this paper, the FIR filter is used to remove the noise in the SAR image, and optimal filtering coefficient is selected through experiment and analysis in the filtering process. The results show that the FIR filter used to remove noises in the SAR image is better than other traditional filtering methods in keeping the radiolocation feature and restraining the speckle noises, and the filtering speed is quicker. At the same time, the selection of the filtering coefficient will largely influence the de-noising effect of the FIR filter.

  9. Frequency Spectrum Based Low-Area Low-Power Parallel FIR Filter Design

    Directory of Open Access Journals (Sweden)

    Jin-Gyun Chung

    2002-09-01

    Full Text Available Parallel (or block FIR digital filters can be used either for high-speed or low-power (with reduced supply voltage applications. Traditional parallel filter implementations cause linear increase in the hardware cost with respect to the block size. Recently, an efficient parallel FIR filter implementation technique requiring a less-than linear increase in the hardware cost was proposed. This paper makes two contributions. First, the filter spectrum characteristics are exploited to select the best fast filter structures. Second, a novel block filter quantization algorithm is introduced. Using filter benchmarks, it is shown that the use of the appropriate fast FIR filter structures and the proposed quantization scheme can result in reduction in the number of binary adders up to 20%.

  10. The recognition of traditional midwives.

    Science.gov (United States)

    Lecky-Thompson, M

    1994-12-01

    This article explores the reasons why Australian traditional midwives need to be recognised by their registered sisterhood. This issue is of particular significance as NSW midwives gear themselves for a campaign to redress the 1991 changes to the NSW Nurses Act. This Act threatens in a variety of ways to denigrate the future of midwifery as an autonomous profession. Do we leave the plight of the traditional midwife to feminist groups to raise and address, or do we work in harmony and with a responsibility to the less recognised, to achieve a profession that is responsive to all women's needs, whatever their race, locale and class? PMID:7887820

  11. Derivative free filtering using Kalmtool

    DEFF Research Database (Denmark)

    Bayramoglu, Enis Technical University of Denmark,

    2010-01-01

    In this paper we present a toolbox enabling easy evaluation and comparison of different filtering algorithms. The toolbox is called Kalmtool 4 and is a set of MATLAB tools for state estimation of nonlinear systems. The toolbox contains functions for extended Kalman filtering as well as for DD1 filter and the DD2 filter. It also contains functions for Unscented Kalman filters as well as several versions of particle filters. The toolbox requires MATLAB version 7, but no additional toolboxes are required.

  12. Static Filtered Skin Detection

    Directory of Open Access Journals (Sweden)

    Rehanullah Khan

    2012-03-01

    Full Text Available A static skin filter defines explicitly (using a number of rules the boundaries the skin cluster has in a color space. Single or multiple ranges of threshold values for each color space component are created and the image pixel values falling within these range(s for all the chosen color components are defined as skin pixels. In this paper, we investigate and evaluate static skin filters for skin segmentation. As a contribution, two new static skin filters for the IHLS and CIELAB color spaces are developed. The two new static filters and four state-of-the-art static filters in YCbCr, HSI, RGB and normalized RGB color spaces are evaluated on the two datasets DS1 and DS2, on the basis of F-measure. Experimental results reveal the feasibility of the developed static skin filters. We also found that since the static filters use static boundaries, any shift of skin color ranges from the static boundaries will result in varying performance. Therefore, the F-measure rankings of the color spaces are different for the datasets DS1 and DS2.

  13. Defueling filter test

    International Nuclear Information System (INIS)

    The Three Mile Island Unit 2 Reactor (TMI-2) has sustained core damage creating a significant quantity of fine debris, which can become suspended during the planned defueling operations, and will have to be constantly removed to maintain water clarity and minimize radiation exposure. To accomplish these objectives, a Defueling Water Cleanup System (DWCS) has been designed. One of the primary components in the DWCS is a custom designed filter canister using an all stainless steel filter medium. The full scale filter canister is designed to remove suspended solids from 800 microns to 0.5 microns in size. Filter cartridges are fabricated into an element cluster to provide for a flowrate of greater than 100 gals/min. Babcock and Wilcox (B and W) under contract to GPU Nuclear Corporation has evaluated two candidate DWCS filter concepts in a 1/100 scale proof-of-principle test program at BandW's Lynchburg Research Center. The filters were challenged with simulated solids suspensions of 1400 and 140 ppm in borated water (5000 ppm boron). Test data collected includes solids loading, effluent turbidity, and differential pressure trends versus time. From the proof-of-principle test results, a full-scale filter canister was generated

  14. Intraindividual evaluation of the influence of iterative reconstruction and filter kernel on subjective and objective image quality in computed tomography of the brain

    International Nuclear Information System (INIS)

    Objectives: To intraindividually evaluate the potential of 4th generation iterative reconstruction (IR) on brain CT with regard to subjective and objective image quality. Methods: 31 consecutive raw data sets of clinical routine native sequential brain CT scans were reconstructed with IR level 0 (= filtered back projection), 1, 3 and 4; 3 different brain filter kernels (smooth/standard/sharp) were applied respectively. Five independent radiologists with different levels of experience performed subjective image rating. Detailed ROI analysis of image contrast and noise was performed. Statistical analysis was carried out by applying a random intercept model. Results: Subjective scores for the smooth and the standard kernels were best at low IR levels, but both, in particular the smooth kernel, scored inferior with an increasing IR level. The sharp kernel scored lowest at IR 0, while the scores substantially increased at high IR levels, reaching significantly best scores at IR 4. Objective measurements revealed an overall increase in contrast-to-noise ratio at higher IR levels, which was highest when applying the soft filter kernel. The absolute grey-white contrast decreased with an increasing IR level and was highest when applying the sharp filter kernel. All subjective effects were independent of the raters' experience and the patients' age and sex. Conclusion: Different combinations of IR level and filter kernel substantially influence subjective and objective image qunfluence subjective and objective image quality of brain CT. (orig.)

  15. EMI filter design

    CERN Document Server

    Ozenbaugh, Richard Lee

    2011-01-01

    With today's electrical and electronics systems requiring increased levels of performance and reliability, the design of robust EMI filters plays a critical role in EMC compliance. Using a mix of practical methods and theoretical analysis, EMI Filter Design, Third Edition presents both a hands-on and academic approach to the design of EMI filters and the selection of components values. The design approaches covered include matrix methods using table data and the use of Fourier analysis, Laplace transforms, and transfer function realization of LC structures. This edition has been fully revised

  16. Filtering, stability, and robustness

    Science.gov (United States)

    van Handel, Ramon

    The theory of nonlinear filtering concerns the optimal estimation of a Markov signal in noisy observations. Such estimates necessarily depend on the model that is chosen for the signal and observations processes. This thesis studies the sensitivity of the filter to the choice of underlying model over long periods of time, within the framework of continuous time filtering with white noise type observations. The first topic of this thesis is the asymptotic stability of the filter, which is studied using the theory of conditional diffusions. This leads to improvements on pathwise stability bounds, and to new insight into existing stability results in a fully probabilistic setting. Furthermore, I develop in detail the theory of conditional diffusions for finite-state Markov signals and clarify the duality between estimation and stochastic control in this context. The second topic of this thesis is the sensitivity of the nonlinear filter to the model parameters of the signal and observations processes. This section concentrates on the finite state case, where the corresponding model parameters are the jump rates of the signal, the observation function, and the initial measure. The main result is that the expected difference between the filters with the true and modified model parameters is bounded uniformly on the infinite time interval, provided that the signal process satisfies a mixing property. The proof uses properties of the stochastic flow generated by the filter on the simplex, as well as the Malliavin calculus and anticipative stochastic calculus. The third and final topic of this thesis is the asymptotic stability of quantum filters. I begin by developing quantum filtering theory using reference probability methods. The stability of the resulting filters is not easily studied using the preceding methods, as smoothing violates the nondemolition requirement. Fortunately, progress can be made by randomizing the initial state of the filter. Using this technique, I prove that the filtered estimate of the measurement observable is stable regardless of the underlying model, provided that the initial states are absolutely continuous in a suitable sense.

  17. Traditional Navajo Maps and Wayfinding

    Science.gov (United States)

    Francis, Harris; Kelley, Klara

    2005-01-01

    An example of the way finding process when using verbal and other traditional maps among the Navajo Indians of the southwestern United States is presented. The scholarly literature on the Southwest offers examples of verbal maps that construct both linear space, such as trails, and broad geographical space, including hunting territories and large…

  18. Africanisms in Gullah Oral Tradition.

    Science.gov (United States)

    Holloway, Joseph E.

    1989-01-01

    The Sea Islands off the coast of South Carolina, Georgia, and Northern Florida retain almost every element of African culture, including language, oral tradition, folklore, and aesthetics. Examines the African influence in the lifestyle of the Gullah people of the Sea Islands, especially in terms of their concept of time. (AF)

  19. The Filtered Abel Transform and Its Application in Combustion Diagnostics

    Science.gov (United States)

    Simons, Stephen N. (Technical Monitor); Yuan, Zeng-Guang

    2003-01-01

    Many non-intrusive combustion diagnosis methods generate line-of-sight projections of a flame field. To reconstruct the spatial field of the measured properties, these projections need to be deconvoluted. When the spatial field is axisymmetric, commonly used deconvolution method include the Abel transforms, the onion peeling method and the two-dimensional Fourier transform method and its derivatives such as the filtered back projection methods. This paper proposes a new approach for performing the Abel transform method is developed, which possesses the exactness of the Abel transform and the flexibility of incorporating various filters in the reconstruction process. The Abel transform is an exact method and the simplest among these commonly used methods. It is evinced in this paper that all the exact reconstruction methods for axisymmetric distributions must be equivalent to the Abel transform because of its uniqueness and exactness. Detailed proof is presented to show that the two dimensional Fourier methods when applied to axisymmetric cases is identical to the Abel transform. Discrepancies among various reconstruction method stem from the different approximations made to perform numerical calculations. An equation relating the spectrum of a set of projection date to that of the corresponding spatial distribution is obtained, which shows that the spectrum of the projection is equal to the Abel transform of the spectrum of the corresponding spatial distribution. From the equation, if either the projection or the distribution is bandwidth limited, the other is also bandwidth limited, and both have the same bandwidth. If the two are not bandwidth limited, the Abel transform has a bias against low wave number components in most practical cases. This explains why the Abel transform and all exact deconvolution methods are sensitive to high wave number noises. The filtered Abel transform is based on the fact that the Abel transform of filtered projection data is equal to an integral transform of the original projection data with the kernel function being the Abel transform of the filtering function. The kernel function is independent of the projection data and can be obtained separately when the filtering function is selected. Users can select the best filtering function for a particular set of experimental data. When the kernal function is obtained, it can be used repeatedly to a number of projection data sets (rovs) from the same experiment. When an entire flame image that contains a large number of projection lines needs to be processed, the new approach significantly reduces computational effort in comparison with the conventional approach in which each projection data set is deconvoluted separately. Computer codes have been developed to perform the filter Abel transform for an entire flame field. Measured soot volume fraction data of a jet diffusion flame are processed as an example.

  20. A novel metamaterial filter with stable passband performance based on frequency selective surface

    OpenAIRE

    Fang, C. Y.; Gao, J. S.; Hai Liu

    2014-01-01

    In this paper, a novel metamaterial filter based on frequency selective surface (FSS) is proposed. Using the mode matching method, we theoretically studied the transmission performance of the structure. Results show that, by rotating its neighboring elements 90 degree, the novel filter has a better stability to angle of incidence than traditional structures for TE and TM polarization. As the incident angles vary from 0 to 50 degrees, the metamaterial filter exhibits a transmittance higher tha...

  1. Design and Implementation for a Non Linear State Filter for LEO Micro Satellite

    OpenAIRE

    Chouraqui, S.; Benyettou, M.

    2009-01-01

    This study preliminarily investigates the numerical application of both Extended Kalman Filter (EKF) (which has traditionally been used for non linear estimation) and a relatively new filter, Unscented Kalman Filter (UKF) to the nonlinear estimation problem. The new method can be applied to nonlinear systems without the linearization process necessary for the EKF and it does not demand a Gaussian distribution of noise and what's more, its ease of implementation and more accurate estimation fe...

  2. Time Weight Update Model Based on the Memory Principle in Collaborative Filtering

    OpenAIRE

    Dan Li; Peng Cao; Yucui Guo; Min Lei

    2013-01-01

    Collaborative filtering is the most widely used technology in the recommender systems. Existing collaborative filtering algorithms do not take the time factor into account. However, users’ interests always change with time, and traditional collaborative filtering cannot reflect the changes. In this paper, the change of users’ interests is considered as the memory process, and a time weight iteration model is designed based on memory principle. For a certain user, the proposed mode...

  3. A Robust Collaborative Filtering Recommendation Algorithm Based on Multidimensional Trust Model

    OpenAIRE

    Dongyan Jia; Fuzhi Zhang; Sai Liu

    2013-01-01

    Collaborative filtering is one of the widely used technologies in the e-commerce recommender systems. It can predict the interests of a user based on the rating information of many other users. But the traditional collaborative filtering recommendation algorithm has the problems such as lower recommendation precision and weaker robustness. To solve these problems, in this paper we present a robust collaborative filtering recommendation algorithm based on multidimensional trust model. Firstly,...

  4. Toward Green Cloud Computing: An Attribute Clustering Based Collaborative Filtering Method for Virtual Machine Migration

    OpenAIRE

    Zhang Liu-Mei; Ma Jian-Feng; Wang Yi-Chuan; Lu Di

    2013-01-01

    In this study, an attribute clustering based collaborative filtering algorithm is depicted for virtual machine migration towards green Cloud computing. The algorithm utilizes similarity characteristics of virtual machine task related attributes, especially CPU related attributes, to filter redundant data by feature selection. Then by referencing K-Means clustering to effectively solve the rating scale problems existing in the traditional collaborative filtering recommendation algorithm. Exper...

  5. Remotely serviced filter and housing

    Science.gov (United States)

    Ross, Maurice J. (Pocatello, ID); Zaladonis, Larry A. (Idaho Falls, ID)

    1988-09-27

    A filter system for a hot cell comprises a housing adapted for input of air or other gas to be filtered, flow of the air through a filter element, and exit of filtered air. The housing is tapered at the top to make it easy to insert a filter cartridge using an overhead crane. The filter cartridge holds the filter element while the air or other gas is passed through the filter element. Captive bolts in trunnion nuts are readily operated by electromechanical manipulators operating power wrenches to secure and release the filter cartridge. The filter cartridge is adapted to make it easy to change a filter element by using a master-slave manipulator at a shielded window station.

  6. Application of Unscented Kalman Filter for Sonar Signal Processing

    Directory of Open Access Journals (Sweden)

    Leela Kumari. B , Padma Raju.K

    2012-06-01

    Full Text Available State estimation theory is one of the best mathematical approaches to analyze variants in the states of the system or process. The state of the system is defined by a set of variables that provide a complete representation of the internal condition at any given instant of time. Filtering of Random processes is referred to as Estimation, and is a well defined statistical technique. There are two types of state estimation processes, Linear and Nonlinear. Linear estimation of a system can easily be analyzed by using Kalman Filter (KF and is used to compute the target state parameters with a priori information under noisy environment. But the traditional KF is optimal only when the model is linear and its performance is well defined under the assumptions that the system model and noise statistics are well known. Most of the state estimation problems are nonlinear, thereby limiting the practical applications of the KF. The modified KF, aka EKF, Unscented Kalman filter and Particle filter are best known for nonlinear estimates. Extended Kalman filter (EKF is the nonlinear version of the Kalman filter which linearizes about the current mean and covariance. The EKF has been considered the standard in the theory of nonlinear state estimation. Since linear systems do not really exist, a novel transformation is adopted. Unscented Kalman filter and Particle filter are best known nonlinear estimates. The approach in this paper is to analyze the algorithm for maneuvering target tracking using bearing only measurements where UKF provides better probability of state estimation.

  7. Metalcasting: Filtering Molten Metal

    International Nuclear Information System (INIS)

    A more efficient method has been created to filter cast molten metal for impurities. Read about the resulting energy and money savings that can accrue to many different industries from the use of this exciting new technology

  8. Scalable bloom filters

    OpenAIRE

    Baquero, Carlos; Almeida, Paulo Se?rgio; Preguic?a, Nuno

    2007-01-01

    Bloom filters provide space-efficient storage of sets at the cost of a probability of false positives on membership queries. The size of the filter must be defined a priori based on the number of elements to store and the desired false positive probability, being impossible to store extra elements without increasing the false positive probability. This leads typically to a conservative assumption regarding maximum set size, possibly by orders of magnitude, and a consequent space waste. This p...

  9. Sliding Bloom Filters

    OpenAIRE

    Naor, Moni; Yogev, Eylon

    2013-01-01

    A Bloom filter is a method for reducing the space (memory) required for representing a set by allowing a small error probability. In this paper we consider a \\emph{Sliding Bloom Filter}: a data structure that, given a stream of elements, supports membership queries of the set of the last $n$ elements (a sliding window), while allowing a small error probability. We formally define the data structure and its relevant parameters and analyze the time and memory requirements need...

  10. Complex Hilbert Transform Filter

    OpenAIRE

    Hannu Olkkonen; Olkkonen, Juuso T.

    2011-01-01

    Hilbert transform is a basic tool in constructing analytical signals for a various applications such as amplitude modulation, envelope and instantaneous frequency analysis, quadrature decoding, shift-invariant multi-rate signal processing and Hilbert-Huang decomposition. This work introduces a complex Hilbert transform (CHT) filter, where the real and imaginary parts are a Hilbert transform pair. The CHT filtered signal is analytic, i.e. its Fourier transform is zero in negative frequency ran...

  11. Bloofi: Multidimensional Bloom Filters

    OpenAIRE

    Crainiceanu, Adina; Lemire, Daniel

    2015-01-01

    Bloom filters are probabilistic data structures commonly used for approximate membership problems in many areas of Computer Science (networking, distributed systems, databases, etc.). With the increase in data size and distribution of data, problems arise where a large number of Bloom filters are available, and all them need to be searched for potential matches. As an example, in a federated cloud environment, each cloud provider could encode the information using Bloom filt...

  12. Robustifying Vector Median Filter

    OpenAIRE

    Valentín Gregori; Samuel Morillas

    2011-01-01

    This paper describes two methods for impulse noise reduction in colour images that outperform the vector median filter from the noise reduction capability point of view. Both methods work by determining first the vector median in a given filtering window. Then, the use of complimentary information from componentwise analysis allows to build robust outputs from more reliable components. The correlation among the colour channels is taken into account in the processing and, as a r...

  13. Hybrid Data Assimilation without Ensemble Filtering

    Science.gov (United States)

    Todling, Ricardo; Akkraoui, Amal El

    2014-01-01

    The Global Modeling and Assimilation Office is preparing to upgrade its three-dimensional variational system to a hybrid approach in which the ensemble is generated using a square-root ensemble Kalman filter (EnKF) and the variational problem is solved using the Grid-point Statistical Interpolation system. As in most EnKF applications, we found it necessary to employ a combination of multiplicative and additive inflations, to compensate for sampling and modeling errors, respectively and, to maintain the small-member ensemble solution close to the variational solution; we also found it necessary to re-center the members of the ensemble about the variational analysis. During tuning of the filter we have found re-centering and additive inflation to play a considerably larger role than expected, particularly in a dual-resolution context when the variational analysis is ran at larger resolution than the ensemble. This led us to consider a hybrid strategy in which the members of the ensemble are generated by simply converting the variational analysis to the resolution of the ensemble and applying additive inflation, thus bypassing the EnKF. Comparisons of this, so-called, filter-free hybrid procedure with an EnKF-based hybrid procedure and a control non-hybrid, traditional, scheme show both hybrid strategies to provide equally significant improvement over the control; more interestingly, the filter-free procedure was found to give qualitatively similar results to the EnKF-based procedure.

  14. Venous Thromboembolism After Removal of Retrievable Inferior Vena Cava Filters

    International Nuclear Information System (INIS)

    The purpose of this study was to examine the incidence of new or recurrent venous thromboembolism (VTE) after retrieval of inferior vena cava (IVC) filters and risk factors associated with such recurrence. Between March 2001 and September 2008, at our institution, implanted retrievable vena cava filters were retrieved in 76 patients. The incidence of new or recurrent VTE after retrieval was reviewed and numerous variables were analyzed to assess risk factors for redevelopment of VTE after filter retrieval. In 5 (6.6%) of the 76 patients, redevelopment or worsening of VTE was seen after retrieval of the filter. Three patients (4.0%) had recurrent deep venous thrombosis (DVT) in the lower extremities and 2 (2.6%) had development of pulmonary embolism, resulting in death. Although there was no significant difference in the incidence of new or recurrent VTE related to any risk factor investigated, a tendency for development of VTE after filter retrieval was higher in patients in whom DVT in the lower extremities had been so severe during filter implantation that interventional radiological therapies in addition to traditional anticoagulation therapies were required (40% in patients with recurrent VTE vs. 23% in those without VTE; p = 0.5866 according to Fisher's exact probability test) and in patients in whom DVT remained at the time of filter retrieval (60% in patients with recurrent VTE vs. 37% in those without VTE; p = 0.3637). In conclusion, new or recurrent VTE was r In conclusion, new or recurrent VTE was rare after retrieval of IVC filters but was most likely to occur in patients who had severe DVT during filter implantation and/or in patients with a DVT that remained at the time of filter retrieval. We must point out that the fatality rate from PE after filter removal was high (2.6%).

  15. The impact of metallic filter media on HEPA filtration

    International Nuclear Information System (INIS)

    Traditional HEPA filter systems have limitations that often prevent them from solving many of the filtration problems in the nuclear industry; particularly in applications where long service or storage life, high levels of radioactivity, dangerous decomposition products, chemical aggression, organic solvents, elevated operating temperatures, fire resistance and resistance to moisture are issues. This paper addresses several of these matters of concern by considering the use of metallic filter media to solve HEPA filtration problems ranging from the long term storage of transuranic waste at the WIPP site, spent and damaged fuel assemblies, in glove box ventilation and tank venting to the venting of fumes at elevated temperatures from incinerators, vitrification processes and conversion and sintering furnaces as well as downstream of iodine absorbers in gas cooled reactors in the UK. The paper reviews the basic technology, development, performance characteristics and filtration efficiency, flow versus differential pressure, cleanability and costs of sintered metal fiber in comparison with traditional resin bonded glass fiber filter media and sintered metal powder filter media. Examples of typical filter element and system configurations and applications will be presented The paper will also address the economic case for installing self cleaning pre-filtration, using metallic media, to recover the small volumes of dust that would otherwise blind large volumes of final di otherwise blind large volumes of final disposable HEPA filters, thus presenting a route to reduce ultimate disposal volumes and secondary waste streams. (authors)

  16. Accurate and Efficient Filtering using Anisotropic Filter Decomposition

    OpenAIRE

    Soler, Cyril; Bagher, Mahdi; Nowrouzezahrai, Derek

    2013-01-01

    Efficient filtering remains an important challenge in computer graphics, particularly when filters are spatially-varying, have large extent, and/or exhibit complex anisotropic profiles. We present an efficient filtering approach for these difficult cases based on anisotropic filter decomposition (IFD). By decomposing complex filters into linear combinations of simpler, displaced isotropic kernels, and precomputing a compact prefiltered dataset, we are able to interactively apply any number of...

  17. Esoteric healing traditions: a conceptual overview.

    Science.gov (United States)

    Levin, Jeff

    2008-01-01

    This paper presents, for the first time, a comprehensive scholarly examination of the history and principles of major traditions of esoteric healing. After a brief conceptual overview of esoteric religion and healing, summaries are provided of eight major esoteric traditions, including descriptions of beliefs and practices related to health, healing, and medicine. These include what are termed the kabbalistic tradition, the mystery school tradition, the gnostic tradition, the brotherhoods tradition, the Eastern mystical tradition, the Western mystical tradition, the shamanic tradition, and the new age tradition. Next, commonalities across these traditions are summarized with respect to beliefs and practices related to anatomy and physiology; nosology and etiology; pathophysiology; and therapeutic modalities. Finally, the implications of this survey of esoteric healing are discussed for clinicians, biomedical researchers, and medical educators. PMID:18316053

  18. A FUZZY FILTERING MODEL FOR CONTOUR DETECTION

    Directory of Open Access Journals (Sweden)

    T.C. Rajakumar

    2011-04-01

    Full Text Available Contour detection is the basic property of image processing. Fuzzy Filtering technique is proposed to generate thick edges in two dimensional gray images. Fuzzy logic is applied to extract value for an image and is used for object contour detection. Fuzzy based pixel selection can reduce the drawbacks of conventional methods(Prewitt, Robert. In the traditional methods, filter mask is used for all kinds of images. It may succeed in one kind of image but fail in another one. In this frame work the threshold parameter values are obtained from the fuzzy histogram of the input image. The Fuzzy inference method selects the complete information about the border of the object and the resultant image has less impulse noise and the contrast of the edge is increased. The extracted object contour is thicker than the existing methods. The performance of the algorithm is tested with Peak Signal Noise Ratio(PSNR and Complex Wavelet Structural Similarity Metrics(CWSSIM.

  19. TRADITIONAL FERMENTED FOODS OF LESOTHO

    OpenAIRE

    Gadaga, Tendekayi H.; Molupe Lehohla; Victor Ntuli

    2013-01-01

    This paper describes the traditional methods of preparing fermented foods and beverages of Lesotho. Information on the preparation methods was obtained through a combination of literature review and face to face interviews with respondents from Roma in Lesotho. An unstructured questionnaire was used to capture information on the processes, raw materials and utensils used. Four products; motoho (a fermented porridge), Sesotho (a sorghum based alcoholic beverage), hopose (sorghum fermented beer...

  20. Post-traditional corporate governance

    OpenAIRE

    Mason, Michael; O Mahony, Joan

    2008-01-01

    Traditional definitions of corporate governance are narrow, focusing on legal relations between managers and shareholders. More recent definitions extend the boundaries of governance to consider the role that various stakeholders play in shaping the behaviour of firms. While stakeholding theory embraces a broader set of corporate constituencies, our argument in this paper is that even these definitions are too narrow – they lack the analytical capacity to account for the social embeddedness...

  1. Traditional transfusion practices are changing

    OpenAIRE

    Holcomb, John B.

    2010-01-01

    Schochl and co-authors have described a 5-year retrospective study that outlines a novel, important and controversial transfusion concept in seriously injured trauma patients. Traditionally, clinicians have been taught to use a serial approach, resuscitating hypovolemic trauma patients with a form of crystalloid or colloid, followed by red blood cells (RBCs), then fresh frozen plasma (FFP), and lastly platelets. The data supporting this widely accepted approach are remarkably weak. Conversely...

  2. Ginseng in Traditional Herbal Prescriptions

    OpenAIRE

    Park, Ho Jae; Kim, Dong Hyun; Park, Se Jin; Kim, Jong Min; Ryu, Jong Hoon

    2012-01-01

    Panax ginseng Meyer has been widely used as a tonic in traditional Korean, Chinese, and Japanese herbal medicines and in Western herbal preparations for thousands of years. In the past, ginseng was very rare and was considered to have mysterious powers. Today, the efficacy of drugs must be tested through well-designed clinical trials or meta-analyses, and ginseng is no exception. In the present review, we discuss the functions of ginseng described in historical documents and describe how thes...

  3. Software Development: Agile vs. Traditional

    OpenAIRE

    Stoica, Marian; Mircea, Marinela; Ghilic-micu, Bogdan

    2013-01-01

    Organizations face the need to adapt themselves to a complex business environment, in continuous change and transformation. Under these circumstances, organization agility is a key element in gaining strategic advantages and market success. Achieving and maintaining agility requires agile architectures, techniques, methods and tools, able to react in real time to change requirements. This paper proposes an incursion in the software development, from traditional to agile.

  4. Software Development: Agile vs. Traditional

    Directory of Open Access Journals (Sweden)

    Marian STOICA

    2013-01-01

    Full Text Available Organizations face the need to adapt themselves to a complex business environment, in continuous change and transformation. Under these circumstances, organization agility is a key element in gaining strategic advantages and market success. Achieving and maintaining agility requires agile architectures, techniques, methods and tools, able to react in real time to change requirements. This paper proposes an incursion in the software development, from traditional to agile.

  5. The Promise of Traditional Medicines

    OpenAIRE

    Inamdar, N. N.; Ansari, J. A.

    2010-01-01

    The usage of plants, plant extracts or plant-derived pure chemicals to treat disease become a therapeutic modality, which has stood the test of time. Today several pharmacological classes of drugs include a natural product prototype. Aspirin, atropine, ephedrine, digoxin, morphine, quinine, reserpine and tubocurarine are a few examples of modern drugs, which were originally discovered through the study of traditional cures and folk knowledge of indigenous people. A team work amongst ethnobota...

  6. Traditional communities, multinationals and biodiversity

    OpenAIRE

    Greissing, Anna; Le Tourneau, Franc?ois-michel

    2009-01-01

    The Iratapuru Sustainable Development Reserve is mainly exploited by the community of the São Francisco village. Due to its efforts to organize its members around a production co-operative and to improve their standard of living, but also as a result of massive funding from local and international institutions, this community has become a symbol for the actions of sustainable development undertaken with and in the benefit of « traditional » communities living in protected areas in the Amaz...

  7. Analog filters in nanometer CMOS

    CERN Document Server

    Uhrmann, Heimo; Zimmermann, Horst

    2013-01-01

    Starting from the basics of analog filters and the poor transistor characteristics in nanometer CMOS 10 high-performance analog filters developed by the authors in 120 nm and 65 nm CMOS are described extensively. Among them are gm-C filters, current-mode filters, and active filters for system-on-chip realization for Bluetooth, WCDMA, UWB, DVB-H, and LTE applications. For the active filters several operational amplifier designs are described. The book, furthermore, contains a review of the newest state of research on low-voltage low-power analog filters. To cover the topic of the book comprehen

  8. An IIR median hybrid filter

    Science.gov (United States)

    Bauer, Peter H.; Sartori, Michael A.; Bryden, Timothy M.

    1992-01-01

    A new class of nonlinear filters, the so-called class of multidirectional infinite impulse response median hybrid filters, is presented and analyzed. The input signal is processed twice using a linear shift-invariant infinite impulse response filtering module: once with normal causality and a second time with inverted causality. The final output of the MIMH filter is the median of the two-directional outputs and the original input signal. Thus, the MIMH filter is a concatenation of linear filtering and nonlinear filtering (a median filtering module). Because of this unique scheme, the MIMH filter possesses many desirable properties which are both proven and analyzed (including impulse removal, step preservation, and noise suppression). A comparison to other existing median type filters is also provided.

  9. Bloomier Filters: A second look

    OpenAIRE

    Charles, Denis; Chellapilla, Kumar

    2008-01-01

    A Bloom filter is a space efficient structure for storing static sets, where the space efficiency is gained at the expense of a small probability of false-positives. A Bloomier filter generalizes a Bloom filter to compactly store a function with a static support. In this article we give a simple construction of a Bloomier filter. The construction is linear in space and requires constant time to evaluate. The creation of our Bloomier filter takes linear time which is faster t...

  10. NOTCH FILTER USING SIMULATED INDUCTOR

    Directory of Open Access Journals (Sweden)

    D.SUSAN,

    2011-06-01

    Full Text Available The design of analog filters at low frequencies is not possible because the size of inductors becomes very large. In such cases, the simulated inductors using operational amplifiers are used. This paper deals with the implementation of notch filter using band pass filter which uses simulated inductor where the direct implementation of notch filter using simulated inductor is not possible because of floating inductor. The design of notch filter and the simulation done in PSPICE is presented.

  11. Boolean filters of distributive lattices

    Directory of Open Access Journals (Sweden)

    M. Sambasiva Rao

    2013-07-01

    Full Text Available In this paper we introduce the notion of Boolean filters in a pseudo-complemented distributive lattice and characterize the class of all Boolean filters. Further a set of equivalent conditions are derived for a proper filter to become a prime Boolean filter. Also a set of equivalent conditions is derived for a pseudo-complemented distributive lattice to become a Boolean algebra. Finally, a Boolean filter is characterized in terms of congruences.

  12. DOE HEPA filter test program

    International Nuclear Information System (INIS)

    This standard establishes essential elements of a Department of Energy (DOE) program for testing HEPA filters to be installed in DOE nuclear facilities or used in DOE-contracted activities. A key element is the testing of HEPA filters for performance at a DOE Filter Test Facility (FTF) prior to installation. Other key elements are (1) providing for a DOE HEPA filter procurement program, and (2) verifying that HEPA filters to be installed in nuclear facilities appear on a Qualified Products List (QPL)

  13. EMD-based 60-Hz noise filtering of the ECG.

    Science.gov (United States)

    Nimunkar, Amit J; Tompkins, Willis J

    2007-01-01

    This study used empirical mode decomposition (EMD) for filtering power line noise in electrocardiogram signals. When the signal-to-noise (SNR) is low, the power line noise is separated out as the first intrinsic mode function (IMF), but when the SNR is high, a part of the signal along with the noise is decomposed as the first IMF. To overcome this problem, we add a pseudo noise at a frequency higher than the highest frequency of the signal to filter out just the power line noise in the first IMF. The results are compared with traditional IIR-based bandstop filtering. This technique is also implemented for filtering power line noise during enhancement of stress ECG signals. PMID:18002354

  14. Cubature Kalman filtering for relative spacecraft attitude and position estimation

    Science.gov (United States)

    Zhang, Lijun; Yang, Huabo; Lu, Heping; Zhang, Shifeng; Cai, Hong; Qian, Shan

    2014-12-01

    A novel relative spacecraft attitude and position estimation approach based on cubature Kalman filter is derived. The integrated sensor suit comprises the gyro sensors on each spacecraft and a vision-based navigation system on the deputy spacecraft. In the traditional algorithm, an assumption that the chief's body frame coincides with its Local Vertical Local Horizontal (LVLH) frame is made to construct the line-of-sight observations for convenience. To solve this problem, two relative quaternions that map the chief's LVLH frame to the deputy and chief body frames are involved. The general relative equations of motion for eccentric orbits are used to describe the positional dynamics. The implementation equations for the cubature Kalman filter are derived. Simulation results indicate that the proposed filter provides more accurate estimates of relative attitude and position over than the extended Kalman filter.

  15. An Implementation of Content Boosted Collaborative Filtering Algorithm

    Directory of Open Access Journals (Sweden)

    Boddu Raja Sarath Kumar,

    2011-04-01

    Full Text Available Collaborative filtering (CF systems have been proven to be very effective for personalized and accurate recommendations. These systems are based on the Recommendations of previous ratings byvarious users and products. Since the present database is very sparse, the missing values are considered first and based on that, a complete prediction dataset are made. In this paper, some standardcomputational techniques are applied within the framework of Content-boosted collaborative filtering with imputational rating data to evaluate and produce CF predictions. The Content-boosted collaborative filtering algorithm uses either naive Bayes or means imputation, depending on the sparsity of the original CF rating dataset. Results are presented and shown that this approach performs better than a traditional content-based predictor and collaborative filters.

  16. Cleaning a gas filter

    International Nuclear Information System (INIS)

    Filter elements in a housing are used to filter particles from waste gas before it is discharged from radioactive waste processing plant. When the elements are clogged they are cleaned by a reverse flow of liquid, delivered through a valve while venting off the gas, until the housing is full of liquid. Then compressed air is delivered through a valve, above the liquid, and a drain valve is opened, to force the liquid through the filter elements and out of the housing. Elements are HEPA or ULPA rated and preferably of sintered metal fibres. The backwash liquid may be water with or without surfactant, caustic soda, nitric acid or organic solvent, and helps to remove small particles by neutralising electrostatic forces holding them in the filter element pores. The liquid and air are both delivered to the housing through filters which are rated at least as finely pored as the HEPA or ULPA elements. Used backwash liquid is collected for further treatment. Several housings may operate in parallel and be cleaned in sequence. (Author)

  17. Multilevel Mixture Kalman Filter

    Directory of Open Access Journals (Sweden)

    Chen Rong

    2004-01-01

    Full Text Available The mixture Kalman filter is a general sequential Monte Carlo technique for conditional linear dynamic systems. It generates samples of some indicator variables recursively based on sequential importance sampling (SIS and integrates out the linear and Gaussian state variables conditioned on these indicators. Due to the marginalization process, the complexity of the mixture Kalman filter is quite high if the dimension of the indicator sampling space is high. In this paper, we address this difficulty by developing a new Monte Carlo sampling scheme, namely, the multilevel mixture Kalman filter. The basic idea is to make use of the multilevel or hierarchical structure of the space from which the indicator variables take values. That is, we draw samples in a multilevel fashion, beginning with sampling from the highest-level sampling space and then draw samples from the associate subspace of the newly drawn samples in a lower-level sampling space, until reaching the desired sampling space. Such a multilevel sampling scheme can be used in conjunction with the delayed estimation method, such as the delayed-sample method, resulting in delayed multilevel mixture Kalman filter. Examples in wireless communication, specifically the coherent and noncoherent 16-QAM over flat-fading channels, are provided to demonstrate the performance of the proposed multilevel mixture Kalman filter.

  18. Filter testing experience

    International Nuclear Information System (INIS)

    Completing the requisite ANSI N510 acceptance testing program on any project requires a team effort between the testing contractor, the owner, and the station designer. On a large multi-unit project with several ventilation systems serving both units and built-up filter systems in addition to the packaged filter units, team effort is paramount in achieving a successful program. This paper discusses several of the administrative and technical problems encountered during the successful execution of the ANSI N510 in-place testing program. The paper also reviews the ANSI N510 acceptance testing program requirements and provide an example of how these requirements were successfully implemented. It also reviews several of the challenging technical problems and solutions encountered. These included an interim testing, adjusting, and balancing (TAB) plan, TAB concepts, system operating combinations, the balancing of multiple plenum filter systems and plenum bypass leakage. The project is a twin 1100-MW pressurized water reactor electrical generating station. All HVAC systems process approximately 600,000 CFM through both package and built-up atmospheric cleanup filter units with final filtration consisting of HEPA filters and charcoal adsorbers

  19. Systolic Filter Design Using Multi-Filtering Techniques

    Directory of Open Access Journals (Sweden)

    DEEPAN RAJ.B

    2013-04-01

    Full Text Available To improve the performance of system in terms of processing power, a new architecture and clocking technique is to be realized in this paper. To process the signal in Embedded Parallel Systolic Filters (EPSF and to eliminate the noise present in the signal using flag-bit and flicker clock condition. Kalman filter and extended kalman filter are the filtering techniques used by systolic arrays that can simultaneously triggered on all data elements with different clock cycles. Kalman filter and extended kalman filter to work in two conditions namely with and without flag-bit, flicker clock are to be synthesized and compared.

  20. Filter against suspended matter

    International Nuclear Information System (INIS)

    The invention has been aimed at a filter against bacteria, viruses, carcinogenic and radioactive aerosols in solid or liquid forms. The filter presents a removal capacity of eta/sub DOP/ > 99.9%, has a variable surface with a low drag, can be folded by hand and has a stable form. It consists of glass fibres (36-46 percent in weight, 0.8 x 1.5 x 10-4 cm in diameter), medium-staple cotton (17-25 percent in weight), and viscose staple/fibre (33-43 percent in weight, Tex 0.17). The filter can be applied to respiratory protection apparatuses, breathing and anaesthetic apparatuses, and other medical instruments

  1. Nonlinear Filter Based Image Denoising Using AMF Approach

    CERN Document Server

    Thivakaran, T K

    2010-01-01

    This paper proposes a new technique based on nonlinear Adaptive Median filter (AMF) for image restoration. Image denoising is a common procedure in digital image processing aiming at the removal of noise, which may corrupt an image during its acquisition or transmission, while retaining its quality. This procedure is traditionally performed in the spatial or frequency domain by filtering. The aim of image enhancement is to reconstruct the true image from the corrupted image. The process of image acquisition frequently leads to degradation and the quality of the digitized image becomes inferior to the original image. Filtering is a technique for enhancing the image. Linear filter is the filtering in which the value of an output pixel is a linear combination of neighborhood values, which can produce blur in the image. Thus a variety of smoothing techniques have been developed that are non linear. Median filter is the one of the most popular non-linear filter. When considering a small neighborhood it is highly e...

  2. Initial results of a new generation dual source CT system using only an in-plane comb filter for ultra-high resolution temporal bone imaging

    International Nuclear Information System (INIS)

    To prospectively evaluate radiation dose and image quality of a third generation dual-source CT (DSCT) without z-axis filter behind the patient for temporal bone CT. Forty-five patients were either examined on a first, second, or third generation DSCT in an ultra-high-resolution (UHR) temporal bone-imaging mode. On the third generation DSCT system, the tighter focal spot of 0.2 mm2 removesthe necessity for an additional z-axis-filter, leading to an improved z-axis radiation dose efficiency. Images of 0.4 mm were reconstructed using standard filtered-back-projection or iterative reconstruction (IR) technique for previous generations of DSCT and a novel IR algorithm for the third generation DSCT. Radiation dose and image quality were compared between the three DSCT systems. The statistically significantly highest subjective and objective image quality was evaluated for the third generation DSCT when compared to the first or second generation DSCT systems (all p < 0.05). Total effective dose was 63 %/39 % lower for the third generation examination as compared to the first and second generation DSCT. Temporal bone imaging without z-axis-UHR-filter and a novel third generation IR algorithm allows for significantly higher image quality while lowering effective dose when compared to the first two generations of DSCTs. (orig.)

  3. Shifted Linear Interpolation Filter

    Directory of Open Access Journals (Sweden)

    Hannu Olkkonen

    2010-12-01

    Full Text Available Linear interpolation has been adapted in many signal and image processing applications due to its simple implementation and low computational cost. In standard linear interpolation the kernel is the second order B-spline. In this work we show that the interpolation error can be remarkably diminished by using the time-shifted B-spline as an interpolation kernel. We verify by experimental tests that the optimal shift is. In VLSI and microprocessor circuits the shifted linear interpolation (SLI algorithm can be effectively implemented by the z-transform filter. The interpolation error of the SLI filter is comparable to the more elaborate higher order cubic convolution interpolation.

  4. Digital filters in spectrometry

    International Nuclear Information System (INIS)

    In this work is presented the development and application of the digital signal processing for different multichannel analysis spectra. The use of the smoothing classic methods in applications of signal processing is illustrated by a filters discussion; autoregressive, mobile average and the ARMA filters. Generally, simple routines of lineal smoothing do not provide appropriate smoothing of the data that show the local ruggedness as the strong discontinuities; however the indicated development algorithms have been enough to leave adapting to this task. Four algorithms were proven: autoregressive, mobile average, ARMA and binomial methods for 5, 7, and 9 of data, everything in the domain of the time and programmed in Mat lab. (Author)

  5. Efficient Iterated Filtering

    DEFF Research Database (Denmark)

    Lindstro?m, Erik; Ionides, Edward

    2012-01-01

    Parameter estimation in general state space models is not trivial as the likelihood is unknown. We propose a recursive estimator for general state space models, and show that the estimates converge to the true parameters with probability one. The estimates are also asymptotically Cramer-Rao efficient. The proposed estimator is easy to implement as it only relies on non-linear filtering. This makes the framework flexible as it is easy to tune the implementation to achieve computational efficiency. This is done by using the approximation of the score function derived from the theory on Iterative Filtering as a building block within the recursive maximum likelihood estimator.

  6. Filters in topology optimization

    DEFF Research Database (Denmark)

    Bourdin, Blaise

    1999-01-01

    In this article, a modified (``filtered'') version of the minimum compliance topology optimization problem is studied. The direct dependence of the material properties on its pointwise density is replaced by a regularization of the density field using a convolution operator. In this setting it is possible to establish the existence of solutions. Moreover, convergence of an approximation by means of finite elements can be obtained. This is illustrated through some numerical experiments. The ``filtering'' technique is also shown to cope with two important numerical problems in topology optimization, \\emph{checkerboards} and \\emph{mesh dependent} designs.

  7. Electronically tuned optical filters

    Science.gov (United States)

    Castellano, J. A.; Pasierb, E. F.; Oh, C. S.; Mccaffrey, M. T.

    1972-01-01

    A detailed account is given of efforts to develop a three layer, polychromic filter that can be tuned electronically. The operation of the filter is based on the cooperative alignment of pleochroic dye molecules by nematic liquid crystals activated by electric fields. This orientation produces changes in the optical density of the material and thus changes in the color of light transmitted through the medium. In addition, attempts to improve materials and devices which employ field induced changes of a cholesteric to a nematic liquid crystal are presented.

  8. Design of Wideband Microstrip Filters with Non-Equiripple Responses and Low Sensitivity

    CERN Document Server

    Gao, S S

    2013-01-01

    This paper presents a novel design procedure for wideband microstrip bandpass filters with non-equiripple filtering frequency responses and low sensitivity. Different from the traditional Chebyshev transfer function filters, the return loss zeros of the proposed non-equiripple filters can be redistributed within the operating passband. For the industrial applications, the proposed filters have a reduced sensitivity to manufacturing errors and exhibit good tolerance control for both specified bandwidth and maximum in-band reflection loss. By deriving the transfer functions, a synthesis approach with a set of non-linear equations can be established according to the specifications such as the bandwidth and predetermined reflection lobes. Without performing any post optimization in the full-wave simulation, the non-equiripple synthesized results have less sensitivity and fractional bandwidth (delta) error in comparison with those obtained from traditional Chebyshev transfer functions with equiripple frequency res...

  9. TRADITIONAL FERMENTED FOODS OF LESOTHO

    Directory of Open Access Journals (Sweden)

    Tendekayi H. Gadaga

    2013-06-01

    Full Text Available This paper describes the traditional methods of preparing fermented foods and beverages of Lesotho. Information on the preparation methods was obtained through a combination of literature review and face to face interviews with respondents from Roma in Lesotho. An unstructured questionnaire was used to capture information on the processes, raw materials and utensils used. Four products; motoho (a fermented porridge, Sesotho (a sorghum based alcoholic beverage, hopose (sorghum fermented beer with added hops and mafi (spontaneously fermented milk, were found to be the main fermented foods prepared and consumed at household level in Lesotho. Motoho is a thin gruel, popular as refreshing beverage as well as a weaning food. Sesotho is sorghum based alcoholic beverage prepared for household consumption as well as for sale. It is consumed in the actively fermenting state. Mafi is the name given to spontaneously fermented milk with a thick consistency. Little research has been done on the technological aspects, including the microbiological and biochemical characteristics of fermented foods in Lesotho. Some of the traditional aspects of the preparation methods, such as use of earthenware pots, are being replaced, and modern equipment including plastic utensils are being used. There is need for further systematic studies on the microbiological and biochemical characteristics of these these products.

  10. Traditional and Modern Morphometrics: Review

    Directory of Open Access Journals (Sweden)

    Gökhan OCAKO?LU

    2013-01-01

    Full Text Available Morphometrics, a branch of morphology, is the study of the size and shape components of biological forms and their variation in the population. In biological and medical sciences, there is a long history of attempts to quantitatively express the diversity of the size and shape of biological forms. On the basis of historical developments in morphometry, we address several questions related to the shape of organs or organisms that are considered in biological and medical studies. In the field of morphometrics, multivariate statistical analysis is used to rigorously address such questions. Historically, these methods have involved the analysis of collections of distances or angles, but recent theoretical, computational, and other advances have shifted the focus of morphometric procedures to the Cartesian coordinates of anatomical points. In recent years, in biology and medicine, the traditional morphometric studies that aim to analyze shape variation have been replaced by modern morphometric studies. In the biological and medical sciences, morphometric methods are frequently preferred for examining the morphologic structures of organs or organisms with regard to diseases or environmental factors. These methods are also preferred for evaluating and classifying the variation of organs or organisms with respect to growth or allometry time dependently. Geometric morphometric methods are more valid than traditional morphometric methods in protecting more morphological information and in permitting analysis of this information.

  11. Manufacturing a low-cost ceramic water filter and filter system for the elimination of common pathogenic bacteria

    Science.gov (United States)

    Simonis, J. J.; Basson, A. K.

    Africa is one of the most water-scarce continents in the world but it is the lack of potable water which results in diarrhoea being the leading cause of death amongst children under the age of five in Africa (696 million children under 5 years old in Africa contract diarrhoea resulting in 2000 deaths per day: WHO and UNICEF, 2009). Most potable water treatment methods use bulk water treatment not suitable or available to the majority of rural poor in Sub-Saharan Africa. One simple but effective way of making sure that water is of good quality is by purifying it by means of a household ceramic water filter. The making and supply of water filters suitable for the removal of suspended solids, pathogenic bacteria and other toxins from drinking water is therefore critical. A micro-porous ceramic water filter with micron-sized pores was developed using the traditional slip casting process. This locally produced filter has the advantage of making use of less raw materials, cost, labour, energy and expertise and being more effective and efficient than other low cost produced filters. The filter is fitted with a silicone tube inserted into a collapsible bag that acts as container and protection for the filter. Enhanced flow is obtained through this filter system. The product was tested using water inoculated with high concentrations of different bacterial cultures as well as with locally polluted stream water. The filter is highly effective (log10 > 4 with 99.99% reduction efficiency) in providing protection from bacteria and suspended solids found in natural water. With correct cleaning and basic maintenance this filter technology can effectively provide drinking water to rural families affected by polluted surface water sources. This is an African solution for the more than 340 million people in Africa without access to clean drinking water (WHO and UNICEF, 2008).

  12. A New Stateless Packet Classification and Filter against DoS Attacks

    Directory of Open Access Journals (Sweden)

    Guang Jin

    2014-02-01

    Full Text Available Capabilities is a typical scheme of stateless filtering. In order to classify and filter packets effectively, a novel scheme of packet classification and filter based on capabilities is proposed in this paper. In our scheme, a new classifier module is added and a new filter structure is designed. We employ capabilities as verification and introduce new authorization in the communications. All these innovations make packet classification owning good effects in attacking scenario. The experimental results based on large-scale topology datasets and NS2 show that our scheme is better than traditional packet classification algorithms, especially under complex cyber environment.

  13. Design and Implementation for a Non Linear State Filter for LEO Micro Satellite

    Directory of Open Access Journals (Sweden)

    S. Chouraqui

    2009-01-01

    Full Text Available This study preliminarily investigates the numerical application of both Extended Kalman Filter (EKF (which has traditionally been used for non linear estimation and a relatively new filter, Unscented Kalman Filter (UKF to the nonlinear estimation problem. The new method can be applied to nonlinear systems without the linearization process necessary for the EKF and it does not demand a Gaussian distribution of noise and what's more, its ease of implementation and more accurate estimation features enables it to demonstrate its good performance. Present experimental results and analysis indicate that unscented Kalman filtering UKF have shown better performances in presence of the severe nonlinearity in state equations.

  14. Toward Green Cloud Computing: An Attribute Clustering Based Collaborative Filtering Method for Virtual Machine Migration

    Directory of Open Access Journals (Sweden)

    Zhang Liu-Mei

    2013-01-01

    Full Text Available In this study, an attribute clustering based collaborative filtering algorithm is depicted for virtual machine migration towards green Cloud computing. The algorithm utilizes similarity characteristics of virtual machine task related attributes, especially CPU related attributes, to filter redundant data by feature selection. Then by referencing K-Means clustering to effectively solve the rating scale problems existing in the traditional collaborative filtering recommendation algorithm. Experiments use virtual machine task related information for clustering the data. By integration of a scaled rating scheme on task related properties and the collaborative filtering philosophy to provide migration recommendation for system administrators.

  15. The Rao-Blackwellized Particle Filter : A Filter Bank Implementation

    OpenAIRE

    Hendeby, Gustaf; Karlsson, Rickard; Gustafsson, Fredrik

    2010-01-01

    For computational efficiency, it is important to utilize model structure in particle filtering. One of the most important cases occurs when there exists a linear Gaussian substructure, which can be efficiently handled by Kalman filters. This is the standard formulation of the Rao-Blackwellized particle filter (RBPF). This contribution suggests an alternative formulation of this well-known result that facilitates reuse of standard filtering components and which is also suitable for object-orie...

  16. Generalized Selection Weighted Vector Filters

    Directory of Open Access Journals (Sweden)

    Rastislav Lukac

    2004-09-01

    Full Text Available This paper introduces a class of nonlinear multichannel filters capable of removing impulsive noise in color images. The here-proposed generalized selection weighted vector filter class constitutes a powerful filtering framework for multichannel signal processing. Previously defined multichannel filters such as vector median filter, basic vector directional filter, directional-distance filter, weighted vector median filters, and weighted vector directional filters are treated from a global viewpoint using the proposed framework. Robust order-statistic concepts and increased degree of freedom in filter design make the proposed method attractive for a variety of applications. Introduced multichannel sigmoidal adaptation of the filter parameters and its modifications allow to accommodate the filter parameters to varying signal and noise statistics. Simulation studies reported in this paper indicate that the proposed filter class is computationally attractive, yields excellent performance, and is able to preserve fine details and color information while efficiently suppressing impulsive noise. This paper is an extended version of the paper by Lukac et al. presented at the 2003 IEEE-EURASIP Workshop on Nonlinear Signal and Image Processing (NSIP '03 in Grado, Italy.

  17. Downhole sand filter

    Energy Technology Data Exchange (ETDEWEB)

    Agalarov, D.M.; Aliev, N.I.; Aliev, R.M.; Aliev, Sh.N.; Guseynov, V.G.; Kurbanov, M.A.

    1981-01-01

    A downhole sand filter is proposed consisting of two concentrically placed branch pipes. The space between the two pipes is filled with a metallic bead. The outside pipe is composed of a diamagnetic material. Operational filter cleaning is made possible with a mechanical packer that is placed in the upper section of the inside branch pipe. Magnets with the same diameter as the branch pipe are placed so that their magnetic fields radiate away from the filter axis. These magnets are deployed in both the upper and lower annular spaces between the branch pipes and the two sleeves. Gasket seals are used between the sleeves and the perforated inside branch pipe. One of the sleeves is connected directly to the branch pipe. Ringed magnets are situated upon the spring-loaded upper face of the other sleeve. The inside branch pipe is also made of a dimagnetic material. Radial channels connect the sleeves, the inside branch pipe, and the filter chamber to the space between the sleeve and the inside branch pipe.

  18. Spectral Ensemble Kalman Filters.

    Czech Academy of Sciences Publication Activity Database

    Mandel, Jan; Kasanický, Ivan; Vejmelka, Martin; Fuglík, Viktor; Tur?i?ová, Marie; Eben, Kryštof; Resler, Jaroslav; Juruš, Pavel

    2014-01-01

    Ro?. 11, - (2014), EMS2014-446. [EMS Annual Meeting /14./ & European Conference on Applied Climatology (ECAC) /10./. 06.10.2014-10.10.2014, Prague] R&D Projects: GA ?R GA13-34856S Grant ostatní: NSF DMS-1216481 Institutional support: RVO:67985807 Keywords : data assimilation * spectral filter Subject RIV: DG - Athmosphere Sciences, Meteorology

  19. Ceramic HEPA Filter Program

    Energy Technology Data Exchange (ETDEWEB)

    Mitchell, M A; Bergman, W; Haslam, J; Brown, E P; Sawyer, S; Beaulieu, R; Althouse, P; Meike, A

    2012-04-30

    Potential benefits of ceramic filters in nuclear facilities: (1) Short term benefit for DOE, NRC, and industry - (a) CalPoly HTTU provides unique testing capability to answer questions for DOE - High temperature testing of materials, components, filter, (b) Several DNFSB correspondences and presentations by DNFSB members have highlighted the need for HEPA filter R and D - DNFSB Recommendation 2009-2 highlighted a nuclear facility response to an evaluation basis earthquake followed by a fire (aka shake-n-bake) and CalPoly has capability for a shake-n-bake test; (2) Intermediate term benefit for DOE and industry - (a) Filtration for specialty applications, e.g., explosive applications at Nevada, (b) Spin-off technologies applicable to other commercial industries; and (3) Long term benefit for DOE, NRC, and industry - (a) Across industry, strong desire for better performance filter, (b) Engineering solution to safety problem will improve facility safety and decrease dependence on associated support systems, (c) Large potential life-cycle cost savings, and (d) Facilitates development and deployment of LLNL process innovations to allow continuous ventilation system operation during a fire.

  20. Ozone decomposing filter

    Science.gov (United States)

    Simandl, Ronald F. (Farragut, TN); Brown, John D. (Harriman, TN); Whinnery, Jr., LeRoy L. (Dublin, CA)

    1999-01-01

    In an improved ozone decomposing air filter carbon fibers are held together with a carbonized binder in a perforated structure. The structure is made by combining rayon fibers with gelatin, forming the mixture in a mold, freeze-drying, and vacuum baking.

  1. Magnetic-Optical Filter

    CERN Document Server

    Formicola, I; Pinto, C; Cerulo, P

    2007-01-01

    Magnetic-Optical Filter (MOF) is an instrument suited for high precision spectral measurements for its peculiar characteristics. It is employed in Astronomy and in the field of the telecommunications (it is called FADOF there). In this brief paper we summarize its fundamental structure and functioning.

  2. Ceramic HEPA Filter Program

    International Nuclear Information System (INIS)

    Potential benefits of ceramic filters in nuclear facilities: (1) Short term benefit for DOE, NRC, and industry - (a) CalPoly HTTU provides unique testing capability to answer questions for DOE - High temperature testing of materials, components, filter, (b) Several DNFSB correspondences and presentations by DNFSB members have highlighted the need for HEPA filter R and D - DNFSB Recommendation 2009-2 highlighted a nuclear facility response to an evaluation basis earthquake followed by a fire (aka shake-n-bake) and CalPoly has capability for a shake-n-bake test; (2) Intermediate term benefit for DOE and industry - (a) Filtration for specialty applications, e.g., explosive applications at Nevada, (b) Spin-off technologies applicable to other commercial industries; and (3) Long term benefit for DOE, NRC, and industry - (a) Across industry, strong desire for better performance filter, (b) Engineering solution to safety problem will improve facility safety and decrease dependence on associated support systems, (c) Large potential life-cycle cost savings, and (d) Facilitates development and deployment of LLNL process innovations to allow continuous ventilation system operation during a fire.

  3. Spot- Zombie Filtering System

    Directory of Open Access Journals (Sweden)

    Arathy Rajagopal

    2014-01-01

    Full Text Available A major security challenge on the Internet is the existence of the large number of compromised machines. Such machines have been increasingly used to launch various security attacks including spamming and spreading malware, DDoS, and identity theft. These compromised machines are called “Zombies”. In general E-mail applications and providers uses spam filters to filter the spam messages. Spam filtering is a technique for discriminating the genuine message from the spam messages. The attackers send the spam messages to the targeted machine by exalting the filters, which causes the increase in false positives and false negatives. We develop an effective spam zombie detection system named SPOT by monitoring outgoing messages of a network. SPOT focuses on the number of outgoing messages that are originated or forwarded by each computer on a network to identify the presence of Zombies. SPOT is designed based on a powerful statistical tool called Sequential Probability Ratio Test, which has bounded false positive and false negative error rates.

  4. OPTIMIZATION OF ADVANCED FILTER SYSTEMS

    Energy Technology Data Exchange (ETDEWEB)

    R.A. Newby; M.A. Alvin; G.J. Bruck; T.E. Lippert; E.E. Smeltzer; M.E. Stampahar

    2002-06-30

    Two advanced, hot gas, barrier filter system concepts have been proposed by the Siemens Westinghouse Power Corporation to improve the reliability and availability of barrier filter systems in applications such as PFBC and IGCC power generation. The two hot gas, barrier filter system concepts, the inverted candle filter system and the sheet filter system, were the focus of bench-scale testing, data evaluations, and commercial cost evaluations to assess their feasibility as viable barrier filter systems. The program results show that the inverted candle filter system has high potential to be a highly reliable, commercially successful, hot gas, barrier filter system. Some types of thin-walled, standard candle filter elements can be used directly as inverted candle filter elements, and the development of a new type of filter element is not a requirement of this technology. Six types of inverted candle filter elements were procured and assessed in the program in cold flow and high-temperature test campaigns. The thin-walled McDermott 610 CFCC inverted candle filter elements, and the thin-walled Pall iron aluminide inverted candle filter elements are the best candidates for demonstration of the technology. Although the capital cost of the inverted candle filter system is estimated to range from about 0 to 15% greater than the capital cost of the standard candle filter system, the operating cost and life-cycle cost of the inverted candle filter system is expected to be superior to that of the standard candle filter system. Improved hot gas, barrier filter system availability will result in improved overall power plant economics. The inverted candle filter system is recommended for continued development through larger-scale testing in a coal-fueled test facility, and inverted candle containment equipment has been fabricated and shipped to a gasifier development site for potential future testing. Two types of sheet filter elements were procured and assessed in the program through cold flow and high-temperature testing. The Blasch, mullite-bonded alumina sheet filter element is the only candidate currently approaching qualification for demonstration, although this oxide-based, monolithic sheet filter element may be restricted to operating temperatures of 538 C (1000 F) or less. Many other types of ceramic and intermetallic sheet filter elements could be fabricated. The estimated capital cost of the sheet filter system is comparable to the capital cost of the standard candle filter system, although this cost estimate is very uncertain because the commercial price of sheet filter element manufacturing has not been established. The development of the sheet filter system could result in a higher reliability and availability than the standard candle filter system, but not as high as that of the inverted candle filter system. The sheet filter system has not reached the same level of development as the inverted candle filter system, and it will require more design development, filter element fabrication development, small-scale testing and evaluation before larger-scale testing could be recommended.

  5. Filtered containment venting

    International Nuclear Information System (INIS)

    After the TMI accident in the USA, the Swedish Government in 1981 decided that all Swedish nuclear power plants must be upgraded to mitigate the consequences of severe accidents. Mitigating measures were required to be in place by 1985 for the Barsebaeck plants and by 1988 for the remaining plants. The technical solution selected for accident mitigation was filtered venting of the reactor containment. The filtering requirement was that the release of radioactivity should be less than 0.1% of the core inventory. In Barsebaeck the filter consists of a gravel bed with a volume of 10,000 cubic metres. For the other Swedish plants (7 BWRs and 3 PWRs) a wet scrubber system of significantly smaller volume (300-400 cubic metres) has been selected. This filter system which is called FILTRA-MVSS (Multi Venturi Scrubber System) has been jointly developed by ASEA-ATOM and FLAEKT Industri, two companies belonging to the ASEA BROWN BOVERI Group of Companies. The FILTRA-MVSS which can accomodate a wide range of flow rates based on an automatic passive technique consists of a number of venturi nozzles submerged in a pool of water. The venturi separation technique has been employed in the field of industrial air pollution control for several decades. The technique has now been further developed and adapted for the cleaning of contaminated radioactive off-gases that might be the consequence of a severe reactor accident. After the Chernobyl accident the discussion on filtered containmendent the discussion on filtered containment venting intensified also in other European countries, and several countries, for example France, the Federal Republic of Germany and Finland are now planning for filtered venting systems. The FILTRA-MVSS system can be designed to meet a wide range of hypothetic design basis events for both BWR and PWR plants. The system can be optimized with respect to its size depending on various pressure requirements and it can be optimized for specified decontamination factors. (author). Poster presentation. 3 figs

  6. The Promise of Traditional Medicines

    Directory of Open Access Journals (Sweden)

    N.N. Inamdar

    2010-01-01

    Full Text Available The usage of plants, plant extracts or plant-derived pure chemicals to treat disease become a therapeutic modality, which has stood the test of time. Today several pharmacological classes of drugs include a natural product prototype. Aspirin, atropine, ephedrine, digoxin, morphine, quinine, reserpine and tubocurarine are a few examples of modern drugs, which were originally discovered through the study of traditional cures and folk knowledge of indigenous people. A team work amongst ethnobotanists, ethnopharmacologists, physicians and phytochemists is must for the fruitful outcome on medicinal plants research. While the ethnopharmacologists have a greater role in the rationalization of combination of activities, the phytochemist’s role will slightly shift towards standardization of herbal medicines.

  7. Traditional Grammar: An Interactive Book

    Science.gov (United States)

    Is traditional grammar dead? Donald Hardy, a professor of English at Northern Illinois University, doesn't think so. He recently posted this "e-grammar" on the Web to help teach users how to distinguish their nouns from their verbs, their nominative cases from their subjunctives, and their present perfect from their past. The descriptions are clear and concise, while quizzes at the end of each chapter as well as five practice exams allow readers to test their retention and keep track electronically of their score. (We were not convinced, however, that the typical exemplifications of the rules that are the core of each chapter truly constitute an "interactive" aspect of the text as the introduction claims.)

  8. HIV thrives in ancient traditions.

    Science.gov (United States)

    Shreedhar, J

    1995-01-01

    Participation in ancient traditions is facilitating the current spread of HIV through India. For most of the year, Koovagam is a typical Indian village. Each April on the night of the full moon, however, the Chittirai-Pournami festival is held in Koovagam, a celebration in homage to Aravan during which up to 2000 pilgrims from across the country engage in thousands of acts of unprotected sexual intercourse. Aravan is a man depicted in a Hindu tale who asked to experience sexual bliss before being sacrificed to the gods. To fulfill this last wish, the god Krishna is said to have assumed the form of a beautiful woman and had sexual intercourse with Aravan. Many of the festival participants are hijras, eunuchs and transsexuals who sell sex for a living. Hijras may be accompanied by men who serve as their sex partners and bodyguards. Surveys suggest that one-third of the 10,000 hijras in New Delhi may be infected with HIV. Other participants are known as dangas, men who are either married or single and appear to lead strictly heterosexual lives throughout the year except during the Chittirai-Pournami festival when they dress as women and sell sex to other men attending the festival. The panthis comprise another group of participants and tend to be either single or married men who attend the festival to have sex with the hijras and dangas for fees up to ten rupees, approximately US$0.50, per sexual encounter. Prostitution within the devadasi sect and the sale of young, virgin girls in the state of Andhra Pradesh to the highest male bidders are other examples of how ancient traditions are facilitating the current spread of HIV in India. PMID:12319989

  9. The magnetic centrifugal mass filter

    Science.gov (United States)

    Fetterman, Abraham J.; Fisch, Nathaniel J.

    2011-09-01

    Mass filters using rotating plasmas have been considered for separating nuclear waste and spent nuclear fuel. We propose a new mass filter that utilizes centrifugal and magnetic confinement of ions in a way similar to the asymmetric centrifugal trap. This magnetic centrifugal mass filter is shown to be more proliferation resistant than present technology. This filter is collisional and produces well confined output streams, among other advantages.

  10. Text Classification for Web Filtering

    OpenAIRE

    Sebastiani, Fabrizio

    2004-01-01

    This talk deals with the application of text classification techniques to Web filtering, with particular emphasis on filtering unsuitable content such as pornographic or violent content. The talk introduces text classification, its tools, techniques, and applications, and then discusses how filtering unsuitable Web content can be seen as the intersection of three different applications: Web page classification, filtering, and detection unsuitable content. The characteristics of these three ap...

  11. Application of a west Eurasian-specific filter for quasi-median network analysis: Sharpening the blade for mtDNA error detection

    OpenAIRE

    Zimmermann, Bettina; Ro?ck, Alexander; Huber, Gabriela; Kra?mer, Tanja; Schneider, Peter M.; Parson, Walther

    2011-01-01

    The application of quasi-median networks provides an effective tool to check the quality of mtDNA data. Filtering of highly recurrent mutations prior to network analysis is required to simplify the data set and reduce the complexity of the network. The phylogenetic background determines those mutations that need to be filtered. While the traditional EMPOPspeedy filter was based on the worldwide mtDNA phylogeny, haplogroup-specific filters can more effectively highlight potential errors in dat...

  12. Quick-change filter cartridge

    Science.gov (United States)

    Rodgers, John C. (Santa Fe, NM); McFarland, Andrew R. (College Station, TX); Ortiz, Carlos A. (Bryan, TX)

    1995-01-01

    A quick-change filter cartridge. In sampling systems for measurement of airborne materials, a filter element is introduced into the sampled airstream such that the aerosol constituents are removed and deposited on the filter. Fragile sampling media often require support in order to prevent rupture during sampling, and careful mounting and sealing to prevent misalignment, tearing, or creasing which would allow the sampled air to bypass the filter. Additionally, handling of filter elements may introduce cross-contamination or exposure of operators to toxic materials. Moreover, it is desirable to enable the preloading of filter media into quick-change cartridges in clean laboratory environments, thereby simplifying and expediting the filter-changing process in the field. The quick-change filter cartridge of the present invention permits the application of a variety of filter media in many types of instruments and may also be used in automated systems. The cartridge includes a base through which a vacuum can be applied to draw air through the filter medium which is located on a porous filter support and held there by means of a cap which forms an airtight seal with the base. The base is also adapted for receiving absorbing media so that both particulates and gas-phase samples may be trapped for investigation, the latter downstream of the aerosol filter.

  13. Assessment of ceramic membrane filters

    Energy Technology Data Exchange (ETDEWEB)

    Ahluwalia, R.K.; Geyer, H.K.; Im, K.H. [and others

    1995-08-01

    The objectives of this project include the development of analytical models for evaluating the fluid mechanics of membrane coated, dead-end ceramic filters, and to determine the effects of thermal and thermo-chemical aging on the material properties of emerging ceramic hot gas filters. A honeycomb cordierite monolith with a thin ceramic coating and a rigid candle filter were evaluated.

  14. Infusing Qualitative Traditions in Counseling Research Designs

    Science.gov (United States)

    Hays, Danica G.; Wood, Chris

    2011-01-01

    Research traditions serve as a blueprint or guide for a variety of design decisions throughout qualitative inquiry. This article presents 6 qualitative research traditions: grounded theory, phenomenology, consensual qualitative research, ethnography, narratology, and participatory action research. For each tradition, the authors describe its…

  15. Inorganic UV filters

    Scientific Electronic Library Online (English)

    Eloísa Berbel, Manaia; Renata Cristina Kiatkoski, Kaminski; Marcos Antonio, Corrêa; Leila Aparecida, Chiavacci.

    2013-06-01

    Full Text Available SciELO Brazil | Language: English Abstract in portuguese A preocupação com o câncer de pele hoje em dia vem crescendo cada vez mais principalmente em países tropicais, onde a incidência da radiação UVA/B é maior. O uso correto de protetores solares é a forma mais eficaz de prevenir o aparecimento desta doença. Os ativos utilizados em protetores solares po [...] dem ser filtros orgânicos e inorgânicos. Filtros inorgânicos apresentam muitas vantagens em relação aos orgânicos, tais como fotoestabilidade, ausência de irritabilidade e amplo espectro de proteção. Entretanto, em razão de apresentarem alto índice de refração, os ativos inorgânicos conferem aos protetores solares aparência esbranquiçada, diminuindo sua atratividade estética. Muitas alternativas têm sido desenvolvidas no sentido de resolver este problema e dentre elas pode-se destacar o uso da nanotecnologia. Estima-se que o uso de nanomateriais deve crescer das atuais 2000 para 58000 toneladas até 2020. Neste sentido, este trabalho tem como objetivo fazer a análise crítica abordando diferentes aspectos envolvidos tanto na obtenção de protetores solares inorgânicos (rotas de sínteses propostas nos últimos anos) quanto na permeabilidade, na segurança e em outros aspectos relacionados à nova geração de filtros solares inorgânicos. Abstract in english Nowadays, concern over skin cancer has been growing more and more, especially in tropical countries where the incidence of UVA/B radiation is higher. The correct use of sunscreen is the most efficient way to prevent the development of this disease. The ingredients of sunscreen can be organic and/or [...] inorganic sun filters. Inorganic filters present some advantages over organic filters, such as photostability, non-irritability and broad spectrum protection. Nevertheless, inorganic filters have a whitening effect in sunscreen formulations owing to the high refractive index, decreasing their esthetic appeal. Many techniques have been developed to overcome this problem and among them, the use of nanotechnology stands out. The estimated amount of nanomaterial in use must increase from 2000 tons in 2004 to a projected 58000 tons in 2020. In this context, this article aims to analyze critically both the different features of the production of inorganic filters (synthesis routes proposed in recent years) and the permeability, the safety and other characteristics of the new generation of inorganic filters.

  16. Controlling flow conditions of test filters in iodine filters

    International Nuclear Information System (INIS)

    Several different iodine filter and test filter designs and experience gained from their operation are presented. For the flow experiments, an iodine filter system equipped with flow regulating and measuring devices was built. In the experiments the influence of the packing method of the iodine sorption material and the influence of the flow regulating and measuring divices upon the flow conditions in the test filters was studied. On the basis of the experiments it has been shown that the flows through the test filters always can be adjusted to a correct value if there only is a high enough pressure difference available across the test filter ducting. As a result of the research, several different methods are presented with which the flows through the test filters in both operating and future iodine sorption system can easily be measured and adjusted to their correct values. (author)

  17. Maneuvering target tracking using fuzzy logic-based recursive least squares filter

    Science.gov (United States)

    Fan, En; Xie, Wei-xin; Liu, Zong-xiang

    2014-12-01

    In this paper, a fuzzy logic-based recursive least squares filter (FLRLSF) is presented for maneuvering target tracking (MTT) in situations of observations with unknown random characteristics. In the proposed filter, fuzzy logic is applied in the standard recursive least squares filter (RLSF) by the design of a set of fuzzy if-then rules. Given the observation residual and the heading change in the current prediction, these rules are used to determine the magnitude of the fading factor of RLSF. The proposed filter has an advantage in which the restrictive assumptions of statistical models for process noise, measurement noise, and motion models are relaxed. Moreover, it does not need a maneuver detector when tracking a maneuvering target. The performance of FLRLSF is evaluated by using a simulation and real test experiment, and it is found to be better than those of the traditional RLSF, the fuzzy adaptive ?-? filter (FA?-?F), and the hybrid Kalman filter in tracking accuracy.

  18. Spatial Filter with Volume Gratings for High-peak-power Multistage Laser Amplifiers

    CERN Document Server

    Tan, Yi-zhou; Zheng, Guang-wei; Shen, Ben-jian; Pan, Heng-yue; Li, Liu

    2012-01-01

    The regular spatial filters comprised of lens and pinhole are essential component in high power laser systems, such as lasers for inertial confinement fusion, nonlinear optical technology and directed-energy weapon. On the other hand the pinhole is treated as a bottleneck of high power laser due to harmful plasma created by the focusing beam. In this paper we present a spatial filter based on angular selectivity of Bragg diffraction grating to avoid the harmful focusing effect in the traditional pinhole filter. A spatial filter consisted of volume phase gratings in two-pass amplifier cavity were reported. Two-dimensional filter was proposed by using single Pi-phase-shifted Bragg grating, numerical simulation results shown that its angular spectrum bandwidth can be less than 160urad. The angular selectivity of photo-thermo-refractive glass and RUGATE film filters, construction stability, thermal stability and the effects of misalignments of gratings on the diffraction efficiencies under high-pulse-energy laser...

  19. Kazakh Traditional Dance Gesture Recognition

    Science.gov (United States)

    Nussipbekov, A. K.; Amirgaliyev, E. N.; Hahn, Minsoo

    2014-04-01

    Full body gesture recognition is an important and interdisciplinary research field which is widely used in many application spheres including dance gesture recognition. The rapid growth of technology in recent years brought a lot of contribution in this domain. However it is still challenging task. In this paper we implement Kazakh traditional dance gesture recognition. We use Microsoft Kinect camera to obtain human skeleton and depth information. Then we apply tree-structured Bayesian network and Expectation Maximization algorithm with K-means clustering to calculate conditional linear Gaussians for classifying poses. And finally we use Hidden Markov Model to detect dance gestures. Our main contribution is that we extend Kinect skeleton by adding headwear as a new skeleton joint which is calculated from depth image. This novelty allows us to significantly improve the accuracy of head gesture recognition of a dancer which in turn plays considerable role in whole body gesture recognition. Experimental results show the efficiency of the proposed method and that its performance is comparable to the state-of-the-art system performances.

  20. Drilling fluid filter

    Science.gov (United States)

    Hall, David R.; Fox, Joe; Garner, Kory

    2007-01-23

    A drilling fluid filter for placement within a bore wall of a tubular drill string component comprises a perforated receptacle with an open end and a closed end. A hanger for engagement with the bore wall is mounted at the open end of the perforated receptacle. A mandrel is adjacent and attached to the open end of the perforated receptacle. A linkage connects the mandrel to the hanger. The linkage may be selected from the group consisting of struts, articulated struts and cams. The mandrel operates on the hanger through the linkage to engage and disengage the drilling fluid filter from the tubular drill string component. The mandrel may have a stationary portion comprising a first attachment to the open end of the perforated receptacle and a telescoping adjustable portion comprising a second attachment to the linkage. The mandrel may also comprise a top-hole interface for top-hole equipment.

  1. The Geometry of Filtering

    CERN Document Server

    Elworthy, K D; Li, Xue-Mei

    2008-01-01

    Geometry arising from two diffusion operators (smooth semi-elliptic, second order differential operators) on different spaces but intertwined by a smooth map is described. Particular cases arise from Riemannian submersions when the operators are Laplace-Beltrami operators, from equivariant operators on the total space of a principal bundle, and for the operators on the diffeomorphism group arising from stochastic flows. Classical non-linear filtering problems also lead to such conffigurations. A basic tool is the, possibly, non-linear "semi-connection" induced by this set up, leading to a canonical decomposition of the operator on the domain space. Topics discussed include: generalised Wietzenbock curvatures arising in the equivariant case, skew -product decompositions of diffusion processes, conditioned processes, classical filtering, decomposition of stochastic flows, and connections determined by stochastic differential equations.

  2. Simon nitinol filter

    International Nuclear Information System (INIS)

    The incidence of inferior vena cava (IVC) thrombosis associated with the Simon Nitinol filter was reviewed at two institutions in 20 patients with lower- extremity deep venous thrombosis within 12 months. Ten patients had underlying malignancy. Clinical findings were correlated with findings of color-flow US, spin-echo (SE) MR imaging, and gradient-echo (GE) MR imaging of the IVC. Four of the 20 patients developed symptoms of IVC thrombosis (bilateral leg edema) within 5 weeks. MR imaging in all four documented IVC thrombosis (absence of intracaval flow void on SE images, absence of increased signal on GE images), and each had underlying malignancy (three pelvic tumors, one renal cell carcinoma). The authors conclude that patients with tumors, especially genitourinary primary tumors, may be at increased risk of IVC thrombosis in the presence of this filter, possibly because of pelvic venous compression or reduced renal vein inflow with prior nephrectomy; MR imaging is very useful for following IVC patency

  3. Filtered cathodic arc source

    Science.gov (United States)

    Falabella, Steven (Livermore, CA); Sanders, David M. (Livermore, CA)

    1994-01-01

    A continuous, cathodic arc ion source coupled to a macro-particle filter capable of separation or elimination of macro-particles from the ion flux produced by cathodic arc discharge. The ion source employs an axial magnetic field on a cathode (target) having tapered sides to confine the arc, thereby providing high target material utilization. A bent magnetic field is used to guide the metal ions from the target to the part to be coated. The macro-particle filter consists of two straight solenoids, end to end, but placed at 45.degree. to one another, which prevents line-of-sight from the arc spot on the target to the parts to be coated, yet provides a path for ions and electrons to flow, and includes a series of baffles for trapping the macro-particles.

  4. Condensate filtering device

    International Nuclear Information System (INIS)

    In a condensate filtering device of a nuclear power plant, a water collecting pipe is disposed over the entire length, an end of a hollow thread is in communication with the water collecting pipe and secured. If the length of the water collecting pipe is extended, a filtering device of an optional length can be obtained irrespective of the length of the hollow thread. Therefore, since there is no need to connect units upon constituting a module, flow of cleaning gases is not restricted at connection portions. Accordingly, even if the volume of the device is increased by the extension of the module, the working life of the module is not degraded. (T.M.)

  5. Bloomier Filters: A second look

    CERN Document Server

    Charles, Denis

    2008-01-01

    A Bloom filter is a space efficient structure for storing static sets, where the space efficiency is gained at the expense of a small probability of false-positives. A Bloomier filter generalizes a Bloom filter to compactly store a function with a static support. In this article we give a simple construction of a Bloomier filter. The construction is linear in space and requires constant time to evaluate. The creation of our Bloomier filter takes linear time which is faster than the existing construction. We show how one can improve the space utilization further at the cost of increasing the time for creating the data structure.

  6. Sharpening minimum-phase filters

    International Nuclear Information System (INIS)

    The minimum-phase requirement restricts that filter has all its zeros on or inside the unit circle. As a result the filter does not have a linear phase. It is well known that the sharpening technique can be used to simultaneous improvements of both the pass-band and stop-band of a linear-phase FIR filters and cannot be used for other types of filters. In this paper we demonstrate that the sharpening technique can also be applied to minimum-phase filters, after small modification. The method is illustrated with one practical examples of design.

  7. Wiener-like correlation filters

    Science.gov (United States)

    Khoury, Jehad; Gianino, Peter D.; Woods, Charles L.

    2000-01-01

    We introduce a new, to our knowledge, design for a Wiener-like correlation filter, which consists of cascading a phase-only filter (POF) with a photorefractive Wiener-like filter. Its performance is compared with that of the POF and the Wiener correlation filter (WCF). Correlation results show that for intermediate and higher levels of noise this correlation filter has a peak-to-noise ratio that is larger than that of either the POF or the WCF while still preserving a correlation peak that is almost as high as that of the POF.

  8. Multimodal microwave filters

    OpenAIRE

    Contreras Lizarraga, Adria?n Arturo

    2013-01-01

    This thesis presents the conception, design and implementation of new topologies of multimodal microwave resonators and filters, using a combination of uniplanar technologies such as coplanar waveguide (CPW), coplanar strips (CPS) and slotlines. The term "multimodal" refers to uniplanar circuits in which the two fundamental modes of the CPW propagate (the even and the odd mode). By using both modes of the CPW, it is possible to achieve added functions, such as additional transmission zeros to...

  9. Wetland Filter Model

    Science.gov (United States)

    Twin Cities Public Television, Inc.

    2007-01-01

    In this quick activity (located on page 2 of the PDF), learners will model how wetlands act as natural filters for the environment. Learners prepare a mixture of water, soil, gravel, and leaves and then pour it down a piece of artificial grass, observing how much gets trapped in the fake grass and comparing water at the bottom with the initial “polluted” sample. Relates to the linked video, DragonflyTV GPS: Wetlands.

  10. Subjective Collaborative Filtering

    OpenAIRE

    Caruso, Fabrizio; Giuffrida, Giovanni; Zarba, Calogero

    2011-01-01

    We present an item-based approach for collaborative filtering. We determine a list of recommended items for a user by considering their previous purchases. Additionally other features of the users could be considered such as page views, search queries, etc... In particular we address the problem of efficiently comparing items. Our algorithm can efficiently approximate an estimate of the similarity between two items. As measure of similarity we use an approximation of the Jac...

  11. Active Collaborative Filtering

    OpenAIRE

    Boutilier, Craig; Zemel, Richard S.; Marlin, Benjamin

    2012-01-01

    Collaborative filtering (CF) allows the preferences of multiple users to be pooled to make recommendations regarding unseen products. We consider in this paper the problem of online and interactive CF: given the current ratings associated with a user, what queries (new ratings) would most improve the quality of the recommendations made? We cast this terms of expected value of information (EVOI); but the online computational cost of computing optimal queries is prohibitive. W...

  12. Trust based collaborative filtering

    OpenAIRE

    Lathia, N.; Hailes, S.; Capra, L.

    2008-01-01

    k-nearest neighbour (kNN) collaborative filtering (CF), the widely successful algorithm supporting recommender systems, attempts to relieve the problem of information overload by generating predicted ratings for items users have not expressed their opinions about; to do so, each predicted rating is computed based on ratings given by like-minded individuals. Like-mindedness, or similarity-based recommendation, is the cause of a variety of problems that plague recommender syst...

  13. Regenerable particulate filter

    Science.gov (United States)

    Stuecker, John N. (Albuquerque, NM); Cesarano, III, Joseph (Albuquerque, NM); Miller, James E. (Albuquerque, NM)

    2009-05-05

    A method of making a three-dimensional lattice structure, such as a filter used to remove particulates from a gas stream, where the physical lattice structure is designed utilizing software simulation from pre-defined mass transfer and flow characteristics and the designed lattice structure is fabricated using a free-form fabrication manufacturing technique, where the periodic lattice structure is comprised of individual geometric elements.

  14. Applied Filtered Density Function

    OpenAIRE

    Levent Yilmaz, S.; Ansari, N.; Pisciuneri, P. H.; Nik, M. B.; Otis, C. C.; Givi, P.

    2013-01-01

    An overview is presented of recent advances in the filtered density function (FDF) modeling and simulation of turbulent combustion. The review is focused on the developments that have facilitated the FDF to be broadly applied in large eddy simulation (LES) of practical flows. These are primarily the development of a new Lagrangian Monte Carlo solver for the FDF, and the implementation of this solver on Eulerian domains portrayed by unstructured grids. With these developmen...

  15. Modified repeated median filters

    OpenAIRE

    Bernholt, Thorsten; Fried, Roland; Gather, Ursula; Wegner, Ingo

    2004-01-01

    We discuss moving window techniques for fast extraction of a signal comprising monotonic trends and abrupt shifts from a noisy time series with irrelevant spikes. Running medians remove spikes and preserve shifts, but they deteriorate in trend periods. Modified trimmed mean filters use a robust scale estimate such as the median absolute deviation about the median (MAD) to select an adaptive amount of trimming. Application of robust regression, particularly of the repeated median, has been sug...

  16. Software filter strategies

    International Nuclear Information System (INIS)

    At an SSC luminosity of 10/sup 33/, the hardware triggers should lower the event rate to a few kHz. A farm of microprocessors can then be used to further reduce the rate to the 1 Hz level. This paper uses the proposed DO filter strategy as an example of what as SSC software filter could look like. A sketch of the current DO software filter design is given with a detailed example of the muon trigger. The hardware trigger will have imposed cuts on various trigger 'components' (e.g. muon rho/sub t/, total E/sub T/, number of jets). The first step is to recalculate each of those components which passed the hardware level but with an improved resolution due to utilizing more information (calibration constants, corrections, muon drift tube times). The various trigger components can then be correlated. Next, specified regions in other detector subsystems can be inspected. Again, an electron trigger should have a singly ionizing track approximately pointing to it. The processing time to do complete central tracking analysis is prohibitive but tracking only around the 'electron' can be done. Finally, all components can be used to classify and reject events. Those events which pass this point can then have a complete analysis performed on them

  17. Carbon nanotube filters

    Science.gov (United States)

    Srivastava, A.; Srivastava, O. N.; Talapatra, S.; Vajtai, R.; Ajayan, P. M.

    2004-09-01

    Over the past decade of nanotube research, a variety of organized nanotube architectures have been fabricated using chemical vapour deposition. The idea of using nanotube structures in separation technology has been proposed, but building macroscopic structures that have controlled geometric shapes, density and dimensions for specific applications still remains a challenge. Here we report the fabrication of freestanding monolithic uniform macroscopic hollow cylinders having radially aligned carbon nanotube walls, with diameters and lengths up to several centimetres. These cylindrical membranes are used as filters to demonstrate their utility in two important settings: the elimination of multiple components of heavy hydrocarbons from petroleum-a crucial step in post-distillation of crude oil-with a single-step filtering process, and the filtration of bacterial contaminants such as Escherichia coli or the nanometre-sized poliovirus (~25 nm) from water. These macro filters can be cleaned for repeated filtration through ultrasonication and autoclaving. The exceptional thermal and mechanical stability of nanotubes, and the high surface area, ease and cost-effective fabrication of the nanotube membranes may allow them to compete with ceramic- and polymer-based separation membranes used commercially.

  18. Optimization of HEPA filter design

    International Nuclear Information System (INIS)

    Often high efficiency filters are associated with high pressure drop, particularly at high airflow velocity. With design optimization, a cost effective design of clean room can be achieved by lowering energy cost or reducing the size of the filter housing. Through mathematical analysis, optimum filter designs are obtained for the separator type HEPA filter. In the mathematical analysis, a similarity solution obtained from Navier-Stoke's equation for airflow between the filter pleat spacing with uniform mass addition and extraction is applied to each finite element along the pleat channel. The optimum pleat aspect ratio is obtained by combining the expressions for the axial pressure gradient for upstream channel with mass extraction, the axial pressure gradient for downstream channel with mass addition, and the filter media flow characteristics for finite elements along the filter pleat channel with varying wall airflow rate

  19. A New Approach for Cluster Based Collaborative Filters

    Directory of Open Access Journals (Sweden)

    R. Venu Babu,

    2010-11-01

    Full Text Available In modern E-Commerce it is not easy for customers to find the best suitable goods of their interest as more and more information is placed on line (like movies, audios, books, documents etc.... So in order to provide most suitable information of high value to customers of an e-commerce business system, a customized recommender system is required. Collaborative Filtering has become a popular technique for reducing this information overload. While traditional collaborative filtering systems have been a substantial success, there are several problems that researchers and commercial applications have identified: the early rater problem, the sparsity problem, and the scalability problem. This paper contains literature survey on cluster based collaborative filter and an approach to construct it.

  20. DCT based interpolation filter for motion compensation in HEVC

    Science.gov (United States)

    Alshin, Alexander; Alshina, Elena; Park, Jeong Hoon; Han, Woo-Jin

    2012-10-01

    High Efficiency Video Coding (HEVC) draft standard has a challenging goal to improve coding efficiency twice compare to H.264/AVC. Many aspects of the traditional hybrid coding framework were improved during new standard development. Motion compensated prediction, in particular the interpolation filter, is one area that was improved significantly over H.264/AVC. This paper presents the details of the interpolation filter design of the draft HEVC standard. The coding efficiency improvements over H.264/AVC interpolation filter is studied and experimental results are presented, which show a 4.0% average bitrate reduction for Luma component and 11.3% average bitrate reduction for Chroma component. The coding efficiency gains are significant for some video sequences and can reach up 21.7%.

  1. Kalman Filter Based Tracking in an Video Surveillance System

    Directory of Open Access Journals (Sweden)

    SULIMAN, C.

    2010-05-01

    Full Text Available In this paper we have developed a Matlab/Simulink based model for monitoring a contact in a video surveillance sequence. For the segmentation process and corect identification of a contact in a surveillance video, we have used the Horn-Schunk optical flow algorithm. The position and the behavior of the correctly detected contact were monitored with the help of the traditional Kalman filter. After that we have compared the results obtained from the optical flow method with the ones obtained from the Kalman filter, and we show the correct functionality of the Kalman filter based tracking. The tests were performed using video data taken with the help of a fix camera. The tested algorithm has shown promising results.

  2. Graphene valley pseudospin filter using an extended line defect

    Science.gov (United States)

    Gunlycke, Daniel; White, Carter

    2011-03-01

    Although graphene exhibits excellent electron and thermal transport properties, it does not have an intrinsic band gap, required to use graphene as a replacement material for silicon and other semiconductors in conventional electronics. The band structure of graphene with its two cones near the Fermi level, however, offers opportunities to develop non-traditional applications. One such avenue is to exploit the valley degeneracy in graphene to develop valleytronics. A central component in valleytronics is the valley filter, just as the spin filter is central in spintronics. Herein, we present a two-dimensional valley filter based on scattering of electrons and holes off a recently observed extended line defect [Nat. Nanotech. 5, 326 (2010)] within graphene. The transmission probability depends strongly on the valley pseudospin and the angle of incidence of the incident quasiparticles. Quasiparticles arriving at the line defect at a high angle of incidence lead to a valley polarization of the transmitted beam that is near 100 percent.

  3. Efficient Feature Selection Methods in Chinese Spam Filtering

    Directory of Open Access Journals (Sweden)

    Xu Yan

    2013-01-01

    Full Text Available This study, from the perspective of Chinese Spam Filtering, focuses on efficient feature selection methods. It expounds the traditional feature selection algorithms including Document Frequency (DF, Information Gain (IG, the Mutual Information (MI, Chi-square (CHI and Knowledge Gain (KG which is proposed in my previous study. Testing these methods on exposing Chinese spam data set, the results show that in Chinese spam corpus CHI and KG can efficiently extract valid features for spam classifications.

  4. Improving Recommendation Quality by Merging Collaborative Filtering and Social Relationships

    OpenAIRE

    Meo, Pasquale; Ferrara, Emilio; Fiumara, Giacomo; Provetti, Alessandro

    2011-01-01

    Matrix Factorization techniques have been successfully applied to raise the quality of suggestions generated by Collaborative Filtering Systems (CFSs). Traditional CFSs based on Matrix Factorization operate on the ratings provided by users and have been recently extended to incorporate demographic aspects such as age and gender. In this paper we propose to merge CFS based on Matrix Factorization and information regarding social friendships in order to provide users with more...

  5. Efficient Feature Selection Methods in Chinese Spam Filtering

    OpenAIRE

    Xu Yan

    2013-01-01

    This study, from the perspective of Chinese Spam Filtering, focuses on efficient feature selection methods. It expounds the traditional feature selection algorithms including Document Frequency (DF), Information Gain (IG), the Mutual Information (MI), Chi-square (CHI) and Knowledge Gain (KG) which is proposed in my previous study. Testing these methods on exposing Chinese spam data set, the results show that in Chinese spam corpus CHI and KG can efficiently extract valid features for spam cla...

  6. A cloud filtering method for microwave upper tropospheric humidity measurements

    Directory of Open Access Journals (Sweden)

    S. A. Buehler

    2007-05-01

    Full Text Available The paper presents a cloud filtering method for upper tropospheric humidity (UTH measurements at 183.31±1.00 GHz. The method uses two criteria: The difference between the brightness temperatures at 183.31±7.00 and 183.31±1.00 GHz, and a threshold for the brightness temperature at 183.31±1.00 GHz. The robustness of this cloud filter is demonstrated by a mid-latitudes winter case-study.

    The paper then studies different biases on UTH climatologies. Clouds are associated with high humidity, therefore the dry bias introduced by cloud filtering is discussed and compared to the wet biases introduced by the clouds radiative effect if no filtering is done. This is done by means of a case study, and by means of a stochastic cloud database with representative statistics for midlatitude conditions.

    The consistent result is that both cloud wet bias (0.8% RH and cloud filtering dry bias (–2.4% RH are modest for microwave data, where the numbers given are for the stochastic cloud dataset. This indicates that for microwave data cloud-filtered UTH and unfiltered UTH can be taken as error bounds for errors due to clouds. This is not possible for the more traditional infrared data, since the radiative effect of clouds is much stronger there.

    The focus of the paper is on midlatitude data, since atmospheric data to test the filter for that case were readily available. The filter is expected to be applicable also to subtropical and tropical data, but should be further validated with case studies similar to the one presented here for those cases.

  7. Implementational Aspects of the Contourlet Filter Bank and Application in Image Coding

    Directory of Open Access Journals (Sweden)

    Truong T. Nguyen

    2009-02-01

    Full Text Available This paper analyzed the implementational aspects of the contourlet filter bank (or the pyramidal directional filter bank (PDFB, and considered its application in image coding. First, details of the binary tree-structured directional filter bank (DFB are presented, including a modification to minimize the phase delay factor and necessary steps for handling rectangular images. The PDFB is viewed as an overcomplete filter bank, and the directional filters are expressed in terms of polyphase components of the pyramidal filter bank and the conventional DFB. The aliasing effect of the conventional DFB and the Laplacian pyramid to the directional filters is then considered, and the conditions for reducing this effect are presented. The new filters obtained by redesigning the PDFBs satisfying these requirements have much better frequency responses. A hybrid multiscale filter bank consisting of the PDFB at higher scales and the traditional maximally decimated wavelet filter bank at lower scales is constructed to provide a sparse image representation. A novel embedded image coding system based on the image decomposition and a morphological dilation algorithm is then presented. The coding algorithm efficiently clusters the significant coefficients using progressive morphological operations. Context models for arithmetic coding are designed to exploit the intraband dependency and the correlation existing among the neighboring directional subbands. Experimental results show that the proposed coding algorithm outperforms the current state-of-the-art wavelet-based coders, such as JPEG2000, for images with directional features.

  8. RESEARCH ON SPATIAL FILTERS AND HOMOMORPHIC FILTERING METHODS

    Directory of Open Access Journals (Sweden)

    Abhinash Singla

    2012-12-01

    Full Text Available In image processing, denoising is one of the important tasks. Despite the significant research conducted on this topic, the development of efficient denoising methods is still a compelling challenge. In this paper, comparison of Spatial Filters methods with the Homomorphic Filters Methods. The spatial filter methods like Median Filter and Wiener Filter are based on the simple formulas that are proposed by different authors. In Homomorphic Filters Method NormalShrink and BayesShrink are used. The basic idea of homomorphic methods is to denoise the image by applying wavelet transform to the noisy image, then thresholding the detailed wavelet coefficient and inverse transforming the set of thresholded coefficient to obtain the denoised image. In this soft thresholding technique is applied.

  9. Folktale Narration: A Retreating Tradition

    Directory of Open Access Journals (Sweden)

    Tandin Dorji

    2002-01-01

    Full Text Available To talk of folktales in the Bhutanese context is to discuss ona literary genre popularly known as khaju1 or ‘oraltransmission’. It serves as an important tool ofcommunication between one generation and another. Amongothers, the folktales comprise an indispensable portion of oralliterature. In it is seen the manifestation of the popularimagination and creativity representing the Bhutanesepatrimony which has been passed down from mouth to earsince time immemorial. The role that it plays in thetransmission of moral values, philosophy, beliefs, humour,etiquette, and many other traits specific to the Bhutanesesociety holds an inescapably eminent place. Despite thisimportance, the documentation of folktales in Bhutan is stillin its infancy. Till the mid-twentieth century, education wasimparted through the monasteries and all the people did nothave access to it. Furthermore, the scarcity of writing andprinting facilities compounded the difficulty and consequentlythe larger section of the population remained illiterate. Evenafter schools were opened and facilities provided free of cost,the documentation of folktales took quite sometime to jumpfrom the springboard. It was only in 1984 that Dasho SherabThaye published his first volume of the collection of folktalesfollowed suite by another two in 1986. This was the debutand now we have authors like Kunzang Choden, Kinley Wangmo, Françoise Pommaret and a few others who hadfollowed the footsteps of Dasho Sherab Thaye. However, thecollections made until today is just a drop considering thevast reservoir of folktales that lies recorded in the memoriesof the Bhutanese. It requires the efforts of many Bhutaneseeven to document a part of this inexhaustible patrimony.This, however, is not to indicate that the Bhutanese folktalesare different form the rest. In fact, ‘…folktales are the same allover, for they tell of people. Not ordinary people like those wemeet on our journey through life, but the whole secret andexciting society of one eyed sorcerers, evil giants, handsomeprinces and dancing fairies….’ 2 All the same, what is specialabout the Bhutanese folktales is that, it still is a livingtradition in many pockets of rural Bhutan. In the villageswhich are far flung from motor roads, the narration offolktales in the pastures, and in the evenings are even todayvery much alive. However, the question is, how long will itcontinue to survive? Will the development process engulf thisbeautiful tradition? And, what could be done to keep thisheritage alive?

  10. From Microwave Filter to Digital Filter and Back Again

    OpenAIRE

    Dalby, Arne Brejning

    2010-01-01

    A new very simple state variable flow graph representation for interdigital transmission line bandpass filters is presented, which has led to two important results: 1) A new type of digital filter with properties, that surpass the properties of most other (all pole) digital filtertypes. 2) The study of the new digital filtertype has led to design formulas for interdigital transmission line filters that are very simple compared to the hitherto known formulas. The accuracy is the same or better.

  11. From Microwave Filter to Digital Filter and Back Again

    DEFF Research Database (Denmark)

    Dalby, Arne Brejning

    1989-01-01

    A new very simple state variable flow graph representation for interdigital transmission line bandpass filters is presented, which has led to two important results: 1) A new type of digital filter with properties, that surpass the properties of most other (all pole) digital filtertypes. 2) The study of the new digital filtertype has led to design formulas for interdigital transmission line filters that are very simple compared to the hitherto known formulas. The accuracy is the same or better.

  12. DSP Control of Line Hybrid Active Filter

    DEFF Research Database (Denmark)

    Dan, Stan George; Benjamin, Doniga Daniel

    2005-01-01

    Active Power Filters have been intensively explored in the past decade. Hybrid active filters inherit the efficiency of passive filters and the improved performance of active filters, and thus constitute a viable improved approach for harmonic compensation. In this paper a parallel hybrid filter is studied for current harmonic compensation. The hybrid filter is formed by a single tuned Le filter and a small-rated power active filter, which are directly connected in series without any matching transformer. Thus the required rating of the active filter is much smaller than a conventional standalone active filter. Simulation and experimental results obtained in laboratory confirmed the validity and effectiveness of the control.

  13. Spatial filters for high average power lasers

    Energy Technology Data Exchange (ETDEWEB)

    Erlandson, Alvin C

    2012-11-27

    A spatial filter includes a first filter element and a second filter element overlapping with the first filter element. The first filter element includes a first pair of cylindrical lenses separated by a first distance. Each of the first pair of cylindrical lenses has a first focal length. The first filter element also includes a first slit filter positioned between the first pair of cylindrical lenses. The second filter element includes a second pair of cylindrical lenses separated by a second distance. Each of the second pair of cylindrical lenses has a second focal length. The second filter element also includes a second slit filter positioned between the second pair of cylindrical lenses.

  14. Time Weight Update Model Based on the Memory Principle in Collaborative Filtering

    Directory of Open Access Journals (Sweden)

    Dan Li

    2013-11-01

    Full Text Available Collaborative filtering is the most widely used technology in the recommender systems. Existing collaborative filtering algorithms do not take the time factor into account. However, users’ interests always change with time, and traditional collaborative filtering cannot reflect the changes. In this paper, the change of users’ interests is considered as the memory process, and a time weight iteration model is designed based on memory principle. For a certain user, the proposed model introduces the time weight for each item, and updates the weight by computing the similarity with the items chosen in a recent period. In the recommend process, the weight will be applied to the prediction algorithm. Experimental results show that the modified algorithm can optimize the result of the recommendation in a certain extent, and performs better than traditional collaborative filtering.

  15. A double-filter method for nitrocellulose-filter binding: application to protein-nucleic acid interactions.

    OpenAIRE

    Wong, I.; Lohman, T. M.

    1993-01-01

    Nitrocellulose-filter binding is a powerful technique commonly used to study protein-nucleic acid interactions; however, its utility in quantitative studies is often compromised by its lack of precision. To improve precision and accuracy, we have introduced two modifications to the traditional technique: the use of a 96-well dot-blot apparatus and the addition of a DEAE membrane beneath the nitrocellulose membrane. Using the dot-blot apparatus, an entire triplicate set of data spanning 20-24 ...

  16. Software Analysis Unifying Particle Filtering and Marginalized Particle Filtering.

    Czech Academy of Sciences Publication Activity Database

    Šmídl, Václav

    Edinburgh : IET, 2010, s. 1-7. ISBN 978-0-9824438-1-1. [13th International Conference on Information Fusion. Edinburgh (GB), 26.07.2010-29.07.2010] R&D Projects: GA MŠk 1M0572; GA ?R GP102/08/P250 Institutional research plan: CEZ:AV0Z10750506 Keywords : marginalized particle filter * software analysis * Bayesian filtering Subject RIV: BC - Control Systems Theory http://library.utia.cas.cz/separaty/2010/AS/smidl-software analysis unifying particle filtering and marginalized particle filtering.pdf

  17. The Geometry of Filtering

    CERN Document Server

    Elworthy, K David; Li, Xue-Mei

    2010-01-01

    The geometry which is the topic of this book is that determined by a map of one space N onto another, M, mapping a diffusion process, or operator, on N to one on M. Filtering theory is the science of obtaining or estimating information about a system from partial and possibly flawed observations of it. The system itself may be random, and the flaws in the observations can be caused by additional noise. In this volume the randomness and noises will be of Gaussian white noise type so that the system can be modelled by a diffusion process; that is it evolves continuously in time in a Markovian wa

  18. Charcoal filter testing

    Energy Technology Data Exchange (ETDEWEB)

    Lyons, J. [Nuclear Regulatory Commission, Washington, DC (United States)

    1997-08-01

    In this very brief, informal presentation, a representative of the US Nuclear Regulatory Commission outlines some problems with charcoal filter testing procedures and actions being taken to correct the problems. Two primary concerns are addressed: (1) the process to find the test method is confusing, and (2) the requirements of the reference test procedures result in condensation on the charcoal and causes the test to fail. To address these problems, emergency technical specifications were processed for three nuclear plants. A generic or an administrative letter is proposed as a more permanent solution. 1 fig.

  19. Charcoal filter testing

    International Nuclear Information System (INIS)

    In this very brief, informal presentation, a representative of the US Nuclear Regulatory Commission outlines some problems with charcoal filter testing procedures and actions being taken to correct the problems. Two primary concerns are addressed: (1) the process to find the test method is confusing, and (2) the requirements of the reference test procedures result in condensation on the charcoal and causes the test to fail. To address these problems, emergency technical specifications were processed for three nuclear plants. A generic or an administrative letter is proposed as a more permanent solution. 1 fig

  20. Aboriginal Oral Traditions of Australian Impact Craters

    CERN Document Server

    Hamacher, Duane W

    2013-01-01

    We explore Aboriginal oral traditions that relate to Australian meteorite craters. Using the literature, first-hand ethnographic records, and fieldtrip data, we identify oral traditions and artworks associated with four impact sites: Gosses Bluff, Henbury, Liverpool, and Wolfe Creek. Oral traditions describe impact origins for Gosses Bluff and Wolfe Creek craters and non-impact origins of Liverpool and Henbury craters, with Wolfe Creek stories having both impact and non-impact origins. Three impact sites that are believed to have formed during human habitation of Australia - Dalgaranga, Veevers, and Boxhole - do not have associated oral traditions that are reported in the literature.

  1. A taxonomy fuzzy filtering approach

    Directory of Open Access Journals (Sweden)

    Vrettos S.

    2003-01-01

    Full Text Available Our work proposes the use of topic taxonomies as part of a filtering language. Given a taxonomy, a classifier is trained for each one of its topics. The user is able to formulate logical rules combining the available topics, e.g. (Topic1 AND Topic2 OR Topic3, in order to filter related documents in a stream. Using the trained classifiers, every document in the stream is assigned a belief value of belonging to the topics of the filter. These belief values are then aggregated using logical operators to yield the belief to the filter. In our study, Support Vector Machines and Naïve Bayes classifiers were used to provide topic probabilities. Aggregation of topic probabilities based on fuzzy logic operators was found to improve filtering performance on the Renters text corpus, as compared to the use of their Boolean counterparts. Finally, we deployed a filtering system on the web using a sample taxonomy of the Open Directory Project.

  2. Dynamically Adaptive Count Bloom Filter for Handling Duplicates in Data Stream

    OpenAIRE

    Senthamilarasu.S; Hemalatha, M.

    2012-01-01

    Identifying and removing duplicates in DataStream applications is one of the primary challenges in traditional duplicate elimination techniques. It is not feasible in many streaming scenarios to eliminate precisely the occurrence of duplicates in an unbounded data stream. However, existing variants of the Bloom filter cannot support dynamic in both filter and counter together. In this paper we focus on eliminating the duplicates by introducing the dynamic approach on both the size of the coun...

  3. Employing User Attribute and Item Attribute to Enhance the Collaborative Filtering Recommendation

    OpenAIRE

    SongJie Gong

    2009-01-01

    Recommender systems are web based systems that aim at predicting a customer's interest on available products and services by relying on previously rated products and dealing with the problem of information and product overload. Collaborative filtering is the most popular recommendation technique nowadays and it mainly employs the user item rating data set. Traditional collaborative filtering approaches compute a similarity value between the target user and each other user by computing the rel...

  4. Experimental results of a single-phase shunt active filter prototype with different switching techniques

    OpenAIRE

    Neves, Pedro; Pinto, J. G.; Pregitzer, Ricardo G.; Monteiro, Lui?s F. C.; Afonso, Joa?o L.; Sepu?lveda, Joa?o

    2007-01-01

    This paper presents experimental results obtained with a developed single-phase shunt active power filter laboratory prototype operating with different switching techniques. This active filter can compensate harmonic currents and power factor in single-phase electric installations. Its power circuit is based on a two-leg IGBT inverter, with a single capacitor in the dc side, and an inductor in the ac side. Its control system is based on a simple stratagem that enables the use of the tradition...

  5. FILT - Filtering Indexed Lucene Triples

    OpenAIRE

    Stuhr, Magnus

    2012-01-01

    The Resource Description Framework (RDF) is the W3C recommended standard for data on the semantic web, while the SPARQL Protocol and RDF Query Language (SPARQL) is the query language that retrieves RDF triples by subject, predicate, or object. RDF data often contain valuable information that can only be queried through filter functions. The SPARQL query language for RDF can include filter clauses in order to define specific data criteria, such as full-text searches, numerical filtering, and c...

  6. Note: Cryogenic coaxial microwave filters

    International Nuclear Information System (INIS)

    The careful filtering of microwave electromagnetic radiation is critical for controlling the electromagnetic environment for experiments in solid-state quantum information processing and quantum metrology at millikelvin temperatures. We describe the design and fabrication of a coaxial filter assembly and demonstrate that its performance is in excellent agreement with theoretical modelling. We further perform an indicative test of the operation of the filters by making current-voltage measurements of small, underdamped Josephson junctions at 15 mK

  7. Tunable Imaging Filters in Astronomy

    OpenAIRE

    Bland-hawthorn, J.

    2000-01-01

    While tunable filters are a recent development in night time astronomy, they have long been used in other physical sciences, e.g. solar physics, remote sensing and underwater communications. With their ability to tune precisely to a given wavelength using a bandpass optimized for the experiment, tunable filters are already producing some of the deepest narrowband images to date of astrophysical sources. Furthermore, some classes of tunable filters can be used in fast telesco...

  8. Robust filtering for TDR traces

    OpenAIRE

    Neveux, Philippe; Blanco, Eric

    2008-01-01

    Time-domain reflectometry (TDR) is a valuable technique for the measurement of dielectric properties of soils. The filtering of noisy TDR traces is treated in this paper. The problem of biased estimation occurs when the permittivity of the soil varies with the depth. In practical issues, only the apparent permittivity of the soil along the TDR line is available. Hence, robust optimal filtering has to be developed to provide a robust and reliable estimation. This filtering step is essential fo...

  9. HEPA filter performance comparative study

    International Nuclear Information System (INIS)

    Current products such as HEPA filters made without separators, with tapered separators and with mini separators have raised many questions for the Nuclear Ventilation System Design Engineer and/or the end user. The principal objective of this investigation is to report HEPA filter performance data and to compare the effectiveness of the various type HEPA filters for use in Nuclear Ventilation Systems with all tests run on the same equipment and under the same controlled conditions

  10. Game-theoretic Kalman Filter

    Science.gov (United States)

    Colburn, Christopher; Bewley, Thomas

    2010-11-01

    The Kalman Filter (KF) is celebrated as the optimal estimator for systems with linear dynamics and gaussian uncertainty. Although most systems of interest do not have linear dynamics and are not forced by gaussian noise, the KF is used ubiquitously within industry. Thus, we present a novel estimation algorithm, the Game-theoretic Kalman Filter (GKF), which intelligently hedges between competing sequential filters and does not require the assumption of gaussian statistics to provide a "best" estimate.

  11. The Marginalized Auxiliary Particle Filter

    OpenAIRE

    Fritsche, Carsten; Scho?n, Thomas; Klein, Anja

    2010-01-01

    In this paper we are concerned with nonlinear systems subject to a conditionally linear, Gaussian sub-structure. This structure is often exploited in high-dimensional state estimation problems using the marginalized (aka Rao-Blackwellized) particle filter. The main contribution in the present work is to show how an efficient filter can be derived by exploiting this structure within the auxiliary particle filter. Based on a multisensor aircraft tracking example, the superior performance of the...

  12. Derived category of filtered objects

    OpenAIRE

    Schapira, Pierre; Schneiders, Jean-pierre

    2013-01-01

    For an abelian category C and a filtrant preordered set Lambda, we prove that the derived category of the quasi-abelian category of filtered objects in C indexed by Lambda is equivalent to the derived category of the abelian category of functors from Lambda to C. We apply this result to the study of the category of filtered modules over a filtered ring in a tensor category.

  13. Konstruktion av en FIR filter generator

    OpenAIRE

    Broddfelt, Michel

    2003-01-01

    In this thesis a FIR filter generator has been designed. The program generates FIR filters in the form of VHDL-files. Four different filter structures have been implemented in the generator, Direct Form (DF), Differential Coefficients Method (DCM), polyphase filters and (2-by-2) filters. The focus of the thesis was to implement filter structures that create FIR filters with as low power consumption and area as possible. The generaterator has been implemented i C++. The C++ program creates t...

  14. Anisotropic Total Variation Filtering

    International Nuclear Information System (INIS)

    Total variation regularization and anisotropic filtering have been established as standard methods for image denoising because of their ability to detect and keep prominent edges in the data. Both methods, however, introduce artifacts: In the case of anisotropic filtering, the preservation of edges comes at the cost of the creation of additional structures out of noise; total variation regularization, on the other hand, suffers from the stair-casing effect, which leads to gradual contrast changes in homogeneous objects, especially near curved edges and corners. In order to circumvent these drawbacks, we propose to combine the two regularization techniques. To that end we replace the isotropic TV semi-norm by an anisotropic term that mirrors the directional structure of either the noisy original data or the smoothed image. We provide a detailed existence theory for our regularization method by using the concept of relaxation. The numerical examples concluding the paper show that the proposed introduction of an anisotropy to TV regularization indeed leads to improved denoising: the stair-casing effect is reduced while at the same time the creation of artifacts is suppressed.

  15. Archimedes Plasma Mass Filter

    International Nuclear Information System (INIS)

    The Archimedes' Plasma Mass Filter is a novel plasma-based mass separation device. The basic physics of the Filter concept and a description of its primary application for nuclear waste separation at Hanford will be presented along with initial experimental results from a Demo device. The Demo is a 3.89 m long cylindrical device with a plasma radius of 0.4 m and an axial magnetic field up to 1600 Gauss. The plasma is produced by helicon waves launched by two four-strap antennas placed symmetrically either side of a central source region. One strap of each antenna is powered by one of four phase controlled 1 MW transmitters operating in the frequency range from 3.9 - 26 MHz. Each end of the device has ten concentric ring electrodes used to apply an electric field to rotate the plasma. Application of a parabolic voltage profile results in a rigid body rotation. Heavy ions above the cut-off mass number are extracted radially and collected by a heavy ion collector surrounding the source injection region while light ions are collected at the ends of the cylinder. Initial experiments will use noble gas and trace metals to demonstrate separation before attempting to operate with complex waste characteristic of Hanford

  16. Archimedes Plasma Mass Filter

    Science.gov (United States)

    Freeman, Richard; Agnew, Steve; Anderegg, Francois; Cluggish, Brian; Gilleland, John; Isler, Ralph; Litvak, Andrei; Miller, Robert; O'Neill, Ray; Ohkawa, Tihiro; Pronko, Steve; Putvinski, Sergei; Sevier, Leigh; Sibley, Andy; Umstadter, Karl; Wade, Terry; Winslow, David

    2003-12-01

    The Archimedes' Plasma Mass Filter is a novel plasma-based mass separation device. The basic physics of the Filter concept and a description of its primary application for nuclear waste separation at Hanford will be presented along with initial experimental results from a Demo device. The Demo is a 3.89 m long cylindrical device with a plasma radius of 0.4 m and an axial magnetic field up to 1600 Gauss. The plasma is produced by helicon waves launched by two four-strap antennas placed symmetrically either side of a central source region. One strap of each antenna is powered by one of four phase controlled 1 MW transmitters operating in the frequency range from 3.9 - 26 MHz. Each end of the device has ten concentric ring electrodes used to apply an electric field to rotate the plasma. Application of a parabolic voltage profile results in a rigid body rotation. Heavy ions above the cut-off mass number are extracted radially and collected by a heavy ion collector surrounding the source injection region while light ions are collected at the ends of the cylinder. Initial experiments will use noble gas and trace metals to demonstrate separation before attempting to operate with complex waste characteristic of Hanford.

  17. Spatial filtering with photonic crystals

    Science.gov (United States)

    Maigyte, Lina; Staliunas, Kestutis

    2015-03-01

    Photonic crystals are well known for their celebrated photonic band-gaps—the forbidden frequency ranges, for which the light waves cannot propagate through the structure. The frequency (or chromatic) band-gaps of photonic crystals can be utilized for frequency filtering. In analogy to the chromatic band-gaps and the frequency filtering, the angular band-gaps and the angular (spatial) filtering are also possible in photonic crystals. In this article, we review the recent advances of the spatial filtering using the photonic crystals in different propagation regimes and for different geometries. We review the most evident configuration of filtering in Bragg regime (with the back-reflection—i.e., in the configuration with band-gaps) as well as in Laue regime (with forward deflection—i.e., in the configuration without band-gaps). We explore the spatial filtering in crystals with different symmetries, including axisymmetric crystals; we discuss the role of chirping, i.e., the dependence of the longitudinal period along the structure. We also review the experimental techniques to fabricate the photonic crystals and numerical techniques to explore the spatial filtering. Finally, we discuss several implementations of such filters for intracavity spatial filtering.

  18. Design of Optimal Digital Filters

    Science.gov (United States)

    Chen, Xiangkun

    Four methods for designing digital filters optimal in the Chebyshev sense are developed. The properties of these filters are investigated and compared. An analytic method for designing narrow-band FIR filters using Zolotarev polynomials, which are extensions of Chebyshev polynomials, is proposed. Bandpass and bandstop narrow-band filters as well as lowpass and highpass filters can be designed by this method. The design procedure, related formulae and examples are presented. An improved method of designing optimal minimum phase FIR filters by directly finding zeros is proposed. The zeros off the unit circle are found by an efficient special purpose root-finding algorithm without deflation. The proposed algorithm utilizes the passband minimum ripple frequencies to establish the initial points, and employs a modified Newton's iteration to find the accurate initial points for a standard Newton's iteration. The proposed algorithm can be used to design very long filters (L = 325) with very high stopband attenuations. The design of FIR digital filters in the complex domain is investigated. The complex approximation problem is converted into a near equivalent real approximation problem. A standard linear programming algorithm is used to solve the real approximation problem. Additional constraints are introduced which allow weighting of the phase and/or group delay of the approximation. Digital filters are designed which have nearly constant group delay in the passbands. The desired constant group delay which gives the minimum Chebyshev error is found to be smaller than that of a linear phase filter of the same length. These filters, in addition to having a smaller, approximately constant group delay, have better magnitude characteristics than exactly linear phase filters with the same length. The filters have nearly equiripple magnitude and group delay. The problem of IIR digital filter design in the complex domain is formulated such that the existence of best approximation is guaranteed. An efficient and numerically stable algorithm for the design is proposed. The methods to establish a good initial point are investigated. Digital filters are designed which have nearly constant group delay in the passbands. The magnitudes of the filter poles near the passband edge are larger than of those far from the passband edge. A delay overshooting may occur in the transition band (don't care region), and it can be reduced by decreasing the maximum allowed pole magnitude of the design problem at the expense of increasing the approximation error.

  19. Cryptographically Secure Bloom-Filters

    OpenAIRE

    Ryo Nojima; Youki Kadobayashi

    2009-01-01

    In this paper, we propose a privacy-preserving variant of Bloom-filters. The Bloom-filter has many applications such as hash-based IP-traceback systems and Web cache sharing. In some of those applications, equipping the Bloom-filter with the privacy-preserving mechanism is crucial for the deployment. In this paper, we propose a cryptographically secure privacy-preserving Bloom-filter protocol. We propose such two protocols based on blind signatures and oblivious pseudorandom functions, respe...

  20. Cryptographically Secure Bloom-Filters

    Directory of Open Access Journals (Sweden)

    Ryo Nojima

    2009-08-01

    Full Text Available In this paper, we propose a privacy-preserving variant of Bloom-filters. The Bloom-filter has many applications such as hash-based IP-traceback systems and Web cache sharing. In some of those applications, equipping the Bloom-filter with the privacy-preserving mechanism is crucial for the deployment. In this paper, we propose a cryptographically secure privacy-preserving Bloom-filter protocol. We propose such two protocols based on blind signatures and oblivious pseudorandom functions, respectively. To show that the proposed protocols are secure, we provide a reasonable security definition and prove the security.

  1. Central difference predictive filter for attitude determination with low precision sensors and model errors

    Science.gov (United States)

    Cao, Lu; Chen, Xiaoqian; Misra, Arun K.

    2014-12-01

    Attitude determination is one of the key technologies for Attitude Determination and Control System (ADCS) of a satellite. However, serious model errors may exist which will affect the estimation accuracy of ACDS, especially for a small satellite with low precision sensors. In this paper, a central difference predictive filter (CDPF) is proposed for attitude determination of small satellites with model errors and low precision sensors. The new filter is proposed by introducing the Stirling's polynomial interpolation formula to extend the traditional predictive filter (PF). It is shown that the proposed filter has higher accuracy for the estimation of system states than the traditional PF. It is known that the unscented Kalman filter (UKF) has also been used in the ADCS of small satellites with low precision sensors. In order to evaluate the performance of the proposed filter, the UKF is also employed to compare it with the CDPF. Numerical simulations show that the proposed CDPF is more effective and robust in dealing with model errors and low precision sensors compared with the UKF or traditional PF.

  2. Toward reliable ensemble Kalman filter estimates of CO2 fluxes

    Science.gov (United States)

    Chatterjee, Abhishek; Michalak, Anna M.; Anderson, Jeffrey L.; Mueller, Kim L.; Yadav, Vineet

    2012-11-01

    The use of ensemble filters for estimating sources and sinks of carbon dioxide (CO2) is becoming increasingly common, because they provide a relatively computationally efficient framework for assimilating high-density observations of CO2. Their applicability for estimating fluxes at high-resolutions and the equivalence of their estimates to those from more traditional "batch" inversion methods have not been demonstrated, however. In this study, we introduce a Geostatistical Ensemble Square Root Filter (GEnSRF) as a prototypical filter and examine its performance using a synthetic data study over North America at a high spatial (1° × 1°) and temporal (3-hourly) resolution. The ensemble performance, both in terms of estimates and associated uncertainties, is benchmarked against a batch inverse modeling setup in order to isolate and quantify the degradation in the estimates due to the numerical approximations and parameter choices in the ensemble filter. The examined case studies demonstrate that adopting state-of-the-art covariance inflation and localization schemes is a necessary but not sufficient condition for ensuring good filter performance, as defined by its ability to yield reliable flux estimates and uncertainties across a range of resolutions. Observational density is found to be another critical factor for stabilizing the ensemble performance, which is attributed to the lack of a dynamical model for evolving the ensemble between assimilation times. This and other results point to key differences between the applicability of ensemble approaches to carbon cycle science relative to its use in meteorological applications where these tools were originally developed.

  3. Instructional Design Processes and Traditional Colleges

    Science.gov (United States)

    Vasser, Nichole

    2010-01-01

    Traditional colleges who have implemented distance education programs would benefit from using instructional design processes to develop their courses. Instructional design processes provide the framework for designing and delivering quality online learning programs in a highly-competitive educational market. Traditional college leaders play a…

  4. Dene Schoolchildren Benefit from Traditional Knowledge.

    Science.gov (United States)

    Latta, Maureen

    1995-01-01

    Describes a unique program that combines research of traditional Dene knowledge with school curricula in an effort to preserve knowledge of traditional Dene medicine. Using participatory action research methodology, researchers documented a wealth of medical knowledge from Dogrib community elders and cross-referenced the database with school…

  5. Comet and Meteorite Traditions of Aboriginal Australians

    CERN Document Server

    Hamacher, Duane W

    2014-01-01

    Of the hundreds of distinct Aboriginal cultures of Australia, many have oral traditions rich in descriptions and explanations of comets, meteors, meteorites, airbursts, impact events, and impact craters. These views generally attribute these phenomena to spirits, death, and bad omens. There are also many traditions that describe the formation of meteorite craters as well as impact events that are not known to Western science.

  6. Librarianship and Oral Tradition in Africa.

    Science.gov (United States)

    Iwuji, H. O. M.

    1990-01-01

    Discusses the oral tradition of Africa as history, as literature, and as an ongoing characteristic of African culture. It is argued that libraries should accommodate this tradition by actively gathering oral history and by providing community services based on oral, rather than written, materials. (Nine references) (CLB)

  7. Tradition and Revolution in ESL Teaching.

    Science.gov (United States)

    Raimes, Ann

    1983-01-01

    Explores the development of language teaching in light of Thomas Kuhn's theory of scientific revolution and briefly defines the positivist tradition in language teaching. Argues that the current emphasis on communication does not mark the emergence of a new paradigm, as it still operates in the positivist tradition, but rather a paradigm shift.…

  8. Testing Dual Rotary Filters - 12373

    International Nuclear Information System (INIS)

    The Savannah River National Laboratory (SRNL) installed and tested two hydraulically connected SpinTekR Rotary Micro-filter units to determine the behavior of a multiple filter system and develop a multi-filter automated control scheme. Developing and testing the control of multiple filters was the next step in the development of the rotary filter for deployment. The test stand was assembled using as much of the hardware planned for use in the field including instrumentation and valving. The control scheme developed will serve as the basis for the scheme used in deployment. The multi filter setup was controlled via an Emerson DeltaV control system running version 10.3 software. Emerson model MD controllers were installed to run the control algorithms developed during this test. Savannah River Remediation (SRR) Process Control Engineering personnel developed the software used to operate the process test model. While a variety of control schemes were tested, two primary algorithms provided extremely stable control as well as significant resistance to process upsets that could lead to equipment interlock conditions. The control system was tuned to provide satisfactory response to changing conditions during the operation of the multi-filter system. Stability was maintained through the startup and shutdown of one of the filter units while the second was still in operation. The equipment selected for deployment, including the concentrate discharge control valve, the pressure transmitters, and flow meters, performed well. Automation of the valve control integrated well with the control scheme and when used in concert with the other control variables, allowed automated control of the dual rotary filter system. Experience acquired on a multi-filter system behavior and with the system layout during this test helped to identify areas where the current deployment rotary filter installation design could be improved. Completion of this testing provides the necessary information on the control and system behavior that will be used in deployment on actual waste. (authors)

  9. Why breast cancer patients seek traditional healers.

    Science.gov (United States)

    Muhamad, Mazanah; Merriam, Sharan; Suhami, Norhasmilia

    2012-01-01

    Traditional healing is a common practice in low and middle income countries such as Malaysia. Eighty percent of Malaysians consult traditional healers or "bomoh" at some time in their life for health-related issues. The purpose of our study was to explore why breast cancer patients visit traditional healers. This is a qualitative study utilizing in-depth interviews with 11 cancer survivors who sought both traditional and Western medicine. The findings revealed the following reasons for which patients seek traditional healers: (1) recommendation from family and friends, (2) sanction from family, (3) perceived benefit and compatibility, (4) healer credibility, and (5) reservation with Western medicine and system delay. These factors work together and are strongly influenced by the Malaysian cultural context. The issue with the Western health system is common in a developing country with limited health facilities. PMID:22295249

  10. Identifying seasonal stars in Kaurna astronomical traditions

    CERN Document Server

    Hamacher, Duane W

    2015-01-01

    Early ethnographers and missionaries recorded Aboriginal languages and oral traditions across Australia. Their general lack of astronomical training resulted in misidentifications, transcription errors, and omissions in these records. Additionally, many of these early records are fragmented. In western Victoria and southeast South Australia, many astronomical traditions were recorded, but curiously, some of the brightest stars in the sky were omitted. Scholars claimed these stars did not feature in Aboriginal traditions. This under-representation continues to be repeated in the literature, but current research shows that some of these stars may in fact feature in Aboriginal traditions and could be seasonal calendar markers. This paper uses established techniques in cultural astronomy to identify seasonal stars in the traditions of the Kaurna Aboriginal people of the Adelaide Plains, South Australia.

  11. Are Supernovae Recorded in Indigenous Astronomical Traditions?

    CERN Document Server

    Hamacher, Duane W

    2014-01-01

    Novae and supernovae are rare astronomical events that would have had an influence on the sky-watching peoples who witnessed them. Although several bright novae/supernovae have been visible during recorded human history, there are many proposed but no confirmed accounts of supernovae in oral traditions or material culture. Criteria are established for confirming novae/supernovae in oral and material culture, and claims from around the world are discussed to determine if they meet these criteria. Australian Aboriginal traditions are explored for possible descriptions of novae/supernovae. Although representations of supernovae may exist in Indigenous traditions, and an account of a nova in Aboriginal traditions has been confirmed, there are currently no confirmed accounts of supernovae in Indigenous oral or material traditions.

  12. Why Breast Cancer Patients Seek Traditional Healers

    International Nuclear Information System (INIS)

    Traditional healing is a common practice in low and middle income countries such as Malaysia. Eighty percent of Malaysians consult traditional healers or bomoh at some time in their life for health-related issues. The purpose of our study was to explore why breast cancer patients visit traditional healers. This is a qualitative study utilizing in-depth interviews with 11 cancer survivors who sought both traditional and Western medicine. The findings revealed the following reasons for which patients seek traditional healers: (1) recommendation from family and friends, (2) sanction from family, (3) perceived benefit and compatibility, (4) healer credibility, and (5) reservation with Western medicine and system delay. These factors work together and are strongly influenced by the Malaysian cultural context. The issue with the Western health system is common in a developing country with limited health facilities

  13. Reduction of Data Sparsity in Collaborative Filtering based on Fuzzy Inference Rules

    Directory of Open Access Journals (Sweden)

    A tisha Sachan

    2013-06-01

    Full Text Available Collaborative filtering Recommender system plays avery demanding and significance role in this era ofinternet informationand of course e commerce age.Collaborative filtering predicts user preferencesfrom past user behaviouror user-item relationships.Though it has many advantages it also has somelimitations such as sparsity, scalability, accuracy,cold start problem etc.In this paper we proposed amethod that helps in reducing sparsity to enhancerecommendation accuracy. We developed fuzzyinference ruleswhich is easily to implement andalso gives better result. Acomparison experiment isalsoperformingwith two previous methods,Traditional Collaborative Filtering (TCF andHybrid User Model Technique (HUMCF.

  14. Application of Archimedes Filter for Reduction of Hanford HLW

    International Nuclear Information System (INIS)

    Archimedes Technology Group, Inc., is developing a plasma mass separator called the Archimedes Filter that separates waste oxide mixtures ion by ion into two mass groups: light and heavy. For the first time, it is feasible to separate large amounts of material atom by atom in a single pass device. Although vacuum ion based electromagnetic separations have been around for many decades, they have traditionally depended on ion beam manipulation. Neutral plasma devices, on the other hand, are much easier, less costly, and permit several orders of magnitude greater throughput. The Filter has many potential applications in areas where separation of species is otherwise difficult or expensive. In particular, radioactive waste sludges at Hanford have been a particularly difficult issue for pretreatment and immobilization. Over 75% of Hanford HLW oxide mass (excluding water, carbon, and nitrogen) has mass less than 59 g/mol. On the other hand, 99.9% of radionuclide activity has mass greater than 89 g/mol. Therefore, Filter mass separation tuned to this cutoff would have a dramatic effect on the amount of IHLW produced--in fact IHLW would be reduced by a factor of at least four. The Archimedes Filter is a brand new tool for the separations specialist's toolbox. In this paper, we show results that describe the extent to which the Filter separates ionized material. Such results provide estimates for the potential advantages of Filter tunability, both in cutoff mass (electric and bility, both in cutoff mass (electric and magnetic fields) and in degree of ionization (plasma power). Archimedes is now engaged in design and fabrication of its Demonstration Filter separator and intends on performing a full-scale treatment of Hanford high-level waste surrogates. The status of the Demo project will be described

  15. Application of Archimedes Filter for Reduction of Hanford HLW

    Energy Technology Data Exchange (ETDEWEB)

    Gilleland, J.; Agnew, S.; Cluggish, B.; Freeman, R.; Miller, R.; Putvinski, S.; Sevier, L.; Umstadter, K.

    2002-02-26

    Archimedes Technology Group, Inc., is developing a plasma mass separator called the Archimedes Filter that separates waste oxide mixtures ion by ion into two mass groups: light and heavy. For the first time, it is feasible to separate large amounts of material atom by atom in a single pass device. Although vacuum ion based electromagnetic separations have been around for many decades, they have traditionally depended on ion beam manipulation. Neutral plasma devices, on the other hand, are much easier, less costly, and permit several orders of magnitude greater throughput. The Filter has many potential applications in areas where separation of species is otherwise difficult or expensive. In particular, radioactive waste sludges at Hanford have been a particularly difficult issue for pretreatment and immobilization. Over 75% of Hanford HLW oxide mass (excluding water, carbon, and nitrogen) has mass less than 59 g/mol. On the other hand, 99.9% of radionuclide activity has mass greater than 89 g/mol. Therefore, Filter mass separation tuned to this cutoff would have a dramatic effect on the amount of IHLW produced--in fact IHLW would be reduced by a factor of at least four. The Archimedes Filter is a brand new tool for the separations specialist's toolbox. In this paper, we show results that describe the extent to which the Filter separates ionized material. Such results provide estimates for the potential advantages of Filter tunability, both in cutoff mass (electric and magnetic fields) and in degree of ionization (plasma power). Archimedes is now engaged in design and fabrication of its Demonstration Filter separator and intends on performing a full-scale treatment of Hanford high-level waste surrogates. The status of the Demo project will be described.

  16. Position USBL/DVL Sensor-based Navigation Filter in the presence of Unknown Ocean Currents

    CERN Document Server

    Morgado, M; Oliveira, P; Silvestre, C

    2010-01-01

    This paper presents a novel approach to the design of globally asymptotically stable (GAS) position filters for Autonomous Underwater Vehicles (AUVs) based directly on the nonlinear sensor readings of an Ultra-short Baseline (USBL) and a Doppler Velocity Log (DVL). Central to the proposed solution is the derivation of a linear time-varying (LTV) system that fully captures the dynamics of the nonlinear system, allowing for the use of powerful linear system analysis and filtering design tools that yield GAS filter error dynamics. Simulation results reveal that the proposed filter is able to achieve the same level of performance of more traditional solutions, such as the Extended Kalman Filter (EKF), while providing, at the same time, GAS guarantees, which are absent for the EKF.

  17. Filters For Chest Radiography

    Science.gov (United States)

    Ramanathan, N.; Paron, J.

    1980-08-01

    The objective of low dose radiography is achieved by a judicious combination of proper kV selection, fast film-screen systems and beam filtration. A systematic study of filters was undertaken to evaluate the improvements that can be realized in terms of patient Entrance Skin Exposures (ESE) for chest radiographs. The Picker CD 135 Generator and the Automatic Chest Filmer with dynamic phototiming were used for the study. The kV dependence of ESE with various amounts of zinc and aluminum filtration is presented. The effect of filtration on image contrast is discussed. The variations of ESE with phantom thickness under different filtration conditions are also considered. It was found that the ESE can be reduced by as much as a factor of 1.8 ± .1 with no significant increase in tube loading.

  18. Latent common origin of bilateral filter and non-local means filter

    Science.gov (United States)

    Tanaka, Masayuki; Okutomi, Masatoshi

    2010-01-01

    The bilateral filter and the non-local means (NL-means) filter are known as very powerful nonlinear filters. The first contribution of this paper is to give a general framework which involves the bilateral filter and the NL-means filter. The general framework is derived based on Bayesian inference. Our analysis reveals that the range weight in the bilateral filter and the similarity measure in the NL-means filter are associated with a noise model or a likelihood distribution. The second contribution is to extend the bilateral filter and the NL-means filter for a general noise model. We also provide a filter classification. The filter classification framework clarifies the differences among existing filters and helps us to develop new filters. As example of future directions, we extend the bilateral filter and the NL-means filter for a general noise model. Both extended filters are theoretically and experimentally justified.

  19. Compressed sensing & sparse filtering

    CERN Document Server

    Carmi, Avishy Y; Godsill, Simon J

    2013-01-01

    This book is aimed at presenting concepts, methods and algorithms ableto cope with undersampled and limited data. One such trend that recently gained popularity and to some extent revolutionised signal processing is compressed sensing. Compressed sensing builds upon the observation that many signals in nature are nearly sparse (or compressible, as they are normally referred to) in some domain, and consequently they can be reconstructed to within high accuracy from far fewer observations than traditionally held to be necessary. Apart from compressed sensing this book contains other related app

  20. AER image filtering

    Science.gov (United States)

    Gómez-Rodríguez, F.; Linares-Barranco, A.; Paz, R.; Miró-Amarante, L.; Jiménez, G.; Civit, A.

    2007-05-01

    Address Event Representation (AER) is an emergent neuromorphic interchip communication protocol that allows real-time virtual massive connectivity among huge number of neurons located on different chips.[1] By exploiting high speed digital communication circuits (with nano-seconds timing), synaptic neural connections can be time multiplexed, while neural activity signals (with mili-seconds timings) are sampled at low frequencies. Neurons generate "events" according to their activity levels. That is, more active neurons generate more events per unit time and access the interchip communication channel more frequently than neurons with low activity. In Neuromorphic system development, AER brings some advantages to develop real-time image processing system: (1) AER represents the information like time continuous stream not like a frame; (2) AER sends the most important information first (although this depends on the sender); (3) AER allows to process information as soon as it is received. When AER is used in artificial vision field, each pixel is considered like a neuron, so pixel's intensity is represented like a sequence of events; modifying the number and the frequency of these events, it is possible to make some image filtering. In this paper we present four image filters using AER: (a) Noise addition and suppression, (b) brightness modification, (c) single moving object tracking and (d) geometrical transformations (rotation, translation, reduction and magnification). For testing and debugging, we use USB-AER board developed by Robotic and Technology of Computers Applied to Rehabilitation (RTCAR) research group. This board is based on an FPGA, devoted to manage the AER functionality. This board also includes a micro-controlled for USB communication, 2 Mbytes RAM and 2 AER ports (one for input and one for output).

  1. X-band preamplifier filter

    Science.gov (United States)

    Manshadi, F.

    1986-01-01

    A low-loss bandstop filter designed and developed for the Deep Space Network's 34-meter high-efficiency antennas is described. The filter is used for protection of the X-band traveling wave masers from the 20-kW transmitter signal. A combination of empirical and theoretical techniques was employed as well as computer simulation to verify the design before fabrication.

  2. Chopped filter for nuclear spectroscopy

    International Nuclear Information System (INIS)

    Some of the theoretical and practical factors affecting the energy resolution of a spectrometry system are considered, specially those related to t he signal-to-noise ratio, and a time-variant filter with the transfer function of the theoretical optimum filter, during its active time, is proposed. A prototype has been tested and experimental results are presented. (Author)

  3. The double well mass filter

    Science.gov (United States)

    Gueroult, Renaud; Rax, Jean-Marcel; Fisch, Nathaniel J.

    2014-02-01

    Various mass filter concepts based on rotating plasmas have been suggested with the specific purpose of nuclear waste remediation. We report on a new rotating mass filter combining radial separation with axial extraction. The radial separation of the masses is the result of a "double-well" in effective radial potential in rotating plasma with a sheared rotation profile.

  4. The double well mass filter

    International Nuclear Information System (INIS)

    Various mass filter concepts based on rotating plasmas have been suggested with the specific purpose of nuclear waste remediation. We report on a new rotating mass filter combining radial separation with axial extraction. The radial separation of the masses is the result of a “double-well” in effective radial potential in rotating plasma with a sheared rotation profile

  5. The double well mass filter

    Energy Technology Data Exchange (ETDEWEB)

    Gueroult, Renaud; Fisch, Nathaniel J. [Princeton Plasma Physics Laboratory, Princeton University, Princeton, New Jersey 08543 (United States); Rax, Jean-Marcel [Laboratoire d' optique appliquée-LOA, Ecole Polytechnique, Chemin de la Hunière, 91761 Palaiseau Cedex (France)

    2014-02-15

    Various mass filter concepts based on rotating plasmas have been suggested with the specific purpose of nuclear waste remediation. We report on a new rotating mass filter combining radial separation with axial extraction. The radial separation of the masses is the result of a “double-well” in effective radial potential in rotating plasma with a sheared rotation profile.

  6. UV-filtered overlap fermions

    OpenAIRE

    Du?rr, Stephan; Hoelbling, Christian; Wenger, Urs

    2005-01-01

    We discuss the kernel spectrum, locality properties and the axial-vector renormalization constant of UV-filtered overlap fermions. We find that UV-filtered overlap fermions have a better conditioned kernel, better locality and an axial-vector renormalization constant closer to 1 than their unfiltered counterparts, even if the shift parameter $\\rho$ is simply set to 1.

  7. Identification Filtering with fuzzy estimations

    Directory of Open Access Journals (Sweden)

    J.J Medel J

    2012-10-01

    Full Text Available A digital identification filter interacts with an output reference model signal known as a black-box output system. The identification technique commonly needs the transition and gain matrixes. Both estimation cases are based on mean square criterion obtaining of the minimum output error as the best estimation filtering. The evolution system represents adaptive properties that the identification mechanism includes considering the fuzzy logic strategies affecting in probability sense the evolution identification filter. The fuzzy estimation filter allows in two forms describing the transition and the gain matrixes applying actions that affect the identification structure. Basically, the adaptive criterion conforming the inference mechanisms set, the Knowledge and Rule bases, selecting the optimal coefficients in distribution form. This paper describes the fuzzy strategies applied to the Kalman filter transition function, and gain matrixes. The simulation results were developed using Matlab©.

  8. Apparatus for filtering radioactive fluids

    International Nuclear Information System (INIS)

    Apparatus is provided for filtering radioactive particles from the cooling and/or auxiliary process water of a nuclear reactor, or nuclear fuel processing plant, or other installations wherein radioactive fluid systems are known to exist. The apparatus affords disposal of the captured particles in a manner which minimizes the exposure of operating personnel to radioactivity. The apparatus comprises a housing adapted to contain a removable filter cartridge assembly, a valve normally closing the lower end of the housing, an upwardly-open shipping cask located below the valve, and an elongated operating rod assembly projecting upwardly from the filter cartridge assembly and through the upper end of the housing to enable a workman to dismount the filter cartridge assembly from its housing and to lower the filter cartridge assembly through the valve and into the cask from a remote location above the housing. (U.S.)

  9. Traditional and Non-Traditional Educational Outcomes: Trade-Off or Complementarity?

    Science.gov (United States)

    van der Wal, Marieke; Waslander, Sietske

    2007-01-01

    Recently, schools have increasingly been charged with enhancing non-traditional academic competencies, in addition to traditional academic competencies. This article raises the question whether schools can implement these new educational goals in their curricula and simultaneously realise the traditional ones or whether a trade-off between…

  10. Kurban (sacrifice tradition in Western Thracian Turks

    Directory of Open Access Journals (Sweden)

    Füsun A?kar

    2011-07-01

    Full Text Available This article is prepared within 04-DPT-007 the Project of Preparing Museum and Archive on Folk Music Instruments, Folk Dances, Traditional Clothes and Folk Music in Anatolia and the Balkans started with the research of Ege University State Turkish Music Conservatory, Turkish Folk Dance Department in Macedonia on 18 April 2004.In the Greece trip within the project, on-site monitoring works have been conducted in the villages where Western Thracian Turks that are taken under research live in Komotini and Xanthi and as a result of these works it is seen that traditions that can be expressed as the facts of cultural remnants from past to present that have still maintained through cultural sanctions have maintained even with different applications.One of the traditions of Western Thracian Turks, Kurban (Sacrifice Tradition attracts attention as a tradition reflects acting collectively and union of forces and a tradition based on cooperation. In this article, this tradition that has still continued to maintain is going to be handled.

  11. POLITICAL TRADITIONS: THE CONCEPT AND STRUCTURE

    Directory of Open Access Journals (Sweden)

    ??????? ?????????? ??????

    2013-05-01

    Full Text Available The article refers to the theoretical aspects of the study of the political traditions phenomenon. The influence of traditional components of the political culture on the current political process is recognized in contemporary literature, but political traditions rarely become the original subject of a scientific research, which explains the vagueness of their interpretation and the need of their system understanding.The author analyzes existing interpretations of the concept "tradition", on which formulates the definition of "political traditions" as (1 a form of fixation for meaningful content of the nation’s socio-political experience and as (2 a mechanism of political-cultural continuity.The author identifies mental, behavioral and institutional levels in the structure of political traditions. Mental level consists of political symbols, myths and stereotypes, which form the image of political reality and authority, and values and norms, which affect the motivation of political behavior. Behavioral level includes models of behavior and patterns of action, such as political habits and rituals. Institutional level reflects historical features of interaction between branches of power and relations between the state and society.The author pays attention to the influence of structural elements of political traditions on the political consciousness and behavior of individuals and social groups. DOI: http://dx.doi.org/10.12731/2218-7405-2013-4-25

  12. The Decline of Traditional Banking Activities

    Directory of Open Access Journals (Sweden)

    Gabriela Cornelia Piciu

    2011-05-01

    Full Text Available The decline of traditional banking activities raise the issue of efficiency of financial stability, in terms ofquantitative and qualitative aspects – the increasing danger of banking failures as well as of susceptibility due toincreased propensity of banking institutions to assume additional to risks either in the form of riskier loans offer orengaging in other "non-traditional" financial activities which give a promise for greater profitability, but also higherrisks. Non-traditional activities of banking as financial products dealers (financial derivatives, generate an increasingrisks and vulnerabilities in the form of moral hazard issues. That is the reason why and these activities should beregulated as well as are the traditional activities. Challenges posed by the decline of traditional banking activities istwofold: the stability of the banking system must be maintained, while the banking system needs to be restructured toachieve financial stability in the long run. One possible way is an appropriate regulatory framework to encourage atransition period of changing the structure of banking activity(reduction of traditional activities and expanding nontraditional activities to enable banking institutions to perform a deep methodic analysis of non traditional activities,oriented to the financial banking efficiency.

  13. Traditional’ justice? - The role of traditional authority in traditional justice mechanisms in post-conflict Sierra Leone

    OpenAIRE

    Meesenburg, Anna Welin; Dolberg, Lisbeth Carstensen

    2012-01-01

    The research field of this present thesis is the intersection between traditional authorities and traditional justice mechanisms in the context of post-conflict Sierra Leone. With an increase in civil wars in past decades, questions of how to deal with issues of post-conflict justice and reconciliation have moved to the forefront of the international development agenda. In recent years this field, known as transitional justice, has begun to move away from standardised approaches towar...

  14. Comet and meteorite traditions of Aboriginal Australians

    Science.gov (United States)

    Hamacher, Duane W.

    2014-06-01

    This research contributes to the disciplines of cultural astronomy (the academic study of how past and present cultures understand and utilise celestial objects and phenomena) and geomythology (the study of geological events and the formation of geological features described in oral traditions). Of the hundreds of distinct Aboriginal cultures of Australia, many have oral traditions rich in descriptions and explanations of comets, meteors, meteorites, airbursts, impact events, and impact craters. These views generally attribute these phenomena to spirits, death, and bad omens. There are also many traditions that describe the formation of meteorite craters as well as impact events that are not known to Western science.

  15. Characterizing a Tune-all bandstop filter

    OpenAIRE

    Musoll, Carles; Llamas Garro, Ignacio; Brito Brito, Zabdiel; Pradell I Cara, Llui?s; Corona, Alfonso

    2009-01-01

    In this paper a reconfigurable bandstop filter able to reconfigure central frequency, bandwidth and selectivity for fine tuning applications is presented. The reconfigurable filter topology has four poles and a quasielliptic bandstop filter response. The filter is tuned by varactor diodes placed at different locations on the filter topology. The varactors are voltage controlled in pairs due to filter symmetry for central frequency and bandwidth control. An additional v...

  16. Performance data and limits of aerosol filters

    International Nuclear Information System (INIS)

    Under suitable conditions and with an optimum mode of operation, aerosol filters as high-quality mechanical air filters can retain sumicron particles, which makes them nearly absolute filters at the present state of the art of air filter technology. This cautions statement shows, however, that the term absolute filter, which is often used, is not correct as an absolute retention - 100% - is never reached under the parameters valid for the various classes fo filters. (orig.)

  17. LMS order statistic filters adaptation by backpropagation

    OpenAIRE

    Pitas, I.; Vougioukas, S.

    2010-01-01

    A novel class of nonlinear adaptive filters based on order statistics is presented. An LMS algorithm for their adaptation is proposed. This algorithm is essentially a backpropagation algorithm for the adaptation of coefficients that are used before data sorting. The nonlinear filters that can become adaptive by the techniques presented in this paper are the median hybrid filter, the general nonlinear filter structure, the L-filters and the Ll-filters.

  18. Development of Test Protocols for International Space Station Particulate Filters

    Science.gov (United States)

    Green, Robert D.; Vijayakumar, R.; Agui, Juan H.

    2014-01-01

    Air quality control on the International Space Station (ISS) is a vital requirement for maintaining a clean environment for the crew and the hardware. This becomes a serious challenge in pressurized space compartments since no outside air ventilation is possible, and a larger particulate load is imposed on the filtration system due to lack of gravitational settling. The ISS Environmental Control and Life Support System (ECLSS) uses a filtration system that has been in use for over 14 years and has proven to meet this challenge. The heart of this system is a traditional High- Efficiency Particulate Air (HEPA) filter configured to interface with the rest of the life support elements and provide effective cabin filtration. Over the years, the service life of these filters has been re-evaluated based on limited post-flight tests of returned filters and risk factors. On earth, a well designed and installed HEPA filter will last for several years, e.g. in industrial and research clean room applications. Test methods for evaluating these filters are being developed on the basis of established test protocols used by the industry and the military. This paper will discuss the test methods adopted and test results on prototypes of the ISS filters. The results will assist in establishing whether the service life can be extended for these filters. Results from unused filters that have been in storage will also be presented to ascertain the shelf life and performance deterioration, if any and determine if the shelf life may be extended. XXXX Presently, the inventory of ISS bacterial filters for the ISS Air Revitalization System are nearing the end of their specified shelf life, and a means of testing to confirm whether the shelf life can be extended is of interest to the ISS Program. A filter test setup was designed and built to meet industry testing standards. A CFD analysis was performed to initially determine the optimal duct geometry and flow configuration. Both a screen and flow straighter were added to the test duct design to improve flow uniformity and face velocity profiles were subsequently measured to confirm. Flow quality and aerosol mixing assessments show that the duct flow is satisfactory for the intended leak testing. Preliminary leak testing was performed on two different ISS filters, one with known perforations and one with limited use, and results confirmed that the testing methods and photometer instrument are sensitive enough to detect and locate compromised sections of an ISS BFE. This work is focused on developing test protocols for testing the ISS BFE filters, but the methodology is general enough to be extended to other present and future spacecraft filters, These techniques for characterizing the test duct and perform leak testing can be applied to conducting acceptance testing and inventory testing for future manned exploration programs with air revitalization filtration needs, possibly even for in-situ filter element integrity testing for extensively long-duration missions. We plan to address the unique needs for test protocols for crewed spacecraft particulate filters by preparing the initial version of a standard, to be documented as a NASA Techmical Memorandum (TM), that can potentially be submitted to IEST and ASHRAE for consideration as a new standard for spacecraft applications.

  19. Biodiversity: The benefits of traditional knowledge

    Science.gov (United States)

    Pardo-de-Santayana, Manuel; Macía, Manuel J.

    2015-02-01

    A study of two Balkan ethnic groups living in close proximity finds that traditional knowledge about local plant resources helps communities to cope with periods of famine, and can promote the conservation of biodiversity.

  20. Sustainable architecture in the traditional Iranian homes

    Energy Technology Data Exchange (ETDEWEB)

    Rezaei, Davood; Niloufari, Morteza; Sadegh Falahat, Mohammad [Zanjan University (Iran, Islamic Republic of)], email: d_rezaei@znu.ac.ir, email: mortezagharibeh@yahoo.com, email: safalahat@yahoo.com

    2011-07-01

    With the coming shortage of fossil fuels it is important to develop energy efficient buildings to reduce both energy consumption and pollution at the same time. In Iran, traditional homes have been built in a sustainable manner to withstand the high climate diversity of the country. The aim of this paper is to present the different methods used in Iranian traditional architecture. Among the architectural principles is appropriate orientation of the building to allow the capture of solar energy and at the same time protect against the cold wind. In addition, indigenous materials were used in the constructions to provide the highest degree of comfort possible with minimal damage to the environment. Finally, Iranian traditional architecture took advantage of the soil's constant temperature by building a Shvadan which is an underground space beneath the house. This article highlighted the different Iranian traditional methods which can create a sustainable architecture.

  1. Protecting traditional knowledge from the grassroots up

    Energy Technology Data Exchange (ETDEWEB)

    Arugomedo, Alejandro [ANDES Association (Peru); Pant, Ruchi [Ecoserve (India); Vedavathy, S. [Herbal Folklore Reseach Centre (India); Munyi, Peter [International Centre of Insect Physiology and Ecology (Kenya); Mutta, Doris [Kenya Forestry Research Institute (Kenya); Herrera, Heracilo [Dobbo Yala Foundation (Panama); Song, Yinching; Li, Jingsong [Centre of Chinese Agricultural Policy (China); Swiderska, Krystyna

    2009-06-15

    For indigenous peoples round the world, traditional knowledge based on natural resources such as medicinal herbs forms the core of culture and identity. But this wealth of knowledge is under pressure. Indigenous communities are increasingly vulnerable to eviction, environmental degradation and outside interests eager to monopolise control over their traditional resources. Intellectual property rights such as patents, however, sit uneasily with traditional knowledge. Their commercial focus wars with fundamental indigenous principles such as resource access and sharing. Local customary law offers a better fit, and findings in China, India, Kenya, Panama and Peru show how this pairing can work in practice. The research has identified common elements, and key differences, in customary law that should be informing policy on traditional knowledge and genetic resources.

  2. Intrusions of Modernity on a Traditional Culture.

    Science.gov (United States)

    Thomas, Anne Horsfall

    1991-01-01

    Presents a teacher's impressions of India, gathered during a Fulbright-sponsored study tour. Examines modernizing influences in the midst of traditional culture, religious cultural groups and potential religious conflict, women's status, and problems due to overpopulation. (CH)

  3. Two traditional African settlements - context and configuration

    OpenAIRE

    Steyn, Gerald; Roodt, Andre?

    2003-01-01

    Vernacular African settlements and buildings are widely appreciated for their human scale, aesthetic clarity and harmony with nature. But this appreciation appears to be limited to their iconic and picturesque qualities, and there seems to be little understanding of the value of these architectural traditions as products of historical, ecological, cultural and economic circumstances. This study compares a traditional Tonga compound at Siamundela, southern Zambia with a Banoka village near Khw...

  4. Cultural anthropology of traditional Chinese medicine

    OpenAIRE

    Wan, Xia; Liu, Jian-ping

    2008-01-01

    Abstract: Biological, psychological and sociological model of medicine substantializes the old model lacking the social humane attributes. The new medical model makes people take medical anthropology into research and highly evaluate traditional medical system. Cultural anthropology of traditional Chinese medicine (TCM) is part of medical anthropology with three major characteristics: wide research scope, specificity, and integration. It has developed its own research methods, such as field i...

  5. Quality Lessons in Traditional and Electronic Textbook

    Directory of Open Access Journals (Sweden)

    Snežana Laketa

    2015-01-01

    Full Text Available The aim of this study is to verify and assess the quality of lessons in traditional and electronic textbook on general standards of textbooks quality. The method of theoretical analysis and content analysis was used. For the purpose of analyzing the contents of the sample we took teaching unit Measures and measurement from two textbooks: the traditional and the electronic. The electronic textbook, which has been the subject of research, is of quality and meets the standards of textbook quality.

  6. Milk-based traditional Turkish desserts

    OpenAIRE

    Tulay Ozcan; Lutfiye Yilmaz-Ersan; Arzu Akpinar-Bayizit

    2009-01-01

    Traditional foods are the reflection of cultural inheritance and affect the lifestyle habits. Culture can be viewed as a system of socially transmitted patterns of behaviour that characterises a particular group. Despite the fact of globalisation, these are key elements to accurately estimate a population’s dietary patterns and how these have been shaped through time. In Turkey, a meal with family or friends traditionally ends with a dessert, which is a testimony to the hosts’ hospitality...

  7. Memory, History and the Classical Tradition

    OpenAIRE

    Whitling, Frederick

    2009-01-01

    Memory' is often confused and mistaken for myth; this is in turn connected with the widespread use of mistaking collective mythology and common myth for the idea of a 'collective memory'. This essay discusses memory and history terminology in the context of the generic concept 'classical tradition'. The case study explored here - the nineteenth-century Walhalla 'temple' near Regensburg in Southern Germany - is an attempt to discuss the classical tradition, focusing on archaeology and architec...

  8. Traditional local communities in international law

    OpenAIRE

    Bessa Da Costa Antunes Rodrigues, Adriana Aparecida

    2013-01-01

    One of the most important innovations of the 1992 Rio Summit was the consolidation of a synergetic approach between human rights and environmental conservation and the introduction of traditional local communities as new subjects of rights in international law. By proclaiming traditional local communities - together with indigenous peoples - as 'custodians of biodiversity', the documents adopted during the meeting called upon States to protect their cultures and lifestyles by, inter alia, enh...

  9. Factors Influencing HEPA Filter Performance

    International Nuclear Information System (INIS)

    Properly functioning HEPA air filtration systems depend on a variety of factors that start with the use of fully characterized challenge conditions for system design and then process control during operation. This paper addresses factors that should be considered during the design phase as well as operating parameters that can be monitored to ensure filter function and lifetime. HEPA filters used in nuclear applications are expected to meet design, fabrication, and performance requirements set forth in the ASME AG-1 standard. The DOE publication Nuclear Air Cleaning Handbook (NACH) is an additional guidance document for design and operation HEPA filter systems in DOE facilities. These two guidelines establish basic maximum operating parameters for temperature, maximum aerosol particle size, maximum particulate matter mass concentration, acceptable differential pressure range, and filter media velocity. Each of these parameters is discussed along with data linking variability of each parameter with filter function and lifetime. Temporal uncertainty associated with gas composition, temperature, and absolute pressure of the air flow can have a direct impact on the volumetric flow rate of the system with a corresponding impact on filter media velocity. Correlations between standard units of flow rate (standard meters per minute or cubic feet per minute) versus actual units of volumetric flow rate are shown for variations in relative humidity for a 70 deg. C to 200 deg. C ve humidity for a 70 deg. C to 200 deg. C temperature range as an example of gas composition that, uncorrected, will influence media velocity. The AG-1 standard establishes a 2.5 cm/s (5 feet per minute) ceiling for media velocities of nuclear grade HEPA filters. Data are presented that show the impact of media velocities from 2.0 to 4.0 cm/s media velocities (4 to 8 fpm) on differential pressure, filter efficiency, and filter lifetime. Data will also be presented correlating media velocity effects with two different particle size distributions. (authors)

  10. Image Edges Strengthening Filter Based Color Filter Array Interpolation

    Directory of Open Access Journals (Sweden)

    C Mohammed Abdul Malik#1, B.N.Nagaveni

    2013-09-01

    Full Text Available Most of the digital cameras use color filter arrays instead of beam splitters to capture image data so as to reduce the cost and to gain more efficiency. As a result of this, only one of the required three color samples becomes available at each pixel location and the other two needs to be interpolated. This procedure is called Color Filter Array (CFA interpolation or demosaicing. So as to improve subjective and objective interpolation quality many demosaicing algorithms have been introduced. We propose an orientation-free edge strength filter and apply it to the demosaicing problem. Output of edge strength filter is utilized both to improve the initial green channel interpolation and to apply the constant color difference rule adaptively. This simple edge method yields visually pleasing results with high CPSNR.

  11. Collaborative Filtering Recommender Systems

    Directory of Open Access Journals (Sweden)

    Mehrbakhsh Nilashi

    2013-04-01

    Full Text Available Recommender Systems are software tools and techniques for suggesting items to users by considering their preferences in an automated fashion. The suggestions provided are aimed at support users in various decision-making processes. Technically, recommender system has their origins in different fields such as Information Retrieval (IR, text classification, machine learning and Decision Support Systems (DSS. Recommender systems are used to address the Information Overload (IO problem by recommending potentially interesting or useful items to users. They have proven to be worthy tools for online users to deal with the IO and have become one of the most popular and powerful tools in E-commerce. Many existing recommender systems rely on the Collaborative Filtering (CF and have been extensively used in E-commerce .They have proven to be very effective with powerful techniques in many famous E-commerce companies. This study presents an overview of the field of recommender systems with current generation of recommendation methods and examines comprehensively CF systems with its algorithms.

  12. On-line filtering

    International Nuclear Information System (INIS)

    Present day electronic detectors used in high energy physics make it possible to obtain high event rates and it is likely that future experiments will face even higher data rates than at present. The complexity of the apparatus increases very rapidly with time and also the criteria for selecting desired events become more and more complex. So complex in fact that the fast trigger system cannot be designed to fully cope with it. The interesting events become thus contaminated with multitudes of uninteresting ones. To distinguish the 'good' events from the often overwhelming background of other events one has to resort to computing techniques. Normally this selection is made in the first part of the analysis of the events, analysis normally performed on a powerful scientific computer. This implies however that many uninteresting or background events have to be recorded during the experiment for subsequent analysis. A number of undesired consequences result; and these constitute a sufficient reason for trying to perform the selection at an earlier stage, in fact ideally before the events are recorded on magnetic tape. This early selection is called 'on-line filtering' and it is the topic of the present lectures. (Auth.)

  13. Simplified design of filter circuits

    CERN Document Server

    Lenk, John

    1999-01-01

    Simplified Design of Filter Circuits, the eighth book in this popular series, is a step-by-step guide to designing filters using off-the-shelf ICs. The book starts with the basic operating principles of filters and common applications, then moves on to describe how to design circuits by using and modifying chips available on the market today. Lenk's emphasis is on practical, simplified approaches to solving design problems.Contains practical designs using off-the-shelf ICsStraightforward, no-nonsense approachHighly illustrated with manufacturer's data sheets

  14. Pragmatic circuits signals and filters

    CERN Document Server

    Eccles, William

    2006-01-01

    Pragmatic Circuits: Signals and Filters is built around the processing of signals. Topics include spectra, a short introduction to the Fourier series, design of filters, and the properties of the Fourier transform. The focus is on signals rather than power. But the treatment is still pragmatic. For example, the author accepts the work of Butterworth and uses his results to design filters in a fairly methodical fashion. This third of three volumes finishes with a look at spectra by showing how to get a spectrum even if a signal is not periodic. The Fourier transform provides a way of dealing wi

  15. ADVANCED HOT GAS FILTER DEVELOPMENT

    Energy Technology Data Exchange (ETDEWEB)

    E.S. Connolly; G.D. Forsythe

    2000-09-30

    DuPont Lanxide Composites, Inc. undertook a sixty-month program, under DOE Contract DEAC21-94MC31214, in order to develop hot gas candle filters from a patented material technology know as PRD-66. The goal of this program was to extend the development of this material as a filter element and fully assess the capability of this technology to meet the needs of Pressurized Fluidized Bed Combustion (PFBC) and Integrated Gasification Combined Cycle (IGCC) power generation systems at commercial scale. The principal objective of Task 3 was to build on the initial PRD-66 filter development, optimize its structure, and evaluate basic material properties relevant to the hot gas filter application. Initially, this consisted of an evaluation of an advanced filament-wound core structure that had been designed to produce an effective bulk filter underneath the barrier filter formed by the outer membrane. The basic material properties to be evaluated (as established by the DOE/METC materials working group) would include mechanical, thermal, and fracture toughness parameters for both new and used material, for the purpose of building a material database consistent with what is being done for the alternative candle filter systems. Task 3 was later expanded to include analysis of PRD-66 candle filters, which had been exposed to actual PFBC conditions, development of an improved membrane, and installation of equipment necessary for the processing of a modified composition. Task 4 would address essential technical issues involving the scale-up of PRD-66 candle filter manufacturing from prototype production to commercial scale manufacturing. The focus would be on capacity (as it affects the ability to deliver commercial order quantities), process specification (as it affects yields, quality, and costs), and manufacturing systems (e.g. QA/QC, materials handling, parts flow, and cost data acquisition). Any filters fabricated during this task would be used for product qualification tests being conducted by Westinghouse at Foster-Wheeler's Pressurized Circulating Fluidized Bed (PCFBC) test facility in Karhula, Finland. Task 5 was designed to demonstrate the improvements implemented in Task 4 by fabricating fifty 1.5-meter hot gas filters. These filters were to be made available for DOE-sponsored field trials at the Power Systems Development Facility (PSDF), operated by Southern Company Services in Wilsonville, Alabama.

  16. Determination of 'dynamic filter factor'

    International Nuclear Information System (INIS)

    The aim of this work was to experimentally determine the 'dynamic filter factor' in symmetrical fields for energies of 6 MV and 15 MV and to compare these results with values obtained by treatment planning system (Cad Plan) under the same conditions. So that, is possible to attribute, for treatment calculation with fields of irradiation that use dynamic filter, the same depth dose percentage used for opened field, without filter. In this way, is validated the system of planning CAD PLAN calculation used in this Institution

  17. Properties of ceramic candle filters

    Energy Technology Data Exchange (ETDEWEB)

    Pontius, D.H.

    1995-06-01

    The mechanical integrity of ceramic filter elements is a key issue for hot gas cleanup systems. To meet the demands of the advanced power systems, the filter components must sustain the thermal stresses of normal operations (pulse cleaning), of start-up and shut-down conditions, and of unanticipated process upsets such as excessive ash accumulation without catastrophic failure. They must also survive the various mechanical loads associated with handling and assembly, normal operation, and process upsets. For near-term filter systems, these elements must survive at operating temperatures of 1650{degrees}F for three years.

  18. Gas cleaning with Granular Filters

    OpenAIRE

    Natvig, Ingunn Roald

    2007-01-01

    The panel bed filter (PBF) is a granular filter patented by A. M. Squires in the late sixties. PBFs consist of louvers with stationary, granular beds. Dust is deposited in the top layers and on the bed surface when gas flows through. PBFs are resistant to high temperatures, variations in the gas flow and hot particles. The filter is cleaned by releasing a pressure pulse in the opposite direction of the bulk flow (a puff back pulse). A new louver geometry patented by A. M. Squires is the...

  19. Filtered Brownian motions as weak limit of filtered Poisson processes

    OpenAIRE

    Decreusefond, L.; Savy, N.

    2005-01-01

    The main result of this paper is a limit theorem which shows the convergence in law, on a Hölderian space, of filtered Poisson processes (a class of processes which contains shot noise process) to filtered Brownian motion (a class of processes which contains fractional Brownian motion) when the intensity of the underlying Poisson process is increasing. We apply the theory of convergence of Hilbert space valued semi-martingales and use some result of radonification.

  20. Automatic Keyword Identification by Artificial Neural Networks Compared to Manual Identification by Users of Filtering Systems.

    Science.gov (United States)

    Boger, Zvi; Kuflik, Tsvi; Shoval, Peretz; Shapira, Bracha

    2001-01-01

    Discussion of information filtering (IF) and information retrieval focuses on the use of an artificial neural network (ANN) as an alternative method for both IF and term selection and compares its effectiveness to that of traditional methods. Results show that the ANN relevance prediction out-performs the prediction of an IF system. (Author/LRW)

  1. Particle Filter Improved by Genetic Algorithm and Particle Swarm Optimization Algorithm

    Directory of Open Access Journals (Sweden)

    Ming Li

    2013-03-01

    Full Text Available Particle filter algorithm is a filtering method which uses Monte Carlo idea within the framework of Bayesian estimation theory. It approximates the probability distribution by using particles and discrete random measure which is consisted of their weights, it updates new discrete random measure recursively according to the algorithm. When the sample is large enough, the discrete random measure approximates the true posteriori probability density function of the state variable. The particle filter algorithm is applicable to any non-linear non-Gaussian system. But the standard particle filter does not consider the current measured value, which will lead to particles with non-zero weights become less after some iterations, this results in particle degradation; re-sampling technique was used to inhibit degradation, but this will reduce the particle diversity, and results in particle impoverishment. To overcome the problems, this paper proposed a new particle filter which introduced genetic algorithm and particle swarm optimization algorithm. The new algorithm is called intelligent particle filter (IPF. Driving particles move to the optimal position by using particle swarm optimization algorithm, thus the numbers of effective particles was increased, the particle diversity was improved, and the particle degradation was inhibited. Replace the re-sampling method in traditional particle filter by using the choice, crossover and mutation operation of the genetic algorithm, avoiding the phenomenon of impoverishment. Simulation results show that the new algorithm improved the estimation accuracy significantly compare with the standard particle filter.

  2. Stability of Sunscreens Containing CePO4: Proposal for a New Inorganic UV Filter

    Directory of Open Access Journals (Sweden)

    Vitor C. Seixas

    2014-07-01

    Full Text Available Inorganic UV filters have become attractive because of their role in protecting the skin from the damage caused by continuous exposure to the sun. However, their large refractive index and high photocatalytic activity have led to the development of alternative inorganic materials such as CePO4 for application as UV filters. This compound leaves a low amount of white residue on the skin and is highly stable. The aim of this study was to evaluate the physical and chemical stability of a cosmetic formulation containing ordinary organic UV filters combined with 5% CePO4, and, to compare it with other formulations containing the same vehicle with 5% TiO2 or ZnO as inorganic materials. The rheological behavior and chemical stability of the formulations containing these different UV filters were investigated. Results showed that the formulation containing CePO4 is a promising innovative UV filter due to its low interaction with organic filters, which culminates in longer shelf life when compared with traditional formulations containing ZnO or TiO2 filters. Moreover, the recognized ability of CePO4 to leave a low amount of white residue on the skin combined with great stability, suggests that CePO4 can be used as inorganic filter in high concentrations, affording formulations with high SPF values.

  3. Stability of sunscreens containing CePO4: proposal for a new inorganic UV filter.

    Science.gov (United States)

    Seixas, Vitor C; Serra, Osvaldo A

    2014-01-01

    Inorganic UV filters have become attractive because of their role in protecting the skin from the damage caused by continuous exposure to the sun. However, their large refractive index and high photocatalytic activity have led to the development of alternative inorganic materials such as CePO4 for application as UV filters. This compound leaves a low amount of white residue on the skin and is highly stable. The aim of this study was to evaluate the physical and chemical stability of a cosmetic formulation containing ordinary organic UV filters combined with 5% CePO4, and, to compare it with other formulations containing the same vehicle with 5% TiO2 or ZnO as inorganic materials. The rheological behavior and chemical stability of the formulations containing these different UV filters were investigated. Results showed that the formulation containing CePO4 is a promising innovative UV filter due to its low interaction with organic filters, which culminates in longer shelf life when compared with traditional formulations containing ZnO or TiO2 filters. Moreover, the recognized ability of CePO4 to leave a low amount of white residue on the skin combined with great stability, suggests that CePO4 can be used as inorganic filter in high concentrations, affording formulations with high SPF values. PMID:25010465

  4. Sensory pollution from bag-type fiberglass ventilation filters: Conventional filter compared with filters containing various amounts of activated carbon

    Energy Technology Data Exchange (ETDEWEB)

    Bekoe, Gabriel; Clausen, Geo [International Centre for Indoor Environment and Energy, Dept. of Civil Engineering, Technical University of Denmark, Nils Koppels Alle 402, 2800-Lyngby (Denmark); Fadeyi, Moshood Olawale [International Centre for Indoor Environment and Energy, Dept. of Civil Engineering, Technical University of Denmark, Nils Koppels Alle 402, 2800-Lyngby (Denmark); Department of Building, School of Design and Environment, National University of Singapore, 4 Architecture Drive, Singapore 117566 (Singapore); Weschler, Charles J. [International Centre for Indoor Environment and Energy, Dept. of Civil Engineering, Technical University of Denmark, Nils Koppels Alle 402, 2800-Lyngby (Denmark); EOHSI (UMDNJ-RW Johnson Medical School and Rutgers Univ.), Piscataway, NJ 08854 (United States)

    2009-10-15

    As ventilation filters accumulate particles removed from the airstream, they become emitters of sensory pollutants that degrade indoor air quality. Previously we demonstrated that an F7 bag-type filter that incorporates activated carbon (a ''combination filter'') reduces this adverse effect compared to an equivalent filter without carbon. The aim of the present study was to examine how the amount of activated carbon (AC) used in combination filters affects their ability to remove both sensory offending pollutants and ozone. A panel evaluated the air downstream of four different filters after each had continuously filtered outdoor suburban air over a period of 6 months. Interim assessments (mid-term evaluation) were performed after 3 months. During both assessments, four unused filters, identical in type to the loaded filters, were also evaluated. The evaluated filters included a conventional F7 fiberglass filter and three modifications of a bag-type fiberglass combination filter: the ''Heavy'' corresponded to a commercially available filter containing 400 g of carbon per square meter of filter area, the ''Medium'' contained half as much carbon (200 g/m{sup 2}), and the ''Light'' contained a quarter as much carbon (100 g/m{sup 2}). Each filter was weighed at the beginning of the soiling period and after 3 and 6 months of service. Additionally, up- and down-stream ozone concentrations and filter pressure drops were measured monthly. Following 6 months of service, the air downstream of each of the combination filters was judged to be significantly better than the air downstream of the 6-month-old F7 filter, and was comparable to that from an unused F7 filter. Additionally, the combination filters removed more ozone from the air than the F7 filter, with their respective fractional removal efficiencies roughly scaling with their carbon content. (author)

  5. Sensory pollution from bag-type fiberglass ventilation filters: Conventional filter compared with filters containing various amounts of activated carbon

    DEFF Research Database (Denmark)

    Bekö, Gabriel; Fadeyi, M.O.

    2009-01-01

    As ventilation filters accumulate particles removed from the airstream, they become emitters of sensory pollutants that degrade indoor air quality. Previously we demonstrated that an F7 bag-type filter that incorporates activated carbon (a "combination filter") reduces this adverse effect compared to an equivalent filter without carbon. The aim of the present study was to examine how the amount of activated carbon (AC) used in combination filters affects their ability to remove both sensory offending pollutants and ozone. A panel evaluated the air downstream of four different filters after each had continuously filtered outdoor suburban air over a period of 6 months. Interim assessments (mid-term evaluation) were performed after 3 months. During both assessments, four unused filters, identical in type to the loaded filters, were also evaluated. The evaluated filters included a conventional F7 fiberglass filter and three modifications of a bag-type fiberglass combination filter: the "Heavy" corresponded to a commercially available filter containing 400 g of carbon per square meter of filter area, the "Medium" contained half as much carbon (200 g/m(2)), and the "Light" contained a quarter as much carbon (100 g/m(2)). Each filter was weighed at the beginning of the soiling period and after 3 and 6 months of service. Additionally, up- and down-stream ozone concentrations and filter pressure drops were measured monthly. Following 6 months of service, the air downstream of each of the combination filters was judged to be significantly better than the air downstream of the 6-month-old F7 filter, and was comparable to that from an unused F7 filter. Additionally, the combination filters removed more ozone from the air than the F7 filter, with their respective fractional removal efficiencies roughly scaling with their carbon content.

  6. Bloom Filters in Adversarial Environments

    OpenAIRE

    Naor, Moni; Yogev, Eylon

    2014-01-01

    A Bloom filter represents a set $S$ of elements approximately, by using fewer bits than a precise representation. The price for succinctness is allowing some errors: for any $x \\in S$ it should always answer 'Yes', and for any $x \

  7. Multi-Domain Collaborative Filtering

    CERN Document Server

    Zhang, Yu; Yeung, Dit-Yan

    2012-01-01

    Collaborative filtering is an effective recommendation approach in which the preference of a user on an item is predicted based on the preferences of other users with similar interests. A big challenge in using collaborative filtering methods is the data sparsity problem which often arises because each user typically only rates very few items and hence the rating matrix is extremely sparse. In this paper, we address this problem by considering multiple collaborative filtering tasks in different domains simultaneously and exploiting the relationships between domains. We refer to it as a multi-domain collaborative filtering (MCF) problem. To solve the MCF problem, we propose a probabilistic framework which uses probabilistic matrix factorization to model the rating problem in each domain and allows the knowledge to be adaptively transferred across different domains by automatically learning the correlation between domains. We also introduce the link function for different domains to correct their biases. Experi...

  8. Biometric verification with correlation filters

    Science.gov (United States)

    Vijaya Kumar, B. V. K.; Savvides, Marios; Xie, Chunyan; Venkataramani, Krithika; Thornton, Jason; Mahalanobis, Abhijit

    2004-01-01

    Using biometrics for subject verification can significantly improve security over that of approaches based on passwords and personal identification numbers, both of which people tend to lose or forget. In biometric verification the system tries to match an input biometric (such as a fingerprint, face image, or iris image) to a stored biometric template. Thus correlation filter techniques are attractive candidates for the matching precision needed in biometric verification. In particular, advanced correlation filters, such as synthetic discriminant function filters, can offer very good matching performance in the presence of variability in these biometric images (e.g., facial expressions, illumination changes, etc.). We investigate the performance of advanced correlation filters for face, fingerprint, and iris biometric verification.

  9. Wiener Chaos and Nonlinear Filtering

    International Nuclear Information System (INIS)

    The paper discusses two algorithms for solving the Zakai equation in the time-homogeneous diffusion filtering model with possible correlation between the state process and the observation noise. Both algorithms rely on the Cameron-Martin version of the Wiener chaos expansion, so that the approximate filter is a finite linear combination of the chaos elements generated by the observation process. The coefficients in the expansion depend only on the deterministic dynamics of the state and observation processes. For real-time applications, computing the coefficients in advance improves the performance of the algorithms in comparison with most other existing methods of nonlinear filtering. The paper summarizes the main existing results about these Wiener chaos algorithms and resolves some open questions concerning the convergence of the algorithms in the noise-correlated setting. The presentation includes the necessary background on the Wiener chaos and optimal nonlinear filtering

  10. A Performance Weighted Collaborative Filtering algorithm for personalized radiology education.

    Science.gov (United States)

    Lin, Hongli; Yang, Xuedong; Wang, Weisheng; Luo, Jiawei

    2014-10-01

    Devising an accurate prediction algorithm that can predict the difficulty level of cases for individuals and then selects suitable cases for them is essential to the development of a personalized training system. In this paper, we propose a novel approach, called Performance Weighted Collaborative Filtering (PWCF), to predict the difficulty level of each case for individuals. The main idea of PWCF is to assign an optimal weight to each rating used for predicting the difficulty level of a target case for a trainee, rather than using an equal weight for all ratings as in traditional collaborative filtering methods. The assigned weight is a function of the performance level of the trainee at which the rating was made. The PWCF method and the traditional method are compared using two datasets. The experimental data are then evaluated by means of the MAE metric. Our experimental results show that PWCF outperforms the traditional methods by 8.12% and 17.05%, respectively, over the two datasets, in terms of prediction precision. This suggests that PWCF is a viable method for the development of personalized training systems in radiology education. PMID:24842564

  11. PRIMARY QUALITIES IN PHYTOTHERAPY AND TRADITIONAL MEDICINES

    Directory of Open Access Journals (Sweden)

    Farid Ramezany

    2013-05-01

    Full Text Available Objectives: The significance of principles of traditional medicines in research protocols are emphasized by World Health Organization. Primary qualities, traditionally referred to as “hot”, “cold”, “dry” and “wet”, are fundamental concepts of many medical traditions of antiquity such as Persian, Chinese, Greek, and Indian. In Humoral-based traditional medicines, these qualities are regulating factors and act in dynamic balance to maintain health. Therefore, understanding of the primary qualities of body humors and drugs is decisive for treatment, self-care and prevention of diseases in many traditional medicines. The main goals of this study are to consider the relationships among primary qualities and botanical or phytochemical profiles of the traditional Iranian Medicinal herbs.Method: A number of 489 medicinal plants were accommodated with proposed scientific names and the corresponding primary qualities were extracted from Old Persian pharmacopeias. Based on literatures, two data sets screened for statistical study. To ensure consistency and similarity of screened samples, they were examined by Chi-square (?2 test. Influences of botanical families on primary qualities were studied by screening of 339 plants in 29 botanical families tested with ?2 test. In the second stage, major phytochemicals of 192 herbs were categorized based on existence of 23 groups of phytochemicals and a model based on traditional medicine concepts was made using logistic regression.Results: Statistical outcomes revealed that although a few botanical families tend to correlate in specific primary qualities, most others displayed no significant relationship. The proposed phytochemical model was able to estimate the relationship between primary qualities and phytochemical classes in more than 77% of the cases. The findings were in accordance with literatures.Conclusion: The botanical family classification is not an empirically acceptable indicator of primary qualities in medicinal plants. On the other hand, phytochemical profile of a plant is an authentic indicator of primary qualities.

  12. Identification Filtering with fuzzy estimations

    OpenAIRE

    J.J Medel J; J. C: Garcia I; J. C. Sanchez G

    2012-01-01

    A digital identification filter interacts with an output reference model signal known as a black-box output system. The identification technique commonly needs the transition and gain matrixes. Both estimation cases are based on mean square criterion obtaining of the minimum output error as the best estimation filtering. The evolution system represents adaptive properties that the identification mechanism includes considering the fuzzy logic strategies affecting in probability sense the evolu...

  13. Process for washing electromagnetic filters

    International Nuclear Information System (INIS)

    This process concerns the washing of an electro-magnetic filter used, inter alia, for filtering the drain-off waters of nuclear power station steam generators, by means of a washing water used in closed circuit and freed, after each cleaning, of the solids in suspension it contains, by settlement of these solids. This invention enables the volume of water to be evaporated to be divided by 50, thereby providing a solid assurance of better safety, apart from a very significant saving

  14. Multi-Domain Collaborative Filtering

    OpenAIRE

    Zhang, Yu; Cao, Bin; Yeung, Dit-yan

    2012-01-01

    Collaborative filtering is an effective recommendation approach in which the preference of a user on an item is predicted based on the preferences of other users with similar interests. A big challenge in using collaborative filtering methods is the data sparsity problem which often arises because each user typically only rates very few items and hence the rating matrix is extremely sparse. In this paper, we address this problem by considering multiple collaborative filterin...

  15. Yellow intraocular filters in fishes.

    Science.gov (United States)

    Heinermann, P H

    1984-01-01

    Yellow intraocular filters are common among the teleosts, especially highly diurnal species. This yellow pigmentation may be uniform, more dense dorsally, or localized to a narrow dorsal ring near the limbus. Certain species possess occlusable yellow corneas and can vary the corneal colour in response to the level of illumination. Yellow lenses and corneas function as hi-pass filters, with the cutoff points varying depending on species. Thus, the amount of short-wavelength light reaching the retina can be regulated. Three distinct yellow pigments may be present in each of the lens, cornea and the retina of certain South American cichlids. The spectral absorbance of the yellow corneal pigment bears a close resemblance to that of beta-carotene. Possible functions of these yellow filters are: a reduction in chromatic aberration, the reduction of glare and dazzle, the improvement of detail by the absorption of "blue haze", the improvement of contrast vision, and the rendering of bioluminescence more conspicuous. Yellow intraocular filters may result in a loss of scotopic sensitivity due to absorption of short wavelengths. Various adaptations in diurnal teleosts to avoid the loss of sensitivity resulting from a yellow filter are presented. Normally, bottom-dwelling fishes lack yellow filters. These filters cause the effective absorbance maximum of scotopic visual pigments to be shifted to longer wavelengths. No correlation has been found between the presence of such filters and the water colour, diet or spectral absorbance of the visual pigment. A possible explanation for the lack of correlation with visual pigments is discussed. Investigation of cone spectral sensitivities may possibly reveal such a correlation. PMID:6398222

  16. Westinghouse advanced particle filter system

    Energy Technology Data Exchange (ETDEWEB)

    Lippert, T.E.; Bruck, G.J.; Sanjana, Z.N.; Newby, R.A.

    1995-11-01

    Integrated Gasification Combined Cycles (IGCC), Pressurized Fluidized Bed Combustion (PFBC) and Advanced PFBC (APFB) are being developed and demonstrated for commercial power generation application. Hot gas particulate filters are key components for the successful implementation of IGCC, PFBC and APFB in power generation gas turbine cycles. The objective of this work is to develop and qualify through analysis and testing a practical hot gas ceramic barrier filter system that meets the performance and operational requirements of these advanced, solid fuel power generation cycles.

  17. Narrow-Band Microwave Filters

    Directory of Open Access Journals (Sweden)

    A.V. Strizhachenko

    2010-01-01

    Full Text Available Original design of the narrow-band compact filters based on the high-quality waveguide-dielectric resonator with anisotropic materials has been presented in this work. Designed filters satisfy the contradictory requirements: they provide the narrow frequency band (0.05 ÷ 0.1 % of the main frequency f0 and the low initial losses ?0 ? 1 dB.

  18. Stochastic processes and filtering theory

    CERN Document Server

    Jazwinski, Andrew H

    2013-01-01

    This unified treatment of linear and nonlinear filtering theory presents material previously available only in journals, and in terms accessible to engineering students. Its sole prerequisites are advanced calculus, the theory of ordinary differential equations, and matrix analysis. Although theory is emphasized, the text discusses numerous practical applications as well.Taking the state-space approach to filtering, this text models dynamical systems by finite-dimensional Markov processes, outputs of stochastic difference, and differential equations. Starting with background material on probab

  19. An introduction to quantum filtering

    OpenAIRE

    Bouten, Luc; Handel, Ramon; James, Matthew

    2006-01-01

    This paper provides an introduction to quantum filtering theory. An introduction to quantum probability theory is given, focusing on the spectral theorem and the conditional expectation as a least squares estimate, and culminating in the construction of Wiener and Poisson processes on the Fock space. We describe the quantum It\\^o calculus and its use in the modelling of physical systems. We use both reference probability and innovations methods to obtain quantum filtering eq...

  20. Optimization of the filter technique

    International Nuclear Information System (INIS)

    A technique of the resonant filtering is used in the Moessbauer spectroscopy for receiving polarized radiation. Two main contributions suppress the filter quality: unwanted small absorption line located at the resonance energy and large unwanted absorption line located far from the resonance energy. Both contributions can be parametrized by a single small parameter r. The optimal effective thickness is proportional to 1 / r2, while the departure of the maximum polarization degree from the unity is proportional to r.

  1. Repeated median and hybrid filters

    OpenAIRE

    Fried, Roland; Bernholt, Thorsten; Gather, Ursula

    2004-01-01

    Standard median filters preserve abrupt shifts (edges) and remove impulsive noise (outliers) from a constant signal but they deteriorate in trend periods. FIR median hybrid (FMH) filters are more flexible and also preserve shifts, but they are much more vulnerable to outliers. Application of robust regression methods, in particular of the repeated median, has been suggested for removing subsequent outliers from a signal with trends. A fast algorithm for updating the repeated median in linear ...

  2. Enhancing collaborative filtering by user interest expansion via personalized ranking.

    Science.gov (United States)

    Liu, Qi; Chen, Enhong; Xiong, Hui; Ding, Chris H Q; Chen, Jian

    2012-02-01

    Recommender systems suggest a few items from many possible choices to the users by understanding their past behaviors. In these systems, the user behaviors are influenced by the hidden interests of the users. Learning to leverage the information about user interests is often critical for making better recommendations. However, existing collaborative-filtering-based recommender systems are usually focused on exploiting the information about the user's interaction with the systems; the information about latent user interests is largely underexplored. To that end, inspired by the topic models, in this paper, we propose a novel collaborative-filtering-based recommender system by user interest expansion via personalized ranking, named iExpand. The goal is to build an item-oriented model-based collaborative-filtering framework. The iExpand method introduces a three-layer, user-interests-item, representation scheme, which leads to more accurate ranking recommendation results with less computation cost and helps the understanding of the interactions among users, items, and user interests. Moreover, iExpand strategically deals with many issues that exist in traditional collaborative-filtering approaches, such as the overspecialization problem and the cold-start problem. Finally, we evaluate iExpand on three benchmark data sets, and experimental results show that iExpand can lead to better ranking performance than state-of-the-art methods with a significant margin. PMID:21880572

  3. Pansharpening of multispectral images using filtering in Fourier domain

    Science.gov (United States)

    Akoguz, Alper; Kurt, Burak; Pinar, Sedef K.

    2014-10-01

    In this study, there is examined filtering based pansharpening methods which means of using several 2D FIR filters in Fourier domain which implies that the filters are applied after taking 2D Discrete Fourier Transform of both multispectral and panchromatic image and after the pansharpening process in Fourier domain, the resulting pansharpened image is obtained with an inverse 2D DFT. In addition, these methods are compared with commonly used fusion methods which are combined as modulation based and component substitution based methods. The algorithms are applied to SPOT 6 co-registered image couples that were acquired simultaneously. Couples are chosen for three different regions which are a city image (Gebze/Turkey), a forest image (Istanbul/Turkey) and an agriculture field image (Sanliurfa/Turkey) in order to analyse the methods in different regional characteristics. These methods are compared by the fusion quality assessments that have common acceptance in community. The results of these quality assessments shows the filtering based methods had the best scores among the traditional methods.

  4. Local geometry variable conductance diffusion for post-reconstruction filtering

    International Nuclear Information System (INIS)

    Variable conductance diffusion (VCD) filtering can preserve edges while smoothing noise in an image. The threshold of the conductance function determines the degree to which a part of the image is smoothed. Traditionally, a constant threshold has been used. The use of a global threshold does not allow for adaptation to local variations within the image. The approach presented in this paper exploits the local geometry of the image and derives the threshold from the variations that are more likely caused by noise than by structural changes. The authors apply it to simulated noisy reconstructed single-photon emission computed tomographic (SPECT) image sets. For a particular voxel, if a consistent gradient direction is found within its neighborhood, then the variations on the plane perpendicular to the gradient direction are considered as noise and used to derive the threshold. The results show that, for the same average noise level in the liver, the image contrast from both local geometry and constant threshold VCD filters are higher than those from Butterworth filtering. The local geometry VCD filtering provides images with smoother boundaries than the constant threshold method. Moreover, the contrast loss is less sensitive to the tumor size for the local geometry method

  5. Combined tunable filters based swept laser source for optical coherence tomography

    Science.gov (United States)

    Chen, Minghui; Ding, Zhihua; Wang, Cheng; Huang, Yimei; Chen, Rong; Song, Chengli

    2013-03-01

    We demonstrate a novel ultra-broad tunable bandwidth and narrow instantaneous line-width swept laser source using combined tunable filters working at 1290 nm center wavelength for application in optical coherence tomography. The combined filters consist of a fiber Fabry-Perot tunable filter (FFP-TF) and a polygon mirror with scanning grating based filter. The FFP-TF has the narrow free spectral range (FSR) but ultra-high spectral resolution (narrow instantaneous bandwidth) driven at high frequency far from resonant frequency. The polygon filter in the Littrow configuration is composed of fiber collimator, polygon mirror driven by function generator, and diffractive grating with low groove. Polygon filter coarsely tunes with wide turning range and then FFP-TF finely tunes with narrow band-pass filtering. In contrast to traditional method using single tunable filter, the trade-off between bandwidth and instantaneous line-width is alleviated. The combined filters can realize ultra wide scan range and fairly narrow instantaneous bandwidth simultaneously. Two semiconductor optical amplifiers (SOA) in the parallel manner are used as the gain medium. The wide bandwidth could be obtained by these parallel SOAs to be suitable for sufficient wide range of the polygon filter's FSR because each SOA generates its own spectrum independently. The proposed swept laser source provides an edge-to-edge scanning range of 180 nm covering 1220 to 1400 nm with instantaneous line-width of about 0.03 nm at sweeping rate of 23.3 kHz. The swept laser source with combined filters offers broadband tunable range with narrow instantaneous line-width, which especially benefits for high resolution and deep imaging depth optical frequency domain imaging.

  6. The Kalman-Levy filter

    CERN Document Server

    Sornette, D

    2001-01-01

    The Kalman filter combines forecasts and new observations to obtain an estimation which is optimal in the sense of a minimum average quadratic error. The Kalman filter has two main restrictions: (i) the dynamical system is assumed linear and (ii) forecasting errors and observational noises are taken Gaussian. Here, we offer an important generalization to the case where errors and noises have heavy tail distributions such as power laws and L\\'evy laws. The main tool needed to solve this ``Kalman-L\\'evy'' filter is the ``tail-covariance'' matrix which generalizes the covariance matrix in the case where it is mathematically ill-defined (i.e. for power law tail exponents $\\mu \\leq 2$). We present the general solution and discuss its properties on pedagogical examples. The standard Kalman-Gaussian filter is recovered for the case $\\mu = 2$. The optimal Kalman-L\\'evy filter is found to deviate substantially fro the standard Kalman-Gaussian filter as $\\mu$ deviates from 2. As $\\mu$ decreases, novel observations are ...

  7. The Archimedes Plasma Mass Filter

    Science.gov (United States)

    Miller, R. L.; Ohkawa, T.; Agnew, S. F.; Cluggish, B. P.; Freeman, R. L.; Gilleland, J.; Putvinski, S.; Sevier, L.; Umstadter, K. R.

    2001-10-01

    Archimedes Technology Group is developing a plasma technology, called the Archimedes Plasma Mass Filter, which can separate a waste mixture ion by ion into mass groups and as such represents a major advance in waste separations technology. The filter is a plasma device employing a magnetic and electric field configuration that acts as a low-mass-pass filter for ions. Ions with mass above a tunable “cutoff mass” are expelled from the plasma. The Archimedes Plasma Mass Filter satisfies all of the requirements of an economic mass separator system: good single-pass separation, acceptable energy cost per ion, and high material throughput. This technology could significantly reduce the volume of radioactive waste at the Hanford Site in Richland, Washington, which is storing sixty percent of the nation’s defense nuclear waste. The potential waste reduction is dramatic because 82 wtpresently scheduled to be vitrified (immobilized and stored in glass) at Hanford are below mass number 60 while 99.9the radioactivity comes from atoms above mass number 89. We will present the plasma physics basis for the filter effect, the fundamental parameter constraints, and modeling results of filter operation.

  8. Welfare of non-traditional pets.

    Science.gov (United States)

    Schuppli, C A; Fraser, D; Bacon, H J

    2014-04-01

    The keeping of non-traditional or 'exotic' pets has been growing in popularity worldwide. In addition to the typical welfare challenges of keeping more traditional pet species like dogs and cats, ensuring the welfare of non-traditional pets is complicated by factors such as lack of knowledge, difficulties meeting requirements in the home and where and how animals are obtained. This paper uses examples of different species to highlight three major welfare concerns: ensuring that pets under our care i) function well biologically, ii) are free from negative psychological states and able to experience normal pleasures, and iii) lead reasonably natural lives. The keeping of non-traditional pets also raises ethical concerns about whether the animal poses any danger to others (e.g. transmission of zoonotic diseases) and whether the animal might cause environmental damage (e.g. invading non-native habitats when released). The authors used these considerations to create a checklist, which identifies and organises the various concerns that may arise over keeping non-traditional species as pets. An inability to address these concerns raises questions about how to mitigate them or even whether or not certain species should be kept as pets at all. Thus, the authors propose five categories, which range from relatively unproblematic pet species to species whose keeping poses unacceptable risks to the animals, to humans, or to the environment. This approach to the evaluation and categorisation of species could provide a constructive basis for advocacy and regulatory actions. PMID:25000795

  9. Traditional Chinese food technology and cuisine.

    Science.gov (United States)

    Li, Jian-rong; Hsieh, Yun-Hwa P

    2004-01-01

    From ancient wisdom to modern science and technology, Chinese cuisine has been established from a long history of the country and gained a global reputation of its sophistication. Traditional Chinese foods and cuisine that exhibit Chinese culture, art and reality play an essential role in Chinese people's everyday lives. Recently, traditional Chinese foods have drawn a great degree of attention from food scientists and technologists, the food industry, and health promotion institutions worldwide due to the extensive values they offer beyond being merely another ethnic food. These traditional foods comprise a wide variety of products, such as pickled vegetables, salted fish and jellyfish, tofu and tofu derived products, rice and rice snack foods, fermented sauces, fish balls and thousand-year-old eggs. An overview of selected popular traditional Chinese foods and their processing techniques are included in this paper. Further development of the traditional techniques for formulation and production of these foods is expected to produce economic, social and health benefits. PMID:15228981

  10. An epistemological reflection on the relevance of monastic traditions for retreat in the Dutch Reformed tradition

    OpenAIRE

    Schutte, C. H.; Yolanda Dreyer

    2006-01-01

    The article focuses on retreat as a relatively new phenomenon in the Dutch Reformed tradition. Retreat is viewed as "communicative action". The aim of the article is firstly to explore epistemological theories in the postmodern paradigm. These theories provide a mental framework for the identification of a research model and a related methodology by means of which the relevance of monastic traditions for retreat in the Reformed tradition can be discovered. The identification o...

  11. Traditional and Non-traditional Collective Bargaining: Strategies to Improve the Patient Care Environment

    OpenAIRE

    Budd, K.

    2004-01-01

    Acquiring organizational autonomy and control over nursing practice, through a combination of traditional and non-traditional collective bargaining (CB) strategies, is emerging as an important solution to the nursing shortage crisis. For the past 60 years, nurses have improved their economic and general welfare by organizing through traditional CB, particularly during periods of nursing shortages. During the past decade, however, the downsizing of nursing staffs, systems redesign, and oppress...

  12. FREQUENCY FILTERING OF TORSIONAL ALFVEN WAVES BY CHROMOSPHERIC MAGNETIC FIELD

    International Nuclear Information System (INIS)

    In this Letter, we demonstrate how the observation of broadband frequency propagating torsional Alfven waves in chromospheric magnetic flux tubes can provide valuable insight into their magnetic field structure. By implementing a full nonlinear three-dimensional magnetohydrodynamic numerical simulation with a realistic vortex driver, we demonstrate how the plasma structure of chromospheric magnetic flux tubes can act as a spatially dependent frequency filter for torsional Alfven waves. Importantly, for solar magnetoseismology applications, this frequency filtering is found to be strongly dependent on magnetic field structure. With reference to an observational case study of propagating torsional Alfven waves using spectroscopic data from the Swedish Solar Telescope, we demonstrate how the observed two-dimensional spatial distribution of maximum power Fourier frequency shows a strong correlation with our forward model. This opens the possibility of beginning an era of chromospheric magnetoseismology, to complement the more traditional methods of mapping the magnetic field structure of the solar chromosphere.

  13. A Contextual Item-Based Collaborative Filtering Technology

    Directory of Open Access Journals (Sweden)

    Pan Pan

    2012-05-01

    Full Text Available This paper proposes a contextual item-based collaborative filtering technology, which is based on the traditional item-based collaborative filtering technology. In the process of the recommendation, user’s important mobile contextual information are taken into account, and the technology combines with those ratings on the items in the users’ historical contextual information who are familiar with user’s current context information in order to predict that which items will be preferred by user in his or her current context. At the end, an experiment is used to prove that the technology proposed in this paper can predict user’s preference in his or her mobile environment more accurately.

  14. Filter for separating solids from gaseous media

    International Nuclear Information System (INIS)

    An improvement of the configuration of a filter is proposed, which should ensure that foreign bodies do not settle in the flow channels, but can easily be removed. The filters for gas cleaning known from 23 52 335 receive a drainage opening at the bottom of the filter, and the slope of the filter cone is changed somewhat. (RW)

  15. Processing of scintigrams using factor filters

    International Nuclear Information System (INIS)

    Factor filters are calculated from the scintigram itself by means of factor analytical methods. They effect a smoothing fitted to the image structure. The only filter parameter is the number of factors used to calculate the filter. By varying the factor number, it is possible to have an interactive filtering. (orig./LH)

  16. Water washable stainless steel HEPA filter

    Science.gov (United States)

    Phillips, Terrance D. (617 Chestnut Ct., Aiken, SC 29803)

    2001-01-01

    The invention is a high efficiency particulate (HEPA) filter apparatus and system, and method for assaying particulates. The HEPA filter provides for capture of 99.99% or greater of particulates from a gas stream, with collection of particulates on the surface of the filter media. The invention provides a filter system that can be cleaned and regenerated in situ.

  17. Recursive approximation of the bilateral filter.

    Science.gov (United States)

    Yang, Qingxiong

    2015-06-01

    This paper presents a complete proof that the bilateral filter can be implemented recursively, as long as: 1) the spatial filter can be implemented recursively and 2) the range filter can be decomposed into a recursive product. As a result, an O(ND) solution can be obtained for bilateral filtering, where N is the image size and D is the dimensionality. PMID:25700449

  18. Analysis and Simulation of Reduced FIR Filters

    Science.gov (United States)

    Radic, Lj.; Mathis, W.

    2005-05-01

    High order FIR filters employ model reduction techniques, in order to decrease power consumption and time delay. During reduction high order FIR filters are converted into low order IIR filters preserving stability and phase linearity as main features. Matlab simulations of an audio system with these reduced filters are presented. Furthermore, the influence of order on power consumption is discussed.

  19. The ethics of improving African traditional medical practice: scientific or African traditional research methods?

    Science.gov (United States)

    Nyika, Aceme

    2009-11-01

    The disease burden in Africa, which is relatively very large compared with developed countries, has been attributed to various factors that include poverty, food shortages, inadequate access to health care and unaffordability of Western medicines to the majority of African populations. Although for 'old diseases' knowledge about the right African traditional medicines to treat or cure the diseases has been passed from generation to generation, knowledge about traditional medicines to treat newly emerging diseases has to be generated in one way or another. In addition, the existing traditional medicines have to be continuously improved, which is also the case with Western scientific medicines. Whereas one school of thought supports the idea of improving medicines, be they traditional or Western, through scientific research, an opposing school of thought argues that subjecting African traditional medicines to scientific research would be tantamount to some form of colonization and imperialism. This paper argues that continuing to use African traditional medicines for old and new diseases without making concerted efforts to improve their efficacy and safety is unethical since the disease burden affecting Africa may continue to rise in spite of the availability and accessibility of the traditional medicines. Most importantly, the paper commends efforts being made in some African countries to improve African traditional medicine through a combination of different mechanisms that include the controversial approach of scientific research on traditional medicines. PMID:19682966

  20. Data-mining of potential antitubercular activities from molecular ingredients of traditional Chinese medicines

    Directory of Open Access Journals (Sweden)

    Salma Jamal

    2014-07-01

    Full Text Available Background. Traditional Chinese medicine encompasses a well established alternate system of medicine based on a broad range of herbal formulations and is practiced extensively in the region for the treatment of a wide variety of diseases. In recent years, several reports describe in depth studies of the molecular ingredients of traditional Chinese medicines on the biological activities including anti-bacterial activities. The availability of a well-curated dataset of molecular ingredients of traditional Chinese medicines and accurate in-silico cheminformatics models for data mining for antitubercular agents and computational filters to prioritize molecules has prompted us to search for potential hits from these datasets. Results. We used a consensus approach to predict molecules with potential antitubercular activities from a large dataset of molecular ingredients of traditional Chinese medicines available in the public domain. We further prioritized 160 molecules based on five computational filters (SMARTSfilter so as to avoid potentially undesirable molecules. We further examined the molecules for permeability across Mycobacterial cell wall and for potential activities against non-replicating and drug tolerant Mycobacteria. Additional in-depth literature surveys for the reported antitubercular activities of the molecular ingredients and their sources were considered for drawing support to prioritization. Conclusions. Our analysis suggests that datasets of molecular ingredients of traditional Chinese medicines offer a new opportunity to mine for potential biological activities. In this report, we suggest a proof-of-concept methodology to prioritize molecules for further experimental assays using a variety of computational tools. We also additionally suggest that a subset of prioritized molecules could be used for evaluation for tuberculosis due to their additional effect against non-replicating tuberculosis as well as the additional hepato-protection offered by the source of these ingredients.

  1. Risk Sensitive Filtering with Poisson Process Observations

    International Nuclear Information System (INIS)

    In this paper we consider risk sensitive filtering for Poisson process observations. Risk sensitive filtering is a type of robust filtering which offers performance benefits in the presence of uncertainties. We derive a risk sensitive filter for a stochastic system where the signal variable has dynamics described by a diffusion equation and determines the rate function for an observation process. The filtering equations are stochastic integral equations. Computer simulations are presented to demonstrate the performance gain for the risk sensitive filter compared with the risk neutral filter

  2. Numerical simulation of large fabric filter

    Science.gov (United States)

    Sedlá?ek, Jan; Kova?ík, Petr

    2012-04-01

    Fabric filters are used in the wide range of industrial technologies for cleaning of incoming or exhaust gases. To achieve maximal efficiency of the discrete phase separation and long lifetime of the filter hoses, it is necessary to ensure uniform load on filter surface and to avoid impacts of heavy particles with high velocities to the filter hoses. The paper deals with numerical simulation of two phase flow field in a large fabric filter. The filter is composed of six chambers with approx. 1600 filter hoses in total. The model was simplified to one half of the filter, the filter hoses walls were substituted by porous zones. The model settings were based on experimental data, especially on the filter pressure drop. Unsteady simulations with different turbulence models were done. Flow field together with particles trajectories were analyzed. The results were compared with experimental observations.

  3. Numerical simulation of large fabric filter

    Directory of Open Access Journals (Sweden)

    Kova?ík Petr

    2012-04-01

    Full Text Available Fabric filters are used in the wide range of industrial technologies for cleaning of incoming or exhaust gases. To achieve maximal efficiency of the discrete phase separation and long lifetime of the filter hoses, it is necessary to ensure uniform load on filter surface and to avoid impacts of heavy particles with high velocities to the filter hoses. The paper deals with numerical simulation of two phase flow field in a large fabric filter. The filter is composed of six chambers with approx. 1600 filter hoses in total. The model was simplified to one half of the filter, the filter hoses walls were substituted by porous zones. The model settings were based on experimental data, especially on the filter pressure drop. Unsteady simulations with different turbulence models were done. Flow field together with particles trajectories were analyzed. The results were compared with experimental observations.

  4. Spatial filters for high power lasers

    Energy Technology Data Exchange (ETDEWEB)

    Erlandson, Alvin Charles; Bayramian, Andrew James

    2014-12-02

    A spatial filter includes a first filter element and a second filter element overlapping with the first filter element. The first filter element includes a first pair of cylindrical lenses separated by a first distance. Each of the first pair of cylindrical lenses has a first focal length. The first filter element also includes a first longitudinal slit filter positioned between the first pair of cylindrical lenses. The second filter element includes a second pair of cylindrical lenses separated by a second distance. Each of the second pair of cylindrical lenses has a second focal length. The second filter element also includes a second longitudinal slit filter positioned between the second pair of cylindrical lenses.

  5. Confucianism and the Asian Martial Traditions

    Directory of Open Access Journals (Sweden)

    C. Alexander Simpkins

    2012-07-01

    Full Text Available Confucianism has been foundational in the political and social life of many Asian countries. Its influence pervades institutions and practices at every level of human activity. Martial arts have also benefited from this philosophy, as the traditional Confucian legacy continues to influence modern practices. This article briefly highlights some key figures and events, describes relevant core concepts of Confucian philosophy, and then shows exemplary applications to martial arts today. Modern martial artists can gain understanding of the traditional Confucian insights that deepen the significance of contemporary martial arts.

  6. Religious Tradition and the Archaic Man

    Directory of Open Access Journals (Sweden)

    Veress Károly

    2005-04-01

    Full Text Available My article as a first step in a comprehensive research program attempts to verify the hypothesis according to which M. Eliade's morphological and historical investigations of archaic religiousness reveal the outlines of an archaic ontology. For this purpose, the article focuses upon Eliade's conception of religious tradition as the carrier of the indivisible unity of sacred existence and religious experience. The ontological difference found in religious existence and revealed by religious experience is rooted in the essentially hermeneutic nature of religious tradition. Therefore the perspective of philosophical hermeneutics proves very productive in the investigation of this problem.

  7. Historical tradition in Serbian genre literature

    Directory of Open Access Journals (Sweden)

    ?or?evi? Ivan

    2008-01-01

    Full Text Available This paper discusses two Serbian science-fiction stories with a special emphasis on the motives in their narrative structure; the motive analysis is focused on those motives that represent a transposition of 'historical tradition' elements. The key words connecting images appearing in this context are: fear of losing (national identity and a strategy of resistance towards those, who presumably, want to 'take over' the identity. In this sense, a return to 'the historical tradition', in the analyzed texts, aims to reassess certain past models indicating at the same time those that have successfully served and endured as historical models in this discourse.

  8. Filter-adsorber aging assessment

    International Nuclear Information System (INIS)

    An aging assessment of high-efficiency particulate (HEPA) air filters and activated carbon gas adsorption units was performed by the Pacific Northwest Laboratory as part of the U.S. Nuclear Regulatory Commission's (USNRC) Nuclear Plant Aging Research (NPAR) Program. This evaluation of the general process in which characteristics of these two components gradually change with time or use included the compilation of information concerning failure experience, stressors, aging mechanisms and effects, and inspection, surveillance, and monitoring methods (ISMM). Stressors, the agents or stimuli that can produce aging degradation, include heat, radiation, volatile contaminants, and even normal concentrations of aerosol particles and gasses. In an experimental evaluation of degradation in terms of the tensile breaking strength of aged filter media specimens, over forty percent of the samples did not meet specifications for new material. Chemical and physical reactions can gradually embrittle sealants and gaskets as well as filter media. Mechanisms that can lead to impaired adsorber performance are associated with the loss of potentially available active sites as a result of the exposure of the carbon to airborne moisture or volatile organic compounds. Inspection, surveillance, and monitoring methods have been established to observe filter pressure drop buildup, check HEPA filters and adsorbers for bypass, and determine the retention effectiveness of aged carbon. These evaluatifectiveness of aged carbon. These evaluations of installed filters do not reveal degradation in terms of reduced media strength but that under normal conditions aged media can continue to effectively retain particles. However, this degradation may be important when considering the likelihood of moisture, steam, and higher particle loadings during severe accidents and the fact it is probable that the filters have been in use for an extended period

  9. Parallel association of shunt active power filters

    OpenAIRE

    Pregitzer, Ricardo G.; Pinto, J. G.; Sepu?lveda, Joa?o; Afonso, Joa?o L.

    2007-01-01

    This paper describes different types of parallel association of Shunt Active Power Filters, presents computer simulation models developed in PSCAD/EMTDC, and shows simulated results for each case. The control system used in the active filters is based on the Instantaneous Reactive Power Theory (p-q Theory). The active filters use periodic sampling as switching technique. Two types of parallel associations are presented: two active filters in different parallel feeders; and both active filters...

  10. The Design of Compressive Sensing Filter

    OpenAIRE

    Li, Lianlin; Zhang, Wenji; Xiang, Yin; Li, Fang

    2008-01-01

    In this paper, the design of universal compressive sensing filter based on normal filters including the lowpass, highpass, bandpass, and bandstop filters with different cutoff frequencies (or bandwidth) has been developed to enable signal acquisition with sub-Nyquist sampling. Moreover, to control flexibly the size and the coherence of the compressive sensing filter, as an example, the microstrip filter based on defected ground structure (DGS) has been employed to realize th...

  11. Review Of Parameter Estimation Using Adaptive Filtering

    OpenAIRE

    Lalita Rani, Shaloo Kikan

    2013-01-01

    In this paper, a comparative study of different adaptive filter algorithm for channel parameter estimation is described. We presented different parameter estimation approaches of adaptive filtering. An extended Kalman filter is then applied as a near-optimal solution to the adaptive channel parameter estimation problem. Kalman filtering is applied for motion parameters resulting in optimal pose estimation. A parallel Kalman filter is applied for joint estimation of code delay, multipath gains...

  12. Multiplier-free filters for wideband SAR

    DEFF Research Database (Denmark)

    Dall, JØrgen; Christensen, Erik Lintz

    2001-01-01

    This paper derives a set of parameters to be optimized when designing filters for digital demodulation and range prefiltering in SAR systems. Aiming at an implementation in field programmable gate arrays (FPGAs), an approach for the design of multiplier-free filters is outlined. Design results are presented in terms of filter complexity and performance. One filter has been coded in VHDL and preliminary results indicate that the filter can meet a 2 GHz input sample rate.

  13. Multivariate ordering in color image filtering

    OpenAIRE

    Pitas, I.; Tsakalides, P.

    2010-01-01

    Multivariate data ordering and its use in color image filtering are presented. Several of the filters presented are extensions of the single-channel filters based on order statistics. The statistical analysis of the marginal order statistics is presented for the p-dimensional case. A family of multichannel filters based on multivariate data ordering, such as the marginal median, the vector median, the marginal ?-trimmed mean, and the multichannel modified trimmed mean filter, is described in...

  14. Neural Filtering Technique for Enhancing Digital Images

    OpenAIRE

    Pushpavalli, R.; Sivarajde, G.

    2013-01-01

    A new image enhancement technique proposed in this paper. The proposed neural filter is carried out in two stages. In first stage the corrupted image is filtered by applying a special class of multistate switching median filter. The filtered image output multistate switching median filter is suitably combined with a feed forward neural network in the second stage. The internal parameters of the feed forward neural network are adaptively optimized by training for three well known images. This ...

  15. Adaptive Marginal Median Filter for Colour Images

    OpenAIRE

    Almanzor Sapena; Valentín Gregori; Samuel Morillas

    2011-01-01

    This paper describes a new filter for impulse noise reduction in colour images which is aimed at improving the noise reduction capability of the classical vector median filter. The filter is inspired by the application of a vector marginal median filtering process over a selected group of pixels in each filtering window. This selection, which is based on the vector median, along with the application of the marginal median operation constitutes an adaptive process that leads to a more robust f...

  16. A strain energy filter for 3D vessel enhancement.

    Science.gov (United States)

    Xiao, Changyan; Staring, Marius; Shamonin, Denis; Reiber, Johan H C; Stolk, Jan; Stoel, Berend C

    2010-01-01

    The traditional Hessian-related vessel filters often suffer from the problem of handling non-cylindrical objects. To remedy the shortcoming, we present a shape-tuned strain energy density function to measure vessel likelihood in 3D images. Based on the tensor invariants and stress-strain principle in mechanics, a new shape discriminating and vessel strength measure function is formulated. The synthetical and clinical data experiments verify the performance of our method in enhancing complex vascular structures including branches, bifurcations, and feature details. PMID:20879421

  17. Improving Recommendation Quality by Merging Collaborative Filtering and Social Relationships

    CERN Document Server

    De Meo, Pasquale; Fiumara, Giacomo; Provetti, Alessandro

    2011-01-01

    Matrix Factorization techniques have been successfully applied to raise the quality of suggestions generated by Collaborative Filtering Systems (CFSs). Traditional CFSs based on Matrix Factorization operate on the ratings provided by users and have been recently extended to incorporate demographic aspects such as age and gender. In this paper we propose to merge CFS based on Matrix Factorization and information regarding social friendships in order to provide users with more accurate suggestions and rankings on items of their interest. The proposed approach has been evaluated on a real-life online social network; the experimental results show an improvement against existing CFSs. A detailed comparison with related literature is also present.

  18. Nonlinear filtering in oil/gas reservoir simulation: filter design

    Energy Technology Data Exchange (ETDEWEB)

    Arnold, E.M.; Voss, D.A.; Mayer, D.W.

    1980-10-01

    In order to provide an additional mode of utility to the USGS reservoir model VARGOW, a nonlinear filter was designed and incorporated into the system. As a result, optimal (in the least squares sense) estimates of reservoir pressure, liquid mass, and gas cap plus free gas mass are obtained from an input of reservoir initial condition estimates and pressure history. These optimal estimates are provided continuously for each time after the initial time, and the input pressure history is allowed to be corrupted by measurement error. Preliminary testing of the VARGOW filter was begun and the results show promise. Synthetic data which could be readily manipulated during testing was used in tracking tests. The results were positive when the initial estimates of the reservoir initial conditions were reasonably close. Further testing is necessary to investigate the filter performance with real reservoir data.

  19. Dry Filter Method for Filtered Venting of the Containment

    Energy Technology Data Exchange (ETDEWEB)

    Obenland, Ralf; Freis, Daniel; Tietsch, Wolfgang [Westinghouse Electric Germany GmbH, Mannheim (Germany); Kroes, Albertus [Westinghouse Electric Belgium, Brussels (Belgium)

    2012-03-15

    In the course of severe accident scenarios pressure builds up in the containment. In these cases, reliable pressure release of the containment will be required to maintain containment integrity. To effectively perform such a containment depressurization while minimizing radioactive release to the environment, a Filtered Containment Venting System (FCVS) is needed. The Dry Filter Method (DFM) is a FCVS with very high Decontamination Factors (DF) for radioactive aerosols and elemental and organic iodine. It was designed for the back fit of German nuclear power stations after the Chernobyl accident in 1986. This paper describes a passive, flexible and modular dry filter system which can cope with a number of hypothetical scenarios and can be very flexibly adapted to plant-specific requirements.

  20. Dry Filter Method for Filtered Venting of the Containment

    International Nuclear Information System (INIS)

    In the course of severe accident scenarios pressure builds up in the containment. In these cases, reliable pressure release of the containment will be required to maintain containment integrity. To effectively perform such a containment depressurization while minimizing radioactive release to the environment, a Filtered Containment Venting System (FCVS) is needed. The Dry Filter Method (DFM) is a FCVS with very high Decontamination Factors (DF) for radioactive aerosols and elemental and organic iodine. It was designed for the back fit of German nuclear power stations after the Chernobyl accident in 1986. This paper describes a passive, flexible and modular dry filter system which can cope with a number of hypothetical scenarios and can be very flexibly adapted to plant-specific requirements

  1. Alternative approaches implementing high-performance FIR filters on lookup-table-based FPGAs: a comparison

    Science.gov (United States)

    Do, Tien-Toan; Kropp, Holger; Reuter, Carsten; Pirsch, Peter

    1998-10-01

    Finite impulse response filters (FIR filters) are very commonly used in digital signal processing (DSP) applications and are traditionally implemented using ASICs or DSP-processors. For FPGA implementation, due to the high throughput rate and large computational power required under real-time constraints, they are a challenging subject. Indeed, the limitation of resources on FPGA, i.e., logic blocks and flip flops, and furthermore, the high routing delays, requires compact implementations of the circuits. Three approaches for implementation of high-performance symmetric FIR filters on lookup table-based FPGAs will be considered in this paper. Fully parallel distributed arithmetic, table lookup multiplication, and conventional hardware multiplication. Implementation results will be illustrated by an 8 taps 8 bits symmetric FIR filter, and comparative considerations of the above approaches invoked for Xilinx FPGAs will be also shown.

  2. Color filters including infrared cut-off integrated on CMOS image sensor.

    Science.gov (United States)

    Frey, Laurent; Parrein, Pascale; Raby, Jacques; Pellé, Catherine; Hérault, Didier; Marty, Michel; Michailos, Jean

    2011-07-01

    A color image was taken with a CMOS image sensor without any infrared cut-off filter, using red, green and blue metal/dielectric filters arranged in Bayer pattern with 1.75 µm pixel pitch. The three colors were obtained by a thickness variation of only two layers in the 7-layer stack, with a technological process including four photolithography levels. The thickness of the filter stack was only half of the traditional color resists, potentially enabling a reduction of optical crosstalk for smaller pixels. Both color errors and signal to noise ratio derived from optimized spectral responses are expected to be similar to color resists associated with infrared filter. PMID:21747459

  3. Ethnic Label Use in Adolescents from Traditional and Non-Traditional Immigrant Communities

    Science.gov (United States)

    Kiang, Lisa; Perreira, Krista M.; Fuligni, Andrew J.

    2011-01-01

    Understanding adolescents' use of ethnic labels is a key developmental issue, particularly given the practical significance of identity and self-definition in adolescents' lives. Ethnic labeling was examined among adolescents in the traditional immigrant receiving area of Los Angeles (Asian n = 258, Latino n = 279) and the non-traditional

  4. A biological oil adsorption filter

    Energy Technology Data Exchange (ETDEWEB)

    Pasila, A. [University of Helsinki (Finland). Dept. of Agricultural Engineering and Household Technology

    2005-12-01

    A new oil adsorption method called adsorption filtration (AF) has been developed. It is a technology where by oil residues can be cleaned from water by running it through a simple filter made from freeze treated, dried, milled and then fragmented plant material. By choosing suitable plants and fragmentation sizes it is possible to produce filters, which pass water but adsorb oil. The aim of this study was to investigate the possibilities of manufacturing oil adsorbing filter materials from reed canary grass (Phalaris arundinacea), flax (Linum usitatissimum L.) or hemp fibre (Cannabis sativa L.). The oil (80 ml) was mixed with de-ionised water (200 ml) and this mixture was filtered through 10 or 20 g adsorption filters. Fine spring harvested hemp fibre (diameter less than 1 mm) and reed canary grass fragments adsorb 2-4 g of oil per gram of adsorption material compared to 1-3 g of water. Adsorption filtration is thus a novel way of gathering spilled oil in shallow coastal waters before the oil reaches the shore. (author)

  5. Iodine filters in nuclear installations

    International Nuclear Information System (INIS)

    The present report discusses the significance for environmental exposure of the iodine released with the gaseous effluents of nuclear power stations and reprocessing plants in relation to releases of other airborne radionuclides. Iodine filtration processes are described. The release pathways and the composition of airborne fission product iodine mixtures and their bearing on environmental exposure are discussed on the basis of measured fission product iodine emissions. The sorbents which can be used for iodine filtration, their removal efficiencies and range of applications are dealt with in detail. The particular conditions governing iodine removal, which are determined by the various gaseous iodine species, are illustrated on the basis of experimentally determined retention profiles. Particular attention is given to the limitations imposed by temperature, humidity, radiation and filter poisoning. The types of filter normally used are described, their advantages and drawbacks discussed, the principles underlying their design are outlined and the sources of error indicated. The methods normally applied to test the efficiency of various iodine sorbents are described and assessed. Operating experience with iodine filters, gathered from surveillance periods of many years, is supplemented by a large number of test results and the findings of extensive experiments. Possible ways of prolonging the permissible service lives of iodine filters are discussed and information is given on protective measures. The various iodine removal processes applied in reprocessing plants are described and compared with reference to efficiency and cost. The latest developments in filter technology in reprocessing plants are briefly outlined

  6. A Twisted Pair Cryogenic Filter

    CERN Document Server

    Spietz, L; Schölkopf, R J; Spietz, Lafe; Teufel, John

    2006-01-01

    In low temperature transport measurements, there is frequently a need to protect a device at cryogenic temperatures from thermal noise originating in warmer parts of the experiment. There are also a wide range of experiments, such as high precision transport measurements on low impedance devices, in which a twisted-pair wiring configuration is useful to eliminate magnetic pickup. Furthermore, with the rapid growth in complexity of cryogenic experiments, as in the field of quantum computing, there is a need for more filtered lines into a cryostat than are often available using the bulky low temperature filters in use today. We describe a low cost filter that provides the needed RF attenuation while allowing for tens of wires in a twisted pair configuration with an RF-tight connection to the sample holder. Our filter consists of manganin twisted pairs wrapped in copper tape with a light-tight connection to the shield of the sample holder. We demonstrate agreement of our filter with a theoretical model up to the...

  7. A biological oil adsorption filter

    International Nuclear Information System (INIS)

    A new oil adsorption method called adsorption filtration (AF) has been developed. It is a technology where by oil residues can be cleaned from water by running it through a simple filter made from freeze treated, dried, milled and then fragmented plant material. By choosing suitable plants and fragmentation sizes it is possible to produce filters, which pass water but adsorb oil. The aim of this study was to investigate the possibilities of manufacturing oil adsorbing filter materials from reed canary grass (Phalaris arundinacea), flax (Linum usitatissimum L.) or hemp fibre (Cannabis sativa L.). The oil (80 ml) was mixed with de-ionised water (200 ml) and this mixture was filtered through 10 or 20 g adsorption filters. Fine spring harvested hemp fibre (diameter less than 1 mm) and reed canary grass fragments adsorb 2-4 g of oil per gram of adsorption material compared to 1-3 g of water. Adsorption filtration is thus a novel way of gathering spilled oil in shallow coastal waters before the oil reaches the shore. (author)

  8. A biological oil adsorption filter.

    Science.gov (United States)

    Pasila, Antti

    2004-12-01

    A new oil adsorption method called adsorption filtration (AF) has been developed. It is a technology where by oil residues can be cleaned from water by running it through a simple filter made from freeze treated, dried, milled and then fragmented plant material. By choosing suitable plants and fragmentation sizes it is possible to produce filters, which pass water but adsorb oil. The aim of this study was to investigate the possibilities of manufacturing oil adsorbing filter materials from reed canary grass (Phalaris arundinacea), flax (Linum usitatissimum L.) or hemp fibre (Cannabis sativa L.). The oil (80 ml) was mixed with de-ionised water (200 ml) and this mixture was filtered through 10 or 20 g adsorption filters. Fine spring harvested hemp fibre (diameter less than 1 mm) and reed canary grass fragments adsorb 2-4 g of oil per gram of adsorption material compared to 1-3 g of water. Adsorption filtration is thus a novel way of gathering spilled oil in shallow coastal waters before the oil reaches the shore. PMID:15556187

  9. Allegory in Chesnutt's Marrow of Tradition.

    Science.gov (United States)

    Giles, James R.; Lally, Thomas P.

    1984-01-01

    Recognizes allegorical patterns controlling Chestnutt's novel, "The Marrow of Tradition," in response to criticisms concerning stereotyped characters and overplotting. Feels that Chestnutt is stating that given the death of benevolence in White society and the suicidal nature of violent Black retaliation, separatism alone offers Black Americans…

  10. Traditional festivities in Bohemia: Continuity and Revitalisation.

    Czech Academy of Sciences Publication Activity Database

    Stav?lová, Daniela

    Limerick : The Irish World Academy of Music and Dance, University of Limerick, 2014 - (Dunin, E.; Foley, C.), s. 278-286 ISBN 9781905952533. [Symposium of the Internacional Council for Traditional Music (ICTM) /27./. Limerick (IE), 22.07.2012-29.07.2012] Institutional support: RVO:68378076 Keywords : shrovetide * carnival * archetype * identity * collective memory Subject RIV: AC - Archeology, Anthropology, Ethnology

  11. Traditions of Moral Education in Iraq

    Science.gov (United States)

    Al-Khaizaran, Huda

    2007-01-01

    This article suggests three ideas. First, under the pressures of the Ottoman and Iraqi state modernity projects, two types of cultural traditions in Iraq, namely Islam and Arab tribal values, were negotiated and re-negotiated. Second, the concepts of merit based on these values changed over time and were institutionalised in education. Third,…

  12. Alternative Approaches to Traditional Topics in Algebra

    Science.gov (United States)

    Coburn, John W.

    2010-01-01

    Students who otherwise seem unreachable through traditional approaches to algebra require some alternative teaching methods. So do teachers who seek to add elements of freshness and innovation to their classrooms or who simply appreciate variety. This article offers some unconventional techniques for teaching a few conventional algebra topics.…

  13. Milk-based traditional Turkish desserts

    Directory of Open Access Journals (Sweden)

    Tulay Ozcan

    2009-12-01

    Full Text Available Traditional foods are the reflection of cultural inheritance and affect the lifestyle habits. Culture can be viewed as a system of socially transmitted patterns of behaviour that characterises a particular group. Despite the fact of globalisation, these are key elements to accurately estimate a population’s dietary patterns and how these have been shaped through time. In Turkey, a meal with family or friends traditionally ends with a dessert, which is a testimony to the hosts’ hospitality or to the housewife’s love and affection for her husband and children, since sweets and desserts are important elements of Turkish cuisine. However, the consciousnesses of nutrition and healthy eating, due to rapid change in popular life style and dietary patterns, has contributed to the increased interest in traditional foods with potential health benefits, with increased uncertainty for dessert consumption. Dairy desserts are extensively consumed due to their nutritive and sensoric characteristics. Some of traditional dairy desserts are Mustafakemalpasa, Gullac, Kazandibi, Hosmerim and Tavukgogsu, which are mainly made from milk or fresh cheese, and the current paper discusses their manufacturing processes and composition.

  14. Are Traditional and Modern Education Incompatible?

    Science.gov (United States)

    Campbell, Anne

    1991-01-01

    Traditional American Indian education reflects beliefs that no single absolute truth exists, and that children learn by observing and interacting with others who know bits of the truth. Mainstream education sees teachers as the most important dispensers of universal truth learned in linear fashion. These views may be incompatible. (SV)

  15. ODL and Traditional Universities: Dichotomy or Convergence?

    Science.gov (United States)

    Curran, Chris

    1997-01-01

    Examines the evolution of open distance learning (ODL) in the United Kingdom, particularly in relation to traditional universities and to recent developments in telematics. Argues that a major test of the new technologies in distance education will be their capacity to support real academic communities, allowing the university to maintain the best…

  16. Industrial redesign of traditional valencian tiles

    Directory of Open Access Journals (Sweden)

    Lucas, F.

    2000-02-01

    Full Text Available The idea behind this project was to recover a type of Traditional Valencian Ceramics, by adapting its own particular production technology to present-day systems, installations and materials.

    Se ha pretendido recuperar una tipología de Cerámica tradicional Valenciana, adaptando su tecnología productiva a los sistemas , instalaciones y materiales actuales.

  17. Two Traditions in Afro-American Literature.

    Science.gov (United States)

    Haslam, Gerald W.

    1970-01-01

    Afro-American poets, dramatists, and prose writers have been affected by the tension between traditional African oral modes and various European-American written genres, as well as by the merger of "white taste and black need," which can be seen by examining the styles and themes of black literature in America from Lucy Terry's 1746 poem "Bar's…

  18. Inferior vena cava filters in cancer patients: to filter or not to filter

    Directory of Open Access Journals (Sweden)

    Hikmat Abdel-Razeq

    2011-03-01

    Full Text Available Hikmat Abdel-Razeq1, Asem Mansour2, Yousef Ismael1, Hazem Abdulelah11Department of Internal Medicine, 2Department of Radiology, King Hussein Cancer Center, Amman, JordanPurpose: Cancer and its treatment are recognized risk factors for venous thromboembolism (VTE; active cancer accounts for almost 20% of all newly diagnosed VTE. Inferior vena cava (IVC filters are utilized to provide mechanical thromboprophylaxis to prevent pulmonary embolism (PE or to avoid bleeding from systemic anticoagulation in high-risk situations. In this report, and utilizing a case study, we will address the appropriate utilization of such filters in cancer patients.Methods: The case of a 43-year-old female patient with rectal cancer, who developed deep vein thrombosis following a complicated medical course, will be presented. The patient was anticoagulated with a low molecular weight heparin, but a few months later and following an episode of bleeding, an IVC filter was planned. Using the PubMed database, articles published in English language addressing issues related to IVC filters in cancer patients were accessed and will be presented.Results: Many recent studies questioned the need to insert IVC filters in advanced-stage cancer patients, particularly those whose anticipated survival is short and prevention of PE may be of little clinical benefit and could be a poor utilization of resources.Conclusion: Systemic anticoagulation can be safely offered for the majority of cancer patients. When the risk of bleeding or pulmonary embolism is high, IVC filters can be utilized. However, placement of such filters should take into consideration the stage of disease and life expectancy of such patients.Keywords: anticoagulation, bleeding, chemotherapy

  19. The Ecological Ideology in Traditional Chinese Culture

    Directory of Open Access Journals (Sweden)

    Jing-wei LIU

    2006-03-01

    Full Text Available
    Traditional Chinese Culture includes many ecological ideas. This paper studies about some characteristics of them in three aspects. First, the feature that man is an integral part of nature in Traditional Chinese Culture provides potential foundation for harmonic coexistence of man and nature. Second, the concept of “the axes and bills enter the hills and forests only at the proper time” in Traditional Chinese Culture provides a rational basis for harmonic coexistence of man and nature. Third, the orientation of “taking the greatest law so you can live” in Traditional Chinese Culture provides emotional foundation for harmonic coexistence of man and nature
    Key words: Traditional Chinese Culture, Ecology, Ideology
    Résumé: La culture traditionnelle chinoise comprend pas mal des idées écologiques. Cette thèse présente fait une recherche sur ses caractéristiques en trois aspects : Premièrement, la nature que l’homme est une part intégrale de la nature dans la culture traditionnelle chinoise provient de la fondation potentielle pour la coexistence harmonieuse de l’homme et de la nature. Deuxièment, le concept de « les axes et l’annonce n’entrent dans les montagnes et des forêts qu’à un moment juste » dans la culture traditionnelle chinoise fournit une base rationnelle pour la coexistence harmonieuse de l’homme et de la nature ; Troisièmement, l’orientation de « respecter la plus grade loi et tu peux vivre » dans la culture traditionnelle chinoise fournit une fondation émotionnelle pour la coexistence harmonieuse de l’homme et de la nature.
    Mots-clés: a culture traditionnelle chinois ; l’écologie, idéologie

  20. A Sensor Fusion Algorithm for Filtering Pyrometer Measurement Noise in the Czochralski Crystallization Process

    Directory of Open Access Journals (Sweden)

    M. Komperød

    2011-01-01

    Full Text Available The Czochralski (CZ crystallization process is used to produce monocrystalline silicon for solar cell wafers and electronics. Tight temperature control of the molten silicon is most important for achieving high crystal quality. SINTEF Materials and Chemistry operates a CZ process. During one CZ batch, two pyrometers were used for temperature measurement. The silicon pyrometer measures the temperature of the molten silicon. This pyrometer is assumed to be accurate, but has much high-frequency measurement noise. The graphite pyrometer measures the temperature of a graphite material. This pyrometer has little measurement noise. There is quite a good correlation between the two pyrometer measurements. This paper presents a sensor fusion algorithm that merges the two pyrometer signals for producing a temperature estimate with little measurement noise, while having significantly less phase lag than traditional lowpass- filtering of the silicon pyrometer. The algorithm consists of two sub-algorithms: (i A dynamic model is used to estimate the silicon temperature based on the graphite pyrometer, and (ii a lowpass filter and a highpass filter designed as complementary filters. The complementary filters are used to lowpass-filter the silicon pyrometer, highpass-filter the dynamic model output, and merge these filtered signals. Hence, the lowpass filter attenuates noise from the silicon pyrometer, while the graphite pyrometer and the dynamic model estimate those frequency components of the silicon temperature that are lost when lowpass-filtering the silicon pyrometer. The algorithm works well within a limited temperature range. To handle a larger temperature range, more research must be done to understand the process' nonlinear dynamics, and build this into the dynamic model.