WorldWideScience

Sample records for computerized image analysis

  1. Computerized analysis of brain perfusion parameter images

    International Nuclear Information System (INIS)

    Turowski, B.; Haenggi, D.; Wittsack, H.J.; Beck, A.; Aurich, V.

    2007-01-01

    Purpose: The development of a computerized method which allows a direct quantitative comparison of perfusion parameters. The display should allow a clear direct comparison of brain perfusion parameters in different vascular territories and over the course of time. The analysis is intended to be the basis for further evaluation of cerebral vasospasm after subarachnoid hemorrhage (SAH). The method should permit early diagnosis of cerebral vasospasm. Materials and Methods: The Angiotux 2D-ECCET software was developed with a close cooperation between computer scientists and clinicians. Starting from parameter images of brain perfusion, the cortex was marked, segmented and assigned to definite vascular territories. The underlying values were averages for each segment and were displayed in a graph. If a follow-up was available, the mean values of the perfusion parameters were displayed in relation to time. The method was developed under consideration of CT perfusion values but is applicable for other methods of perfusion imaging. Results: Computerized analysis of brain perfusion parameter images allows an immediate comparison of these parameters and follow-up of mean values in a clear and concise manner. Values are related to definite vascular territories. The tabular output facilitates further statistic evaluations. The computerized analysis is precisely reproducible, i. e., repetitions result in exactly the same output. (orig.)

  2. Morphological analysis of the vestibular aqueduct by computerized tomography images

    International Nuclear Information System (INIS)

    Marques, Sergio Ricardo; Smith, Ricardo Luiz; Isotani, Sadao; Alonso, Luis Garcia; Anadao, Carlos Augusto; Prates, Jose Carlos; Lederman, Henrique Manoel

    2007-01-01

    Objective: In the last two decades, advances in the computerized tomography (CT) field revise the internal and medium ear evaluation. Therefore, the aim of this study is to analyze the morphology and morphometric aspects of the vestibular aqueduct on the basis of computerized tomography images (CTI). Material and method: Computerized tomography images of vestibular aqueducts were acquired from patients (n = 110) with an age range of 1-92 years. Thereafter, from the vestibular aqueducts images a morphometric analysis was performed. Through a computerized image processing system, the vestibular aqueduct measurements comprised of its area, external opening, length and the distance from the vestibular aqueduct to the internal acoustic meatus. Results: The morphology of the vestibular aqueduct may be funnel-shaped, filiform or tubular and the respective proportions were found to be at 44%, 33% and 22% in children and 21.7%, 53.3% and 25% in adults. The morphometric data showed to be of 4.86 mm 2 of area, 2.24 mm of the external opening, 4.73 mm of length and 11.88 mm of the distance from the vestibular aqueduct to the internal acoustic meatus, in children, and in adults it was of 4.93 mm 2 , 2.09 mm, 4.44 mm, and 11.35 mm, respectively. Conclusions: Computerized tomography showed that the vestibular aqueduct presents high morphological variability. The morphometric analysis showed that the differences found between groups of children and adults or between groups of both genders were not statistically significant

  3. Computerized comprehensive data analysis of Lung Imaging Database Consortium (LIDC)

    International Nuclear Information System (INIS)

    Tan Jun; Pu Jiantao; Zheng Bin; Wang Xingwei; Leader, Joseph K.

    2010-01-01

    Purpose: Lung Image Database Consortium (LIDC) is the largest public CT image database of lung nodules. In this study, the authors present a comprehensive and the most updated analysis of this dynamically growing database under the help of a computerized tool, aiming to assist researchers to optimally use this database for lung cancer related investigations. Methods: The authors developed a computer scheme to automatically match the nodule outlines marked manually by radiologists on CT images. A large variety of characteristics regarding the annotated nodules in the database including volume, spiculation level, elongation, interobserver variability, as well as the intersection of delineated nodule voxels and overlapping ratio between the same nodules marked by different radiologists are automatically calculated and summarized. The scheme was applied to analyze all 157 examinations with complete annotation data currently available in LIDC dataset. Results: The scheme summarizes the statistical distributions of the abovementioned geometric and diagnosis features. Among the 391 nodules, (1) 365 (93.35%) have principal axis length ≤20 mm; (2) 120, 75, 76, and 120 were marked by one, two, three, and four radiologists, respectively; and (3) 122 (32.48%) have the maximum volume overlapping ratios ≥80% for the delineations of two radiologists, while 198 (50.64%) have the maximum volume overlapping ratios <60%. The results also showed that 72.89% of the nodules were assessed with malignancy score between 2 and 4, and only 7.93% of these nodules were considered as severely malignant (malignancy ≥4). Conclusions: This study demonstrates that LIDC contains examinations covering a diverse distribution of nodule characteristics and it can be a useful resource to assess the performance of the nodule detection and/or segmentation schemes.

  4. Computerized image analysis: estimation of breast density on mammograms

    Science.gov (United States)

    Zhou, Chuan; Chan, Heang-Ping; Petrick, Nicholas; Sahiner, Berkman; Helvie, Mark A.; Roubidoux, Marilyn A.; Hadjiiski, Lubomir M.; Goodsitt, Mitchell M.

    2000-06-01

    An automated image analysis tool is being developed for estimation of mammographic breast density, which may be useful for risk estimation or for monitoring breast density change in a prevention or intervention program. A mammogram is digitized using a laser scanner and the resolution is reduced to a pixel size of 0.8 mm X 0.8 mm. Breast density analysis is performed in three stages. First, the breast region is segmented from the surrounding background by an automated breast boundary-tracking algorithm. Second, an adaptive dynamic range compression technique is applied to the breast image to reduce the range of the gray level distribution in the low frequency background and to enhance the differences in the characteristic features of the gray level histogram for breasts of different densities. Third, rule-based classification is used to classify the breast images into several classes according to the characteristic features of their gray level histogram. For each image, a gray level threshold is automatically determined to segment the dense tissue from the breast region. The area of segmented dense tissue as a percentage of the breast area is then estimated. In this preliminary study, we analyzed the interobserver variation of breast density estimation by two experienced radiologists using BI-RADS lexicon. The radiologists' visually estimated percent breast densities were compared with the computer's calculation. The results demonstrate the feasibility of estimating mammographic breast density using computer vision techniques and its potential to improve the accuracy and reproducibility in comparison with the subjective visual assessment by radiologists.

  5. Computerized Analysis of MR and Ultrasound Images of Breast Lesions

    National Research Council Canada - National Science Library

    Giger, Maryellen Lissak

    2000-01-01

    ...) images of breast lesions to aid radiologists in their workup of suspect lesions. We currently have retrospectively collected over 400 ultrasound cases of mass lesions, all that had gone on to either biopsy or cyst aspiration...

  6. SU-E-J-275: Review - Computerized PET/CT Image Analysis in the Evaluation of Tumor Response to Therapy

    International Nuclear Information System (INIS)

    Lu, W; Wang, J; Zhang, H

    2015-01-01

    Purpose: To review the literature in using computerized PET/CT image analysis for the evaluation of tumor response to therapy. Methods: We reviewed and summarized more than 100 papers that used computerized image analysis techniques for the evaluation of tumor response with PET/CT. This review mainly covered four aspects: image registration, tumor segmentation, image feature extraction, and response evaluation. Results: Although rigid image registration is straightforward, it has been shown to achieve good alignment between baseline and evaluation scans. Deformable image registration has been shown to improve the alignment when complex deformable distortions occur due to tumor shrinkage, weight loss or gain, and motion. Many semi-automatic tumor segmentation methods have been developed on PET. A comparative study revealed benefits of high levels of user interaction with simultaneous visualization of CT images and PET gradients. On CT, semi-automatic methods have been developed for only tumors that show marked difference in CT attenuation between the tumor and the surrounding normal tissues. Quite a few multi-modality segmentation methods have been shown to improve accuracy compared to single-modality algorithms. Advanced PET image features considering spatial information, such as tumor volume, tumor shape, total glycolytic volume, histogram distance, and texture features have been found more informative than the traditional SUVmax for the prediction of tumor response. Advanced CT features, including volumetric, attenuation, morphologic, structure, and texture descriptors, have also been found advantage over the traditional RECIST and WHO criteria in certain tumor types. Predictive models based on machine learning technique have been constructed for correlating selected image features to response. These models showed improved performance compared to current methods using cutoff value of a single measurement for tumor response. Conclusion: This review showed that

  7. SU-E-J-275: Review - Computerized PET/CT Image Analysis in the Evaluation of Tumor Response to Therapy

    Energy Technology Data Exchange (ETDEWEB)

    Lu, W; Wang, J; Zhang, H [University of Maryland School of Medicine, Baltimore, MD (United States)

    2015-06-15

    Purpose: To review the literature in using computerized PET/CT image analysis for the evaluation of tumor response to therapy. Methods: We reviewed and summarized more than 100 papers that used computerized image analysis techniques for the evaluation of tumor response with PET/CT. This review mainly covered four aspects: image registration, tumor segmentation, image feature extraction, and response evaluation. Results: Although rigid image registration is straightforward, it has been shown to achieve good alignment between baseline and evaluation scans. Deformable image registration has been shown to improve the alignment when complex deformable distortions occur due to tumor shrinkage, weight loss or gain, and motion. Many semi-automatic tumor segmentation methods have been developed on PET. A comparative study revealed benefits of high levels of user interaction with simultaneous visualization of CT images and PET gradients. On CT, semi-automatic methods have been developed for only tumors that show marked difference in CT attenuation between the tumor and the surrounding normal tissues. Quite a few multi-modality segmentation methods have been shown to improve accuracy compared to single-modality algorithms. Advanced PET image features considering spatial information, such as tumor volume, tumor shape, total glycolytic volume, histogram distance, and texture features have been found more informative than the traditional SUVmax for the prediction of tumor response. Advanced CT features, including volumetric, attenuation, morphologic, structure, and texture descriptors, have also been found advantage over the traditional RECIST and WHO criteria in certain tumor types. Predictive models based on machine learning technique have been constructed for correlating selected image features to response. These models showed improved performance compared to current methods using cutoff value of a single measurement for tumor response. Conclusion: This review showed that

  8. Computerized nipple identification for multiple image analysis in computer-aided diagnosis

    International Nuclear Information System (INIS)

    Zhou Chuan; Chan Heangping; Paramagul, Chintana; Roubidoux, Marilyn A.; Sahiner, Berkman; Hadjiiski, Labomir M.; Petrick, Nicholas

    2004-01-01

    Correlation of information from multiple-view mammograms (e.g., MLO and CC views, bilateral views, or current and prior mammograms) can improve the performance of breast cancer diagnosis by radiologists or by computer. The nipple is a reliable and stable landmark on mammograms for the registration of multiple mammograms. However, accurate identification of nipple location on mammograms is challenging because of the variations in image quality and in the nipple projections, resulting in some nipples being nearly invisible on the mammograms. In this study, we developed a computerized method to automatically identify the nipple location on digitized mammograms. First, the breast boundary was obtained using a gradient-based boundary tracking algorithm, and then the gray level profiles along the inside and outside of the boundary were identified. A geometric convergence analysis was used to limit the nipple search to a region of the breast boundary. A two-stage nipple detection method was developed to identify the nipple location using the gray level information around the nipple, the geometric characteristics of nipple shapes, and the texture features of glandular tissue or ducts which converge toward the nipple. At the first stage, a rule-based method was designed to identify the nipple location by detecting significant changes of intensity along the gray level profiles inside and outside the breast boundary and the changes in the boundary direction. At the second stage, a texture orientation-field analysis was developed to estimate the nipple location based on the convergence of the texture pattern of glandular tissue or ducts towards the nipple. The nipple location was finally determined from the detected nipple candidates by a rule-based confidence analysis. In this study, 377 and 367 randomly selected digitized mammograms were used for training and testing the nipple detection algorithm, respectively. Two experienced radiologists identified the nipple locations

  9. Methods for processing and analysis functional and anatomical brain images: computerized tomography, emission tomography and nuclear resonance imaging

    International Nuclear Information System (INIS)

    Mazoyer, B.M.

    1988-01-01

    The various methods for brain image processing and analysis are presented and compared. The following topics are developed: the physical basis of brain image comparison (nature and formation of signals intrinsic performance of the methods image characteristics); mathematical methods for image processing and analysis (filtering, functional parameter extraction, morphological analysis, robotics and artificial intelligence); methods for anatomical localization (neuro-anatomy atlas, proportional stereotaxic atlas, numerized atlas); methodology of cerebral image superposition (normalization, retiming); image networks [fr

  10. Reduction of false-positive recalls using a computerized mammographic image feature analysis scheme

    Science.gov (United States)

    Tan, Maxine; Pu, Jiantao; Zheng, Bin

    2014-08-01

    The high false-positive recall rate is one of the major dilemmas that significantly reduce the efficacy of screening mammography, which harms a large fraction of women and increases healthcare cost. This study aims to investigate the feasibility of helping reduce false-positive recalls by developing a new computer-aided diagnosis (CAD) scheme based on the analysis of global mammographic texture and density features computed from four-view images. Our database includes full-field digital mammography (FFDM) images acquired from 1052 recalled women (669 positive for cancer and 383 benign). Each case has four images: two craniocaudal (CC) and two mediolateral oblique (MLO) views. Our CAD scheme first computed global texture features related to the mammographic density distribution on the segmented breast regions of four images. Second, the computed features were given to two artificial neural network (ANN) classifiers that were separately trained and tested in a ten-fold cross-validation scheme on CC and MLO view images, respectively. Finally, two ANN classification scores were combined using a new adaptive scoring fusion method that automatically determined the optimal weights to assign to both views. CAD performance was tested using the area under a receiver operating characteristic curve (AUC). The AUC = 0.793  ±  0.026 was obtained for this four-view CAD scheme, which was significantly higher at the 5% significance level than the AUCs achieved when using only CC (p = 0.025) or MLO (p = 0.0004) view images, respectively. This study demonstrates that a quantitative assessment of global mammographic image texture and density features could provide useful and/or supplementary information to classify between malignant and benign cases among the recalled cases, which may eventually help reduce the false-positive recall rate in screening mammography.

  11. Computerized ECT data analysis system

    International Nuclear Information System (INIS)

    Miyake, Y.; Fukui, S.; Iwahashi, Y.; Matsumoto, M.; Koyama, K.

    1988-01-01

    For the analytical method of the eddy current testing (ECT) of steam generator tubes in nuclear power plants, the authors have developed the computerized ECT data analysis system using a large-scale computer with a high-resolution color graphic display. This system can store acquired ECT data up to 15 steam generators, and ECT data can be analyzed immediately on the monitor in dialogue communication with a computer. Analyzed results of ECT data are stored and registered in the data base. This system enables an analyst to perform sorting and collecting of data under various conditions and obtain the results automatically, and also to make a plan of tube repair works. This system has completed the test run, and has been used for data analysis at the annual inspection of domestic plants. This paper describes an outline, features and examples of the computerized eddy current data analysis system for steam generator tubes in PWR nuclear power plants

  12. Optimization and objective and subjective analysis of thorax image for computerized radiology

    International Nuclear Information System (INIS)

    Velo, Alexandre F.; Miranda, Jose Ricardo A.

    2013-01-01

    This research aimed at optimizing computational chest radiographic images (in previous posterior projection-PA). To this end, we used a homogeneous patient equivalent phantom in Computational Imaging System calibration, in order to obtain a satisfactory noise signal relation for a diagnosis, adjusting to a minimum dose received by the patient. The techniques have been applied in an anthropomorphic phantom (RANDO). The images obtained were evaluated by a radiologist, which identified the best image to determine possible pathologies (fracture or pneumonia). The technique were quantified objectively (Detective Quantum Efficiency - DQE, Modulation Transfer Function MTF, Noise Power Spectrum, NPS). Comparing optimized techniques with the clinical routine, it is concluded that all provide doses below reference levels. However the choice of the best technique for viewing possible pneumonia and/or fracture, was determined based on the first 3D (Dose, Diagnostic, Dollar) and regarded as gold standard. This image presented a reduction of dose and loading of tube around 70.5% and 80% respectively when compared with the clinical routine

  13. Thrombin effectuates therapeutic arteriogenesis in the rabbit hindlimb ischemia model: A quantitative analysis by computerized in vivo imaging

    International Nuclear Information System (INIS)

    Kagadis, George C.; Karnabatidis, Dimitrios; Katsanos, Konstantinos; Diamantopoulos, Athanassios; Samaras, Nikolaos; Maroulis, John; Siablis, Dimitrios; Nikiforidis, George C.

    2006-01-01

    We report on an experimental mammalian controlled study that documents arteriogenic capacity of thrombin and utilizes computerized algorithms to quantify the newly formed vessels. Hindlimb ischemia was surgically invoked in 10 New Zealand white rabbits. After quiescence of endogenous angiogenesis heterologous bovine thrombin was intramuscularly injected (1500 units) in one hindlimb per rabbit (Group T). Contralateral limbs were infused with normal saline (Group C). Digital subtraction angiography (DSA) of both limbs was performed after thrombin infusion by selective cannulation of the abdominal aorta and digital images were post-processed with computerized algorithms in order to enhance newly formed vessels. Total vessel area and total vessel length were quantified. In vivo functional evaluation included measurements of blood flow volume at the level of the external iliac artery by Doppler ultrasonography both at baseline and at 20 days after thrombin infusion. Total vessel area and length (in pixels) were 14,713+/-1023 and 5466+/-1327 in group T versus 12,015+/-2557 and 4598+/-1269 in group C (p=0.0062 and 0.1526, respectively). Blood flow volumes (ml/min) at baseline and at 20 days after thrombin infusion were 25.87+/-11.09 and 38.06+/-11.72 in group T versus 26.57+/-11.19 and 20.35+/-7.20 in group C (p=0.8898 and 0.0007, respectively). Intramuscular thrombin effectuates an arteriogenic response in the rabbit hindlimb ischemia model. Computerized algorithms may enable accurate quantification of the neovascularization outcome

  14. Measurement and analysis of noise power spectrum of computerized tomography in images

    International Nuclear Information System (INIS)

    Castro Tejero, P.; Garayoa Roca, J.

    2013-01-01

    This paper examines the implementation of the spectrum of powers of the noise, NPS, as metric to characterize the noise, both in magnitude and in texture, for CT scans. The NPS found show that you for convolution filters that assume a greater softening in the reconstructed image, spectrum is concentrated in the low frequencies, while for filters sharp, the spectrum extends to high frequencies. In the analyzed cases, there is a low frequency component, largely due to the structure-borne noise, which can be a potential negative effect on the detectability of injuries. (Author)

  15. Stratification of mammographic computerized analysis by BI-RADS categories

    Energy Technology Data Exchange (ETDEWEB)

    Lederman, Richard [Department of Radiology, Hadassah University Hospital, Ein Kerem, Jerusalem (Israel); Leichter, Isaac [Department of Electro-Optics, Jerusalem College of Technology, P.O.B. 16031, Jerusalem (Israel); Buchbinder, Shalom [Department of Radiology of The Montefiore Medical Center, The University Hospital for the Albert Einstein College of Medicine, Bronx, New York (United States); Novak, Boris [Department of Applied Mathematics, Jerusalem College of Technology, P.O.B. 16031, Jerusalem 91160 (Israel); Bamberger, Philippe [Department of Electronics, Jerusalem College of Technology, POB 16031, Jerusalem (Israel); Fields, Scott [Department of Radiology, Hadassah University Hospital, Mt. Scopus, Jerusalem (Israel)

    2003-02-01

    The Breast Imaging Reporting and Data System (BI-RADS) was implemented to standardize characterization of mammographic findings. The purpose of the present study was to evaluate in which BI-RADS categories the changes recommended by computerized mammographic analysis are most beneficial. Archival cases including, 170 masses (101 malignant, 69 benign) and 63 clusters of microcalcifications (MCs; 36 malignant, 27 benign), were evaluated retrospectively, using the BI-RADS categories, by several radiologists, blinded to the pathology results. A computerized system then automatically extracted from the digitized mammogram features characterizing mammographic lesions, which were used to classify the lesions. The results of the computerized classification scheme were compared, by receiver operating characteristics (ROC) analysis, to the conventional interpretation. In the ''low probability of malignancy group'' (excluding BI-RADS categories 4 and 5), computerized analysis improved the A{sub z}of the ROC curve significantly, from 0.57 to 0.89. In the ''high probability of malignancy group'' (mostly category 5) the computerized analysis yielded an ROC curve with an A {sub z}of 0.99. In the ''intermediate probability of malignancy group'' computerized analysis improved the A {sub z}significantly, from 0.66 for to 0.83. Pair-wise analysis showed that in the latter group the modifications resulting from computerized analysis were correct in 83% of cases. Computerized analysis has the ability to improve the performance of the radiologists exactly in the BI-RADS categories with the greatest difficulties in arriving at a correct diagnosis. It increased the performance significantly in the problematic group of ''intermediate probability of malignancy'' and pinpointed all the cases with missed cancers in the ''low probability'' group. (orig.)

  16. Stratification of mammographic computerized analysis by BI-RADS categories

    International Nuclear Information System (INIS)

    Lederman, Richard; Leichter, Isaac; Buchbinder, Shalom; Novak, Boris; Bamberger, Philippe; Fields, Scott

    2003-01-01

    The Breast Imaging Reporting and Data System (BI-RADS) was implemented to standardize characterization of mammographic findings. The purpose of the present study was to evaluate in which BI-RADS categories the changes recommended by computerized mammographic analysis are most beneficial. Archival cases including, 170 masses (101 malignant, 69 benign) and 63 clusters of microcalcifications (MCs; 36 malignant, 27 benign), were evaluated retrospectively, using the BI-RADS categories, by several radiologists, blinded to the pathology results. A computerized system then automatically extracted from the digitized mammogram features characterizing mammographic lesions, which were used to classify the lesions. The results of the computerized classification scheme were compared, by receiver operating characteristics (ROC) analysis, to the conventional interpretation. In the ''low probability of malignancy group'' (excluding BI-RADS categories 4 and 5), computerized analysis improved the A z of the ROC curve significantly, from 0.57 to 0.89. In the ''high probability of malignancy group'' (mostly category 5) the computerized analysis yielded an ROC curve with an A z of 0.99. In the ''intermediate probability of malignancy group'' computerized analysis improved the A z significantly, from 0.66 for to 0.83. Pair-wise analysis showed that in the latter group the modifications resulting from computerized analysis were correct in 83% of cases. Computerized analysis has the ability to improve the performance of the radiologists exactly in the BI-RADS categories with the greatest difficulties in arriving at a correct diagnosis. It increased the performance significantly in the problematic group of ''intermediate probability of malignancy'' and pinpointed all the cases with missed cancers in the ''low probability'' group. (orig.)

  17. Human Reliability Analysis For Computerized Procedures

    International Nuclear Information System (INIS)

    Boring, Ronald L.; Gertman, David I.; Le Blanc, Katya

    2011-01-01

    This paper provides a characterization of human reliability analysis (HRA) issues for computerized procedures in nuclear power plant control rooms. It is beyond the scope of this paper to propose a new HRA approach or to recommend specific methods or refinements to those methods. Rather, this paper provides a review of HRA as applied to traditional paper-based procedures, followed by a discussion of what specific factors should additionally be considered in HRAs for computerized procedures. Performance shaping factors and failure modes unique to computerized procedures are highlighted. Since there is no definitive guide to HRA for paper-based procedures, this paper also serves to clarify the existing guidance on paper-based procedures before delving into the unique aspects of computerized procedures.

  18. Cost-effectiveness analysis of 3-D computerized tomography colonography versus optical colonoscopy for imaging symptomatic gastroenterology patients.

    Science.gov (United States)

    Gomes, Manuel; Aldridge, Robert W; Wylie, Peter; Bell, James; Epstein, Owen

    2013-04-01

    When symptomatic gastroenterology patients have an indication for colonic imaging, clinicians have a choice between optical colonoscopy (OC) and computerized tomography colonography with three-dimensional reconstruction (3-D CTC). 3-D CTC provides a minimally invasive and rapid evaluation of the entire colon, and it can be an efficient modality for diagnosing symptoms. It allows for a more targeted use of OC, which is associated with a higher risk of major adverse events and higher procedural costs. A case can be made for 3-D CTC as a primary test for colonic imaging followed if necessary by targeted therapeutic OC; however, the relative long-term costs and benefits of introducing 3-D CTC as a first-line investigation are unknown. The aim of this study was to assess the cost effectiveness of 3-D CTC versus OC for colonic imaging of symptomatic gastroenterology patients in the UK NHS. We used a Markov model to follow a cohort of 100,000 symptomatic gastroenterology patients, aged 50 years or older, and estimate the expected lifetime outcomes, life years (LYs) and quality-adjusted life years (QALYs), and costs (£, 2010-2011) associated with 3-D CTC and OC. Sensitivity analyses were performed to assess the robustness of the base-case cost-effectiveness results to variation in input parameters and methodological assumptions. 3D-CTC provided a similar number of LYs (7.737 vs 7.739) and QALYs (7.013 vs 7.018) per individual compared with OC, and it was associated with substantially lower mean costs per patient (£467 vs £583), leading to a positive incremental net benefit. After accounting for the overall uncertainty, the probability of 3-D CTC being cost effective was around 60 %, at typical willingness-to-pay values of £20,000-£30,000 per QALY gained. 3-D CTC is a cost-saving and cost-effective option for colonic imaging of symptomatic gastroenterology patients compared with OC.

  19. Computerization of the safeguards analysis decision process

    International Nuclear Information System (INIS)

    Ehinger, M.H.

    1990-01-01

    This paper reports that safeguards regulations are evolving to meet new demands for timeliness and sensitivity in detecting the loss or unauthorized use of sensitive nuclear materials. The opportunities to meet new rules, particularly in bulk processing plants, involve developing techniques which use modern, computerized process control and information systems. Using these computerized systems in the safeguards analysis involves all the challenges of the man-machine interface experienced in the typical process control application and adds new dimensions to accuracy requirements, data analysis, and alarm resolution in the regulatory environment

  20. Some aspects of evaluation of image quality in computerized tomography

    International Nuclear Information System (INIS)

    Travassos, Paulo Cesar Baptista; Peixoto, Jose Guilherme; Almeida, Carlos Eduardo Veloso de; Campos, Luciana Tourinho; Magalhaes, Luis Alexandre

    2016-01-01

    The evaluation of CT scanners image quality includes measuring the Hounsfield values, HU, using a table with the limit values. This table does not consider that different devices have different effective energies, which may cause some false results. The evaluation of 90 computerized tomography, by the American College Radiology methodology, showed that some failed devices in the evaluation showed excellent linear fit between the values of the linear attenuation coefficients calculated for the actual energy used, according to the HU values. The analysis of the coefficient of determination suggests that 10 of these devices could have been approved. (author)

  1. Computerized occlusal analysis in bruxism

    Directory of Open Access Journals (Sweden)

    Lazić Vojkan

    2006-01-01

    Full Text Available Introduction. Sleep bruxism as nocturnal parafunction, also known as tooth grinding, is the most common parasomnia (sleep disorder. Most tooth grinding occurs during rapid eye movement - REM sleep. Sleep bruxism is an oral habit characterized by rhythmic activity of the masticatory muscles (m. masseter that causes forced contact between dental surfaces during sleep. Sleep bruxism has been associated with craniomandibular disorders including temporomandibular joint discomfort, pulpalgia, premature loss of teeth due to excessive attrition and mobility, headache, muscle ache, sleep interruption of an individual and problems with removable and fixed denture. Basically, two groups of etiological factors can be distinguished, viz., peripheral (occlusal factors and central (pathophysiological and psychological factors. The role of occlusion (occlusal discrepancies as the causative factor is not enough mentioned in relation to bruxism. Objective. The main objective of this paper was to evaluate the connection between occlusal factors and nocturnal parafunctional activities (occlusal disharmonies and bruxism. Method. Two groups were formed- experimental of 15 persons with signs and symptoms of nocturnal parafunctional activity of mandible (mean age 26.6 years and control of 42 persons with no signs and symptoms of bruxism (mean age 26.3 yrs.. The computerized occlusal analyses were performed using the T-Scan II system (Tekscan, Boston, USA. 2D occlusograms were analyzed showing the occlusal force, the center of the occlusal force with the trajectory and the number of antagonistic tooth contacts. Results. Statistically significant difference of force distribution was found between the left and the right side of the arch (L%-R% (t=2.773; p<0.02 in the group with bruxism. The difference of the centre of occlusal force - COF trajectory between the experimental and control group was not significant, but the trajectory of COF was longer in the group of

  2. Her-2/neu expression in node-negative breast cancer: direct tissue quantitation by computerized image analysis and association of overexpression with increased risk of recurrent disease.

    Science.gov (United States)

    Press, M F; Pike, M C; Chazin, V R; Hung, G; Udove, J A; Markowicz, M; Danyluk, J; Godolphin, W; Sliwkowski, M; Akita, R

    1993-10-15

    levels. By using cells with defined expression levels as calibration material, computerized image analysis of immunohistochemical staining could be used to determine the amount of oncoprotein product in these cell lines as well as in human breast cancer specimens. Quantitation of the amount of HER-2/neu protein product determined by computerized image analysis of immunohistochemical assays correlated very closely with quantitative analysis of a series of molecularly characterized breast cancer cell lines and breast cancer tissue specimens.(ABSTRACT TRUNCATED AT 400 WORDS)

  3. Computerized image analysis: Texture-field orientation method for pectoral muscle identification on MLO-view mammograms

    International Nuclear Information System (INIS)

    Zhou Chuan; Wei Jun; Chan, Heang-Ping; Paramagul, Chintana; Hadjiiski, Lubomir M.; Sahiner, Berkman; Douglas, Julie A.

    2010-01-01

    Purpose: To develop a new texture-field orientation (TFO) method that combines a priori knowledge, local and global information for the automated identification of pectoral muscle on mammograms. Methods: The authors designed a gradient-based directional kernel (GDK) filter to enhance the linear texture structures, and a gradient-based texture analysis to extract a texture orientation image that represented the dominant texture orientation at each pixel. The texture orientation image was enhanced by a second GDK filter for ridge point extraction. The extracted ridge points were validated and the ridges that were less likely to lie on the pectoral boundary were removed automatically. A shortest-path finding method was used to generate a probability image that represented the likelihood that each remaining ridge point lay on the true pectoral boundary. Finally, the pectoral boundary was tracked by searching for the ridge points with the highest probability lying on the pectoral boundary. A data set of 130 MLO-view digitized film mammograms (DFMs) from 65 patients was used to train the TFO algorithm. An independent data set of 637 MLO-view DFMs from 562 patients was used to evaluate its performance. Another independent data set of 92 MLO-view full field digital mammograms (FFDMs) from 92 patients was used to assess the adaptability of the TFO algorithm to FFDMs. The pectoral boundary detection accuracy of the TFO method was quantified by comparison with an experienced radiologist's manually drawn pectoral boundary using three performance metrics: The percent overlap area (POA), the Hausdorff distance (Hdist), and the average distance (AvgDist). Results: The mean and standard deviation of POA, Hdist, and AvgDist were 95.0±3.6%, 3.45±2.16 mm, and 1.12±0.82 mm, respectively. For the POA measure, 91.5%, 97.3%, and 98.9% of the computer detected pectoral muscles had POA larger than 90%, 85%, and 80%, respectively. For the distance measures, 85.4% and 98.0% of the

  4. Computerized Buckling Analysis of Shells

    Science.gov (United States)

    1981-06-01

    Simple Examples to Illu-trate Various Types of Buckling Column Buckling In order to make the discussion of the basic concepts introduced in connec...the optimum design of a square box column obtained from an "* analysis in which the effective width concept is used and collapse is assumed to occur...nology, Delft., pp 335-344 (1969). 120 Save, M., "Verification experimentale de l’analyse limite plastique des plaques et des coques en acier doux

  5. Radiographic analysis of body composition by computerized axial tomography

    International Nuclear Information System (INIS)

    Heymsfield, S.B.

    1986-01-01

    Radiographic methods of evaluating body composition have been applied for over five decades. A marked improvement in this approach occurred in the mid-nineteen-seventies with the introduction of computerized axial tomography. High image contrast, cross-sectional imaging and rapid computerized data processing make this technique a sophisticated clinically applicable tool. (author)

  6. Use of computerized tomography and magnetic resonance imaging

    International Nuclear Information System (INIS)

    Tjoerstad, K.

    1992-01-01

    This is a neurologist's opinion on how computerized tomography and magnetic resonance imaging have improved the doctor's diagnostic possibilities, changed patient/doctor relationship and increased the patients' expectations from diagnostic tests. How should the often conflicting interests of patients, society and doctors be handled? 15 refs., 1 fig., 1 tab

  7. Computerized radionuclidic analysis in production facilities

    International Nuclear Information System (INIS)

    Gibbs, A.

    1978-03-01

    The Savannah River Plant Laboratories Department has been using a dual computer system to control all radionuclidic pulse height analyses since 1971. This computerized system analyzes 7000 to 8000 samples per month and has allowed the counting room staff to be reduced from three persons to one person. More reliable process information is being returned to the production facilities and for environmental evaluations and being returned faster, even though the sample load has more than tripled. This information is now more easily retrievable for other evaluations. The computer is also used for mass spectrometer data reduction and for quality control data analysis. The basic system is being expanded by interfacing microcomputers which provide data input from all of the laboratory modules for quality assurance programs

  8. Computerized Analysis and Detection of Missed Cancer in Screening Mammogram

    National Research Council Canada - National Science Library

    Li, Lihua

    2005-01-01

    This project is to explore an innovative CAD strategy for improving early detection of breast cancer in screening mammograms by focusing on computerized analysis and detection of cancers missed by radiologists...

  9. Computerized Analysis and Detection of Missed Cancer in Screening Mammogram

    National Research Council Canada - National Science Library

    Li, Lihua

    2004-01-01

    This project is to explore an innovative CAD strategy for improving early detection of breast cancer in screening mammograms by focusing on computerized analysis and detection of cancers missed by radiologists...

  10. Analysis of concrete material through gamma ray computerized tomography

    International Nuclear Information System (INIS)

    Oliveira Junior, J.M. de

    2004-01-01

    Computerized Tomography (CT) refers to the cross sectional imaging of an object from both transmission or reflection data collected by illuminating the object from many different directions. The most important contribution of CT is to greatly improve abilities to distinguish regions with different gamma ray transmittance and to separate over-lying structures. The mathematical problem of the CT imaging is that of estimating an image from its projections. These projections can represent, for example, the linear attenuation coefficient of γ-rays along the path of the ray. In this work we will present some new results obtained by using tomographic techniques to analyze column samples of concrete to check the distribution of various materials and structural problems. These concrete samples were made using different proportions of stone, sand and cement. Another set of samples with different proportions of sand and cement were also used to verify the outcome from the CT analysis and the differences between them. Those samples were prepared at the Material Laboratory of Faculdade de Engenharia de Sorocaba, following the same procedures used in real case of concrete tests. The projections used in this work was obtained by Mini Computerized Tomograph of Uniso (MTCU), located at the Experimental Nuclear Physics Laboratory at University of Sorocaba. This tomograph operates with a gamma ray source of 241 Am (photons of 60 keV and 100 mCi of intensity) and a NaI(Tl) solid state detector. The system features translation and rotation scanning modes, a 100 mm effective field of view, and 1 mm spatial resolution. The image reconstruction problem is solved using Discrete Filtered Backprojection (FBP). (author)

  11. Computerized detection of lacunar infarcts in brain MR images

    International Nuclear Information System (INIS)

    Uchiyama, Yoshikazu; Matsui, Atsushi; Yokoyama, Ryujiro

    2007-01-01

    Asymptomatic lacunar infarcts are often found in the Brain Dock. The presence of asymptomatic lacunar infarcts increases the risk of serious cerebral infarction. Thus, it is an important task for radiologists and/or neurosurgeons to detect asymptomatic lacunar infarctions in MRI images. However, it is difficult for radiologists and/or neurosurgeons to identify lacunar infarcts correctly in MRI images, because it is hard to distinguish between lacunar infarcts and enlarged Virchow-Robin space. Therefore, the purpose of our study was to develop a computer-aided diagnosis scheme for detection of lacunar infarctions in order to assist radiologists and/or neurosurgeons' interpretation as a ''second opinion.'' Our database consisted of 1143 T2-weighted MR images and 1143 T1-weighted MR images, which were selected from 132 patients. First, we segmented the cerebral parenchyma region by use of a region growing technique. The white-tophat transformation was then applied for enhancement of lacunar infarcts. The multiple-phase binarization was used for identifying initial candidates of lacunar infarcts. For removal of false positives (FPs), 12 features were determined in each of the initial candidates in T2 and T1-weighted MR images. The rule-based schemes and an artificial neural network with these features were used for distinguishing between lacunar infarcts and FPs. The sensitivity of detection of lacunar infarcts was 96.8% (90/93) with 0.69 (737/1063) FP per image. This computerized method may be useful for radiologists and/or neurosurgeons in detecting lacunar infracts in MRI images. (author)

  12. Image reconstruction in computerized tomography using the convolution method

    International Nuclear Information System (INIS)

    Oliveira Rebelo, A.M. de.

    1984-03-01

    In the present work an algoritin was derived, using the analytical convolution method (filtered back-projection) for two-dimensional or three-dimensional image reconstruction in computerized tomography applied to non-destructive testing and to the medical use. This mathematical model is based on the analytical Fourier transform method for image reconstruction. This model consists of a discontinuous system formed by an NxN array of cells (pixels). The attenuation in the object under study of a colimated gamma ray beam has been determined for various positions and incidence angles (projections) in terms of the interaction of the beam with the intercepted pixels. The contribution of each pixel to beam attenuation was determined using the weight function W ij which was used for simulated tests. Simulated tests using standard objects with attenuation coefficients in the range of 0,2 to 0,7 cm -1 were carried out using cell arrays of up to 25x25. One application was carried out in the medical area simulating image reconstruction of an arm phantom with attenuation coefficients in the range of 0,2 to 0,5 cm -1 using cell arrays of 41x41. The simulated results show that, in objects with a great number of interfaces and great variations of attenuation coefficients at these interfaces, a good reconstruction is obtained with the number of projections equal to the reconstruction matrix dimension. A good reconstruction is otherwise obtained with fewer projections. (author) [pt

  13. Spiral Computed Tomographic Imaging Related to Computerized Ultrasonographic Images of Carotid Plaque Morphology and Histology

    DEFF Research Database (Denmark)

    Grønholdt, Marie-Louise M.; Wagner, Aase; Wiebe, Britt M.

    2001-01-01

    Echolucency of carotid atherosclerotic plaques, as evaluated by computerized B-mode ultrasonographic images, has been associated with an increased incidence of brain infarcts on cerebral computed tomographic scans. We tested the hypotheses that characterization of carotid plaques on spiral comput...

  14. Current Human Reliability Analysis Methods Applied to Computerized Procedures

    Energy Technology Data Exchange (ETDEWEB)

    Ronald L. Boring

    2012-06-01

    Computerized procedures (CPs) are an emerging technology within nuclear power plant control rooms. While CPs have been implemented internationally in advanced control rooms, to date no US nuclear power plant has implemented CPs in its main control room (Fink et al., 2009). Yet, CPs are a reality of new plant builds and are an area of considerable interest to existing plants, which see advantages in terms of enhanced ease of use and easier records management by omitting the need for updating hardcopy procedures. The overall intent of this paper is to provide a characterization of human reliability analysis (HRA) issues for computerized procedures. It is beyond the scope of this document to propose a new HRA approach or to recommend specific methods or refinements to those methods. Rather, this paper serves as a review of current HRA as it may be used for the analysis and review of computerized procedures.

  15. Measurement and analysis of noise power spectrum of computerized tomography in images; Medida y analysis del espectro de potencias del ruido en imagenes de tomografia computarizada

    Energy Technology Data Exchange (ETDEWEB)

    Castro Tejero, P.; Garayoa Roca, J.

    2013-07-01

    This paper examines the implementation of the spectrum of powers of the noise, NPS, as metric to characterize the noise, both in magnitude and in texture, for CT scans. The NPS found show that you for convolution filters that assume a greater softening in the reconstructed image, spectrum is concentrated in the low frequencies, while for filters sharp, the spectrum extends to high frequencies. In the analyzed cases, there is a low frequency component, largely due to the structure-borne noise, which can be a potential negative effect on the detectability of injuries. (Author)

  16. Cognitive task analysis and the design of computerized operator aids

    International Nuclear Information System (INIS)

    Andersson, H.

    1985-01-01

    The new technological possibilities have led to the initiation of many projects for the design and evaluation of computerized operator support systems to be implemented in nuclear power plant control rooms. A typical finding so far has been that operators often have a positive attitude towards such systems but still don't use them very much, mostly because they find almost the same information on the conventional control boards which they are accustomed to use. Still, however, there is a widely shared belief that conventional control rooms have short-comings that make the use of computerized aids necessary. One reason for the limited success so far is that the new systems often are poorly integrated with the existing conventional instrumentation and with the working procedures. The reluctance to use new computer based aids, despite their nice features, is therefore probably caused by an inadequate task analysis made prior to the design of these computerized operator support systems

  17. Positron transaxial emission tomograph with computerized image reconstruction

    International Nuclear Information System (INIS)

    Jatteau, Michel.

    1981-01-01

    This invention concerns a positron transaxial emission tomography apparatus with computerized image reconstruction, like those used in nuclear medicine for studying the metabolism of organs, in physiological examinations and as a diagnosis aid. The operation is based on the principle of the detection of photons emitted when the positrons are annihilated by impact with an electron. The appliance is mainly composed of: (a) - a set of gamma ray detectors distributed on a polygonal arrangement around the body area to be examined, (b) - circuits for amplifying the signals delivered by the gamma ray detectors, (c) - computers essentially comprising energy integration and discrimination circuits and provided at the output of the detectors for calculating and delivering, as from the amplified signals, information on the position and energy relative to each occurrence constituted by the detections of photons, (d) - time coincidence circuits for selecting by emission of detector validation signals, only those occurrences, among the ensemble of those detected, which effectively result from the annihilation of positrons inside the area examined, (e) - a data processing system [fr

  18. ECAT: a new computerized tomographic imaging system for position-emitting radiopharmaceuticals

    International Nuclear Information System (INIS)

    Phelps, M.E.; Hoffman, E.J.; Huang, S.C.; Kuhl, D.E.

    1977-01-01

    The ECAT was designed and developed as a complete computerized positron radionuclide imaging system capable of providing high contrast, high resolution, quantitative images in 2 dimensional and tomographic formats. Flexibility, in its various image mode options, allows it to be used for a wide variety of imaging problems

  19. Automated computerized scheme for distinction between benign and malignant solitary pulmonary nodules on chest images

    International Nuclear Information System (INIS)

    Aoyama, Masahito; Li Qiang; Katsuragawa, Shigehiko; MacMahon, Heber; Doi, Kunio

    2002-01-01

    A novel automated computerized scheme has been developed to assist radiologists for their distinction between benign and malignant solitary pulmonary nodules on chest images. Our database consisted of 55 chest radiographs (33 primary lung cancers and 22 benign nodules). In this method, the location of a nodule was indicated first by a radiologist. The difference image with a nodule was produced by use of filters and then represented in a polar coordinate system. The nodule was segmented automatically by analysis of contour lines of the gray-level distribution based on the polar-coordinate representation. Two clinical parameters (age and sex) and 75 image features were determined from the outline, the image, and histogram analysis for inside and outside regions of the segmented nodule. Linear discriminant analysis (LDA) and knowledge about benign and malignant nodules were used to select initial feature combinations. Many combinations for subgroups of 77 features were evaluated as input to artificial neural networks (ANNs). The performance of ANNs with the selected 7 features by use of the round-robin test showed Az=0.872, which was greater than Az=0.854 obtained previously with the manual method (P=0.53). The performance of LDA (Az=0.886) was slightly improved compared to that of ANNs (P=0.59) and was greater than that of the manual method (Az=0.854) reported previously (P=0.40). The high level of its performance indicates the potential usefulness of this automated computerized scheme in assisting radiologists as a second opinion for distinction between benign and malignant solitary pulmonary nodules on chest images

  20. Global plastic models for computerized structural analysis

    International Nuclear Information System (INIS)

    Roche, R.L.; Hoffmann, A.

    1977-01-01

    In many types of structures, it is possible to use generalized stresses (like membrane forces, bending moment, torsion moment...) to define a yield surface for a part of the structure. Analysis can be achieved by using the HILL's principle and a hardening rule. The whole formulation is said 'Global Plastic Model'. Two different global models are used in the CEASEMT system for structural analysis, one for shell analysis and the other for piping analysis (in plastic or creep field). In shell analysis the generalized stresses chosen are the membrane forces and bending (including torsion) moments. There is only one yield condition for a normal to the middle surface and no integration along the thickness is required. In piping analysis, the choice of generalized stresses is bending moments, torsional moment, hoop stress and tension stress. There is only a set of stresses for a cross section and no integration over the cross section area is needed. Connected strains are axis curvature, torsion, uniform strains. The definition of the yield surface is the most important item. A practical way is to use a diagonal quadratic function of the stress components. But the coefficients are depending of the shape of the pipe element, especially for curved segments. Indications will be given on the yield functions used. Some examples of applications in structural analysis are added to the text

  1. Laboratory Computerization: The Case for a Prospective Analysis

    OpenAIRE

    Hurdle, John F.; Schwamm, Harry A.

    1982-01-01

    The argument is made that computerization of a laboratory should be preceeded by a thorough prospective analysis of laboratory operations. Points to be pondered include complementation of retrospective data, system cost justification, system performance justification, post-installation personnel adjustments, improved system utilization, improved manual performance, and insight into “how much” system to buy. A brief, general outline is offered describing how to approach such a study.

  2. Global plastic models for computerized structural analysis

    International Nuclear Information System (INIS)

    Roche, R.; Hoffmann, A.

    1977-01-01

    Two different global models are used in the CEASEMT system for structural analysis, one for the shells analysis and the other for piping analysis (in plastic or creep field). In shell analysis the generalized stresses choosed are the membrane forces Nsub(ij) and bending (including torsion) moments Msub(ij). There is only one yield condition for a normal (to the middle surface) and no integration along the thickness is required. In piping analysis, the choice of generalized stresses is: bending moments, torsional moments, Hoop stress and tension stress. There is only a set of stresses for a cross section and non integration over the cross section area is needed. Connected strains are axis curvature, torsion, uniform strains. The definition of the yield surface is the most important item. A practical way is to use a diagonal quadratic fonction of the stress components. But the coefficients are depending of the shape of the pipe element, especially for curved segments. Indications will be given on the yield fonctions used. Some examples of applications in structural analysis are added to the text [fr

  3. Accurately Diagnosing Uric Acid Stones from Conventional Computerized Tomography Imaging: Development and Preliminary Assessment of a Pixel Mapping Software.

    Science.gov (United States)

    Ganesan, Vishnu; De, Shubha; Shkumat, Nicholas; Marchini, Giovanni; Monga, Manoj

    2018-02-01

    Preoperative determination of uric acid stones from computerized tomography imaging would be of tremendous clinical use. We sought to design a software algorithm that could apply data from noncontrast computerized tomography to predict the presence of uric acid stones. Patients with pure uric acid and calcium oxalate stones were identified from our stone registry. Only stones greater than 4 mm which were clearly traceable from initial computerized tomography to final composition were included in analysis. A semiautomated computer algorithm was used to process image data. Average and maximum HU, eccentricity (deviation from a circle) and kurtosis (peakedness vs flatness) were automatically generated. These parameters were examined in several mathematical models to predict the presence of uric acid stones. A total of 100 patients, of whom 52 had calcium oxalate and 48 had uric acid stones, were included in the final analysis. Uric acid stones were significantly larger (12.2 vs 9.0 mm, p = 0.03) but calcium oxalate stones had higher mean attenuation (457 vs 315 HU, p = 0.001) and maximum attenuation (918 vs 553 HU, p uric acid stones. A combination of stone size, attenuation intensity and attenuation pattern from conventional computerized tomography can distinguish uric acid stones from calcium oxalate stones with high sensitivity and specificity. Copyright © 2018 American Urological Association Education and Research, Inc. Published by Elsevier Inc. All rights reserved.

  4. Diagnostic Value of Manual and Computerized Methods of Dental Casts Analysis

    Directory of Open Access Journals (Sweden)

    H. Rahimi

    2009-06-01

    Full Text Available Objective: The aim of this study was to evaluate the validity of computerized and manual methods of dental cast analysis.Materials and Methods: Twenty set-ups of upper and lower casts using artificial teeth corresponding to various malocclusions were created for a diagnostic in vitro study. Values of tooth size were calculated from the isolated artificial teeth out of the set-ups, results were considered as a gold standard for the tooth size. Arch width was calculated from the existing set-ups on the dentins.Impressions were taken of the casts with alginate and duplicated with dental stone. Models were measured with digital caliper manually. Then images were taken from the occlusal views of the casts by a digital camera. Measurements were done on digital images with the AutoCAD software.The results of the computerized and manual methods were compared with the gold standard.Intra class correlation coefficient of reliability was used to measure the accuracy ofthe methods and the Friedman technique used to evaluate the significance of differences.Results: Results indicated that all measurements were highly correlated, e.g. gold standard and manual (0.9613-0.9991, gold standard and computerized (0.7118-0.9883, manual and computerized (0.6734-0.9914. Statistically significant differences were present between these methods (P<0.05, but they proved not to be clinically significant.Conclusion: Manual measurement is still the most accurate method when compared to the computerized measurements and the results of measurement by computer should be interpreted with caution.

  5. The industrial computerized tomography applied to the rock analysis

    International Nuclear Information System (INIS)

    Tetzner, Guaraciaba de Campos

    2008-01-01

    This work is a study of the possibilities of the technical applications of Computerized Tomography (CT) by using a device developed in the Radiation Technology Center (CTR), Institute for Energy and Nuclear Research (IPEN-CNEN/SP). The equipment consists of a gamma radiation source ( 60 Co), a scintillation detector of sodium iodide doped with thallium (NaI (Tl)), a mechanical system to move the object (rotation and translation) and a computer system. This operating system has been designed and developed by the CTR-IPEN-CNEN/SP team using national resources and technology. The first validation test of the equipment was carried out using a cylindrical sample of polypropylene (phantom) with two cylindrical cavities (holes) of 5 x 25 cm (diameter and length). In these tests, the holes were filled with materials of different density (air, oil and metal), whose attenuation coefficients are well known. The goal of this first test was to assess the response quality of the equipment. The present report is a study comparing computerized tomography equipment CTR-IPEN-CNEN/SP which uses a source of gamma radiation ( 60 Co) and other equipment provided by the Department of Geosciences in the University of Texas (CTUT), which uses an X-ray source (450 kV and 3.2 mA). As a result, the images obtained and the comprehensive study of the usefulness of the equipment developed here strengthened the proposition that the development of industrial computerized tomography is an important step toward consolidating the national technology. (author)

  6. Castem 2000: a modern approach of computerized structural analysis

    International Nuclear Information System (INIS)

    Verpeaux, P.; Millard, A.; Hoffmann, A.; Ebersolt, L.

    1988-01-01

    Since the early beginning of the computerized structural analysis, many general purpose programs have been developed. Their complexity has increased rapidly and they became difficult to use and to maintain. The needs for an improved user's friendliness and the treatment of complex coupled problems have led to a modern tool: CASTEM 2000. It has profited by the general progress in computers technology and by a twenty years experience in large finite element codes. Its basic principles as well as examples of applications will be presented in this paper

  7. Manual and computerized measurement of coronal vertebral inclination on MRI images: A pilot study

    International Nuclear Information System (INIS)

    Vrtovec, T.; Likar, B.; Pernuš, F.

    2013-01-01

    Aim: A pilot study that presents a systematic approach for evaluating the variability of manual and computerized measurements of coronal vertebral inclination (CVI) on images acquired by magnetic resonance imaging (MRI). Materials and methods: Three observers identified the vertebral body corners of 28 vertebrae on two occasions on two-dimensional (2D) coronal MRI cross-sections, which served to evaluate CVI using six manual measurements (superior and inferior tangents, left and right tangents, mid-endplate and mid-wall lines). Computerized measurements were performed by evaluating CVI from the symmetry of vertebral anatomical structures of the same 28 vertebrae in 2D coronal MRI cross-sections and in three-dimensional (3D) MRI images. Results: In terms of standard deviation (SD), the mid-endplate lines proved to be the manual measurements with the lowest intra- (1.0° SD) and interobserver (1.4° SD) variability. The computerized measurements in 3D yielded even lower intra- (0.8° SD) and interobserver (1.3° SD) variability. The strongest inter-method agreement (1.2° SD) was found among lines parallel to vertebral endplates (superior tangents, inferior tangents, mid-endplate lines). The computerized measurements in 3D were most in agreement with the mid-endplate lines (1.9° SD). The estimated intra- and interobserver variabilities of standard Cobb angle measurements were equal to 1.6° SD and 2.5° SD, respectively, for manual measurements, and to 1.1° SD and 1.8° SD, respectively, for computerized measurements. Conclusion: The mid-endplate lines proved to be the most reproducible and reliable manual CVI measurements. Computerized CVI measurements based on the evaluation of the symmetry of vertebral anatomical structures in 3D were more reproducible and reliable than manual measurements

  8. VARIABILITY OF MANUAL AND COMPUTERIZED METHODS FOR MEASURING CORONAL VERTEBRAL INCLINATION IN COMPUTED TOMOGRAPHY IMAGES

    Directory of Open Access Journals (Sweden)

    Tomaž Vrtovec

    2015-06-01

    Full Text Available Objective measurement of coronal vertebral inclination (CVI is of significant importance for evaluating spinal deformities in the coronal plane. The purpose of this study is to systematically analyze and compare manual and computerized measurements of CVI in cross-sectional and volumetric computed tomography (CT images. Three observers independently measured CVI in 14 CT images of normal and 14 CT images of scoliotic vertebrae by using six manual and two computerized measurements. Manual measurements were obtained in coronal cross-sections by manually identifying the vertebral body corners, which served to measure CVI according to the superior and inferior tangents, left and right tangents, and mid-endplate and mid-wall lines. Computerized measurements were obtained in two dimensions (2D and in three dimensions (3D by manually initializing an automated method in vertebral centroids and then searching for the planes of maximal symmetry of vertebral anatomical structures. The mid-endplate lines were the most reproducible and reliable manual measurements (intra- and inter-observer variability of 0.7° and 1.2° standard deviation, SD, respectively. The computerized measurements in 3D were more reproducible and reliable (intra- and inter-observer variability of 0.5° and 0.7° SD, respectively, but were most consistent with the mid-wall lines (2.0° SD and 1.4° mean absolute difference. The manual CVI measurements based on mid-endplate lines and the computerized CVI measurements in 3D resulted in the lowest intra-observer and inter-observer variability, however, computerized CVI measurements reduce observer interaction.

  9. Clinical study on primary epilepsy by computerized analysis of CT

    International Nuclear Information System (INIS)

    Tominaga, Hidefumi; Ueyama, Kenichi; Mizutani, Hiroshi; Imamura, Keisuke; Yoshidome, Kazushi; Matsumoto, Kei

    1985-01-01

    CT scans were examined by conventional linear measurement method and computerized analysis in 17 patients with primary epilepsy (Group A). Results were compared with those in healthy volunteers (Group B). Relationship between CT and EEG findings was also examined. The maximum width of the third cerebral ventricle (TCV) was narrowed in Group A than in Group B, with statistically significant difference. Low density rates (LDR) in Group A tended to be lower than those in Group B. There was significant difference in narrowed maximum width of TCV between Group A presenting with sudden dysrhythmia and Group B. For this type of Group A, LDR was significantly lower than that for Group B. These results suggest some changes in the brain in young epilepsy patients, especially those presenting with sudden dysrhythemia. (Namekawa, K.)

  10. Computerized detection method for asymptomatic white matter lesions in brain screening MR images using a clustering technique

    International Nuclear Information System (INIS)

    Kunieda, Takuya; Uchiyama, Yoshikazu; Hara, Takeshi

    2008-01-01

    Asymptomatic white matter lesions are frequently identified by the screening system known as Brain Dock, which is intended for the detection of asymptomatic brain diseases. The detection of asymptomatic white matter lesions is important because their presence is associated with an increased risk of stroke. Therefore, we have developed a computerized method for the detection of asymptomatic white matter lesions in order to assist radiologists in image interpretation as a ''second opinion''. Our database consisted of T 1 - and T 2 -weighted images obtained from 73 patients. The locations of the white matter lesions were determined by an experienced neuroradiologist. In order to restrict the area to be searched for white matter lesions, we first segmented the cerebral region in T 1 -weighted images by applying thresholding and region-growing techniques. To identify the initial candidate lesions, k-means clustering with pixel values in T 1 - and T 2 -weighted images was applied to the segmented cerebral region. To eliminate false positives (FPs), we determined the features, such as location, size, and circularity, of each of the initial candidate lesions. Finally, a rule-based scheme and a quadratic discriminant analysis with these features were employed to distinguish between white matter lesions and FPs. The results showed that the sensitivity for the detection of white matter lesions was 93.2%, with 4.3 FPs per image, suggesting that our computerized method may be useful for the detection of asymptomatic white matter lesions in T 1 - and T 2 -weighted images. (author)

  11. Vector entropy imaging theory with application to computerized tomography

    International Nuclear Information System (INIS)

    Wang Yuanmei; Cheng Jianping; Heng, Pheng Ann

    2002-01-01

    Medical imaging theory for x-ray CT and PET is based on image reconstruction from projections. In this paper a novel vector entropy imaging theory under the framework of multiple criteria decision making is presented. We also study the most frequently used image reconstruction methods, namely, least square, maximum entropy, and filtered back-projection methods under the framework of the single performance criterion optimization. Finally, we introduce some of the results obtained by various reconstruction algorithms using computer-generated noisy projection data from the Hoffman phantom and real CT scanner data. Comparison of the reconstructed images indicates that the vector entropy method gives the best in error (difference between the original phantom data and reconstruction), smoothness (suppression of noise), grey value resolution and is free of ghost images. (author)

  12. Computerized digital image processing on radiographs of canine filariosis

    International Nuclear Information System (INIS)

    Miyatake, K.; Okamoto, Y.; Minami, S.

    1999-01-01

    For objective evaluation in the lung arterial lesions, density histogram revealed by survey thoracic radiographies of fifteen canine filariosis and five normal canine were digitally analyzed, and preparation of pulmonary artery angiogram with inflated-fixed lung, the changes in the histogram and the pulmonary arterial lesion by a soft X-ray examination were compared. In the lung areas affected by filariosis, the density histogram increased the white level and decreased the black level in each part compared to a normal lung. In comparison with the normal parameters, those of the filariosis it were significantly increased in minimum grey level values (Min), maximum grey level values (Max), and the maximum frequency grey level values (Mode) and, it was significantly decreased in maximum frequency values (MaF). The pulmonary arterial lesion of the filariosis showed obvious morphological changes such as in distinction, pruning, angiectasis, and meandering. In the grade of pulmonary arterial lesion, the parameter Min, Max, Mode and MaF were changed significantly. From these results, it was clear that the methods for the lung arterial lesions analysis of X-ray images were confirmed to be highly beneficial in the lung arterial lesions for objective diagnosis

  13. Region-of-interest imaging in cone beam computerized tomography

    International Nuclear Information System (INIS)

    Tam, K.C.

    1996-01-01

    Imaging a sectional region within an object with a detector just big enough to cover the sectional region-of-interest is analyzed. We show that with some suitable choice of scanning configuration and with an innovative method of data combination, all the Radon data can be obtained accurately. The algorithm is mathematically exact, and requires no iterations and no additional measurements. The method can be applied to inspect portions of large industrial objects in industrial imaging, as well as to image portions of human bodies in medical diagnosis

  14. A computerized tomography system for transcranial ultrasound imaging.

    Science.gov (United States)

    Tang, Sai Chun; Clement, Gregory T

    Hardware for tomographic imaging presents both challenge and opportunity for simplification when compared with traditional pulse-echo imaging systems. Specifically, point diffraction tomography does not require simultaneous powering of elements, in theory allowing just a single transmit channel and a single receive channel to be coupled with a switching or multiplexing network. In our ongoing work on transcranial imaging, we have developed a 512-channel system designed to transmit and/or receive a high voltage signal from/to arbitrary elements of an imaging array. The overall design follows a hierarchy of modules including a software interface, microcontroller, pulse generator, pulse amplifier, high-voltage power converter, switching mother board, switching daughter board, receiver amplifier, analog-to-digital converter, peak detector, memory, and USB communication. Two pulse amplifiers are included, each capable of producing up to 400Vpp via power MOSFETS. Switching is based around mechanical relays that allow passage of 200V, while still achieving switching times of under 2ms, with an operating frequency ranging from below 100kHz to 10MHz. The system is demonstrated through ex vivo human skulls using 1MHz transducers. The overall system design is applicable to planned human studies in transcranial image acquisition, and may have additional tomographic applications for other materials necessitating a high signal output.

  15. Use of morphologic filters in the computerized detection of lung nodules in digital chest images

    International Nuclear Information System (INIS)

    Yoshimura, H.; Giger, M.L.; Doi, K.; Ahn, N.; MacMahon, H.

    1989-01-01

    The authors have previously described a computerized scheme for the detection of lung nodules based on a difference-image approach, which had a detection accuracy of 70% with 7--8 false positives per image. Currently, they are investigating morphologic filters for the further enhancement/suppression of nodule-signals and the removal of false-positives. Gray-level morphologic filtering is performed on clinical chest radiographs digitized with an optical drum scanner. Various shapes and sequences of erosion and dilation filters (i.e., determination of the minimum and maximum gray levels, respectively) were examined for signal enhancement and suppression for sue in the difference- image approach

  16. Computerized detection of acute ischemic stroke in brain computed tomography images

    International Nuclear Information System (INIS)

    Nagashima, Hiroyuki; Shiraishi, Akihisa; Harakawa, Tetsumi; Shiraishi, Junji; Doi, Kunio; Sunaga, Shinichi

    2009-01-01

    The interpretation of acute ischemic stroke (AIS) in computed tomography (CT) images is a very difficult challenge for radiologists. To assist radiologists in CT image interpretation, we have developed a computerized method for the detection of AIS using 100 training cases and 60 testing cases. In our computerized method, the inclination of the isotropic brain CT volume data is corrected by rotation and shifting. The subtraction data for the contralateral volume is then derived by subtraction from the mirrored (right-left reversed) volume data. Initial candidates suspected to have experienced AIS were identified using multiple-thresholding and filtering techniques. Twenty-one image features of these candidates were extracted and applied to a rule-based test to identify final candidates for AIS. The detection sensitivity values for the training cases and for the testing cases were 95.0% with 3.1 false positives per case and 85.7% with 3.4 false positives per case, respectively. Our computerized method showed good performance in the detection of AIS by CT and is expected to be useful in decision-making by radiologists. (author)

  17. Cardiac imaging systems and methods employing computerized tomographic scanning

    International Nuclear Information System (INIS)

    Richey, J.B.; Wake, R.H.; Walters, R.G.; Hunt, W.F.; Cool, S.L.

    1980-01-01

    The invention relates to cardiac imaging systems and methods employing computerised tomographic scanning. Apparatus is described which allows an image of the radiation attenuation of the heart at a desired phase of the cardiac cycle. The patients ECG signal can be used in a transverse-and-rotate type CT scanner as a time base, so that the beam reaches the heart at a desired phase of the cardiac cycle, or, in a purely rotational-type CT scanner continuously generated scan data is only stored for corresponding phases of successive cardiac cycles. Alternatively, gating of the beams themselves by shuttering or switching the power supply can be controlled by the ECG signal. A pacemaker is used to stabilize the cardiac period. Also used is a system for recognising unacceptable variations in the cardiac period and discarding corresponding scan data. In a transverse-and-rotate type fan-beam CT scanner, the effective beam width is narrowed to reduce the duration of the traverse of the heart. (U.K.)

  18. Computerized spiral analysis using the iPad.

    Science.gov (United States)

    Sisti, Jonathan A; Christophe, Brandon; Seville, Audrey Rakovich; Garton, Andrew L A; Gupta, Vivek P; Bandin, Alexander J; Yu, Qiping; Pullman, Seth L

    2017-01-01

    Digital analysis of writing and drawing has become a valuable research and clinical tool for the study of upper limb motor dysfunction in patients with essential tremor, Parkinson's disease, dystonia, and related disorders. We developed a validated method of computerized spiral analysis of hand-drawn Archimedean spirals that provides insight into movement dynamics beyond subjective visual assessment using a Wacom graphics tablet. While the Wacom tablet method provides robust data, more widely available mobile technology platforms exist. We introduce a novel adaptation of the Wacom-based method for the collection of hand-drawn kinematic data using an Apple iPad. This iPad-based system is stand-alone, easy-to-use, can capture drawing data with either a finger or capacitive stylus, is precise, and potentially ubiquitous. The iPad-based system acquires position and time data that is fully compatible with our original spiral analysis program. All of the important indices including degree of severity, speed, presence of tremor, tremor amplitude, tremor frequency, variability of pressure, and tightness are calculated from the digital spiral data, which the application is able to transmit. While the iPad method is limited by current touch screen technology, it does collect data with acceptable congruence compared to the current Wacom-based method while providing the advantages of accessibility and ease of use. The iPad is capable of capturing precise digital spiral data for analysis of motor dysfunction while also providing a convenient, easy-to-use modality in clinics and potentially at home. Copyright © 2016 Elsevier B.V. All rights reserved.

  19. A Computerized Tablet with Visual Feedback of Hand Position for Functional Magnetic Resonance Imaging

    Directory of Open Access Journals (Sweden)

    Mahta eKarimpoor

    2015-03-01

    Full Text Available Neuropsychological tests - behavioral tasks that very commonly involve handwriting and drawing - are widely used in the clinic to detect abnormal brain function. Functional magnetic resonance imaging (fMRI may be useful in increasing the specificity of such tests. However, performing complex pen-and-paper tests during fMRI involves engineering challenges. Previously, we developed an fMRI-compatible, computerized tablet system to address this issue. However, the tablet did not include visual feedback of hand position (VFHP, a human factors component that may be important for fMRI of certain patient populations. A real-time system was thus developed to provide VFHP and integrated with the tablet in an augmented reality display. The effectiveness of the system was initially tested in young healthy adults who performed various handwriting tasks in front of a computer display with and without VFHP. Pilot fMRI of writing tasks were performed by two representative individuals with and without VFHP. Quantitative analysis of the behavioral results indicated improved writing performance with VFHP. The pilot fMRI results suggest that writing with VFHP requires less neural resources compared to the without VFHP condition, to maintain similar behavior. Thus, the tablet system with VFHP is recommended for future fMRI studies involving patients with impaired brain function and where ecologically valid behavior is important.

  20. A computerized tablet with visual feedback of hand position for functional magnetic resonance imaging

    Science.gov (United States)

    Karimpoor, Mahta; Tam, Fred; Strother, Stephen C.; Fischer, Corinne E.; Schweizer, Tom A.; Graham, Simon J.

    2015-01-01

    Neuropsychological tests behavioral tasks that very commonly involve handwriting and drawing are widely used in the clinic to detect abnormal brain function. Functional magnetic resonance imaging (fMRI) may be useful in increasing the specificity of such tests. However, performing complex pen-and-paper tests during fMRI involves engineering challenges. Previously, we developed an fMRI-compatible, computerized tablet system to address this issue. However, the tablet did not include visual feedback of hand position (VFHP), a human factors component that may be important for fMRI of certain patient populations. A real-time system was thus developed to provide VFHP and integrated with the tablet in an augmented reality display. The effectiveness of the system was initially tested in young healthy adults who performed various handwriting tasks in front of a computer display with and without VFHP. Pilot fMRI of writing tasks were performed by two representative individuals with and without VFHP. Quantitative analysis of the behavioral results indicated improved writing performance with VFHP. The pilot fMRI results suggest that writing with VFHP requires less neural resources compared to the without VFHP condition, to maintain similar behavior. Thus, the tablet system with VFHP is recommended for future fMRI studies involving patients with impaired brain function and where ecologically valid behavior is important. PMID:25859201

  1. Image analysis

    International Nuclear Information System (INIS)

    Berman, M.; Bischof, L.M.; Breen, E.J.; Peden, G.M.

    1994-01-01

    This paper provides an overview of modern image analysis techniques pertinent to materials science. The usual approach in image analysis contains two basic steps: first, the image is segmented into its constituent components (e.g. individual grains), and second, measurement and quantitative analysis are performed. Usually, the segmentation part of the process is the harder of the two. Consequently, much of the paper concentrates on this aspect, reviewing both fundamental segmentation tools (commonly found in commercial image analysis packages) and more advanced segmentation tools. There is also a review of the most widely used quantitative analysis methods for measuring the size, shape and spatial arrangements of objects. Many of the segmentation and analysis methods are demonstrated using complex real-world examples. Finally, there is a discussion of hardware and software issues. 42 refs., 17 figs

  2. The Deference Due the Oracle: Computerized Text Analysis in a Basic Writing Class.

    Science.gov (United States)

    Otte, George

    1989-01-01

    Describes how a computerized text analysis program can help students discover error patterns in their writing, and notes how students' responses to analyses can reduce errors and improve their writing. (MM)

  3. Computerized summary scoring: crowdsourcing-based latent semantic analysis.

    Science.gov (United States)

    Li, Haiying; Cai, Zhiqiang; Graesser, Arthur C

    2017-11-03

    In this study we developed and evaluated a crowdsourcing-based latent semantic analysis (LSA) approach to computerized summary scoring (CSS). LSA is a frequently used mathematical component in CSS, where LSA similarity represents the extent to which the to-be-graded target summary is similar to a model summary or a set of exemplar summaries. Researchers have proposed different formulations of the model summary in previous studies, such as pregraded summaries, expert-generated summaries, or source texts. The former two methods, however, require substantial human time, effort, and costs in order to either grade or generate summaries. Using source texts does not require human effort, but it also does not predict human summary scores well. With human summary scores as the gold standard, in this study we evaluated the crowdsourcing LSA method by comparing it with seven other LSA methods that used sets of summaries from different sources (either experts or crowdsourced) of differing quality, along with source texts. Results showed that crowdsourcing LSA predicted human summary scores as well as expert-good and crowdsourcing-good summaries, and better than the other methods. A series of analyses with different numbers of crowdsourcing summaries demonstrated that the number (from 10 to 100) did not significantly affect performance. These findings imply that crowdsourcing LSA is a promising approach to CSS, because it saves human effort in generating the model summary while still yielding comparable performance. This approach to small-scale CSS provides a practical solution for instructors in courses, and also advances research on automated assessments in which student responses are expected to semantically converge on subject matter content.

  4. Computerized analysis of mammographic parenchymal patterns for assessing breast cancer risk: Effect of ROI size and location

    International Nuclear Information System (INIS)

    Li Hui; Giger, Maryellen L.; Huo Zhimin; Olopade, Olufunmilayo I.; Lan Li; Weber, Barbara L.; Bonta, Ioana

    2004-01-01

    The long-term goal of our research is to develop computerized radiographic markers for assessing breast density and parenchymal patterns that may be used together with clinical measures for determining the risk of breast cancer and assessing the response to preventive treatment. In our earlier studies, we found that women at high risk tended to have dense breasts with mammographic patterns that were coarse and low in contrast. With our method, computerized texture analysis is performed on a region of interest (ROI) within the mammographic image. In our current study, we investigate the effect of ROI size and ROI location on the computerized texture features obtained from 90 subjects (30 BRCA1/BRCA2 gene-mutation carriers and 60 age-matched women deemed to be at low risk for breast cancer). Mammograms were digitized at 0.1 mm pixel size and various ROI sizes were extracted from different breast regions in the craniocaudal (CC) view. Seventeen features, which characterize the density and texture of the parenchymal patterns, were extracted from the ROIs on these digitized mammograms. Stepwise feature selection and linear discriminant analysis were applied to identify features that differentiate between the low-risk women and the BRCA1/BRCA2 gene-mutation carriers. ROC analysis was used to assess the performance of the features in the task of distinguishing between these two groups. Our results show that there was a statistically significant decrease in the performance of the computerized texture features, as the ROI location was varied from the central region behind the nipple. However, we failed to show a statistically significant decrease in the performance of the computerized texture features with decreasing ROI size for the range studied

  5. Computerized diagnostic data analysis and 3-D visualization

    International Nuclear Information System (INIS)

    Schuhmann, D.; Haubner, M.; Krapichler, C.; Englmeier, K.H.; Seemann, M.; Schoepf, U.J.; Gebicke, K.; Reiser, M.

    1998-01-01

    Purpose: To survey methods for 3D data visualization and image analysis which can be used for computer based diagnostics. Material and methods: The methods available are explained in short terms and links to the literature are presented. Methods which allow basic manipulation of 3D data are windowing, rotation and clipping. More complex methods for visualization of 3D data are multiplanar reformation, volume projections (MIP, semi-transparent projections) and surface projections. Methods for image analysis comprise local data transformation (e.g. filtering) and definition and application of complex models (e.g. deformable models). Results: Volume projections produce an impression of the 3D data set without reducing the data amount. This supports the interpretation of the 3D data set and saves time in comparison to any investigation which requires examination of all slice images. More advanced techniques for visualization, e.g. surface projections and hybrid rendering visualize anatomical information to a very detailed extent, but both techniques require the segmentation of the structures of interest. Image analysis methods can be used to extract these structures (e.g. an organ) from the image data. Discussion: At the present time volume projections are robust and fast enough to be used routinely. Surface projections can be used to visualize complex and presegmented anatomical features. (orig.) [de

  6. Non-Conventional Applications of Computerized Tomography: Analysis of Solid Dosage Forms Produced by Pharmaceutical Industry

    International Nuclear Information System (INIS)

    Martins de Oliveira, Jose Jr.; Germano Martins, Antonio Cesar

    2010-01-01

    X-ray computed tomography (CT) refers to the cross-sectional imaging of an object measuring the transmitted radiation at different directions. In this work, we describe a non-conventional application of computerized tomography: visualization and improvements in the understanding of some internal structural features of solid dosage forms. A micro-CT X-ray scanner, with a minimum resolution of 30 μm was used to characterize some pharmaceutical tablets, granules, controlled-release osmotic tablet and liquid-filled soft-gelatin capsules. The analysis presented in this work are essentially qualitative, but quantitative parameters, such as porosity, density distribution, tablets dimensions, etc. could also be obtained using the related CT techniques.

  7. Effect of The Measuring Parameters on The Reconstructed Images by Computerized Tomography

    International Nuclear Information System (INIS)

    Ali, A.M.; Ali, A.M.; Megahid, R.M.

    2011-01-01

    In this paper, the potential of computerized tomography by neutrons and gamma rays as a main precise technique for nondestructive assay of materials and components of prime importance in nuclear and general industries are given and discussed. Both Fast Fourier Transform (FFT) and convolution techniques are introduced. Shepp and Logan human head phantom is used for theoretical testing and studying the effect of translation value for both techniques. Moreover, the effect of the projection number discussed. Comparison between the two reconstruction techniques wasper formed for the examined object. In addition, some of the experimentally scanned images using slit beam of gamma rays emitt ed from the ETRR- 1 reactor are presented and discussed.

  8. A high-speed computerized tomography image reconstruction using direct two-dimensional Fourier transform method

    International Nuclear Information System (INIS)

    Niki, Noboru; Mizutani, Toshio; Takahashi, Yoshizo; Inouye, Tamon.

    1983-01-01

    The nescessity for developing real-time computerized tomography (CT) aiming at the dynamic observation of organs such as hearts has lately been advocated. It is necessary for its realization to reconstruct the images which are markedly faster than present CTs. Although various reconstructing methods have been proposed so far, the method practically employed at present is the filtered backprojection (FBP) method only, which can give high quality image reconstruction, but takes much computing time. In the past, the two-dimensional Fourier transform (TFT) method was regarded as unsuitable to practical use because the quality of images obtained was not good, in spite of the promising method for high speed reconstruction because of its less computing time. However, since it was revealed that the image quality by TFT method depended greatly on interpolation accuracy in two-dimensional Fourier space, the authors have developed a high-speed calculation algorithm that can obtain high quality images by pursuing the relationship between the image quality and the interpolation method. In this case, radial data sampling points in Fourier space are increased to β-th power of 2 times, and the linear or spline interpolation is used. Comparison of this method with the present FBP method resulted in the conclusion that the image quality is almost the same in practical image matrix, the computational time by TFT method becomes about 1/10 of FBP method, and the memory capacity also reduces by about 20 %. (Wakatsuki, Y.)

  9. Computerized tomography. Fundamentals, equipment, image quality, applications. 2. rev. and enl. ed.

    International Nuclear Information System (INIS)

    Kalender, W.A.

    2006-01-01

    The book provides a clear and user-oriented outline of the theoretical and technical fundamentals of computerized tomography for a wide range of readers, from conventional CT to volume CT with conical beams. Image quality, its relevant influencing parameters and the performance factors to be observed for clinical application are discussed in detail, as are dose requirements, dose metering and dose reduction measures in CT. The second, revised edition contains updated information and also presents new technologies. A CD-ROM with attractive exemplary appications, animations and interactive exercises in image interpretation and manipulation is included. The book addresses everyone concerned with CT, either in their daily routine or even only every now and then, i.e. physicians, medical assistants, engineers, technicians and physicists. The book also contains a glossary of technical terms. (orig.)

  10. Computerized method for evaluating diagnostic image quality of calcified plaque images in cardiac CT: Validation on a physical dynamic cardiac phantom

    International Nuclear Information System (INIS)

    King, Martin; Rodgers, Zachary; Giger, Maryellen L.; Bardo, Dianna M. E.; Patel, Amit R.

    2010-01-01

    Purpose: In cardiac computed tomography (CT), important clinical indices, such as the coronary calcium score and the percentage of coronary artery stenosis, are often adversely affected by motion artifacts. As a result, the expert observer must decide whether or not to use these indices during image interpretation. Computerized methods potentially can be used to assist in these decisions. In a previous study, an artificial neural network (ANN) regression model provided assessability (image quality) indices of calcified plaque images from the software NCAT phantom that were highly agreeable with those provided by expert observers. The method predicted assessability indices based on computer-extracted features of the plaque. In the current study, the ANN-predicted assessability indices were used to identify calcified plaque images with diagnostic calcium scores (based on mass) from a physical dynamic cardiac phantom. The basic assumption was that better quality images were associated with more accurate calcium scores. Methods: A 64-channel CT scanner was used to obtain 500 calcified plaque images from a physical dynamic cardiac phantom at different heart rates, cardiac phases, and plaque locations. Two expert observers independently provided separate sets of assessability indices for each of these images. Separate sets of ANN-predicted assessability indices tailored to each observer were then generated within the framework of a bootstrap resampling scheme. For each resampling iteration, the absolute calcium score error between the calcium scores of the motion-contaminated plaque image and its corresponding stationary image served as the ground truth in terms of indicating images with diagnostic calcium scores. The performances of the ANN-predicted and observer-assigned indices in identifying images with diagnostic calcium scores were then evaluated using ROC analysis. Results: Assessability indices provided by the first observer and the corresponding ANN performed

  11. Computerized method for evaluating diagnostic image quality of calcified plaque images in cardiac CT: Validation on a physical dynamic cardiac phantom

    Energy Technology Data Exchange (ETDEWEB)

    King, Martin; Rodgers, Zachary; Giger, Maryellen L.; Bardo, Dianna M. E.; Patel, Amit R. [Department of Radiology, Committee on Medical Physics, University of Chicago, 5841 South Maryland Avenue, MC 2026, Chicago, Illinois 60637 (United States); Department of Diagnostic Radiology, Oregon Health and Science University, 3181 Southwest Sam Jackson Park Road, Portland, Oregon 97239 (United States); Department of Medicine, University of Chicago, 5841 South Maryland Avenue, MC 5084, Chicago, Illinois 60637 (United States)

    2010-11-15

    Purpose: In cardiac computed tomography (CT), important clinical indices, such as the coronary calcium score and the percentage of coronary artery stenosis, are often adversely affected by motion artifacts. As a result, the expert observer must decide whether or not to use these indices during image interpretation. Computerized methods potentially can be used to assist in these decisions. In a previous study, an artificial neural network (ANN) regression model provided assessability (image quality) indices of calcified plaque images from the software NCAT phantom that were highly agreeable with those provided by expert observers. The method predicted assessability indices based on computer-extracted features of the plaque. In the current study, the ANN-predicted assessability indices were used to identify calcified plaque images with diagnostic calcium scores (based on mass) from a physical dynamic cardiac phantom. The basic assumption was that better quality images were associated with more accurate calcium scores. Methods: A 64-channel CT scanner was used to obtain 500 calcified plaque images from a physical dynamic cardiac phantom at different heart rates, cardiac phases, and plaque locations. Two expert observers independently provided separate sets of assessability indices for each of these images. Separate sets of ANN-predicted assessability indices tailored to each observer were then generated within the framework of a bootstrap resampling scheme. For each resampling iteration, the absolute calcium score error between the calcium scores of the motion-contaminated plaque image and its corresponding stationary image served as the ground truth in terms of indicating images with diagnostic calcium scores. The performances of the ANN-predicted and observer-assigned indices in identifying images with diagnostic calcium scores were then evaluated using ROC analysis. Results: Assessability indices provided by the first observer and the corresponding ANN performed

  12. Image Analysis

    DEFF Research Database (Denmark)

    The 19th Scandinavian Conference on Image Analysis was held at the IT University of Copenhagen in Denmark during June 15-17, 2015. The SCIA conference series has been an ongoing biannual event for more than 30 years and over the years it has nurtured a world-class regional research and development...... area within the four participating Nordic countries. It is a regional meeting of the International Association for Pattern Recognition (IAPR). We would like to thank all authors who submitted works to this year’s SCIA, the invited speakers, and our Program Committee. In total 67 papers were submitted....... The topics of the accepted papers range from novel applications of vision systems, pattern recognition, machine learning, feature extraction, segmentation, 3D vision, to medical and biomedical image analysis. The papers originate from all the Scandinavian countries and several other European countries...

  13. Computerized assessment of body image in anorexia nervosa and bulimia nervosa: comparison with standardized body image assessment tool.

    Science.gov (United States)

    Caspi, Asaf; Amiaz, Revital; Davidson, Noa; Czerniak, Efrat; Gur, Eitan; Kiryati, Nahum; Harari, Daniel; Furst, Miriam; Stein, Daniel

    2017-02-01

    Body image disturbances are a prominent feature of eating disorders (EDs). Our aim was to test and evaluate a computerized assessment of body image (CABI), to compare the body image disturbances in different ED types, and to assess the factors affecting body image. The body image of 22 individuals undergoing inpatient treatment with restricting anorexia nervosa (AN-R), 22 with binge/purge AN (AN-B/P), 20 with bulimia nervosa (BN), and 41 healthy controls was assessed using the Contour Drawing Rating Scale (CDRS), the CABI, which simulated the participants' self-image in different levels of weight changes, and the Eating Disorder Inventory-2-Body Dissatisfaction (EDI-2-BD) scale. Severity of depression and anxiety was also assessed. Significant differences were found among the three scales assessing body image, although most of their dimensions differentiated between patients with EDs and controls. Our findings support the use of the CABI in the comparison of body image disturbances in patients with EDs vs. Moreover, the use of different assessment tools allows for a better understanding of the differences in body image disturbances in different ED types.

  14. CONSIDERATIONS FOR THE TREATMENT OF COMPUTERIZED PROCEDURES IN HUMAN RELIABILITY ANALYSIS

    Energy Technology Data Exchange (ETDEWEB)

    Ronald L. Boring; David I. Gertman

    2012-07-01

    Computerized procedures (CPs) are an emerging technology within nuclear power plant control rooms. While CPs have been implemented internationally in advanced control rooms, to date no US nuclear power plant has implemented CPs in its main control room. Yet, CPs are a reality of new plant builds and are an area of considerable interest to existing plants, which see advantages in terms of easier records management by omitting the need for updating hardcopy procedures. The overall intent of this paper is to provide a characterization of human reliability analysis (HRA) issues for computerized procedures. It is beyond the scope of this document to propose a new HRA approach or to recommend specific methods or refinements to those methods. Rather, this paper serves as a review of current HRA as it may be used for the analysis and review of computerized procedures.

  15. HUMAN RELIABILITY ANALYSIS FOR COMPUTERIZED PROCEDURES, PART TWO: APPLICABILITY OF CURRENT METHODS

    Energy Technology Data Exchange (ETDEWEB)

    Ronald L. Boring; David I. Gertman

    2012-10-01

    Computerized procedures (CPs) are an emerging technology within nuclear power plant control rooms. While CPs have been implemented internationally in advanced control rooms, to date no U.S. nuclear power plant has implemented CPs in its main control room. Yet, CPs are a reality of new plant builds and are an area of considerable interest to existing plants, which see advantages in terms of easier records management by omitting the need for updating hardcopy procedures. The overall intent of this paper is to provide a characterization of human reliability analysis (HRA) issues for computerized procedures. It is beyond the scope of this document to propose a new HRA approach or to recommend specific methods or refinements to those methods. Rather, this paper serves as a review of current HRA as it may be used for the analysis and review of computerized procedures.

  16. Manual versus Automated Narrative Analysis of Agrammatic Production Patterns: The Northwestern Narrative Language Analysis and Computerized Language Analysis

    Science.gov (United States)

    Hsu, Chien-Ju; Thompson, Cynthia K.

    2018-01-01

    Purpose: The purpose of this study is to compare the outcomes of the manually coded Northwestern Narrative Language Analysis (NNLA) system, which was developed for characterizing agrammatic production patterns, and the automated Computerized Language Analysis (CLAN) system, which has recently been adopted to analyze speech samples of individuals…

  17. Low gray scale values of computerized images of carotid plaques associated with increased levels of triglyceride-rich lipoproteins and with increased plaque lipid content

    DEFF Research Database (Denmark)

    Grønholdt, Marie-Louise M.; Nordestgaard, Børge; Weibe, Britt M.

    1997-01-01

    Relatioin between low gray scale values in computerized images of carotid plaques and 1) plasma levels of triglyceride-rich lipoproteins and 2) plaque lipid content......Relatioin between low gray scale values in computerized images of carotid plaques and 1) plasma levels of triglyceride-rich lipoproteins and 2) plaque lipid content...

  18. Anatomical Variations of Carotid Artery and Optic Nerve in Sphenoid Sinus Using Computerized Tomographic Imaging

    Directory of Open Access Journals (Sweden)

    Nikakhlagh

    2014-12-01

    Full Text Available Background Sphenoid sinus is surrounded by many vital vascular and nervous structures. In more than 20% of patients with chronic sinusitis, involvement of sphenoid sinus has been observed. Besides, sphenoid sinus is an appropriate route to access anterior and middle cranial fossa in surgery. Therefore, it is important to have an adequate knowledge about the contents of sphenoid sinus and its proximity for nasal endoscopy, sinus surgeries and neurosurgeries. Objectives The aim of this study was to study sphenoid sinus proximity with carotid artery and the optic nerve using computerized tomographic imaging. Materials and Methods In this prospective study, computerized tomographic images of sphenoid sinus of patients referred to Imam Khomeini and Apadana hospitals were studied. The images were studied regarding any bulging, as well as not having a bone covering in sphenoid sinus regarding internal carotid artery and optic nerve. Furthermore, unilateralness or bilateralness of their relationships was studied. Results Among 468 coronal and axial CT scan images of sphenoid sinus, 365 (78% showed post-sellar pneumatization and 103 (22% pre-sellar pneumatization. Regarding existence of internal septa, 346 (74% cases showed multiple septation, and the remaining images were reported to have a single septum. According to the reports of CT scan images, the existence of bulging as a result of internal carotid artery and uncovered artery were 4.22% and 5.8% in the right sinus, 4.9% and 5.4% in the left sinus, and 4.34% and 4.6% in both sinuses, respectively. According to the reports of CT scan images, existence of bulging as a result of optic nerve and uncovered nerve were 5.7% and 4.3% in the right sinus, 6% and 5.4% in the left sinus, and 12% and 3.2% in both sinuses, respectively. Conclusions Due to variability of sphenoid sinus pneumatization and the separator blade of the two sinus cavities, careful attention is required during sinus surgery to avoid

  19. Analysis of the of bones through 3D computerized tomography

    International Nuclear Information System (INIS)

    Lima, I.; Lopes, R.T.; Oliveira, L.F.; Alves, J.M.

    2009-01-01

    This work shows the analysis of the internal structure of the bones samples through 3D micro tomography technique (3D-μTC). The comprehension of the bone structure is particularly important when related to osteoporosis diagnosis because this implies in a deterioration of the trabecular bone architecture, which increases the fragility and the possibility to have bone fractures. Two bone samples (human calcaneous and Wistar rat femur) were used, and the method was a radiographic system in real time with an X Ray microfocus tube. The quantifications parameters are based on stereological principles and they are five: a bone volume fraction, trabecular number, the ratio between surface and bone volume, the trabecular thickness and the trabecular separation. The quantifications were done with a program developed especially for this purpose in Nuclear Instrumentation Laboratory - COPPE/UFRJ. This program uses as input the 3D reconstructions images and generates a table with the quantifications. The results of the human calcaneous quantifications are presented in tables 1 and 2, and the 3D reconstructions are illustrated in Figure 5. The Figure 6 illustrate the 2D reconstructed image and the Figure 7 the 3D visualization respectively of the Wistar femur sample. The obtained results show that the 3D-μTC is a powerful technique that can be used to analyze bone microstructures. (author)

  20. Computerized Analysis of Digital Photographs for Evaluation of Tooth Movement.

    Science.gov (United States)

    Toodehzaeim, Mohammad Hossein; Karandish, Maryam; Karandish, Mohammad Nabi

    2015-03-01

    Various methods have been introduced for evaluation of tooth movement in orthodontics. The challenge is to adopt the most accurate and most beneficial method for patients. This study was designed to introduce analysis of digital photographs with AutoCAD software as a method to evaluate tooth movement and assess the reliability of this method. Eighteen patients were evaluated in this study. Three intraoral digital images from the buccal view were captured from each patient in half an hour interval. All the photos were sent to AutoCAD software 2011, calibrated and the distance between canine and molar hooks were measured. The data was analyzed using intraclass correlation coefficient. Photographs were found to have high reliability coefficient (P > 0.05). The introduced method is an accurate, efficient and reliable method for evaluation of tooth movement.

  1. A Procedure for the Computerized Analysis of Cleft Palate Speech Transcription

    Science.gov (United States)

    Fitzsimons, David A.; Jones, David L.; Barton, Belinda; North, Kathryn N.

    2012-01-01

    The phonetic symbols used by speech-language pathologists to transcribe speech contain underlying hexadecimal values used by computers to correctly display and process transcription data. This study aimed to develop a procedure to utilise these values as the basis for subsequent computerized analysis of cleft palate speech. A computer keyboard…

  2. Computerized tomography using high resolution X-ray imaging system with a microfocus source

    International Nuclear Information System (INIS)

    Zaprazny, Z.; Korytar, D.; Konopka, P.; Ac, V.; Bielecki, J.

    2011-01-01

    In recent years there is an effort to image an internal structure of an object by using not only conventional 2D X-ray radiography but also using high resolution 3D tomography which is based on reconstruction of multiple 2D projections at various angular positions of the object. We have previously reported [1] the development and basic parameters of a high resolution x-ray imaging system with a microfocus source. We report the recent progress using this high resolution X-ray laboratory system in this work. These first findings show that our system is particularly suitable for light weight and nonmetallic objects such as biological objects, plastics, wood, paper, etc. where phase contrast helps to increase the visibility of the finest structures of the object. Phase-contrast X-ray Computerized Tomography is of our special interest because it is an emerging imaging technique that can be implemented at third generation synchrotron radiation sources and also in laboratory conditions using a microfocus X-ray tube or beam conditioning optics. (authors)

  3. New applications to computerized tomography: analysis of solid dosage forms produced by pharmaceutical industry

    International Nuclear Information System (INIS)

    Oliveira Junior, Jose Martins de; Martins, Antonio Cesar Germano

    2009-01-01

    Full text: In recent years, computerized tomography (CT) has been used as a new probe to study solid dosage forms (tablets) produced by pharmaceutical industry. This new approach to study tablet and powder, or granulation, properties used in pharmaceutical industry is very suitable. First because CT can generate information that traditional technologies used in this kind of analysis can not, such as, density distribution of internal structures and tablet dimensions, pore size distribution, particle shape information, and also investigation of official and unofficial (counterfeit) copies of solid dosage forms. Second because CT is a nondestructive technique, allowing the use of tablets or granules in others analysis. In this work we discus how CT can be used to acquire and reconstruct internal microstructure of tablets and granules. CT is a technique that is based on attenuation of X-rays passing through matter. Attenuation depends on the density and atomic number of the material that is scanned. In this work, a micro-CT X-ray scanner (manufactured by the group of Applied Nuclear Physics at University of Sorocaba) was used to obtain three-dimensional images of the tablets and granules for nondestructive analysis. These images showed a non uniform density distribution of material inside some tablets, the morphology of some granules analyzed, the integrity of the liquid-filled soft-gelatin capsule and so on. It could also be observed that the distribution of different constituents presents an osmotic controlled-release dosage form. The present work shows that it is possible to use X-ray microtomography to obtain useful qualitative and quantitative information on the structure of pharmaceutical dosage forms. (author)

  4. Wound areas by computerized planimetry of digital images: accuracy and reliability.

    Science.gov (United States)

    Mayrovitz, Harvey N; Soontupe, Lisa B

    2009-05-01

    Tracking wound size is an essential part of treatment. Because a wound's initial size may affect apparent healing rates, its surface area (S) and its surface area-to-perimeter (S/P) ratio are useful to document healing. Assessments of these parameters can be made by computerized planimetry of digital images using suitable software. Because different caregivers often evaluate wounds and because measurement time is important, the objective of this study was to determine accuracy, repeatability, and measurement time of S and S/P from measurements of images recorded by digital photography. Six wound images of various complexities with known areas were measured in triplicate by 20 senior nursing students during 2 sessions 1 week apart. Images included an ellipse, 2 traced venous ulcers, and photographs of a pressure, diabetic plantar, and venous ulcer. Area error was determined as the percentage difference between known and planimetry measured areas. Reliability was assessed from test-retest coefficient of variations (CV%) from which the smallest meaningful percentage change (SMPC) was determined. Area errors (mean +/- SD) ranged from -2.95% +/- 7.01% to +2.32% +/- 6.04%. For well-defined image margins, area and S/P SMPC values were all less than 3.2%. For borders that were not as well defined, SMPCs were larger, ranging between 6.2% and 10.8%. Wound measurement time decreased from 93.4 +/- 35.1 seconds at session 1 to 67.7 +/- 24.4 at session 2 (P reliable estimates of wound area and S/P ratios.

  5. Analysis of the percentage voids of test and field specimens using computerized tomography

    International Nuclear Information System (INIS)

    Braz, D.; Lopes, R.T.; Motta, L.M.G. da

    1999-01-01

    Computerized tomography has been an excellent tool of analysis of asphaltics mixtures, because it allows comparison of the quality and integrity of test and field specimens. It was required to detect and follow the evolution of cracks, when these mixtures were submitted to fatigue tests, and also helping to interpret the distribution of tensions and deformations which occur in the several types of solicitations imposed to the mixtures. Comparing the medium values of percentage voids obtained from tomographic images with the project's values, it can be observed that the values of test and field specimens for the wearing course are closer to the ones of the project than the ones of the binder. It can be verified that the wearing course specimens always present a distribution of the aggregate, and voids quite homogeneously in the whole profile of the sample, while the binder specimens show an accentuated differentiation of the same factors in the several heights of the sample. Therefore, when choosing a slice for tomography, these considerations should be taken into account

  6. Age influence on attitudes of office workers faced with new computerized technologies: a questionnaire analysis.

    Science.gov (United States)

    Marquié, J C; Thon, B; Baracat, B

    1994-06-01

    The study of Bue and Gollac (1988) provided evidence that a significantly lower proportion of workers aged 45 years and over make use of computer technology compared with younger ones. The aim of the present survey was to explain this fact by a more intensive analysis of the older workers' attitude with respect to the computerization of work situations in relation to other individual and organizational factors. Six hundred and twenty office workers from 18 to 70 years old, either users or non-users of computerized devices, were asked to complete a questionnaire. The questions allowed the assessment of various aspects of the workers' current situation, such as the computer training they had received, the degree of consultation they were subjected to during the computerization process, their representation of the effects of these new technologies on working conditions and employment, the rate of use of new technologies outside the work context, and the perceived usefulness of computers for their own work. The analysis of the questionnaire revealed that as long as the step towards using computer tools, even minimally, has not been taken, then attitudes with respect to computerization are on the whole not very positive and are a source of anxiety for many workers. Age, and even more, seniority in the department, increase such negative representations. The effects of age and seniority were also found among users, as well as the effects of other factors such as qualification, education level, type and rate of computer use, and size of the firm. For the older workers, the expectation of less positive consequences for their career, or even the fear that computerization might be accompanied by threats to their own employment and the less clear knowledge of how computers operate, appeared to account for a significant part of the observed age and seniority differences in attitudes. Although the difference in the amount of computer training between age groups was smaller than

  7. Computerized follow-up of discrepancies in image interpretation between emergency and radiology departments.

    Science.gov (United States)

    Siegel, E; Groleau, G; Reiner, B; Stair, T

    1998-08-01

    Radiographs are ordered and interpreted for immediate clinical decisions 24 hours a day by emergency physicians (EP's). The Joint Commission for Accreditation of Health Care Organizations requires that all these images be reviewed by radiologists and that there be some mechanism for quality improvement (QI) for discrepant readings. There must be a log of discrepancies and documentation of follow up activities, but this alone does not guarantee effective Q.I. Radiologists reviewing images from the previous day and night often must guess at the preliminary interpretation of the EP and whether follow up action is necessary. EP's may remain ignorant of the final reading and falsely assume the initial diagnosis and treatment were correct. Some hospitals use a paper system in which the EP writes a preliminary interpretation on the requisition slip, which will be available when the radiologist dictates the final reading. Some hospitals use a classification of discrepancies based on clinical import and urgency, and communicated to the EP on duty at the time of the official reading, but may not communicate discrepancies to the EP's who initial read the images. Our computerized radiology department and picture archiving and communications system have increased technologist and radiologist productivity, and decreased retakes and lost films. There are fewer face-to-face consultants of radiologists and clinicians, but more communication by telephone and electronic annotation of PACS images. We have integrated the QI process for emergency department (ED) images into the PACS, and gained advantages over the traditional discrepancy log. Requisitions including clinical indications are entered into the Hospital Information System and then appear on the PACS along with images on readings. The initial impression, time of review, and the initials of the EP are available to the radiologist dictating the official report. The radiologist decides if there is a discrepancy, and whether it

  8. Non-destructive analysis and detection of internal characteristics of spruce logs through X computerized tomography

    International Nuclear Information System (INIS)

    Longuetaud, F.

    2005-10-01

    Computerized tomography allows a direct access to internal features of scanned logs on the basis of density and moisture content variations. The objective of this work is to assess the feasibility of an automatic detection of internal characteristics with the final aim of conducting scientific analyses. The database is constituted by CT images of 24 spruces obtained with a medical CT scanner. Studied trees are representative of several social status and are coming from four stands located in North-Eastern France, themselves are representative of several age, density and fertility classes. The automatic processing developed are the following. First, pith detection in logs dealing with the problem of knot presence and ring eccentricity. The accuracy of the localisation was less than one mm. Secondly, the detection of the sapwood/heart-wood limit in logs dealing with the problem of knot presence (main source of difficulty). The error on the diameter was 1.8 mm which corresponds to a relative error of 1.3 per cent. Thirdly, the detection of the whorls location and comparison with an optical method. Fourthly the detection of individualized knots. This process allows to count knots and to locate them in a log (longitudinal position and azimuth); however, the validation of the method and extraction of branch diameter and inclination are still to be developed. An application of this work was a variability analysis of the sapwood content in the trunk: at the within-tree level, the sapwood width was found to be constant under the living crown; at the between-tree level, a strong correlation was found with the amount of living branches. A great number of analyses are possible from our work results, among others: architectural analysis with the pith tracking and the apex death occurrence; analysis of radial variations of the heart-wood shape; analysis of the knot distribution in logs. (author)

  9. Computerized tomography

    International Nuclear Information System (INIS)

    Rubashov, I.B.

    1985-01-01

    Operating principle is described for the devices of computerized tomography used in medicine for diagnosis of brain diseases. Computerized tomography is considered as a part of computerized diagnosis, as a part of information science. It is shown that computerized tomography is a real existed field of investigations in medicine and industrial production

  10. Computerized tomography and its diagnostic value in the imaging of limbs

    International Nuclear Information System (INIS)

    Myllylae, V.; Tervonen, O.; Paeivaensalo, M.; Jalovaara, P.; Merikanto, J.; Maekaeraeinen, H.; Oulu Univ.

    1987-01-01

    The application of computerized tomography in skeletal diagnostics has many advantages over plain film radiography. Its use as a supplementary technique in traumatology and oncology is strongly advised. (orig.) [de

  11. Computerized statistical analysis with bootstrap method in nuclear medicine

    International Nuclear Information System (INIS)

    Zoccarato, O.; Sardina, M.; Zatta, G.; De Agostini, A.; Barbesti, S.; Mana, O.; Tarolo, G.L.

    1988-01-01

    Statistical analysis of data samples involves some hypothesis about the features of data themselves. The accuracy of these hypotheses can influence the results of statistical inference. Among the new methods of computer-aided statistical analysis, the bootstrap method appears to be one of the most powerful, thanks to its ability to reproduce many artificial samples starting from a single original sample and because it works without hypothesis about data distribution. The authors applied the bootstrap method to two typical situation of Nuclear Medicine Department. The determination of the normal range of serum ferritin, as assessed by radioimmunoassay and defined by the mean value ±2 standard deviations, starting from an experimental sample of small dimension, shows an unacceptable lower limit (ferritin plasmatic levels below zero). On the contrary, the results obtained by elaborating 5000 bootstrap samples gives ans interval of values (10.95 ng/ml - 72.87 ng/ml) corresponding to the normal ranges commonly reported. Moreover the authors applied the bootstrap method in evaluating the possible error associated with the correlation coefficient determined between left ventricular ejection fraction (LVEF) values obtained by first pass radionuclide angiocardiography with 99m Tc and 195m Au. The results obtained indicate a high degree of statistical correlation and give the range of r 2 values to be considered acceptable for this type of studies

  12. [Failure mode and effects analysis on computerized drug prescriptions].

    Science.gov (United States)

    Paredes-Atenciano, J A; Roldán-Aviña, J P; González-García, Mercedes; Blanco-Sánchez, M C; Pinto-Melero, M A; Pérez-Ramírez, C; Calvo Rubio-Burgos, Miguel; Osuna-Navarro, F J; Jurado-Carmona, A M

    2015-01-01

    To identify and analyze errors in drug prescriptions of patients treated in a "high resolution" hospital by applying a Failure mode and effects analysis (FMEA).Material and methods A multidisciplinary group of medical specialties and nursing analyzed medical records where drug prescriptions were held in free text format. An FMEA was developed in which the risk priority index (RPI) was obtained from a cross-sectional observational study using an audit of the medical records, carried out in 2 phases: 1) Pre-intervention testing, and (2) evaluation of improvement actions after the first analysis. An audit sample size of 679 medical records from a total of 2,096 patients was calculated using stratified sampling and random selection of clinical events. Prescription errors decreased by 22.2% in the second phase. FMEA showed a greater RPI in "unspecified route of administration" and "dosage unspecified", with no significant decreases observed in the second phase, although it did detect, "incorrect dosing time", "contraindication due to drug allergy", "wrong patient" or "duplicate prescription", which resulted in the improvement of prescriptions. Drug prescription errors have been identified and analyzed by FMEA methodology, improving the clinical safety of these prescriptions. This tool allows updates of electronic prescribing to be monitored. To avoid such errors would require the mandatory completion of all sections of a prescription. Copyright © 2014 SECA. Published by Elsevier Espana. All rights reserved.

  13. A basic framework for the analysis of the human error potential due to the computerization in nuclear power plants

    International Nuclear Information System (INIS)

    Lee, Y. H.

    1999-01-01

    Computerization and its vivid benefits expected in the nuclear power plant design cannot be realized without verifying the inherent safety problems. Human error aspect is also included in the verification issues. The verification spans from the perception of the changes in operation functions such as automation to the unfamiliar experience of operators due to the interface change. Therefore, a new framework for human error analysis might capture both the positive and the negative effect of the computerization. This paper suggest a basic framework for error identification through the review of the existing human error studies and the experience of computerizations in nuclear power plants

  14. Design and development of computerized local and overall country's environmental data analysis network system

    International Nuclear Information System (INIS)

    Kim, Chang Gyu; Kang, Jong Gyu; Han, H.; Han, J. S.; Lee, Y. D.; Lee, S. R.; Kang, D. J.; Cho, Y. G.; Yun, S. H.

    2001-03-01

    In this development, we designed a integrated database for efficient data processing of radiation-environment data and developed the CLEAN (Computerized Local and overall country's Environmental data Analysis Network) system. The CLEAN system consists of local radiation-environment network, data analysis system, data open system. We developed the CLEAN system focused on building an integrated database, a data mart, and a CLEAN web site. It is expected that the developed system, which organizes the information related to environmental radiation data systematically, can be utilize for the accurate interpretation, analysis and evaluation

  15. Spinal imaging and image analysis

    CERN Document Server

    Yao, Jianhua

    2015-01-01

    This book is instrumental to building a bridge between scientists and clinicians in the field of spine imaging by introducing state-of-the-art computational methods in the context of clinical applications.  Spine imaging via computed tomography, magnetic resonance imaging, and other radiologic imaging modalities, is essential for noninvasively visualizing and assessing spinal pathology. Computational methods support and enhance the physician’s ability to utilize these imaging techniques for diagnosis, non-invasive treatment, and intervention in clinical practice. Chapters cover a broad range of topics encompassing radiological imaging modalities, clinical imaging applications for common spine diseases, image processing, computer-aided diagnosis, quantitative analysis, data reconstruction and visualization, statistical modeling, image-guided spine intervention, and robotic surgery. This volume serves a broad audience as  contributions were written by both clinicians and researchers, which reflects the inte...

  16. Retinal Imaging and Image Analysis

    Science.gov (United States)

    Abràmoff, Michael D.; Garvin, Mona K.; Sonka, Milan

    2011-01-01

    Many important eye diseases as well as systemic diseases manifest themselves in the retina. While a number of other anatomical structures contribute to the process of vision, this review focuses on retinal imaging and image analysis. Following a brief overview of the most prevalent causes of blindness in the industrialized world that includes age-related macular degeneration, diabetic retinopathy, and glaucoma, the review is devoted to retinal imaging and image analysis methods and their clinical implications. Methods for 2-D fundus imaging and techniques for 3-D optical coherence tomography (OCT) imaging are reviewed. Special attention is given to quantitative techniques for analysis of fundus photographs with a focus on clinically relevant assessment of retinal vasculature, identification of retinal lesions, assessment of optic nerve head (ONH) shape, building retinal atlases, and to automated methods for population screening for retinal diseases. A separate section is devoted to 3-D analysis of OCT images, describing methods for segmentation and analysis of retinal layers, retinal vasculature, and 2-D/3-D detection of symptomatic exudate-associated derangements, as well as to OCT-based analysis of ONH morphology and shape. Throughout the paper, aspects of image acquisition, image analysis, and clinical relevance are treated together considering their mutually interlinked relationships. PMID:22275207

  17. Usefulness of computerized method for lung nodule detection on digital chest radiographs using similar subtraction images from different patients

    International Nuclear Information System (INIS)

    Aoki, Takatoshi; Oda, Nobuhiro; Yamashita, Yoshiko; Yamamoto, Keiji; Korogi, Yukunori

    2012-01-01

    Purpose: The purpose of this study is to evaluate the usefulness of a novel computerized method to select automatically the similar chest radiograph for image subtraction in the patients who have no previous chest radiographs and to assist the radiologists’ interpretation by presenting the “similar subtraction image” from different patients. Materials and methods: Institutional review board approval was obtained, and the requirement for informed patient consent was waived. A large database of approximately 15,000 normal chest radiographs was used for searching similar images of different patients. One hundred images of candidates were selected according to two clinical parameters and similarity of the lung field in the target image. We used the correlation value of chest region in the 100 images for searching the most similar image. The similar subtraction images were obtained by subtracting the similar image selected from the target image. Thirty cases with lung nodules and 30 cases without lung nodules were used for an observer performance test. Four attending radiologists and four radiology residents participated in this observer performance test. Results: The AUC for all radiologists increased significantly from 0.925 to 0.974 with the CAD (P = .004). When the computer output images were available, the average AUC for the residents was more improved (0.960 vs. 0.890) than for the attending radiologists (0.987 vs. 0.960). Conclusion: The novel computerized method for lung nodule detection using similar subtraction images from different patients would be useful to detect lung nodules on digital chest radiographs, especially for less experienced readers.

  18. Soft tissue segmentation and 3D display from computerized tomography and magnetic resonance imaging

    International Nuclear Information System (INIS)

    Fan, R.T.; Trivedi, S.S.; Fellingham, L.L.; Gamboa-Aldeco, A.; Hedgcock, M.W.

    1987-01-01

    Volume calculation and 3D display of human anatomy facilitate a physician's diagnosis, treatment, and evaluation. Accurate segmentation of soft tissue structures is a prerequisite for such volume calculations and 3D displays, but segmentation by hand-outlining structures is often tedious and time-consuming. In this paper, methods based on analysis of statistics of image gray level are applied to segmentation of soft tissue in medical images, with the goal of making segmentation automatic or semi-automatic. The resulting segmented images, volume calculations, and 3D displays are analyzed and compared with results based on physician-drawn outlines as well as actual volume measurements

  19. Using computerized text analysis to assess communication within an Italian type 1 diabetes Facebook group

    Directory of Open Access Journals (Sweden)

    Alda Troncone

    2015-11-01

    Full Text Available The purpose of this study was to assess messages posted by mothers of children with type 1 diabetes in the Italian Facebook group “Mamme e diabete” using computerized text analysis. The data suggest that these mothers use online discussion boards as a place to seek and provide information to better manage the disease’s daily demands—especially those tasks linked to insulin correction and administration, control of food intake, and bureaucratic duties, as well as to seek and give encouragement and to share experiences regarding diabetes and related impact on their life. The implications of these findings for the management of diabetes are discussed.

  20. Feasibility analysis of CNP 1000 computerized I and C system design objectives

    International Nuclear Information System (INIS)

    Zhang Mingguang; Xu Jijun; Zhang Qinshen

    2000-01-01

    The author states the design objectives of the computerized I and C (CIC) system and advanced main control room (AMCR), which could and should be achieved in CNP 1000, based on the national 1E computer production technology including software and hardware, and current instrumentation and control design technique of nuclear power plant. The feasibility analysis on the design objectives and the reasons or necessity to do the design research projects have been described. The objectives of design research on CIC and AMCR as well as the self-design proficiency after the design research have been given

  1. Retinal imaging and image analysis

    NARCIS (Netherlands)

    Abramoff, M.D.; Garvin, Mona K.; Sonka, Milan

    2010-01-01

    Many important eye diseases as well as systemic diseases manifest themselves in the retina. While a number of other anatomical structures contribute to the process of vision, this review focuses on retinal imaging and image analysis. Following a brief overview of the most prevalent causes of

  2. Vascular fluorscene casting and imaging cryomicrotomy for computerized three-dimensional renal arterial reconstruction

    NARCIS (Netherlands)

    Lagerveld, B.W.; Wee, ter R.; Rosette, de la J.J.M.C.H.; Spaan, J.A.; Wijkstra, H.

    2010-01-01

    OBJECTIVE To assess the combined use of a casting technique, cryomicrotomy imaging, and three-dimensional (3D) computer analysis as a method for visualizing and reconstructing the arterial vascular tree in a porcine renal model. MATERIAL AND METHODS The arterial branches of two porcine kidneys were

  3. Vascular fluorescence casting and imaging cryomicrotomy for computerized three-dimensional renal arterial reconstruction

    NARCIS (Netherlands)

    Lagerveld, Brunolf W.; ter Wee, Rene D.; de La Rosette, Jean J. M. C. H.; Spaan, Jos A. E.; Wijkstra, Hessel

    2007-01-01

    OBJECTIVES To assess the combined use of a casting technique, cryomicrotomy imaging, and three-dimensional (3D) computer analysis as a method for visualizing and reconstructing the arterial vascular tree in a porcine renal model. MATERIAL AND METHODS The arterial branches of two porcine kidneys were

  4. Computerized tomography with X-rays: an instrument in the analysis physico-chemical between formations and drilling fluids interactions

    International Nuclear Information System (INIS)

    Coelho, Marcus Vinicius Cavalcante

    1998-01-01

    In this study it is demonstrated the applicability of the Computerized Tomography technique with x-rays to evaluate the reactivity degree between various drilling fluids and argillaceous sediments (Shales and Sandstones). The research has been conducted in the Rock-Fluid Interaction Pressure Simulator (RFIPS), where the possible physico-chemical alterations can be observed through successive tomography images, which are obtained during the flow of the fluid through the samples. In addition, it was noticed the formation of mud cake in Berea Sandstones samples in the RFIPS, though the Computerized Tomography with X-rays, when utilizing drilling fluids weighted with the baryte. (author)

  5. Computerized analysis of fetal heart rate variability signal during the stages of labor.

    Science.gov (United States)

    Annunziata, Maria Laura; Tagliaferri, Salvatore; Esposito, Francesca Giovanna; Giuliano, Natascia; Mereghini, Flavia; Di Lieto, Andrea; Campanile, Marta

    2016-03-01

    To analyze computerized cardiotocographic (cCTG) parameters (baseline fetal heart rate, baseline FHR; short term variability, STV; approximate entropy, ApEn; low frequency, LF; movement frequency, MF; high frequency, HF) in physiological pregnancy in order to correlate them with the stages of labor. This could provide more information for understanding the mechanisms of nervous system control of FHR during labor progression. A total of 534 pregnant women were monitored on cCTG from the 37th week before the onset of spontaneous labor and during the first and the second stage of labor. Statistical analysis was performed using Kruskal-Wallis test and Wilcoxon rank-sum test with the Bonferroni adjusted α (labor, and the first and second stages of labor. Differences between some of the stages were found for ApEn, LF and for LF/(HF + MF), where the first and the third were reduced and the second was increased. cCTG modifications during labor may reflect the physiologic increased activation of the autonomous nervous system. Using computerized fetal heart rate analysis during labor it may be possible to obtain more information from the fetal cardiac signal, in comparison with the traditional tracing. © 2016 Japan Society of Obstetrics and Gynecology.

  6. Design of aerosol face masks for children using computerized 3D face analysis.

    Science.gov (United States)

    Amirav, Israel; Luder, Anthony S; Halamish, Asaf; Raviv, Dan; Kimmel, Ron; Waisman, Dan; Newhouse, Michael T

    2014-08-01

    Aerosol masks were originally developed for adults and downsized for children. Overall fit to minimize dead space and a tight seal are problematic, because children's faces undergo rapid and marked topographic and internal anthropometric changes in their first few months/years of life. Facial three-dimensional (3D) anthropometric data were used to design an optimized pediatric mask. Children's faces (n=271, aged 1 month to 4 years) were scanned with 3D technology. Data for the distance from the bridge of the nose to the tip of the chin (H) and the width of the mouth opening (W) were used to categorize the scans into "small," "medium," and "large" "clusters." "Average" masks were developed from each cluster to provide an optimal seal with minimal dead space. The resulting computerized contour, W and H, were used to develop the SootherMask® that enables children, "suckling" on their own pacifier, to keep the mask on their face, mainly by means of subatmospheric pressure. The relatively wide and flexible rim of the mask accommodates variations in facial size within and between clusters. Unique pediatric face masks were developed based on anthropometric data obtained through computerized 3D face analysis. These masks follow facial contours and gently seal to the child's face, and thus may minimize aerosol leakage and dead space.

  7. A computerized glow curve analysis (GCA) method for WinREMS thermoluminescent dosimeter data using MATLAB

    International Nuclear Information System (INIS)

    Harvey, John A.; Rodrigues, Miesher L.; Kearfott, Kimberlee J.

    2011-01-01

    A computerized glow curve analysis (GCA) program for handling of thermoluminescence data originating from WinREMS is presented. The MATLAB program fits the glow peaks using the first-order kinetics model. Tested materials are LiF:Mg,Ti, CaF 2 :Dy, CaF 2 :Tm, CaF 2 :Mn, LiF:Mg,Cu,P, and CaSO 4 :Dy, with most having an average figure of merit (FOM) of 1.3% or less, with CaSO 4 :Dy 2.2% or less. Output is a list of fit parameters, peak areas, and graphs for each fit, evaluating each glow curve in 1.5 s or less. - Highlights: → Robust algorithm for performing thermoluminescent dosimeter glow curve analysis. → Written in MATLAB so readily implemented on variety of computers. → Usage of figure of merit demonstrated for six different materials.

  8. Study on beam geometry and image reconstruction algorithm in fast neutron computerized tomography at NECTAR facility

    Science.gov (United States)

    Guo, J.; Bücherl, T.; Zou, Y.; Guo, Z.

    2011-09-01

    Investigations on the fast neutron beam geometry for the NECTAR facility are presented. The results of MCNP simulations and experimental measurements of the beam distributions at NECTAR are compared. Boltzmann functions are used to describe the beam profile in the detection plane assuming the area source to be set up of large number of single neutron point sources. An iterative algebraic reconstruction algorithm is developed, realized and verified by both simulated and measured projection data. The feasibility for improved reconstruction in fast neutron computerized tomography at the NECTAR facility is demonstrated.

  9. Study on beam geometry and image reconstruction algorithm in fast neutron computerized tomography at NECTAR facility

    International Nuclear Information System (INIS)

    Guo, J.; Buecherl, T.; Zou, Y.; Guo, Z.

    2011-01-01

    Investigations on the fast neutron beam geometry for the NECTAR facility are presented. The results of MCNP simulations and experimental measurements of the beam distributions at NECTAR are compared. Boltzmann functions are used to describe the beam profile in the detection plane assuming the area source to be set up of large number of single neutron point sources. An iterative algebraic reconstruction algorithm is developed, realized and verified by both simulated and measured projection data. The feasibility for improved reconstruction in fast neutron computerized tomography at the NECTAR facility is demonstrated.

  10. Study on beam geometry and image reconstruction algorithm in fast neutron computerized tomography at NECTAR facility

    Energy Technology Data Exchange (ETDEWEB)

    Guo, J. [State Key Laboratory of Nuclear Physics and Technology and School of Physics, Peking University, 5 Yiheyuan Lu, Beijing 100871 (China); Lehrstuhl fuer Radiochemie, Technische Universitaet Muenchen, Garching 80748 (Germany); Buecherl, T. [Lehrstuhl fuer Radiochemie, Technische Universitaet Muenchen, Garching 80748 (Germany); Zou, Y., E-mail: zouyubin@pku.edu.cn [State Key Laboratory of Nuclear Physics and Technology and School of Physics, Peking University, 5 Yiheyuan Lu, Beijing 100871 (China); Guo, Z. [State Key Laboratory of Nuclear Physics and Technology and School of Physics, Peking University, 5 Yiheyuan Lu, Beijing 100871 (China)

    2011-09-21

    Investigations on the fast neutron beam geometry for the NECTAR facility are presented. The results of MCNP simulations and experimental measurements of the beam distributions at NECTAR are compared. Boltzmann functions are used to describe the beam profile in the detection plane assuming the area source to be set up of large number of single neutron point sources. An iterative algebraic reconstruction algorithm is developed, realized and verified by both simulated and measured projection data. The feasibility for improved reconstruction in fast neutron computerized tomography at the NECTAR facility is demonstrated.

  11. Computerized gait analysis in Legg Calvé Perthes disease--analysis of the frontal plane.

    Science.gov (United States)

    Westhoff, Bettina; Petermann, Andrea; Hirsch, Mark A; Willers, Reinhart; Krauspe, Rüdiger

    2006-10-01

    Current follow-up and outcome studies of Legg Calvé Perthes disease (LCPD) are based on subjective measures of function, clinical parameters and radiological changes [Herring JA, Kim HT, Browne RH. Legg-Calvé-Perthes disease. Part II: prospective multicenter study of the effect of treatment on outcome. J Bone Joint Surg 2004;86A:2121-34; Aksoy MC, Cankus MC, Alanay A, Yazici M, Caglar O, Alpaslan AM. Radiological outcome of proximal femoral varus osteotomy for the treatment of lateral pillar group-C. J Pediatr Orthop 2005;14 B:88-91; Kitakoji T, Hattori T, Kitoh H, Katho M, Ishiguro N. Which is a better method for Perthes' disease: femoral varus or Salter osteotomy? Clin Orthop 2005;430:163-170; Joseph B, Rao N, Mulpuri K, Varghese G, Nair S. How does femoral varus osteotomy alter the natural evolution of Perthes' disease. J Pediatr Orthop 2005;14B:10-5; Ishida A, Kuwajima SS, Laredo FJ, Milani C. Salter innominate osteotomy in the treatment of severe Legg-Calvé-Perthes disease: clinical and radiographic results in 32 patients (37 hips) at skeletal maturity. J Pediatr Orthop 2004;24:257-64.]. The objective of this study was to evaluate the frontal plane kinematics and the effect on hip joint loading on the affected side in children with a radiographic diagnosis of LCPD. Computerized, three-dimensional gait analysis was performed in 33 individuals aged > or =5 years (mean 8.0+/-2 years) with unilateral LCPD and no history of previous surgery to the hip or any disorder leading to gait abnormality. Frontal plane kinematics and kinetics were compared to a group of healthy children (n=30, mean age 8.1+/-1.2 years). Hip joint loading was estimated as a function of the hip abductor moment. Subjects with LCPD demonstrated two distinct frontal plane gait patterns, both deviating from normal. Type 1 (n=3) was characterized by a pelvic drop of the swinging limb, a trunk lean in relation to the pelvis towards the stance limb and hip adduction during stance phase and

  12. Determining Women’s Sexual Self-Schemas Through Advanced Computerized Text Analysis

    Science.gov (United States)

    Stanton, Amelia M.; Boyd, Ryan L.; Pulverman, Carey S.; Meston, Cindy M.

    2015-01-01

    The meaning extraction method (MEM), an advanced computerized text analysis technique, was used to analyze women’s sexual self-schemas. Participants (n = 239) completed open-ended essays about their personal feelings associated with sex and sexuality. These essays were analyzed using the MEM, a procedure designed to extract common themes from natural language. Using the MEM procedure, we extracted seven unique themes germane to sexual self-schemas: family and development, virginity, abuse, relationship, sexual activity, attraction, and existentialism. Each of these themes is comprised of frequently used words across the participants’ descriptions of their sexual selves. Significant differences in sexual self-schemas were observed to covary with age, relationship status, and sexual abuse history. PMID:26146161

  13. Determining women's sexual self-schemas through advanced computerized text analysis.

    Science.gov (United States)

    Stanton, Amelia M; Boyd, Ryan L; Pulverman, Carey S; Meston, Cindy M

    2015-08-01

    The meaning extraction method (MEM), an advanced computerized text analysis technique, was used to analyze women's sexual self-schemas. Participants (n=239) completed open-ended essays about their personal feelings associated with sex and sexuality. These essays were analyzed using the MEM, a procedure designed to extract common themes from natural language. Using the MEM procedure, we extracted seven unique themes germane to sexual self-schemas: family and development, virginity, abuse, relationship, sexual activity, attraction, and existentialism. Each of these themes is comprised of frequently used words across the participants' descriptions of their sexual selves. Significant differences in sexual self-schemas were observed to covary with age, relationship status, and sexual abuse history. Copyright © 2015 Elsevier Ltd. All rights reserved.

  14. Computerized Modeling and Loaded Tooth Contact Analysis of Hypoid Gears Manufactured by Face Hobbing Process

    Science.gov (United States)

    Nishino, Takayuki

    The face hobbing process has been widely applied in automotive industry. But so far few analytical tools have been developed. This makes it difficult for us to optimize gear design. To settle this situation, this study aims at developing a computerized tool to predict the running performances such as loaded tooth contact pattern, static transmission error and so on. First, based upon kinematical analysis of a cutting machine, a mathematical description of tooth surface generation is given. Second, based upon the theory of gearing and differential geometry, conjugate tooth surfaces are studied. Then contact lines are generated. Third, load distribution along contact lines is formulated. Last, the numerical model is validated by measuring loaded transmission error and loaded tooth contact pattern.

  15. Reliability of Computerized Neurocognitive Tests for Concussion Assessment: A Meta-Analysis.

    Science.gov (United States)

    Farnsworth, James L; Dargo, Lucas; Ragan, Brian G; Kang, Minsoo

    2017-09-01

      Although widely used, computerized neurocognitive tests (CNTs) have been criticized because of low reliability and poor sensitivity. A systematic review was published summarizing the reliability of Immediate Post-Concussion Assessment and Cognitive Testing (ImPACT) scores; however, this was limited to a single CNT. Expansion of the previous review to include additional CNTs and a meta-analysis is needed. Therefore, our purpose was to analyze reliability data for CNTs using meta-analysis and examine moderating factors that may influence reliability.   A systematic literature search (key terms: reliability, computerized neurocognitive test, concussion) of electronic databases (MEDLINE, PubMed, Google Scholar, and SPORTDiscus) was conducted to identify relevant studies.   Studies were included if they met all of the following criteria: used a test-retest design, involved at least 1 CNT, provided sufficient statistical data to allow for effect-size calculation, and were published in English.   Two independent reviewers investigated each article to assess inclusion criteria. Eighteen studies involving 2674 participants were retained. Intraclass correlation coefficients were extracted to calculate effect sizes and determine overall reliability. The Fisher Z transformation adjusted for sampling error associated with averaging correlations. Moderator analyses were conducted to evaluate the effects of the length of the test-retest interval, intraclass correlation coefficient model selection, participant demographics, and study design on reliability. Heterogeneity was evaluated using the Cochran Q statistic.   The proportion of acceptable outcomes was greatest for the Axon Sports CogState Test (75%) and lowest for the ImPACT (25%). Moderator analyses indicated that the type of intraclass correlation coefficient model used significantly influenced effect-size estimates, accounting for 17% of the variation in reliability.   The Axon Sports CogState Test, which

  16. A Computerized QC Analysis of TLD Glow Curves for Personal Dosimetry Measurements Using Tag QC Program

    International Nuclear Information System (INIS)

    Primo, S.; Datz, H.; Dar, A.

    2014-01-01

    The External Dosimetry Lab (EDL) at the Radiation Safety Division at Soreq Nuclear Research Center (SNRC) is ISO 17025 certified and provides its services to approximately 13,000 users throughout the country from various sectors such as medical, industrial and academic. About 95% of the users are monitored monthly for X-rays, radiation using Thermoluminescence Dosimeter (TLD) cards that contain three LiF:Mg,Ti elements and the other users, who work also with thermal neutrons, use TLD cards that contain four LiF:Mg,Ti elements. All TLD cards are measured with the Thermo 8800pc reader. Suspicious TLD glow curve (GC) can cause wrong dose estimation so the EDL makes great efforts to ensure that each GC undergoes a careful QC procedure. The current QC procedure is performed manually and through a few steps using different softwares and databases in a long and complicated procedure: EDL staff needs to export all the results/GCs to be checked to an Excel file, followed by finding the suspicious GCs, which is done in a different program (WinREMS), According to the GC shapes (Figure 1 illustrates suitable and suspicious GC shapes) and the ratio between the elements result values, the inspecting technician corrects the data. The motivation for developing the new program is the complicated and time consuming process of our the manual procedure to the large amount of TLDs each month (13,000), similarly to other Dosimetry services that use computerized QC GC analysis. it is important to note that only ~25% of the results are above the EDL recorded level (0.10 mSv) and need to be inspected. Thus, the purpose of this paper is to describe a new program, TagQC, which allows a computerized QC GC analysis that identifies automatically, swiftly, and accurately suspicious TLD GC

  17. Renal calyceal anatomy characterization with 3-dimensional in vivo computerized tomography imaging.

    Science.gov (United States)

    Miller, Joe; Durack, Jeremy C; Sorensen, Mathew D; Wang, James H; Stoller, Marshall L

    2013-02-01

    Calyceal selection for percutaneous renal access is critical for safe, effective performance of percutaneous nephrolithotomy. Available anatomical evidence is contradictory and incomplete. We present detailed renal calyceal anatomy obtained from in vivo 3-dimentional computerized tomography renderings. A total of 60 computerized tomography urograms were randomly selected. The renal collecting system was isolated and 3-dimensional renderings were constructed. The primary plane of each calyceal group of 100 kidneys was determined. A coronal maximum intensity projection was used for simulated percutaneous access. The most inferior calyx was designated calyx 1. Moving superiorly, the subsequent calyces were designated calyx 2 and, when present, calyx 3. The surface rendering was rotated to assess the primary plane of the calyceal group and the orientation of the select calyx. The primary plane of the upper pole calyceal group was mediolateral in 95% of kidneys and the primary plane of the lower pole calyceal group was anteroposterior in 95%. Calyx 2 was chosen in 90 of 97 simulations and it was appropriate in 92%. Calyx 3 was chosen in 7 simulations but it was appropriate in only 57%. Calyx 1 was not selected in any simulation and it was anteriorly oriented in 75% of kidneys. Appropriate lower pole calyceal access can be reliably accomplished with an understanding of the anatomical relationship between individual calyceal orientation and the primary plane of the calyceal group. Calyx 2 is most often appropriate for accessing the anteroposterior primary plane of the lower pole. Calyx 1 is most commonly oriented anterior. Copyright © 2013 American Urological Association Education and Research, Inc. Published by Elsevier Inc. All rights reserved.

  18. Computerized detection of breast cancer on automated breast ultrasound imaging of women with dense breasts

    International Nuclear Information System (INIS)

    Drukker, Karen; Sennett, Charlene A.; Giger, Maryellen L.

    2014-01-01

    Purpose: Develop a computer-aided detection method and investigate its feasibility for detection of breast cancer in automated 3D ultrasound images of women with dense breasts. Methods: The HIPAA compliant study involved a dataset of volumetric ultrasound image data, “views,” acquired with an automated U-Systems Somo•V ® ABUS system for 185 asymptomatic women with dense breasts (BI-RADS Composition/Density 3 or 4). For each patient, three whole-breast views (3D image volumes) per breast were acquired. A total of 52 patients had breast cancer (61 cancers), diagnosed through any follow-up at most 365 days after the original screening mammogram. Thirty-one of these patients (32 cancers) had a screening-mammogram with a clinically assigned BI-RADS Assessment Category 1 or 2, i.e., were mammographically negative. All software used for analysis was developed in-house and involved 3 steps: (1) detection of initial tumor candidates, (2) characterization of candidates, and (3) elimination of false-positive candidates. Performance was assessed by calculating the cancer detection sensitivity as a function of the number of “marks” (detections) per view. Results: At a single mark per view, i.e., six marks per patient, the median detection sensitivity by cancer was 50.0% (16/32) ± 6% for patients with a screening mammogram-assigned BI-RADS category 1 or 2—similar to radiologists’ performance sensitivity (49.9%) for this dataset from a prior reader study—and 45.9% (28/61) ± 4% for all patients. Conclusions: Promising detection sensitivity was obtained for the computer on a 3D ultrasound dataset of women with dense breasts at a rate of false-positive detections that may be acceptable for clinical implementation

  19. Impact of a computerized provider radiography order entry system without clinical decision support on emergency department medical imaging requests.

    Science.gov (United States)

    Claret, Pierre-Géraud; Bobbia, Xavier; Macri, Francesco; Stowell, Andrew; Motté, Antony; Landais, Paul; Beregi, Jean-Paul; de La Coussaye, Jean-Emmanuel

    2016-06-01

    The adoption of computerized physician order entry is an important cornerstone of using health information technology (HIT) in health care. The transition from paper to computer forms presents a change in physicians' practices. The main objective of this study was to investigate the impact of implementing a computer-based order entry (CPOE) system without clinical decision support on the number of radiographs ordered for patients admitted in the emergency department. This single-center pre-/post-intervention study was conducted in January, 2013 (before CPOE period) and January, 2014 (after CPOE period) at the emergency department at Nîmes University Hospital. All patients admitted in the emergency department who had undergone medical imaging were included in the study. Emergency department admissions have increased since the implementation of CPOE (5388 in the period before CPOE implementation vs. 5808 patients after CPOE implementation, p=.008). In the period before CPOE implementation, 2345 patients (44%) had undergone medical imaging; in the period after CPOE implementation, 2306 patients (40%) had undergone medical imaging (p=.008). In the period before CPOE, 2916 medical imaging procedures were ordered; in the period after CPOE, 2876 medical imaging procedures were ordered (p=.006). In the period before CPOE, 1885 radiographs were ordered; in the period after CPOE, 1776 radiographs were ordered (pmedical imaging did not vary between the two periods. Our results show a decrease in the number of radiograph requests after a CPOE system without clinical decision support was implemented in our emergency department. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  20. COST ANALYSIS OF ALTERNATIVE COMPUTERIZED SYSTEMS FOR THE MARKETING AND DISTRIBUTION OF MULTIPLE FOOD COMMODITIES

    OpenAIRE

    Epperson, James E.; Helmreich, D.P.; Moon, Leonard C.; Carley, Dale H.; Huang, Chung L.; Fletcher, Stanley M.

    1981-01-01

    The authors make cost comparisons among alternative computerized marketing systems. The systems described could encompass any number of commodities and stages of distribution involving cash and/or futures transactions.

  1. Analysis of errors during medical and computerized diagnostics of spherical lung neoplasms

    International Nuclear Information System (INIS)

    Pozmogov, A.I.; Petruk, D.A.

    1985-01-01

    Reasons for errors in medical and computerized diagnostics of spherical lung neoplasms are studied based on material of 212 case records and clinicoroentgenological data; it should promote improvement of their diagnostics

  2. The use of transport and diffusion equations in the three-dimensional reconstruction of computerized tomographic images

    Energy Technology Data Exchange (ETDEWEB)

    Pires, Sandrerley Ramos, E-mail: sandrerley@eee.ufg.br [Escola de Engenharia Eletrica e de Computacao - EEEC, Universidade Federal de Goias - UFG, Goiania, GO (Brazil); Flores, Edna Lucia; Pires, Dulcineia Goncalves F.; Carrijo, Gilberto Arantes; Veiga, Antonio Claudio Paschoarelli [Faculdade de Engenharia Eletrica - FEELT, Universidade Federal de Uberlandia - UFU, Uberlandia, MG (Brazil); Barcelos, Celia Aparecida Z. [Faculdade de Matematica, Universidade Federal de Uberlandia - UFU, Uberlandia, MG (Brazil)

    2012-09-15

    The visualization of a computerized tomographic (TC) exam in 3D increases the quality of the medical diagnosis and, consequently, the success probability in the treatment. To obtain a high quality image it is necessary to obtain slices which are close to one another. Motivated towards the goal of reaching an improved balance between quantity of slices and visualization quality, this research work presents a digital inpainting technique of 3D interpolation for CT slices used in the visualization of human body structures. The inpainting is carried out via non-linear partial differential equations (PDE). The PDE's have been used, in the image-processing context to fill in the damaged regions in a digital 2D image. Inspired by this idea, this article proposes an interpolation method for the filling in of the empty regions between the CT slices. To do it, considering the high similarity between two consecutive real slice, the first step of the proposed method is to create the virtual slices. The virtual slices contain all similarity between the intercalated slices and, when there are not similarities between real slices, the virtual slices will contain indefinite portions. In the second step of the proposed method, the created virtual slices will be used together with the real slices images, in the reconstruction of the structure in three dimensions, mapped onto the exam. The proposed method is capable of reconstructing the curvatures of the patient's internal structures without using slices that are close to one another. The experiments carried out show the proposed method's efficiency. (author)

  3. Computerized video interaction self-instruction of MR imaging fundamentals utilizing laser disk technology

    International Nuclear Information System (INIS)

    Genberg, R.W.; Javitt, M.C.; Popky, G.L.; Parker, J.A.; Pinkney, M.N.

    1986-01-01

    Interactive computer-assisted self-instruction is emerging as a recognized didactic modality and is now being introduced to teach physicians the physics of MR imaging. The interactive system consists of a PC-compatible computer, a 12-inch laser disk drive, and a high-resolution monitor. The laser disk, capable of storing 54,000 images, is pressed from a previously edited video tape of MR and video images. The interactive approach is achieved through the use of the computer and appropriate software. The software is written to include computer graphics overlays of the laser disk images, to select interactive branching paths (depending on the user's response to directives or questions), and to provide feedback to the user so that he can assess his performance. One of their systems is available for use in the scientific exhibit area

  4. Glandular dose and image quality control in mammography facilities with computerized radiography systems

    International Nuclear Information System (INIS)

    Dantas, Marcelino Vicente de Almeida

    2010-01-01

    Breast cancer is the most common cancer among women, and early detection is critical to its diagnosis and treatment. To date, the most effective method for early detection of breast cancer has been x-ray mammography for which the screen/film (SF) technique has been the gold standard. However, even though SF combinations have been improved and optimized over the years for breast imaging, there are some critical limitations, including a narrow exposure range, image artifacts, film processing problems, and inflexibility in image processing and film management. In recent years, digital mammography has been introduced in cancer screening programmes with the screen/film techniques gradually being phased out. Computed radiography (CR), also commonly known as photostimulable phosphor (PSP) imaging or storage phosphor, employs reusable imaging plates and associated hardware and software to acquire and to display digital projection radiographs. In this work, a protocol model was tested for performing image quality control and average glandular dose (AGD) evaluation in 19 institutions with computed radiography systems for mammography. The protocol was validated through tests at the Laboratorio de Radioprotecao Aplicada a Mamografia (LARAM) from the Centro de Desenvolvimento da Tecnologia Nuclear (CDTN). The image quality visual evaluation of CDMAM phantom showed that 53% of the facilities were able to produce images of excellent quality. Furthermore, the automated evaluation of image quality, using the analyze software cdcom.exe, showed that 57% of the images were considered to be of good quality. The detector linearity test showed that the CR response is very linear, where 95% of facilities evaluated were considered to be compliant. For the image noise was found that only 20% of facilities are in agreement with the parameters established for this test. The average glandular doses, which patients may be getting to perform an examination, were below the action levels

  5. Computerized follow-up of discrepancies in image interpretation between emergency and radiology departments

    OpenAIRE

    Siegel, Eliot; Groleau, Georgina; Reiner, Bruce; Stair, Thomas

    1998-01-01

    Radiographs are ordered and interpreted for immediate clinical decisions 24 hours a day by emergency physicians (EP’s). The Joint Commission for Accreditation of Health Care Organizations requires that all these images be reviewed by radiologists and that there be some mechanism for quality improvement (QI) for discrepant readings. There must be a log of discrepancies and documentation of follow up activities, but this alone does not guarantee effective Q.I. Radiologists reviewing images from...

  6. Incomplete-data image reconstructions in industrial x-ray computerized tomography

    International Nuclear Information System (INIS)

    Tam, K.C.; Eberhard, J.W.; Mitchell, K.W.

    1989-01-01

    In earlier works it was concluded that image reconstruction from incomplete data can be achieved through an iterative transform algorithm which utilizes the a priori information on the object to compensate for the missing data. The image is transformed back and forth between the object space and the projection space, being corrected by the a priori information on the object in the object space, and by the known projections in the projection space. The a priori information in the object space includes a boundary enclosing the object, and an upper bound and a lower bound of the object density. In this paper we report the results of testing the iterative transform algorithm on experimental data. X-ray sinogram data of the cross section of a F404 high-pressure turbine blade made of Ni-based superalloy were supplied to us by the Aircraft Engine Business Group of General Electric Company at Cincinnati, Ohio. From the data set we simulated two kinds of incomplete data situations, incomplete projection and limited-angle scanning, and applied the iterative transform algorithm to reconstruct the images. The results validated the practical value of the iterative transform algorithm in reconstructing images from incomplete x-ray data, both incomplete projections and limited-angle data. In all the cases tested there were significant improvements in the appearance of the images after iterations. The visual improvements are substantiated in a quantitative manner by the plots of errors in wall thickness measurements which in general decrease in magnitude with iterations

  7. Roentgen and X-ray computerized tomographic (CT) imaging of cysts in the maxilla

    International Nuclear Information System (INIS)

    Rahmatulla, M

    1999-01-01

    Two cysts in the maxilla were subjected to routine roentgen imaging followed by CT scanning. Roentgen investigation included periapical, occlusal, and panoramic views. CT imaging included axial and coronal scans. While roentgen views were adequate in establishing the diagnosis of the cystic lesions, CT scan was useful in understanding the precise antero-posterior expansion and depth of the lesion. Interpretation of CT scan of cystic jaw lesions without con-ventional radiographs can be misleading. Hence, the CT procedure may be used only as supplement to the routine radiographic investigations particularly in cystic lesions of the jaws. (author)

  8. Human factor engineering analysis for computerized human machine interface design issues

    International Nuclear Information System (INIS)

    Wang Zhifang; Gu Pengfei; Zhang Jianbo

    2010-01-01

    The application of digital I and C technology in nuclear power plants is a significant improvement in terms of functional performances and flexibility, and it also poses a challenge to operation safety. Most of the new NPPs under construction are adopting advanced control room design which utilizes the computerized human machine interface (HMI) as the main operating means. Thus, it greatly changes the way the operators interact with the plant. This paper introduces the main challenges brought out by computerized technology on the human factor engineering aspect and addresses the main issues to be dealt with in the computerized HMI design process. Based on a operator task-resources-cognitive model, it states that the root cause of human errors is the mismatch between resources demand and their supply. And a task-oriented HMI design principle is discussed. (authors)

  9. Trends in computerized structural analysis and synthesis; Proceedings of the Symposium, Washington, D.C., October 30-November 1, 1978

    Science.gov (United States)

    Noor, A. K. (Editor); Mccomb, H. G., Jr.

    1978-01-01

    The subjects considered are related to future directions of structural applications and potential of new computing systems, advances and trends in data management and engineering software development, advances in applied mathematics and symbolic computing, computer-aided instruction and interactive computer graphics, nonlinear analysis, dynamic analysis and transient response, structural synthesis, structural analysis and design systems, advanced structural applications, supercomputers, numerical analysis, and trends in software systems. Attention is given to the reliability and optimality of the finite element method, computerized symbolic manipulation in structural mechanics, a standard computer graphics subroutine package, and a drag method as a finite element mesh generation scheme.

  10. Computerized breast cancer analysis system using three stage semi-supervised learning method.

    Science.gov (United States)

    Sun, Wenqing; Tseng, Tzu-Liang Bill; Zhang, Jianying; Qian, Wei

    2016-10-01

    A large number of labeled medical image data is usually a requirement to train a well-performed computer-aided detection (CAD) system. But the process of data labeling is time consuming, and potential ethical and logistical problems may also present complications. As a result, incorporating unlabeled data into CAD system can be a feasible way to combat these obstacles. In this study we developed a three stage semi-supervised learning (SSL) scheme that combines a small amount of labeled data and larger amount of unlabeled data. The scheme was modified on our existing CAD system using the following three stages: data weighing, feature selection, and newly proposed dividing co-training data labeling algorithm. Global density asymmetry features were incorporated to the feature pool to reduce the false positive rate. Area under the curve (AUC) and accuracy were computed using 10 fold cross validation method to evaluate the performance of our CAD system. The image dataset includes mammograms from 400 women who underwent routine screening examinations, and each pair contains either two cranio-caudal (CC) or two mediolateral-oblique (MLO) view mammograms from the right and the left breasts. From these mammograms 512 regions were extracted and used in this study, and among them 90 regions were treated as labeled while the rest were treated as unlabeled. Using our proposed scheme, the highest AUC observed in our research was 0.841, which included the 90 labeled data and all the unlabeled data. It was 7.4% higher than using labeled data only. With the increasing amount of labeled data, AUC difference between using mixed data and using labeled data only reached its peak when the amount of labeled data was around 60. This study demonstrated that our proposed three stage semi-supervised learning can improve the CAD performance by incorporating unlabeled data. Using unlabeled data is promising in computerized cancer research and may have a significant impact for future CAD system

  11. A gender-based analysis of high school athletes using computerized electrocardiogram measurements.

    Directory of Open Access Journals (Sweden)

    Nikhil Kumar

    Full Text Available BACKGROUND: The addition of the ECG to the preparticipation examination (PPE of high school athletes has been a topic for debate. Defining the difference between the high school male and female ECG is crucial to help initiate its implementation in the High School PPE. Establishing the different parameters set for the male and female ECG would help to reduce false positives. We examined the effect of gender on the high school athlete ECG by obtaining and analyzing ECG measurements of high school athletes from Henry M. Gunn High School. METHODS: In 2011 and 2012, computerized Electrocardiograms were recorded and analyzed on 181 athletes (52.5% male; mean age 16.1 ± 1.1 years who participated in 17 different sports. ECG statistics included intervals and durations in all 3 axes (X, Y, Z to calculate 12 lead voltage sums, QRS Amplitude, QT interval, QRS Duration, and the sum of the R wave in V5 and the S Wave in V2 (RS Sum. RESULTS: By computer analysis, we demonstrated that male athletes had significantly greater QRS duration, Q-wave duration, and T wave amplitude. (P<0.05. By contrast, female athletes had a significantly greater QTc interval. (P<0.05. CONCLUSION: The differences in ECG measurements in high school athletes are strongly associated with gender. However, body size does not correlate with the aforementioned ECG measurements. Our tables of the gender-specific parameters can help facilitate the development of a more large scale and in-depth ECG analysis for screening high school athletes in the future.

  12. Computerized tomography and morphological findings in brain infarcts and intracerebral haematonous for identical image planes

    Energy Technology Data Exchange (ETDEWEB)

    Clar, H E; Bock, W J; Hahse, H C; Gerhard, L; Flossdorf, R [Essen Univ. (Gesamthochschule) (Germany, F.R.). Neurochirurgische Klinik; Duesseldorf Univ. (Germany, F.R.). Neurochirurgische Klinik; Essen Univ. (Gesamthochschule) (Germany, F.R.). Roentgendiagnostisches Zentralinstitut; Essen Univ. (Gesamthochschule) (Germany, F.R.). Neuropathologisches Inst.)

    1979-01-01

    Contrary to earlier, more optimistic publications, CT findings do not always agree with brain sections of the same image plane. For example, in spite of a clinically proved infrarot anamnesis, Huber was unable to detect a pathological CT finding in 20% of the cases. Still, CT is the method that yields the best information on cerebral ischaemios, haemorrhaegic infarcts, and haemorrhagies if purposefully applied.

  13. Technologies in computerized lexicography | Kruyt | Lexikos

    African Journals Online (AJOL)

    Although the topic of this paper is technology, focus is on functional rather than technical aspects of computerized lexicography. Keywords: computerized lexicography, electronic dictionary, electronic text corpus, lexicographer's workbench, integrated language database, automatic linguistic analysis, information retrieval, ...

  14. Brain metastasis of small cell lung carcinoma. Comparison of Gd-DTPA enhanced magnetic resonance imaging and enhanced computerized tomography

    International Nuclear Information System (INIS)

    Nomoto, Yasushi; Yamaguchi, Yutaka; Miyamoto, Tadaaki.

    1994-01-01

    Small cell carcinoma of the lung (SCLC) frequently metastasizes into the brain, resulting in serious influences upon prognosis. Delayed brain damage caused by prophylactic cranial irradiation (PCI) is also problematic. Gadolinium diethylene triamine pentaacetic acid (Gd-DTPA) enhanced magnetic resonance imaging (MRI) was performed to detect early brain metastasis from SCLC, and its usefulness was compared with contrast computerized tomography (CT). Among 25 SCLC patients, brain metastasis was detected in 11 by MRI and in 10 by CT, although six of them were completely asymptomatic. In the 11 patients, 6.3 and 2.4 lesions were respectively detected on average by MRI and CT. The ability of MRI to detect metastatic lesions of ≥15 mm diameter did not differ from that of CT, but became different as lesions became smaller (P<0.002), and MRI had a decided advantage over CT because as many as 30 lesions of ≤5 mm diameter were detected by MRI, whereas such lesions visualized on CT numbered only one (P<0.0001). MRI was incomparably superior to CT (P<0.0004) for subtentorial lesions since 18 lesions were detected on MRI, but only three, measuring ≥25 mm in diameter, were demonstrated on CT. Gd-DTPA enhanced MRI was determined to be extremely useful in the early diagnosis of SCLC brain metastasis. MRI was thought to reduce delayed brain damage caused by PCI if performed according to an adequate schedule. (author)

  15. Direct image reconstruction with limited angle projection data for computerized tomography

    International Nuclear Information System (INIS)

    Inouye, T.

    1980-01-01

    Discussions are made on the minimum angle range for projection data necessary to reconstruct the complete CT image. As is easily shown from the image reconstruction theorem, the lack of projection angle provides no data for the Fourier transformed function of the object on the corresponding angular directions, where the projections are missing. In a normal situation, the Fourier transformed function of an object image holds an analytic characteristic with respect to two-dimensional orthogonal parameters. This characteristic enables uniquely prolonging the function outside the obtained region employing a sort of analytic continuation with respect to both parameters. In the method reported here, an object pattern, which is confined within a finite range, is shifted to a specified region to have complete orthogonal function expansions without changing the projection angle directions. These orthogonal functions are analytically extended to the missing projection angle range and the whole function is determined. This method does not include any estimation process, whose effectiveness is often seriously jeopardized by the presence of a slight fluctuation component. Computer simulations were carried out to demonstrate the effectiveness of the method

  16. A three-dimensional dose-distribution estimation system using computerized image reconstruction

    International Nuclear Information System (INIS)

    Nishijima, Akihiko; Kidoya, Eiji; Komuro, Hiroyuki; Tanaka, Masato; Asada, Naoki.

    1990-01-01

    In radiotherapy planning, three dimensional (3-D) estimation of dose distribution has been very troublesome and time-consuming. To solve this problem, a simple and fast 3-D dose distribution image using a computer and Charged Couple Device (CCD) camera was developed. A series of X-ray films inserted in the phantom using a linear accelerator unit was exposed. The degree of film density was degitized with a CCD camera and a minicomputer (VAX 11-750). After that these results were compared with the present depth dose obtained by a JARP type dosimeter, with a dose error being less than 2%. The 3-D dose distribution image could accurately depict the density changes created by aluminum and air put into the phantom. The contrast resolution of the CCD camera seemed to be superior to the convention densitometer in the low-to-intermediate contrast range. In conclusion, our method seem to be very fast and simple for obtaining 3-D dose distribution images and is very effective when compared with the conventional method. (author)

  17. Oncological image analysis.

    Science.gov (United States)

    Brady, Sir Michael; Highnam, Ralph; Irving, Benjamin; Schnabel, Julia A

    2016-10-01

    Cancer is one of the world's major healthcare challenges and, as such, an important application of medical image analysis. After a brief introduction to cancer, we summarise some of the major developments in oncological image analysis over the past 20 years, but concentrating those in the authors' laboratories, and then outline opportunities and challenges for the next decade. Copyright © 2016 Elsevier B.V. All rights reserved.

  18. Design and development of computerized local and overall country's environmental data analysis network system

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Chang Gyu; Kang, Jong Gyu; Han, H.; Han, J. S.; Lee, Y. D.; Lee, S. R.; Kang, D. J.; Cho, Y. G.; Yun, S. H. [Daedeok College, Taejon (Korea, Republic of)

    2001-03-15

    In this development, we designed a integrated database for efficient data processing of radiation-environment data and developed the CLEAN (Computerized Local and overall country's Environmental data Analysis Network) system. The CLEAN system consists of local radiation-environment network, data analysis system, data open system. We developed the CLEAN system focused on building an integrated database, a data mart, and a CLEAN web site. It is expected that the developed system, which organizes the information related to environmental radiation data systematically, can be utilize for the accurate interpretation, analysis and evaluation.

  19. Computerized Games and Simulations in Computer-Assisted Language Learning: A Meta-Analysis of Research

    Science.gov (United States)

    Peterson, Mark

    2010-01-01

    This article explores research on the use of computerized games and simulations in language education. The author examined the psycholinguistic and sociocultural constructs proposed as a basis for the use of games and simulations in computer-assisted language learning. Research in this area is expanding rapidly. However, to date, few studies have…

  20. Image seedling analysis to evaluate tomato seed physiological potential

    Directory of Open Access Journals (Sweden)

    Vanessa Neumann Silva

    Full Text Available Computerized seedling image analysis are one of the most recently techniques to detect differences of vigor between seed lots. The aim of this study was verify the hability of computerized seedling image analysis by SVIS® to detect differences of vigor between tomato seed lots as information provided by traditionally vigor tests. Ten lots of tomato seeds, cultivar Santa Clara, were stored for 12 months in controlled environment at 20 ± 1 ºC and 45-50% of relative humidity of the air. The moisture content of the seeds was monitored and the physiological potential tested at 0, 6 and 12 months after storage, with germination test, first count of germination, traditional accelerated ageing and with saturated salt solution, electrical conductivity, seedling emergence and with seed vigor imaging system (SVIS®. A completely randomized experimental design was used with four replications. The parameters obtained by the computerized seedling analysis (seedling length and indexes of vigor and seedling growth with software SVIS® are efficient to detect differences between tomato seed lots of high and low vigor.

  1. Gabor Analysis for Imaging

    DEFF Research Database (Denmark)

    Christensen, Ole; Feichtinger, Hans G.; Paukner, Stephan

    2015-01-01

    , it characterizes a function by its transform over phase space, which is the time–frequency plane (TF-plane) in a musical context or the location–wave-number domain in the context of image processing. Since the transition from the signal domain to the phase space domain introduces an enormous amount of data...... of the generalities relevant for an understanding of Gabor analysis of functions on Rd. We pay special attention to the case d = 2, which is the most important case for image processing and image analysis applications. The chapter is organized as follows. Section 2 presents central tools from functional analysis......, the application of Gabor expansions to image representation is considered in Sect. 6....

  2. Interfraction Prostate Rotation Determined from In-Room Computerized Tomography Images

    International Nuclear Information System (INIS)

    Owen, Rebecca; Kron, Tomas; Foroudi, Farshad; Milner, Alvin; Cox, Jennifer; Duchesne, Gillian

    2011-01-01

    Fiducial markers (FMs) are commonly used as a correction technique for interfraction translations of the prostate. The aim of this investigation was to determine the magnitude of prostate rotations using 2 methods: FM coordinates and the anatomical border of the prostate and rectum. Daily computed tomography (CT) scans (n = 346) of 10 prostate cancer patients with 3 implanted FMs were acquired using the CT on rails. FM coordinates were used to determine rotation in the sagittal, transverse, and coronal planes, and CT contours of the prostate and rectum were used to determine rotation along the sagittal plane. An adaptive technique based on a subset of images (n = 6; planning and first 5 treatment CTs) to reduce systematic rotation errors in the sagittal plane was tested. The standard deviation (SD) of systematic rotation from FM coordinates was 7.6 o , 7.7 o , and 5.0 o in the sagittal, transverse and coronal planes. The corresponding SD of random error was 10.2 o , 15.8 o , and 6.5 o . Errors in the sagittal plane, determined from prostate and rectal contours, were 10.1 o (systematic) and 7.7 o (random). These results did not correlate with rotation computed from FM coordinates (r = -0.017; p = 0.753, n = 337). The systematic error could be reduced by 43% to 5.6 o when the mean prostate position was estimated from 6 CT scans. Prostate rotation is a significant source of error that appears to be more accurately determined using the anatomical border of the prostate and rectum rather than FMs, thus highlighting the utility of CT image guidance.

  3. Paperback atlas of anatomical sectional images: Computerized tomography and NMR imaging. Vol. 1. Head, neck, vertebral column, joints

    International Nuclear Information System (INIS)

    Moeller, T.B.; Reif, E.

    1993-01-01

    Using the nomenclature relating to X-ray findings, the paperback atlas provides a concise, yet accurate description of fine anatomical structures visualized by sectional imaging procedures. Each of the approx. 250 sample images shown for the regions of the head (including neurocranium), vertebral column, neck, thorax, abdomen and muscosceletal system (including joints) is supplemented with a drawing that permits an immediate identification of any structure of interest. (orig.) [de

  4. Digital image analysis

    DEFF Research Database (Denmark)

    Riber-Hansen, Rikke; Vainer, Ben; Steiniche, Torben

    2012-01-01

    Digital image analysis (DIA) is increasingly implemented in histopathological research to facilitate truly quantitative measurements, decrease inter-observer variation and reduce hands-on time. Originally, efforts were made to enable DIA to reproduce manually obtained results on histological slides...... reproducibility, application of stereology-based quantitative measurements, time consumption, optimization of histological slides, regions of interest selection and recent developments in staining and imaging techniques....

  5. Echo-lucency of computerized ultrasound images of carotid atherosclerotic plaques are associated with increased levels of triglyceride-rich lipoproteins as well as increased plaque lipid content

    DEFF Research Database (Denmark)

    Grønholdt, Marie-Louise Moes; Nordestgaard, Børge G.; Weibe, Brit M.

    1998-01-01

    carotid plaque echo-lucency and that echo-lucency predicts a high plaque lipid content. Methods and Results-The study included 137 patients with neurological symptoms and greater than or equal to 50% stenosis of the relevant carotid artery, High-resolution B-mode ultrasound images of carotid plaques were......Background-Echo-lucency of carotid atherosclerotic plaques on computerized ultrasound B-mode images has been associated with a high incidence of brain infarcts as evaluated on CT scans. We tested the hypotheses that triglyceride-rich lipoproteins in the fasting and postprandial state predict...

  6. Echolucency of computerized ultrasound images of carotid atherosclerotic plaques are associated with increased levels of triglyceride-rich lipoproteins as well as increased plaque lipid content

    DEFF Research Database (Denmark)

    Grønholdt, Marie-Louise M.; Nordestgaard, Børge; Wiebe, Britt M.

    1998-01-01

    carotid plaque echo-lucency and that echo-lucency predicts a high plaque lipid content. Methods and Results-The study included 137 patients with neurological symptoms and greater than or equal to 50% stenosis of the relevant carotid artery, High-resolution B-mode ultrasound images of carotid plaques were......Background-Echo-lucency of carotid atherosclerotic plaques on computerized ultrasound B-mode images has been associated with a high incidence of brain infarcts as evaluated on CT scans. We tested the hypotheses that triglyceride-rich lipoproteins in the fasting and postprandial state predict...

  7. Analysis of internal and external validity criteria for a computerized visual search task: A pilot study.

    Science.gov (United States)

    Richard's, María M; Introzzi, Isabel; Zamora, Eliana; Vernucci, Santiago

    2017-01-01

    Inhibition is one of the main executive functions, because of its fundamental role in cognitive and social development. Given the importance of reliable and computerized measurements to assessment inhibitory performance, this research intends to analyze the internal and external criteria of validity of a computerized conjunction search task, to evaluate the role of perceptual inhibition. A sample of 41 children (21 females and 20 males), aged between 6 and 11 years old (M = 8.49, SD = 1.47), intentionally selected from a private management school of Mar del Plata (Argentina), middle socio-economic level were assessed. The Conjunction Search Task from the TAC Battery, Coding and Symbol Search tasks from Wechsler Intelligence Scale for Children were used. Overall, results allow us to confirm that the perceptual inhibition task form TAC presents solid rates of internal and external validity that make a valid measurement instrument of this process.

  8. Computerized analysis of the 12-lead electrocardiogram to identify epicardial ventricular tachycardia exit sites.

    Science.gov (United States)

    Yokokawa, Miki; Jung, Dae Yon; Joseph, Kim K; Hero, Alfred O; Morady, Fred; Bogun, Frank

    2014-11-01

    Twelve-lead electrocardiogram (ECG) criteria for epicardial ventricular tachycardia (VT) origins have been described. In patients with structural heart disease, the ability to predict an epicardial origin based on QRS morphology is limited and has been investigated only for limited regions in the heart. The purpose of this study was to determine whether a computerized algorithm is able to accurately differentiate epicardial vs endocardial origins of ventricular arrhythmias. Endocardial and epicardial pace-mapping were performed in 43 patients at 3277 sites. The 12-lead ECGs were digitized and analyzed using a mixture of gaussian model (MoG) to assess whether the algorithm was able to identify an epicardial vs endocardial origin of the paced rhythm. The MoG computerized algorithm was compared to algorithms published in prior reports. The computerized algorithm correctly differentiated epicardial vs endocardial pacing sites for 80% of the sites compared to an accuracy of 42% to 66% of other described criteria. The accuracy was higher in patients without structural heart disease than in those with structural heart disease (94% vs 80%, P = .0004) and for right bundle branch block (82%) compared to left bundle branch block morphologies (79%, P = .001). Validation studies showed the accuracy for VT exit sites to be 84%. A computerized algorithm was able to accurately differentiate the majority of epicardial vs endocardial pace-mapping sites. The algorithm is not region specific and performed best in patients without structural heart disease and with VTs having a right bundle branch block morphology. Copyright © 2014 Heart Rhythm Society. Published by Elsevier Inc. All rights reserved.

  9. Computerized videodefecography versus defecography: do we need radiographs?

    Directory of Open Access Journals (Sweden)

    Carlos Walter Sobrado

    Full Text Available CONTEXT AND OBJECTIVE: Defecography has been recognized as a valuable method for evaluating patients with evacuation disorders. It consists of the use of static radiography and fluoroscopy to record different situations within anorectal dynamics. Conventionally, rectal parameters are measured using radiograms. It is rare for fluoroscopy alone to be used. Computer software has been developed with the specific aim of calculating these measurements from digitized videotaped images obtained during fluoroscopy, without the need for radiographic film, thereby developing a computerized videodefecography method. The objective was thus to compare measurements obtained via computerized videodefecography with conventional measurements and to discuss the advantages of the new method. DESIGN AND SETTING: Prospective study at the radiology service of Hospital das Clínicas, Universidade de São Paulo. METHOD: Ten consecutive normal subjects underwent videodefecography. The anorectal angle, anorectal junction, puborectalis muscle length, anal canal length and degree of anal relaxation were obtained via the conventional method (using radiography film and via computerized videodefecography using the ANGDIST software. Measurement and analysis of these parameters was performed by two independent physicians. RESULTS: Statistical analysis confirmed that the measurements obtained through direct radiography film assessment and using digital image analysis (computerized videodefecography were equivalent. CONCLUSIONS: Computerized videodefecography is equivalent to the traditional defecography examination. It has the advantage of offering reduced radiation exposure through saving on the use of radiography.

  10. Computerized tomography

    International Nuclear Information System (INIS)

    1980-01-01

    Improvements in the design of computerized tomographic X-ray equipment are described which lead to improvements in the mechanical properties, speed and size of scanning areas. The method envisages the body being scanned as a two-dimensional matrix of elements arising from a plurality of concentric rings. The concentric centre need not coincide with the axis of rotation. The procedures for rotation of the X-ray beam and detectors around the patient and for translating the measured information into attenuation coefficients for each matrix element of the body are described in detail. Explicit derivations are given for the mathematical formulae used. (U.K.)

  11. Elastic fibers in human skin: quantitation of elastic fibers by computerized digital image analyses and determination of elastin by radioimmunoassay of desmosine.

    Science.gov (United States)

    Uitto, J; Paul, J L; Brockley, K; Pearce, R H; Clark, J G

    1983-10-01

    The elastic fibers in the skin and other organs can be affected in several disease processes. In this study, we have developed morphometric techniques that allow accurate quantitation of the elastic fibers in punch biopsy specimens of skin. In this procedure, the elastic fibers, visualized by elastin-specific stains, are examined through a camera unit attached to the microscope. The black and white images sensing various gray levels are then converted to binary images after selecting a threshold with an analog threshold selection device. The binary images are digitized and the data analyzed by a computer program designed to express the properties of the image, thus allowing determination of the volume fraction occupied by the elastic fibers. As an independent measure of the elastic fibers, alternate tissue sections were used for assay of desmosine, an elastin-specific cross-link compound, by a radioimmunoassay. The clinical applicability of the computerized morphometric analyses was tested by examining the elastic fibers in the skin of five patients with pseudoxanthoma elasticum or Buschke-Ollendorff syndrome. In the skin of 10 healthy control subjects, the elastic fibers occupied 2.1 +/- 1.1% (mean +/- SD) of the dermis. The volume fractions occupied by the elastic fibers in the lesions of pseudoxanthoma elasticum or Buschke-Ollendorff syndrome were increased as much as 6-fold, whereas the values in the unaffected areas of the skin in the same patients were within normal limits. A significant correlation between the volume fraction of elastic fibers, determined by computerized morphometric analyses, and the concentration of desmosine, quantitated by radioimmunoassay, was noted in the total material. These results demonstrate that computerized morphometric techniques are helpful in characterizing disease processes affecting skin. This methodology should also be applicable to other tissues that contain elastic fibers and that are affected in various heritable and

  12. Computer analysis of the X-ray images of renal concretions

    International Nuclear Information System (INIS)

    Naumov, N.; Zozikov, B.; Yanakiev, I.; Varlev, H.; Dimitrov, I.; Baltadjiev, D.; Dimitrova, St.; Shimanov, V.; Sultanov, A.; Pazderov, R.

    1997-01-01

    An investigation aimed to assessing the possibilities of computerized analysis of renal concretions is described. The results of comparative study of digitized X-ray images of concretions and data retrieved from radio-spectral microprobe analysis are presented. The obtained data confirm the author's hypothesis set forth, claiming that it is possible to define the composition and structure of renal concretions using specially developed software (Videoexpert 2.0). Excellent results are obtained even from native X-rays where the concernment is still within the patient's body. Roentgen computerized analysis is recommended in making decision on therapeutic approach towards calculi in urological and radiographic practice. 5 refs., 5 figs

  13. An automated computerized methodology for the segmentation of in vivo acquired DSA images: application in the New Zealand hindlimb ischemia model

    International Nuclear Information System (INIS)

    Kagadis, G C; Daskalakis, A; Spyridonos, P; Nikiforidis, G C; Diamantopoulos, A; Samaras, N; Katsanos, K; Karnabatidis, D; Siablis, D; Sourgiadaki, E; Cavouras, D

    2009-01-01

    In-vivo dynamic visualization and accurate quantification of vascular networks is a prerequisite of crucial importance in both therapeutic angiogenesis and tumor anti-angiogenesis studies. A user independent computerized tool was developed, for the automated segmentation and quantitative assessment of in-vivo acquired DSA images. Automatic vessel assessment was performed employing the concept of image structural tensor. Initially, vasculature was estimated according to the largest eigenvalue of the structural tensor. The resulted eigenvalue matrix was treated as gray-matrix from which the vessels were gradually segmented and then categorized in three main sub-groups; large, medium and small-size vessels. The histogram percentiles, corresponding to 85%, 65% and 47% of prime eigenvalue gray-matrix were optimally found to give the thresholds T1, T2 and T3 respectively, for extracting vessels of different size. The proposed methodology was tested on a series of DSA images in both normal rabbits (group A) and in rabbits with experimental induced chronic hindlimb ischemia (group B). As a result an automated computerized tool was developed to process images without any user intervention in either experimental or clinical studies. Specifically, a higher total vascular area and length were calculated in group B compared to group A (p=0.0242 and p=0.0322 respectively), which is in accordance to the fact that significantly more collateral arteries are developed during the physiological response to the stimuli of ischemia.

  14. Computerized analysis of isometric tension studies provides important additional information about vasomotor activity

    Directory of Open Access Journals (Sweden)

    Vincent M.B.

    1997-01-01

    Full Text Available Concentration-response curves of isometric tension studies on isolated blood vessels are obtained traditionally. Although parameters such as Imax, EC50 and pA2 may be readily calculated, this method does not provide information on the temporal profile of the responses or the actual nature of the reaction curves. Computerized data acquisition systems can be used to obtain average data that represent a new source of otherwise inaccessible information, since early and late responses may be observed separately in detail

  15. Computerized industrial tomography

    International Nuclear Information System (INIS)

    Ashraf, M.M.

    1999-01-01

    Computerized Tomographic (CT) has been used for a number of applications in the field of medicine and industry. For the last couple of years, the technique has been applied for the material characterization and detection of defects and flaws inside the industrial components of nuclear, aerospace and missile industries. A CT scanner of first generation was developed at the institute. The scanner has been used to demonstrate couple of applications of CT in the field of non destructive testing of materials. The data acquired by placing the test objects at various angles and scanning the object through a source detector assembly has been processed on a Pentium computer for image reconstruction using a filtered back projection method. The technique has been developed which can be modified and improved to study various other applications in materials science and a modern computerized tomographic facility can be established. (author)

  16. Computerized method for estimation of the location of a lung tumor on EPID cine images without implanted markers in stereotactic body radiotherapy

    International Nuclear Information System (INIS)

    Arimura, H; Toyofuku, F; Higashida, Y; Onizuka, Y; Terashima, H; Egashira, Y; Shioyama, Y; Nomoto, S; Honda, H; Nakamura, K; Yoshidome, S; Anai, S

    2009-01-01

    The purpose of this study was to develop a computerized method for estimation of the location of a lung tumor in cine images on an electronic portal imaging device (EPID) without implanted markers during stereotactic body radiotherapy (SBRT). Each tumor region was segmented in the first EPID cine image, i.e., reference portal image, based on a multiple-gray level thresholding technique and a region growing technique, and then the image including the tumor region was cropped as a 'tumor template' image. The tumor location was determined as the position in which the tumor template image took the maximum cross-correlation value within each consecutive portal image, which was acquired in cine mode on the EPID in treatment. EPID images with 512 x 384 pixels (pixel size: 0.56 mm) were acquired at a sampling rate of 0.5 frame s -1 by using energies of 4, 6 or 10 MV on linear accelerators. We applied our proposed method to EPID cine images (226 frames) of 12 clinical cases (ages: 51-83, mean: 72) with a non-small cell lung cancer. As a result, the average location error between tumor points obtained by our method and the manual method was 1.47 ± 0.60 mm. This preliminary study suggests that our method based on the tumor template matching technique might be feasible for tracking the location of a lung tumor without implanted markers in SBRT.

  17. On two methods of statistical image analysis

    NARCIS (Netherlands)

    Missimer, J; Knorr, U; Maguire, RP; Herzog, H; Seitz, RJ; Tellman, L; Leenders, K.L.

    1999-01-01

    The computerized brain atlas (CBA) and statistical parametric mapping (SPM) are two procedures for voxel-based statistical evaluation of PET activation studies. Each includes spatial standardization of image volumes, computation of a statistic, and evaluation of its significance. In addition,

  18. Influence of ASIR (Adaptative Statistical Iterative Reconstruction) variation in the image noise of computerized tomography for high voltage

    International Nuclear Information System (INIS)

    Mendes, L.M.M.; Pereira, W.B.R.; Vieira, J.G.; Lamounier, C.S.; Gonçalves, D.A.; Carvalho, G.N.P.; Santana, P.C.; Oliveira, P.M.C.; Reis, L.P.

    2017-01-01

    Computed tomography had great advances in the equipment used in the diagnostic practice, directly influencing the levels of radiation for the patient. It is essential to optimize techniques that must be employed to comply with the ALARA (As Low As Reasonably Achievable) principle of radioprotection. The relationship of ASIR (Adaptive Statistical Iterative Reconstruction) with image noise was studied. Central images of a homogeneous water simulator were obtained in a 20 mm scan using a 64-channel Lightspeed VCT tomograph of General Electric in helical acquisitions with a rotation time of 0.5 seconds, Pitch 0.984: 1, and thickness of cut 0.625 mm. All these constant parameters varying the voltage in two distinct values: 120 and 140 kV with use of the automatic current by the CAE (Automatic Exposure Control), ranging from 50 to 675 mA (120 kV) and from 50 to 610 mA (140kV), minimum and maximum values, respectively allowed for each voltage. Image noise was determined through ImageJ free software. The analysis of the obtained data compared the percentage variation of the noise in the image based on the ASIR value of 10%, concluding that there is a variation of approximately 50% when compared to the values of ASIR (100%) in both tensions. Dose evaluation is required in future studies to better utilize the relationship between dose and image quality

  19. Computerized content analysis of some adolescent writings of Napoleon Bonaparte: a test of the validity of the method.

    Science.gov (United States)

    Gottschalk, Louis A; DeFrancisco, Don; Bechtel, Robert J

    2002-08-01

    The aim of this study was to test the validity of a computer software program previously demonstrated to be capable of making DSM-IV neuropsychiatric diagnoses from the content analysis of speech or verbal texts. In this report, the computer program was applied to three personal writings of Napoleon Bonaparte when he was 12 to 16 years of age. The accuracy of the neuropsychiatric evaluations derived from the computerized content analysis of these writings of Napoleon was independently corroborated by two biographers who have described pertinent details concerning his life situations, moods, and other emotional reactions during this adolescent period of his life. The relevance of this type of computer technology to psychohistorical research and clinical psychiatry is suggested.

  20. Analysis of risk in computerized tomography and other diagnostic radiology procedures

    International Nuclear Information System (INIS)

    Mossman, K.L.

    1982-01-01

    Medical practice entails continuous risks to the patient taken in good faith by the physician for the benefit of the patient. Risk of radiation induced cancer death approximates 10(-4) per cGy (rad). Assuming an average whole body dose of 0.1 cGy for many diagnostic X-ray procedures, the probability of radiation-induced cancer death is about 10(-5). The purpose of this paper is to compare the risks of common diagnostic X-ray procedures including computerized tomography (CT) with risks of smoking or automobile travel. Such comparisons should be constructive in putting radiation in perspective and facilitating explanation of risk/benefit to patients

  1. RADTRAN II: a computerized model for risk analysis of transportation of radioactive material

    International Nuclear Information System (INIS)

    Taylor, J.M.; Daniel, S.L.; Biringer, B.E.

    1980-01-01

    The RADTRAN computer code, which formed the basis for the 1977 US generic transportation risk assessment, has been extensively updated. The updated version of the code, denoted RADTRAN II, includes changes based on findings from other transportation risk studies as well as changes based on reevaluation of earlier assumptions, analyses, and computerization techniques. The environmental impact of the transportation of radioactive material can be envisioned as consisting of five components, incident free transport, non-radiological impacts, vehicular accidents, breaches of security/safeguards, and failures of quality assurance. RADTRAN II is designed to evaluate both the incident-free and the accident contribution directly and can be used to evaluate the contributions of breaches of security and quality assurances deviation if some alterations in coding are made. Non-radiological impacts are not addressed

  2. Image sequence analysis

    CERN Document Server

    1981-01-01

    The processing of image sequences has a broad spectrum of important applica­ tions including target tracking, robot navigation, bandwidth compression of TV conferencing video signals, studying the motion of biological cells using microcinematography, cloud tracking, and highway traffic monitoring. Image sequence processing involves a large amount of data. However, because of the progress in computer, LSI, and VLSI technologies, we have now reached a stage when many useful processing tasks can be done in a reasonable amount of time. As a result, research and development activities in image sequence analysis have recently been growing at a rapid pace. An IEEE Computer Society Workshop on Computer Analysis of Time-Varying Imagery was held in Philadelphia, April 5-6, 1979. A related special issue of the IEEE Transactions on Pattern Anal­ ysis and Machine Intelligence was published in November 1980. The IEEE Com­ puter magazine has also published a special issue on the subject in 1981. The purpose of this book ...

  3. Development Of An Atherothrombotic Occlusion In The Rabbit Carotid Artery: Accessed By New Computerized B- Mode Ultrasound Image Processing Technology And Histopathology

    Directory of Open Access Journals (Sweden)

    Hossein Mehrad

    2017-02-01

    Full Text Available Introduction: Thrombus formation on a disrupted atherosclerotic soft plaque is a key event that leads to atherothrombosis. Atherothrombosis is one of the leading causes of acute coronary syndrome and ischemic stroke. Our ability to test new protocols for the treatment of atherothrombotic stenosis in humans is limited for obvious ethical reasons; therefore, a precise understanding of the mechanism of atherothrombotic occlusion in human carotid artery, which give rise to thrombosis, emboli and stroke, requires a suitable animal model that would mimic the same characteristics well. Aims: The aim of this study was to generate an easily reproducible and inexpensive experimental rabbit carotid model of atherothrombotic occlusion with morphological similarities to the human disease and the subsequent assessment of the reliability of new computerized B- mode ultrasound image processing technology in the study of lumen area stenosis in this model. Methods: Briefly, male New Zealand white rabbits were submitted to common carotid artery atherothrombotic occlusion by primary balloon injury followed 1.5% cholesterol- rich diet injury for eight weeks and finally perivascularly severe cold injury. All of the rabbits' arteries were imaged by B-mode ultrasound weekly, after which the rabbits were sacrificed, and their vessels were processed for histopathology. Ultrasound longitudinal view images from three cardiac cycles were processed by a new computerized analyzing method based on dynamic programming and maximum gradient algorithm for measurement of instantaneous changes in arterial wall thickness and lumen diameter in sequential ultrasound images. Results: Histopathology results showed progressive changes, from the lipid-laden cells and fibrous connective tissue proliferation, fibrolipid plaque formation, resulting in vessel wall thickening, remodeling, neovascularization and lumen narrowing (before perivascularly severe cold injury using liquid nitrogen up

  4. The Use of Statistical Process Control-Charts for Person-Fit Analysis on Computerized Adaptive Testing. LSAC Research Report Series.

    Science.gov (United States)

    Meijer, Rob R.; van Krimpen-Stoop, Edith M. L. A.

    In this study a cumulative-sum (CUSUM) procedure from the theory of Statistical Process Control was modified and applied in the context of person-fit analysis in a computerized adaptive testing (CAT) environment. Six person-fit statistics were proposed using the CUSUM procedure, and three of them could be used to investigate the CAT in online test…

  5. Development of a computerized intervertebral motion analysis of the cervical spine for clinical application.

    Science.gov (United States)

    Piché, Mathieu; Benoît, Pierre; Lambert, Julie; Barrette, Virginie; Grondin, Emmanuelle; Martel, Julie; Paré, Amélie; Cardin, André

    2007-01-01

    The objective of this study was to develop a measurement method that could be implemented in chiropractic for the evaluation of angular and translational intervertebral motion of the cervical spine. Flexion-extension radiographs were digitized with a scanner at a ratio of 1:1 and imported into a software, allowing segmental motion measurements. The measurements were obtained by selecting the most anteroinferior point and the most posteroinferior point of a vertebral body (anterior and posterior arch, respectively, for C1), with the origin of the reference frame set at the most posteroinferior point of the vertebral body below. The same procedure was performed for both the flexion and extension radiographs, and the coordinates of the 2 points were used to calculate the angular movement and the translation between the 2 vertebrae. This method provides a measure of intervertebral angular and translational movement. It uses a different reference frame for each joint instead of the same reference frame for all joints and thus provides a measure of motion in the plane of each articulation. The calculated values obtained are comparable to other studies on intervertebral motion and support further development to validate the method. The present study proposes a computerized procedure to evaluate intervertebral motion of the cervical spine. This procedure needs to be validated with a reliability study but could provide a valuable tool for doctors of chiropractic and further spinal research.

  6. A powerful, yet easy to use, computerized analysis of competitive protein binding and radioimmunoassay data

    International Nuclear Information System (INIS)

    English, J.E.

    1981-01-01

    A computerized method has been developed and tested for the automatic computation of data obtained from competitive radio-binding assays, that is easily used by computer and non-computer oriented individuals. The program requires a series of only eleven parameter lines coded from the assay protocol, followed by the data exactly as it is produced from a standard sample counter. From the set of parameters the program is able to find standard curves at scattered locations through an assay, check their 'log dose-logit response' least squares linear regression equations statistically for homogeneity of slopes and elevations, pool the standard curves and check the pooled least squares linear regression equation statistically for linearity and non-linearity. The results of the pooled standard curve is presented graphically and in tabular form. Using the linear equation for the pooled standard curves, the concentration for each unknown sample is predicted with its corresponding 95% confidence interval and presented in a table of unknowns. Also provided in the table of unknowns is a mean and standard error of the mean for all biological replicates, including footnote flags to warn the user when an unknown concentration: (i) is outside the 0-100% bound range; (ii) is estimated from a range outside that covered by the standards; or (iii) was estimated from the unusable tail regions of the standard curve. (Auth.)

  7. Cost-effectiveness analysis of computerized ECG interpretation system in an ambulatory health care organization.

    Science.gov (United States)

    Carel, R S

    1982-04-01

    The cost-effectiveness of a computerized ECG interpretation system in an ambulatory health care organization has been evaluated in comparison with a conventional (manual) system. The automated system was shown to be more cost-effective at a minimum load of 2,500 patients/month. At larger monthly loads an even greater cost-effectiveness was found, the average cost/ECG being about $2. In the manual system the cost/unit is practically independent of patient load. This is primarily due to the fact that 87% of the cost/ECG is attributable to wages and fees of highly trained personnel. In the automated system, on the other hand, the cost/ECG is heavily dependent on examinee load. This is due to the relatively large impact of equipment depreciation on fixed (and total) cost. Utilization of a computer-assisted system leads to marked reduction in cardiologists' interpretation time, substantially shorter turnaround time (of unconfirmed reports), and potential provision of simultaneous service at several remotely located "heart stations."

  8. Applying computerized adaptive testing to the Negative Acts Questionnaire-Revised: Rasch analysis of workplace bullying.

    Science.gov (United States)

    Ma, Shu-Ching; Chien, Tsair-Wei; Wang, Hsiu-Hung; Li, Yu-Chi; Yui, Mei-Shu

    2014-02-17

    Workplace bullying is a prevalent problem in contemporary work places that has adverse effects on both the victims of bullying and organizations. With the rapid development of computer technology in recent years, there is an urgent need to prove whether item response theory-based computerized adaptive testing (CAT) can be applied to measure exposure to workplace bullying. The purpose of this study was to evaluate the relative efficiency and measurement precision of a CAT-based test for hospital nurses compared to traditional nonadaptive testing (NAT). Under the preliminary conditions of a single domain derived from the scale, a CAT module bullying scale model with polytomously scored items is provided as an example for evaluation purposes. A total of 300 nurses were recruited and responded to the 22-item Negative Acts Questionnaire-Revised (NAQ-R). All NAT (or CAT-selected) items were calibrated with the Rasch rating scale model and all respondents were randomly selected for a comparison of the advantages of CAT and NAT in efficiency and precision by paired t tests and the area under the receiver operating characteristic curve (AUROC). The NAQ-R is a unidimensional construct that can be applied to measure exposure to workplace bullying through CAT-based administration. Nursing measures derived from both tests (CAT and NAT) were highly correlated (r=.97) and their measurement precisions were not statistically different (P=.49) as expected. CAT required fewer items than NAT (an efficiency gain of 32%), suggesting a reduced burden for respondents. There were significant differences in work tenure between the 2 groups (bullied and nonbullied) at a cutoff point of 6 years at 1 worksite. An AUROC of 0.75 (95% CI 0.68-0.79) with logits greater than -4.2 (or >30 in summation) was defined as being highly likely bullied in a workplace. With CAT-based administration of the NAQ-R for nurses, their burden was substantially reduced without compromising measurement precision.

  9. Medical image registration for analysis

    International Nuclear Information System (INIS)

    Petrovic, V.

    2006-01-01

    Full text: Image registration techniques represent a rich family of image processing and analysis tools that aim to provide spatial correspondences across sets of medical images of similar and disparate anatomies and modalities. Image registration is a fundamental and usually the first step in medical image analysis and this paper presents a number of advanced techniques as well as demonstrates some of the advanced medical image analysis techniques they make possible. A number of both rigid and non-rigid medical image alignment algorithms of equivalent and merely consistent anatomical structures respectively are presented. The algorithms are compared in terms of their practical aims, inputs, computational complexity and level of operator (e.g. diagnostician) interaction. In particular, the focus of the methods discussion is placed on the applications and practical benefits of medical image registration. Results of medical image registration on a number of different imaging modalities and anatomies are presented demonstrating the accuracy and robustness of their application. Medical image registration is quickly becoming ubiquitous in medical imaging departments with the results of such algorithms increasingly used in complex medical image analysis and diagnostics. This paper aims to demonstrate at least part of the reason why

  10. Evaluation of standard and use tendency of image diagnosis exams in Brazil with emphasis in pediatric computerized tomography

    International Nuclear Information System (INIS)

    Dovales, Ana Cristina Murta

    2016-01-01

    There is little information on developing countries about the use of diagnostic imaging procedures and the doses associated with radiological examinations. This study assessed the pattern and trend of diagnostic imaging usage in outpatients of the Brazilian Unified Health System (SUS) by modality and examined body part. Emphasis was given to computed tomography (CT) scans for which the analysis was extended to the private health care sector and included the evaluation of age at examination distribution, and dose estimation for children and young adults. Information on the use of diagnostic imaging procedures among SUS outpatients was obtained from the Outpatient Information System (SIA) of the Department of Information Technology of SUS (DATASUS). Data on the use of CT in the private health care sector were extracted from the Radiological Information Systems (RIS) of 25 private radiology services in 8 Brazilian cities. Effective doses and absorbed doses on organs of interest were estimated individually for 4,497 patients younger than 20 years of age using CT scan technical parameters and Monte Carlo simulations of radiation transport. Between 2002 and 2012 it was observed that conventional radiology was the most frequent modality of diagnostic imaging in SUS outpatients, but more sophisticated modalities, such as CT and magnetic resonance imaging, had the highest growth rates over the study period. The most frequent CT scan in SUS outpatients between 2001 and 2011 was the head/neck exam, but abdomen/pelvis examinations were the ones that grew the most. Patients up to 20 years of age made approximately 13% and 9% of the CT examinations between 2008 and 2014 in the public and private health care systems, respectively. About one-third of the private-sector patients had more than one CT scan in this period. There was great variation in doses, even for the same type of procedure in patients of the same age group. The highest mean effective dose was 13.5 mSv estimated

  11. Computerized Interpretation of Dynamic Breast MRI

    National Research Council Canada - National Science Library

    Chen, Weijie; Giger, Maryellen Lissak

    2005-01-01

    ... and prognosis of breast cancer. The research involves investigation of automatic methods for image artifacts correction, tumor segmentation, and extraction of computerized features that help distinguish between benign and malignant lesions...

  12. Implementation of a Computerized Order Entry Tool to Reduce the Inappropriate and Unnecessary Use of Cardiac Stress Tests With Imaging in Hospitalized Patients.

    Science.gov (United States)

    Gertz, Zachary M; O'Donnell, William; Raina, Amresh; Balderston, Jessica R; Litwack, Andrew J; Goldberg, Lee R

    2016-10-15

    The rising use of imaging cardiac stress tests has led to potentially unnecessary testing. Interventions designed to reduce inappropriate stress testing have focused on the ambulatory setting. We developed a computerized order entry tool intended to reduce the use of imaging cardiac stress tests and improve appropriate use in hospitalized patients. The tool was evaluated using preimplementation and postimplementation cohorts at a single urban academic teaching hospital. All hospitalized patients referred for testing were included. The co-primary outcomes were the use of imaging stress tests as a percentage of all stress tests and the percentage of inappropriate tests, compared between the 2 cohorts. There were 478 patients in the precohort and 463 in the postcohort. The indication was chest pain in 66% and preoperative in 18% and was not significantly different between groups. The use of nonimaging stress tests increased from 4% in the pregroup to 15% in the postgroup (p nonimaging stress tests increased from 7% to 25% (p nonimaging cardiac stress tests and reduced the use of imaging tests yet was not able to reduce inappropriate use. Our study highlights the differences in cardiac stress testing between hospitalized and ambulatory patients. Copyright © 2016 Elsevier Inc. All rights reserved.

  13. Three-dimensional thoracic aorta principal strain analysis from routine ECG-gated computerized tomography: feasibility in patients undergoing transcatheter aortic valve replacement.

    Science.gov (United States)

    Satriano, Alessandro; Guenther, Zachary; White, James A; Merchant, Naeem; Di Martino, Elena S; Al-Qoofi, Faisal; Lydell, Carmen P; Fine, Nowell M

    2018-05-02

    Functional impairment of the aorta is a recognized complication of aortic and aortic valve disease. Aortic strain measurement provides effective quantification of mechanical aortic function, and 3-dimenional (3D) approaches may be desirable for serial evaluation. Computerized tomographic angiography (CTA) is routinely performed for various clinical indications, and offers the unique potential to study 3D aortic deformation. We sought to investigate the feasibility of performing 3D aortic strain analysis in a candidate population of patients undergoing transcatheter aortic valve replacement (TAVR). Twenty-one patients with severe aortic valve stenosis (AS) referred for TAVR underwent ECG-gated CTA and echocardiography. CTA images were analyzed using a 3D feature-tracking based technique to construct a dynamic aortic mesh model to perform peak principal strain amplitude (PPSA) analysis. Segmental strain values were correlated against clinical, hemodynamic and echocardiographic variables. Reproducibility analysis was performed. The mean patient age was 81±6 years. Mean left ventricular ejection fraction was 52±14%, aortic valve area (AVA) 0.6±0.3 cm 2 and mean AS pressure gradient (MG) 44±11 mmHg. CTA-based 3D PPSA analysis was feasible in all subjects. Mean PPSA values for the global thoracic aorta, ascending aorta, aortic arch and descending aorta segments were 6.5±3.0, 10.2±6.0, 6.1±2.9 and 3.3±1.7%, respectively. 3D PSSA values demonstrated significantly more impairment with measures of worsening AS severity, including AVA and MG for the global thoracic aorta and ascending segment (panalysis is clinically feasible from routine ECG-gated CTA. Appropriate reductions in PSSA were identified with increasing AS hemodynamic severity. Expanded study of 3D aortic PSSA for patients with various forms of aortic disease is warranted.

  14. Diagnostic medical imaging systems. X-ray radiography and angiography, computerized tomography, nuclear medicine, NMR imaging, sonography, integrated image information systems. 3. rev. and enl. ed.

    International Nuclear Information System (INIS)

    Morneburg, H.

    1995-01-01

    This third edition is based on major review and updating work. Many recent developments have been included, as for instance novel systems for fluoroscopy and mammography, spiral CT and electron beam CT, nuclear medical tomography ( SPECT and PET), novel techniques for fast NMR imaging, spectral and colour coded duplex sonography, as well as a new chapter on integrated image information systems, including network installations. (orig.) [de

  15. Cumulative Effects of Concussion History on Baseline Computerized Neurocognitive Test Scores: Systematic Review and Meta-analysis.

    Science.gov (United States)

    Alsalaheen, Bara; Stockdale, Kayla; Pechumer, Dana; Giessing, Alexander; He, Xuming; Broglio, Steven P

    It is unclear whether individuals with a history of single or multiple clinically recovered concussions exhibit worse cognitive performance on baseline testing compared with individuals with no concussion history. To analyze the effects of concussion history on baseline neurocognitive performance using a computerized neurocognitive test. PubMed, CINAHL, and psycINFO were searched in November 2015. The search was supplemented by a hand search of references. Studies were included if participants completed the Immediate Post-concussion Assessment and Cognitive Test (ImPACT) at baseline (ie, preseason) and if performance was stratified by previous history of single or multiple concussions. Systematic review and meta-analysis. Level 2. Sample size, demographic characteristics of participants, as well as performance of participants on verbal memory, visual memory, visual-motor processing speed, and reaction time were extracted from each study. A random-effects pooled meta-analysis revealed that, with the exception of worsened visual memory for those with 1 previous concussion (Hedges g = 0.10), no differences were observed between participants with 1 or multiple concussions compared with participants without previous concussions. With the exception of decreased visual memory based on history of 1 concussion, history of 1 or multiple concussions was not associated with worse baseline cognitive performance.

  16. Quantitative analysis of receptor imaging

    International Nuclear Information System (INIS)

    Fu Zhanli; Wang Rongfu

    2004-01-01

    Model-based methods for quantitative analysis of receptor imaging, including kinetic, graphical and equilibrium methods, are introduced in detail. Some technical problem facing quantitative analysis of receptor imaging, such as the correction for in vivo metabolism of the tracer and the radioactivity contribution from blood volume within ROI, and the estimation of the nondisplaceable ligand concentration, is also reviewed briefly

  17. Introduction to Medical Image Analysis

    DEFF Research Database (Denmark)

    Paulsen, Rasmus Reinhold; Moeslund, Thomas B.

    of the book is to present the fascinating world of medical image analysis in an easy and interesting way. Compared to many standard books on image analysis, the approach we have chosen is less mathematical and more casual. Some of the key algorithms are exemplified in C-code. Please note that the code...

  18. Hyperspectral image analysis. A tutorial

    DEFF Research Database (Denmark)

    Amigo Rubio, Jose Manuel; Babamoradi, Hamid; Elcoroaristizabal Martin, Saioa

    2015-01-01

    This tutorial aims at providing guidelines and practical tools to assist with the analysis of hyperspectral images. Topics like hyperspectral image acquisition, image pre-processing, multivariate exploratory analysis, hyperspectral image resolution, classification and final digital image processi...... to differentiate between several types of plastics by using Near infrared hyperspectral imaging and Partial Least Squares - Discriminant Analysis. Thus, the reader is guided through every single step and oriented in order to adapt those strategies to the user's case....... will be exposed, and some guidelines given and discussed. Due to the broad character of current applications and the vast number of multivariate methods available, this paper has focused on an industrial chemical framework to explain, in a step-wise manner, how to develop a classification methodology...

  19. Hyperspectral image analysis. A tutorial

    International Nuclear Information System (INIS)

    Amigo, José Manuel; Babamoradi, Hamid; Elcoroaristizabal, Saioa

    2015-01-01

    This tutorial aims at providing guidelines and practical tools to assist with the analysis of hyperspectral images. Topics like hyperspectral image acquisition, image pre-processing, multivariate exploratory analysis, hyperspectral image resolution, classification and final digital image processing will be exposed, and some guidelines given and discussed. Due to the broad character of current applications and the vast number of multivariate methods available, this paper has focused on an industrial chemical framework to explain, in a step-wise manner, how to develop a classification methodology to differentiate between several types of plastics by using Near infrared hyperspectral imaging and Partial Least Squares – Discriminant Analysis. Thus, the reader is guided through every single step and oriented in order to adapt those strategies to the user's case. - Highlights: • Comprehensive tutorial of Hyperspectral Image analysis. • Hierarchical discrimination of six classes of plastics containing flame retardant. • Step by step guidelines to perform class-modeling on hyperspectral images. • Fusion of multivariate data analysis and digital image processing methods. • Promising methodology for real-time detection of plastics containing flame retardant.

  20. Stochastic geometry for image analysis

    CERN Document Server

    Descombes, Xavier

    2013-01-01

    This book develops the stochastic geometry framework for image analysis purpose. Two main frameworks are  described: marked point process and random closed sets models. We derive the main issues for defining an appropriate model. The algorithms for sampling and optimizing the models as well as for estimating parameters are reviewed.  Numerous applications, covering remote sensing images, biological and medical imaging, are detailed.  This book provides all the necessary tools for developing an image analysis application based on modern stochastic modeling.

  1. Computerized tomographic system

    International Nuclear Information System (INIS)

    Godbarsen, R.; Barrett, D.M.; Garrott, P.M.; Foley, L.E.; Redington, R.W.; Lambert, T.W.; Edelheit, L.S.

    1981-01-01

    A computerized tomographic system for examining human breasts is described in detail. Conventional X-ray scanning apparatus has difficulty in achieving the levels of image definition and examination speeds required for mass screening. A novel method of scanning successive layers of the breast with a rotating X-ray beam is discussed and details of the control circuitry and sequence steps are given. The method involves immersing the breast in an inner fluid (e.g. water) filled container which is stationary during an examination and is surrounded by a large outer container which is also filled with the fluid; the inner and outer containers are always maintained at a constant height and the X-ray absorption across the fan-shaped beam is as close as possible to constant. (U.K.)

  2. Computerized Analysis of Acoustic Characteristics of Patients with Internal Nasal Valve Collapse Before and After Functional Rhinoplasty

    Science.gov (United States)

    Rezaei, Fariba; Omrani, Mohammad Reza; Abnavi, Fateme; Mojiri, Fariba; Golabbakhsh, Marzieh; Barati, Sohrab; Mahaki, Behzad

    2015-01-01

    Acoustic analysis of sounds produced during speech provides significant information about the physiology of larynx and vocal tract. The analysis of voice power spectrum is a fundamental sensitive method of acoustic assessment that provides valuable information about the voice source and characteristics of vocal tract resonance cavities. The changes in long-term average spectrum (LTAS) spectral tilt and harmony to noise ratio (HNR) were analyzed to assess the voice quality before and after functional rhinoplasty in patients with internal nasal valve collapse. Before and 3 months after functional rhinoplasty, 12 participants were evaluated and HNR and LTAS spectral tilt in /a/ and /i/ vowels were estimated. It was seen that an increase in HNR and a decrease in LTAS spectral tilt existed after surgery. Mean LTAS spectral tilt in vowel /a/ decreased from 2.37 ± 1.04 to 2.28 ± 1.17 (P = 0.388), and it was decreased from 4.16 ± 1.65 to 2.73 ± 0.69 in vowel /i/ (P = 0.008). Mean HNR in the vowel /a/ increased from 20.71 ± 3.93 to 25.06 ± 2.67 (P = 0.002), and it was increased from 21.28 ± 4.11 to 25.26 ± 3.94 in vowel /i/ (P = 0.002). Modification of the vocal tract caused the vocal cords to close sufficiently, and this showed that although rhinoplasty did not affect the larynx directly, it changes the structure of the vocal tract and consequently the resonance of voice production. The aim of this study was to investigate the changes in voice parameters after functional rhinoplasty in patients with internal nasal valve collapse by computerized analysis of acoustic characteristics. PMID:26955564

  3. Data engineering systems: Computerized modeling and data bank capabilities for engineering analysis

    Science.gov (United States)

    Kopp, H.; Trettau, R.; Zolotar, B.

    1984-01-01

    The Data Engineering System (DES) is a computer-based system that organizes technical data and provides automated mechanisms for storage, retrieval, and engineering analysis. The DES combines the benefits of a structured data base system with automated links to large-scale analysis codes. While the DES provides the user with many of the capabilities of a computer-aided design (CAD) system, the systems are actually quite different in several respects. A typical CAD system emphasizes interactive graphics capabilities and organizes data in a manner that optimizes these graphics. On the other hand, the DES is a computer-aided engineering system intended for the engineer who must operationally understand an existing or planned design or who desires to carry out additional technical analysis based on a particular design. The DES emphasizes data retrieval in a form that not only provides the engineer access to search and display the data but also links the data automatically with the computer analysis codes.

  4. Multimodality image analysis work station

    International Nuclear Information System (INIS)

    Ratib, O.; Huang, H.K.

    1989-01-01

    The goal of this project is to design and implement a PACS (picture archiving and communication system) workstation for quantitative analysis of multimodality images. The Macintosh II personal computer was selected for its friendly user interface, its popularity among the academic and medical community, and its low cost. The Macintosh operates as a stand alone workstation where images are imported from a central PACS server through a standard Ethernet network and saved on a local magnetic or optical disk. A video digitizer board allows for direct acquisition of images from sonograms or from digitized cine angiograms. The authors have focused their project on the exploration of new means of communicating quantitative data and information through the use of an interactive and symbolic user interface. The software developed includes a variety of image analysis, algorithms for digitized angiograms, sonograms, scintigraphic images, MR images, and CT scans

  5. CONTEXT BASED FOOD IMAGE ANALYSIS

    OpenAIRE

    He, Ye; Xu, Chang; Khanna, Nitin; Boushey, Carol J.; Delp, Edward J.

    2013-01-01

    We are developing a dietary assessment system that records daily food intake through the use of food images. Recognizing food in an image is difficult due to large visual variance with respect to eating or preparation conditions. This task becomes even more challenging when different foods have similar visual appearance. In this paper we propose to incorporate two types of contextual dietary information, food co-occurrence patterns and personalized learning models, in food image analysis to r...

  6. Multispectral analysis of multimodal images

    Energy Technology Data Exchange (ETDEWEB)

    Kvinnsland, Yngve; Brekke, Njaal (Dept. of Surgical Sciences, Univ. of Bergen, Bergen (Norway)); Taxt, Torfinn M.; Gruener, Renate (Dept. of Biomedicine, Univ. of Bergen, Bergen (Norway))

    2009-02-15

    An increasing number of multimodal images represent a valuable increase in available image information, but at the same time it complicates the extraction of diagnostic information across the images. Multispectral analysis (MSA) has the potential to simplify this problem substantially as unlimited number of images can be combined, and tissue properties across the images can be extracted automatically. Materials and methods. We have developed a software solution for MSA containing two algorithms for unsupervised classification, an EM-algorithm finding multinormal class descriptions and the k-means clustering algorithm, and two for supervised classification, a Bayesian classifier using multinormal class descriptions and a kNN-algorithm. The software has an efficient user interface for the creation and manipulation of class descriptions, and it has proper tools for displaying the results. Results. The software has been tested on different sets of images. One application is to segment cross-sectional images of brain tissue (T1- and T2-weighted MR images) into its main normal tissues and brain tumors. Another interesting set of images are the perfusion maps and diffusion maps, derived images from raw MR images. The software returns segmentation that seem to be sensible. Discussion. The MSA software appears to be a valuable tool for image analysis with multimodal images at hand. It readily gives a segmentation of image volumes that visually seems to be sensible. However, to really learn how to use MSA, it will be necessary to gain more insight into what tissues the different segments contain, and the upcoming work will therefore be focused on examining the tissues through for example histological sections.

  7. Cognitive impairment and its relation to imaging measures in multiple sclerosis: a study using a computerized battery.

    Science.gov (United States)

    Pellicano, Clelia; Kane, Robert L; Gallo, Antonio; Xiaobai, Li; Stern, Susan K; Ikonomidou, Vasiliki N; Evangelou, Iordanis E; Ohayon, Joan M; Ehrmantraut, Mary; Cantor, Fredric K; Bagnato, Francesca

    2013-07-01

    Cognitive impairment (CI) is an important component of multiple sclerosis (MS) disability. A complex biological interplay between white matter (WM) and gray matter (GM) disease likely sustains CI. This study aims to address this issue by exploring the association between the extent of normal WM and GM disease and CI. Cognitive function of 24 MS patients and 24 healthy volunteers (HVs) was studied using the Automated Neuropsychological Assessment Metrics (ANAM) battery. WM focal lesions and normal appearing WM (NAWM) volume in patients, cortical thickness (CTh) and deep GM structure volumes in both patients and HVs were measured by high field strength (3.0-Tesla; 3T) imaging. An analysis of covariance showed that patients performed worse than HVs on Code Substitution Delayed Memory (P = .04) and Procedural Reaction Time (P = .05) indicative of reduced performance in memory, cognitive flexibility, and processing speed. A summary score (Index of Cognitive Efficiency) indicating global test battery performance was also lower for the patient group (P = .04). Significant associations, as determined by the Spearman rank correlation tests, were noted between each of these 3 cognitive scores and measures of NAWM volume [CDD-TP1(r = .609; P = .0035), PRO-TP1 (r = .456; P = .029) and ICE (r = .489; P = .0129)], CTh (r = .5; P ≤ .05) and volume of subcortical normal appearing GM (NAGM) structures (r = .4; P≤ .04), but not WM lesions. Both NAWM and NAGM volumes are related to CI in MS. The results highlight once again the urgent need to develop pharmacological strategies protecting patients from widespread neurodegeneration as possible preventive strategies of CI development. Copyright © 2012 by the American Society of Neuroimaging.

  8. Imaging mass spectrometry statistical analysis.

    Science.gov (United States)

    Jones, Emrys A; Deininger, Sören-Oliver; Hogendoorn, Pancras C W; Deelder, André M; McDonnell, Liam A

    2012-08-30

    Imaging mass spectrometry is increasingly used to identify new candidate biomarkers. This clinical application of imaging mass spectrometry is highly multidisciplinary: expertise in mass spectrometry is necessary to acquire high quality data, histology is required to accurately label the origin of each pixel's mass spectrum, disease biology is necessary to understand the potential meaning of the imaging mass spectrometry results, and statistics to assess the confidence of any findings. Imaging mass spectrometry data analysis is further complicated because of the unique nature of the data (within the mass spectrometry field); several of the assumptions implicit in the analysis of LC-MS/profiling datasets are not applicable to imaging. The very large size of imaging datasets and the reporting of many data analysis routines, combined with inadequate training and accessible reviews, have exacerbated this problem. In this paper we provide an accessible review of the nature of imaging data and the different strategies by which the data may be analyzed. Particular attention is paid to the assumptions of the data analysis routines to ensure that the reader is apprised of their correct usage in imaging mass spectrometry research. Copyright © 2012 Elsevier B.V. All rights reserved.

  9. UV imaging in pharmaceutical analysis

    DEFF Research Database (Denmark)

    Østergaard, Jesper

    2018-01-01

    UV imaging provides spatially and temporally resolved absorbance measurements, which are highly useful in pharmaceutical analysis. Commercial UV imaging instrumentation was originally developed as a detector for separation sciences, but the main use is in the area of in vitro dissolution...

  10. Methods in quantitative image analysis.

    Science.gov (United States)

    Oberholzer, M; Ostreicher, M; Christen, H; Brühlmann, M

    1996-05-01

    The main steps of image analysis are image capturing, image storage (compression), correcting imaging defects (e.g. non-uniform illumination, electronic-noise, glare effect), image enhancement, segmentation of objects in the image and image measurements. Digitisation is made by a camera. The most modern types include a frame-grabber, converting the analog-to-digital signal into digital (numerical) information. The numerical information consists of the grey values describing the brightness of every point within the image, named a pixel. The information is stored in bits. Eight bits are summarised in one byte. Therefore, grey values can have a value between 0 and 256 (2(8)). The human eye seems to be quite content with a display of 5-bit images (corresponding to 64 different grey values). In a digitised image, the pixel grey values can vary within regions that are uniform in the original scene: the image is noisy. The noise is mainly manifested in the background of the image. For an optimal discrimination between different objects or features in an image, uniformity of illumination in the whole image is required. These defects can be minimised by shading correction [subtraction of a background (white) image from the original image, pixel per pixel, or division of the original image by the background image]. The brightness of an image represented by its grey values can be analysed for every single pixel or for a group of pixels. The most frequently used pixel-based image descriptors are optical density, integrated optical density, the histogram of the grey values, mean grey value and entropy. The distribution of the grey values existing within an image is one of the most important characteristics of the image. However, the histogram gives no information about the texture of the image. The simplest way to improve the contrast of an image is to expand the brightness scale by spreading the histogram out to the full available range. Rules for transforming the grey value

  11. Computerized bone density analysis of the proximal phalanx of the horse

    International Nuclear Information System (INIS)

    Thompson, K.N.; Cheung, T.K.; Putnam, M.

    1996-01-01

    This study utilized computed tomography to determine the density patterns and the subchondral bone thickness of the first phalanx of the horse. An image processing system and commercially available software were used to process the computed tomographic slices obtained from the first phalanges of a 2-year-old Thoroughbred horse. The thickness and density of the medial and lateral cortices in the mid-shaft of the bone were similar; however, the cortex on the dorsal aspect was more dense and extended farther toward the proximal and distal aspects of the bone than the cortex on the palmar aspect. Density of the cortical bone was highest at the region of the bone with the smallest diameter. The cortical bone density at mid-shaft was approximately 3.5 times the cancellous bone density at the proximal aspect and 2.5 times that at the distal aspect of the bone. A moderate correlation (r = 0.53, p < 0.01)was found between the subchondral bone density and thickness. Despite limited numbers of specimens used, this study demonstrated the potential applications of computed tomography for investigating equine joint mechanics and diseases

  12. Suitability of PCR fingerprinting, infrequent-restriction-site PCR, and pulsed-field gel electrophoresis, combined with computerized gel analysis, in library typing of Salmonella enterica serovar enteritidis

    DEFF Research Database (Denmark)

    Garaizar, J.; Lopez-Molina, N.; Laconcha, I.

    2000-01-01

    Strains of Salmonella enterica (n = 212) of different serovars and phage types were used to establish a library typing computerized system for serovar Enteritidis on the basis of PCR fingerprinting, infrequent-restriction-site PCR (IRS-PCR), or pulsed-field gel electrophoresis (PFGE). The rate...... showed an intercenter reproducibility value of 93.3%. The high reproducibility of PFGE combined with the previously determined high discrimination directed its use for library typing. The use of PFGE with enzymes XbaI, BlnI, and SpeI for library typing of serovar Enteritidis was assessed with GelCompar 4.......0 software, Three computer libraries of PFGE DNA profiles were constructed, and their ability to recognize new DNA profiles was analyzed. The results obtained pointed out that the combination of PFGE with computerized analysis could be suitable in long-term epidemiological comparison and surveillance...

  13. Quantitative inspection by computerized tomography

    International Nuclear Information System (INIS)

    Lopes, R.T.; Assis, J.T. de; Jesus, E.F.O. de

    1989-01-01

    The computerized Tomography (CT) is a method of nondestructive testing, that furnish quantitative information, that permit the detection and accurate localization of defects, internal dimension measurement, and, measurement and chart of the density distribution. The CT technology is much versatile, not presenting restriction in relation to form, size or composition of the object. A tomographic system, projected and constructed in our laboratory is presented. The applications and limitation of this system, illustrated by tomographyc images, are shown. (V.R.B.)

  14. A Study on the computerization of power system analysis of local control center

    Energy Technology Data Exchange (ETDEWEB)

    Kwon, Tae Won; Hwang, Jong Young; Jeon, Young Soo; Park, Yong Bae [Korea Electric Power Corp. (KEPCO), Taejon (Korea, Republic of). Research Center; Kim, Kun Jung; Kim, Yong Bae [Electrical Engineering and Science Research Institute (Korea, Republic of)

    1995-12-31

    Introducing PSS/E software PC version of PTI, after investigating several programs. For convenient use, translated application manual and operational manual in korean language. Unifying of bus numbering for managing and sharing the data which is using in PSS/E program. Preparing load flow theory quick reference manual and training users of PSS/E to promotability of power system analysis and power system planning (author). 12 refs., 43 figs.

  15. Simplified diagnostic coding sheet for computerized data storage and analysis in ophthalmology.

    Science.gov (United States)

    Tauber, J; Lahav, M

    1987-11-01

    A review of currently-available diagnostic coding systems revealed that most are either too abbreviated or too detailed. We have compiled a simplified diagnostic coding sheet based on the International Coding and Diagnosis (ICD-9), which is both complete and easy to use in a general practice. The information is transferred to a computer, which uses the relevant (ICD-9) diagnoses as database and can be retrieved later for display of patients' problems or analysis of clinical data.

  16. A Study on the computerization of power system analysis of local control center

    Energy Technology Data Exchange (ETDEWEB)

    Kwon, Tae Won; Hwang, Jong Young; Jeon, Young Soo; Park, Yong Bae [Korea Electric Power Corp. (KEPCO), Taejon (Korea, Republic of). Research Center; Kim, Kun Jung; Kim, Yong Bae [Electrical Engineering and Science Research Institute (Korea, Republic of)

    1996-12-31

    Introducing PSS/E software PC version of PTI, after investigating several programs. For convenient use, translated application manual and operational manual in korean language. Unifying of bus numbering for managing and sharing the data which is using in PSS/E program. Preparing load flow theory quick reference manual and training users of PSS/E to promotability of power system analysis and power system planning (author). 12 refs., 43 figs.

  17. Comparison of computerized digital and film-screen radiography: response to variation in imaging kVp

    Energy Technology Data Exchange (ETDEWEB)

    Broderick, N J; Long, B; Dreesen, R G; Cohen, M D; Cory, D A [Riley Hospital for Children, Indiana Univ. School of Medicine, Indianapolis, IN (United States). Dept. of Radiology; Katz, B P; Kalasinski, L A [Regenstreif Inst., Indiana Univ. School of Medicine, Indianapolis, IN (United States). Dept. of Medicine

    1992-09-01

    A controlled prospective study, in an animal model chosen to simulate portable neonatal radiography, was performed to compare the response of the Philips Computed Radiography (CR) system and conventional 200 speed film-screen (FS) to variation in imaging kVp. Acceptable images were obtained on the CR system over a very wide kVp range. In contrast the FS system produced acceptable images over a narrow kVp range. This ability suggests that the CR system should eliminate the need for repeat examinations in cases where a suboptimal kVp setting would have resulted in an unacceptable FS image. CR technology should therefore be ideally suited to portable radiography especially in situations where selection of correct exposure factors is difficult as in the neonatal nursery. (orig.).

  18. Comparison of computerized digital and film-screen radiography: response to variation in imaging kVp

    International Nuclear Information System (INIS)

    Broderick, N.J.; Long, B.; Dreesen, R.G.; Cohen, M.D.; Cory, D.A.; Katz, B.P.; Kalasinski, L.A.

    1992-01-01

    A controlled prospective study, in an animal model chosen to simulate portable neonatal radiography, was performed to compare the response of the Philips Computed Radiography (CR) system and conventional 200 speed film-screen (FS) to variation in imaging kVp. Acceptable images were obtained on the CR system over a very wide kVp range. In contrast the FS system produced acceptable images over a narrow kVp range. This ability suggests that the CR system should eliminate the need for repeat examinations in cases where a suboptimal kVp setting would have resulted in an unacceptable FS image. CR technology should therefore be ideally suited to portable radiography especially in situations where selection of correct exposure factors is difficult as in the neonatal nursery. (orig.)

  19. Analysis of absorbed dose in cervical spine scanning by computerized tomography using simulator objects

    International Nuclear Information System (INIS)

    Lyra, Maria Henriqueta Freire

    2015-01-01

    The Computed tomography (CT) has become an important diagnostic tool after the continued development of Multidetector CT (MDCT), which allows faster acquisition of images with better quality than the previous technology. However, there is an increased radiation exposure, especially in examinations that require more than one acquisition, as dynamic exams and enhancement studies in order to discriminate low contrast soft tissue injury from normal tissue. Cervical spine MDCT examinations are used for diagnosis of soft tissue and vascular changes, fractures, dysplasia and other diseases with instability, which guide the patient treatment and rehabilitation. This study aims at checking the absorbed dose range in the thyroid and other organs during MDCT scan of cervical spine, with and without bismuth thyroid shield. In this experiment a cervical spine MDCT scan was performed on anthropomorphic phantoms, from the occipital to the first thoracic vertebra, using a 64 and a 16 – channel CT scanners. Thermoluminescent dosimeters were used to obtain the absorbed dose in thyroid, lenses, magnum foramen and breasts of the phantom. The results show that the thyroid received the highest dose, 60.0 mGy, in the female phantom, according to the incidence of the primary X-ray beam. The absorbed doses in these tests showed significant differences in the evaluated organs, p value < 0.005, except for the magnum foramen and breasts. With the bismuth thyroid shield applied on the female phantom, the doses in the thyroid and in the lenses were reduced by 27% and 52%, respectively. On the other hand, a reduction of 23.3% in the thyroid and increasing of 49.0% in the lens were measured on the male phantom. (author)

  20. Analysis of progression of cervical OPLL using computerized tomography: typical sign of maturation of OPLL mass.

    Science.gov (United States)

    Choi, Byung-Wan; Baek, Dong-Hoon; Sheffler, Lindsey C; Chang, Han

    2015-07-17

    OBJECT The progression of cervical ossification of the posterior longitudinal ligament (OPLL) can lead to increase in the size of the OPLL mass and aggravation of neurological symptoms. In the present study, the authors aimed to analyze the progression of cervical OPLL by using CT imaging, elucidate the morphology of OPLL masses, and evaluate the factors associated with the progression of cervical OPLL. METHODS Sixty patients with cervical OPLL were included. All underwent an initial CT examination and had at least 24 months' follow-up with CT. The mean duration of follow-up was 29.6 months. Fourteen patients (Group A) had CT evidence of OPLL progression, and 46 (Group B) did not show evidence of progression on CT. The 2 groups were compared with respect to the following variables: sex, age, number of involved segments, type of OPLL, and treatment methods. The CT findings, such as the connection of an OPLL mass with the vertebral body and formation of trabeculation in the mass, were evaluated. RESULTS Sex and treatment modality were not associated with OPLL progression. The mean age of the patients in Group A was significantly lower than that in Group B (p = 0.03). The mean number of involved segments was 5.3 in Group A and 3.6 in Group B (p = 0.002). Group A had a higher proportion of cases with the mixed type of OPLL, whereas Group B had a higher proportion of cases with the segmental type (p = 0.02). A connection between the vertebral body and OPLL mass and trabeculation formation were more common in Group B (p cervical OPLL is associated with younger age, involvement of multiple levels, and mixed-type morphology. OPLL masses that are contiguous with the vertebral body and have trabecular formation are useful findings for identifying masses that are less likely to progress.

  1. Image formation and image analysis in electron microscopy

    International Nuclear Information System (INIS)

    Heel, M. van.

    1981-01-01

    This thesis covers various aspects of image formation and image analysis in electron microscopy. The imaging of relatively strong objects in partially coherent illumination, the coherence properties of thermionic emission sources and the detection of objects in quantum noise limited images are considered. IMAGIC, a fast, flexible and friendly image analysis software package is described. Intelligent averaging of molecular images is discussed. (C.F.)

  2. Image analysis enhancement and interpretation

    International Nuclear Information System (INIS)

    Glauert, A.M.

    1978-01-01

    The necessary practical and mathematical background are provided for the analysis of an electron microscope image in order to extract the maximum amount of structural information. Instrumental methods of image enhancement are described, including the use of the energy-selecting electron microscope and the scanning transmission electron microscope. The problems of image interpretation are considered with particular reference to the limitations imposed by radiation damage and specimen thickness. A brief survey is given of the methods for producing a three-dimensional structure from a series of two-dimensional projections, although emphasis is really given on the analysis, processing and interpretation of the two-dimensional projection of a structure. (Auth.)

  3. Image Analysis of Eccentric Photorefraction

    Directory of Open Access Journals (Sweden)

    J. Dušek

    2004-01-01

    Full Text Available This article deals with image and data analysis of the recorded video-sequences of strabistic infants. It describes a unique noninvasive measuring system based on two measuring methods (position of I. Purkynje image with relation to the centre of the lens and eccentric photorefraction for infants. The whole process is divided into three steps. The aim of the first step is to obtain video sequences on our special system (Eye Movement Analyser. Image analysis of the recorded sequences is performed in order to obtain curves of basic eye reactions (accommodation and convergence. The last step is to calibrate of these curves to corresponding units (diopter and degrees of movement.

  4. Introduction to Medical Image Analysis

    DEFF Research Database (Denmark)

    Paulsen, Rasmus Reinhold; Moeslund, Thomas B.

    This book is a result of a collaboration between DTU Informatics at the Technical University of Denmark and the Laboratory of Computer Vision and Media Technology at Aalborg University. It is partly based on the book ”Image and Video Processing”, second edition by Thomas Moeslund. The aim...... of the book is to present the fascinating world of medical image analysis in an easy and interesting way. Compared to many standard books on image analysis, the approach we have chosen is less mathematical and more casual. Some of the key algorithms are exemplified in C-code. Please note that the code...

  5. Shape analysis in medical image analysis

    CERN Document Server

    Tavares, João

    2014-01-01

    This book contains thirteen contributions from invited experts of international recognition addressing important issues in shape analysis in medical image analysis, including techniques for image segmentation, registration, modelling and classification, and applications in biology, as well as in cardiac, brain, spine, chest, lung and clinical practice. This volume treats topics such as, anatomic and functional shape representation and matching; shape-based medical image segmentation; shape registration; statistical shape analysis; shape deformation; shape-based abnormity detection; shape tracking and longitudinal shape analysis; machine learning for shape modeling and analysis; shape-based computer-aided-diagnosis; shape-based medical navigation; benchmark and validation of shape representation, analysis and modeling algorithms. This work will be of interest to researchers, students, and manufacturers in the fields of artificial intelligence, bioengineering, biomechanics, computational mechanics, computationa...

  6. Automation of an ion chromatograph for precipitation analysis with computerized data reduction

    Science.gov (United States)

    Hedley, Arthur G.; Fishman, Marvin J.

    1982-01-01

    Interconnection of an ion chromatograph, an autosampler, and a computing integrator to form an analytical system for simultaneous determination of fluoride, chloride, orthophosphate, bromide, nitrate, and sulfate in precipitation samples is described. Computer programs provided with the integrator are modified to implement ionchromatographic data reduction and data storage. The liquid-flow scheme for the ion chromatograph is changed by addition of a second suppressor column for greater analytical capacity. An additional vave enables selection of either suppressor column for analysis, as the other column is regenerated and stabilized with concentrated eluent.Minimum limits of detection and quantitation for each anion are calculated; these limits are a function of suppressor exhaustion. Precision for replicate analyses of six precipitation samples for fluoride, chloride, orthophosphate, nitrate, and sulfate ranged from 0.003 to 0.027 milligrams per liter. To determine accuracy of results, the same samples were spiked with known concentrations of the above mentioned anions. Average recovery was 108 percent.

  7. Processing and refinement of steel microstructure images for assisting in computerized heat treatment of plain carbon steel

    Science.gov (United States)

    Gupta, Shubhank; Panda, Aditi; Naskar, Ruchira; Mishra, Dinesh Kumar; Pal, Snehanshu

    2017-11-01

    Steels are alloys of iron and carbon, widely used in construction and other applications. The evolution of steel microstructure through various heat treatment processes is an important factor in controlling properties and performance of steel. Extensive experimentations have been performed to enhance the properties of steel by customizing heat treatment processes. However, experimental analyses are always associated with high resource requirements in terms of cost and time. As an alternative solution, we propose an image processing-based technique for refinement of raw plain carbon steel microstructure images, into a digital form, usable in experiments related to heat treatment processes of steel in diverse applications. The proposed work follows the conventional steps practiced by materials engineers in manual refinement of steel images; and it appropriately utilizes basic image processing techniques (including filtering, segmentation, opening, and clustering) to automate the whole process. The proposed refinement of steel microstructure images is aimed to enable computer-aided simulations of heat treatment of plain carbon steel, in a timely and cost-efficient manner; hence it is beneficial for the materials and metallurgy industry. Our experimental results prove the efficiency and effectiveness of the proposed technique.

  8. Artificial intelligence and medical imaging. Expert systems and image analysis

    International Nuclear Information System (INIS)

    Wackenheim, A.; Zoellner, G.; Horviller, S.; Jacqmain, T.

    1987-01-01

    This paper gives an overview on the existing systems for automated image analysis and interpretation in medical imaging, especially in radiology. The example of ORFEVRE, the system for the analysis of CAT-scan images of the cervical triplet (c3-c5) by image analysis and subsequent expert-system is given and discussed in detail. Possible extensions are described [fr

  9. Computerized systems analysis and optimization of aircraft engine performance, weight, and life cycle costs

    Science.gov (United States)

    Fishbach, L. H.

    1979-01-01

    The computational techniques utilized to determine the optimum propulsion systems for future aircraft applications and to identify system tradeoffs and technology requirements are described. The characteristics and use of the following computer codes are discussed: (1) NNEP - a very general cycle analysis code that can assemble an arbitrary matrix fans, turbines, ducts, shafts, etc., into a complete gas turbine engine and compute on- and off-design thermodynamic performance; (2) WATE - a preliminary design procedure for calculating engine weight using the component characteristics determined by NNEP; (3) POD DRG - a table look-up program to calculate wave and friction drag of nacelles; (4) LIFCYC - a computer code developed to calculate life cycle costs of engines based on the output from WATE; and (5) INSTAL - a computer code developed to calculate installation effects, inlet performance and inlet weight. Examples are given to illustrate how these computer techniques can be applied to analyze and optimize propulsion system fuel consumption, weight, and cost for representative types of aircraft and missions.

  10. Errors from Image Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Wood, William Monford [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-02-23

    Presenting a systematic study of the standard analysis of rod-pinch radiographs for obtaining quantitative measurements of areal mass densities, and making suggestions for improving the methodology of obtaining quantitative information from radiographed objects.

  11. Pocket pumped image analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kotov, I.V., E-mail: kotov@bnl.gov [Brookhaven National Laboratory, Upton, NY 11973 (United States); O' Connor, P. [Brookhaven National Laboratory, Upton, NY 11973 (United States); Murray, N. [Centre for Electronic Imaging, Open University, Milton Keynes, MK7 6AA (United Kingdom)

    2015-07-01

    The pocket pumping technique is used to detect small electron trap sites. These traps, if present, degrade CCD charge transfer efficiency. To reveal traps in the active area, a CCD is illuminated with a flat field and, before image is read out, accumulated charges are moved back and forth number of times in parallel direction. As charges are moved over a trap, an electron is removed from the original pocket and re-emitted in the following pocket. As process repeats one pocket gets depleted and the neighboring pocket gets excess of charges. As a result a “dipole” signal appears on the otherwise flat background level. The amplitude of the dipole signal depends on the trap pumping efficiency. This paper is focused on trap identification technique and particularly on new methods developed for this purpose. The sensor with bad segments was deliberately chosen for algorithms development and to demonstrate sensitivity and power of new methods in uncovering sensor defects.

  12. Quantitative image analysis in sonograms of the thyroid gland

    Energy Technology Data Exchange (ETDEWEB)

    Catherine, Skouroliakou [A' Department of Radiology, University of Athens, Vas.Sophias Ave, Athens 11528 (Greece); Maria, Lyra [A' Department of Radiology, University of Athens, Vas.Sophias Ave, Athens 11528 (Greece)]. E-mail: mlyra@pindos.uoa.gr; Aristides, Antoniou [A' Department of Radiology, University of Athens, Vas.Sophias Ave, Athens 11528 (Greece); Lambros, Vlahos [A' Department of Radiology, University of Athens, Vas.Sophias Ave, Athens 11528 (Greece)

    2006-12-20

    High-resolution, real-time ultrasound is a routine examination for assessing the disorders of the thyroid gland. However, the current diagnosis practice is based mainly on qualitative evaluation of the resulting sonograms, therefore depending on the physician's experience. Computerized texture analysis is widely employed in sonographic images of various organs (liver, breast), and it has been proven to increase the sensitivity of diagnosis by providing a better tissue characterization. The present study attempts to characterize thyroid tissue by automatic texture analysis. The texture features that are calculated are based on co-occurrence matrices as they have been proposed by Haralick. The sample consists of 40 patients. For each patient two sonographic images (one for each lobe) are recorded in DICOM format. The lobe is manually delineated in each sonogram, and the co-occurrence matrices for 52 separation vectors are calculated. The texture features extracted from each one of these matrices are: contrast, correlation, energy and homogeneity. Primary component analysis is used to select the optimal set of features. The statistical analysis resulted in the extraction of 21 optimal descriptors. The optimal descriptors are all co-occurrence parameters as the first-order statistics did not prove to be representative of the images characteristics. The bigger number of components depends mainly on correlation for very close or very far distances. The results indicate that quantitative analysis of thyroid sonograms can provide an objective characterization of thyroid tissue.

  13. Use of Order Sets in Inpatient Computerized Provider Order Entry Systems: A Comparative Analysis of Usage Patterns at Seven Sites

    Science.gov (United States)

    Wright, Adam; Feblowitz, Joshua C.; Pang, Justine E.; Carpenter, James D.; Krall, Michael A.; Middleton, Blackford; Sittig, Dean F.

    2012-01-01

    Background Many computerized provider order entry (CPOE) systems include the ability to create electronic order sets: collections of clinically-related orders grouped by purpose. Order sets promise to make CPOE systems more efficient, improve care quality and increase adherence to evidence-based guidelines. However, the development and implementation of order sets can be expensive and time-consuming and limited literature exists about their utilization. Methods Based on analysis of order set usage logs from a diverse purposive sample of seven sites with commercially- and internally-developed inpatient CPOE systems, we developed an original order set classification system. Order sets were categorized across seven non-mutually exclusive axes: admission/discharge/transfer (ADT), perioperative, condition-specific, task-specific, service-specific, convenience, and personal. In addition, 731 unique subtypes were identified within five axes: four in ADT (S=4), three in perioperative, 144 in condition-specific, 513 in task-specific, and 67 in service-specific. Results Order sets (n=1,914) were used a total of 676,142 times at the participating sites during a one-year period. ADT and perioperative order sets accounted for 27.6% and 24.2% of usage respectively. Peripartum/labor, chest pain/Acute Coronary Syndrome/Myocardial Infarction and diabetes order sets accounted for 51.6% of condition-specific usage. Insulin, angiography/angioplasty and arthroplasty order sets accounted for 19.4% of task-specific usage. Emergency/trauma, Obstetrics/Gynecology/Labor Delivery and anesthesia accounted for 32.4% of service-specific usage. Overall, the top 20% of order sets accounted for 90.1% of all usage. Additional salient patterns are identified and described. Conclusion We observed recurrent patterns in order set usage across multiple sites as well as meaningful variations between sites. Vendors and institutional developers should identify high-value order set types through concrete

  14. Computerized three-dimensional normal atlas

    International Nuclear Information System (INIS)

    Mano, Isamu; Suto, Yasuzo; Suzuki, Masataka; Iio, Masahiro.

    1990-01-01

    This paper presents our ongoing project in which normal human anatomy and its quantitative data are systematically arranged in a computer. The final product, the Computerized Three-Dimensional Normal Atlas, will be able to supply tomographic images in any direction, 3-D images, and coded information on organs, e.g., anatomical names, CT numbers, and T 1 and T 2 values. (author)

  15. Analysis of the gammaholographic image formation

    International Nuclear Information System (INIS)

    Fonroget, J.; Roucayrol, J.C.; Perrin, J.; Belvaux, Y.; Paris-11 Univ., 91 - Orsay

    1975-01-01

    Gammaholography, or coded opening gammagraphy, is a new gammagraphic method in which the standard collimators are replaced by one or more modulator screens placed between the detector and the radioactive object. The recording obtained is a coded image or incoherent hologram which contains three-dimensional information on the object and can be decoded analogically in a very short time. The formation of the image has been analyzed in the coding and optical decoding phases in the case of a single coding screen modulated according to a Fresnel zoned lattice. The analytical expression established for the modulation transfer function (MTF) of the system can be used to study, by computerized simulation, the influence of the number of zones on the quality of the image [fr

  16. Digital image analysis in breast pathology-from image processing techniques to artificial intelligence.

    Science.gov (United States)

    Robertson, Stephanie; Azizpour, Hossein; Smith, Kevin; Hartman, Johan

    2018-04-01

    Breast cancer is the most common malignant disease in women worldwide. In recent decades, earlier diagnosis and better adjuvant therapy have substantially improved patient outcome. Diagnosis by histopathology has proven to be instrumental to guide breast cancer treatment, but new challenges have emerged as our increasing understanding of cancer over the years has revealed its complex nature. As patient demand for personalized breast cancer therapy grows, we face an urgent need for more precise biomarker assessment and more accurate histopathologic breast cancer diagnosis to make better therapy decisions. The digitization of pathology data has opened the door to faster, more reproducible, and more precise diagnoses through computerized image analysis. Software to assist diagnostic breast pathology through image processing techniques have been around for years. But recent breakthroughs in artificial intelligence (AI) promise to fundamentally change the way we detect and treat breast cancer in the near future. Machine learning, a subfield of AI that applies statistical methods to learn from data, has seen an explosion of interest in recent years because of its ability to recognize patterns in data with less need for human instruction. One technique in particular, known as deep learning, has produced groundbreaking results in many important problems including image classification and speech recognition. In this review, we will cover the use of AI and deep learning in diagnostic breast pathology, and other recent developments in digital image analysis. Copyright © 2017 Elsevier Inc. All rights reserved.

  17. Three-dimensional analysis and display of medical images

    International Nuclear Information System (INIS)

    Bajcsy, R.

    1985-01-01

    Until recently, the most common medical images were X-rays on film analyzed by an expert, ususally a radiologist, who used, in addition to his/her visual perceptual abilities, knowledge obtained through medical studies, and experience. Today, however, with the advent of various imaging techniques, X-ray computerized axial tomographs (CAT), positron emission tomographs (PET), ultrasound tomographs, nuclear magnetic resonance tomographs (NMR), just to mention a few, the images are generated by computers and displayed on computer-controlled devices; so it is appropriate to think about more quantitative and perhaps automated ways of data analysis. Furthermore, since the data are generated by computer, it is only natural to take advantage of the computer for analysis purposes. In addition, using the computer, one can analyze more data and relate different modalities from the same subject, such as, for example, comparing the CAT images with PET images from the same subject. In the next section (The PET Scanner) the authors shall only briefly mention with appropriate references the modeling of the positron emission tomographic scanner, since this imaging technique is not as widely described in the literature as the CAT scanner. The modeling of the interpreter is not going to be mentioned, since it is a topic that by itself deserves a full paper; see, for example, Pizer [1981]. The thrust of this chapter is on modeling the organs that are being imaged and the matching techniques between the model and the data. The image data is from CAT and PET scans. Although the authors believe that their techniques are applicable to any organ of the human body, the examples are only from the brain

  18. Comparison of preoperative computerized tomography scan imaging of temporal bone with the intra-operative findings in patients undergoing mastiodectomy

    International Nuclear Information System (INIS)

    Gerami, H.; Naghavi, E.; Wahabi-Moghadam, M.; Forghanparast, K.; Akbar, Manzar H.

    2009-01-01

    Objective was to compare the consistency rates of pre- and intra-operative radiological findings in patients with chronic suppurative otitis media (CSOM). In a cross-sectional study, 80 patients with CSOM underwent pre-operative CT scanning and we compared the results with intra-operative clinical findings during mastiodectomy from 2000-2004 in the Otology Department, Amiralmomenin Hospital of Guilan Medical University, Rasht, Iran. Sensitivity, specificity, positive and negative predictive value of CT scan in tympanic and mastoid cholesteatoma, ossicular chain erosion, tegmentympani erosion, dehiscence of facial canal, lateral semicircular canal (LSCC) fistula were assessed. Then, correlation between radiological findings and intra-operative findings were calculated. The mean age of patients was 27.9+-16.3 years. Mostly were males (n=57 [71.3%]). Correlation of preoperative radiological images with intra-operative clinical findings were moderate to good on tympanic cholesteatoma, mastoid cholesteatoma and ossicular chain erosion, but weak and insignificant in cases of tegmen erosion, facial canal dehiscene and LSCC fistulae. Preoperative CT scan may be helpful in decision-making for surgery in cases of cholesteatoma and ossicular erosion. Despite of limitations radiological scanning is a useful adjunct to management of CSOM. (author)

  19. Signal and image multiresolution analysis

    CERN Document Server

    Ouahabi, Abdelialil

    2012-01-01

    Multiresolution analysis using the wavelet transform has received considerable attention in recent years by researchers in various fields. It is a powerful tool for efficiently representing signals and images at multiple levels of detail with many inherent advantages, including compression, level-of-detail display, progressive transmission, level-of-detail editing, filtering, modeling, fractals and multifractals, etc.This book aims to provide a simple formalization and new clarity on multiresolution analysis, rendering accessible obscure techniques, and merging, unifying or completing

  20. Teaching image analysis at DIKU

    DEFF Research Database (Denmark)

    Johansen, Peter

    2010-01-01

    The early development of computer vision at Department of Computer Science at University of Copenhagen (DIKU) is briefly described. The different disciplines in computer vision are introduced, and the principles for teaching two courses, an image analysis course, and a robot lab class are outlined....

  1. SU-D-BRA-04: Computerized Framework for Marker-Less Localization of Anatomical Feature Points in Range Images Based On Differential Geometry Features for Image-Guided Radiation Therapy

    International Nuclear Information System (INIS)

    Soufi, M; Arimura, H; Toyofuku, F; Nakamura, K; Hirose, T; Umezu, Y; Shioyama, Y

    2016-01-01

    Purpose: To propose a computerized framework for localization of anatomical feature points on the patient surface in infrared-ray based range images by using differential geometry (curvature) features. Methods: The general concept was to reconstruct the patient surface by using a mathematical modeling technique for the computation of differential geometry features that characterize the local shapes of the patient surfaces. A region of interest (ROI) was firstly extracted based on a template matching technique applied on amplitude (grayscale) images. The extracted ROI was preprocessed for reducing temporal and spatial noises by using Kalman and bilateral filters, respectively. Next, a smooth patient surface was reconstructed by using a non-uniform rational basis spline (NURBS) model. Finally, differential geometry features, i.e. the shape index and curvedness features were computed for localizing the anatomical feature points. The proposed framework was trained for optimizing shape index and curvedness thresholds and tested on range images of an anthropomorphic head phantom. The range images were acquired by an infrared ray-based time-of-flight (TOF) camera. The localization accuracy was evaluated by measuring the mean of minimum Euclidean distances (MMED) between reference (ground truth) points and the feature points localized by the proposed framework. The evaluation was performed for points localized on convex regions (e.g. apex of nose) and concave regions (e.g. nasofacial sulcus). Results: The proposed framework has localized anatomical feature points on convex and concave anatomical landmarks with MMEDs of 1.91±0.50 mm and 3.70±0.92 mm, respectively. A statistically significant difference was obtained between the feature points on the convex and concave regions (P<0.001). Conclusion: Our study has shown the feasibility of differential geometry features for localization of anatomical feature points on the patient surface in range images. The proposed

  2. SU-D-BRA-04: Computerized Framework for Marker-Less Localization of Anatomical Feature Points in Range Images Based On Differential Geometry Features for Image-Guided Radiation Therapy

    Energy Technology Data Exchange (ETDEWEB)

    Soufi, M; Arimura, H; Toyofuku, F [Kyushu University, Fukuoka, Fukuoka (Japan); Nakamura, K [Hamamatsu University School of Medicine, Hamamatsu, Shizuoka (Japan); Hirose, T; Umezu, Y [Kyushu University Hospital, Fukuoka, Fukuoka (Japan); Shioyama, Y [Saga Heavy Ion Medical Accelerator in Tosu, Tosu, Saga (Japan)

    2016-06-15

    Purpose: To propose a computerized framework for localization of anatomical feature points on the patient surface in infrared-ray based range images by using differential geometry (curvature) features. Methods: The general concept was to reconstruct the patient surface by using a mathematical modeling technique for the computation of differential geometry features that characterize the local shapes of the patient surfaces. A region of interest (ROI) was firstly extracted based on a template matching technique applied on amplitude (grayscale) images. The extracted ROI was preprocessed for reducing temporal and spatial noises by using Kalman and bilateral filters, respectively. Next, a smooth patient surface was reconstructed by using a non-uniform rational basis spline (NURBS) model. Finally, differential geometry features, i.e. the shape index and curvedness features were computed for localizing the anatomical feature points. The proposed framework was trained for optimizing shape index and curvedness thresholds and tested on range images of an anthropomorphic head phantom. The range images were acquired by an infrared ray-based time-of-flight (TOF) camera. The localization accuracy was evaluated by measuring the mean of minimum Euclidean distances (MMED) between reference (ground truth) points and the feature points localized by the proposed framework. The evaluation was performed for points localized on convex regions (e.g. apex of nose) and concave regions (e.g. nasofacial sulcus). Results: The proposed framework has localized anatomical feature points on convex and concave anatomical landmarks with MMEDs of 1.91±0.50 mm and 3.70±0.92 mm, respectively. A statistically significant difference was obtained between the feature points on the convex and concave regions (P<0.001). Conclusion: Our study has shown the feasibility of differential geometry features for localization of anatomical feature points on the patient surface in range images. The proposed

  3. Computerized Analysis of Verbal Fluency: Normative Data and the Effects of Repeated Testing, Simulated Malingering, and Traumatic Brain Injury.

    Directory of Open Access Journals (Sweden)

    David L Woods

    Full Text Available In verbal fluency (VF tests, subjects articulate words in a specified category during a short test period (typically 60 s. Verbal fluency tests are widely used to study language development and to evaluate memory retrieval in neuropsychiatric disorders. Performance is usually measured as the total number of correct words retrieved. Here, we describe the properties of a computerized VF (C-VF test that tallies correct words and repetitions while providing additional lexical measures of word frequency, syllable count, and typicality. In addition, the C-VF permits (1 the analysis of the rate of responding over time, and (2 the analysis of the semantic relationships between words using a new method, Explicit Semantic Analysis (ESA, as well as the established semantic clustering and switching measures developed by Troyer et al. (1997. In Experiment 1, we gathered normative data from 180 subjects ranging in age from 18 to 82 years in semantic ("animals" and phonemic (letter "F" conditions. The number of words retrieved in 90 s correlated with education and daily hours of computer-use. The rate of word production declined sharply over time during both tests. In semantic conditions, correct-word scores correlated strongly with the number of ESA and Troyer-defined semantic switches as well as with an ESA-defined semantic organization index (SOI. In phonemic conditions, ESA revealed significant semantic influences in the sequence of words retrieved. In Experiment 2, we examined the test-retest reliability of different measures across three weekly tests in 40 young subjects. Different categories were used for each semantic ("animals", "parts of the body", and "foods" and phonemic (letters "F", "A", and "S" condition. After regressing out the influences of education and computer-use, we found that correct-word z-scores in the first session did not differ from those of the subjects in Experiment 1. Word production was uniformly greater in semantic than

  4. MDCT for computerized volumetry of pneumothoraces in pediatric patients.

    Science.gov (United States)

    Cai, Wenli; Lee, Edward Y; Vij, Abhinav; Mahmood, Soran A; Yoshida, Hiroyuki

    2011-03-01

    Our purpose in this study was to develop an automated computer-aided volumetry (CAV) scheme for quantifying pneumothorax in multidetector computed tomography (MDCT) images for pediatric patients and to investigate the imaging parameters that may affect its accuracy. Fifty-eight consecutive pediatric patients (mean age 12 ± 6 years) with pneumothorax who underwent MDCT for evaluation were collected retrospectively for this study. All cases were imaged by a 16- or 64-MDCT scanner with weight-based kilovoltage, low-dose tube current, 1.0-1.5 pitch, 0.6-5.0 mm slice thickness, and a B70f (sharp) or B31f (soft) reconstruction kernel. Sixty-three pneumothoraces ≥1 mL were visually identified in the left (n = 30) and right (n = 33) lungs. Each identified pneumothorax was contoured manually on an Amira workstation V4.1.1 (Mercury Computer Systems, Chelmsford, MA) by two radiologists in consensus. The computerized volumes of the pneumothoraces were determined by application of our CAV scheme. The accuracy of our automated CAV scheme was evaluated by comparison between computerized volumetry and manual volumetry, for the total volume of pneumothoraces in the left and right lungs. The mean difference between the computerized volumetry and the manual volumetry for all 63 pneumothoraces ≥1 mL was 8.2%. For pneumothoraces ≥10 mL, ≥50 mL, and ≥200 mL, the mean differences were 7.7% (n = 57), 7.3% (n = 33), and 6.4% (n = 13), respectively. The correlation coefficient was 0.99 between the computerized volume and the manual volume of pneumothoraces. Bland-Altman analysis showed that computerized volumetry has a mean difference of -5.1% compared to manual volumetry. For all pneumothoraces ≥10 mL, the mean differences for slice thickness ≤1.25 mm, = 1.5 mm, and = 5.0 mm were 6.1% (n = 28), 3.5% (n = 10), and 12.2% (n = 19), respectively. For the two reconstruction kernels, B70f and B31f, the mean differences were 6.3% (n = 42, B70f) and 11.7% (n = 15, B31f

  5. Is hybridic positron emission tomography/computerized tomography the only option? The future of nuclear medicine and molecular imaging.

    Science.gov (United States)

    Grammaticos, Philip; Zerva, Cherry; Asteriadis, Ioannis; Trontzos, Christos; Hatziioannou, Kostas

    2007-01-01

    sources of radiation" b) nuclear radiation and c) molecular nuclear medicine. The "European Journal of Nuclear Medicine and Molecular Imaging" shall have to erase the three last words of its title and be renamed. As Professor Abass Alavi et al (2007), have mentioned: "Is PET/CT the only option?" In favor of PET/CT are the following: Attenuation correction (AC) and better anatomical localization of lesions visualized with PET. Also PET/CT can be used as a diagnostic CT scanner (dCT). Against using the PET/CT scanners are the following arguments: a) This equipment is not necessary because we can always ask the Radiologists for a dCT scan. Many patients have already done a dCT scan at the time they are referred for a PET scan to the Nuclear Medicine Department. b) The absolute clinical indications for PET/CT with the use of a contrast agent, are under investigation. c) Although there is at present a list of indications suggested for the PET/CT scanner, there are studies disputing some of these indications, as for example in metastatic colon cancer where a high diagnostic accuracy for PET study alone, has been reported. d) The option of AC performed by the PET/CT scanner has also been questioned. Artifacts may be up to 84%. e) The PET/CT is expensive, time consuming, space occupying, and needs additional medical and technical personnel. f) Not to mention the extra radiation dose to the patients. g) Shall we inform those young medical students who wish to become nuclear medicine physicians, to hold their decision till the content of future Nuclear Medicine is clarified? We may suggest that: Our specialty could be renamed as: "Clinical Nuclear Medicine" and include additional "proper certified education" on the PET/CT equipment. The PET/CT scanner should remain in the Nuclear Medicine Department where Radiologists could act as advisors.

  6. Human factors analysis and design methods for nuclear waste retrieval systems. Volume III. User's guide for the computerized event-tree analysis technique

    International Nuclear Information System (INIS)

    Casey, S.M.; Deretsky, Z.

    1980-08-01

    This document provides detailed instructions for using the Computerized Event-Tree Analysis Technique (CETAT), a program designed to assist a human factors analyst in predicting event probabilities in complex man-machine configurations found in waste retrieval systems. The instructions contained herein describe how to (a) identify the scope of a CETAT analysis, (b) develop operator performance data, (c) enter an event-tree structure, (d) modify a data base, and (e) analyze event paths and man-machine system configurations. Designed to serve as a tool for developing, organizing, and analyzing operator-initiated event probabilities, CETAT simplifies the tasks of the experienced systems analyst by organizing large amounts of data and performing cumbersome and time consuming arithmetic calculations. The principal uses of CETAT in the waste retrieval development project will be to develop models of system reliability and evaluate alternative equipment designs and operator tasks. As with any automated technique, however, the value of the output will be a function of the knowledge and skill of the analyst using the program

  7. Astronomical Image and Data Analysis

    CERN Document Server

    Starck, J.-L

    2006-01-01

    With information and scale as central themes, this comprehensive survey explains how to handle real problems in astronomical data analysis using a modern arsenal of powerful techniques. It treats those innovative methods of image, signal, and data processing that are proving to be both effective and widely relevant. The authors are leaders in this rapidly developing field and draw upon decades of experience. They have been playing leading roles in international projects such as the Virtual Observatory and the Grid. The book addresses not only students and professional astronomers and astrophysicists, but also serious amateur astronomers and specialists in earth observation, medical imaging, and data mining. The coverage includes chapters or appendices on: detection and filtering; image compression; multichannel, multiscale, and catalog data analytical methods; wavelets transforms, Picard iteration, and software tools. This second edition of Starck and Murtagh's highly appreciated reference again deals with to...

  8. Computerized tomography of orbital lesions

    International Nuclear Information System (INIS)

    Kuroiwa, Mayumi

    1981-01-01

    Two different types of computerized tomography scanners (CT scanner), i.e. a whole-body CT scanner (GE-CT/T8800) and a cerebral CT scanner (EMI-1010), were compared in the assessment and diagnosis of various orbital lesions. The whole-body CT scanner was found to be advantageous over the cerebral CT scanner for the following reasons: (1) CT images were more informative due to thinner slices associated with smaller-sized and larger-numbered matrices; (2) less artifacts derived from motion of the head or eyeball were produced because of the shorter scanning time; (3) with a devised gantry, coronal dissections were available whenever demanded. (author)

  9. Image analysis for material characterisation

    Science.gov (United States)

    Livens, Stefan

    In this thesis, a number of image analysis methods are presented as solutions to two applications concerning the characterisation of materials. Firstly, we deal with the characterisation of corrosion images, which is handled using a multiscale texture analysis method based on wavelets. We propose a feature transformation that deals with the problem of rotation invariance. Classification is performed with a Learning Vector Quantisation neural network and with combination of outputs. In an experiment, 86,2% of the images showing either pit formation or cracking, are correctly classified. Secondly, we develop an automatic system for the characterisation of silver halide microcrystals. These are flat crystals with a triangular or hexagonal base and a thickness in the 100 to 200 nm range. A light microscope is used to image them. A novel segmentation method is proposed, which allows to separate agglomerated crystals. For the measurement of shape, the ratio between the largest and the smallest radius yields the best results. The thickness measurement is based on the interference colours that appear for light reflected at the crystals. The mean colour of different thickness populations is determined, from which a calibration curve is derived. With this, the thickness of new populations can be determined accurately.

  10. Planning applications in image analysis

    Science.gov (United States)

    Boddy, Mark; White, Jim; Goldman, Robert; Short, Nick, Jr.

    1994-01-01

    We describe two interim results from an ongoing effort to automate the acquisition, analysis, archiving, and distribution of satellite earth science data. Both results are applications of Artificial Intelligence planning research to the automatic generation of processing steps for image analysis tasks. First, we have constructed a linear conditional planner (CPed), used to generate conditional processing plans. Second, we have extended an existing hierarchical planning system to make use of durations, resources, and deadlines, thus supporting the automatic generation of processing steps in time and resource-constrained environments.

  11. Quantitative image analysis of synovial tissue

    NARCIS (Netherlands)

    van der Hall, Pascal O.; Kraan, Maarten C.; Tak, Paul Peter

    2007-01-01

    Quantitative image analysis is a form of imaging that includes microscopic histological quantification, video microscopy, image analysis, and image processing. Hallmarks are the generation of reliable, reproducible, and efficient measurements via strict calibration and step-by-step control of the

  12. Geometrical efficiency in computerized tomography: generalized model

    International Nuclear Information System (INIS)

    Costa, P.R.; Robilotta, C.C.

    1992-01-01

    A simplified model for producing sensitivity and exposure profiles in computerized tomographic system was recently developed allowing the forecast of profiles behaviour in the rotation center of the system. The generalization of this model for some point of the image plane was described, and the geometrical efficiency could be evaluated. (C.G.C.)

  13. Mesh Processing in Medical Image Analysis

    DEFF Research Database (Denmark)

    The following topics are dealt with: mesh processing; medical image analysis; interactive freeform modeling; statistical shape analysis; clinical CT images; statistical surface recovery; automated segmentation; cerebral aneurysms; and real-time particle-based representation....

  14. Effective Heart Disease Detection Based on Quantitative Computerized Traditional Chinese Medicine Using Representation Based Classifiers

    Directory of Open Access Journals (Sweden)

    Ting Shu

    2017-01-01

    Full Text Available At present, heart disease is the number one cause of death worldwide. Traditionally, heart disease is commonly detected using blood tests, electrocardiogram, cardiac computerized tomography scan, cardiac magnetic resonance imaging, and so on. However, these traditional diagnostic methods are time consuming and/or invasive. In this paper, we propose an effective noninvasive computerized method based on facial images to quantitatively detect heart disease. Specifically, facial key block color features are extracted from facial images and analyzed using the Probabilistic Collaborative Representation Based Classifier. The idea of facial key block color analysis is founded in Traditional Chinese Medicine. A new dataset consisting of 581 heart disease and 581 healthy samples was experimented by the proposed method. In order to optimize the Probabilistic Collaborative Representation Based Classifier, an analysis of its parameters was performed. According to the experimental results, the proposed method obtains the highest accuracy compared with other classifiers and is proven to be effective at heart disease detection.

  15. Automated image analysis of uterine cervical images

    Science.gov (United States)

    Li, Wenjing; Gu, Jia; Ferris, Daron; Poirson, Allen

    2007-03-01

    Cervical Cancer is the second most common cancer among women worldwide and the leading cause of cancer mortality of women in developing countries. If detected early and treated adequately, cervical cancer can be virtually prevented. Cervical precursor lesions and invasive cancer exhibit certain morphologic features that can be identified during a visual inspection exam. Digital imaging technologies allow us to assist the physician with a Computer-Aided Diagnosis (CAD) system. In colposcopy, epithelium that turns white after application of acetic acid is called acetowhite epithelium. Acetowhite epithelium is one of the major diagnostic features observed in detecting cancer and pre-cancerous regions. Automatic extraction of acetowhite regions from cervical images has been a challenging task due to specular reflection, various illumination conditions, and most importantly, large intra-patient variation. This paper presents a multi-step acetowhite region detection system to analyze the acetowhite lesions in cervical images automatically. First, the system calibrates the color of the cervical images to be independent of screening devices. Second, the anatomy of the uterine cervix is analyzed in terms of cervix region, external os region, columnar region, and squamous region. Third, the squamous region is further analyzed and subregions based on three levels of acetowhite are identified. The extracted acetowhite regions are accompanied by color scores to indicate the different levels of acetowhite. The system has been evaluated by 40 human subjects' data and demonstrates high correlation with experts' annotations.

  16. Computerized detection of mass lesions in digital mammograms

    International Nuclear Information System (INIS)

    Yin, F.F.; Giger, M.L.; Doi, K.; Metz, C.E.; Vyborny, C.J.; Schmidt, R.A.

    1989-01-01

    Early detection of breast cancer from the periodic screening of asymptomatic women could reduce breast cancer mortality by at least 40%. The authors are developing a computerized scheme for the detection of mass lesions in digital mammograms as an aid to radiologists in such high volume screening programs. Based on left-right architectural symmetry and gray-level histogram analysis, bilateral subtraction of left and right breast images is performed. False-positive detections included in bilateral-difference images are reduced with various images feature-extraction techniques. The database involves clinical film mammograms digitized by a TV camera and analyzed on a Micro-VAX workstation. Among five different bilateral subtraction techniques investigated, a nonlinear approach provided superior lesion enhancement. Feature-extraction techniques reduced substantially the remaining false-positives. Preliminary results, for 32 pairs of clinical mammograms, yielded a true-positive rate of approximately 95% with a false-positive rate of about 2 per image

  17. Computerized data treatment technology

    International Nuclear Information System (INIS)

    Ferguson, R.B.; Maddox, J.H.; Wren, H.F.

    1977-01-01

    The Savannah River Laboratory (SRL) has accepted responsibility for a hydrogeochemical and stream-sediment reconnaissance in 25 eastern states as part of the National Uranium Resource Evaluation (NURE). SRL has developed a computerized program for recording, processing, updating, retrieving, and analyzing hydrogeochemical data from this reconnaissance. This program will handle an expected 150 million bytes of hydrogeochemical data from 150,000 to 200,000 sample sites over the next four years. The SRL--NURE hydrogeochemical data management system is written in FORTRAN IV for an IBM System 360/195 computer and is designed to easily accommodate changes in types of collected data and input format. As the data become available, they are accepted and combined with relevant data already in the system. SRL also developed a sample inventory and control system and a graphics and analysis system. The sample inventory and control system accounts for the movements of all samples and forms from initial receipt through final storage. Approximately six million sample movements are expected. The graphics and analysis system provides easily usable programs for reporting and interpreting data. Because of the large volume of data to be interpreted, the graphics and analysis system plays a central role in the hydrogeochemical program. Programs developed to provide two- and three-dimensional plots of sampled geographic areas show concentrations and locations of individual variables which are displayed and reproduced photographically. Pattern recognition techniques are also available, and they allow multivariate data to be categorized into ''clusters,'' which may indicate sites favorable for uranium exploration

  18. Computerized tomography in the diagnosis of hyperparathyroidism

    International Nuclear Information System (INIS)

    Sobota, J.; Girl, J.; Sotornik, I.; Kocandrle, V.

    1990-01-01

    Long-term experience in the application of computerized tomography to the diagnosis of hyperparathyroidism is summarized. Based on a large number of examinations (164) of parathyroid glands associated with the possibility of verification and comparison with the results of ultrasonography and other imaging methods, the potential of computerized tomography in the diagnosis of hyperparathyroidism and its advantages and limitations are summarized. It is concluded that owing to its high diagnostic precision, this technique can be regarded reliable in detecting enlarged parathyroid glands. (author). 11 figs., 1 tab., 19 refs

  19. AUTOMATIC ESTIMATION OF SIZE PARAMETERS USING VERIFIED COMPUTERIZED STEREOANALYSIS

    Directory of Open Access Journals (Sweden)

    Peter R Mouton

    2011-05-01

    Full Text Available State-of-the-art computerized stereology systems combine high-resolution video microscopy and hardwaresoftware integration with stereological methods to assist users in quantifying multidimensional parameters of importance to biomedical research, including volume, surface area, length, number, their variation and spatial distribution. The requirement for constant interactions between a trained, non-expert user and the targeted features of interest currently limits the throughput efficiency of these systems. To address this issue we developed a novel approach for automatic stereological analysis of 2-D images, Verified Computerized Stereoanalysis (VCS. The VCS approach minimizes the need for user interactions with high contrast [high signal-to-noise ratio (S:N] biological objects of interest. Performance testing of the VCS approach confirmed dramatic increases in the efficiency of total object volume (size estimation, without a loss of accuracy or precision compared to conventional computerized stereology. The broad application of high efficiency VCS to high-contrast biological objects on tissue sections could reduce labor costs, enhance hypothesis testing, and accelerate the progress of biomedical research focused on improvements in health and the management of disease.

  20. Analysis of the of bones through 3D computerized tomography; Analise de estrutura ossea atraves de microtomografia computadorizada 3D

    Energy Technology Data Exchange (ETDEWEB)

    Lima, I.; Lopes, R.T. [Coordenacao dos Programas de Pos-Graduacao de Engenharia (COPPE), Rio de Janeiro, RJ (Brazil). Lab. de Instrumentacao Nuclear; Oliveira, L.F. [Universidade do Estado do Rio de Janeiro (UERJ), RJ (Brazil). Inst. de Fisica. Dept. de Fisica Aplicada e Termodinamica; Alves, J.M. [Universidade de Sao Paulo (USP), Sao Carlos, SP (Brazil). Escola de Engenharia

    2009-03-15

    This work shows the analysis of the internal structure of the bones samples through 3D micro tomography technique (3D-{mu}TC). The comprehension of the bone structure is particularly important when related to osteoporosis diagnosis because this implies in a deterioration of the trabecular bone architecture, which increases the fragility and the possibility to have bone fractures. Two bone samples (human calcaneous and Wistar rat femur) were used, and the method was a radiographic system in real time with an X Ray microfocus tube. The quantifications parameters are based on stereological principles and they are five: a bone volume fraction, trabecular number, the ratio between surface and bone volume, the trabecular thickness and the trabecular separation. The quantifications were done with a program developed especially for this purpose in Nuclear Instrumentation Laboratory - COPPE/UFRJ. This program uses as input the 3D reconstructions images and generates a table with the quantifications. The results of the human calcaneous quantifications are presented in tables 1 and 2, and the 3D reconstructions are illustrated in Figure 5. The Figure 6 illustrate the 2D reconstructed image and the Figure 7 the 3D visualization respectively of the Wistar femur sample. The obtained results show that the 3D-{mu}TC is a powerful technique that can be used to analyze bone microstructures. (author)

  1. Image Analysis for X-ray Imaging of Food

    DEFF Research Database (Denmark)

    Einarsdottir, Hildur

    for quality and safety evaluation of food products. In this effort the fields of statistics, image analysis and statistical learning are combined, to provide analytical tools for determining the aforementioned food traits. The work demonstrated includes a quantitative analysis of heat induced changes......X-ray imaging systems are increasingly used for quality and safety evaluation both within food science and production. They offer non-invasive and nondestructive penetration capabilities to image the inside of food. This thesis presents applications of a novel grating-based X-ray imaging technique...... and defect detection in food. Compared to the complex three dimensional analysis of microstructure, here two dimensional images are considered, making the method applicable for an industrial setting. The advantages obtained by grating-based imaging are compared to conventional X-ray imaging, for both foreign...

  2. Computerized radioautographic grain counting

    International Nuclear Information System (INIS)

    McKanna, J.A.; Casagrande, V.A.

    1985-01-01

    In recent years, radiolabeling techniques have become fundamental assays in physiology and biochemistry experiments. They also have assumed increasingly important roles in morphologic studies. Characteristically, radioautographic analysis of structure has been qualitative rather than quantitative, however, microcomputers have opened the door to several methods for quantifying grain counts and density. The overall goal of this chapter is to describe grain counting using the Bioquant, an image analysis package based originally on the Apple II+, and now available for several popular microcomputers. The authors discuss their image analysis procedures by applying them to a study of development in the central nervous system

  3. Russian system of computerized analysis for licensing at atomic industry (SCALA) and its validation on ICSBEP handbook data and some burnup calculations

    International Nuclear Information System (INIS)

    Ivanova, T.; Nikolaev, M.; Polyakov, A.; Saraeva, T.; Tsiboulia, A.

    2000-01-01

    The System of Computerized Analysis for Licensing at Atomic industry (SCALA) is a Russian analogue of the well-known SCALE system. For criticality evaluations the ABBN-93 system is used with TWODANT and with joined American KENO and Russian MMK Monte-Carlo code MMKKENO. Using the same cross sections and input models, all these codes give results that coincide within the statistical uncertainties (for Monte-Carlo codes). Validation of criticality calculations using SCALA was performed using data presented in the International Handbook of Evaluated Criticality Safety Benchmark Experiments. Another task of the work was to test the burnup capability of SCALA system in complex geometry in compare with other codes. Benchmark models of VVER type reactor assemblies with UO 2 and MOX fuel including the cases with burnable gadolinium absorbers were calculated. KENO-VI and MMK codes were used for power distribution calculations, ORIGEN code was used for the isotopic kinetics calculations. (authors)

  4. Thirteen-Year Evaluation of Highly Cross-Linked Polyethylene Articulating With Either 28-mm or 36-mm Femoral Heads Using Radiostereometric Analysis and Computerized Tomography

    DEFF Research Database (Denmark)

    Nebergall, Audrey K; Greene, Meridith E; Rubash, Harry E

    2016-01-01

    BACKGROUND: The objective of this 13-year prospective evaluation of highly cross-linked ultra high molecular weight polyethylene (HXLPE) was to (1) assess the long-term wear of HXLPE articulating with 2 femoral head sizes using radiostereometric analysis (RSA) and to (2) determine if osteolysis...... is a concern with this material through the use of plain radiographs and computerized tomography (CT). METHODS: All patients received a Longevity HXLPE liner with tantalum beads and either a 28-mm or 36-mm femoral head. Twelve patients (6 in each head size group) agreed to return for 13-year RSA, plain...... scan revealed areas of remodeling of this graft. One patient's 13-year plain radiographs showed evidence of cup loosening and linear radiolucencies in zones 2 and 3. CONCLUSION: There was no evidence of significant wear over time using RSA. The CT scans did not show evidence of osteolysis due to wear...

  5. Ultrasonic image analysis and image-guided interventions.

    Science.gov (United States)

    Noble, J Alison; Navab, Nassir; Becher, H

    2011-08-06

    The fields of medical image analysis and computer-aided interventions deal with reducing the large volume of digital images (X-ray, computed tomography, magnetic resonance imaging (MRI), positron emission tomography and ultrasound (US)) to more meaningful clinical information using software algorithms. US is a core imaging modality employed in these areas, both in its own right and used in conjunction with the other imaging modalities. It is receiving increased interest owing to the recent introduction of three-dimensional US, significant improvements in US image quality, and better understanding of how to design algorithms which exploit the unique strengths and properties of this real-time imaging modality. This article reviews the current state of art in US image analysis and its application in image-guided interventions. The article concludes by giving a perspective from clinical cardiology which is one of the most advanced areas of clinical application of US image analysis and describing some probable future trends in this important area of ultrasonic imaging research.

  6. IMAGE ANALYSIS OF BREAD CRUMB STRUCTURE IN RELATION TO GLUTEN STRENGTH OF WHEAT

    Directory of Open Access Journals (Sweden)

    D. Magdić

    2006-06-01

    Full Text Available The objective of this study was to determine bread slice medium part properties in relation to quality parameters with a focus on gluten strength. Since sensory evaluation of bread is time consuming, expensive and subjective in nature, computerized image analysis was applied as objective method of bread crumb quality evaluation. Gluten Index method was applied as fast and reliable tool for defining gluten strength of wheat. Significant (P90 Ana, Demetra, Klara, Srpanjka and Divana have shown trend to give unequal and bigger crumb grains while cultivars Golubica, Barbara, Žitarka, Kata and Sana with optimal gluten strength (GI= 60-90 have shown finer and uniform crumb grain.

  7. Celebral computerized tomography

    International Nuclear Information System (INIS)

    Lofteroed, B.; Sortland, O.

    1985-01-01

    Indications for cerebral computerized tomography (CT) and the diagnostic results from this examination are evaluated in 127 children. Pathological changes were found in 31 children, mostly based on such indications as increasing head size, suspicion of brain tumor, cerebral paresis, delayed psychomotor development and epileptic seizures. A list of indications for CT in children is given

  8. Digital image analysis applied to industrial nondestructive evaluation and automated parts assembly

    International Nuclear Information System (INIS)

    Janney, D.H.; Kruger, R.P.

    1979-01-01

    Many ideas of image enhancement and analysis are relevant to the needs of the nondestructive testing engineer. These ideas not only aid the engineer in the performance of his current responsibilities, they also open to him new areas of industrial development and automation which are logical extensions of classical testing problems. The paper begins with a tutorial on the fundamentals of computerized image enhancement as applied to nondestructive testing, then progresses through pattern recognition and automated inspection to automated, or robotic, assembly procedures. It is believed that such procedures are cost-effective in many instances, and are but the logical extension of those techniques now commonly used, but often limited to analysis of data from quality-assurance images. Many references are given in order to help the reader who wishes to pursue a given idea further

  9. Vaccine Images on Twitter: Analysis of What Images are Shared.

    Science.gov (United States)

    Chen, Tao; Dredze, Mark

    2018-04-03

    Visual imagery plays a key role in health communication; however, there is little understanding of what aspects of vaccine-related images make them effective communication aids. Twitter, a popular venue for discussions related to vaccination, provides numerous images that are shared with tweets. The objectives of this study were to understand how images are used in vaccine-related tweets and provide guidance with respect to the characteristics of vaccine-related images that correlate with the higher likelihood of being retweeted. We collected more than one million vaccine image messages from Twitter and characterized various properties of these images using automated image analytics. We fit a logistic regression model to predict whether or not a vaccine image tweet was retweeted, thus identifying characteristics that correlate with a higher likelihood of being shared. For comparison, we built similar models for the sharing of vaccine news on Facebook and for general image tweets. Most vaccine-related images are duplicates (125,916/237,478; 53.02%) or taken from other sources, not necessarily created by the author of the tweet. Almost half of the images contain embedded text, and many include images of people and syringes. The visual content is highly correlated with a tweet's textual topics. Vaccine image tweets are twice as likely to be shared as nonimage tweets. The sentiment of an image and the objects shown in the image were the predictive factors in determining whether an image was retweeted. We are the first to study vaccine images on Twitter. Our findings suggest future directions for the study and use of vaccine imagery and may inform communication strategies around vaccination. Furthermore, our study demonstrates an effective study methodology for image analysis. ©Tao Chen, Mark Dredze. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 03.04.2018.

  10. Vaccine Images on Twitter: Analysis of What Images are Shared

    Science.gov (United States)

    Dredze, Mark

    2018-01-01

    Background Visual imagery plays a key role in health communication; however, there is little understanding of what aspects of vaccine-related images make them effective communication aids. Twitter, a popular venue for discussions related to vaccination, provides numerous images that are shared with tweets. Objective The objectives of this study were to understand how images are used in vaccine-related tweets and provide guidance with respect to the characteristics of vaccine-related images that correlate with the higher likelihood of being retweeted. Methods We collected more than one million vaccine image messages from Twitter and characterized various properties of these images using automated image analytics. We fit a logistic regression model to predict whether or not a vaccine image tweet was retweeted, thus identifying characteristics that correlate with a higher likelihood of being shared. For comparison, we built similar models for the sharing of vaccine news on Facebook and for general image tweets. Results Most vaccine-related images are duplicates (125,916/237,478; 53.02%) or taken from other sources, not necessarily created by the author of the tweet. Almost half of the images contain embedded text, and many include images of people and syringes. The visual content is highly correlated with a tweet’s textual topics. Vaccine image tweets are twice as likely to be shared as nonimage tweets. The sentiment of an image and the objects shown in the image were the predictive factors in determining whether an image was retweeted. Conclusions We are the first to study vaccine images on Twitter. Our findings suggest future directions for the study and use of vaccine imagery and may inform communication strategies around vaccination. Furthermore, our study demonstrates an effective study methodology for image analysis. PMID:29615386

  11. Clinical neuroanatomy and diagnostic imaging of the skull. Computerized tomography and nmr imaging. 2. rev. and enlarged ed. Klinische Neuroanatomie und kranielle Bilddiagnostik. Computertomographie und Magnetresonanztomographie

    Energy Technology Data Exchange (ETDEWEB)

    Kretschmann, H.J. (Medizinische Hochschule Hannover (Germany). Abt. Neuroanatomie); Weinrich, W. (Krankenhaus Nordstadt, Hannover (Germany). Neurologische Klinik)

    1991-01-01

    In the last few years, the techniques of CT, MRI, PET and utrasonography have been improved in their diagnostic efficiency, but in spite of the much enhanced resolution achievable with the new techniques, a large part of the neurofunctional systems that are of great significance to clinical diagnostic evaluation remains in the dark. The local structure of the neurofunctional systems is derived from the data describing the position of the conductive structures such as cerebral ventricles or typical cerebral sulci or gyri. A good knowledge of the three-dimensional topography of the neuroanatomy in the skull is required for this purpose, and this is what the book at hand is intended to confer. The arteries and their vascular environment are shown in the frontal, sagittal and axial plane and are compared with drawings produced from angiographies. The information given covers the cranium viscerale, the craniocervical neighbouring areas, and the motor innervation areas of the skull. The term 'diagnostic imaging of the skull' comprises the whole head and the transitional zones. The illustration of the neurofunctional system anatomy in the tomogram presented in this book is a source of information that may serve as a guide for pathfinding in CT, MRI, PET and ultrasonography. In addition, the knowledge found in the book will help to assign clinical data to pathologic findings revealed by CT or MRI. (orig.) With 596 mostly coloured figs.

  12. Introduction to the Multifractal Analysis of Images

    OpenAIRE

    Lévy Véhel , Jacques

    1998-01-01

    International audience; After a brief review of some classical approaches in image segmentation, the basics of multifractal theory and its application to image analysis are presented. Practical methods for multifractal spectrum estimation are discussed and some experimental results are given.

  13. Tolerance analysis through computational imaging simulations

    Science.gov (United States)

    Birch, Gabriel C.; LaCasse, Charles F.; Stubbs, Jaclynn J.; Dagel, Amber L.; Bradley, Jon

    2017-11-01

    The modeling and simulation of non-traditional imaging systems require holistic consideration of the end-to-end system. We demonstrate this approach through a tolerance analysis of a random scattering lensless imaging system.

  14. Algorithms for the Analysis of 3D Magnetic Resonance Angiography Images

    International Nuclear Information System (INIS)

    Tizon, Xavier

    2004-01-01

    Atherosclerosis is a disease of the arterial wall, progressively impairing blood flow as it spreads throughout the body. The heart attacks and strokes that result of this condition cause more deaths than cancer in industrial countries. Angiography refers to the group of imaging techniques used through the diagnosis, treatment planning and follow-up of atherosclerosis. In recent years, Magnetic Resonance Angiography (MRA) has shown promising abilities to supplant conventional, invasive, X-ray-based angiography. In order to fully benefit from this modality, there is a need for more objective and reproducible methods. This thesis shows, in two applications, how computerized image analysis can help define and implement these methods. First, by using segmentation to improve visualization of blood-pool contrast enhanced (CE)-MRA, with an additional application in coronary Computerized Tomographic Angiography. We show that, using a limited amount of user interaction and an algorithmic framework borrowed from graph theory and fuzzy logic theory, we can simplify the display of complex 3D structures like vessels. Second, by proposing a methodology to analyze the geometry of arteries in whole-body CE-MRA. The vessel centreline is extracted, and geometrical properties of this 3D curve are measured, to improve interpretation of the angiograms. It represents a more global approach than the conventional evaluation of atherosclerosis, as a first step towards screening for vascular diseases. We have developed the methods presented in this thesis with clinical practice in mind. However, they have the potential to be useful to other applications of computerized image analysis

  15. Computerized system for measuring cerebral metabolism

    International Nuclear Information System (INIS)

    McGlone, J.S.; Hibbard, L.S.; Hawkins, R.A.; Kasturi, R.

    1987-01-01

    A computerized stereotactic measurement system for evaluating rat brain metabolism was developed to utilize the large amount of data generated by quantitative autoradiography. Conventional methods of measurement only analyze a small percent of these data because these methods are limited by instrument design and the subjectiveness of the investigator. However, a computerized system allows digital images to be analyzed by placing data at their appropriate three-dimensional stereotactic coordinates. The system automatically registers experimental data to a standard three-dimensional image using alignment, scaling, and matching operations. Metabolic activity in different neuronal structures is then measured by generating digital masks and superimposing them on to experimental data. Several experimental data sets were evaluated and it was noticed that the structures measured by the computerized system, had in general, lower metabolic activity than manual measurements had indicated. This was expected because the computerized system measured the structure over its volume while the manual readings were taken from the most active metabolic area of a particular structure

  16. Similarity analysis between quantum images

    Science.gov (United States)

    Zhou, Ri-Gui; Liu, XingAo; Zhu, Changming; Wei, Lai; Zhang, Xiafen; Ian, Hou

    2018-06-01

    Similarity analyses between quantum images are so essential in quantum image processing that it provides fundamental research for the other fields, such as quantum image matching, quantum pattern recognition. In this paper, a quantum scheme based on a novel quantum image representation and quantum amplitude amplification algorithm is proposed. At the end of the paper, three examples and simulation experiments show that the measurement result must be 0 when two images are same, and the measurement result has high probability of being 1 when two images are different.

  17. Computerized detection of noncalcified plaques in coronary CT angiography: Evaluation of topological soft gradient prescreening method and luminal analysis

    Energy Technology Data Exchange (ETDEWEB)

    Wei, Jun, E-mail: jvwei@umich.edu; Zhou, Chuan; Chan, Heang-Ping; Chughtai, Aamer; Agarwal, Prachi; Kuriakose, Jean; Hadjiiski, Lubomir; Patel, Smita; Kazerooni, Ella [Department of Radiology, University of Michigan, Ann Arbor, Michigan 48109 (United States)

    2014-08-15

    Purpose: The buildup of noncalcified plaques (NCPs) that are vulnerable to rupture in coronary arteries is a risk for myocardial infarction. Interpretation of coronary CT angiography (cCTA) to search for NCP is a challenging task for radiologists due to the low CT number of NCP, the large number of coronary arteries, and multiple phase CT acquisition. The authors conducted a preliminary study to develop machine learning method for automated detection of NCPs in cCTA. Methods: With IRB approval, a data set of 83 ECG-gated contrast enhanced cCTA scans with 120 NCPs was collected retrospectively from patient files. A multiscale coronary artery response and rolling balloon region growing (MSCAR-RBG) method was applied to each cCTA volume to extract the coronary arterial trees. Each extracted vessel was reformatted to a straightened volume composed of cCTA slices perpendicular to the vessel centerline. A topological soft-gradient (TSG) detection method was developed to prescreen for NCP candidates by analyzing the 2D topological features of the radial gradient field surface along the vessel wall. The NCP candidates were then characterized by a luminal analysis that used 3D geometric features to quantify the shape information and gray-level features to evaluate the density of the NCP candidates. With machine learning techniques, useful features were identified and combined into an NCP score to differentiate true NCPs from false positives (FPs). To evaluate the effectiveness of the image analysis methods, the authors performed tenfold cross-validation with the available data set. Receiver operating characteristic (ROC) analysis was used to assess the classification performance of individual features and the NCP score. The overall detection performance was estimated by free response ROC (FROC) analysis. Results: With our TSG prescreening method, a prescreening sensitivity of 92.5% (111/120) was achieved with a total of 1181 FPs (14.2 FPs/scan). On average, six features

  18. Computerized detection of noncalcified plaques in coronary CT angiography: Evaluation of topological soft gradient prescreening method and luminal analysis

    International Nuclear Information System (INIS)

    Wei, Jun; Zhou, Chuan; Chan, Heang-Ping; Chughtai, Aamer; Agarwal, Prachi; Kuriakose, Jean; Hadjiiski, Lubomir; Patel, Smita; Kazerooni, Ella

    2014-01-01

    Purpose: The buildup of noncalcified plaques (NCPs) that are vulnerable to rupture in coronary arteries is a risk for myocardial infarction. Interpretation of coronary CT angiography (cCTA) to search for NCP is a challenging task for radiologists due to the low CT number of NCP, the large number of coronary arteries, and multiple phase CT acquisition. The authors conducted a preliminary study to develop machine learning method for automated detection of NCPs in cCTA. Methods: With IRB approval, a data set of 83 ECG-gated contrast enhanced cCTA scans with 120 NCPs was collected retrospectively from patient files. A multiscale coronary artery response and rolling balloon region growing (MSCAR-RBG) method was applied to each cCTA volume to extract the coronary arterial trees. Each extracted vessel was reformatted to a straightened volume composed of cCTA slices perpendicular to the vessel centerline. A topological soft-gradient (TSG) detection method was developed to prescreen for NCP candidates by analyzing the 2D topological features of the radial gradient field surface along the vessel wall. The NCP candidates were then characterized by a luminal analysis that used 3D geometric features to quantify the shape information and gray-level features to evaluate the density of the NCP candidates. With machine learning techniques, useful features were identified and combined into an NCP score to differentiate true NCPs from false positives (FPs). To evaluate the effectiveness of the image analysis methods, the authors performed tenfold cross-validation with the available data set. Receiver operating characteristic (ROC) analysis was used to assess the classification performance of individual features and the NCP score. The overall detection performance was estimated by free response ROC (FROC) analysis. Results: With our TSG prescreening method, a prescreening sensitivity of 92.5% (111/120) was achieved with a total of 1181 FPs (14.2 FPs/scan). On average, six features

  19. Image registration with uncertainty analysis

    Science.gov (United States)

    Simonson, Katherine M [Cedar Crest, NM

    2011-03-22

    In an image registration method, edges are detected in a first image and a second image. A percentage of edge pixels in a subset of the second image that are also edges in the first image shifted by a translation is calculated. A best registration point is calculated based on a maximum percentage of edges matched. In a predefined search region, all registration points other than the best registration point are identified that are not significantly worse than the best registration point according to a predetermined statistical criterion.

  20. Highly resolving computerized tomography

    International Nuclear Information System (INIS)

    Kurtz, B.; Petersen, D.; Walter, E.

    1984-01-01

    With the development of highly-resolving devices for computerized tomography, CT diagnosis of the lumbar vertebral column has gained increasing importance. As an ambulatory, non-invasive method it has proved in comparative studies to be at least equivalent to myelography in the detection of dislocations of inter-vertebral disks (4,6,7,15). Because with modern devices not alone the bones, but especially the spinal soft part structures are clearly and precisely presented with a resolution of distinctly below 1 mm, a further improvement of the results is expected as experience will increase. The authors report on the diagnosis of the lumbar vertebral column with the aid of a modern device for computerized tomography and wish to draw particular attention to the possibility of doing this investigation as a routine, and to the diagnostic value of secondary reconstructions. (BWU) [de

  1. Highly resolving computerized tomography

    Energy Technology Data Exchange (ETDEWEB)

    Kurtz, B.; Petersen, D.; Walter, E.

    1984-01-01

    With the development of highly-resolving devices for computerized tomography, CT diagnosis of the lumbar vertebral column has gained increasing importance. As an ambulatory, non-invasive method it has proved in comparative studies to be at least equivalent to myelography in the detection of dislocations of inter-vertebral disks (4,6,7,15). Because with modern devices not alone the bones, but especially the spinal soft part structures are clearly and precisely presented with a resolution of distinctly below 1 mm, a further improvement of the results is expected as experience will increase. The authors report on the diagnosis of the lumbar vertebral column with the aid of a modern device for computerized tomography and wish to draw particular attention to the possibility of doing this investigation as a routine, and to the diagnostic value of secondary reconstructions.

  2. Transfer function analysis of radiographic imaging systems

    International Nuclear Information System (INIS)

    Metz, C.E.; Doi, K.

    1979-01-01

    The theoretical and experimental aspects of the techniques of transfer function analysis used in radiographic imaging systems are reviewed. The mathematical principles of transfer function analysis are developed for linear, shift-invariant imaging systems, for the relation between object and image and for the image due to a sinusoidal plane wave object. The other basic mathematical principle discussed is 'Fourier analysis' and its application to an input function. Other aspects of transfer function analysis included are alternative expressions for the 'optical transfer function' of imaging systems and expressions are derived for both serial and parallel transfer image sub-systems. The applications of transfer function analysis to radiographic imaging systems are discussed in relation to the linearisation of the radiographic imaging system, the object, the geometrical unsharpness, the screen-film system unsharpness, other unsharpness effects and finally noise analysis. It is concluded that extensive theoretical, computer simulation and experimental studies have demonstrated that the techniques of transfer function analysis provide an accurate and reliable means for predicting and understanding the effects of various radiographic imaging system components in most practical diagnostic medical imaging situations. (U.K.)

  3. Computerized plant maintenance management

    International Nuclear Information System (INIS)

    Kozusko, A.M.

    1986-01-01

    The evolution of the computer has and continues to have a great impact on industry. We are in an adjustment cycle with the current computer evolution, and will need to adapt to make the changes for the coming decade. Hardware and software are continually being enhanced. Computers are becoming more powerful and will eventually provide an effective man-machine interface. This paper shares experiences encountered during implementations of computerized maintenance systems

  4. Computerized medical convocations

    International Nuclear Information System (INIS)

    Roche, Annie; Gilbert, Jean-Francois; Chiadot, Pierre; Vanzetto, Rene; Darnault, Jean

    1969-06-01

    Thanks to a close collaboration between the Medical and Social department and the Numerical Calculation Laboratory, a computerized convocation system has been implemented to reduce the administrative workload and to introduce more rigor in medical management, patient historical background and statistics. This work comprises: - a preliminary study of the data generating medical convocations and the related practical requirements; - the programming work according to these data; - the realisation of the mechano-graphical file covering the overall personnel [fr

  5. Microscopy image segmentation tool: Robust image data analysis

    Energy Technology Data Exchange (ETDEWEB)

    Valmianski, Ilya, E-mail: ivalmian@ucsd.edu; Monton, Carlos; Schuller, Ivan K. [Department of Physics and Center for Advanced Nanoscience, University of California San Diego, 9500 Gilman Drive, La Jolla, California 92093 (United States)

    2014-03-15

    We present a software package called Microscopy Image Segmentation Tool (MIST). MIST is designed for analysis of microscopy images which contain large collections of small regions of interest (ROIs). Originally developed for analysis of porous anodic alumina scanning electron images, MIST capabilities have been expanded to allow use in a large variety of problems including analysis of biological tissue, inorganic and organic film grain structure, as well as nano- and meso-scopic structures. MIST provides a robust segmentation algorithm for the ROIs, includes many useful analysis capabilities, and is highly flexible allowing incorporation of specialized user developed analysis. We describe the unique advantages MIST has over existing analysis software. In addition, we present a number of diverse applications to scanning electron microscopy, atomic force microscopy, magnetic force microscopy, scanning tunneling microscopy, and fluorescent confocal laser scanning microscopy.

  6. Microscopy image segmentation tool: Robust image data analysis

    Science.gov (United States)

    Valmianski, Ilya; Monton, Carlos; Schuller, Ivan K.

    2014-03-01

    We present a software package called Microscopy Image Segmentation Tool (MIST). MIST is designed for analysis of microscopy images which contain large collections of small regions of interest (ROIs). Originally developed for analysis of porous anodic alumina scanning electron images, MIST capabilities have been expanded to allow use in a large variety of problems including analysis of biological tissue, inorganic and organic film grain structure, as well as nano- and meso-scopic structures. MIST provides a robust segmentation algorithm for the ROIs, includes many useful analysis capabilities, and is highly flexible allowing incorporation of specialized user developed analysis. We describe the unique advantages MIST has over existing analysis software. In addition, we present a number of diverse applications to scanning electron microscopy, atomic force microscopy, magnetic force microscopy, scanning tunneling microscopy, and fluorescent confocal laser scanning microscopy.

  7. Microscopy image segmentation tool: Robust image data analysis

    International Nuclear Information System (INIS)

    Valmianski, Ilya; Monton, Carlos; Schuller, Ivan K.

    2014-01-01

    We present a software package called Microscopy Image Segmentation Tool (MIST). MIST is designed for analysis of microscopy images which contain large collections of small regions of interest (ROIs). Originally developed for analysis of porous anodic alumina scanning electron images, MIST capabilities have been expanded to allow use in a large variety of problems including analysis of biological tissue, inorganic and organic film grain structure, as well as nano- and meso-scopic structures. MIST provides a robust segmentation algorithm for the ROIs, includes many useful analysis capabilities, and is highly flexible allowing incorporation of specialized user developed analysis. We describe the unique advantages MIST has over existing analysis software. In addition, we present a number of diverse applications to scanning electron microscopy, atomic force microscopy, magnetic force microscopy, scanning tunneling microscopy, and fluorescent confocal laser scanning microscopy

  8. Computerized cognitive training in cognitively healthy older adults: a systematic review and meta-analysis of effect modifiers.

    Directory of Open Access Journals (Sweden)

    Amit Lampit

    2014-11-01

    Full Text Available BACKGROUND: New effective interventions to attenuate age-related cognitive decline are a global priority. Computerized cognitive training (CCT is believed to be safe and can be inexpensive, but neither its efficacy in enhancing cognitive performance in healthy older adults nor the impact of design factors on such efficacy has been systematically analyzed. Our aim therefore was to quantitatively assess whether CCT programs can enhance cognition in healthy older adults, discriminate responsive from nonresponsive cognitive domains, and identify the most salient design factors. METHODS AND FINDINGS: We systematically searched Medline, Embase, and PsycINFO for relevant studies from the databases' inception to 9 July 2014. Eligible studies were randomized controlled trials investigating the effects of ≥ 4 h of CCT on performance in neuropsychological tests in older adults without dementia or other cognitive impairment. Fifty-two studies encompassing 4,885 participants were eligible. Intervention designs varied considerably, but after removal of one outlier, heterogeneity across studies was small (I(2 = 29.92%. There was no systematic evidence of publication bias. The overall effect size (Hedges' g, random effects model for CCT versus control was small and statistically significant, g = 0.22 (95% CI 0.15 to 0.29. Small to moderate effect sizes were found for nonverbal memory, g = 0.24 (95% CI 0.09 to 0.38; verbal memory, g = 0.08 (95% CI 0.01 to 0.15; working memory (WM, g = 0.22 (95% CI 0.09 to 0.35; processing speed, g = 0.31 (95% CI 0.11 to 0.50; and visuospatial skills, g = 0.30 (95% CI 0.07 to 0.54. No significant effects were found for executive functions and attention. Moderator analyses revealed that home-based administration was ineffective compared to group-based training, and that more than three training sessions per week was ineffective versus three or fewer. There was no evidence for the effectiveness of WM training, and only weak

  9. Information granules in image histogram analysis.

    Science.gov (United States)

    Wieclawek, Wojciech

    2018-04-01

    A concept of granular computing employed in intensity-based image enhancement is discussed. First, a weighted granular computing idea is introduced. Then, the implementation of this term in the image processing area is presented. Finally, multidimensional granular histogram analysis is introduced. The proposed approach is dedicated to digital images, especially to medical images acquired by Computed Tomography (CT). As the histogram equalization approach, this method is based on image histogram analysis. Yet, unlike the histogram equalization technique, it works on a selected range of the pixel intensity and is controlled by two parameters. Performance is tested on anonymous clinical CT series. Copyright © 2017 Elsevier Ltd. All rights reserved.

  10. Diagnostics of neuromuscular diseases with the aid of computerized tomography

    Energy Technology Data Exchange (ETDEWEB)

    Visser, M de; Verbeeten, Jr, B J

    1988-06-04

    In this article the diagnosis of neuromuscular diseases with the aid of computerized tomography is treated. Computerized tomography of skeletal muscles give no information which is pathognomonic for particular diseases. But the technique can be used in the following aspects: to choose a muscle for a biopsy; when it is not possible to examine the function of a muscle, a CT scan can visualize morphological deviations; in the differentiation of muscle hypertrophy and pseudo-hypertrophy. For some cases as Becker-type muscular dystrophy, facioscapulohumeral dystrophy and Kugelberg-Welander type spinal muscular atrophy computerized tomography gives characteristic images. 10 refs.; 6 figs.

  11. Diagnostics of neuromuscular diseases with the aid of computerized tomography

    International Nuclear Information System (INIS)

    Visser, M. de; Verbeeten, B.J. Jr.

    1988-01-01

    In this article the diagnosis of neuromuscular diseases with the aid of computerized tomography is treated. Computerized tomography of skeletal muscles give no information which is pathognomonic for particular diseases. But the technique can be used in the following aspects: to choose a muscle for a biopsy; when it is not possible to examine the function of a muscle, a CT scan can visualize morphological deviations; in the differentiation of muscle hypertrophy and pseudo-hypertrophy. For some cases as Becker-type muscular dystrophy, facioscapulohumeral dystrophy and Kugelberg-Welander type spinal muscular atrophy computerized tomography gives characteristic images. 10 refs.; 6 figs

  12. Analysis of 3-D images

    Science.gov (United States)

    Wani, M. Arif; Batchelor, Bruce G.

    1992-03-01

    Deriving generalized representation of 3-D objects for analysis and recognition is a very difficult task. Three types of representations based on type of an object is used in this paper. Objects which have well-defined geometrical shapes are segmented by using a fast edge region based segmentation technique. The segmented image is represented by plan and elevation of each part of the object if the object parts are symmetrical about their central axis. The plan and elevation concept enables representing and analyzing such objects quickly and efficiently. The second type of representation is used for objects having parts which are not symmetrical about their central axis. The segmented surface patches of such objects are represented by the 3-D boundary and the surface features of each segmented surface. Finally, the third type of representation is used for objects which don't have well-defined geometrical shapes (for example a loaf of bread). These objects are represented and analyzed from its features which are derived using a multiscale contour based technique. Anisotropic Gaussian smoothing technique is introduced to segment the contours at various scales of smoothing. A new merging technique is used which enables getting the current best estimate of break points at each scale. This new technique enables elimination of loss of accuracy of localization effects at coarser scales without using scale space tracking approach.

  13. Non-destructive analysis and detection of internal characteristics of spruce logs through X computerized tomography; Detection et analyse non destructive de caracteristiques internes de billons d'epicea commun (PICEA ABIES (L.) KARST) par tomographie a rayons X

    Energy Technology Data Exchange (ETDEWEB)

    Longuetaud, F

    2005-10-15

    Computerized tomography allows a direct access to internal features of scanned logs on the basis of density and moisture content variations. The objective of this work is to assess the feasibility of an automatic detection of internal characteristics with the final aim of conducting scientific analyses. The database is constituted by CT images of 24 spruces obtained with a medical CT scanner. Studied trees are representative of several social status and are coming from four stands located in North-Eastern France, themselves are representative of several age, density and fertility classes. The automatic processing developed are the following. First, pith detection in logs dealing with the problem of knot presence and ring eccentricity. The accuracy of the localisation was less than one mm. Secondly, the detection of the sapwood/heart-wood limit in logs dealing with the problem of knot presence (main source of difficulty). The error on the diameter was 1.8 mm which corresponds to a relative error of 1.3 per cent. Thirdly, the detection of the whorls location and comparison with an optical method. Fourthly the detection of individualized knots. This process allows to count knots and to locate them in a log (longitudinal position and azimuth); however, the validation of the method and extraction of branch diameter and inclination are still to be developed. An application of this work was a variability analysis of the sapwood content in the trunk: at the within-tree level, the sapwood width was found to be constant under the living crown; at the between-tree level, a strong correlation was found with the amount of living branches. A great number of analyses are possible from our work results, among others: architectural analysis with the pith tracking and the apex death occurrence; analysis of radial variations of the heart-wood shape; analysis of the knot distribution in logs. (author)

  14. Non-destructive analysis and detection of internal characteristics of spruce logs through X computerized tomography; Detection et analyse non destructive de caracteristiques internes de billons d'epicea commun (PICEA ABIES (L.) KARST) par tomographie a rayons X

    Energy Technology Data Exchange (ETDEWEB)

    Longuetaud, F

    2005-10-15

    Computerized tomography allows a direct access to internal features of scanned logs on the basis of density and moisture content variations. The objective of this work is to assess the feasibility of an automatic detection of internal characteristics with the final aim of conducting scientific analyses. The database is constituted by CT images of 24 spruces obtained with a medical CT scanner. Studied trees are representative of several social status and are coming from four stands located in North-Eastern France, themselves are representative of several age, density and fertility classes. The automatic processing developed are the following. First, pith detection in logs dealing with the problem of knot presence and ring eccentricity. The accuracy of the localisation was less than one mm. Secondly, the detection of the sapwood/heart-wood limit in logs dealing with the problem of knot presence (main source of difficulty). The error on the diameter was 1.8 mm which corresponds to a relative error of 1.3 per cent. Thirdly, the detection of the whorls location and comparison with an optical method. Fourthly the detection of individualized knots. This process allows to count knots and to locate them in a log (longitudinal position and azimuth); however, the validation of the method and extraction of branch diameter and inclination are still to be developed. An application of this work was a variability analysis of the sapwood content in the trunk: at the within-tree level, the sapwood width was found to be constant under the living crown; at the between-tree level, a strong correlation was found with the amount of living branches. A great number of analyses are possible from our work results, among others: architectural analysis with the pith tracking and the apex death occurrence; analysis of radial variations of the heart-wood shape; analysis of the knot distribution in logs. (author)

  15. Multi-Resolution Wavelet-Transformed Image Analysis of Histological Sections of Breast Carcinomas

    Directory of Open Access Journals (Sweden)

    Hae-Gil Hwang

    2005-01-01

    Full Text Available Multi-resolution images of histological sections of breast cancer tissue were analyzed using texture features of Haar- and Daubechies transform wavelets. Tissue samples analyzed were from ductal regions of the breast and included benign ductal hyperplasia, ductal carcinoma in situ (DCIS, and invasive ductal carcinoma (CA. To assess the correlation between computerized image analysis and visual analysis by a pathologist, we created a two-step classification system based on feature extraction and classification. In the feature extraction step, we extracted texture features from wavelet-transformed images at 10× magnification. In the classification step, we applied two types of classifiers to the extracted features, namely a statistics-based multivariate (discriminant analysis and a neural network. Using features from second-level Haar transform wavelet images in combination with discriminant analysis, we obtained classification accuracies of 96.67 and 87.78% for the training and testing set (90 images each, respectively. We conclude that the best classifier of carcinomas in histological sections of breast tissue are the texture features from the second-level Haar transform wavelet images used in a discriminant function.

  16. Applications of stochastic geometry in image analysis

    NARCIS (Netherlands)

    Lieshout, van M.N.M.; Kendall, W.S.; Molchanov, I.S.

    2009-01-01

    A discussion is given of various stochastic geometry models (random fields, sequential object processes, polygonal field models) which can be used in intermediate and high-level image analysis. Two examples are presented of actual image analysis problems (motion tracking in video,

  17. Solar Image Analysis and Visualization

    CERN Document Server

    Ireland, J

    2009-01-01

    This volume presents a selection of papers on the state of the art of image enhancement, automated feature detection, machine learning, and visualization tools in support of solar physics that focus on the challenges presented by new ground-based and space-based instrumentation. The articles and topics were inspired by the Third Solar Image Processing Workshop, held at Trinity College Dublin, Ireland but contributions from other experts have been included as well. This book is mainly aimed at researchers and graduate students working on image processing and compter vision in astronomy and solar physics.

  18. Impact of renal anatomy on shock wave lithotripsy outcomes for lower pole kidney stones: results of a prospective multifactorial analysis controlled by computerized tomography.

    Science.gov (United States)

    Torricelli, Fabio C M; Marchini, Giovanni S; Yamauchi, Fernando I; Danilovic, Alexandre; Vicentini, Fabio C; Srougi, Miguel; Monga, Manoj; Mazzucchi, Eduardo

    2015-06-01

    We evaluated which variables impact fragmentation and clearance of lower pole calculi after shock wave lithotripsy. We prospectively evaluated patients undergoing shock wave lithotripsy for a solitary 5 to 20 mm lower pole kidney stone between June 2012 and August 2014. Patient body mass index and abdominal waist circumference were recorded. One radiologist blinded to shock wave lithotripsy outcomes measured stone size, area and density, stone-to-skin distance, infundibular length, width and height, and infundibulopelvic angle based on baseline noncontrast computerized tomography. Fragmentation, success (defined as residual fragments less than 4 mm in asymptomatic patients) and the stone-free rate were evaluated by noncontrast computerized tomography 12 weeks postoperatively. Univariate and multivariate analysis was performed. A total of 100 patients were enrolled in the study. Mean stone size was 9.1 mm. Overall fragmentation, success and stone-free rates were 76%, 54% and 37%, respectively. On logistic regression body mass index (OR 1.27, 95% CI 1.11-1.49, p = 0.004) and stone density (OR 1.0026, 95% CI 1.0008-1.0046, p = 0.005) significantly impacted fragmentation. Stone size (OR 1.24, 95% CI 1.07-1.48, p = 0.039) and stone density (OR 1.0021, 95% CI 1.0007-1.0037, p = 0.012) impacted the success rate while stone size (OR 1.24, 95% CI 1.04-1.50, p = 0.029), stone density (OR 1.0015, 95% CI 1.0001-1.0032, p = 0.046) and infundibular length (OR 1.1035, 95% CI 1.015-1.217, p = 0.015) impacted the stone-free rate. The best outcomes were found in patients with a body mass index of 30 kg/m(2) or less, stones 10 mm or less and 900 HU or less, and an infundibular length of 25 mm or less. The coexistence of significant unfavorable variables led to a stone-free rate of less than 20%. Obese patients with higher than 10 mm density stones (greater than 900 HU) in the lower pole of the kidney with an infundibular length of greater than 25 mm should be discouraged from

  19. Computerized Italian criticality guide, description and validation

    International Nuclear Information System (INIS)

    Carotenuto, M.; Landeyro, P.A.

    1988-10-01

    Our group is developing an 'expert system' for collecting engineering know-how on back-end nuclear plant design. An expert system is the most suitable software tool for our problem. During the analysis, the design process was divided into different branches. At each branch of the design process the Expert System relates a computerized design procedure. Any design procedure is composed of a set of design methods, together with their condition of application and reliability limits. In the framework of this expert system, the nuclear criticality safety analysis procedure was developed, in the form of a computerized criticality guide, attempting to reproduce the designer's normal 'reasoning' process. The criticality guide is composed of two parts: A computerized text, including theory, a description of the accidents occurred in the past and a description of the italian design experience; An interactive computer aided calculation module, containing a graphical facility for critical parameter curves. In the present report are presented the criticality guide (computerized Italian Criticality Guide) and its validation test. (author)

  20. Computerized Italian criticality guide, description and validation

    Energy Technology Data Exchange (ETDEWEB)

    Carotenuto, M; Landeyro, P A [ENEA - Dipartimento Ciclo del Combustibile, Centro Ricerche Energia, Casaccia (Italy)

    1988-10-15

    Our group is developing an 'expert system' for collecting engineering know-how on back-end nuclear plant design. An expert system is the most suitable software tool for our problem. During the analysis, the design process was divided into different branches. At each branch of the design process the Expert System relates a computerized design procedure. Any design procedure is composed of a set of design methods, together with their condition of application and reliability limits. In the framework of this expert system, the nuclear criticality safety analysis procedure was developed, in the form of a computerized criticality guide, attempting to reproduce the designer's normal 'reasoning' process. The criticality guide is composed of two parts: A computerized text, including theory, a description of the accidents occurred in the past and a description of the italian design experience; An interactive computer aided calculation module, containing a graphical facility for critical parameter curves. In the present report are presented the criticality guide (computerized Italian Criticality Guide) and its validation test. (author)

  1. Multi-Source Image Analysis.

    Science.gov (United States)

    1979-12-01

    These collections were taken to show the advantages made available to the inter- preter. In a military operation, however, often little or no in- situ ...The large body of water labeled "W" on each image represents the Agua Hedionda lagoon. East of the lagoon the area is primarily agricultural with a...power plant located in the southeast corner of the image. West of the Agua Hedionda lagoon is Carlsbad, California. Damp ground is labelled "Dg" on the

  2. Objective analysis of image quality of video image capture systems

    Science.gov (United States)

    Rowberg, Alan H.

    1990-07-01

    As Picture Archiving and Communication System (PACS) technology has matured, video image capture has become a common way of capturing digital images from many modalities. While digital interfaces, such as those which use the ACR/NEMA standard, will become more common in the future, and are preferred because of the accuracy of image transfer, video image capture will be the dominant method in the short term, and may continue to be used for some time because of the low cost and high speed often associated with such devices. Currently, virtually all installed systems use methods of digitizing the video signal that is produced for display on the scanner viewing console itself. A series of digital test images have been developed for display on either a GE CT9800 or a GE Signa MRI scanner. These images have been captured with each of five commercially available image capture systems, and the resultant images digitally transferred on floppy disk to a PC1286 computer containing Optimast' image analysis software. Here the images can be displayed in a comparative manner for visual evaluation, in addition to being analyzed statistically. Each of the images have been designed to support certain tests, including noise, accuracy, linearity, gray scale range, stability, slew rate, and pixel alignment. These image capture systems vary widely in these characteristics, in addition to the presence or absence of other artifacts, such as shading and moire pattern. Other accessories such as video distribution amplifiers and noise filters can also add or modify artifacts seen in the captured images, often giving unusual results. Each image is described, together with the tests which were performed using them. One image contains alternating black and white lines, each one pixel wide, after equilibration strips ten pixels wide. While some systems have a slew rate fast enough to track this correctly, others blur it to an average shade of gray, and do not resolve the lines, or give

  3. Computerized operator decision aids

    International Nuclear Information System (INIS)

    Long, A.B.

    1984-01-01

    This article explores the potential benefits associated with the use of computers in nuclear plants by the operating crew as an aid in making decisions. Pertinent findings are presented from recently completed projects to establish the context in which operating decisions have to be made. Key factors influencing the decision-making process itself are also identified. Safety parameter display systems, which are being implemented in various forms by the nuclear industry, are described within the context of decision making. In addition, relevant worldwide research and development activities are examined as potential enhancements to computerized operator decision aids to further improve plant safety and availability

  4. Computerized procedures system

    Science.gov (United States)

    Lipner, Melvin H.; Mundy, Roger A.; Franusich, Michael D.

    2010-10-12

    An online data driven computerized procedures system that guides an operator through a complex process facility's operating procedures. The system monitors plant data, processes the data and then, based upon this processing, presents the status of the current procedure step and/or substep to the operator. The system supports multiple users and a single procedure definition supports several interface formats that can be tailored to the individual user. Layered security controls access privileges and revisions are version controlled. The procedures run on a server that is platform independent of the user workstations that the server interfaces with and the user interface supports diverse procedural views.

  5. Computerized spleen volumetry

    International Nuclear Information System (INIS)

    Jahnke, T.; Mohring, R.; Schertel, L.

    1981-01-01

    We examined in experimental studies and clinical investigations on 34 patients in how far volumetry of the spleen can be carried out with a commonly available program, a whole-body computerized tomograph (SOMATOM) and an analytic equipment (EVALUSKOP). In this connection the authors tried to find also other ways of spleen volumetry by means of this unit combination. Our final result was that the given program for the usage of labelled areas presents itself as the best-suited technique for spleen volumetry which is also applicable in practice. (orig./MG) [de

  6. Design-related influencing factors of the computerized procedure system for inclusion into human reliability analysis of the advanced control room

    International Nuclear Information System (INIS)

    Kim, Jaewhan; Lee, Seung Jun; Jang, Seung Cheol; Ahn, Kwang-Il; Shin, Yeong Cheol

    2013-01-01

    This paper presents major design factors of the computerized procedure system (CPS) by task characteristics/requirements, with individual relative weight evaluated by the analytic hierarchy process (AHP) technique, for inclusion into human reliability analysis (HRA) of the advanced control rooms. Task characteristics/requirements of an individual procedural step are classified into four categories according to the dynamic characteristics of an emergency situation: (1) a single-static step, (2) a single-dynamic and single-checking step, (3) a single-dynamic and continuous-monitoring step, and (4) a multiple-dynamic and continuous-monitoring step. According to the importance ranking evaluation by the AHP technique, ‘clearness of the instruction for taking action’, ‘clearness of the instruction and its structure for rule interpretation’, and ‘adequate provision of requisite information’ were rated as of being higher importance for all the task classifications. Importance of ‘adequacy of the monitoring function’ and ‘adequacy of representation of the dynamic link or relationship between procedural steps’ is dependent upon task characteristics. The result of the present study gives a valuable insight on which design factors of the CPS should be incorporated, with relative importance or weight between design factors, into HRA of the advanced control rooms. (author)

  7. Computerized adaptive testing item selection in computerized adaptive learning systems

    NARCIS (Netherlands)

    Eggen, Theodorus Johannes Hendrikus Maria; Eggen, T.J.H.M.; Veldkamp, B.P.

    2012-01-01

    Item selection methods traditionally developed for computerized adaptive testing (CAT) are explored for their usefulness in item-based computerized adaptive learning (CAL) systems. While in CAT Fisher information-based selection is optimal, for recovering learning populations in CAL systems item

  8. Forensic Analysis of Digital Image Tampering

    Science.gov (United States)

    2004-12-01

    analysis of when each method fails, which Chapter 4 discusses. Finally, a test image containing an invisible watermark using LSB steganography is...2.2 – Example of invisible watermark using Steganography Software F5 ............. 8 Figure 2.3 – Example of copy-move image forgery [12...used to embed the hidden watermark is Steganography Software F5 version 11+ discussed in Section 2.2. Original JPEG Image – 580 x 435 – 17.4

  9. Wavefront analysis for plenoptic camera imaging

    International Nuclear Information System (INIS)

    Luan Yin-Sen; Xu Bing; Yang Ping; Tang Guo-Mao

    2017-01-01

    The plenoptic camera is a single lens stereo camera which can retrieve the direction of light rays while detecting their intensity distribution. In this paper, to reveal more truths of plenoptic camera imaging, we present the wavefront analysis for the plenoptic camera imaging from the angle of physical optics but not from the ray tracing model of geometric optics. Specifically, the wavefront imaging model of a plenoptic camera is analyzed and simulated by scalar diffraction theory and the depth estimation is redescribed based on physical optics. We simulate a set of raw plenoptic images of an object scene, thereby validating the analysis and derivations and the difference between the imaging analysis methods based on geometric optics and physical optics are also shown in simulations. (paper)

  10. Breast cancer histopathology image analysis : a review

    NARCIS (Netherlands)

    Veta, M.; Pluim, J.P.W.; Diest, van P.J.; Viergever, M.A.

    2014-01-01

    This paper presents an overview of methods that have been proposed for the analysis of breast cancer histopathology images. This research area has become particularly relevant with the advent of whole slide imaging (WSI) scanners, which can perform cost-effective and high-throughput histopathology

  11. Multiplicative calculus in biomedical image analysis

    NARCIS (Netherlands)

    Florack, L.M.J.; Assen, van H.C.

    2011-01-01

    We advocate the use of an alternative calculus in biomedical image analysis, known as multiplicative (a.k.a. non-Newtonian) calculus. It provides a natural framework in problems in which positive images or positive definite matrix fields and positivity preserving operators are of interest. Indeed,

  12. Image analysis in x-ray cinefluorography

    Energy Technology Data Exchange (ETDEWEB)

    Ikuse, J; Yasuhara, H; Sugimoto, H [Toshiba Corp., Kawasaki, Kanagawa (Japan)

    1979-02-01

    For the cinefluorographic image in the cardiovascular diagnostic system, the image quality is evaluated by means of MTF (Modulation Transfer Function), and object contrast by introducing the concept of x-ray spectrum analysis. On the basis of these results, further investigation is made of optimum X-ray exposure factors set for cinefluorography and the cardiovascular diagnostic system.

  13. An Imaging And Graphics Workstation For Image Sequence Analysis

    Science.gov (United States)

    Mostafavi, Hassan

    1990-01-01

    This paper describes an application-specific engineering workstation designed and developed to analyze imagery sequences from a variety of sources. The system combines the software and hardware environment of the modern graphic-oriented workstations with the digital image acquisition, processing and display techniques. The objective is to achieve automation and high throughput for many data reduction tasks involving metric studies of image sequences. The applications of such an automated data reduction tool include analysis of the trajectory and attitude of aircraft, missile, stores and other flying objects in various flight regimes including launch and separation as well as regular flight maneuvers. The workstation can also be used in an on-line or off-line mode to study three-dimensional motion of aircraft models in simulated flight conditions such as wind tunnels. The system's key features are: 1) Acquisition and storage of image sequences by digitizing real-time video or frames from a film strip; 2) computer-controlled movie loop playback, slow motion and freeze frame display combined with digital image sharpening, noise reduction, contrast enhancement and interactive image magnification; 3) multiple leading edge tracking in addition to object centroids at up to 60 fields per second from both live input video or a stored image sequence; 4) automatic and manual field-of-view and spatial calibration; 5) image sequence data base generation and management, including the measurement data products; 6) off-line analysis software for trajectory plotting and statistical analysis; 7) model-based estimation and tracking of object attitude angles; and 8) interface to a variety of video players and film transport sub-systems.

  14. Facial Image Analysis in Anthropology: A Review

    Czech Academy of Sciences Publication Activity Database

    Kalina, Jan

    2011-01-01

    Roč. 49, č. 2 (2011), s. 141-153 ISSN 0323-1119 Institutional support: RVO:67985807 Keywords : face * computer-assisted methods * template matching * geometric morphopetrics * robust image analysis Subject RIV: IN - Informatics, Computer Science

  15. Optimization of shearography image quality analysis

    International Nuclear Information System (INIS)

    Rafhayudi Jamro

    2005-01-01

    Shearography is an optical technique based on speckle pattern to measure the deformation of the object surface in which the fringe pattern is obtained through the correlation analysis from the speckle pattern. Analysis of fringe pattern for engineering application is limited for qualitative measurement. Therefore, for further analysis that lead to qualitative data, series of image processing mechanism are involved. In this paper, the fringe pattern for qualitative analysis is discussed. In principal field of applications is qualitative non-destructive testing such as detecting discontinuity, defect in the material structure, locating fatigue zones and etc and all these required image processing application. In order to performed image optimisation successfully, the noise in the fringe pattern must be minimised and the fringe pattern itself must be maximise. This can be achieved by applying a filtering method with a kernel size ranging from 2 X 2 to 7 X 7 pixels size and also applying equalizer in the image processing. (Author)

  16. Structural analysis in medical imaging

    International Nuclear Information System (INIS)

    Dellepiane, S.; Serpico, S.B.; Venzano, L.; Vernazza, G.

    1987-01-01

    The conventional techniques in Pattern Recognition (PR) have been greatly improved by the introduction of Artificial Intelligence (AI) approaches, in particular for knowledge representation, inference mechanism and control structure. The purpose of this paper is to describe an image understanding system, based on the integrated approach (AI - PR), developed in the author's Department to interpret Nuclear Magnetic Resonance (NMR) images. The system is characterized by a heterarchical control structure and a blackboard model for the global data-base. The major aspects of the system are pointed out, with particular reference to segmentation, knowledge representation and error recovery (backtracking). The eye slices obtained in the case of two patients have been analyzed and the related results are discussed

  17. Quantitative analysis of selected minor and trace elements through use of a computerized automatic x-ray spectrograph

    International Nuclear Information System (INIS)

    Fabbi, B.P.; Elsheimer, H.N.; Espos, L.F.

    1976-01-01

    Upgrading a manual X-ray spectrograph, interfacing with an 8K computer, and employment of interelement correction programs have resulted in a several-fold increase in productivity for routine quantitative analysis and an accompanying decrease in operator bias both in measurement procedures and in calculations. Factors such as dead time and self-absorption also are now computer corrected, resulting in improved accuracy. All conditions of analysis except for the X-ray tube voltage are controlled by the computer, which enhances precision of analysis. Elemental intensities are corrected for matrix effects, and from these the percent concentrations are calculated and printed via teletype. Interelement correction programs utilizing multiple linear regression are employed for the determination of the following minor and trace elements: K, S, Rb, Sr, Y, and Zr in silicate rocks, and Ba, As, Sb, and Zn in both silicate and carbonate rock samples. The last named elements use the same regression curves for both rock types. All these elements are determined in concentrations generally ranging from 0.0025 percent to 4.00 percent. The sensitivities obtainable range from 0.0001 percent for barium to 0.001 percent for antimony. The accuracy, as measured by the percent relative error for a variety of silicate and carbonate rocks, is on the order of 1-7 percent. The exception is yttrium

  18. A computerized system to conduct the Tweed-Merrifield analysis in orthodontics Sistema computadorizado para conduzir a análise de Tweed-Merrifield na ortodontia

    Directory of Open Access Journals (Sweden)

    Maximino Brandão Barreto

    2006-04-01

    Full Text Available Precision in orthodontic diagnosis can increase the chance of therapeutic success. The objective of this study was to describe the development of a computerized system (prototype, created from a printed table of the Cranial Facial Analysis and Total Dentition Space Analysis with Difficulty Index - Tweed-Merrifield Analysis - in order to aid orthodontic diagnosis. The analysis was transposed from the manual format to the digital format. A user-logical and clear interface was sought for the development of the prototype, consisting of tables and graphs, including automatic, fast and accurate calculations. The result was the immediate visualization of the resolution of the analysis after filling out the fields on the computer. This technological innovation can be a helpful instrument for the orthodontist that favors a more accurate dental-cranial-facial analysis, increases patient safety, orients conduct and may contribute to teaching and research.A precisão no diagnóstico ortodôntico pode aumentar a chance de êxito terapêutico. Este trabalho teve como objetivo descrever o desenvolvimento de um sistema computadorizado (protótipo, criado a partir de uma tabela impressa da Análise Craniofacial e Análise do Espaço Total com o Índice de Dificuldade - Análise de Tweed-Merrifield, que visa auxiliar o diagnóstico ortodôntico. Foi aplicada a transposição da análise do formato manual para o digital. Buscou-se uma interface lógica e simples para o desenvolvimento do protótipo, composta por tabelas e gráficos, incluindo a realização de cálculos automáticos rápidos e precisos. O resultado foi a visualização imediata da resolução da análise, após o preenchimento dos campos no computador. Essa inovação tecnológica pode ser um instrumento de auxílio ao ortodontista, favorecendo a obtenção de um diagnóstico dentocraniofacial mais acurado, aumentando a segurança do paciente, orientando a conduta e pode contribuir para o ensino e

  19. Malware Analysis Using Visualized Image Matrices

    Directory of Open Access Journals (Sweden)

    KyoungSoo Han

    2014-01-01

    Full Text Available This paper proposes a novel malware visual analysis method that contains not only a visualization method to convert binary files into images, but also a similarity calculation method between these images. The proposed method generates RGB-colored pixels on image matrices using the opcode sequences extracted from malware samples and calculates the similarities for the image matrices. Particularly, our proposed methods are available for packed malware samples by applying them to the execution traces extracted through dynamic analysis. When the images are generated, we can reduce the overheads by extracting the opcode sequences only from the blocks that include the instructions related to staple behaviors such as functions and application programming interface (API calls. In addition, we propose a technique that generates a representative image for each malware family in order to reduce the number of comparisons for the classification of unknown samples and the colored pixel information in the image matrices is used to calculate the similarities between the images. Our experimental results show that the image matrices of malware can effectively be used to classify malware families both statically and dynamically with accuracy of 0.9896 and 0.9732, respectively.

  20. Malware analysis using visualized image matrices.

    Science.gov (United States)

    Han, KyoungSoo; Kang, BooJoong; Im, Eul Gyu

    2014-01-01

    This paper proposes a novel malware visual analysis method that contains not only a visualization method to convert binary files into images, but also a similarity calculation method between these images. The proposed method generates RGB-colored pixels on image matrices using the opcode sequences extracted from malware samples and calculates the similarities for the image matrices. Particularly, our proposed methods are available for packed malware samples by applying them to the execution traces extracted through dynamic analysis. When the images are generated, we can reduce the overheads by extracting the opcode sequences only from the blocks that include the instructions related to staple behaviors such as functions and application programming interface (API) calls. In addition, we propose a technique that generates a representative image for each malware family in order to reduce the number of comparisons for the classification of unknown samples and the colored pixel information in the image matrices is used to calculate the similarities between the images. Our experimental results show that the image matrices of malware can effectively be used to classify malware families both statically and dynamically with accuracy of 0.9896 and 0.9732, respectively.

  1. Analysis of Variance in Statistical Image Processing

    Science.gov (United States)

    Kurz, Ludwik; Hafed Benteftifa, M.

    1997-04-01

    A key problem in practical image processing is the detection of specific features in a noisy image. Analysis of variance (ANOVA) techniques can be very effective in such situations, and this book gives a detailed account of the use of ANOVA in statistical image processing. The book begins by describing the statistical representation of images in the various ANOVA models. The authors present a number of computationally efficient algorithms and techniques to deal with such problems as line, edge, and object detection, as well as image restoration and enhancement. By describing the basic principles of these techniques, and showing their use in specific situations, the book will facilitate the design of new algorithms for particular applications. It will be of great interest to graduate students and engineers in the field of image processing and pattern recognition.

  2. Image processing and analysis software development

    International Nuclear Information System (INIS)

    Shahnaz, R.

    1999-01-01

    The work presented in this project is aimed at developing a software 'IMAGE GALLERY' to investigate various image processing and analysis techniques. The work was divided into two parts namely the image processing techniques and pattern recognition, which further comprised of character and face recognition. Various image enhancement techniques including negative imaging, contrast stretching, compression of dynamic, neon, diffuse, emboss etc. have been studied. Segmentation techniques including point detection, line detection, edge detection have been studied. Also some of the smoothing and sharpening filters have been investigated. All these imaging techniques have been implemented in a window based computer program written in Visual Basic Neural network techniques based on Perception model have been applied for face and character recognition. (author)

  3. HVAC modifications and computerized energy analysis for the Operations Support Building at the Mars Deep Space Station at Goldstone

    Science.gov (United States)

    Halperin, A.; Stelzmuller, P.

    1986-01-01

    The key heating, ventilation, and air-conditioning (HVAC) modifications implemented at the Mars Deep Space Station's Operation Support Building at Jet Propulsion Laboratories (JPL) in order to reduce energy consumption and decrease operating costs are described. An energy analysis comparison between the computer simulated model for the building and the actual meter data was presented. The measurement performance data showed that the cumulative energy savings was about 21% for the period 1979 to 1981. The deviation from simulated data to measurement performance data was only about 3%.

  4. From Digital Imaging to Computer Image Analysis of Fine Art

    Science.gov (United States)

    Stork, David G.

    An expanding range of techniques from computer vision, pattern recognition, image analysis, and computer graphics are being applied to problems in the history of art. The success of these efforts is enabled by the growing corpus of high-resolution multi-spectral digital images of art (primarily paintings and drawings), sophisticated computer vision methods, and most importantly the engagement of some art scholars who bring questions that may be addressed through computer methods. This paper outlines some general problem areas and opportunities in this new inter-disciplinary research program.

  5. Low osmolar (non-ionic) contrast media versus high osmolar (ionic) contrast media in intravenous urography and enhanced computerized tomography: a cost-effectiveness analysis.

    Science.gov (United States)

    Wangsuphachart, S

    1991-12-01

    The cost-effectiveness of three alternative policies for the use of intravenous contrast media for urography and enhanced computerized tomography (CT) are analyzed. Alternative #1 is to use high osmolar contrast media (HOCM) in all patients, the historical policy. Alternative #2 is to replace it with low osmolar contrast media (LOCM) in all patients. Alternative #3 is to use LOCM only in the high risk patients. Data on the 6,242 patients who underwent intravenous urography and enhanced CT at the Department of Radiology, Chulalongkorn Hospital in 1989 were used. Both societal and hospital viewpoints were analyzed. The incremental cost-effectiveness (ICE) between #2 and #1 was 26,739 Baht (US$1,070) per healthy day saved (HDS), while the ICE between #3 and #1 was 12,057 Baht (US$482) per HDS. For fatal cases only, ICE between #2 and #1 was 35,111 Baht (US$1,404) per HDS, while the ICE between #3 and #1 was 18,266 Baht (US$731) per HDS. The incremental cost (IC) per patient was 2,341 Baht (US$94) and 681 Baht (US$27) respectively. For the hospital viewpoint the ICE between #2 and #1 was 13,744 (US$550) and between #3 and #1 was 6,127 Baht (US$245) per HDS. The IC per patient was 1,203 Baht (US$48) and 346 Baht (US$14), respectively. From the sensitivity analysis, #3 should be used if the LOCM price is reduced more than 75% (equal to 626 Baht or less) and more than 80% of the patients are able to pay for the contrast media.

  6. Breast cancer histopathology image analysis: a review.

    Science.gov (United States)

    Veta, Mitko; Pluim, Josien P W; van Diest, Paul J; Viergever, Max A

    2014-05-01

    This paper presents an overview of methods that have been proposed for the analysis of breast cancer histopathology images. This research area has become particularly relevant with the advent of whole slide imaging (WSI) scanners, which can perform cost-effective and high-throughput histopathology slide digitization, and which aim at replacing the optical microscope as the primary tool used by pathologist. Breast cancer is the most prevalent form of cancers among women, and image analysis methods that target this disease have a huge potential to reduce the workload in a typical pathology lab and to improve the quality of the interpretation. This paper is meant as an introduction for nonexperts. It starts with an overview of the tissue preparation, staining and slide digitization processes followed by a discussion of the different image processing techniques and applications, ranging from analysis of tissue staining to computer-aided diagnosis, and prognosis of breast cancer patients.

  7. Some developments in multivariate image analysis

    DEFF Research Database (Denmark)

    Kucheryavskiy, Sergey

    be up to several million. The main MIA tool for exploratory analysis is score density plot – all pixels are projected into principal component space and on the corresponding scores plots are colorized according to their density (how many pixels are crowded in the unit area of the plot). Looking...... for and analyzing patterns on these plots and the original image allow to do interactive analysis, to get some hidden information, build a supervised classification model, and much more. In the present work several alternative methods to original principal component analysis (PCA) for building the projection......Multivariate image analysis (MIA), one of the successful chemometric applications, now is used widely in different areas of science and industry. Introduced in late 80s it has became very popular with hyperspectral imaging, where MIA is one of the most efficient tools for exploratory analysis...

  8. Cardiothoracic ratio and vertebral heart size (VHS to standardize the heart size of the tufted capuchin (Cebus apella Linnaeus, 1758 in computerized radiographic images

    Directory of Open Access Journals (Sweden)

    Hermínio J. Rocha-Neto

    2015-10-01

    Full Text Available Abstract: The VHS and CTR were assessed using computerized thoracic radiographs of ten clinically healthy tufted capuchin monkeys (five males and five females from the Wild Animal Screening Center in São Luís (Centro de Triagem de Animais Silvestres de São Luís-MA-CETAS. Radiographs were taken in laterolateral and dorsoventral projections to calculate the cardiothoracic ratio (VHS and vertebral heart size (CTR. The VHS showed mean values of 9.34±0.32v (males and 9.16±0.34v (females and there was no statistical difference between males and females (p>0.05. The CTR showed mean values of 0.55±0.04 (males and 0.52±0.03 (females and there was no statistical difference between the sexes (p>0.05. There was positive correlation between VHS and CTR (r=0.78. The thoracic and heart diameters showed mean values of 5.70±0.48cm and 2.16±0.40cm in the males, respectively. In the females they measured 5.32±0.39cm and 2.94±0.32cm. There was no statistical difference between the sexes. Our results show that the high correlation found between VHS and CTR permitted the verification with similar clinical precision between the two methods to estimate alterations in the heart silhouette by radiographic examination of tufted capuchin, making it an easy technique to apply that can be considered in the investigation of heart problems for this wild species.

  9. DARE: Unesco Computerized Data Retrieval System for Documentation in the Social and Human Sciences (Including an Analysis of the Present System).

    Science.gov (United States)

    Vasarhelyi, Paul

    The new data retrieval system for the social sciences which has recently been installed in the UNESCO Secretariat in Paris is described in this comprehensive report. The computerized system is designed to facilitate the existing storage systems in the circulation of information, data retrieval, and indexing services. Basically, this report…

  10. Computerized analysis of coronary artery disease: Performance evaluation of segmentation and tracking of coronary arteries in CT angiograms

    Energy Technology Data Exchange (ETDEWEB)

    Zhou, Chuan, E-mail: chuan@umich.edu; Chan, Heang-Ping; Chughtai, Aamer; Kuriakose, Jean; Agarwal, Prachi; Kazerooni, Ella A.; Hadjiiski, Lubomir M.; Patel, Smita; Wei, Jun [Department of Radiology, University of Michigan, Ann Arbor, Michigan 48109 (United States)

    2014-08-15

    Purpose: The authors are developing a computer-aided detection system to assist radiologists in analysis of coronary artery disease in coronary CT angiograms (cCTA). This study evaluated the accuracy of the authors’ coronary artery segmentation and tracking method which are the essential steps to define the search space for the detection of atherosclerotic plaques. Methods: The heart region in cCTA is segmented and the vascular structures are enhanced using the authors’ multiscale coronary artery response (MSCAR) method that performed 3D multiscale filtering and analysis of the eigenvalues of Hessian matrices. Starting from seed points at the origins of the left and right coronary arteries, a 3D rolling balloon region growing (RBG) method that adapts to the local vessel size segmented and tracked each of the coronary arteries and identifies the branches along the tracked vessels. The branches are queued and subsequently tracked until the queue is exhausted. With Institutional Review Board approval, 62 cCTA were collected retrospectively from the authors’ patient files. Three experienced cardiothoracic radiologists manually tracked and marked center points of the coronary arteries as reference standard following the 17-segment model that includes clinically significant coronary arteries. Two radiologists visually examined the computer-segmented vessels and marked the mistakenly tracked veins and noisy structures as false positives (FPs). For the 62 cases, the radiologists marked a total of 10191 center points on 865 visible coronary artery segments. Results: The computer-segmented vessels overlapped with 83.6% (8520/10191) of the center points. Relative to the 865 radiologist-marked segments, the sensitivity reached 91.9% (795/865) if a true positive is defined as a computer-segmented vessel that overlapped with at least 10% of the reference center points marked on the segment. When the overlap threshold is increased to 50% and 100%, the sensitivities were 86

  11. Document image analysis: A primer

    Indian Academy of Sciences (India)

    R. Narasimhan (Krishtel eMaging) 1461 1996 Oct 15 13:05:22

    (1) Typical documents in today's office are computer-generated, but even so, inevitably by different computers and ... different sizes, from a business card to a large engineering drawing. Document analysis ... Whether global or adaptive ...

  12. Motion Estimation and Compensation Strategies in Dynamic Computerized Tomography

    Science.gov (United States)

    Hahn, Bernadette N.

    2017-12-01

    A main challenge in computerized tomography consists in imaging moving objects. Temporal changes during the measuring process lead to inconsistent data sets, and applying standard reconstruction techniques causes motion artefacts which can severely impose a reliable diagnostics. Therefore, novel reconstruction techniques are required which compensate for the dynamic behavior. This article builds on recent results from a microlocal analysis of the dynamic setting, which enable us to formulate efficient analytic motion compensation algorithms for contour extraction. Since these methods require information about the dynamic behavior, we further introduce a motion estimation approach which determines parameters of affine and certain non-affine deformations directly from measured motion-corrupted Radon-data. Our methods are illustrated with numerical examples for both types of motion.

  13. Traffic analysis and control using image processing

    Science.gov (United States)

    Senthilkumar, K.; Ellappan, Vijayan; Arun, A. R.

    2017-11-01

    This paper shows the work on traffic analysis and control till date. It shows an approach to regulate traffic the use of image processing and MATLAB systems. This concept uses computational images that are to be compared with original images of the street taken in order to determine the traffic level percentage and set the timing for the traffic signal accordingly which are used to reduce the traffic stoppage on traffic lights. They concept proposes to solve real life scenarios in the streets, thus enriching the traffic lights by adding image receivers like HD cameras and image processors. The input is then imported into MATLAB to be used. as a method for calculating the traffic on roads. Their results would be computed in order to adjust the traffic light timings on a particular street, and also with respect to other similar proposals but with the added value of solving a real, big instance.

  14. Development of Image Analysis Software of MAXI

    Science.gov (United States)

    Eguchi, S.; Ueda, Y.; Hiroi, K.; Isobe, N.; Sugizaki, M.; Suzuki, M.; Tomida, H.; Maxi Team

    2010-12-01

    Monitor of All-sky X-ray Image (MAXI) is an X-ray all-sky monitor, attached to the Japanese experiment module Kibo on the International Space Station. The main scientific goals of the MAXI mission include the discovery of X-ray novae followed by prompt alerts to the community (Negoro et al., in this conference), and production of X-ray all-sky maps and new source catalogs with unprecedented sensitivities. To extract the best capabilities of the MAXI mission, we are working on the development of detailed image analysis tools. We utilize maximum likelihood fitting to a projected sky image, where we take account of the complicated detector responses, such as the background and point spread functions (PSFs). The modeling of PSFs, which strongly depend on the orbit and attitude of MAXI, is a key element in the image analysis. In this paper, we present the status of our software development.

  15. Digital image analysis of NDT radiographs

    International Nuclear Information System (INIS)

    Graeme, W.A. Jr.; Eizember, A.C.; Douglass, J.

    1989-01-01

    Prior to the introduction of Charge Coupled Device (CCD) detectors the majority of image analysis performed on NDT radiographic images was done visually in the analog domain. While some film digitization was being performed, the process was often unable to capture all the usable information on the radiograph or was too time consuming. CCD technology now provides a method to digitize radiographic film images without losing the useful information captured in the original radiograph in a timely process. Incorporating that technology into a complete digital radiographic workstation allows analog radiographic information to be processed, providing additional information to the radiographer. Once in the digital domain, that data can be stored, and fused with radioscopic and other forms of digital data. The result is more productive analysis and management of radiographic inspection data. The principal function of the NDT Scan IV digital radiography system is the digitization, enhancement and storage of radiographic images

  16. Mathematical foundations of image processing and analysis

    CERN Document Server

    Pinoli, Jean-Charles

    2014-01-01

    Mathematical Imaging is currently a rapidly growing field in applied mathematics, with an increasing need for theoretical mathematics. This book, the second of two volumes, emphasizes the role of mathematics as a rigorous basis for imaging sciences. It provides a comprehensive and convenient overview of the key mathematical concepts, notions, tools and frameworks involved in the various fields of gray-tone and binary image processing and analysis, by proposing a large, but coherent, set of symbols and notations, a complete list of subjects and a detailed bibliography. It establishes a bridg

  17. Chromatic Image Analysis For Quantitative Thermal Mapping

    Science.gov (United States)

    Buck, Gregory M.

    1995-01-01

    Chromatic image analysis system (CIAS) developed for use in noncontact measurements of temperatures on aerothermodynamic models in hypersonic wind tunnels. Based on concept of temperature coupled to shift in color spectrum for optical measurement. Video camera images fluorescence emitted by phosphor-coated model at two wavelengths. Temperature map of model then computed from relative brightnesses in video images of model at those wavelengths. Eliminates need for intrusive, time-consuming, contact temperature measurements by gauges, making it possible to map temperatures on complex surfaces in timely manner and at reduced cost.

  18. New method of scoliosis assessment: preliminary results using computerized photogrammetry.

    Science.gov (United States)

    Aroeira, Rozilene Maria Cota; Leal, Jefferson Soares; de Melo Pertence, Antônio Eustáquio

    2011-09-01

    A new method for nonradiographic evaluation of scoliosis was independently compared with the Cobb radiographic method, for the quantification of scoliotic curvature. To develop a protocol for computerized photogrammetry, as a nonradiographic method, for the quantification of scoliosis, and to mathematically relate this proposed method with the Cobb radiographic method. Repeated exposure to radiation of children can be harmful to their health. Nevertheless, no nonradiographic method until now proposed has gained popularity as a routine method for evaluation, mainly due to a low correspondence to the Cobb radiographic method. Patients undergoing standing posteroanterior full-length spine radiographs, who were willing to participate in this study, were submitted to dorsal digital photography in the orthostatic position with special surface markers over the spinous process, specifically the vertebrae C7 to L5. The radiographic and photographic images were sent separately for independent analysis to two examiners, trained in quantification of scoliosis for the types of images received. The scoliosis curvature angles obtained through computerized photogrammetry (the new method) were compared to those obtained through the Cobb radiographic method. Sixteen individuals were evaluated (14 female and 2 male). All presented idiopathic scoliosis, and were between 21.4 ± 6.1 years of age; 52.9 ± 5.8 kg in weight; 1.63 ± 0.05 m in height, with a body mass index of 19.8 ± 0.2. There was no statistically significant difference between the scoliosis angle measurements obtained in the comparative analysis of both methods, and a mathematical relationship was formulated between both methods. The preliminary results presented demonstrate equivalence between the two methods. More studies are needed to firmly assess the potential of this new method as a coadjuvant tool in the routine following of scoliosis treatment.

  19. Laue image analysis. Pt. 2

    International Nuclear Information System (INIS)

    Greenhough, T.J.; Shrive, A.K.

    1994-01-01

    Many Laue diffraction patterns from crystals of particular biological or chemical interest are of insufficient quality for their analysis to be feasible. In many cases, this is because of pronounced streaking of the spots owing to either large mosaic spread or disorder introduced during reactions in the crystal. Methods for the analysis of exposures exhibiting radial or near-radial streaking are described, along with their application in Laue diffraction studies of form-II crystals of Met-tRNA synthetase and a photosynthetic reaction centre from Rhodobacter sphaeroides. In both cases, variable elliptical radial masking has led to significant improvements in data quality and quantity and exposures that previously were too streaked to process may now be analysed. These masks can also provide circular profiles as a special case for processing high-quality Laue exposures and spatial-overlap deconvolution may be performed using the elliptical or circular masks. (orig.)

  20. Multisource Images Analysis Using Collaborative Clustering

    Directory of Open Access Journals (Sweden)

    Pierre Gançarski

    2008-04-01

    Full Text Available The development of very high-resolution (VHR satellite imagery has produced a huge amount of data. The multiplication of satellites which embed different types of sensors provides a lot of heterogeneous images. Consequently, the image analyst has often many different images available, representing the same area of the Earth surface. These images can be from different dates, produced by different sensors, or even at different resolutions. The lack of machine learning tools using all these representations in an overall process constraints to a sequential analysis of these various images. In order to use all the information available simultaneously, we propose a framework where different algorithms can use different views of the scene. Each one works on a different remotely sensed image and, thus, produces different and useful information. These algorithms work together in a collaborative way through an automatic and mutual refinement of their results, so that all the results have almost the same number of clusters, which are statistically similar. Finally, a unique result is produced, representing a consensus among the information obtained by each clustering method on its own image. The unified result and the complementarity of the single results (i.e., the agreement between the clustering methods as well as the disagreement lead to a better understanding of the scene. The experiments carried out on multispectral remote sensing images have shown that this method is efficient to extract relevant information and to improve the scene understanding.

  1. Parameter analysis of radiography film for TC use

    International Nuclear Information System (INIS)

    Souza, L.S. de; Lopes, R.T.

    1990-01-01

    The possibilities of use x-ray industrial films on the projection survey for image processing, jointly with the noise analysis in films, when used as a radiation detectors in computerized tomography are studied. (C.G.C.)

  2. Computerized tomography in myotonic dystrophy

    International Nuclear Information System (INIS)

    Gellerich, I.; Mueller, D.; Koch, R.D.

    1986-01-01

    Besides clinical symptoms, progress and electromyography computerized tomography improves the diagnostics of myotonic dystrophy. Even small changes in muscular structure are detectable and especially the musculus soleus exhibits early and pronounced alterations. By means of density distribution pattern an improved characterization of the disease is possible. Additional information is obtained by cerebral computerized tomography. Atrophy of brain tissue is to be expected in all patients with myotonic dystrophy. (author)

  3. Computerized radiation treatment planning

    International Nuclear Information System (INIS)

    Laarse, R. van der.

    1981-01-01

    Following a general introduction, a chain consisting of three computer programs which has been developed for treatment planning of external beam radiotherapy without manual intervention is described. New score functions used for determination of optimal incidence directions are presented and the calculation of the position of the isocentre for each optimum combination of incidence directions is explained. A description of how a set of applicators, covering fields with dimensions of 4 to 20 cm, for the 6 to 20 MeV electron beams of a MEL SL75-20 linear accelerator was developed, is given. A computer program for three dimensional electron beam treatment planning is presented. A microprocessor based treatment planning system for the Selectron remote controlled afterloading system for intracavitary radiotherapy is described. The main differences in treatment planning procedures for external beam therapy with neutrons instead of photons is discussed. A microprocessor based densitometer for plotting isodensity lines in film dosimetry is described. A computer program for dose planning of brachytherapy is presented. Finally a general discussion about the different aspects of computerized treatment planning as presented in this thesis is given. (Auth.)

  4. Radioactive waste computerized management

    International Nuclear Information System (INIS)

    Communaux, M.; Lantes, B.

    1993-01-01

    Since December 31, 1990, the management of the nuclear wastes for all the power stations has been computerized, using the DRA module of the Power Generation and Transmission Group's data processing master plan. So now EDF has a software package which centralizes all the data, enabling it to declare the characteristics of the nuclear wastes which are to be stored on the sites operated by the National Radioactive Waste Management Agency (ANDRA). Among other uses, this application makes it possible for EDF, by real time data exchange with ANDRA, to constitute an inventory of validated, shippable packs. It also constitutes a data base for all the wastes produced on the various sites. This application was developed to meet the following requirements: give the producers of radioactive waste a means to fully manage all the characteristics and materials that are necessary to condition their waste correctly; guarantee the traceability and safety of data and automatically assure the transmission of this data in real time between the producers and the ANDRA; give the Central Services of EDF an operation and statistical tool permitting an experienced feed-back based on the complete national production (single, centralized data base); and integrate the application within the products of the processing master plan in order to assure its maintenance and evolution

  5. Development of a computerized tomographic system

    International Nuclear Information System (INIS)

    Borges, J.C.; Santos, C.A.C.

    1986-01-01

    The Nuclear Instrumentation Laboratory at COPPE/UFRJ has been developing techniques for detection and applications of nuclear radiations. A lot of research work has been done and resulted in several M.Sc. and D.Sc. thesis, concerning subjects like neutrongraphy, gammagraphy, image reconstruction, special detectors, etc. Recent progress and multiple applications of the computerized tomography to medical and industrial non-destructive tests, pushed the Laboratory to a vast program in this field of research. In this paper, we report what has been done and the results obtained. (Author) [pt

  6. Applications Of Binary Image Analysis Techniques

    Science.gov (United States)

    Tropf, H.; Enderle, E.; Kammerer, H. P.

    1983-10-01

    After discussing the conditions where binary image analysis techniques can be used, three new applications of the fast binary image analysis system S.A.M. (Sensorsystem for Automation and Measurement) are reported: (1) The human view direction is measured at TV frame rate while the subject's head is free movable. (2) Industrial parts hanging on a moving conveyor are classified prior to spray painting by robot. (3) In automotive wheel assembly, the eccentricity of the wheel is minimized by turning the tyre relative to the rim in order to balance the eccentricity of the components.

  7. Computerized tomographic in non-destructive testing

    International Nuclear Information System (INIS)

    Lopes, R.T.

    1988-01-01

    The process of computerized tomography has been developed for medical imaging purposes using tomographs with X-ray, and little attention has been given to others possibles applications of technique, because of its cost. As an alternative for the problem, we constructed a Tomographic System (STAC-1), using gamma-rays, for nonmedical applications. In this work we summarize the basic theory of reconstructing images using computerized tomography and we describe the considerations leading to the development of the experimental system. The method of reconstruction image implanted in the system is the filtered backprojection or convolution, with a digital filters system to carried on a pre-filtering in the projections. The experimental system is described, with details of control and the data processing. An alternative and a complementary system, using film as a detector is shown in preliminary form . This thesis discuss and shows the theorical and practical aspects, considered in the construction of the STAC-1, and also its limitations and apllications [pt

  8. Fourier analysis: from cloaking to imaging

    Science.gov (United States)

    Wu, Kedi; Cheng, Qiluan; Wang, Guo Ping

    2016-04-01

    Regarding invisibility cloaks as an optical imaging system, we present a Fourier approach to analytically unify both Pendry cloaks and complementary media-based invisibility cloaks into one kind of cloak. By synthesizing different transfer functions, we can construct different devices to realize a series of interesting functions such as hiding objects (events), creating illusions, and performing perfect imaging. In this article, we give a brief review on recent works of applying Fourier approach to analysis invisibility cloaks and optical imaging through scattering layers. We show that, to construct devices to conceal an object, no constructive materials with extreme properties are required, making most, if not all, of the above functions realizable by using naturally occurring materials. As instances, we experimentally verify a method of directionally hiding distant objects and create illusions by using all-dielectric materials, and further demonstrate a non-invasive method of imaging objects completely hidden by scattering layers.

  9. Fourier analysis: from cloaking to imaging

    International Nuclear Information System (INIS)

    Wu, Kedi; Ping Wang, Guo; Cheng, Qiluan

    2016-01-01

    Regarding invisibility cloaks as an optical imaging system, we present a Fourier approach to analytically unify both Pendry cloaks and complementary media-based invisibility cloaks into one kind of cloak. By synthesizing different transfer functions, we can construct different devices to realize a series of interesting functions such as hiding objects (events), creating illusions, and performing perfect imaging. In this article, we give a brief review on recent works of applying Fourier approach to analysis invisibility cloaks and optical imaging through scattering layers. We show that, to construct devices to conceal an object, no constructive materials with extreme properties are required, making most, if not all, of the above functions realizable by using naturally occurring materials. As instances, we experimentally verify a method of directionally hiding distant objects and create illusions by using all-dielectric materials, and further demonstrate a non-invasive method of imaging objects completely hidden by scattering layers. (review)

  10. Quantitative Image Simulation and Analysis of Nanoparticles

    DEFF Research Database (Denmark)

    Madsen, Jacob; Hansen, Thomas Willum

    Microscopy (HRTEM) has become a routine analysis tool for structural characterization at atomic resolution, and with the recent development of in-situ TEMs, it is now possible to study catalytic nanoparticles under reaction conditions. However, the connection between an experimental image, and the underlying...... physical phenomena or structure is not always straightforward. The aim of this thesis is to use image simulation to better understand observations from HRTEM images. Surface strain is known to be important for the performance of nanoparticles. Using simulation, we estimate of the precision and accuracy...... of strain measurements from TEM images, and investigate the stability of these measurements to microscope parameters. This is followed by our efforts toward simulating metal nanoparticles on a metal-oxide support using the Charge Optimized Many Body (COMB) interatomic potential. The simulated interface...

  11. Hyperspectral Image Analysis of Food Quality

    DEFF Research Database (Denmark)

    Arngren, Morten

    inspection.Near-infrared spectroscopy can address these issues by offering a fast and objectiveanalysis of the food quality. A natural extension to these single spectrumNIR systems is to include image information such that each pixel holds a NIRspectrum. This augmented image information offers several......Assessing the quality of food is a vital step in any food processing line to ensurethe best food quality and maximum profit for the farmer and food manufacturer.Traditional quality evaluation methods are often destructive and labourintensive procedures relying on wet chemistry or subjective human...... extensions to the analysis offood quality. This dissertation is concerned with hyperspectral image analysisused to assess the quality of single grain kernels. The focus is to highlight thebenefits and challenges of using hyperspectral imaging for food quality presentedin two research directions. Initially...

  12. Deep Learning in Medical Image Analysis.

    Science.gov (United States)

    Shen, Dinggang; Wu, Guorong; Suk, Heung-Il

    2017-06-21

    This review covers computer-assisted analysis of images in the field of medical imaging. Recent advances in machine learning, especially with regard to deep learning, are helping to identify, classify, and quantify patterns in medical images. At the core of these advances is the ability to exploit hierarchical feature representations learned solely from data, instead of features designed by hand according to domain-specific knowledge. Deep learning is rapidly becoming the state of the art, leading to enhanced performance in various medical applications. We introduce the fundamentals of deep learning methods and review their successes in image registration, detection of anatomical and cellular structures, tissue segmentation, computer-aided disease diagnosis and prognosis, and so on. We conclude by discussing research issues and suggesting future directions for further improvement.

  13. Data Analysis Strategies in Medical Imaging.

    Science.gov (United States)

    Parmar, Chintan; Barry, Joseph D; Hosny, Ahmed; Quackenbush, John; Aerts, Hugo Jwl

    2018-03-26

    Radiographic imaging continues to be one of the most effective and clinically useful tools within oncology. Sophistication of artificial intelligence (AI) has allowed for detailed quantification of radiographic characteristics of tissues using predefined engineered algorithms or deep learning methods. Precedents in radiology as well as a wealth of research studies hint at the clinical relevance of these characteristics. However, there are critical challenges associated with the analysis of medical imaging data. While some of these challenges are specific to the imaging field, many others like reproducibility and batch effects are generic and have already been addressed in other quantitative fields such as genomics. Here, we identify these pitfalls and provide recommendations for analysis strategies of medical imaging data including data normalization, development of robust models, and rigorous statistical analyses. Adhering to these recommendations will not only improve analysis quality, but will also enhance precision medicine by allowing better integration of imaging data with other biomedical data sources. Copyright ©2018, American Association for Cancer Research.

  14. Diversity in computerized reactor protection systems

    International Nuclear Information System (INIS)

    Fischer, H.D.; Piel, L.

    1999-01-01

    Based on engineering judgement, the most important measures to increase the independency of redundant trains of a computerized safety instrumentation and control system (I and C) in a nuclear power plant are evaluated with respect to practical applications. This paper will contribute to an objective discussion on the necessary and justifiable arrangement of diversity in a computerized safety I and C system. Important conclusions are: - (i) diverse equipment may be used to control dependent failures only if measures necessary for designing, licensing, and operating a computerized safety I and C system homogeneous in equipment are neither technically nor economically feasible; - (ii) the considerable large operating experience in France with a non-diverse equipment digital reactor protection system does not call for equipment diversity. Although there are no generally accepted methods, the licensing authority is still required to take into account dependent failures in a probabilistic safety analysis; - (ii) the frequency of postulated initiating events implies which I and C functionality should be implemented on diverse equipment. Using non-safety I and C equipment in addition to safety I and C equipment is attractive because its necessary unavailability to control an initiating event in teamwork with the safety I and C equipment is estimated to range from 0.01 to 0.1. This can be achieved by operational experience

  15. Multispectral Image Analysis for Astaxanthin Coating Classification

    DEFF Research Database (Denmark)

    Ljungqvist, Martin Georg; Ersbøll, Bjarne Kjær; Nielsen, Michael Engelbrecht

    2012-01-01

    Industrial quality inspection using image analysis on astaxanthin coating in aquaculture feed pellets is of great importance for automatic production control. The pellets were divided into two groups: one with pellets coated using synthetic astaxanthin in fish oil and the other with pellets coated...

  16. A virtual laboratory for medical image analysis

    NARCIS (Netherlands)

    Olabarriaga, Sílvia D.; Glatard, Tristan; de Boer, Piter T.

    2010-01-01

    This paper presents the design, implementation, and usage of a virtual laboratory for medical image analysis. It is fully based on the Dutch grid, which is part of the Enabling Grids for E-sciencE (EGEE) production infrastructure and driven by the gLite middleware. The adopted service-oriented

  17. Scanning transmission electron microscopy imaging and analysis

    CERN Document Server

    Pennycook, Stephen J

    2011-01-01

    Provides the first comprehensive treatment of the physics and applications of this mainstream technique for imaging and analysis at the atomic level Presents applications of STEM in condensed matter physics, materials science, catalysis, and nanoscience Suitable for graduate students learning microscopy, researchers wishing to utilize STEM, as well as for specialists in other areas of microscopy Edited and written by leading researchers and practitioners

  18. Serial assessment of doxorubicin cardiomyopathy with the computerized scintillation probe

    International Nuclear Information System (INIS)

    Strashun, A.; Goldsmith, S.J.; Horowitz, S.F.

    1982-01-01

    Cardiac function was serially monitored in 55 patients receiving adriamycin chemotherapy over 18 months with quantitative radionuclide assessment by both a nonimaging computerized scintillation probe and gamma camera-computer imaging. Interval ejection fraction change was comparable with both techniques and predicted incipient cardiotoxicity. Probe data revealed ejection fraction decline was antedated by decline left ventricular emptying and filling rates

  19. The Pattern of Significant Lesions Found in Computerized ...

    African Journals Online (AJOL)

    Introduction: Seizures are common reasons for neurologic consultations and investigations. In the absence of magnetic resonance imaging, computerized tomography scanning of the brain is a reliable and cheaper alternative. Little is known about the pattern of brain lesions in patients with recurrent seizures in Nigeria.

  20. The Pattern of Significant Lesions Found in Computerized ...

    African Journals Online (AJOL)

    2017-12-05

    Dec 5, 2017 ... Introduction: Seizures are common reasons for neurologic consultations and investigations. In the absence of magnetic resonance imaging, computerized tomography scanning of the brain is a reliable and cheaper alternative. Little is known about the pattern of brain lesions in patients with recurrent ...

  1. Flame analysis using image processing techniques

    Science.gov (United States)

    Her Jie, Albert Chang; Zamli, Ahmad Faizal Ahmad; Zulazlan Shah Zulkifli, Ahmad; Yee, Joanne Lim Mun; Lim, Mooktzeng

    2018-04-01

    This paper presents image processing techniques with the use of fuzzy logic and neural network approach to perform flame analysis. Flame diagnostic is important in the industry to extract relevant information from flame images. Experiment test is carried out in a model industrial burner with different flow rates. Flame features such as luminous and spectral parameters are extracted using image processing and Fast Fourier Transform (FFT). Flame images are acquired using FLIR infrared camera. Non-linearities such as thermal acoustic oscillations and background noise affect the stability of flame. Flame velocity is one of the important characteristics that determines stability of flame. In this paper, an image processing method is proposed to determine flame velocity. Power spectral density (PSD) graph is a good tool for vibration analysis where flame stability can be approximated. However, a more intelligent diagnostic system is needed to automatically determine flame stability. In this paper, flame features of different flow rates are compared and analyzed. The selected flame features are used as inputs to the proposed fuzzy inference system to determine flame stability. Neural network is used to test the performance of the fuzzy inference system.

  2. Frequency domain analysis of knock images

    Science.gov (United States)

    Qi, Yunliang; He, Xin; Wang, Zhi; Wang, Jianxin

    2014-12-01

    High speed imaging-based knock analysis has mainly focused on time domain information, e.g. the spark triggered flame speed, the time when end gas auto-ignition occurs and the end gas flame speed after auto-ignition. This study presents a frequency domain analysis on the knock images recorded using a high speed camera with direct photography in a rapid compression machine (RCM). To clearly visualize the pressure wave oscillation in the combustion chamber, the images were high-pass-filtered to extract the luminosity oscillation. The luminosity spectrum was then obtained by applying fast Fourier transform (FFT) to three basic colour components (red, green and blue) of the high-pass-filtered images. Compared to the pressure spectrum, the luminosity spectra better identify the resonant modes of pressure wave oscillation. More importantly, the resonant mode shapes can be clearly visualized by reconstructing the images based on the amplitudes of luminosity spectra at the corresponding resonant frequencies, which agree well with the analytical solutions for mode shapes of gas vibration in a cylindrical cavity.

  3. Computerized tomography and conventional radiography: A comparison from the standpoint of X-ray physics and technology

    Energy Technology Data Exchange (ETDEWEB)

    Pfeiler, M; Linke, G [Siemens A.G., Erlangen (Germany, F.R.). Unternehmensbereich Medizinische Technik

    1979-08-01

    After a short explantation of the technical foundations of computerized tomography (CT) from terms used in conventional X-ray technique and CT the differences (dose distribution, image character) and similarities (quantum noise, beam quality) of both methods are discussed. Finally possible methods of quantitative evaluation of CT images and computation of longitudinal layers from a series of computerized tomograms are described. (author).

  4. Computed image analysis of neutron radiographs

    International Nuclear Information System (INIS)

    Dinca, M.; Anghel, E.; Preda, M.; Pavelescu, M.

    2008-01-01

    Similar with X-radiography, using neutron like penetrating particle, there is in practice a nondestructive technique named neutron radiology. When the registration of information is done on a film with the help of a conversion foil (with high cross section for neutrons) that emits secondary radiation (β,γ) that creates a latent image, the technique is named neutron radiography. A radiographic industrial film that contains the image of the internal structure of an object, obtained by neutron radiography, must be subsequently analyzed to obtain qualitative and quantitative information about the structural integrity of that object. There is possible to do a computed analysis of a film using a facility with next main components: an illuminator for film, a CCD video camera and a computer (PC) with suitable software. The qualitative analysis intends to put in evidence possibly anomalies of the structure due to manufacturing processes or induced by working processes (for example, the irradiation activity in the case of the nuclear fuel). The quantitative determination is based on measurements of some image parameters: dimensions, optical densities. The illuminator has been built specially to perform this application but can be used for simple visual observation. The illuminated area is 9x40 cm. The frame of the system is a comparer of Abbe Carl Zeiss Jena type, which has been adapted to achieve this application. The video camera assures the capture of image that is stored and processed by computer. A special program SIMAG-NG has been developed at INR Pitesti that beside of the program SMTV II of the special acquisition module SM 5010 can analyze the images of a film. The major application of the system was the quantitative analysis of a film that contains the images of some nuclear fuel pins beside a dimensional standard. The system was used to measure the length of the pellets of the TRIGA nuclear fuel. (authors)

  5. Patient surface doses in computerized tomography examinations

    International Nuclear Information System (INIS)

    Vekic, B; Kovacevic, S.; Ranogajec-Komor, M.; Duvnjak, N.; Marusic, P.; Anic, P.; Dolencic, P.

    1996-01-01

    The diagnostic value of computerized tomography has increased due to very rapid technical advances in both equipment and techniques. When the CT scanners were introduced, a significant problem for the specification of the radiation dose imparted to the patient undergoing CT examination has been created. In CT, the conditions of exposure are quite different from those in conventional X-ray imaging. CT procedure involves the continuous tomography of thin layers. Some of these layers touch each other while others overlap. The radiation doses received by patients can vary considerably. In addition to the radiation from the collimated primary beam, patients are exposed to significant scattered doses in unpredictable amounts. Every effort should be made to keep these doses to a reasonable minimum, without sacrificing the image quality. The aims of this work were to determine the surface doses delivered to various organs of patients during various computerized tomography examinations (head, thorax, kidney, abdomen and pelvis). Particular attention was directed to the precise determination of doses received by the eyes (during CT of head) and gonads (during CT of pelvis and lower abdomen) since these organs can be near or even in the primary X-ray beam

  6. Web Based Distributed Coastal Image Analysis System, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — This project develops Web based distributed image analysis system processing the Moderate Resolution Imaging Spectroradiometer (MODIS) data to provide decision...

  7. Digital image sequence processing, compression, and analysis

    CERN Document Server

    Reed, Todd R

    2004-01-01

    IntroductionTodd R. ReedCONTENT-BASED IMAGE SEQUENCE REPRESENTATIONPedro M. Q. Aguiar, Radu S. Jasinschi, José M. F. Moura, andCharnchai PluempitiwiriyawejTHE COMPUTATION OF MOTIONChristoph Stiller, Sören Kammel, Jan Horn, and Thao DangMOTION ANALYSIS AND DISPLACEMENT ESTIMATION IN THE FREQUENCY DOMAINLuca Lucchese and Guido Maria CortelazzoQUALITY OF SERVICE ASSESSMENT IN NEW GENERATION WIRELESS VIDEO COMMUNICATIONSGaetano GiuntaERROR CONCEALMENT IN DIGITAL VIDEOFrancesco G.B. De NataleIMAGE SEQUENCE RESTORATION: A WIDER PERSPECTIVEAnil KokaramVIDEO SUMMARIZATIONCuneyt M. Taskiran and Edward

  8. Study of TCP densification via image analysis

    International Nuclear Information System (INIS)

    Silva, R.C.; Alencastro, F.S.; Oliveira, R.N.; Soares, G.A.

    2011-01-01

    Among ceramic materials that mimic human bone, β-type tri-calcium phosphate (β-TCP) has shown appropriate chemical stability and superior resorption rate when compared to hydroxyapatite. In order to increase its mechanical strength, the material is sintered, under controlled time and temperature conditions, to obtain densification without phase change. In the present work, tablets were produced via uniaxial compression and then sintered at 1150°C for 2h. The analysis via XRD and FTIR showed that the sintered tablets were composed only by β-TCP. The SEM images were used for quantification of grain size and volume fraction of pores, via digital image analysis. The tablets showed small pore fraction (between 0,67% and 6,38%) and homogeneous grain size distribution (∼2μm). Therefore, the analysis method seems viable to quantify porosity and grain size. (author)

  9. Using MATLAB software with Tomcat server and Java platform for remote image analysis in pathology.

    Science.gov (United States)

    Markiewicz, Tomasz

    2011-03-30

    The Matlab software is a one of the most advanced development tool for application in engineering practice. From our point of view the most important is the image processing toolbox, offering many built-in functions, including mathematical morphology, and implementation of a many artificial neural networks as AI. It is very popular platform for creation of the specialized program for image analysis, also in pathology. Based on the latest version of Matlab Builder Java toolbox, it is possible to create the software, serving as a remote system for image analysis in pathology via internet communication. The internet platform can be realized based on Java Servlet Pages with Tomcat server as servlet container. In presented software implementation we propose remote image analysis realized by Matlab algorithms. These algorithms can be compiled to executable jar file with the help of Matlab Builder Java toolbox. The Matlab function must be declared with the set of input data, output structure with numerical results and Matlab web figure. Any function prepared in that manner can be used as a Java function in Java Servlet Pages (JSP). The graphical user interface providing the input data and displaying the results (also in graphical form) must be implemented in JSP. Additionally the data storage to database can be implemented within algorithm written in Matlab with the help of Matlab Database Toolbox directly with the image processing. The complete JSP page can be run by Tomcat server. The proposed tool for remote image analysis was tested on the Computerized Analysis of Medical Images (CAMI) software developed by author. The user provides image and case information (diagnosis, staining, image parameter etc.). When analysis is initialized, input data with image are sent to servlet on Tomcat. When analysis is done, client obtains the graphical results as an image with marked recognized cells and also the quantitative output. Additionally, the results are stored in a server

  10. Analysis of renal nuclear medicine images

    International Nuclear Information System (INIS)

    Jose, R.M.J.

    2000-01-01

    Nuclear medicine imaging of the renal system involves producing time-sequential images showing the distribution of a radiopharmaceutical in the renal system. Producing numerical and graphical data from nuclear medicine studies requires defining regions of interest (ROIs) around various organs within the field of view, such as the left kidney, right kidney and bladder. Automating this process has several advantages: a saving of a clinician's time; enhanced objectivity and reproducibility. This thesis describes the design, implementation and assessment of an automatic ROI generation system. The performance of the system described in this work is assessed by comparing the results to those obtained using manual techniques. Since nuclear medicine images are inherently noisy, the sequence of images is reconstructed using the first few components of a principal components analysis in order to reduce the noise in the images. An image of the summed reconstructed sequence is then formed. This summed image is segmented by using an edge co-occurrence matrix as a feature space for simultaneously classifying regions and locating boundaries. Two methods for assigning the regions of a segmented image to organ class labels are assessed. The first method is based on using Dempster-Shafer theory to combine uncertain evidence from several sources into a single evidence; the second method makes use of a neural network classifier. The use of each technique in classifying the regions of a segmented image are assessed in separate experiments using 40 real patient-studies. A comparative assessment of the two techniques shows that the neural network produces more accurate region labels for the kidneys. The optimum neural system is determined experimentally. Results indicate that combining temporal and spatial information with a priori clinical knowledge produces reasonable ROIs. Consistency in the neural network assignment of regions is enhanced by taking account of the contextual

  11. Coupling image processing and stress analysis for damage identification in a human premolar tooth.

    Science.gov (United States)

    Andreaus, U; Colloca, M; Iacoviello, D

    2011-08-01

    Non-carious cervical lesions are characterized by the loss of dental hard tissue at the cement-enamel junction (CEJ). Exceeding stresses are therefore generated in the cervical region of the tooth that cause disruption of the bonds between the hydroxyapatite crystals, leading to crack formation and eventual loss of enamel and the underlying dentine. Damage identification was performed by image analysis techniques and allowed to quantitatively assess changes in teeth. A computerized two-step procedure was generated and applied to the first left maxillary human premolar. In the first step, dental images were digitally processed by a segmentation method in order to identify the damage. The considered morphological properties were the enamel thickness and total area, the number of fragments in which the enamel is chipped. The information retrieved by the data processing of the section images allowed to orient the stress investigation toward selected portions of the tooth. In the second step, a three-dimensional finite element model based on CT images of both the tooth and the periodontal ligament was employed to compare the changes occurring in the stress distributions in normal occlusion and malocclusion. The stress states were analyzed exclusively in the critical zones designated in the first step. The risk of failure at the CEJ and of crack initiation at the dentin-enamel junction through the quantification of first and third principal stresses, von Mises stress, and normal and tangential stresses, were also estimated. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  12. Rapid Analysis and Exploration of Fluorescence Microscopy Images

    OpenAIRE

    Pavie, Benjamin; Rajaram, Satwik; Ouyang, Austin; Altschuler, Jason; Steininger, Robert J; Wu, Lani; Altschuler, Steven

    2014-01-01

    Despite rapid advances in high-throughput microscopy, quantitative image-based assays still pose significant challenges. While a variety of specialized image analysis tools are available, most traditional image-analysis-based workflows have steep learning curves (for fine tuning of analysis parameters) and result in long turnaround times between imaging and analysis. In particular, cell segmentation, the process of identifying individual cells in an image, is a major bottleneck in this regard.

  13. Is 18F-fluorocholine-positron emission tomography/computerized tomography a new imaging tool for detecting hyperfunctioning parathyroid glands in primary or secondary hyperparathyroidism?

    Science.gov (United States)

    Michaud, Laure; Burgess, Alice; Huchet, Virginie; Lefèvre, Marine; Tassart, Marc; Ohnona, Jessica; Kerrou, Khaldoun; Balogova, Sona; Talbot, Jean-Noël; Périé, Sophie

    2014-12-01

    Preoperative ultrasonography and scintigraphy using (99m)Tc-sestamibi are commonly used to localize abnormal parathyroid glands. In cases of discrepant results between scintigraphy and ultrasonography, it is important to rely on another diagnostic imaging modality. (18)F-fluorodeoxyglucose (FDG) and (11)C-methionine positron emission tomography (PET) have been studied, but are imperfect to detect abnormal parathyroid glands. Recently, first cases of abnormal parathyroid glands taking-up radiolabelled choline were discovered incidentally in men referred to (11)C-choline or (18)F-fluorocholine (FCH)-PET/CT for prostate cancer. We checked if FCH uptake was a general feature of adenomatous or hyperplastic parathyroid glands. FCH-PET/CT was performed in 12 patients with primary (n = 8) or secondary hyperparathyroidism (1 dialyzed, 3 grafted) and with discordant or equivocal results on preoperative ultrasonography (US) and/or (123)I/(99m)Tc-sestamibi dual-phase scintigraphy. The results of the FCH-PET/CT were evaluated, with surgical exploration and histopathologic examination as the standard of truth. On a per-patient level, the detection rate of FCH-PET/CT (at least one FCH focus corresponding to an abnormal parathyroid gland in a given patient) was 11/12 = 92%. FCH-PET/CT detected 18 foci interpreted as parathyroid glands and correctly localized 17 abnormal parathyroid glands (7 adenomas and 10 hyperplasias). On a per-lesion level, FCH-PET/CT results were 17 TP, 2 false negative ie, a lesion-based sensitivity of 89%, and 1 false positive. As the main result of this pilot study, we show that in patients with hyperparathyroidism and with discordant or equivocal results on scintigraphy or on ultrasonography, adenomatous or hyperplastic parathyroid glands can be localized by FCH-PET/CT with good accuracy. Furthermore, FCH-PET/CT can solve discrepant results between preoperative ultrasonography and scintigraphy and has thus a potential as a functional imaging modality in

  14. Image analysis for ophthalmological diagnosis image processing of Corvis ST images using Matlab

    CERN Document Server

    Koprowski, Robert

    2016-01-01

    This monograph focuses on the use of analysis and processing methods for images from the Corvis® ST tonometer. The presented analysis is associated with the quantitative, repeatable and fully automatic evaluation of the response of the eye, eyeball and cornea to an air-puff. All the described algorithms were practically implemented in MATLAB®. The monograph also describes and provides the full source code designed to perform the discussed calculations. As a result, this monograph is intended for scientists, graduate students and students of computer science and bioengineering as well as doctors wishing to expand their knowledge of modern diagnostic methods assisted by various image analysis and processing methods.

  15. Interictal "patchy" regional cerebral blood flow patterns in migraine patients. A single photon emission computerized tomographic study

    DEFF Research Database (Denmark)

    Friberg, L; Olesen, J; Iversen, Helle Klingenberg

    1994-01-01

    In 92 migraine patients and 44 healthy control subjects we recorded regional cerebral blood flow (rCBF) with single photon emission computerized tomography and (133) Xe inhalation or with i.v. (99m) Tc-HMPAO. Migraine patients were studied interictally. A quantitated analysis of right-left asymme......In 92 migraine patients and 44 healthy control subjects we recorded regional cerebral blood flow (rCBF) with single photon emission computerized tomography and (133) Xe inhalation or with i.v. (99m) Tc-HMPAO. Migraine patients were studied interictally. A quantitated analysis of right...... rCBF images is insufficient to pick up abnormalities; (2) almost 50% of the migraine sufferers had abnormal rCBF/asymmetries. However, these are discrete compared with those typically seen during the aura phase of a migraine attack. One explanation to the patchy rCBF patterns might...

  16. Image sequence analysis workstation for multipoint motion analysis

    Science.gov (United States)

    Mostafavi, Hassan

    1990-08-01

    This paper describes an application-specific engineering workstation designed and developed to analyze motion of objects from video sequences. The system combines the software and hardware environment of a modem graphic-oriented workstation with the digital image acquisition, processing and display techniques. In addition to automation and Increase In throughput of data reduction tasks, the objective of the system Is to provide less invasive methods of measurement by offering the ability to track objects that are more complex than reflective markers. Grey level Image processing and spatial/temporal adaptation of the processing parameters is used for location and tracking of more complex features of objects under uncontrolled lighting and background conditions. The applications of such an automated and noninvasive measurement tool include analysis of the trajectory and attitude of rigid bodies such as human limbs, robots, aircraft in flight, etc. The system's key features are: 1) Acquisition and storage of Image sequences by digitizing and storing real-time video; 2) computer-controlled movie loop playback, freeze frame display, and digital Image enhancement; 3) multiple leading edge tracking in addition to object centroids at up to 60 fields per second from both live input video or a stored Image sequence; 4) model-based estimation and tracking of the six degrees of freedom of a rigid body: 5) field-of-view and spatial calibration: 6) Image sequence and measurement data base management; and 7) offline analysis software for trajectory plotting and statistical analysis.

  17. Quantitative Analysis in Nuclear Medicine Imaging

    CERN Document Server

    2006-01-01

    This book provides a review of image analysis techniques as they are applied in the field of diagnostic and therapeutic nuclear medicine. Driven in part by the remarkable increase in computing power and its ready and inexpensive availability, this is a relatively new yet rapidly expanding field. Likewise, although the use of radionuclides for diagnosis and therapy has origins dating back almost to the discovery of natural radioactivity itself, radionuclide therapy and, in particular, targeted radionuclide therapy has only recently emerged as a promising approach for therapy of cancer and, to a lesser extent, other diseases. As effort has, therefore, been made to place the reviews provided in this book in a broader context. The effort to do this is reflected by the inclusion of introductory chapters that address basic principles of nuclear medicine imaging, followed by overview of issues that are closely related to quantitative nuclear imaging and its potential role in diagnostic and therapeutic applications. ...

  18. Computerized tomography with X-rays: an instrument in the analysis physico-chemical between formations and drilling fluids interactions; Tomografia computadorizada com raios-X: uma ferramenta na analise das interacoes fisico-quimicas entre as formacoes rochosas e fluidos de perfuracao

    Energy Technology Data Exchange (ETDEWEB)

    Coelho, Marcus Vinicius Cavalcante

    1998-12-31

    In this study it is demonstrated the applicability of the Computerized Tomography technique with x-rays to evaluate the reactivity degree between various drilling fluids and argillaceous sediments (Shales and Sandstones). The research has been conducted in the Rock-Fluid Interaction Pressure Simulator (RFIPS), where the possible physico-chemical alterations can be observed through successive tomography images, which are obtained during the flow of the fluid through the samples. In addition, it was noticed the formation of mud cake in Berea Sandstones samples in the RFIPS, though the Computerized Tomography with X-rays, when utilizing drilling fluids weighted with the baryte. (author) 35 refs., 38 figs., 5 tabs.

  19. Computerized tomography with X-rays: an instrument in the analysis physico-chemical between formations and drilling fluids interactions; Tomografia computadorizada com raios-X: uma ferramenta na analise das interacoes fisico-quimicas entre as formacoes rochosas e fluidos de perfuracao

    Energy Technology Data Exchange (ETDEWEB)

    Coelho, Marcus Vinicius Cavalcante

    1999-12-31

    In this study it is demonstrated the applicability of the Computerized Tomography technique with x-rays to evaluate the reactivity degree between various drilling fluids and argillaceous sediments (Shales and Sandstones). The research has been conducted in the Rock-Fluid Interaction Pressure Simulator (RFIPS), where the possible physico-chemical alterations can be observed through successive tomography images, which are obtained during the flow of the fluid through the samples. In addition, it was noticed the formation of mud cake in Berea Sandstones samples in the RFIPS, though the Computerized Tomography with X-rays, when utilizing drilling fluids weighted with the baryte. (author) 35 refs., 38 figs., 5 tabs.

  20. Multimodal Imaging Brain Connectivity Analysis (MIBCA toolbox

    Directory of Open Access Journals (Sweden)

    Andre Santos Ribeiro

    2015-07-01

    Full Text Available Aim. In recent years, connectivity studies using neuroimaging data have increased the understanding of the organization of large-scale structural and functional brain networks. However, data analysis is time consuming as rigorous procedures must be assured, from structuring data and pre-processing to modality specific data procedures. Until now, no single toolbox was able to perform such investigations on truly multimodal image data from beginning to end, including the combination of different connectivity analyses. Thus, we have developed the Multimodal Imaging Brain Connectivity Analysis (MIBCA toolbox with the goal of diminishing time waste in data processing and to allow an innovative and comprehensive approach to brain connectivity.Materials and Methods. The MIBCA toolbox is a fully automated all-in-one connectivity toolbox that offers pre-processing, connectivity and graph theoretical analyses of multimodal image data such as diffusion-weighted imaging, functional magnetic resonance imaging (fMRI and positron emission tomography (PET. It was developed in MATLAB environment and pipelines well-known neuroimaging softwares such as Freesurfer, SPM, FSL, and Diffusion Toolkit. It further implements routines for the construction of structural, functional and effective or combined connectivity matrices, as well as, routines for the extraction and calculation of imaging and graph-theory metrics, the latter using also functions from the Brain Connectivity Toolbox. Finally, the toolbox performs group statistical analysis and enables data visualization in the form of matrices, 3D brain graphs and connectograms. In this paper the MIBCA toolbox is presented by illustrating its capabilities using multimodal image data from a group of 35 healthy subjects (19–73 years old with volumetric T1-weighted, diffusion tensor imaging, and resting state fMRI data, and 10 subjets with 18F-Altanserin PET data also.Results. It was observed both a high inter

  1. Transformations and algorithms in a computerized brain atlas

    International Nuclear Information System (INIS)

    Thurfjell, L.; Bohm, C.; Eriksson, L.; Karolinska Institute/Hospital, Stockholm

    1993-01-01

    The computerized brain atlas constructed at the Karolinska Hospital, Stockholm, Sweden, has been further developed. This atlas was designed to be employed in different fields of neuro imaging such as positron emission tomography (PET), single photon emission tomography (SPECT), computerized tomography (CT) and magnetic resonance imaging (MR). The main objectives with the atlas is to aid the interpretation of functional images by introducing anatomical information, to serve as a tool in the merging of data from different imaging modalities and to facilitate the comparisons of data from different individuals by allowing for anatomical standardization of individual data. The purpose of this paper is to describe the algorithms and transformations used in the implementation of the atlas software

  2. Semiautomatic digital imaging system for cytogenetic analysis

    International Nuclear Information System (INIS)

    Chaubey, R.C.; Chauhan, P.C.; Bannur, S.V.; Kulgod, S.V.; Chadda, V.K.; Nigam, R.K.

    1999-08-01

    The paper describes a digital image processing system, developed indigenously at BARC for size measurement of microscopic biological objects such as cell, nucleus and micronucleus in mouse bone marrow; cytochalasin-B blocked human lymphocytes in-vitro; numerical counting and karyotyping of metaphase chromosomes of human lymphocytes. Errors in karyotyping of chromosomes by the imaging system may creep in due to lack of well-defined position of centromere or extensive bending of chromosomes, which may result due to poor quality of preparation. Good metaphase preparations are mandatory for precise and accurate analysis by the system. Additional new morphological parameters about each chromosome have to be incorporated to improve the accuracy of karyotyping. Though the experienced cytogenetisist is the final judge; however, the system assists him/her to carryout analysis much faster as compared to manual scoring. Further, experimental studies are in progress to validate different software packages developed for various cytogenetic applications. (author)

  3. Morphometric image analysis of giant vesicles

    DEFF Research Database (Denmark)

    Husen, Peter Rasmussen; Arriaga, Laura; Monroy, Francisco

    2012-01-01

    We have developed a strategy to determine lengths and orientations of tie lines in the coexistence region of liquid-ordered and liquid-disordered phases of cholesterol containing ternary lipid mixtures. The method combines confocal-fluorescence-microscopy image stacks of giant unilamellar vesicles...... (GUVs), a dedicated 3D-image analysis, and a quantitative analysis based in equilibrium thermodynamic considerations. This approach was tested in GUVs composed of 1,2-dioleoyl-sn-glycero-3-phosphocholine/1,2-palmitoyl-sn-glycero-3-phosphocholine/cholesterol. In general, our results show a reasonable...... agreement with previously reported data obtained by other methods. For example, our computed tie lines were found to be nonhorizontal, indicating a difference in cholesterol content in the coexisting phases. This new, to our knowledge, analytical strategy offers a way to further exploit fluorescence...

  4. Image Analysis for Nail-fold Capillaroscopy

    OpenAIRE

    Vucic, Vladimir

    2015-01-01

    Detection of diseases in an early stage is very important since it can make the treatment of patients easier, safer and more ecient. For the detection of rheumatic diseases, and even prediction of tendencies towards such diseases, capillaroscopy is becoming an increasingly recognized method. Nail-fold capillaroscopy is a non-invasive imaging technique that is used for analysis of microcirculation abnormalities that may lead todisease like systematic sclerosis, Reynauds phenomenon and others. ...

  5. A novel computerized surgeon-machine interface for robot-assisted laser phonomicrosurgery.

    Science.gov (United States)

    Mattos, Leonardo S; Deshpande, Nikhil; Barresi, Giacinto; Guastini, Luca; Peretti, Giorgio

    2014-08-01

    To introduce a novel computerized surgical system for improved usability, intuitiveness, accuracy, and controllability in robot-assisted laser phonomicrosurgery. Pilot technology assessment. The novel system was developed involving a newly designed motorized laser micromanipulator, a touch-screen display, and a graphics stylus. The system allows the control of a CO2 laser through interaction between the stylus and the live video of the surgical area. This empowers the stylus with the ability to have actual effect on the surgical site. Surgical enhancements afforded by this system were established through a pilot technology assessment using randomized trials comparing its performance with a state-of-the-art laser microsurgery system. Resident surgeons and medical students were chosen as subjects in performing sets of trajectory-following exercises. Image processing-based techniques were used for an objective performance assessment. A System Usability Scale-based questionnaire was used for the qualitative assessment. The computerized interface demonstrated superiority in usability, accuracy, and controllability over the state-of-the-art system. Significant ease of use and learning experienced by the subjects were demonstrated by the usability score assigned to the two compared interfaces: computerized interface = 83.96% versus state-of-the-art = 68.02%. The objective analysis showed a significant enhancement in accuracy and controllability: computerized interface = 90.02% versus state-of-the-art = 75.59%. The novel system significantly enhances the accuracy, usability, and controllability in laser phonomicrosurgery. The design provides an opportunity to improve the ergonomics and safety of current surgical setups. © 2014 The American Laryngological, Rhinological and Otological Society, Inc.

  6. Image analysis and modeling in medical image computing. Recent developments and advances.

    Science.gov (United States)

    Handels, H; Deserno, T M; Meinzer, H-P; Tolxdorff, T

    2012-01-01

    Medical image computing is of growing importance in medical diagnostics and image-guided therapy. Nowadays, image analysis systems integrating advanced image computing methods are used in practice e.g. to extract quantitative image parameters or to support the surgeon during a navigated intervention. However, the grade of automation, accuracy, reproducibility and robustness of medical image computing methods has to be increased to meet the requirements in clinical routine. In the focus theme, recent developments and advances in the field of modeling and model-based image analysis are described. The introduction of models in the image analysis process enables improvements of image analysis algorithms in terms of automation, accuracy, reproducibility and robustness. Furthermore, model-based image computing techniques open up new perspectives for prediction of organ changes and risk analysis of patients. Selected contributions are assembled to present latest advances in the field. The authors were invited to present their recent work and results based on their outstanding contributions to the Conference on Medical Image Computing BVM 2011 held at the University of Lübeck, Germany. All manuscripts had to pass a comprehensive peer review. Modeling approaches and model-based image analysis methods showing new trends and perspectives in model-based medical image computing are described. Complex models are used in different medical applications and medical images like radiographic images, dual-energy CT images, MR images, diffusion tensor images as well as microscopic images are analyzed. The applications emphasize the high potential and the wide application range of these methods. The use of model-based image analysis methods can improve segmentation quality as well as the accuracy and reproducibility of quantitative image analysis. Furthermore, image-based models enable new insights and can lead to a deeper understanding of complex dynamic mechanisms in the human body

  7. The Digital Image Processing And Quantitative Analysis In Microscopic Image Characterization

    International Nuclear Information System (INIS)

    Ardisasmita, M. Syamsa

    2000-01-01

    Many electron microscopes although have produced digital images, but not all of them are equipped with a supporting unit to process and analyse image data quantitatively. Generally the analysis of image has to be made visually and the measurement is realized manually. The development of mathematical method for geometric analysis and pattern recognition, allows automatic microscopic image analysis with computer. Image processing program can be used for image texture and structure periodic analysis by the application of Fourier transform. Because the development of composite materials. Fourier analysis in frequency domain become important for measure the crystallography orientation. The periodic structure analysis and crystal orientation are the key to understand many material properties like mechanical strength. stress, heat conductivity, resistance, capacitance and other material electric and magnetic properties. In this paper will be shown the application of digital image processing in microscopic image characterization and analysis in microscopic image

  8. Computerized radiology instruction

    International Nuclear Information System (INIS)

    Goldberg, M.E.; Drake, D.G.; Day, D.L.

    1987-01-01

    The storage and display capabilities of a teleradiology system are used in an educational setting to allow the student (radiologist, resident, or medical student) to interactively respond to questions regarding a series of displayed images (radiographs, CT and US scans, and so forth). The computer prompts the respondent for the correct response, and both correct and incorrect responses are made known to the user. The immediate feedback to the user is a key to the student's acceptance of this form of teaching

  9. Analysis of fetal movements by Doppler actocardiogram and fetal B-mode imaging.

    Science.gov (United States)

    Maeda, K; Tatsumura, M; Utsu, M

    1999-12-01

    We have presented that fetal surveillance may be enhanced by use of the fetal actocardiogram and by computerized processing of fetal motion as well as fetal B-mode ultrasound imaging. Ultrasonic Doppler fetal actogram is a sensitive and objective method for detecting and recording fetal movements. Computer processing of the actograph output signals enables powerful, detailed, and convenient analysis of fetal physiologic phenomena. The actocardiogram is a useful measurement tool not only in fetal behavioral studies but also in evaluation of fetal well-being. It reduces false-positive, nonreactive NST and false-positive sinusoidal FHR pattern. It is a valuable tool to predict fetal distress. The results of intrapartum fetal monitoring are further improved by the antepartum application of the actocardiogram. Quantified fetal motion analysis is a useful, objective evaluation of the embryo and fetus. This method allows monitoring of changes in fetal movement, as well as frequency, amplitude, and duration. Furthermore, quantification of fetal motion enables evaluation of fetal behavior states and how these states relate to other measurements, such as changes in FHR. Numeric analysis of both fetal actogram and fetal motion from B-mode images is a promising application in the correlation of fetal activity or behavior with other fetal physiologic measurements.

  10. Automatic image processing as a means of safeguarding nuclear material

    International Nuclear Information System (INIS)

    Kahnmeyer, W.; Willuhn, K.; Uebel, W.

    1985-01-01

    Problems involved in computerized analysis of pictures taken by automatic film or video cameras in the context of international safeguards implementation are described. They include technical ones as well as the need to establish objective criteria for assessing image information. In the near future automatic image processing systems will be useful in verifying the identity and integrity of IAEA seals. (author)

  11. Computerized Proof Techniques for Undergraduates

    Science.gov (United States)

    Smith, Christopher J.; Tefera, Akalu; Zeleke, Aklilu

    2012-01-01

    The use of computer algebra systems such as Maple and Mathematica is becoming increasingly important and widespread in mathematics learning, teaching and research. In this article, we present computerized proof techniques of Gosper, Wilf-Zeilberger and Zeilberger that can be used for enhancing the teaching and learning of topics in discrete…

  12. Computerized adaptive testing in computer assisted learning?

    NARCIS (Netherlands)

    Veldkamp, Bernard P.; Matteucci, Mariagiulia; Eggen, Theodorus Johannes Hendrikus Maria; De Wannemacker, Stefan; Clarebout, Geraldine; De Causmaecker, Patrick

    2011-01-01

    A major goal in computerized learning systems is to optimize learning, while in computerized adaptive tests (CAT) efficient measurement of the proficiency of students is the main focus. There seems to be a common interest to integrate computerized adaptive item selection in learning systems and

  13. Automatic dirt trail analysis in dermoscopy images.

    Science.gov (United States)

    Cheng, Beibei; Joe Stanley, R; Stoecker, William V; Osterwise, Christopher T P; Stricklin, Sherea M; Hinton, Kristen A; Moss, Randy H; Oliviero, Margaret; Rabinovitz, Harold S

    2013-02-01

    Basal cell carcinoma (BCC) is the most common cancer in the US. Dermatoscopes are devices used by physicians to facilitate the early detection of these cancers based on the identification of skin lesion structures often specific to BCCs. One new lesion structure, referred to as dirt trails, has the appearance of dark gray, brown or black dots and clods of varying sizes distributed in elongated clusters with indistinct borders, often appearing as curvilinear trails. In this research, we explore a dirt trail detection and analysis algorithm for extracting, measuring, and characterizing dirt trails based on size, distribution, and color in dermoscopic skin lesion images. These dirt trails are then used to automatically discriminate BCC from benign skin lesions. For an experimental data set of 35 BCC images with dirt trails and 79 benign lesion images, a neural network-based classifier achieved a 0.902 are under a receiver operating characteristic curve using a leave-one-out approach. Results obtained from this study show that automatic detection of dirt trails in dermoscopic images of BCC is feasible. This is important because of the large number of these skin cancers seen every year and the challenge of discovering these earlier with instrumentation. © 2011 John Wiley & Sons A/S.

  14. A prototype of a computerized patient record.

    Science.gov (United States)

    Adelhard, K; Eckel, R; Hölzel, D; Tretter, W

    1995-01-01

    Computerized medical record systems (CPRS) should present user and problem oriented views of the patient file. Problem lists, clinical course, medication profiles and results of examinations have to be recorded in a computerized patient record. Patient review screens should give a synopsis of the patient data to inform whenever the patient record is opened. Several different types of data have to be stored in a patient record. Qualitative and quantitative measurements, narratives and images are such examples. Therefore, a CPR must also be able to handle these different data types. New methods and concepts appear frequently in medicine. Thus a CPRS must be flexible enough to cope with coming demands. We developed a prototype of a computer based patient record with a graphical user interface on a SUN workstation. The basis of the system are a dynamic data dictionary, an interpreter language and a large set of basic functions. This approach gives optimal flexibility to the system. A lot of different data types are already supported. Extensions are easily possible. There is also almost no limit concerning the number of medical concepts that can be handled by our prototype. Several applications were built on this platform. Some of them are presented to exemplify the patient and problem oriented handling of the CPR.

  15. Remote Sensing Digital Image Analysis An Introduction

    CERN Document Server

    Richards, John A

    2013-01-01

    Remote Sensing Digital Image Analysis provides the non-specialist with a treatment of the quantitative analysis of satellite and aircraft derived remotely sensed data. Since the first edition of the book there have been significant developments in the algorithms used for the processing and analysis of remote sensing imagery; nevertheless many of the fundamentals have substantially remained the same.  This new edition presents material that has retained value since those early days, along with new techniques that can be incorporated into an operational framework for the analysis of remote sensing data. The book is designed as a teaching text for the senior undergraduate and postgraduate student, and as a fundamental treatment for those engaged in research using digital image processing in remote sensing.  The presentation level is for the mathematical non-specialist.  Since the very great number of operational users of remote sensing come from the earth sciences communities, the text is pitched at a leve...

  16. [Imaging Mass Spectrometry in Histopathologic Analysis].

    Science.gov (United States)

    Yamazaki, Fumiyoshi; Seto, Mitsutoshi

    2015-04-01

    Matrix-assisted laser desorption/ionization (MALDI)-imaging mass spectrometry (IMS) enables visualization of the distribution of a range of biomolecules by integrating biochemical information from mass spectrometry with positional information from microscopy. IMS identifies a target molecule. In addition, IMS enables global analysis of biomolecules containing unknown molecules by detecting the ratio of the molecular weight to electric charge without any target, which makes it possible to identify novel molecules. IMS generates data on the distribution of lipids and small molecules in tissues, which is difficult to visualize with either conventional counter-staining or immunohistochemistry. In this review, we firstly introduce the principle of imaging mass spectrometry and recent advances in the sample preparation method. Secondly, we present findings regarding biological samples, especially pathological ones. Finally, we discuss the limitations and problems of the IMS technique and clinical application, such as in drug development.

  17. Machine Learning Interface for Medical Image Analysis.

    Science.gov (United States)

    Zhang, Yi C; Kagen, Alexander C

    2017-10-01

    TensorFlow is a second-generation open-source machine learning software library with a built-in framework for implementing neural networks in wide variety of perceptual tasks. Although TensorFlow usage is well established with computer vision datasets, the TensorFlow interface with DICOM formats for medical imaging remains to be established. Our goal is to extend the TensorFlow API to accept raw DICOM images as input; 1513 DaTscan DICOM images were obtained from the Parkinson's Progression Markers Initiative (PPMI) database. DICOM pixel intensities were extracted and shaped into tensors, or n-dimensional arrays, to populate the training, validation, and test input datasets for machine learning. A simple neural network was constructed in TensorFlow to classify images into normal or Parkinson's disease groups. Training was executed over 1000 iterations for each cross-validation set. The gradient descent optimization and Adagrad optimization algorithms were used to minimize cross-entropy between the predicted and ground-truth labels. Cross-validation was performed ten times to produce a mean accuracy of 0.938 ± 0.047 (95 % CI 0.908-0.967). The mean sensitivity was 0.974 ± 0.043 (95 % CI 0.947-1.00) and mean specificity was 0.822 ± 0.207 (95 % CI 0.694-0.950). We extended the TensorFlow API to enable DICOM compatibility in the context of DaTscan image analysis. We implemented a neural network classifier that produces diagnostic accuracies on par with excellent results from previous machine learning models. These results indicate the potential role of TensorFlow as a useful adjunct diagnostic tool in the clinical setting.

  18. Phase Image Analysis in Conduction Disturbance Patients

    International Nuclear Information System (INIS)

    Kwark, Byeng Su; Choi, Si Wan; Kang, Seung Sik; Park, Ki Nam; Lee, Kang Wook; Jeon, Eun Seok; Park, Chong Hun

    1994-01-01

    It is known that the normal His-Purkinje system provides for nearly synchronous activation of right (RV) and left (LV) ventricles. When His-Purkinje conduction is abnormal, the resulting sequence of ventricular contraction must be correspondingly abnormal. These abnormal mechanical consequences were difficult to demonstrate because of the complexity and the rapidity of its events. To determine the relationship of the phase changes and the abnormalities of ventricular conduction, we performed phase image analysis of Tc-RBC gated blood pool scintigrams in patients with intraventricular conduction disturbances (24 complete left bundle branch block (C-LBBB), 15 complete right bundle branch block (C-RBBB), 13 Wolff-Parkinson-White syndrome (WPW), 10 controls). The results were as follows; 1) The ejection fraction (EF), peak ejection rate (PER), and peak filling rate (PFR) of LV in gated blood pool scintigraphy (GBPS) were significantly lower in patients with C-LBBB than in controls (44.4 ± 13.9% vs 69.9 ± 4.2%, 2.48 ± 0.98 vs 3.51 ± 0,62, 1.76 ± 0.71 vs 3.38 ± 0.92, respectively, p<0.05). 2) In the phase angle analysis of LV, Standard deviation (SD), width of half maximum of phase angle (FWHM), and range of phase angle were significantly increased in patients with C-LBBB than in controls (20.6 + 18.1 vs S.6 + I.8, 22. 5 + 9.2 vs 16.0 + 3.9, 95.7 + 31.7 vs 51.3 + 5.4, respectively, p<0.05). 3) There was no significant difference in EF, PER, PFR between patients with the WolffParkinson-White syndrome and controls. 4) Standard deviation and range of phase angle were significantly higher in patients with WPW syndrome than in controls (10.6 + 2.6 vs 8.6 + 1.8, p<0.05, 69.8 + 11.7 vs 51.3 + 5 4, p<0.001, respectively), however, there was no difference between the two groups in full width of half maximum. 5) Phase image analysis revealed relatively uniform phase across the both ventriles in patients with normal conduction, but markedly delayed phase in the left ventricle

  19. Phase Image Analysis in Conduction Disturbance Patients

    Energy Technology Data Exchange (ETDEWEB)

    Kwark, Byeng Su; Choi, Si Wan; Kang, Seung Sik; Park, Ki Nam; Lee, Kang Wook; Jeon, Eun Seok; Park, Chong Hun [Chung Nam University Hospital, Daejeon (Korea, Republic of)

    1994-03-15

    It is known that the normal His-Purkinje system provides for nearly synchronous activation of right (RV) and left (LV) ventricles. When His-Purkinje conduction is abnormal, the resulting sequence of ventricular contraction must be correspondingly abnormal. These abnormal mechanical consequences were difficult to demonstrate because of the complexity and the rapidity of its events. To determine the relationship of the phase changes and the abnormalities of ventricular conduction, we performed phase image analysis of Tc-RBC gated blood pool scintigrams in patients with intraventricular conduction disturbances (24 complete left bundle branch block (C-LBBB), 15 complete right bundle branch block (C-RBBB), 13 Wolff-Parkinson-White syndrome (WPW), 10 controls). The results were as follows; 1) The ejection fraction (EF), peak ejection rate (PER), and peak filling rate (PFR) of LV in gated blood pool scintigraphy (GBPS) were significantly lower in patients with C-LBBB than in controls (44.4 +- 13.9% vs 69.9 +- 4.2%, 2.48 +- 0.98 vs 3.51 +- 0,62, 1.76 +- 0.71 vs 3.38 +- 0.92, respectively, p<0.05). 2) In the phase angle analysis of LV, Standard deviation (SD), width of half maximum of phase angle (FWHM), and range of phase angle were significantly increased in patients with C-LBBB than in controls (20.6 + 18.1 vs S.6 + I.8, 22. 5 + 9.2 vs 16.0 + 3.9, 95.7 + 31.7 vs 51.3 + 5.4, respectively, p<0.05). 3) There was no significant difference in EF, PER, PFR between patients with the WolffParkinson-White syndrome and controls. 4) Standard deviation and range of phase angle were significantly higher in patients with WPW syndrome than in controls (10.6 + 2.6 vs 8.6 + 1.8, p<0.05, 69.8 + 11.7 vs 51.3 + 5 4, p<0.001, respectively), however, there was no difference between the two groups in full width of half maximum. 5) Phase image analysis revealed relatively uniform phase across the both ventriles in patients with normal conduction, but markedly delayed phase in the left ventricle

  20. INTEGRATION PECULIARITIES OF COMPUTERIZED MEANS OF EDUCATION INTO THE PROCESS OF TEACHER TRAINING AT PEDAGOGICAL COLLEGES

    Directory of Open Access Journals (Sweden)

    Olga M. Naumenko

    2010-08-01

    Full Text Available Important problems of using the computerized means of education in the process of teacher training at pedagogical college are considered. On the basis of the analysis of the organisation of educational process in different pedagogical colleges, the general principles of construction of the educational module “Methodology of computerized means of education in educational process” are considered.

  1. A methodology for automated CPA extraction using liver biopsy image analysis and machine learning techniques.

    Science.gov (United States)

    Tsipouras, Markos G; Giannakeas, Nikolaos; Tzallas, Alexandros T; Tsianou, Zoe E; Manousou, Pinelopi; Hall, Andrew; Tsoulos, Ioannis; Tsianos, Epameinondas

    2017-03-01

    Collagen proportional area (CPA) extraction in liver biopsy images provides the degree of fibrosis expansion in liver tissue, which is the most characteristic histological alteration in hepatitis C virus (HCV). Assessment of the fibrotic tissue is currently based on semiquantitative staging scores such as Ishak and Metavir. Since its introduction as a fibrotic tissue assessment technique, CPA calculation based on image analysis techniques has proven to be more accurate than semiquantitative scores. However, CPA has yet to reach everyday clinical practice, since the lack of standardized and robust methods for computerized image analysis for CPA assessment have proven to be a major limitation. The current work introduces a three-stage fully automated methodology for CPA extraction based on machine learning techniques. Specifically, clustering algorithms have been employed for background-tissue separation, as well as for fibrosis detection in liver tissue regions, in the first and the third stage of the methodology, respectively. Due to the existence of several types of tissue regions in the image (such as blood clots, muscle tissue, structural collagen, etc.), classification algorithms have been employed to identify liver tissue regions and exclude all other non-liver tissue regions from CPA computation. For the evaluation of the methodology, 79 liver biopsy images have been employed, obtaining 1.31% mean absolute CPA error, with 0.923 concordance correlation coefficient. The proposed methodology is designed to (i) avoid manual threshold-based and region selection processes, widely used in similar approaches presented in the literature, and (ii) minimize CPA calculation time. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  2. Análise Computadorizada da Cardiotocografia Anteparto em Gestações de Alto Risco Computerized Antepartum Cardiotocography Analysis in High Risk Pregnancies

    Directory of Open Access Journals (Sweden)

    Roseli Mieko Yamamoto Nomura

    2002-01-01

    Full Text Available Objetivos: analisar a freqüência dos resultados das cardiotocografias computadorizadas realizadas em gestantes de alto risco e relacionar o critério proposto pelo sistema com os resultados perinatais. Métodos: estudamos prospectivamente 233 gestantes de alto risco que realizaram 485 cardiotocografias computadorizadas. Foram excluídos casos de anomalias fetais e os exames com perda de sinal superior a 20% (proporção de episódios de 3,75 milissegundos do traçado onde não se constata intervalo de pulso por perda de captação dos sinais de batimentos cardíacos fetais. Para estudo da associação da cardiotocografia com os resultados perinatais, analisou-se o último exame realizado na semana anterior ao parto (71 casos, excluindo-se casos com diagnóstico de diástole zero ou reversa na dopplervelocimetria das artérias umbilicais. Resultados: após a exclusão de 33 exames com perda de sinal superior a 20%, constatou-se que 404 cardiotocografias foram caracterizadas como normais (83,3%. Quanto à duração do exame, em 62,1% foi de até 20 minutos e em 79,0% de até 30 minutos. A análise das correlações com os resultados perinatais demonstrou associação significativa (pPurpose: to study computerized cardiotocography performed in high-risk pregnancies, analyze the results, and correlate the criteria to perinatal results. Patients and Methods: two hundred and thirty-three high-risk pregnancies were studied prospectively, performing a total of 485 computerized cardiotocographies. The exclusion criteria included fetal anomalies and signal loss over 20% (proportion of 3.75-millisecond periods in which there were no valid pulse intervals. The perinatal results of 71 pregnancies were correlated to the last cardiotocography, performed at least seven days before birth, excluding patients with absent or reversed end diastolic velocities in the umbilical arteries. Results: thirty-three examinations with signal loss over 20% were excluded. The

  3. Analysis of Baseline Computerized Neurocognitive Testing Results among 5–11-Year-Old Male and Female Children Playing Sports in Recreational Leagues in Florida

    Directory of Open Access Journals (Sweden)

    Karen D. Liller

    2017-09-01

    Full Text Available There is a paucity of data related to sports injuries, concussions, and computerized neurocognitive testing (CNT among very young athletes playing sports in recreational settings. The purpose of this study was to report baseline CNT results among male and female children, ages 5–11, playing sports in Hillsborough County, Florida using ImPACT Pediatric, which is specifically designed for this population. Data were collected from 2016 to 2017. The results show that 657 baseline tests were conducted and t-tests and linear regression were used to assess mean significant differences in composite scores with sex and age. Results showed that females scored better on visual memory and in general as age increased, baseline scores improved. The results can be used to build further studies on the use of CNT in recreational settings and their role in concussion treatment, management, and interventions.

  4. Anniversary Paper: History and status of CAD and quantitative image analysis: The role of Medical Physics and AAPM

    International Nuclear Information System (INIS)

    Giger, Maryellen L.; Chan, Heang-Ping; Boone, John

    2008-01-01

    The roles of physicists in medical imaging have expanded over the years, from the study of imaging systems (sources and detectors) and dose to the assessment of image quality and perception, the development of image processing techniques, and the development of image analysis methods to assist in detection and diagnosis. The latter is a natural extension of medical physicists' goals in developing imaging techniques to help physicians acquire diagnostic information and improve clinical decisions. Studies indicate that radiologists do not detect all abnormalities on images that are visible on retrospective review, and they do not always correctly characterize abnormalities that are found. Since the 1950s, the potential use of computers had been considered for analysis of radiographic abnormalities. In the mid-1980s, however, medical physicists and radiologists began major research efforts for computer-aided detection or computer-aided diagnosis (CAD), that is, using the computer output as an aid to radiologists--as opposed to a completely automatic computer interpretation--focusing initially on methods for the detection of lesions on chest radiographs and mammograms. Since then, extensive investigations of computerized image analysis for detection or diagnosis of abnormalities in a variety of 2D and 3D medical images have been conducted. The growth of CAD over the past 20 years has been tremendous--from the early days of time-consuming film digitization and CPU-intensive computations on a limited number of cases to its current status in which developed CAD approaches are evaluated rigorously on large clinically relevant databases. CAD research by medical physicists includes many aspects--collecting relevant normal and pathological cases; developing computer algorithms appropriate for the medical interpretation task including those for segmentation, feature extraction, and classifier design; developing methodology for assessing CAD performance; validating the

  5. A report on digital image processing and analysis

    International Nuclear Information System (INIS)

    Singh, B.; Alex, J.; Haridasan, G.

    1989-01-01

    This report presents developments in software, connected with digital image processing and analysis in the Centre. In image processing, one resorts to either alteration of grey level values so as to enhance features in the image or resorts to transform domain operations for restoration or filtering. Typical transform domain operations like Karhunen-Loeve transforms are statistical in nature and are used for a good registration of images or template - matching. Image analysis procedures segment grey level images into images contained within selectable windows, for the purpose of estimating geometrical features in the image, like area, perimeter, projections etc. In short, in image processing both the input and output are images, whereas in image analyses, the input is an image whereas the output is a set of numbers and graphs. (author). 19 refs

  6. Uses of software in digital image analysis: a forensic report

    Science.gov (United States)

    Sharma, Mukesh; Jha, Shailendra

    2010-02-01

    Forensic image analysis is required an expertise to interpret the content of an image or the image itself in legal matters. Major sub-disciplines of forensic image analysis with law enforcement applications include photo-grammetry, photographic comparison, content analysis and image authentication. It has wide applications in forensic science range from documenting crime scenes to enhancing faint or indistinct patterns such as partial fingerprints. The process of forensic image analysis can involve several different tasks, regardless of the type of image analysis performed. Through this paper authors have tried to explain these tasks, which are described in to three categories: Image Compression, Image Enhancement & Restoration and Measurement Extraction. With the help of examples like signature comparison, counterfeit currency comparison and foot-wear sole impression using the software Canvas and Corel Draw.

  7. Computerized photogrammetry used to calculate the brow position index.

    Science.gov (United States)

    Naif-de-Andrade, Naif Thadeu; Hochman, Bernardo; Naif-de-Andrade, Camila Zirlis; Ferreira, Lydia Masako

    2012-10-01

    The orbital region is of vital importance to facial expression. Brow ptosis, besides having an impact on facial harmony, is a sign of aging. Various surgical techniques have been developed to increase the efficacy of brow-lift surgery. However, no consensus method exists for an objective measurement of the eyebrow position due to the curvature of the face. Therefore, this study aimed to establish a method for measuring the eyebrow position using computerized photogrammetry. For this study, 20 orbital regions of 10 volunteers were measured by direct anthropometry using a digital caliper and by indirect anthropometry (computerized photogrammetry) using standardized digital photographs. Lines, points, and distances were defined based on the position of the anthropometric landmarks endocanthion and exocanthion and then used to calculate the brow position index (BPI). Statistical analysis was performed using Student's t test with a significance level of 5 %. The BPI values obtained by computerized photogrammetric measurements did not differ significantly from those obtained by direct anthropometric measurements (p > 0.05). The mean BPI was 84.89 ± 10.30 for the computerized photogrammetric measurements and 85.27 ± 10.67 for the direct anthropometric measurements. The BPI defined in this study and obtained by computerized photogrammetry is a reproducible and efficient method for measuring the eyebrow position. This journal requires that authors assign a level of evidence to each article.

  8. Illuminance: Computerized simulation

    Energy Technology Data Exchange (ETDEWEB)

    Barlow, A

    1991-03-01

    One of the main objectives of a graphics work-station is to create images that are as realistic as possible. This paper reviews and assesses the state-of-the-art in the field of illuminance simulation. The techniques examined are: ray tracing, in which illuminance in a given ambient is calculated in an approximate way by tracing individual rays of light; the 'radiosity' (a term combining surface radiancy and reflectivity) method, based on the calculation of the ambient's thermodynamics and which considers the effects of different surface colours; progressive improvement, in which 'radiosity' is calculated step by step with increasing levels of detail. The Gouraud and Phong methods of representing the effects of shade are also compared.

  9. Analysis of image plane's Illumination in Image-forming System

    International Nuclear Information System (INIS)

    Duan Lihua; Zeng Yan'an; Zhang Nanyangsheng; Wang Zhiguo; Yin Shiliang

    2011-01-01

    In the detection of optical radiation, the detecting accuracy is affected by optic power distribution of the detector's surface to a large extent. In addition, in the image-forming system, the quality of the image is greatly determined by the uniformity of the image's illumination distribution. However, in the practical optical system, affected by the factors such as field of view, false light and off axis and so on, the distribution of the image's illumination tends to be non uniform, so it is necessary to discuss the image plane's illumination in image-forming systems. In order to analyze the characteristics of the image-forming system at a full range, on the basis of photometry, the formulas to calculate the illumination of the imaging plane have been summarized by the numbers. Moreover, the relationship between the horizontal offset of the light source and the illumination of the image has been discussed in detail. After that, the influence of some key factors such as aperture angle, off-axis distance and horizontal offset on illumination of the image has been brought forward. Through numerical simulation, various theoretical curves of those key factors have been given. The results of the numerical simulation show that it is recommended to aggrandize the diameter of the exit pupil to increase the illumination of the image. The angle of view plays a negative role in the illumination distribution of the image, that is, the uniformity of the illumination distribution can be enhanced by compressing the angle of view. Lastly, it is proved that telecentric optical design is an effective way to advance the uniformity of the illumination distribution.

  10. Computerized tomography findings in nasolacrimal dysfunction

    International Nuclear Information System (INIS)

    Ishida, Toshio; Nakamura, Yasuhisa; Kumagai, Michiasa

    1985-01-01

    We examined 17 cases (22 lesions) with stenosis or obstruction in the lacrimal drainage system with the use of computerized tomography (CT). In idio pathic cases, the site of obstruction was located either in the upper nasolacrimal duct or at the junction of the lacrimal sac and the nasolacrimal duct. In post-traumatic cases, it was located in the lower nasolacrimal duct. The obstructed areas appeared as homogenous in CT image with CT values ranging between +60 and +80. These findings were suggestive of granulation tissue. The stenosed areas appeared, on the other hand, as areas of unequal density. Lacrimal passage appeared to be maintained through the low-density portion. The CT values of the lacrimal sac was around +40 in dacryocystitis and around +20 to +30 other cases. (author)

  11. Difference Image Analysis of Galactic Microlensing. I. Data Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Alcock, C.; Allsman, R. A.; Alves, D.; Axelrod, T. S.; Becker, A. C.; Bennett, D. P.; Cook, K. H.; Drake, A. J.; Freeman, K. C.; Griest, K. (and others)

    1999-08-20

    This is a preliminary report on the application of Difference Image Analysis (DIA) to Galactic bulge images. The aim of this analysis is to increase the sensitivity to the detection of gravitational microlensing. We discuss how the DIA technique simplifies the process of discovering microlensing events by detecting only objects that have variable flux. We illustrate how the DIA technique is not limited to detection of so-called ''pixel lensing'' events but can also be used to improve photometry for classical microlensing events by removing the effects of blending. We will present a method whereby DIA can be used to reveal the true unblended colors, positions, and light curves of microlensing events. We discuss the need for a technique to obtain the accurate microlensing timescales from blended sources and present a possible solution to this problem using the existing Hubble Space Telescope color-magnitude diagrams of the Galactic bulge and LMC. The use of such a solution with both classical and pixel microlensing searches is discussed. We show that one of the major causes of systematic noise in DIA is differential refraction. A technique for removing this systematic by effectively registering images to a common air mass is presented. Improvements to commonly used image differencing techniques are discussed. (c) 1999 The American Astronomical Society.

  12. An expert image analysis system for chromosome analysis application

    International Nuclear Information System (INIS)

    Wu, Q.; Suetens, P.; Oosterlinck, A.; Van den Berghe, H.

    1987-01-01

    This paper reports a recent study on applying a knowledge-based system approach as a new attempt to solve the problem of chromosome classification. A theoretical framework of an expert image analysis system is proposed, based on such a study. In this scheme, chromosome classification can be carried out under a hypothesize-and-verify paradigm, by integrating a rule-based component, in which the expertise of chromosome karyotyping is formulated with an existing image analysis system which uses conventional pattern recognition techniques. Results from the existing system can be used to bring in hypotheses, and with the rule-based verification and modification procedures, improvement of the classification performance can be excepted

  13. The Scientific Image in Behavior Analysis.

    Science.gov (United States)

    Keenan, Mickey

    2016-05-01

    Throughout the history of science, the scientific image has played a significant role in communication. With recent developments in computing technology, there has been an increase in the kinds of opportunities now available for scientists to communicate in more sophisticated ways. Within behavior analysis, though, we are only just beginning to appreciate the importance of going beyond the printing press to elucidate basic principles of behavior. The aim of this manuscript is to stimulate appreciation of both the role of the scientific image and the opportunities provided by a quick response code (QR code) for enhancing the functionality of the printed page. I discuss the limitations of imagery in behavior analysis ("Introduction"), and I show examples of what can be done with animations and multimedia for teaching philosophical issues that arise when teaching about private events ("Private Events 1 and 2"). Animations are also useful for bypassing ethical issues when showing examples of challenging behavior ("Challenging Behavior"). Each of these topics can be accessed only by scanning the QR code provided. This contingency has been arranged to help the reader embrace this new technology. In so doing, I hope to show its potential for going beyond the limitations of the printing press.

  14. Etching and image analysis of the microstructure in marble

    DEFF Research Database (Denmark)

    Alm, Ditte; Brix, Susanne; Howe-Rasmussen, Helle

    2005-01-01

    of grains exposed on that surface are measured on the microscope images using image analysis by the program Adobe Photoshop 7.0 with Image Processing Toolkit 4.0. The parameters measured by the program on microscope images of thin sections of two marble types are used for calculation of the coefficient...

  15. Role of computerized tomography in diagnosis of atypical gall bladder and common bile duct stones

    International Nuclear Information System (INIS)

    El-Husseni, Tareq K.; Al-Shebrein, Ibrahim A.

    2001-01-01

    Objective was to assess the value of computerized tomography as an adjuvant to ultrasound in the diagnosis of atypical gallbladder and common duct stone disease. Real time ultrasound scanning for the gallbladder and common duct was performed in the routine manner. High resolution computerized tomography images were subsequently obtained for the region of interest. Computerized tomography resolved undetermined results as follows: 1. Non shadowing gallbladder debris (6 points), 2. Focal gallbladder wall thickening (2 points), 3. Stone obscured by calcified gallbladder wall (3 points), 4. Non visualized gallbladder double arc shadow (4 points), 5 and 6. Impacted gallbladder neck and common duct stones (18 points), computerized tomography gave false positive diagnosis in (2 points). Computerized tomography provided an effective and reliable means for the diagnosis of atypical gallbladder calculi when ultrasound was imprecise or the findings contradicted the clinical presentation. Finally if gallbladder neck or common duct stones are suspected, in addition to computerized tomography other imaging techniques such as magnetic resonance cholangio pancreatography or endoscopic retrograde cholangiopancreatography in addition to computerized tomography may be needed to avoid false positive diagnosis prior to surgery. (author)

  16. Use of the software Seed Vigor Imaging System (SVIS® for assessing vigor of carrot seeds

    Directory of Open Access Journals (Sweden)

    José Luís de Marchi

    Full Text Available ABSTRACT Seed vigor has traditionally been evaluated by physiological, biochemical and stress tolerance tests. More recently, with the use of computerized image analysis, objective information has become accessible in a relatively short period of time, with less human interference. The aim of this study was to verify the efficiency of computerized seedling image analysis by Seed Vigor Imaging System (SVIS® to detect differences in vigor between carrot (Daucus carota L. seed lots as compared to those provided by traditional vigor tests. Seeds from seven lots from the Brasilia cultivar were subjected to a germination test, first count of germination, speed of germination, accelerated aging with saline solution and seedling emergence; furthermore, a vigor index, growth index and uniformity index were determined by the Seed Vigor Imaging System (SVIS® during four evaluation periods. The results obtained by the computerized seedling analysis (vigor index and growth index show that SVIS® is efficient in assessing carrot seed vigor.

  17. Computerized accounting methods. Final report

    International Nuclear Information System (INIS)

    1994-01-01

    This report summarizes the results of the research performed under the Task Order on computerized accounting methods in a period from 03 August to 31 December 1994. Computerized nuclear material accounting methods are analyzed and evaluated. Selected methods are implemented in a hardware-software complex developed as a prototype of the local network-based CONMIT system. This complex has been put into trial operation for test and evaluation of the selected methods at two selected ''Kurchatov Institute'' Russian Research Center (''KI'' RRC) nuclear facilities. Trial operation is carried out since the beginning of Initial Physical Inventory Taking in these facilities that was performed in November 1994. Operation of CONMIT prototype system was demonstrated in the middle of December 1994. Results of evaluation of CONMIT prototype system features and functioning under real operating conditions are considered. Conclusions are formulated on the ways of further development of computerized nuclear material accounting methods. The most important conclusion is a need to strengthen computer and information security features supported by the operating environment. Security provisions as well as other LANL Client/Server System approaches being developed by Los Alamos National Laboratory are recommended for selection of software and hardware components to be integrated into production version of CONMIT system for KI RRC

  18. Application of automatic image analysis in wood science

    Science.gov (United States)

    Charles W. McMillin

    1982-01-01

    In this paper I describe an image analysis system and illustrate with examples the application of automatic quantitative measurement to wood science. Automatic image analysis, a powerful and relatively new technology, uses optical, video, electronic, and computer components to rapidly derive information from images with minimal operator interaction. Such instruments...

  19. Brain-inspired algorithms for retinal image analysis

    NARCIS (Netherlands)

    ter Haar Romeny, B.M.; Bekkers, E.J.; Zhang, J.; Abbasi-Sureshjani, S.; Huang, F.; Duits, R.; Dasht Bozorg, Behdad; Berendschot, T.T.J.M.; Smit-Ockeloen, I.; Eppenhof, K.A.J.; Feng, J.; Hannink, J.; Schouten, J.; Tong, M.; Wu, H.; van Triest, J.W.; Zhu, S.; Chen, D.; He, W.; Xu, L.; Han, P.; Kang, Y.

    2016-01-01

    Retinal image analysis is a challenging problem due to the precise quantification required and the huge numbers of images produced in screening programs. This paper describes a series of innovative brain-inspired algorithms for automated retinal image analysis, recently developed for the RetinaCheck

  20. From Pixels to Geographic Objects in Remote Sensing Image Analysis

    NARCIS (Netherlands)

    Addink, E.A.; Van Coillie, Frieke M.B.; Jong, Steven M. de

    Traditional image analysis methods are mostly pixel-based and use the spectral differences of landscape elements at the Earth surface to classify these elements or to extract element properties from the Earth Observation image. Geographic object-based image analysis (GEOBIA) has received

  1. Using image analysis as a tool for assessment of prognostic and predictive biomarkers for breast cancer: How reliable is it?

    Directory of Open Access Journals (Sweden)

    Mark C Lloyd

    2010-01-01

    Full Text Available Background : Estrogen receptor (ER, progesterone receptor (PR and human epidermal growth factor receptor-2 (HER2 are important and well-established prognostic and predictive biomarkers for breast cancers and routinely tested on patient′s tumor samples by immunohistochemical (IHC study. The accuracy of these test results has substantial impact on patient management. A critical factor that contributes to the result is the interpretation (scoring of IHC. This study investigates how computerized image analysis can play a role in a reliable scoring, and identifies potential pitfalls with common methods. Materials and Methods : Whole slide images of 33 invasive ductal carcinoma (IDC (10 ER and 23 HER2 were scored by pathologist under the light microscope and confirmed by another pathologist. The HER2 results were additionally confirmed by fluorescence in situ hybridization (FISH. The scoring criteria were adherent to the guidelines recommended by the American Society of Clinical Oncology/College of American Pathologists. Whole slide stains were then scored by commercially available image analysis algorithms from Definiens (Munich, Germany and Aperio Technologies (Vista, CA, USA. Each algorithm was modified specifically for each marker and tissue. The results were compared with the semi-quantitative manual scoring, which was considered the gold standard in this study. Results : For HER2 positive group, each algorithm scored 23/23 cases within the range established by the pathologist. For ER, both algorithms scored 10/10 cases within range. The performance of each algorithm varies somewhat from the percentage of staining as compared to the pathologist′s reading. Conclusions : Commercially available computerized image analysis can be useful in the evaluation of ER and HER2 IHC results. In order to achieve accurate results either manual pathologist region selection is necessary, or an automated region selection tool must be employed. Specificity can

  2. An image scanner for real time analysis of spark chamber images

    International Nuclear Information System (INIS)

    Cesaroni, F.; Penso, G.; Locci, A.M.; Spano, M.A.

    1975-01-01

    The notes describes the semiautomatic scanning system at LNF for the analysis of spark chamber images. From the projection of the images on the scanner table, the trajectory in the real space is reconstructed

  3. Textural features for radar image analysis

    Science.gov (United States)

    Shanmugan, K. S.; Narayanan, V.; Frost, V. S.; Stiles, J. A.; Holtzman, J. C.

    1981-01-01

    Texture is seen as an important spatial feature useful for identifying objects or regions of interest in an image. While textural features have been widely used in analyzing a variety of photographic images, they have not been used in processing radar images. A procedure for extracting a set of textural features for characterizing small areas in radar images is presented, and it is shown that these features can be used in classifying segments of radar images corresponding to different geological formations.

  4. Computerized detection of vertebral compression fractures on lateral chest radiographs: Preliminary results with a tool for early detection of osteoporosis

    International Nuclear Information System (INIS)

    Kasai, Satoshi; Li Feng; Shiraishi, Junji; Li Qiang; Doi, Kunio

    2006-01-01

    Vertebral fracture (or vertebral deformity) is a very common outcome of osteoporosis, which is one of the major public health concerns in the world. Early detection of vertebral fractures is important because timely pharmacologic intervention can reduce the risk of subsequent additional fractures. Chest radiographs are used routinely for detection of lung and heart diseases, and vertebral fractures can be visible on lateral chest radiographs. However, investigators noted that about 50% of vertebral fractures visible on lateral chest radiographs were underdiagnosed or under-reported, even when the fractures were severe. Therefore, our goal was to develop a computerized method for detection of vertebral fractures on lateral chest radiographs in order to assist radiologists' image interpretation and thus allow the early diagnosis of osteoporosis. The cases used in this study were 20 patients with severe vertebral fractures and 118 patients without fractures, as confirmed by the consensus of two radiologists. Radiologists identified the locations of fractured vertebrae, and they provided morphometric data on the vertebral shape for evaluation of the accuracy of detecting vertebral end plates by computer. In our computerized method, a curved search area, which included a number of vertebral end plates, was first extracted automatically, and was straightened so that vertebral end plates became oriented horizontally. Edge candidates were enhanced by use of a horizontal line-enhancement filter in the straightened image, and a multiple thresholding technique, followed by feature analysis, was used for identification of the vertebral end plates. The height of each vertebra was determined from locations of identified vertebral end plates, and fractured vertebrae were detected by comparison of the measured vertebral height with the expected height. The sensitivity of our computerized method for detection of fracture cases was 95% (19/20), with 1.03 (139/135) false

  5. Computerizing clinical practice guidelines

    DEFF Research Database (Denmark)

    Lyng, Karen Marie

    It is well described that hospitals have problems with sustaining high quality of care and expedient introduction of new medical knowledge. Clinical practice guidelines (CPGs) have been promoted as a remedy to deal with these problems. It is, however, also well described that application and comp......It is well described that hospitals have problems with sustaining high quality of care and expedient introduction of new medical knowledge. Clinical practice guidelines (CPGs) have been promoted as a remedy to deal with these problems. It is, however, also well described that application...... is comprised by fieldwork in three oncology departments and a case study of advanced life support. Although close to all patients within oncology are treated according to a CPG, I found limited application of physical CPGs and web-based CPG portals. However, I found comprehensive application of activity...... of the business strategic aims, and 3) analysis and formalization of CPGs. This will imply orchestration of design teams with competencies from a wide array of disciplines such as health practice, business management, knowledge management and information systems....

  6. Analysis of RTM extended images for VTI media

    KAUST Repository

    Li, Vladimir; Tsvankin, Ilya; Alkhalifah, Tariq Ali

    2015-01-01

    velocity analysis remain generally valid in the extended image space for complex media. The dependence of RMO on errors in the anisotropy parameters provides essential insights for anisotropic wavefield tomography using extended images.

  7. Direct identification of fungi using image analysis

    DEFF Research Database (Denmark)

    Dørge, Thorsten Carlheim; Carstensen, Jens Michael; Frisvad, Jens Christian

    1999-01-01

    Filamentous fungi have often been characterized, classified or identified with a major emphasis on macromorphological characters, i.e. the size, texture and color of fungal colonies grown on one or more identification media. This approach has been rejcted by several taxonomists because of the sub......Filamentous fungi have often been characterized, classified or identified with a major emphasis on macromorphological characters, i.e. the size, texture and color of fungal colonies grown on one or more identification media. This approach has been rejcted by several taxonomists because...... of the subjectivity in the visual evaluation and quantification (if any)of such characters and the apparent large variability of the features. We present an image analysis approach for objective identification and classification of fungi. The approach is exemplified by several isolates of nine different species...... of the genus Penicillium, known to be very difficult to identify correctly. The fungi were incubated on YES and CYA for one week at 25 C (3 point inoculation) in 9 cm Petri dishes. The cultures are placed under a camera where a digital image of the front of the colonies is acquired under optimal illumination...

  8. Image sequence analysis in nuclear medicine: (1) Parametric imaging using statistical modelling

    International Nuclear Information System (INIS)

    Liehn, J.C.; Hannequin, P.; Valeyre, J.

    1989-01-01

    This is a review of parametric imaging methods on Nuclear Medicine. A Parametric Image is an image in which each pixel value is a function of the value of the same pixel of an image sequence. The Local Model Method is the fitting of each pixel time activity curve by a model which parameter values form the Parametric Images. The Global Model Method is the modelling of the changes between two images. It is applied to image comparison. For both methods, the different models, the identification criterion, the optimization methods and the statistical properties of the images are discussed. The analysis of one or more Parametric Images is performed using 1D or 2D histograms. The statistically significant Parametric Images, (Images of significant Variances, Amplitudes and Differences) are also proposed [fr

  9. Biostatistical analysis of quantitative immunofluorescence microscopy images.

    Science.gov (United States)

    Giles, C; Albrecht, M A; Lam, V; Takechi, R; Mamo, J C

    2016-12-01

    Semiquantitative immunofluorescence microscopy has become a key methodology in biomedical research. Typical statistical workflows are considered in the context of avoiding pseudo-replication and marginalising experimental error. However, immunofluorescence microscopy naturally generates hierarchically structured data that can be leveraged to improve statistical power and enrich biological interpretation. Herein, we describe a robust distribution fitting procedure and compare several statistical tests, outlining their potential advantages/disadvantages in the context of biological interpretation. Further, we describe tractable procedures for power analysis that incorporates the underlying distribution, sample size and number of images captured per sample. The procedures outlined have significant potential for increasing understanding of biological processes and decreasing both ethical and financial burden through experimental optimization. © 2016 The Authors Journal of Microscopy © 2016 Royal Microscopical Society.

  10. Computerised image analysis of biocrystallograms originating from agricultural products

    DEFF Research Database (Denmark)

    Andersen, Jens-Otto; Henriksen, Christian B.; Laursen, J.

    1999-01-01

    Procedures are presented for computerised image analysis of iocrystallogram images, originating from biocrystallization investigations of agricultural products. The biocrystallization method is based on the crystallographic phenomenon that when adding biological substances, such as plant extracts...... on up to eight parameters indicated strong relationships, with R2 up to 0.98. It is concluded that the procedures were able to discriminate the seven groups of images, and are applicable for biocrystallization investigations of agricultural products. Perspectives for the application of image analysis...

  11. Image analysis and microscopy: a useful combination

    Directory of Open Access Journals (Sweden)

    Pinotti L.

    2009-01-01

    Full Text Available The TSE Roadmap published in 2005 (DG for Health and Consumer Protection, 2005 suggests that short and medium term (2005-2009 amendments to control BSE policy should include “a relaxation of certain measures of the current total feed ban when certain conditions are met”. The same document noted “the starting point when revising the current feed ban provisions should be risk-based but at the same time taking into account the control tools in place to evaluate and ensure the proper implementation of this feed ban”. The clear implication is that adequate analytical methods to detect constituents of animal origin in feedstuffs are required. The official analytical method for the detection of constituents of animal origin in feedstuffs is the microscopic examination technique as described in Commission Directive 2003/126/EC of 23 December 2003 [OJ L 339, 24.12.2003, 78]. Although the microscopic method is usually able to distinguish fish from land animal material, it is often unable to distinguish between different terrestrial animals. Fulfillments of the requirements of Regulation 1774/2002/EC laying down health rules concerning animal by-products not intended for human consumption, clearly implies that it must be possible to identify the origin animal materials, at higher taxonomic levels than in the past. Thus improvements in all methods of detecting constituents of animal origin are required, including the microscopic method. This article will examine the problem of meat and bone meal in animal feeds, and the use of microscopic methods in association with computer image analysis to identify the source species of these feedstuff contaminants. Image processing, integrated with morphometric measurements can provide accurate and reliable results and can be a very useful aid to the analyst in the characterization, analysis and control of feedstuffs.

  12. Forensic image analysis - CCTV distortion and artefacts.

    Science.gov (United States)

    Seckiner, Dilan; Mallett, Xanthé; Roux, Claude; Meuwly, Didier; Maynard, Philip

    2018-04-01

    As a result of the worldwide deployment of surveillance cameras, authorities have gained a powerful tool that captures footage of activities of people in public areas. Surveillance cameras allow continuous monitoring of the area and allow footage to be obtained for later use, if a criminal or other act of interest occurs. Following this, a forensic practitioner, or expert witness can be required to analyse the footage of the Person of Interest. The examination ultimately aims at evaluating the strength of evidence at source and activity levels. In this paper, both source and activity levels are inferred from the trace, obtained in the form of CCTV footage. The source level alludes to features observed within the anatomy and gait of an individual, whilst the activity level relates to activity undertaken by the individual within the footage. The strength of evidence depends on the value of the information recorded, where the activity level is robust, yet source level requires further development. It is therefore suggested that the camera and the associated distortions should be assessed first and foremost and, where possible, quantified, to determine the level of each type of distortion present within the footage. A review of the 'forensic image analysis' review is presented here. It will outline the image distortion types and detail the limitations of differing surveillance camera systems. The aim is to highlight various types of distortion present particularly from surveillance footage, as well as address gaps in current literature in relation to assessment of CCTV distortions in tandem with gait analysis. Future work will consider the anatomical assessment from surveillance footage. Copyright © 2018 Elsevier B.V. All rights reserved.

  13. Single photon emission computerized tomography (SPECT)

    International Nuclear Information System (INIS)

    Ganatra, R.D.

    1992-01-01

    Tomography in nuclear medicine did not originate after the introduction of X-ray computerized tomography (CT). Even in the days of rectilinear scanner, tomography was attempted with multiple detector heads rotating around the patient, but the counts at each plane were never very high to obtain a satisfactory image. A high resolution focusing collimator can look at different depths but taking several slices in one projection was a time consuming process. Rectilinear scanners lose lot of counts in the collimator to look at one point, at on time, in one plane. It is true that attempts to do tomography with gamma camera really got a boost after the success of CT. By that time, algorithms for doing reconstruction of images also were highly refined and for advanced. Clinical application of SPECT has become widespread now, because of the development of suitable radiopharmaceuticals and improvement in instrumentation. The SPECT provides a direct measure of regional organ function and is performed with nuclides such as 123 I and 99 Tc m that emit a mono-image photon during their decay. SPECT is far less expensive than positron emission tomography

  14. Single photon emission computerized tomography (SPECT)

    Energy Technology Data Exchange (ETDEWEB)

    Ganatra, R D

    1993-12-31

    Tomography in nuclear medicine did not originate after the introduction of X-ray computerized tomography (CT). Even in the days of rectilinear scanner, tomography was attempted with multiple detector heads rotating around the patient, but the counts at each plane were never very high to obtain a satisfactory image. A high resolution focusing collimator can look at different depths but taking several slices in one projection was a time consuming process. Rectilinear scanners lose lot of counts in the collimator to look at one point, at on time, in one plane. It is true that attempts to do tomography with gamma camera really got a boost after the success of CT. By that time, algorithms for doing reconstruction of images also were highly refined and for advanced. Clinical application of SPECT has become widespread now, because of the development of suitable radiopharmaceuticals and improvement in instrumentation. The SPECT provides a direct measure of regional organ function and is performed with nuclides such as {sup 123}I and {sup 99}Tc{sup m} that emit a mono-image photon during their decay. SPECT is far less expensive than positron emission tomography

  15. Multi-focus Image Fusion Using Epifluorescence Microscopy for Robust Vascular Segmentation

    OpenAIRE

    Pelapur, Rengarajan; Prasath, Surya; Palaniappan, Kannappan

    2014-01-01

    We are building a computerized image analysis system for Dura Mater vascular network from fluorescence microscopy images. We propose a system that couples a multi-focus image fusion module with a robust adaptive filtering based segmentation. The robust adaptive filtering scheme handles noise without destroying small structures, and the multi focal image fusion considerably improves the overall segmentation quality by integrating information from multiple images. Based on the segmenta...

  16. APPLICATION OF PRINCIPAL COMPONENT ANALYSIS TO RELAXOGRAPHIC IMAGES

    International Nuclear Information System (INIS)

    STOYANOVA, R.S.; OCHS, M.F.; BROWN, T.R.; ROONEY, W.D.; LI, X.; LEE, J.H.; SPRINGER, C.S.

    1999-01-01

    Standard analysis methods for processing inversion recovery MR images traditionally have used single pixel techniques. In these techniques each pixel is independently fit to an exponential recovery, and spatial correlations in the data set are ignored. By analyzing the image as a complete dataset, improved error analysis and automatic segmentation can be achieved. Here, the authors apply principal component analysis (PCA) to a series of relaxographic images. This procedure decomposes the 3-dimensional data set into three separate images and corresponding recovery times. They attribute the 3 images to be spatial representations of gray matter (GM), white matter (WM) and cerebrospinal fluid (CSF) content

  17. Dynamic Chest Image Analysis: Model-Based Perfusion Analysis in Dynamic Pulmonary Imaging

    Directory of Open Access Journals (Sweden)

    Kiuru Aaro

    2003-01-01

    Full Text Available The "Dynamic Chest Image Analysis" project aims to develop model-based computer analysis and visualization methods for showing focal and general abnormalities of lung ventilation and perfusion based on a sequence of digital chest fluoroscopy frames collected with the dynamic pulmonary imaging technique. We have proposed and evaluated a multiresolutional method with an explicit ventilation model for ventilation analysis. This paper presents a new model-based method for pulmonary perfusion analysis. According to perfusion properties, we first devise a novel mathematical function to form a perfusion model. A simple yet accurate approach is further introduced to extract cardiac systolic and diastolic phases from the heart, so that this cardiac information may be utilized to accelerate the perfusion analysis and improve its sensitivity in detecting pulmonary perfusion abnormalities. This makes perfusion analysis not only fast but also robust in computation; consequently, perfusion analysis becomes computationally feasible without using contrast media. Our clinical case studies with 52 patients show that this technique is effective for pulmonary embolism even without using contrast media, demonstrating consistent correlations with computed tomography (CT and nuclear medicine (NM studies. This fluoroscopical examination takes only about 2 seconds for perfusion study with only low radiation dose to patient, involving no preparation, no radioactive isotopes, and no contrast media.

  18. Image based SAR product simulation for analysis

    Science.gov (United States)

    Domik, G.; Leberl, F.

    1987-01-01

    SAR product simulation serves to predict SAR image gray values for various flight paths. Input typically consists of a digital elevation model and backscatter curves. A new method is described of product simulation that employs also a real SAR input image for image simulation. This can be denoted as 'image-based simulation'. Different methods to perform this SAR prediction are presented and advantages and disadvantages discussed. Ascending and descending orbit images from NASA's SIR-B experiment were used for verification of the concept: input images from ascending orbits were converted into images from a descending orbit; the results are compared to the available real imagery to verify that the prediction technique produces meaningful image data.

  19. Machine learning based analysis of cardiovascular images

    NARCIS (Netherlands)

    Wolterink, JM

    2017-01-01

    Cardiovascular diseases (CVDs), including coronary artery disease (CAD) and congenital heart disease (CHD) are the global leading cause of death. Computed tomography (CT) and magnetic resonance imaging (MRI) allow non-invasive imaging of cardiovascular structures. This thesis presents machine

  20. Analysis of Pregerminated Barley Using Hyperspectral Image Analysis

    DEFF Research Database (Denmark)

    Arngren, Morten; Hansen, Per Waaben; Eriksen, Birger

    2011-01-01

    imaging system in a mathematical modeling framework to identify pregerminated barley at an early stage of approximately 12 h of pregermination. Our model only assigns pregermination as the cause for a single kernel’s lack of germination and is unable to identify dormancy, kernel damage etc. The analysis...... is based on more than 750 Rosalina barley kernels being pregerminated at 8 different durations between 0 and 60 h based on the BRF method. Regerminating the kernels reveals a grouping of the pregerminated kernels into three categories: normal, delayed and limited germination. Our model employs a supervised...

  1. Image quality analysis of digital mammographic equipments

    Energy Technology Data Exchange (ETDEWEB)

    Mayo, P.; Pascual, A.; Verdu, G. [Valencia Univ. Politecnica, Chemical and Nuclear Engineering Dept. (Spain); Rodenas, F. [Valencia Univ. Politecnica, Applied Mathematical Dept. (Spain); Campayo, J.M. [Valencia Univ. Hospital Clinico, Servicio de Radiofisica y Proteccion Radiologica (Spain); Villaescusa, J.I. [Hospital Clinico La Fe, Servicio de Proteccion Radiologica, Valencia (Spain)

    2006-07-01

    The image quality assessment of a radiographic phantom image is one of the fundamental points in a complete quality control programme. The good functioning result of all the process must be an image with an appropriate quality to carry out a suitable diagnostic. Nowadays, the digital radiographic equipments are replacing the traditional film-screen equipments and it is necessary to update the parameters to guarantee the quality of the process. Contrast-detail phantoms are applied to digital radiography to study the threshold contrast detail sensitivity at operation conditions of the equipment. The phantom that is studied in this work is C.D.M.A.M. 3.4, which facilitates the evaluation of image contrast and detail resolution. One of the most extended indexes to measure the image quality in an objective way is the Image Quality Figure (I.Q.F.). This parameter is useful to calculate the image quality taking into account the contrast and detail resolution of the image analysed. The contrast-detail curve is useful as a measure of the image quality too, because it is a graphical representation in which the hole thickness and diameter are plotted for each contrast-detail combination detected in the radiographic image of the phantom. It is useful for the comparison of the functioning of different radiographic image systems, for phantom images under the same exposition conditions. The aim of this work is to study the image quality of different images contrast-detail phantom C.D.M.A.M. 3.4, carrying out the automatic detection of the contrast-detail combination and to establish a parameter which characterize in an objective way the mammographic image quality. This is useful to compare images obtained at different digital mammographic equipments to study the functioning of the equipments. (authors)

  2. Image quality analysis of digital mammographic equipments

    International Nuclear Information System (INIS)

    Mayo, P.; Pascual, A.; Verdu, G.; Rodenas, F.; Campayo, J.M.; Villaescusa, J.I.

    2006-01-01

    The image quality assessment of a radiographic phantom image is one of the fundamental points in a complete quality control programme. The good functioning result of all the process must be an image with an appropriate quality to carry out a suitable diagnostic. Nowadays, the digital radiographic equipments are replacing the traditional film-screen equipments and it is necessary to update the parameters to guarantee the quality of the process. Contrast-detail phantoms are applied to digital radiography to study the threshold contrast detail sensitivity at operation conditions of the equipment. The phantom that is studied in this work is C.D.M.A.M. 3.4, which facilitates the evaluation of image contrast and detail resolution. One of the most extended indexes to measure the image quality in an objective way is the Image Quality Figure (I.Q.F.). This parameter is useful to calculate the image quality taking into account the contrast and detail resolution of the image analysed. The contrast-detail curve is useful as a measure of the image quality too, because it is a graphical representation in which the hole thickness and diameter are plotted for each contrast-detail combination detected in the radiographic image of the phantom. It is useful for the comparison of the functioning of different radiographic image systems, for phantom images under the same exposition conditions. The aim of this work is to study the image quality of different images contrast-detail phantom C.D.M.A.M. 3.4, carrying out the automatic detection of the contrast-detail combination and to establish a parameter which characterize in an objective way the mammographic image quality. This is useful to compare images obtained at different digital mammographic equipments to study the functioning of the equipments. (authors)

  3. Machine learning approaches in medical image analysis

    DEFF Research Database (Denmark)

    de Bruijne, Marleen

    2016-01-01

    Machine learning approaches are increasingly successful in image-based diagnosis, disease prognosis, and risk assessment. This paper highlights new research directions and discusses three main challenges related to machine learning in medical imaging: coping with variation in imaging protocols......, learning from weak labels, and interpretation and evaluation of results....

  4. Principal component analysis of psoriasis lesions images

    DEFF Research Database (Denmark)

    Maletti, Gabriela Mariel; Ersbøll, Bjarne Kjær

    2003-01-01

    A set of RGB images of psoriasis lesions is used. By visual examination of these images, there seem to be no common pattern that could be used to find and align the lesions within and between sessions. It is expected that the principal components of the original images could be useful during future...

  5. An application of image processing techniques in computed tomography image analysis

    DEFF Research Database (Denmark)

    McEvoy, Fintan

    2007-01-01

    number of animals and image slices, automation of the process was desirable. The open-source and free image analysis program ImageJ was used. A macro procedure was created that provided the required functionality. The macro performs a number of basic image processing procedures. These include an initial...... process designed to remove the scanning table from the image and to center the animal in the image. This is followed by placement of a vertical line segment from the mid point of the upper border of the image to the image center. Measurements are made between automatically detected outer and inner...... boundaries of subcutaneous adipose tissue along this line segment. This process was repeated as the image was rotated (with the line position remaining unchanged) so that measurements around the complete circumference were obtained. Additionally, an image was created showing all detected boundary points so...

  6. Development of computerized risk management tool

    International Nuclear Information System (INIS)

    Kil Yoo Kim; Mee Jung Hwang; Seung Cheol Jang; Sang Hoon Han; Tae Woon Kim

    1997-01-01

    The author describes the kinds of efforts for the development of computerized risk management tool; (1) development of a risk monitor, Risk Monster, (2) improvement of McFarm (Missing Cutsets Finding Algorithm for Risk Monitor) and finally (3) development of reliability database management system, KwDBMan. Risk Monster supports for plant operators and maintenance schedulers to monitor plant risk and to avoid high peak risk by rearranging maintenance work schedule. Improved McFarm significantly improved calculation speed of Risk Monster for the cases of supporting system OOS (Out Of Service). KwDBMan manages event data, generic data and CCF (Common Cause Failure) data to support Risk Monster as well as PSA tool, KIRAP (KAERI Integrated Reliability Analysis Package)

  7. Paperback atlas of anatomical sectional images: Computerized tomography and NMR imaging. Vol. 1. Head, neck, vertebral column, joints; Taschenatlas der Schnittbildanatomie: Computertomographie und Kernspintomographie. Bd. 1. Kopf, Hals, Wirbelsaeule, Gelenke

    Energy Technology Data Exchange (ETDEWEB)

    Moeller, T.B. [Caritas-Krankenhaus, Dillingen (Germany); Reif, E. [Caritas-Krankenhaus, Dillingen (Germany)

    1993-12-31

    Using the nomenclature relating to X-ray findings, the paperback atlas provides a concise, yet accurate description of fine anatomical structures visualized by sectional imaging procedures. Each of the approx. 250 sample images shown for the regions of the head (including neurocranium), vertebral column, neck, thorax, abdomen and muscosceletal system (including joints) is supplemented with a drawing that permits an immediate identification of any structure of interest. (orig.) [Deutsch] Der Taschenatlas beschreibt komprimiert und unter Verwendung der roentgenspezifischen Nomenklatur die anatomischen Details der Schnittbilddiagnostik. Jedem der rund 250 exemplarischen Schnittbilder aus den Bereichen Kopf (inkl. Neurokranium), Wirbelsaeule, Hals, Thorax, Abdomen und muskuloskeletalem System (inkl. Gelenke) ist eine Zeichnung zugeordnet, die das rasche Auffinden der jeweils gesuchten Struktur ermoeglicht. (orig.)

  8. An Ibm PC/AT-Based Image Acquisition And Processing System For Quantitative Image Analysis

    Science.gov (United States)

    Kim, Yongmin; Alexander, Thomas

    1986-06-01

    In recent years, a large number of applications have been developed for image processing systems in the area of biological imaging. We have already finished the development of a dedicated microcomputer-based image processing and analysis system for quantitative microscopy. The system's primary function has been to facilitate and ultimately automate quantitative image analysis tasks such as the measurement of cellular DNA contents. We have recognized from this development experience, and interaction with system users, biologists and technicians, that the increasingly widespread use of image processing systems, and the development and application of new techniques for utilizing the capabilities of such systems, would generate a need for some kind of inexpensive general purpose image acquisition and processing system specially tailored for the needs of the medical community. We are currently engaged in the development and testing of hardware and software for a fairly high-performance image processing computer system based on a popular personal computer. In this paper, we describe the design and development of this system. Biological image processing computer systems have now reached a level of hardware and software refinement where they could become convenient image analysis tools for biologists. The development of a general purpose image processing system for quantitative image analysis that is inexpensive, flexible, and easy-to-use represents a significant step towards making the microscopic digital image processing techniques more widely applicable not only in a research environment as a biologist's workstation, but also in clinical environments as a diagnostic tool.

  9. Towards automatic quantitative analysis of cardiac MR perfusion images

    NARCIS (Netherlands)

    Breeuwer, M.; Quist, M.; Spreeuwers, Lieuwe Jan; Paetsch, I.; Al-Saadi, N.; Nagel, E.

    2001-01-01

    Magnetic Resonance Imaging (MRI) is a powerful technique for imaging cardiovascular diseases. The introduction of cardiovascular MRI into clinical practice is however hampered by the lack of efficient and reliable automatic image analysis methods. This paper focuses on the automatic evaluation of

  10. Subsurface offset behaviour in velocity analysis with extended reflectivity images

    NARCIS (Netherlands)

    Mulder, W.A.

    2013-01-01

    Migration velocity analysis with the constant-density acoustic wave equation can be accomplished by the focusing of extended migration images, obtained by introducing a subsurface shift in the imaging condition. A reflector in a wrong velocity model will show up as a curve in the extended image. In

  11. Visual Analytics Applied to Image Analysis : From Segmentation to Classification

    NARCIS (Netherlands)

    Rauber, Paulo

    2017-01-01

    Image analysis is the field of study concerned with extracting information from images. This field is immensely important for commercial and scientific applications, from identifying people in photographs to recognizing diseases in medical images. The goal behind the work presented in this thesis is

  12. Mesh Processing in Medical-Image Analysis-a Tutorial

    DEFF Research Database (Denmark)

    Levine, Joshua A.; Paulsen, Rasmus Reinhold; Zhang, Yongjie

    2012-01-01

    Medical-image analysis requires an understanding of sophisticated scanning modalities, constructing geometric models, building meshes to represent domains, and downstream biological applications. These four steps form an image-to-mesh pipeline. For research in this field to progress, the imaging...

  13. Intrasubject registration for change analysis in medical imaging

    NARCIS (Netherlands)

    Staring, M.

    2008-01-01

    Image matching is important for the comparison of medical images. Comparison is of clinical relevance for the analysis of differences due to changes in the health of a patient. For example, when a disease is imaged at two time points, then one wants to know if it is stable, has regressed, or

  14. DOE transporation programs - computerized techniques

    Energy Technology Data Exchange (ETDEWEB)

    Joy, D.S.; Johnson, P.E.; Fore, C.S.; Peterson, B.E.

    1983-01-01

    One of the major thrusts of the transportation programs at the Oak Ridge National Laboratory has been the development of a number of computerized transportation programs and data bases. The U.S. Department of Energy (DOE) is supporting these efforts through the Transportation Technology Center at Sandia National Laboratories and the Tranportation Operations and Traffic Management (TOTM) organization at DOE Headquarters. Initially this project was centered upon research activities. However, since these tools provide traffic managers and key personnel involved in preshipment planning with a unique resource for ensuring that the movement of radioactive materials can be properly accomplished, additional interest and support is coming from the operational side of DOE. The major accomplishments include the development of two routing models (one for rail shipments and the other for highway shipments), an emergency response assistance program, and two data bases containing pertinent legislative and regulatory information. This paper discusses the mose recent advances in, and additions to, these computerized techniques and provides examples of how they are used.

  15. Computerized index for teaching files

    International Nuclear Information System (INIS)

    Bramble, J.M.

    1989-01-01

    A computerized index can be used to retrieve cases from a teaching file that have radiographic findings similar to an unknown case. The probability that a user will review cases with a correct diagnosis was estimated with use of radiographic findings of arthritis in hand radiographs of 110 cases from a teaching file. The nearest-neighbor classification algorithm was used as a computer index to 110 cases of arthritis. Each case was treated as an unknown and inputted to the computer index. The accuracy of the computer index in retrieving cases with the same diagnosis (including rheumatoid arthritis, gout, psoriatic arthritis, inflammatory osteoarthritis, and pyrophosphate arthropathy) was measured. A Bayes classifier algorithm was also tested on the same database. Results are presented. The nearest-neighbor algorithm was 83%. By comparison, the estimated accuracy of the Bayes classifier algorithm was 78%. Conclusions: A computerized index to a teaching file based on the nearest-neighbor algorithm should allow the user to review cases with the correct diagnosis of an unknown case, by entering the findings of the unknown case

  16. Computerized lateral-shear interferometer

    Science.gov (United States)

    Hasegan, Sorin A.; Jianu, Angela; Vlad, Valentin I.

    1998-07-01

    A lateral-shear interferometer, coupled with a computer for laser wavefront analysis, is described. A CCD camera is used to transfer the fringe images through a frame-grabber into a PC. 3D phase maps are obtained by fringe pattern processing using a new algorithm for direct spatial reconstruction of the optical phase. The program describes phase maps by Zernike polynomials yielding an analytical description of the wavefront aberration. A compact lateral-shear interferometer has been built using a laser diode as light source, a CCD camera and a rechargeable battery supply, which allows measurements in-situ, if necessary.

  17. Image quality preferences among radiographers and radiologists. A conjoint analysis

    International Nuclear Information System (INIS)

    Ween, Borgny; Kristoffersen, Doris Tove; Hamilton, Glenys A.; Olsen, Dag Rune

    2005-01-01

    Purpose: The aim of this study was to investigate the image quality preferences among radiographers and radiologists. The radiographers' preferences are mainly related to technical parameters, whereas radiologists assess image quality based on diagnostic value. Methods: A conjoint analysis was undertaken to survey image quality preferences; the study included 37 respondents: 19 radiographers and 18 radiologists. Digital urograms were post-processed into 8 images with different properties of image quality for 3 different patients. The respondents were asked to rank the images according to their personally perceived subjective image quality. Results: Nearly half of the radiographers and radiologists were consistent in their ranking of the image characterised as 'very best image quality'. The analysis showed, moreover, that chosen filtration level and image intensity were responsible for 72% and 28% of the preferences, respectively. The corresponding figures for each of the two professions were 76% and 24% for the radiographers, and 68% and 32% for the radiologists. In addition, there were larger variations in image preferences among the radiologists, as compared to the radiographers. Conclusions: Radiographers revealed a more consistent preference than the radiologists with respect to image quality. There is a potential for image quality improvement by developing sets of image property criteria

  18. Quality of computerized blast load simulation for non-linear dynamic ...

    African Journals Online (AJOL)

    Quality of computerized blast load simulation for non-linear dynamic response ... commercial software system and a special-purpose, blast-specific software product to ... depend both on the analysis model of choice and the stand-off distances.

  19. Study on forefoot by computerized tomography

    Energy Technology Data Exchange (ETDEWEB)

    Machida, Eiichi (Nihon Univ., Tokyo. School of Medicine)

    1983-10-01

    Computerized tomography (CT) was used to study coronary sections of the forefoot in both normal and abnormal human feet. CT images of the transverse arches at the metatarsal head, middle and base of the shaft were classified into five patterns. In the pattern most commonly found in normal feet, the second metatarsus appeared elevated above the other metatarsal bones at all points, and there was a gradual and even reduction in elevation from the second to the fifth metatarsal. In cases of hallux valgus, however, a variety of deformities were noted in the arc of the second to fifth metatarsals, particularly at the head. The rotation of the first metatarsus and shift of the sesamoids were measured from CT images at the head of the first metatarsus. In hallux valgus, both the rotation and the sesamoid shift appeared to have a wider angle than in the case of normal feet. In normal feet, the differences between the rotation of the first metatarsus and shift of the sesamoids were relatively small, whereas in hallux valgus there was a much greater degree of variation. Furthermore, while normal feet the variation in rotation of the first metatarsus and sesamoid shift both tended to be either great or small, in hallux valgus a large degree of sesamoid shift was sometimes found in combination with a small degree of rotation of the first metatarsus.

  20. Study on forefoot by computerized tomography

    International Nuclear Information System (INIS)

    Machida, Eiichi

    1983-01-01

    Computerized tomography (CT) was used to study coronary sections of the forefoot in both normal and abnormal human feet. CT images of the transverse arches at the metatarsal head, middle and base of the shaft were classified into five patterns. In the pattern most commonly found in normal feet, the second metatarsus appeared elevated above the other metatarsal bones at all points, and there was a gradual and even reduction in elevation from the second to the fifth metatarsal. In cases of hallux valgus, however, a variety of deformities were noted in the arc of the second to fifth metatarsals, particularly at the head. The rotation of the first metatarsus and shift of the sesamoids were measured from CT images at the head of the first metatarsus. In hallux valgus, both the rotation and the sesamoid shift appeared to have a wider angle than in the case of normal feet. In normal feet, the differences between the rotation of the first metatarsus and shift of the sesamoids were relatively small, whereas in hallux valgus there was a much greater degree of variation. Furthermore, while normal feet the variation in rotation of the first metatarsus and sesamoid shift both tended to be either great or small, in hallux valgus a large degree of sesamoid shift was sometimes found in combination with a small degree of rotatin of the first metatarsus. (author)

  1. Work improvement by computerizing the process of shielding block production

    International Nuclear Information System (INIS)

    Kang, Dong Hyuk; Jeong, Do Hyeong; Kang, Dong Yoon; Jeon, Young Gung; Hwang, Jae Woong

    2013-01-01

    Introducing CR (Computed Radiography) system created a process of printing therapy irradiation images and converting the degree of enlargement. This is to increase job efficiency and contribute to work improvement using a computerized method with home grown software to simplify this process, work efficiency. Microsoft EXCEL (ver. 2007) and VISUAL BASIC (ver. 6.0) have been used to make the software. A window for each shield block was designed to enter patients' treatment information. Distances on the digital images were measured, the measured data were entered to the Excel program to calculate the degree of enlargement, and printouts were produced to manufacture shield blocks. By computerizing the existing method with this program, the degree of enlargement can easily be calculated and patients' treatment information can be entered into the printouts by using macro function. As a result, errors in calculation which may occur during the process of production or errors that the treatment information may be delivered wrongly can be reduced. In addition, with the simplification of the conversion process of the degree of enlargement, no copy machine was needed, which resulted in the reduction of use of paper. Works have been improved by computerizing the process of block production and applying it to practice which would simplify the existing method. This software can apply to and improve the actual conditions of each hospital in various ways using various features of EXCEL and VISUAL BASIC which has already been proven and used widely

  2. Work improvement by computerizing the process of shielding block production

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Dong Hyuk; Jeong, Do Hyeong; Kang, Dong Yoon; Jeon, Young Gung; Hwang, Jae Woong [Proton Therapy Center, National Cancer Center, Goyang (Korea, Republic of)

    2013-09-15

    Introducing CR (Computed Radiography) system created a process of printing therapy irradiation images and converting the degree of enlargement. This is to increase job efficiency and contribute to work improvement using a computerized method with home grown software to simplify this process, work efficiency. Microsoft EXCEL (ver. 2007) and VISUAL BASIC (ver. 6.0) have been used to make the software. A window for each shield block was designed to enter patients' treatment information. Distances on the digital images were measured, the measured data were entered to the Excel program to calculate the degree of enlargement, and printouts were produced to manufacture shield blocks. By computerizing the existing method with this program, the degree of enlargement can easily be calculated and patients' treatment information can be entered into the printouts by using macro function. As a result, errors in calculation which may occur during the process of production or errors that the treatment information may be delivered wrongly can be reduced. In addition, with the simplification of the conversion process of the degree of enlargement, no copy machine was needed, which resulted in the reduction of use of paper. Works have been improved by computerizing the process of block production and applying it to practice which would simplify the existing method. This software can apply to and improve the actual conditions of each hospital in various ways using various features of EXCEL and VISUAL BASIC which has already been proven and used widely.

  3. Convergence analysis in near-field imaging

    International Nuclear Information System (INIS)

    Bao, Gang; Li, Peijun

    2014-01-01

    This paper is devoted to the mathematical analysis of the direct and inverse modeling of the diffraction by a perfectly conducting grating surface in the near-field regime. It is motivated by our effort to analyze recent significant numerical results, in order to solve a class of inverse rough surface scattering problems in near-field imaging. In a model problem, the diffractive grating surface is assumed to be a small and smooth deformation of a plane surface. On the basis of the variational method, the direct problem is shown to have a unique weak solution. An analytical solution is introduced as a convergent power series in the deformation parameter by using the transformed field and Fourier series expansions. A local uniqueness result is proved for the inverse problem where only a single incident field is needed. On the basis of the analytic solution of the direct problem, an explicit reconstruction formula is presented for recovering the grating surface function with resolution beyond the Rayleigh criterion. Error estimates for the reconstructed grating surface are established with fully revealed dependence on such quantities as the surface deformation parameter, measurement distance, noise level of the scattering data, and regularity of the exact grating surface function. (paper)

  4. IMAGE ANALYSIS FOR MODELLING SHEAR BEHAVIOUR

    Directory of Open Access Journals (Sweden)

    Philippe Lopez

    2011-05-01

    Full Text Available Through laboratory research performed over the past ten years, many of the critical links between fracture characteristics and hydromechanical and mechanical behaviour have been made for individual fractures. One of the remaining challenges at the laboratory scale is to directly link fracture morphology of shear behaviour with changes in stress and shear direction. A series of laboratory experiments were performed on cement mortar replicas of a granite sample with a natural fracture perpendicular to the axis of the core. Results show that there is a strong relationship between the fracture's geometry and its mechanical behaviour under shear stress and the resulting damage. Image analysis, geostatistical, stereological and directional data techniques are applied in combination to experimental data. The results highlight the role of geometric characteristics of the fracture surfaces (surface roughness, size, shape, locations and orientations of asperities to be damaged in shear behaviour. A notable improvement in shear understanding is that shear behaviour is controlled by the apparent dip in the shear direction of elementary facets forming the fracture.

  5. Measure by image analysis of industrial radiographs

    International Nuclear Information System (INIS)

    Brillault, B.

    1988-01-01

    A digital radiographic picture processing system for non destructive testing intends to provide the expert with computer tool, to precisely quantify radiographic images. The author describes the main problems, from the image formation to its characterization. She also insists on the necessity to define a precise process in order to automatize the system. Some examples illustrate the efficiency of digital processing for radiographic images [fr

  6. MORPHOLOGY BY IMAGE ANALYSIS K. Belaroui and M. N Pons ...

    African Journals Online (AJOL)

    31 déc. 2012 ... Keywords: Characterization; particle size; morphology; image analysis; porous media. 1. INTRODUCTION. La puissance de l'analyse d'images comme ... en une image numérique au moyen d'un convertisseur analogique digital (A/D). Les points de l'image sont disposés suivant une grille en réseau carré, ...

  7. PIZZARO: Forensic analysis and restoration of image and video data

    Czech Academy of Sciences Publication Activity Database

    Kamenický, Jan; Bartoš, Michal; Flusser, Jan; Mahdian, Babak; Kotera, Jan; Novozámský, Adam; Saic, Stanislav; Šroubek, Filip; Šorel, Michal; Zita, Aleš; Zitová, Barbara; Šíma, Z.; Švarc, P.; Hořínek, J.

    2016-01-01

    Roč. 264, č. 1 (2016), s. 153-166 ISSN 0379-0738 R&D Projects: GA MV VG20102013064; GA ČR GA13-29225S Institutional support: RVO:67985556 Keywords : Image forensic analysis * Image restoration * Image tampering detection * Image source identification Subject RIV: JD - Computer Applications, Robotics Impact factor: 1.989, year: 2016 http://library.utia.cas.cz/separaty/2016/ZOI/kamenicky-0459504.pdf

  8. Computerized tomography in diffuse diseases of the liver. Pt. 2

    International Nuclear Information System (INIS)

    Helmberger, H.; Vogel, U.; Bautz, W.

    1993-01-01

    Computerized tomography is a first-line method of imaging to confirm diffuse disorders of the liver suggested by preliminary clinical and biochemical findings. If the disease is caused by an obstructed vessel, this is reliably detected. For most types of thesaurismosis as well as hepatic steatosis and cirrhosis of the liver approaches to quantitative determinations of the spread of disease have been described in theory but so far failed to show great merits in practice. The transition from hepatic fibrosis to cirrhosis as the final developmental stage common to all those disorders has typical features on computerized tomography. This explains why the use of this method in diffuse hepatic disease offers particular advantages as regards the detection of complications occurring at an advanced stage ot the diagnosis of changes developing into malignancies. (orig.) [de

  9. Computerized tomography of the mandibular joints and masticatory muscles

    International Nuclear Information System (INIS)

    Huels, A.B.

    1981-01-01

    A methodology for computerized tomography of the mandibular joints was developed and applied in 80 test persons. Imaging of the mandibular joints is possible with a tomographic technique with 5 mm-overlap, full utilisation of the enlargement capacity of the imaging device, and combined use of transversal and coronary tomography. The method yields full latero-medial, cranio-caudal and anterior-posterior views of the condyle and fossa contours, free of interferences and of distortions caused by the projection. Positional diagnoses are thus possible as well as diagnoses of pathological structural changes. (orig./MG) [de

  10. Computerized Classification Testing with the Rasch Model

    Science.gov (United States)

    Eggen, Theo J. H. M.

    2011-01-01

    If classification in a limited number of categories is the purpose of testing, computerized adaptive tests (CATs) with algorithms based on sequential statistical testing perform better than estimation-based CATs (e.g., Eggen & Straetmans, 2000). In these computerized classification tests (CCTs), the Sequential Probability Ratio Test (SPRT) (Wald,…

  11. New approaches in intelligent image analysis techniques, methodologies and applications

    CERN Document Server

    Nakamatsu, Kazumi

    2016-01-01

    This book presents an Introduction and 11 independent chapters, which are devoted to various new approaches of intelligent image processing and analysis. The book also presents new methods, algorithms and applied systems for intelligent image processing, on the following basic topics: Methods for Hierarchical Image Decomposition; Intelligent Digital Signal Processing and Feature Extraction; Data Clustering and Visualization via Echo State Networks; Clustering of Natural Images in Automatic Image Annotation Systems; Control System for Remote Sensing Image Processing; Tissue Segmentation of MR Brain Images Sequence; Kidney Cysts Segmentation in CT Images; Audio Visual Attention Models in Mobile Robots Navigation; Local Adaptive Image Processing; Learning Techniques for Intelligent Access Control; Resolution Improvement in Acoustic Maps. Each chapter is self-contained with its own references. Some of the chapters are devoted to the theoretical aspects while the others are presenting the practical aspects and the...

  12. Neurosurgical operating computerized tomographic scanner system

    International Nuclear Information System (INIS)

    Okudera, Hiroshi; Sugita, Kenichiro; Kobayashi, Shigeaki; Kimishima, Sakae; Yoshida, Hisashi.

    1988-01-01

    A neurosurgical operating computerized tomography scanner system is presented. This system has been developed for obtaining intra- and postoperative CT images in the operating room. A TCT-300 scanner (manufactured by the Toshiba Co., Tokyo) is placed in the operating room. The realization of a true intraoperative CT image requires certain improvements in the CT scanner and operating table. To adjust the axis of the co-ordinates of the motor system of the MST-7000 microsurgical operating table (manufactured by the Mizuho Ika Co., Tokyo) to the CT scanner, we have designed an interface and a precise motor system so that the computer of the CT scanner can directly control the movement of the operating table. Furthermore, a new head-fixation system has been designed for producing artifact-free intraoperative CT images. The head-pins of the head-fixation system are made of carbon-fiber bars and titanium tips. A simulation study of the total system in the operating room with the CT scanner, operating table, and head holder using a skull model yielded a degree of error similar to that in the phantom testing of the original scanner. Three patients underwent resection of a glial tumor using this system. Intraoperative CT scans taken after dural opening showed a bulging of the cortex, a shift in the central structure, and a displacement of the cortical subarachnoid spaces under the influence of gravity. With a contrast medium the edge of the surrounding brain after resection was enhanced and the residual tumor mass was demonstrated clearly. This system makes it possible to obtain a noninvasive intraoperative image in a situation where structural shifts are taking place. (author)

  13. Analysis of engineering drawings and raster map images

    CERN Document Server

    Henderson, Thomas C

    2013-01-01

    Presents up-to-date methods and algorithms for the automated analysis of engineering drawings and digital cartographic maps Discusses automatic engineering drawing and map analysis techniques Covers detailed accounts of the use of unsupervised segmentation algorithms to map images

  14. Computerizing primary schools in rural kenya

    DEFF Research Database (Denmark)

    Ogembo, J.G.; Ngugi, B.; Pelowski, Matthew John

    2012-01-01

    questions surrounding this endeavour. Specifically: 1.) what problems do rural schools actually want to solve with computerization; 2.) is computerization the most important priority for rural schools; 3.) are schools ready, in terms of infrastructure, for a computer in the classroom; or 4.) might...... and protective roofing -posing severe challenges to the outstanding conception of computerization. We consider these results and make recommendations for better adapting programs for computer introduction, and also suggest the use of new innovative devices, such as cell phones, which might already have overcome......This paper investigates the outstanding challenges facing primary schools' computerization in rural Kenya. Computerization of schools is often envisaged as a 'magic', or at least a particularly efficient, solution to many of the problems that developing countries face in improving primary school...

  15. Energy functionals for medical image segmentation: choices and consequences

    OpenAIRE

    McIntosh, Christopher

    2011-01-01

    Medical imaging continues to permeate the practice of medicine, but automated yet accurate segmentation and labeling of anatomical structures continues to be a major obstacle to computerized medical image analysis. Though there exists numerous approaches for medical image segmentation, one in particular has gained increasing popularity: energy minimization-based techniques, and the large set of methods encompassed therein. With these techniques an energy function must be chosen, segmentations...

  16. ANALYSIS OF SST IMAGES BY WEIGHTED ENSEMBLE TRANSFORM KALMAN FILTER

    OpenAIRE

    Sai , Gorthi; Beyou , Sébastien; Memin , Etienne

    2011-01-01

    International audience; This paper presents a novel, efficient scheme for the analysis of Sea Surface Temperature (SST) ocean images. We consider the estimation of the velocity fields and vorticity values from a sequence of oceanic images. The contribution of this paper lies in proposing a novel, robust and simple approach based onWeighted Ensemble Transform Kalman filter (WETKF) data assimilation technique for the analysis of real SST images, that may contain coast regions or large areas of ...

  17. An introduction to diffusion tensor image analysis.

    Science.gov (United States)

    O'Donnell, Lauren J; Westin, Carl-Fredrik

    2011-04-01

    Diffusion tensor magnetic resonance imaging (DTI) is a relatively new technology that is popular for imaging the white matter of the brain. This article provides a basic and broad overview of DTI to enable the reader to develop an intuitive understanding of these types of data, and an awareness of their strengths and weaknesses. Copyright © 2011 Elsevier Inc. All rights reserved.

  18. Biomedical Image Analysis: Rapid prototyping with Mathematica

    NARCIS (Netherlands)

    Haar Romenij, ter B.M.; Almsick, van M.A.

    2004-01-01

    Digital acquisition techniques have caused an explosion in the production of medical images, especially with the advent of multi-slice CT and volume MRI. One third of the financial investments in a modern hospital's equipment are dedicated to imaging. Emerging screening programs add to this flood of

  19. Multi-spectral Image Analysis for Astaxanthin Coating Classification

    DEFF Research Database (Denmark)

    Ljungqvist, Martin Georg; Ersbøll, Bjarne Kjær; Nielsen, Michael Engelbrecht

    2011-01-01

    Industrial quality inspection using image analysis on astaxanthin coating in aquaculture feed pellets is of great importance for automatic production control. In this study multi-spectral image analysis of pellets was performed using LDA, QDA, SNV and PCA on pixel level and mean value of pixels...

  20. Geographic Object-Based Image Analysis: Towards a new paradigm

    NARCIS (Netherlands)

    Blaschke, T.; Hay, G.J.; Kelly, M.; Lang, S.; Hofmann, P.; Addink, E.A.|info:eu-repo/dai/nl/224281216; Queiroz Feitosa, R.; van der Meer, F.D.|info:eu-repo/dai/nl/138940908; van der Werff, H.M.A.; van Coillie, F.; Tiede, A.

    2014-01-01

    The amount of scientific literature on (Geographic) Object-based Image Analysis – GEOBIA has been and still is sharply increasing. These approaches to analysing imagery have antecedents in earlier research on image segmentation and use GIS-like spatial analysis within classification and feature

  1. A short introduction to image analysis - Matlab exercises

    DEFF Research Database (Denmark)

    Hansen, Michael Adsetts Edberg

    2000-01-01

    This document contain a short introduction to Image analysis. In addition small exercises has been prepared in order to support the theoretical understanding.......This document contain a short introduction to Image analysis. In addition small exercises has been prepared in order to support the theoretical understanding....

  2. Analysis of licensed South African diagnostic imaging equipment ...

    African Journals Online (AJOL)

    Analysis of licensed South African diagnostic imaging equipment. ... Pan African Medical Journal ... Introduction: Objective: To conduct an analysis of all registered South Africa (SA) diagnostic radiology equipment, assess the number of equipment units per capita by imaging modality, and compare SA figures with published ...

  3. Computerized analysis of snoring in sleep apnea syndrome Análise computadorizada do ronco na síndrome da apneia do sono

    Directory of Open Access Journals (Sweden)

    Fabio Koiti Shiomi

    2011-08-01

    Full Text Available The International Classification of Sleep Disorders lists 90 disorders. Manifestations, such as snoring, are important signs in the diagnosis of the Obstructive Sleep Apnea Syndrome; they are also socially undesirable. OBJECTIVE: The aim of this paper was to present and evaluate a computerized tool that automatically identifies snoring and highlights the importance of establishing the duration of each snoring event in OSA patients. MATERIAL AND METHODS: The low-sampling (200 Hz electrical signal that indicates snoring was measured during polysomnography. The snoring sound of 31 patients was automatically classified by the software. The Kappa approach was applied to measure agreement between the automatic detection software and a trained observer. Student's T test was applied to evaluate differences in the duration of snoring episodes among simple snorers and OSA snorers. RESULTS: Of a total 43,976 snoring episodes, the software sensitivity was 99. 26%, the specificity was 97. 35%, and Kappa was 0. 96. We found a statistically significant difference (p A classificação internacional de distúrbios do sono enumera aproximadamente 90 distúrbios. Manifestações, como o ronco, são um sinal no diagnóstico da Síndrome da Apneia Obstrutiva, além de ser um incômodo social. OBJETIVO: O objetivo deste artigo é apresentar e avaliar a ferramenta computacional que identifica o ronco automaticamente e destacar a importância da quantificação da duração de cada evento do ronco em pacientes com SAHOS. MATERIAL E MÉTODOS: O sinal elétrico que representa o ronco de baixa amostragem (200 hz foi captado enquanto os pacientes eram submetidos à polissonografia. O sinal do ronco dos 31 pacientes foi classificado pelo programa computacional automaticamente. Utilizamos o valor de Kappa para avaliar a concordância entre o programa de detecção automática e o observador treinado (teste t-student. Avaliamos a diferença da duração dos episódios de

  4. Psychometrics behind Computerized Adaptive Testing.

    Science.gov (United States)

    Chang, Hua-Hua

    2015-03-01

    The paper provides a survey of 18 years' progress that my colleagues, students (both former and current) and I made in a prominent research area in Psychometrics-Computerized Adaptive Testing (CAT). We start with a historical review of the establishment of a large sample foundation for CAT. It is worth noting that the asymptotic results were derived under the framework of Martingale Theory, a very theoretical perspective of Probability Theory, which may seem unrelated to educational and psychological testing. In addition, we address a number of issues that emerged from large scale implementation and show that how theoretical works can be helpful to solve the problems. Finally, we propose that CAT technology can be very useful to support individualized instruction on a mass scale. We show that even paper and pencil based tests can be made adaptive to support classroom teaching.

  5. Chinese computerized nuclear data library

    International Nuclear Information System (INIS)

    Liang Qichang; Cai Dunjiu

    1996-01-01

    The Second Version of Chinese Evaluated Nuclear Data Library (CENDL-2) includes the complete neutron nuclear data sets of 54 important elements and isotopes used for nuclear science and engineering with the incident neutron energy from 10 -5 eV to 20 MeV, the international universal format ENDF/B-6 was adopted. Now, the Chinese Computerized nuclear data library has been developed and put into operation. That is, the users can make on-line use of the main data libraries for evaluated neutron reaction data in the world of EXFOR experimental nuclear data library on the terminal of computer via the perfect computer software system, carry out directly the nuclear engineering calculation or nuclear data evaluation, enjoy the use of the resource of our nuclear data libraries for their development of nuclear energy and nuclear technology applications

  6. Computerized evaluation of flood impact

    International Nuclear Information System (INIS)

    Gagnon, J.; Quach, T.T.; Marche, C.; Lessard, G.

    1998-01-01

    A computerized evaluation process for assessing the economic impacts of a potential dam failure is described. The DOMINO software, which was developed by Hydro-Quebec, takes into account flow data from dam break simulations of floods, the territory involved, plus the economic evaluations of the real estate and infrastructures affected. Some examples of software applications and impact evaluations are presented. The principal elements involved in estimating economic or other types of impacts induced by natural flooding or dam failure, are: (1) flow forecasting, (2) defining the contour of the involved territory, and (3) accounting for the various impacts identified in the affected zone. Owing to its wide range of functions and utilities, DOMINO has proven to be a very useful, user-friendly and portable decision-making tool. 5 refs., 6 tabs

  7. Analysis of sharpness increase by image noise

    Science.gov (United States)

    Kurihara, Takehito; Aoki, Naokazu; Kobayashi, Hiroyuki

    2009-02-01

    Motivated by the reported increase in sharpness by image noise, we investigated how noise affects sharpness perception. We first used natural images of tree bark with different amounts of noise to see whether noise enhances sharpness. Although the result showed sharpness decreased as noise amount increased, some observers seemed to perceive more sharpness with increasing noise, while the others did not. We next used 1D and 2D uni-frequency patterns as stimuli in an attempt to reduce such variability in the judgment. The result showed, for higher frequency stimuli, sharpness decreased as the noise amount increased, while sharpness of the lower frequency stimuli increased at a certain noise level. From this result, we thought image noise might reduce sharpness at edges, but be able to improve sharpness of lower frequency component or texture in image. To prove this prediction, we experimented again with the natural image used in the first experiment. Stimuli were made by applying noise separately to edge or to texture part of the image. The result showed noise, when added to edge region, only decreased sharpness, whereas when added to texture, could improve sharpness. We think it is the interaction between noise and texture that sharpens image.

  8. IMAGE ANALYSIS BASED ON EDGE DETECTION TECHNIQUES

    Institute of Scientific and Technical Information of China (English)

    纳瑟; 刘重庆

    2002-01-01

    A method that incorporates edge detection technique, Markov Random field (MRF), watershed segmentation and merging techniques was presented for performing image segmentation and edge detection tasks. It first applies edge detection technique to obtain a Difference In Strength (DIS) map. An initial segmented result is obtained based on K-means clustering technique and the minimum distance. Then the region process is modeled by MRF to obtain an image that contains different intensity regions. The gradient values are calculated and then the watershed technique is used. DIS calculation is used for each pixel to define all the edges (weak or strong) in the image. The DIS map is obtained. This help as priority knowledge to know the possibility of the region segmentation by the next step (MRF), which gives an image that has all the edges and regions information. In MRF model,gray level l, at pixel location i, in an image X, depends on the gray levels of neighboring pixels. The segmentation results are improved by using watershed algorithm. After all pixels of the segmented regions are processed, a map of primitive region with edges is generated. The edge map is obtained using a merge process based on averaged intensity mean values. A common edge detectors that work on (MRF) segmented image are used and the results are compared. The segmentation and edge detection result is one closed boundary per actual region in the image.

  9. Photoacoustic image reconstruction: a quantitative analysis

    Science.gov (United States)

    Sperl, Jonathan I.; Zell, Karin; Menzenbach, Peter; Haisch, Christoph; Ketzer, Stephan; Marquart, Markus; Koenig, Hartmut; Vogel, Mika W.

    2007-07-01

    Photoacoustic imaging is a promising new way to generate unprecedented contrast in ultrasound diagnostic imaging. It differs from other medical imaging approaches, in that it provides spatially resolved information about optical absorption of targeted tissue structures. Because the data acquisition process deviates from standard clinical ultrasound, choice of the proper image reconstruction method is crucial for successful application of the technique. In the literature, multiple approaches have been advocated, and the purpose of this paper is to compare four reconstruction techniques. Thereby, we focused on resolution limits, stability, reconstruction speed, and SNR. We generated experimental and simulated data and reconstructed images of the pressure distribution using four different methods: delay-and-sum (DnS), circular backprojection (CBP), generalized 2D Hough transform (HTA), and Fourier transform (FTA). All methods were able to depict the point sources properly. DnS and CBP produce blurred images containing typical superposition artifacts. The HTA provides excellent SNR and allows a good point source separation. The FTA is the fastest and shows the best FWHM. In our study, we found the FTA to show the best overall performance. It allows a very fast and theoretically exact reconstruction. Only a hardware-implemented DnS might be faster and enable real-time imaging. A commercial system may also perform several methods to fully utilize the new contrast mechanism and guarantee optimal resolution and fidelity.

  10. Rapid, low-cost, image analysis through video processing

    International Nuclear Information System (INIS)

    Levinson, R.A.; Marrs, R.W.; Grantham, D.G.

    1976-01-01

    Remote Sensing now provides the data necessary to solve many resource problems. However, many of the complex image processing and analysis functions used in analysis of remotely-sensed data are accomplished using sophisticated image analysis equipment. High cost of this equipment places many of these techniques beyond the means of most users. A new, more economical, video system capable of performing complex image analysis has now been developed. This report describes the functions, components, and operation of that system. Processing capability of the new video image analysis system includes many of the tasks previously accomplished with optical projectors and digital computers. Video capabilities include: color separation, color addition/subtraction, contrast stretch, dark level adjustment, density analysis, edge enhancement, scale matching, image mixing (addition and subtraction), image ratioing, and construction of false-color composite images. Rapid input of non-digital image data, instantaneous processing and display, relatively low initial cost, and low operating cost gives the video system a competitive advantage over digital equipment. Complex pre-processing, pattern recognition, and statistical analyses must still be handled through digital computer systems. The video system at the University of Wyoming has undergone extensive testing, comparison to other systems, and has been used successfully in practical applications ranging from analysis of x-rays and thin sections to production of color composite ratios of multispectral imagery. Potential applications are discussed including uranium exploration, petroleum exploration, tectonic studies, geologic mapping, hydrology sedimentology and petrography, anthropology, and studies on vegetation and wildlife habitat

  11. Image Sharing Technologies and Reduction of Imaging Utilization: A Systematic Review and Meta-analysis

    Science.gov (United States)

    Vest, Joshua R.; Jung, Hye-Young; Ostrovsky, Aaron; Das, Lala Tanmoy; McGinty, Geraldine B.

    2016-01-01

    Introduction Image sharing technologies may reduce unneeded imaging by improving provider access to imaging information. A systematic review and meta-analysis were conducted to summarize the impact of image sharing technologies on patient imaging utilization. Methods Quantitative evaluations of the effects of PACS, regional image exchange networks, interoperable electronic heath records, tools for importing physical media, and health information exchange systems on utilization were identified through a systematic review of the published and gray English-language literature (2004–2014). Outcomes, standard effect sizes (ESs), settings, technology, populations, and risk of bias were abstracted from each study. The impact of image sharing technologies was summarized with random-effects meta-analysis and meta-regression models. Results A total of 17 articles were included in the review, with a total of 42 different studies. Image sharing technology was associated with a significant decrease in repeat imaging (pooled effect size [ES] = −0.17; 95% confidence interval [CI] = [−0.25, −0.09]; P utilization (pooled ES = 0.20; 95% CI = [0.07, 0.32]; P = .002). For all outcomes combined, image sharing technology was not associated with utilization. Most studies were at risk for bias. Conclusions Image sharing technology was associated with reductions in repeat and unnecessary imaging, in both the overall literature and the most-rigorous studies. Stronger evidence is needed to further explore the role of specific technologies and their potential impact on various modalities, patient populations, and settings. PMID:26614882

  12. Vector sparse representation of color image using quaternion matrix analysis.

    Science.gov (United States)

    Xu, Yi; Yu, Licheng; Xu, Hongteng; Zhang, Hao; Nguyen, Truong

    2015-04-01

    Traditional sparse image models treat color image pixel as a scalar, which represents color channels separately or concatenate color channels as a monochrome image. In this paper, we propose a vector sparse representation model for color images using quaternion matrix analysis. As a new tool for color image representation, its potential applications in several image-processing tasks are presented, including color image reconstruction, denoising, inpainting, and super-resolution. The proposed model represents the color image as a quaternion matrix, where a quaternion-based dictionary learning algorithm is presented using the K-quaternion singular value decomposition (QSVD) (generalized K-means clustering for QSVD) method. It conducts the sparse basis selection in quaternion space, which uniformly transforms the channel images to an orthogonal color space. In this new color space, it is significant that the inherent color structures can be completely preserved during vector reconstruction. Moreover, the proposed sparse model is more efficient comparing with the current sparse models for image restoration tasks due to lower redundancy between the atoms of different color channels. The experimental results demonstrate that the proposed sparse image model avoids the hue bias issue successfully and shows its potential as a general and powerful tool in color image analysis and processing domain.

  13. Interpretation of medical images by model guided analysis

    International Nuclear Information System (INIS)

    Karssemeijer, N.

    1989-01-01

    Progress in the development of digital pictorial information systems stimulates a growing interest in the use of image analysis techniques in medicine. Especially when precise quantitative information is required the use of fast and reproducable computer analysis may be more appropriate than relying on visual judgement only. Such quantitative information can be valuable, for instance, in diagnostics or in irradiation therapy planning. As medical images are mostly recorded in a prescribed way, human anatomy guarantees a common image structure for each particular type of exam. In this thesis it is investigated how to make use of this a priori knowledge to guide image analysis. For that purpose models are developed which are suited to capture common image structure. The first part of this study is devoted to an analysis of nuclear medicine images of myocardial perfusion. In ch. 2 a model of these images is designed in order to represent characteristic image properties. It is shown that for these relatively simple images a compact symbolic description can be achieved, without significant loss of diagnostically importance of several image properties. Possibilities for automatic interpretation of more complex images is investigated in the following chapters. The central topic is segmentation of organs. Two methods are proposed and tested on a set of abdominal X-ray CT scans. Ch. 3 describes a serial approach based on a semantic network and the use of search areas. Relational constraints are used to guide the image processing and to classify detected image segments. In teh ch.'s 4 and 5 a more general parallel approach is utilized, based on a markov random field image model. A stochastic model used to represent prior knowledge about the spatial arrangement of organs is implemented as an external field. (author). 66 refs.; 27 figs.; 6 tabs

  14. Multifractal analysis of three-dimensional histogram from color images

    International Nuclear Information System (INIS)

    Chauveau, Julien; Rousseau, David; Richard, Paul; Chapeau-Blondeau, Francois

    2010-01-01

    Natural images, especially color or multicomponent images, are complex information-carrying signals. To contribute to the characterization of this complexity, we investigate the possibility of multiscale organization in the colorimetric structure of natural images. This is realized by means of a multifractal analysis applied to the three-dimensional histogram from natural color images. The observed behaviors are confronted to those of reference models with known multifractal properties. We use for this purpose synthetic random images with trivial monofractal behavior, and multidimensional multiplicative cascades known for their actual multifractal behavior. The behaviors observed on natural images exhibit similarities with those of the multifractal multiplicative cascades and display the signature of elaborate multiscale organizations stemming from the histograms of natural color images. This type of characterization of colorimetric properties can be helpful to various tasks of digital image processing, as for instance modeling, classification, indexing.

  15. Knowledge-based image analysis: some aspects on the analysis of images using other types of information

    Energy Technology Data Exchange (ETDEWEB)

    Eklundh, J O

    1982-01-01

    The computer vision approach to image analysis is discussed from two aspects. First, this approach is constrasted to the pattern recognition approach. Second, how external knowledge and information and models from other fields of science and engineering can be used for image and scene analysis is discussed. In particular, the connections between computer vision and computer graphics are pointed out.

  16. Introducing PLIA: Planetary Laboratory for Image Analysis

    Science.gov (United States)

    Peralta, J.; Hueso, R.; Barrado, N.; Sánchez-Lavega, A.

    2005-08-01

    We present a graphical software tool developed under IDL software to navigate, process and analyze planetary images. The software has a complete Graphical User Interface and is cross-platform. It can also run under the IDL Virtual Machine without the need to own an IDL license. The set of tools included allow image navigation (orientation, centring and automatic limb determination), dynamical and photometric atmospheric measurements (winds and cloud albedos), cylindrical and polar projections, as well as image treatment under several procedures. Being written in IDL, it is modular and easy to modify and grow for adding new capabilities. We show several examples of the software capabilities with Galileo-Venus observations: Image navigation, photometrical corrections, wind profiles obtained by cloud tracking, cylindrical projections and cloud photometric measurements. Acknowledgements: This work has been funded by Spanish MCYT PNAYA2003-03216, fondos FEDER and Grupos UPV 15946/2004. R. Hueso acknowledges a post-doc fellowship from Gobierno Vasco.

  17. Applying Image Matching to Video Analysis

    Science.gov (United States)

    2010-09-01

    image groups, classified by the background scene, are the flag, the kitchen, the telephone, the bookshelf , the title screen, the...Kitchen 136 Telephone 3 Bookshelf 81 Title Screen 10 Map 1 24 Map 2 16 command line. This implementation of a Bloom filter uses two arbitrary...with the Bookshelf images. This scene is a much closer shot than the Kitchen scene so the host occupies much of the background. Algorithms for face

  18. Plant phenomics: an overview of image acquisition technologies and image data analysis algorithms.

    Science.gov (United States)

    Perez-Sanz, Fernando; Navarro, Pedro J; Egea-Cortines, Marcos

    2017-11-01

    The study of phenomes or phenomics has been a central part of biology. The field of automatic phenotype acquisition technologies based on images has seen an important advance in the last years. As with other high-throughput technologies, it addresses a common set of problems, including data acquisition and analysis. In this review, we give an overview of the main systems developed to acquire images. We give an in-depth analysis of image processing with its major issues and the algorithms that are being used or emerging as useful to obtain data out of images in an automatic fashion. © The Author 2017. Published by Oxford University Press.

  19. Diagnostic imaging analysis of the impacted mesiodens

    Energy Technology Data Exchange (ETDEWEB)

    Noh, Jeong Jun; Choi, Bo Ram; Jeong, Hwan Seok; Huh, Kyung Hoe; Yi, Won Jin; Heo, Min Suk; Lee, Sam Sun; Choi, Soon Chul [School of Dentistry, Seoul National University, Seoul (Korea, Republic of)

    2010-06-15

    The research was performed to predict the three dimensional relationship between the impacted mesiodens and the maxillary central incisors and the proximity with the anatomic structures by comparing their panoramic images with the CT images. Among the patients visiting Seoul National University Dental Hospital from April 2003 to July 2007, those with mesiodens were selected (154 mesiodens of 120 patients). The numbers, shapes, orientation and positional relationship of mesiodens with maxillary central incisors were investigated in the panoramic images. The proximity with the anatomical structures and complications were investigated in the CT images as well. The sex ratio (M : F) was 2.28 : 1 and the mean number of mesiodens per one patient was 1.28. Conical shape was 84.4% and inverted orientation was 51.9%. There were more cases of anatomical structures encroachment, especially on the nasal floor and nasopalatine duct, when the mesiodens was not superimposed with the central incisor. There were, however, many cases of the nasopalatine duct encroachment when the mesiodens was superimpoised with the apical 1/3 of central incisor (52.6%). Delayed eruption (55.6%), crown rotation (66.7%) and crown resorption (100%) were observed when the mesiodens was superimposed with the crown of the central incisor. It is possible to predict three dimensional relationship between the impacted mesiodens and the maxillary central incisors in the panoramic images, but more details should be confirmed by the CT images when necessary.

  20. An image analysis system for near-infrared (NIR) fluorescence lymph imaging

    Science.gov (United States)

    Zhang, Jingdan; Zhou, Shaohua Kevin; Xiang, Xiaoyan; Rasmussen, John C.; Sevick-Muraca, Eva M.

    2011-03-01

    Quantitative analysis of lymphatic function is crucial for understanding the lymphatic system and diagnosing the associated diseases. Recently, a near-infrared (NIR) fluorescence imaging system is developed for real-time imaging lymphatic propulsion by intradermal injection of microdose of a NIR fluorophore distal to the lymphatics of interest. However, the previous analysis software3, 4 is underdeveloped, requiring extensive time and effort to analyze a NIR image sequence. In this paper, we develop a number of image processing techniques to automate the data analysis workflow, including an object tracking algorithm to stabilize the subject and remove the motion artifacts, an image representation named flow map to characterize lymphatic flow more reliably, and an automatic algorithm to compute lymph velocity and frequency of propulsion. By integrating all these techniques to a system, the analysis workflow significantly reduces the amount of required user interaction and improves the reliability of the measurement.

  1. Theoretical analysis of radiographic images by nonstationary Poisson processes

    International Nuclear Information System (INIS)

    Tanaka, Kazuo; Uchida, Suguru; Yamada, Isao.

    1980-01-01

    This paper deals with the noise analysis of radiographic images obtained in the usual fluorescent screen-film system. The theory of nonstationary Poisson processes is applied to the analysis of the radiographic images containing the object information. The ensemble averages, the autocorrelation functions, and the Wiener spectrum densities of the light-energy distribution at the fluorescent screen and of the film optical-density distribution are obtained. The detection characteristics of the system are evaluated theoretically. Numerical examples one-dimensional image are shown and the results are compared with those obtained under the assumption that the object image is related to the background noise by the additive process. (author)

  2. Automated thermal mapping techniques using chromatic image analysis

    Science.gov (United States)

    Buck, Gregory M.

    1989-01-01

    Thermal imaging techniques are introduced using a chromatic image analysis system and temperature sensitive coatings. These techniques are used for thermal mapping and surface heat transfer measurements on aerothermodynamic test models in hypersonic wind tunnels. Measurements are made on complex vehicle configurations in a timely manner and at minimal expense. The image analysis system uses separate wavelength filtered images to analyze surface spectral intensity data. The system was initially developed for quantitative surface temperature mapping using two-color thermographic phosphors but was found useful in interpreting phase change paint and liquid crystal data as well.

  3. Quantitative methods for the analysis of electron microscope images

    DEFF Research Database (Denmark)

    Skands, Peter Ulrik Vallø

    1996-01-01

    The topic of this thesis is an general introduction to quantitative methods for the analysis of digital microscope images. The images presented are primarily been acquired from Scanning Electron Microscopes (SEM) and interfermeter microscopes (IFM). The topic is approached though several examples...... foundation of the thesis fall in the areas of: 1) Mathematical Morphology; 2) Distance transforms and applications; and 3) Fractal geometry. Image analysis opens in general the possibility of a quantitative and statistical well founded measurement of digital microscope images. Herein lies also the conditions...

  4. Computerized reactor monitor and control for nuclear reactors

    International Nuclear Information System (INIS)

    Buerger, L.

    1982-01-01

    The analysis of a computerized process control system developed by Transelektro-KFKI-Videoton (Hangary) for a twenty-year-old research reactor in Budapest and or a new one in Tajura (Libya) is given. The paper describes the computer hardware (R-10) and the implemented software (PROCESS-24K) as well as their applications at nuclear reactors. The computer program provides for man-machine communication, data acquisition and processing, trend and alarm analysis, the control of the reactor power, reactor physical calculations and additional operational functions. The reliability and the possible further development of the computerized systems which are suitable for application at reactors of different design are also discussed. (Sz.J.)

  5. Algorithms for Computerized Fetal Heart Rate Diagnosis with Direct Reporting

    Directory of Open Access Journals (Sweden)

    Kazuo Maeda

    2015-06-01

    Full Text Available Aims: Since pattern classification of fetal heart rate (FHR was subjective and enlarged interobserver difference, objective FHR analysis was achieved with computerized FHR diagnosis. Methods: The computer algorithm was composed of an experts’ knowledge system, including FHR analysis and FHR score calculation, and also of an objective artificial neural network system with software. In addition, a FHR frequency spectrum was studied to detect ominous sinusoidal FHR and the loss of baseline variability related to fetal brain damage. The algorithms were installed in a central-computerized automatic FHR monitoring system, which gave the diagnosis rapidly and directly to the attending doctor. Results: Clinically perinatal mortality decreased significantly and no cerebral palsy developed after introduction of the centralized system. Conclusion: The automatic multichannel FHR monitoring system improved the monitoring, increased the objectivity of FHR diagnosis and promoted clinical results.

  6. Image Harvest: an open-source platform for high-throughput plant image processing and analysis.

    Science.gov (United States)

    Knecht, Avi C; Campbell, Malachy T; Caprez, Adam; Swanson, David R; Walia, Harkamal

    2016-05-01

    High-throughput plant phenotyping is an effective approach to bridge the genotype-to-phenotype gap in crops. Phenomics experiments typically result in large-scale image datasets, which are not amenable for processing on desktop computers, thus creating a bottleneck in the image-analysis pipeline. Here, we present an open-source, flexible image-analysis framework, called Image Harvest (IH), for processing images originating from high-throughput plant phenotyping platforms. Image Harvest is developed to perform parallel processing on computing grids and provides an integrated feature for metadata extraction from large-scale file organization. Moreover, the integration of IH with the Open Science Grid provides academic researchers with the computational resources required for processing large image datasets at no cost. Image Harvest also offers functionalities to extract digital traits from images to interpret plant architecture-related characteristics. To demonstrate the applications of these digital traits, a rice (Oryza sativa) diversity panel was phenotyped and genome-wide association mapping was performed using digital traits that are used to describe different plant ideotypes. Three major quantitative trait loci were identified on rice chromosomes 4 and 6, which co-localize with quantitative trait loci known to regulate agronomically important traits in rice. Image Harvest is an open-source software for high-throughput image processing that requires a minimal learning curve for plant biologists to analyzephenomics datasets. © The Author 2016. Published by Oxford University Press on behalf of the Society for Experimental Biology.

  7. Image Harvest: an open-source platform for high-throughput plant image processing and analysis

    Science.gov (United States)

    Knecht, Avi C.; Campbell, Malachy T.; Caprez, Adam; Swanson, David R.; Walia, Harkamal

    2016-01-01

    High-throughput plant phenotyping is an effective approach to bridge the genotype-to-phenotype gap in crops. Phenomics experiments typically result in large-scale image datasets, which are not amenable for processing on desktop computers, thus creating a bottleneck in the image-analysis pipeline. Here, we present an open-source, flexible image-analysis framework, called Image Harvest (IH), for processing images originating from high-throughput plant phenotyping platforms. Image Harvest is developed to perform parallel processing on computing grids and provides an integrated feature for metadata extraction from large-scale file organization. Moreover, the integration of IH with the Open Science Grid provides academic researchers with the computational resources required for processing large image datasets at no cost. Image Harvest also offers functionalities to extract digital traits from images to interpret plant architecture-related characteristics. To demonstrate the applications of these digital traits, a rice (Oryza sativa) diversity panel was phenotyped and genome-wide association mapping was performed using digital traits that are used to describe different plant ideotypes. Three major quantitative trait loci were identified on rice chromosomes 4 and 6, which co-localize with quantitative trait loci known to regulate agronomically important traits in rice. Image Harvest is an open-source software for high-throughput image processing that requires a minimal learning curve for plant biologists to analyzephenomics datasets. PMID:27141917

  8. 5-ALA induced fluorescent image analysis of actinic keratosis

    Science.gov (United States)

    Cho, Yong-Jin; Bae, Youngwoo; Choi, Eung-Ho; Jung, Byungjo

    2010-02-01

    In this study, we quantitatively analyzed 5-ALA induced fluorescent images of actinic keratosis using digital fluorescent color and hyperspectral imaging modalities. UV-A was utilized to induce fluorescent images and actinic keratosis (AK) lesions were demarcated from surrounding the normal region with different methods. Eight subjects with AK lesion were participated in this study. In the hyperspectral imaging modality, spectral analysis method was utilized for hyperspectral cube image and AK lesions were demarcated from the normal region. Before image acquisition, we designated biopsy position for histopathology of AK lesion and surrounding normal region. Erythema index (E.I.) values on both regions were calculated from the spectral cube data. Image analysis of subjects resulted in two different groups: the first group with the higher fluorescence signal and E.I. on AK lesion than the normal region; the second group with lower fluorescence signal and without big difference in E.I. between two regions. In fluorescent color image analysis of facial AK, E.I. images were calculated on both normal and AK lesions and compared with the results of hyperspectral imaging modality. The results might indicate that the different intensity of fluorescence and E.I. among the subjects with AK might be interpreted as different phases of morphological and metabolic changes of AK lesions.

  9. Rapid analysis and exploration of fluorescence microscopy images.

    Science.gov (United States)

    Pavie, Benjamin; Rajaram, Satwik; Ouyang, Austin; Altschuler, Jason M; Steininger, Robert J; Wu, Lani F; Altschuler, Steven J

    2014-03-19

    Despite rapid advances in high-throughput microscopy, quantitative image-based assays still pose significant challenges. While a variety of specialized image analysis tools are available, most traditional image-analysis-based workflows have steep learning curves (for fine tuning of analysis parameters) and result in long turnaround times between imaging and analysis. In particular, cell segmentation, the process of identifying individual cells in an image, is a major bottleneck in this regard. Here we present an alternate, cell-segmentation-free workflow based on PhenoRipper, an open-source software platform designed for the rapid analysis and exploration of microscopy images. The pipeline presented here is optimized for immunofluorescence microscopy images of cell cultures and requires minimal user intervention. Within half an hour, PhenoRipper can analyze data from a typical 96-well experiment and generate image profiles. Users can then visually explore their data, perform quality control on their experiment, ensure response to perturbations and check reproducibility of replicates. This facilitates a rapid feedback cycle between analysis and experiment, which is crucial during assay optimization. This protocol is useful not just as a first pass analysis for quality control, but also may be used as an end-to-end solution, especially for screening. The workflow described here scales to large data sets such as those generated by high-throughput screens, and has been shown to group experimental conditions by phenotype accurately over a wide range of biological systems. The PhenoBrowser interface provides an intuitive framework to explore the phenotypic space and relate image properties to biological annotations. Taken together, the protocol described here will lower the barriers to adopting quantitative analysis of image based screens.

  10. Quality criteria for abdominal computerized tomography

    International Nuclear Information System (INIS)

    Huebener, K.H.; Kurtz, B.; Metzger, H.O.F.

    1985-01-01

    Quality, not only in obdominal computerized tomography, is determined by the measurable technical parameters and, to an important extent, also bei individual factors, among which the diagnostic skill and experience of the examiner is one of the most decisive. These individual factors and the part they play with regard to the quality of CT-assisted diagnosis may well equal the technical parameters, as they significantly influence the course of examinations, resulting indications for contrast medium application, and the sensitivity of the diagnosis. The authors are convinced that especially for abdominal CT, standardized examination techniques inevitably would bring down the diagnostic quality. The technical parameters are of equal significance to achieving the diagnostic optimum, and to these parameters one has to count equipment characteristics as well as the data given by the examiner. Exposure time, spatial resolution and density differentiation are given by the equipment specifications but have to be adapted to and optimised to the clinical problems involved in every case. Another important task is that of routine imaging of given anatomic structures, for adequate evaluation of individual conditions. (orig./MG) [de

  11. Research of second harmonic generation images based on texture analysis

    Science.gov (United States)

    Liu, Yao; Li, Yan; Gong, Haiming; Zhu, Xiaoqin; Huang, Zufang; Chen, Guannan

    2014-09-01

    Texture analysis plays a crucial role in identifying objects or regions of interest in an image. It has been applied to a variety of medical image processing, ranging from the detection of disease and the segmentation of specific anatomical structures, to differentiation between healthy and pathological tissues. Second harmonic generation (SHG) microscopy as a potential noninvasive tool for imaging biological tissues has been widely used in medicine, with reduced phototoxicity and photobleaching. In this paper, we clarified the principles of texture analysis including statistical, transform, structural and model-based methods and gave examples of its applications, reviewing studies of the technique. Moreover, we tried to apply texture analysis to the SHG images for the differentiation of human skin scar tissues. Texture analysis method based on local binary pattern (LBP) and wavelet transform was used to extract texture features of SHG images from collagen in normal and abnormal scars, and then the scar SHG images were classified into normal or abnormal ones. Compared with other texture analysis methods with respect to the receiver operating characteristic analysis, LBP combined with wavelet transform was demonstrated to achieve higher accuracy. It can provide a new way for clinical diagnosis of scar types. At last, future development of texture analysis in SHG images were discussed.

  12. [Computerized medical record: deontology and legislation].

    Science.gov (United States)

    Allaert, F A; Dusserre, L

    1996-02-01

    Computerization of medical records is making headway for patients' follow-up, scientific research, and health expenses control, but it must not alter the guarantees provided to the patients by the medical code of ethics and the law of January 6, 1978. This law, modified on July 1, 1994, requires to register all computerized records of personal data and establishes rights to protect privacy against computer misdemeanor. All medical practitioners using computerized medical records must be aware that the infringement of this law may provoke suing in professional, civil or criminal court.

  13. Microcomputer Network for Computerized Adaptive Testing (CAT)

    Science.gov (United States)

    1984-03-01

    PRDC TR 84-33 \\Q.�d-33- \\ MICROCOMPUTER NETWOJlt FOR COMPUTERIZED ADAPTIVE TESTING ( CAT ) Baldwin Quan Thomas A . Park Gary Sandahl John H...ACCEIIION NO NPRDC TR 84-33 4. TITLE (-d Sul>tlllo) MICROCOMP UTER NETWORK FOR COMPUTERIZED ADA PTIVE TESTING ( CAT ) 1. Q B. uan T. A . Park...adaptive testing ( CAT ) Bayesian sequential testing 20. ABSTitACT (Continuo on ro•••• aide II noco .. _, _., ld-tlly ,.,. t.loclt _._.) DO Computerized

  14. Uncooled LWIR imaging: applications and market analysis

    Science.gov (United States)

    Takasawa, Satomi

    2015-05-01

    The evolution of infrared (IR) imaging sensor technology for defense market has played an important role in developing commercial market, as dual use of the technology has expanded. In particular, technologies of both reduction in pixel pitch and vacuum package have drastically evolved in the area of uncooled Long-Wave IR (LWIR; 8-14 μm wavelength region) imaging sensor, increasing opportunity to create new applications. From the macroscopic point of view, the uncooled LWIR imaging market is divided into two areas. One is a high-end market where uncooled LWIR imaging sensor with sensitivity as close to that of cooled one as possible is required, while the other is a low-end market which is promoted by miniaturization and reduction in price. Especially, in the latter case, approaches towards consumer market have recently appeared, such as applications of uncooled LWIR imaging sensors to night visions for automobiles and smart phones. The appearance of such a kind of commodity surely changes existing business models. Further technological innovation is necessary for creating consumer market, and there will be a room for other companies treating components and materials such as lens materials and getter materials and so on to enter into the consumer market.

  15. Digital image processing and analysis human and computer vision applications with CVIPtools

    CERN Document Server

    Umbaugh, Scott E

    2010-01-01

    Section I Introduction to Digital Image Processing and AnalysisDigital Image Processing and AnalysisOverviewImage Analysis and Computer VisionImage Processing and Human VisionKey PointsExercisesReferencesFurther ReadingComputer Imaging SystemsImaging Systems OverviewImage Formation and SensingCVIPtools SoftwareImage RepresentationKey PointsExercisesSupplementary ExercisesReferencesFurther ReadingSection II Digital Image Analysis and Computer VisionIntroduction to Digital Image AnalysisIntroductionPreprocessingBinary Image AnalysisKey PointsExercisesSupplementary ExercisesReferencesFurther Read

  16. Comparison of computerized tomography to sonography, applied in diseases of the pancreas

    International Nuclear Information System (INIS)

    Kluge, K.

    1982-01-01

    The examination results of 418 patients whose epigastria had been examined both by computerized tomography and sonography over 1 week in the time from beginning January 1978 until and of July 1979 were compared with regard to the imaging of the pancreas, reliability, and the specificity and sensitivity in establishing the diagnosis. For the sonographic examination, a compound and a real-time unit were used; the computerized tomography was carried out by means of an equipment of the 3rd generation with a scan time of 4 sec. The screening of the pancreas was significantly better using computerized tomography (99.3% US. 84% with US). As for accuracy, computerized tomography had 92.5% exact diagnoses versus 79.9% obtained by sonography. If, however, we look at the cases in which the pancreas could be screened with both methods the accuracy was almost the same (93.7 CT and 93.3% US). Specificity was almost of the same quality, however, the method of computerized tomography with 0.963 was slightly better than ultrasound with 0.943. As for sensitivity, sonography with 0.838 was better than CT with 0.721. The reason for that is the fact that a big part of the chronic pancreatites (30.3%) were not recognized by means of computerized tomography. (orig.) [de

  17. Analysis of live cell images: Methods, tools and opportunities.

    Science.gov (United States)

    Nketia, Thomas A; Sailem, Heba; Rohde, Gustavo; Machiraju, Raghu; Rittscher, Jens

    2017-02-15

    Advances in optical microscopy, biosensors and cell culturing technologies have transformed live cell imaging. Thanks to these advances live cell imaging plays an increasingly important role in basic biology research as well as at all stages of drug development. Image analysis methods are needed to extract quantitative information from these vast and complex data sets. The aim of this review is to provide an overview of available image analysis methods for live cell imaging, in particular required preprocessing image segmentation, cell tracking and data visualisation methods. The potential opportunities recent advances in machine learning, especially deep learning, and computer vision provide are being discussed. This review includes overview of the different available software packages and toolkits. Copyright © 2017. Published by Elsevier Inc.

  18. Characterization of filters and filtration process using X-ray computerized tomography

    International Nuclear Information System (INIS)

    Maschio, Celio; Arruda, Antonio Celso Fonseca de

    1999-01-01

    The objective of this work is to present the potential of X-Ray computerized tomography as a tool for internal characterization of filters used in the solid-liquid separation, mainly the water filters. Cartridge filters (for industrial and domestic applications) contaminated with glass beads were used. The scanning process was carried out both with and without contaminant in the filter to compare the attenuation coefficient of the clean filter and the contaminated filter. The images showed that is possible the mapping the internal structure of the filters and the distribution of the contaminant, permitting a local analysis, that is not possible through the standard tests used by the manufactures. These standard tests reveal only global characteristics of the filter media. The possibility of application for manufacturing process control was also shown, because the non invasive nature is a important advantage of the technique, which also permitted damage detection in filters submitted to severe operational conditions. (author)

  19. Imaging analysis of dedifferentiated chondrosarcoma of bone

    International Nuclear Information System (INIS)

    Xie Yuanzhong; Kong Qingkui; Wang Xia; Li Changqing

    2004-01-01

    Objective: To analyze the radiological findings of dedifferentiated chondrosarcoma, and to explore the imaging features of dedifferentiated tissue. Methods: The X-ray and CT findings of 13 cases with dedifferentiated chondrosarcoma of bone were analyzed retrospectively, and studied with clinic and corresponding histological changes. Results: The dedifferentiated chondrosarcoma not only had the radiological findings of typical chondrosarcoma but also had the imaging features of dedifferentiated tissues. In 13 patients, periosteal reactions were found in 11 cases, ossifications in 8 cases, soft tissue masses in 12 cases, calcifications in 10 cases, and the site of calcifications in 8 cases was in the center of the focus. Conclusion: The dedifferentiated chondrosarcoma showed special imaging features, which includes ossification, calcification, periosteal reaction, and soft tissue mass. These features were not found in typical chondrosarcoma. Recognizing these specific features is helpful to the diagnosis of dedifferentiated chondrosarcoma. (author)

  20. System Matrix Analysis for Computed Tomography Imaging

    Science.gov (United States)

    Flores, Liubov; Vidal, Vicent; Verdú, Gumersindo

    2015-01-01

    In practical applications of computed tomography imaging (CT), it is often the case that the set of projection data is incomplete owing to the physical conditions of the data acquisition process. On the other hand, the high radiation dose imposed on patients is also undesired. These issues demand that high quality CT images can be reconstructed from limited projection data. For this reason, iterative methods of image reconstruction have become a topic of increased research interest. Several algorithms have been proposed for few-view CT. We consider that the accurate solution of the reconstruction problem also depends on the system matrix that simulates the scanning process. In this work, we analyze the application of the Siddon method to generate elements of the matrix and we present results based on real projection data. PMID:26575482