WorldWideScience

Sample records for image quality simulations

  1. Simulation of High Quality Ultrasound Imaging

    DEFF Research Database (Denmark)

    Hemmsen, Martin Christian; Kortbek, Jacob; Nikolov, Svetoslav Ivanov

    2010-01-01

    ), and at Full Width at One-Hundredth Maximum (FWOHM) of 9 points spread functions resulting from evenly distributed point targets at depths ranging from 10 mm to 90 mm. The results are documented for a 64 channel system, using a 192 element linear array transducer model. A physical BK Medical 8804 transducer...... amplitude and phase compensation, the LR at FWOHM improves from 6.3 mm to 4.7 mm and is a factor of 2.2 better than DRF. This study has shown that individual element impulse response, phase, and amplitude deviations are important to include in simulated system performance evaluations. Furthermore...

  2. Design and development of a computer based simulator to support learning of radiographic image quality

    Energy Technology Data Exchange (ETDEWEB)

    Costaridou, L; Pitoura, T; Panayiotakis, G; Pallikarakis, N [Department of Medical Physics, School of Medicine, University of Patras, 265 00 Patras (Greece); Hatzis, K [Institute of Biomedical Technology, Ellinos Stratiotou 50A, 264 41 Patras (Greece)

    1994-12-31

    A training simulator has been developed to offer a structured and functional approach to radiographic imaging procedures and comprehensive understanding of interrelations between physical and technical input parameters of a radiographic imaging system and characteristics of image quality. The system addresses training needs of radiographers and radiology clinicians. The simulator is based on procedural simulation enhanced by a hypertextual model of information organization. It is supported by an image data base, which supplies and enriches the simulator. The simulation is controlled by a browsing facility which corresponds to several hierachical levels of use of the underlying multimodal data base, organized as imaging tasks. Representative tasks are : production of a single radiograph or production of functional sets of radiographs exhibiting parameter effects on image characteristics. System parameters such as patient positioning, focus to patient distance, magnification, field dimensions, focal spot size, tube voltage, tube current and exposure time are under user control. (authors). 7 refs, 2 figs.

  3. Design and development of a computer based simulator to support learning of radiographic image quality

    International Nuclear Information System (INIS)

    Costaridou, L.; Pitoura, T.; Panayiotakis, G.; Pallikarakis, N.; Hatzis, K.

    1994-01-01

    A training simulator has been developed to offer a structured and functional approach to radiographic imaging procedures and comprehensive understanding of interrelations between physical and technical input parameters of a radiographic imaging system and characteristics of image quality. The system addresses training needs of radiographers and radiology clinicians. The simulator is based on procedural simulation enhanced by a hypertextual model of information organization. It is supported by an image data base, which supplies and enriches the simulator. The simulation is controlled by a browsing facility which corresponds to several hierachical levels of use of the underlying multimodal data base, organized as imaging tasks. Representative tasks are : production of a single radiograph or production of functional sets of radiographs exhibiting parameter effects on image characteristics. System parameters such as patient positioning, focus to patient distance, magnification, field dimensions, focal spot size, tube voltage, tube current and exposure time are under user control. (authors)

  4. Texture Based Quality Analysis of Simulated Synthetic Ultrasound Images Using Local Binary Patterns †

    Directory of Open Access Journals (Sweden)

    Prerna Singh

    2017-12-01

    Full Text Available Speckle noise reduction is an important area of research in the field of ultrasound image processing. Several algorithms for speckle noise characterization and analysis have been recently proposed in the area. Synthetic ultrasound images can play a key role in noise evaluation methods as they can be used to generate a variety of speckle noise models under different interpolation and sampling schemes, and can also provide valuable ground truth data for estimating the accuracy of the chosen methods. However, not much work has been done in the area of modeling synthetic ultrasound images, and in simulating speckle noise generation to get images that are as close as possible to real ultrasound images. An important aspect of simulated synthetic ultrasound images is the requirement for extensive quality assessment for ensuring that they have the texture characteristics and gray-tone features of real images. This paper presents texture feature analysis of synthetic ultrasound images using local binary patterns (LBP and demonstrates the usefulness of a set of LBP features for image quality assessment. Experimental results presented in the paper clearly show how these features could provide an accurate quality metric that correlates very well with subjective evaluations performed by clinical experts.

  5. Dose-image quality study in digital chest radiography using Monte Carlo simulation

    International Nuclear Information System (INIS)

    Correa, S.C.A.; Souza, E.M.; Silva, A.X.; Lopes, R.T.; Yoriyaz, H.

    2008-01-01

    One of the main preoccupations of diagnostic radiology is to guarantee a good image-sparing dose to the patient. In the present study, Monte Carlo simulations, with MCNPX code, coupled with an adult voxel female model (FAX) were performed to investigate how image quality and dose in digital chest radiography vary with tube voltage (80-150 kV) using air-gap technique and a computed radiography system. Calculated quantities were normalized to a fixed value of entrance skin exposure (ESE) of 0.0136 R. The results of the present analysis show that the image quality for chest radiography with imaging plate is improved and the dose reduced at lower tube voltage

  6. Dynamic simulation of the effect of soft toric contact lenses movement on retinal image quality.

    Science.gov (United States)

    Niu, Yafei; Sarver, Edwin J; Stevenson, Scott B; Marsack, Jason D; Parker, Katrina E; Applegate, Raymond A

    2008-04-01

    To report the development of a tool designed to dynamically simulate the effect of soft toric contact lens movement on retinal image quality, initial findings on three eyes, and the next steps to be taken to improve the utility of the tool. Three eyes of two subjects wearing soft toric contact lenses were cyclopleged with 1% cyclopentolate and 2.5% phenylephrine. Four hundred wavefront aberration measurements over a 5-mm pupil were recorded during soft contact lens wear at 30 Hz using a complete ophthalmic analysis system aberrometer. Each wavefront error measurement was input into Visual Optics Laboratory (version 7.15, Sarver and Associates, Inc.) to generate a retinal simulation of a high contrast log MAR visual acuity chart. The individual simulations were combined into a single dynamic movie using a custom MatLab PsychToolbox program. Visual acuity was measured for each eye reading the movie with best cycloplegic spectacle correction through a 3-mm artificial pupil to minimize the influence of the eyes' uncorrected aberrations. Comparison of the simulated acuity was made to values recorded while the subject read unaberrated charts with contact lenses through a 5-mm artificial pupil. For one study eye, average acuity was the same as the natural contact lens viewing condition. For the other two study eyes visual acuity of the best simulation was more than one line worse than natural viewing conditions. Dynamic simulation of retinal image quality, although not yet perfect, is a promising technique for visually illustrating the optical effects on image quality because of the movements of alignment-sensitive corrections.

  7. Use of different simulators to quality evaluation of image quality in digital mammography; Utilizacao de diferentes simuladores na avaliacao da qualidade da imagem em mamografia digital

    Energy Technology Data Exchange (ETDEWEB)

    Pereira, Leslie S.; Coutinho, Celia M.C., E-mail: leslie@ird.gov.br, E-mail: celia@ird.gov.br [Instituto de Radioprotecao e Dosimetria (IRD/CNEN-RJ), Rio de Janeiro, RJ (Brazil); Magalhaes, Luis A.G.; Almeida, Carlos Eduardo de, E-mail: luisalexandregm@hotmail.com, E-mail: cea71@yahoo.com.br [Universidade do Estado do Rio de Janeiro (LCR/UERJ), Rio de Janeiro, RJ (Brazil). Laboratorio de Ciencias Radiologicas

    2013-11-01

    In this study, the digital images were acquired with different exposure simulators to evaluate the quality of the image, noting the tumor mass detection, microcalcification fiber and representing regions of interest during mammography. The technical parameters of exposure depends on the thickness and composition of the breast, thus affecting the dose and image quality. The simulators were used: ACR, SBP 1054, BREAST PHANTOM CIRS and for evaluation of image quality, as well as measures kerma incident on the entrance surface (Ki) and calculating the mean glandular dose (MGD)

  8. SU-F-J-178: A Computer Simulation Model Observer for Task-Based Image Quality Assessment in Radiation Therapy

    International Nuclear Information System (INIS)

    Dolly, S; Mutic, S; Anastasio, M; Li, H; Yu, L

    2016-01-01

    Purpose: Traditionally, image quality in radiation therapy is assessed subjectively or by utilizing physically-based metrics. Some model observers exist for task-based medical image quality assessment, but almost exclusively for diagnostic imaging tasks. As opposed to disease diagnosis, the task for image observers in radiation therapy is to utilize the available images to design and deliver a radiation dose which maximizes patient disease control while minimizing normal tissue damage. The purpose of this study was to design and implement a new computer simulation model observer to enable task-based image quality assessment in radiation therapy. Methods: A modular computer simulation framework was developed to resemble the radiotherapy observer by simulating an end-to-end radiation therapy treatment. Given images and the ground-truth organ boundaries from a numerical phantom as inputs, the framework simulates an external beam radiation therapy treatment and quantifies patient treatment outcomes using the previously defined therapeutic operating characteristic (TOC) curve. As a preliminary demonstration, TOC curves were calculated for various CT acquisition and reconstruction parameters, with the goal of assessing and optimizing simulation CT image quality for radiation therapy. Sources of randomness and bias within the system were analyzed. Results: The relationship between CT imaging dose and patient treatment outcome was objectively quantified in terms of a singular value, the area under the TOC (AUTOC) curve. The AUTOC decreases more rapidly for low-dose imaging protocols. AUTOC variation introduced by the dose optimization algorithm was approximately 0.02%, at the 95% confidence interval. Conclusion: A model observer has been developed and implemented to assess image quality based on radiation therapy treatment efficacy. It enables objective determination of appropriate imaging parameter values (e.g. imaging dose). Framework flexibility allows for incorporation

  9. A Monte-Carlo simulation framework for joint optimisation of image quality and patient dose in digital paediatric radiography

    International Nuclear Information System (INIS)

    Menser, Bernd; Manke, Dirk; Mentrup, Detlef; Neitzel, Ulrich

    2016-01-01

    In paediatric radiography, according to the as low as reasonably achievable (ALARA) principle, the imaging task should be performed with the lowest possible radiation dose. This paper describes a Monte-Carlo simulation framework for dose optimisation of imaging parameters in digital paediatric radiography. Patient models with high spatial resolution and organ segmentation enable the simultaneous evaluation of image quality and patient dose on the same simulated radiographic examination. The accuracy of the image simulation is analysed by comparing simulated and acquired images of technical phantoms. As a first application example, the framework is applied to optimise tube voltage and pre-filtration in newborn chest radiography. At equal patient dose, the highest CNR is obtained with low-kV settings in combination with copper filtration. (authors)

  10. Quality comparison between DEF-10 digital image from simulation technique and Computed Tomography (CR) technique in industrial radiography

    International Nuclear Information System (INIS)

    Siti Nur Syatirah Ismail

    2012-01-01

    The study was conducted to make comparison of digital image quality of DEF-10 from the techniques of simulation and computed radiography (CR). The sample used is steel DEF-10 with thickness of 15.28 mm. In this study, the sample is exposed to radiation from X-ray machine (ISOVOLT Titan E) with certain parameters. The parameters used in this study such as current, volt, exposure time and distance are specified. The current and distance of 3 mA and 700 mm respectively are specified while the applied voltage varies at 140, 160, 180 and 200 kV. The exposure time is reduced at a rate of 0, 20, 40, 60 and 80 % for each sample exposure. Digital image of simulation produced from aRTist software whereas digital image of computed radiography produced from imaging plate. Therefore, both images were compared qualitatively (sensitivity) and quantitatively (Signal to-Noise Ratio; SNR, Basic Spatial Resolution; SRb and LOP size) using Isee software. Radiographic sensitivity is indicated by Image Quality Indicator (IQI) which is the ability of the CR system and aRTist software to identify IQI of wire type when the time exposure is reduced up to 80% according to exposure chart ( D7; ISOVOLT Titan E). The image of the thinnest wire diameter achieved by radiograph from simulation and CR are the wire numbered 7 rather than the wire numbered 8 required by the standard. In quantitative comparison, this study shows that the SNR values decreases with reducing exposure time. SRb values increases for simulation and decreases for CR when the exposure time decreases and the good image quality can be achieved at 80% reduced exposure time. The high SNR and SRb values produced good image quality in CR and simulation techniques respectively. (author)

  11. SU-E-J-89: Comparative Analysis of MIM and Velocity’s Image Deformation Algorithm Using Simulated KV-CBCT Images for Quality Assurance

    Energy Technology Data Exchange (ETDEWEB)

    Cline, K; Narayanasamy, G; Obediat, M; Stanley, D; Stathakis, S; Kirby, N [University of Texas Health Science Center at San Antonio, Cancer Therapy and Research Center, San Antonio, TX (United States); Kim, H [University of California San Francisco, San Francisco, CA (United States)

    2015-06-15

    Purpose: Deformable image registration (DIR) is used routinely in the clinic without a formalized quality assurance (QA) process. Using simulated deformations to digitally deform images in a known way and comparing to DIR algorithm predictions is a powerful technique for DIR QA. This technique must also simulate realistic image noise and artifacts, especially between modalities. This study developed an algorithm to create simulated daily kV cone-beam computed-tomography (CBCT) images from CT images for DIR QA between these modalities. Methods: A Catphan and physical head-and-neck phantom, with known deformations, were used. CT and kV-CBCT images of the Catphan were utilized to characterize the changes in Hounsfield units, noise, and image cupping that occur between these imaging modalities. The algorithm then imprinted these changes onto a CT image of the deformed head-and-neck phantom, thereby creating a simulated-CBCT image. CT and kV-CBCT images of the undeformed and deformed head-and-neck phantom were also acquired. The Velocity and MIM DIR algorithms were applied between the undeformed CT image and each of the deformed CT, CBCT, and simulated-CBCT images to obtain predicted deformations. The error between the known and predicted deformations was used as a metric to evaluate the quality of the simulated-CBCT image. Ideally, the simulated-CBCT image registration would produce the same accuracy as the deformed CBCT image registration. Results: For Velocity, the mean error was 1.4 mm for the CT-CT registration, 1.7 mm for the CT-CBCT registration, and 1.4 mm for the CT-simulated-CBCT registration. These same numbers were 1.5, 4.5, and 5.9 mm, respectively, for MIM. Conclusion: All cases produced similar accuracy for Velocity. MIM produced similar values of accuracy for CT-CT registration, but was not as accurate for CT-CBCT registrations. The MIM simulated-CBCT registration followed this same trend, but overestimated MIM DIR errors relative to the CT

  12. SU-E-J-89: Comparative Analysis of MIM and Velocity’s Image Deformation Algorithm Using Simulated KV-CBCT Images for Quality Assurance

    International Nuclear Information System (INIS)

    Cline, K; Narayanasamy, G; Obediat, M; Stanley, D; Stathakis, S; Kirby, N; Kim, H

    2015-01-01

    Purpose: Deformable image registration (DIR) is used routinely in the clinic without a formalized quality assurance (QA) process. Using simulated deformations to digitally deform images in a known way and comparing to DIR algorithm predictions is a powerful technique for DIR QA. This technique must also simulate realistic image noise and artifacts, especially between modalities. This study developed an algorithm to create simulated daily kV cone-beam computed-tomography (CBCT) images from CT images for DIR QA between these modalities. Methods: A Catphan and physical head-and-neck phantom, with known deformations, were used. CT and kV-CBCT images of the Catphan were utilized to characterize the changes in Hounsfield units, noise, and image cupping that occur between these imaging modalities. The algorithm then imprinted these changes onto a CT image of the deformed head-and-neck phantom, thereby creating a simulated-CBCT image. CT and kV-CBCT images of the undeformed and deformed head-and-neck phantom were also acquired. The Velocity and MIM DIR algorithms were applied between the undeformed CT image and each of the deformed CT, CBCT, and simulated-CBCT images to obtain predicted deformations. The error between the known and predicted deformations was used as a metric to evaluate the quality of the simulated-CBCT image. Ideally, the simulated-CBCT image registration would produce the same accuracy as the deformed CBCT image registration. Results: For Velocity, the mean error was 1.4 mm for the CT-CT registration, 1.7 mm for the CT-CBCT registration, and 1.4 mm for the CT-simulated-CBCT registration. These same numbers were 1.5, 4.5, and 5.9 mm, respectively, for MIM. Conclusion: All cases produced similar accuracy for Velocity. MIM produced similar values of accuracy for CT-CT registration, but was not as accurate for CT-CBCT registrations. The MIM simulated-CBCT registration followed this same trend, but overestimated MIM DIR errors relative to the CT

  13. Image simulation and a model of noise power spectra across a range of mammographic beam qualities

    Energy Technology Data Exchange (ETDEWEB)

    Mackenzie, Alistair, E-mail: alistairmackenzie@nhs.net; Dance, David R.; Young, Kenneth C. [National Coordinating Centre for the Physics of Mammography, Royal Surrey County Hospital, Guildford GU2 7XX, United Kingdom and Department of Physics, University of Surrey, Guildford GU2 7XH (United Kingdom); Diaz, Oliver [Centre for Vision, Speech and Signal Processing, Faculty of Engineering and Physical Sciences, University of Surrey, Guildford GU2 7XH, United Kingdom and Computer Vision and Robotics Research Institute, University of Girona, Girona 17071 (Spain)

    2014-12-15

    Purpose: The aim of this work is to create a model to predict the noise power spectra (NPS) for a range of mammographic radiographic factors. The noise model was necessary to degrade images acquired on one system to match the image quality of different systems for a range of beam qualities. Methods: Five detectors and x-ray systems [Hologic Selenia (ASEh), Carestream computed radiography CR900 (CRc), GE Essential (CSI), Carestream NIP (NIPc), and Siemens Inspiration (ASEs)] were characterized for this study. The signal transfer property was measured as the pixel value against absorbed energy per unit area (E) at a reference beam quality of 28 kV, Mo/Mo or 29 kV, W/Rh with 45 mm polymethyl methacrylate (PMMA) at the tube head. The contributions of the three noise sources (electronic, quantum, and structure) to the NPS were calculated by fitting a quadratic at each spatial frequency of the NPS against E. A quantum noise correction factor which was dependent on beam quality was quantified using a set of images acquired over a range of radiographic factors with different thicknesses of PMMA. The noise model was tested for images acquired at 26 kV, Mo/Mo with 20 mm PMMA and 34 kV, Mo/Rh with 70 mm PMMA for three detectors (ASEh, CRc, and CSI) over a range of exposures. The NPS were modeled with and without the noise correction factor and compared with the measured NPS. A previous method for adapting an image to appear as if acquired on a different system was modified to allow the reference beam quality to be different from the beam quality of the image. The method was validated by adapting the ASEh flat field images with two thicknesses of PMMA (20 and 70 mm) to appear with the imaging characteristics of the CSI and CRc systems. Results: The quantum noise correction factor rises with higher beam qualities, except for CR systems at high spatial frequencies, where a flat response was found against mean photon energy. This is due to the dominance of secondary quantum noise

  14. Improving Conductivity Image Quality Using Block Matrix-based Multiple Regularization (BMMR Technique in EIT: A Simulation Study

    Directory of Open Access Journals (Sweden)

    Tushar Kanti Bera

    2011-06-01

    Full Text Available A Block Matrix based Multiple Regularization (BMMR technique is proposed for improving conductivity image quality in EIT. The response matrix (JTJ has been partitioned into several sub-block matrices and the highest eigenvalue of each sub-block matrices has been chosen as regularization parameter for the nodes contained by that sub-block. Simulated boundary data are generated for circular domain with circular inhomogeneity and the conductivity images are reconstructed in a Model Based Iterative Image Reconstruction (MoBIIR algorithm. Conductivity images are reconstructed with BMMR technique and the results are compared with the Single-step Tikhonov Regularization (STR and modified Levenberg-Marquardt Regularization (LMR methods. It is observed that the BMMR technique reduces the projection error and solution error and improves the conductivity reconstruction in EIT. Result show that the BMMR method also improves the image contrast and inhomogeneity conductivity profile and hence the reconstructed image quality is enhanced. ;doi:10.5617/jeb.170 J Electr Bioimp, vol. 2, pp. 33-47, 2011

  15. Improving image quality in Electrical Impedance Tomography (EIT using Projection Error Propagation-based Regularization (PEPR technique: A simulation study

    Directory of Open Access Journals (Sweden)

    Tushar Kanti Bera

    2011-03-01

    Full Text Available A Projection Error Propagation-based Regularization (PEPR method is proposed and the reconstructed image quality is improved in Electrical Impedance Tomography (EIT. A projection error is produced due to the misfit of the calculated and measured data in the reconstruction process. The variation of the projection error is integrated with response matrix in each iterations and the reconstruction is carried out in EIDORS. The PEPR method is studied with the simulated boundary data for different inhomogeneity geometries. Simulated results demonstrate that the PEPR technique improves image reconstruction precision in EIDORS and hence it can be successfully implemented to increase the reconstruction accuracy in EIT.>doi:10.5617/jeb.158 J Electr Bioimp, vol. 2, pp. 2-12, 2011

  16. Investigation of influence of 16-slice spiral CT electrocardiogram-controlled dose modulation on exposure dosage and image quality of cardiac CT imaging under simulated fluctuant heart rate

    International Nuclear Information System (INIS)

    Yin Yan; Chen Jie; Chai Weiming; Hua Jia; Gao Na; Xu Jianrong; Shen Yun

    2008-01-01

    Objective: To investigate the influence of electrocardiogram (ECG)-controlled dose modulation on exposure dosage and image quality of cardiac CT imaging in a cardiac phantom with simulated fluctuant heart rate. Methods: The basal heart rate of the cardiac pulsating phantom was set as 60 bpm, the experimental situations were divided as 6 groups according to different heart rates. The cardiac imaging was performed on the cardiac phantom when the ECG-controlled dose modulation was firstly turned off. The exposure dosage of each scan sequence was documented. The standard deviation of the CT values of the phantom was measured on the central slice after coronal reformation of the raw data. The quality of 2D and 3D images were scored. Then cardiac imaging was performed when ECG modulation was on and set as four groups according to different modulation parameters. All the data were documented as before. The results from the five groups with and without ECG modulation current were analyzed by F test and comparative rank sum test using the statistical software SPSS 10.0. Results: Statistical analysis showed no significant difference (P>0.05) between the SNR of images (SD value was 27.78 and 26.30) from the groups that full mA output at wide reconstruction phase (69%-99%) when the heart rate was fluctuant(≥7.5 bpm). There was also no significant difference (P>0.05) between the quality of the 2D and 3D images. But there was a significant difference (P 12.5 bpm, the exposure dosage would increase obviously (from 0.6 to 1.7 mSv). Conclusion: For cardiac imaging with 16-slice row CT, the application of ECG modulated current can effectively reduce the exposure dosage without compromising the image quality even if heart rate was fluctuant. (authors)

  17. Water Quality Analysis Simulation

    Science.gov (United States)

    The Water Quality analysis simulation Program, an enhancement of the original WASP. This model helps users interpret and predict water quality responses to natural phenomena and man-made pollution for variious pollution management decisions.

  18. Water Quality Analysis Simulation

    Data.gov (United States)

    U.S. Environmental Protection Agency — The Water Quality analysis simulation Program, an enhancement of the original WASP. This model helps users interpret and predict water quality responses to natural...

  19. Retinal image quality and visual stimuli processing by simulation of partial eye cataract

    Science.gov (United States)

    Ozolinsh, Maris; Danilenko, Olga; Zavjalova, Varvara

    2016-10-01

    Visual stimuli were demonstrated on a 4.3'' mobile phone screen inside a "Virtual Reality" adapter that allowed separation of the left and right eye visual fields. Contrast of the retina image thus can be controlled by the image on the phone screen and parallel to that at appropriate geometry by the AC voltage applied to scattering PDLC cell inside the adapter. Such optical pathway separation allows to demonstrate to both eyes spatially variant images, that after visual binocular fusion acquire their characteristic indications. As visual stimuli we used grey and different color (two opponent components to vision - red-green in L*a*b* color space) spatially periodical stimuli for left and right eyes; and with spatial content that by addition or subtraction resulted as clockwise or counter clockwise slanted Gabor gratings. We performed computer modeling with numerical addition or subtraction of signals similar to processing in brain via stimuli input decomposition in luminance and color opponency components. It revealed the dependence of the perception psychophysical equilibrium point between clockwise or counter clockwise perception of summation on one eye image contrast and color saturation, and on the strength of the retinal aftereffects. Existence of a psychophysical equilibrium point in perception of summation is only in the presence of a prior adaptation to a slanted periodical grating and at the appropriate slant orientation of adaptation grating and/or at appropriate spatial grating pattern phase according to grating nods. Actual observer perception experiments when one eye images were deteriorated by simulated cataract approved the shift of mentioned psychophysical equilibrium point on the degree of artificial cataract. We analyzed also the mobile devices stimuli emission spectra paying attention to areas sensitive to macula pigments absorption spectral maxima and blue areas where the intense irradiation can cause in abnormalities in periodic melatonin

  20. Task-based image quality evaluation of iterative reconstruction methods for low dose CT using computer simulations

    Science.gov (United States)

    Xu, Jingyan; Fuld, Matthew K.; Fung, George S. K.; Tsui, Benjamin M. W.

    2015-04-01

    Iterative reconstruction (IR) methods for x-ray CT is a promising approach to improve image quality or reduce radiation dose to patients. The goal of this work was to use task based image quality measures and the channelized Hotelling observer (CHO) to evaluate both analytic and IR methods for clinical x-ray CT applications. We performed realistic computer simulations at five radiation dose levels, from a clinical reference low dose D0 to 25% D0. A fixed size and contrast lesion was inserted at different locations into the liver of the XCAT phantom to simulate a weak signal. The simulated data were reconstructed on a commercial CT scanner (SOMATOM Definition Flash; Siemens, Forchheim, Germany) using the vendor-provided analytic (WFBP) and IR (SAFIRE) methods. The reconstructed images were analyzed by CHOs with both rotationally symmetric (RS) and rotationally oriented (RO) channels, and with different numbers of lesion locations (5, 10, and 20) in a signal known exactly (SKE), background known exactly but variable (BKEV) detection task. The area under the receiver operating characteristic curve (AUC) was used as a summary measure to compare the IR and analytic methods; the AUC was also used as the equal performance criterion to derive the potential dose reduction factor of IR. In general, there was a good agreement in the relative AUC values of different reconstruction methods using CHOs with RS and RO channels, although the CHO with RO channels achieved higher AUCs than RS channels. The improvement of IR over analytic methods depends on the dose level. The reference dose level D0 was based on a clinical low dose protocol, lower than the standard dose due to the use of IR methods. At 75% D0, the performance improvement was statistically significant (p < 0.05). The potential dose reduction factor also depended on the detection task. For the SKE/BKEV task involving 10 lesion locations, a dose reduction of at least 25% from D0 was achieved.

  1. Four corneal presbyopia corrections: simulations of optical consequences on retinal image quality.

    Science.gov (United States)

    Koller, Tobias; Seiler, Theo

    2006-12-01

    To investigate the possibility of multifocal or aspherical treatment of the cornea with optical ray tracing. Institute for Refractive and Ophthalmic Surgery, Zurich, Switzerland. The optical consequences of 4 corneal shapes-global optimum (GO) for curvature and asphericity, central steep island (CSI), decentered steep island (DSI), and centered steep annulus (CSA)-for presbyopia correction were analyzed using a modified Liou-Brennan eye model and ray tracing with a commercial optic design software (Zemax, Zemax Development Corp.). The ocular optical configuration for far vision was a point light source at a distance of 5 m, 1 degree up, and a pupil diameter of 5.0 mm and for near vision, 0.4 m distance, 1 degree up, and a pupil diameter of 2.5 mm. The curvature radius (R) of the cornea and its asphericity (Q) were used as operands to optimize (simultaneously for near and far vision) the quality of the retinal image described by means of the minimum spot diameter or the root-mean-square (RMS) wavefront error. Starting from an emmetropic eye optimized for R and Q, the RMS wavefront error in the retina was 0.07 microm (far) and 1.42 microm (near). The GO resulted in a wavefront error of 1.42 microm (far) and 0.52 microm (near); improvement of near vision using reading glasses is possible. The CSI yielded 0.91 microm (far) and 0.13 microm (near); spectacles did not improve far or near vision. The DSI and CSA had significantly worse results for near and far vision. Of the options studied, GO and CSI seemed the most promising alternatives for corneal presbyopia correction. Although reading glasses can improve near vision in GO, reading glasses did not improve near vision in CSI-treated eyes. The CSI treatment is critically dependent on centration and a reverse treatment is difficult to achieve.

  2. Imaging Food Quality

    DEFF Research Database (Denmark)

    Møller, Flemming

    Imaging and spectroscopy have long been established methods for food quality control both in the laboratories and online. An ever increasing number of analytical techniques are being developed into imaging methods and existing imaging methods to contain spectral information. Images and especially...... spectral images contain large amounts of data which should be analysed appropriately by techniques combining structure and spectral information. This dissertation deals with how different types of food quality can be measured by imaging techniques, analysed with appropriate image analysis techniques...... and finally use the image data to predict or visualise food quality. A range of different food quality parameters was addressed, i.e. water distribution in bread throughout storage, time series analysis of chocolate milk stability, yoghurt glossiness, graininess and dullness and finally structure and meat...

  3. Image simulation using LOCUS

    International Nuclear Information System (INIS)

    Strachan, J.D.; Roberts, J.A.

    1989-09-01

    The LOCUS data base program has been used to simulate images and to solve simple equations. This has been accomplished by making each record (which normally would represent a data entry)represent sequenced or random number pairs

  4. Dynamic 99mTc-MAG3 renography: images for quality control obtained by combining pharmacokinetic modelling, an anthropomorphic computer phantom and Monte Carlo simulated scintillation camera imaging

    Science.gov (United States)

    Brolin, Gustav; Sjögreen Gleisner, Katarina; Ljungberg, Michael

    2013-05-01

    In dynamic renal scintigraphy, the main interest is the radiopharmaceutical redistribution as a function of time. Quality control (QC) of renal procedures often relies on phantom experiments to compare image-based results with the measurement setup. A phantom with a realistic anatomy and time-varying activity distribution is therefore desirable. This work describes a pharmacokinetic (PK) compartment model for 99mTc-MAG3, used for defining a dynamic whole-body activity distribution within a digital phantom (XCAT) for accurate Monte Carlo (MC)-based images for QC. Each phantom structure is assigned a time-activity curve provided by the PK model, employing parameter values consistent with MAG3 pharmacokinetics. This approach ensures that the total amount of tracer in the phantom is preserved between time points, and it allows for modifications of the pharmacokinetics in a controlled fashion. By adjusting parameter values in the PK model, different clinically realistic scenarios can be mimicked, regarding, e.g., the relative renal uptake and renal transit time. Using the MC code SIMIND, a complete set of renography images including effects of photon attenuation, scattering, limited spatial resolution and noise, are simulated. The obtained image data can be used to evaluate quantitative techniques and computer software in clinical renography.

  5. Social image quality

    Science.gov (United States)

    Qiu, Guoping; Kheiri, Ahmed

    2011-01-01

    Current subjective image quality assessments have been developed in the laboratory environments, under controlledconditions, and are dependent on the participation of limited numbers of observers. In this research, with the help of Web 2.0 and social media technology, a new method for building a subjective image quality metric has been developed where the observers are the Internet users. A website with a simple user interface that enables Internet users from anywhere at any time to vote for a better quality version of a pair of the same image has been constructed. Users' votes are recorded and used to rank the images according to their perceived visual qualities. We have developed three rank aggregation algorithms to process the recorded pair comparison data, the first uses a naive approach, the second employs a Condorcet method, and the third uses the Dykstra's extension of Bradley-Terry method. The website has been collecting data for about three months and has accumulated over 10,000 votes at the time of writing this paper. Results show that the Internet and its allied technologies such as crowdsourcing offer a promising new paradigm for image and video quality assessment where hundreds of thousands of Internet users can contribute to building more robust image quality metrics. We have made Internet user generated social image quality (SIQ) data of a public image database available online (http://www.hdri.cs.nott.ac.uk/siq/) to provide the image quality research community with a new source of ground truth data. The website continues to collect votes and will include more public image databases and will also be extended to include videos to collect social video quality (SVQ) data. All data will be public available on the website in due course.

  6. Image quality in mammography

    International Nuclear Information System (INIS)

    Haus, A.G.; Doi, K.; Metz, C.E.; Bernstein, J.

    1976-01-01

    In mammography, image quality is a function of the shape, size, and x-ray absorption properties of the anatomic part to be radiographed and of the lesion to be detected; it also depends on geometric unsharpness, and the resolution, characteristic curve and noise properties of the recording system. X-ray energy spectra, modulation transfer functions, Wiener spectra, characteristic and gradient curves, and radiographs of a breast phantom and of a resected breast specimen containing microcalcifications are used in a review of some current considerations of the factors, and the complex relationship among factors, that affect image quality in mammography. Image quality and patient radiation exposure in mammography are interrelated. An approach to the problem of evaluating the trade-off between diagnostic certainty and the cost or risk of performing a breast imaging procedure is discussed

  7. Producing quality radiographic images

    International Nuclear Information System (INIS)

    Cullinan, A.M.

    1987-01-01

    This book gives an overview of physics, equipment, imaging, and quality assurance in the radiology department. The chapters are laid out with generous use of subheads to allow for quick reference, Points are illustrated with clear, uncluttered line diagrams and well-produced images. The accompanying explanations are miniature lessons by themselves. Inserted at various points throughout the text are important notes that highlight key concepts. The chapter ''Image Evaluation and Application of Radiographic Principles'' present a systematic approach to evaluating radiographs and contains several sample radiographs to illustrate the points made

  8. Fast simulation of ultrasound images

    DEFF Research Database (Denmark)

    Jensen, Jørgen Arendt; Nikolov, Svetoslav

    2000-01-01

    , and a whole image can take a full day. Simulating 3D images and 3D flow takes even more time. A 3D image of 64 by 64 lines can take 21 days, which is not practical for iterative work. This paper presents a new fast simulation method based on the Field II program. In imaging the same spatial impulse response...

  9. Simulation of Hyperspectral Images

    Science.gov (United States)

    Richsmeier, Steven C.; Singer-Berk, Alexander; Bernstein, Lawrence S.

    2004-01-01

    A software package generates simulated hyperspectral imagery for use in validating algorithms that generate estimates of Earth-surface spectral reflectance from hyperspectral images acquired by airborne and spaceborne instruments. This software is based on a direct simulation Monte Carlo approach for modeling three-dimensional atmospheric radiative transport, as well as reflections from surfaces characterized by spatially inhomogeneous bidirectional reflectance distribution functions. In this approach, "ground truth" is accurately known through input specification of surface and atmospheric properties, and it is practical to consider wide variations of these properties. The software can treat both land and ocean surfaces, as well as the effects of finite clouds with surface shadowing. The spectral/spatial data cubes computed by use of this software can serve both as a substitute for, and a supplement to, field validation data.

  10. Visual air quality simulation techniques

    Science.gov (United States)

    Molenar, John V.; Malm, William C.; Johnson, Christopher E.

    Visual air quality is primarily a human perceptual phenomenon beginning with the transfer of image-forming information through an illuminated, scattering and absorbing atmosphere. Visibility, especially the visual appearance of industrial emissions or the degradation of a scenic view, is the principal atmospheric characteristic through which humans perceive air pollution, and is more sensitive to changing pollution levels than any other air pollution effect. Every attempt to quantify economic costs and benefits of air pollution has indicated that good visibility is a highly valued and desired environmental condition. Measurement programs can at best approximate the state of the ambient atmosphere at a few points in a scenic vista viewed by an observer. To fully understand the visual effect of various changes in the concentration and distribution of optically important atmospheric pollutants requires the use of aerosol and radiative transfer models. Communication of the output of these models to scientists, decision makers and the public is best done by applying modern image-processing systems to generate synthetic images representing the modeled air quality conditions. This combination of modeling techniques has been under development for the past 15 yr. Initially, visual air quality simulations were limited by a lack of computational power to simplified models depicting Gaussian plumes or uniform haze conditions. Recent explosive growth in low cost, high powered computer technology has allowed the development of sophisticated aerosol and radiative transfer models that incorporate realistic terrain, multiple scattering, non-uniform illumination, varying spatial distribution, concentration and optical properties of atmospheric constituents, and relative humidity effects on aerosol scattering properties. This paper discusses these improved models and image-processing techniques in detail. Results addressing uniform and non-uniform layered haze conditions in both

  11. Quality assurance: image production and film quality

    International Nuclear Information System (INIS)

    Abd Aziz Mhd Ramli

    2004-01-01

    The contents of this chapter are follows - Factors Affecting Image Quality and Patient Dose: Quality Control in Diagnostic Radiology, Mechanical Safety, Electrical Safety, Radiation Protection, Performance and Safety Standard, Calibration of QC Test Tools

  12. High-quality compressive ghost imaging

    Science.gov (United States)

    Huang, Heyan; Zhou, Cheng; Tian, Tian; Liu, Dongqi; Song, Lijun

    2018-04-01

    We propose a high-quality compressive ghost imaging method based on projected Landweber regularization and guided filter, which effectively reduce the undersampling noise and improve the resolution. In our scheme, the original object is reconstructed by decomposing of regularization and denoising steps instead of solving a minimization problem in compressive reconstruction process. The simulation and experimental results show that our method can obtain high ghost imaging quality in terms of PSNR and visual observation.

  13. REMOTE SENSING IMAGE QUALITY ASSESSMENT EXPERIMENT WITH POST-PROCESSING

    Directory of Open Access Journals (Sweden)

    W. Jiang

    2018-04-01

    Full Text Available This paper briefly describes the post-processing influence assessment experiment, the experiment includes three steps: the physical simulation, image processing, and image quality assessment. The physical simulation models sampled imaging system in laboratory, the imaging system parameters are tested, the digital image serving as image processing input are produced by this imaging system with the same imaging system parameters. The gathered optical sampled images with the tested imaging parameters are processed by 3 digital image processes, including calibration pre-processing, lossy compression with different compression ratio and image post-processing with different core. Image quality assessment method used is just noticeable difference (JND subject assessment based on ISO20462, through subject assessment of the gathered and processing images, the influence of different imaging parameters and post-processing to image quality can be found. The six JND subject assessment experimental data can be validated each other. Main conclusions include: image post-processing can improve image quality; image post-processing can improve image quality even with lossy compression, image quality with higher compression ratio improves less than lower ratio; with our image post-processing method, image quality is better, when camera MTF being within a small range.

  14. Evaluation of tomographic image quality of extended and conventional parallel hole collimators using maximum likelihood expectation maximization algorithm by Monte Carlo simulations.

    Science.gov (United States)

    Moslemi, Vahid; Ashoor, Mansour

    2017-10-01

    One of the major problems associated with parallel hole collimators (PCs) is the trade-off between their resolution and sensitivity. To solve this problem, a novel PC - namely, extended parallel hole collimator (EPC) - was proposed, in which particular trapezoidal denticles were increased upon septa on the side of the detector. In this study, an EPC was designed and its performance was compared with that of two PCs, PC35 and PC41, with a hole size of 1.5 mm and hole lengths of 35 and 41 mm, respectively. The Monte Carlo method was used to calculate the important parameters such as resolution, sensitivity, scattering, and penetration ratio. A Jaszczak phantom was also simulated to evaluate the resolution and contrast of tomographic images, which were produced by the EPC6, PC35, and PC41 using the Monte Carlo N-particle version 5 code, and tomographic images were reconstructed by using maximum likelihood expectation maximization algorithm. Sensitivity of the EPC6 was increased by 20.3% in comparison with that of the PC41 at the identical spatial resolution and full-width at tenth of maximum here. Moreover, the penetration and scattering ratio of the EPC6 was 1.2% less than that of the PC41. The simulated phantom images show that the EPC6 increases contrast-resolution and contrast-to-noise ratio compared with those of PC41 and PC35. When compared with PC41 and PC35, EPC6 improved trade-off between resolution and sensitivity, reduced penetrating and scattering ratios, and produced images with higher quality. EPC6 can be used to increase detectability of more details in nuclear medicine images.

  15. Quality measures in applications of image restoration.

    Science.gov (United States)

    Kriete, A; Naim, M; Schafer, L

    2001-01-01

    We describe a new method for the estimation of image quality in image restoration applications. We demonstrate this technique on a simulated data set of fluorescent beads, in comparison with restoration by three different deconvolution methods. Both the number of iterations and a regularisation factor are varied to enforce changes in the resulting image quality. First, the data sets are directly compared by an accuracy measure. These values serve to validate the image quality descriptor, which is developed on the basis of optical information theory. This most general measure takes into account the spectral energies and the noise, weighted in a logarithmic fashion. It is demonstrated that this method is particularly helpful as a user-oriented method to control the output of iterative image restorations and to eliminate the guesswork in choosing a suitable number of iterations.

  16. Medical Imaging Image Quality Assessment with Monte Carlo Methods

    International Nuclear Information System (INIS)

    Michail, C M; Fountos, G P; Kalyvas, N I; Valais, I G; Kandarakis, I S; Karpetas, G E; Martini, Niki; Koukou, Vaia

    2015-01-01

    The aim of the present study was to assess image quality of PET scanners through a thin layer chromatography (TLC) plane source. The source was simulated using a previously validated Monte Carlo model. The model was developed by using the GATE MC package and reconstructed images obtained with the STIR software for tomographic image reconstruction, with cluster computing. The PET scanner simulated in this study was the GE DiscoveryST. A plane source consisted of a TLC plate, was simulated by a layer of silica gel on aluminum (Al) foil substrates, immersed in 18F-FDG bath solution (1MBq). Image quality was assessed in terms of the Modulation Transfer Function (MTF). MTF curves were estimated from transverse reconstructed images of the plane source. Images were reconstructed by the maximum likelihood estimation (MLE)-OSMAPOSL algorithm. OSMAPOSL reconstruction was assessed by using various subsets (3 to 21) and iterations (1 to 20), as well as by using various beta (hyper) parameter values. MTF values were found to increase up to the 12th iteration whereas remain almost constant thereafter. MTF improves by using lower beta values. The simulated PET evaluation method based on the TLC plane source can be also useful in research for the further development of PET and SPECT scanners though GATE simulations. (paper)

  17. Retinal image quality during accommodation.

    Science.gov (United States)

    López-Gil, Norberto; Martin, Jesson; Liu, Tao; Bradley, Arthur; Díaz-Muñoz, David; Thibos, Larry N

    2013-07-01

    We asked if retinal image quality is maximum during accommodation, or sub-optimal due to accommodative error, when subjects perform an acuity task. Subjects viewed a monochromatic (552 nm), high-contrast letter target placed at various viewing distances. Wavefront aberrations of the accommodating eye were measured near the endpoint of an acuity staircase paradigm. Refractive state, defined as the optimum target vergence for maximising retinal image quality, was computed by through-focus wavefront analysis to find the power of the virtual correcting lens that maximizes visual Strehl ratio. Despite changes in ocular aberrations and pupil size during binocular viewing, retinal image quality and visual acuity typically remain high for all target vergences. When accommodative errors lead to sub-optimal retinal image quality, acuity and measured image quality both decline. However, the effect of accommodation errors of on visual acuity are mitigated by pupillary constriction associated with accommodation and binocular convergence and also to binocular summation of dissimilar retinal image blur. Under monocular viewing conditions some subjects displayed significant accommodative lag that reduced visual performance, an effect that was exacerbated by pharmacological dilation of the pupil. Spurious measurement of accommodative error can be avoided when the image quality metric used to determine refractive state is compatible with the focusing criteria used by the visual system to control accommodation. Real focusing errors of the accommodating eye do not necessarily produce a reliably measurable loss of image quality or clinically significant loss of visual performance, probably because of increased depth-of-focus due to pupil constriction. When retinal image quality is close to maximum achievable (given the eye's higher-order aberrations), acuity is also near maximum. A combination of accommodative lag, reduced image quality, and reduced visual function may be a useful

  18. Reconstructed image quality analysis of an industrial instant non-scanning tomography system with different types of collimators by the Monte Carlo simulation

    Energy Technology Data Exchange (ETDEWEB)

    Velo, Alexandre F.; Carvalho, Diego V.; Alvarez, Alexandre G.; Hamada, Margarida M.; Mesquita, Carlos H., E-mail: afvelo@usp.br [Instituto de Pesquisas Energéticas e Nucleares (IPEN/CNEN-SP), São Paulo, SP (Brazil)

    2017-07-01

    The greatest impact of the tomography technology application currently occurs in medicine. The great success of medical tomography is due to the human body presents reasonably standardized dimensions with well established chemical composition. Generally, these favorable conditions are not found in large industrial objects. In the industry there is much interest in using the information of the tomograph in order to know the interior of: (1) manufactured industrial objects or (2) machines and their means of production. In these cases, the purpose of the tomograph is to: (a) control the quality of the final product and (b) optimize production, contributing to the pilot phase of the projects and analyzing the quality of the means of production. In different industrial processes, e. g. in chemical reactors and distillation columns, the phenomena related to multiphase processes are usually fast, requiring high temporal resolution of the computed tomography (CT) data acquisition. In this context, Instant non-scanning tomograph and fifth generation tomograph meets these requirements. An instant non scanning tomography system is being developed at the IPEN/CNEN. In this work, in order to optimize the system, this tomograph comprised different collimators was simulated, with Monte Carlo method using the MCNP4C. The image quality was evaluated with MATLAB® 2013b, by analysis of the following parameters: contrast to noise (CNR), root mean square ratio (RMSE), signal to noise ratio (SNR) and the spatial resolution by the Modulation Transfer Function (MTF(f)), to analyze which collimator fits better to the instant non scanning tomography. It was simulated three situations; (1) with no collimator; (2) ?25 mm x 50 mm cylindrical collimator with a septum of ø5.0 mm x 50 mm; (3) ø25 mm x 50 mm cylindrical collimator with a slit septum of 24 mm x 5.0 mm x 50 mm. (author)

  19. Reconstructed image quality analysis of an industrial instant non-scanning tomography system with different types of collimators by the Monte Carlo simulation

    International Nuclear Information System (INIS)

    Velo, Alexandre F.; Carvalho, Diego V.; Alvarez, Alexandre G.; Hamada, Margarida M.; Mesquita, Carlos H.

    2017-01-01

    The greatest impact of the tomography technology application currently occurs in medicine. The great success of medical tomography is due to the human body presents reasonably standardized dimensions with well established chemical composition. Generally, these favorable conditions are not found in large industrial objects. In the industry there is much interest in using the information of the tomograph in order to know the interior of: (1) manufactured industrial objects or (2) machines and their means of production. In these cases, the purpose of the tomograph is to: (a) control the quality of the final product and (b) optimize production, contributing to the pilot phase of the projects and analyzing the quality of the means of production. In different industrial processes, e. g. in chemical reactors and distillation columns, the phenomena related to multiphase processes are usually fast, requiring high temporal resolution of the computed tomography (CT) data acquisition. In this context, Instant non-scanning tomograph and fifth generation tomograph meets these requirements. An instant non scanning tomography system is being developed at the IPEN/CNEN. In this work, in order to optimize the system, this tomograph comprised different collimators was simulated, with Monte Carlo method using the MCNP4C. The image quality was evaluated with MATLAB® 2013b, by analysis of the following parameters: contrast to noise (CNR), root mean square ratio (RMSE), signal to noise ratio (SNR) and the spatial resolution by the Modulation Transfer Function (MTF(f)), to analyze which collimator fits better to the instant non scanning tomography. It was simulated three situations; (1) with no collimator; (2) ?25 mm x 50 mm cylindrical collimator with a septum of ø5.0 mm x 50 mm; (3) ø25 mm x 50 mm cylindrical collimator with a slit septum of 24 mm x 5.0 mm x 50 mm. (author)

  20. SU-E-J-154: Image Quality Assessment of Contrast-Enhanced 4D-CT for Pancreatic Adenocarcinoma in Radiotherapy Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Choi, W; Xue, M; Patel, K; Regine, W; Wang, J; D’Souza, W; Lu, W [University of Maryland School of Medicine, Baltimore, MD (United States); Kang, M [University of Maryland School of Medicine, Baltimore, MD (United States); Yeungnam University Medical Center, Daegu, Daegu (Korea, Republic of); Klahr, P [Philips Healthcare, Highland Heights, OH (United States)

    2015-06-15

    Purpose: This study presents quantitative and qualitative assessment of the image qualities in contrast-enhanced (CE) 3D-CT, 4D-CT and CE 4D-CT to identify feasibility for replacing the clinical standard simulation with a single CE 4D-CT for pancreatic adenocarcinoma (PDA) in radiotherapy simulation. Methods: Ten PDA patients were enrolled and underwent three CT scans: a clinical standard pair of CE 3D-CT immediately followed by a 4D-CT, and a CE 4D-CT one week later. Physicians qualitatively evaluated the general image quality and regional vessel definitions and gave a score from 1 to 5. Next, physicians delineated the contours of the tumor (T) and the normal pancreatic parenchyma (P) on the three CTs (CE 3D-CT, 50% phase for 4D-CT and CE 4D-CT), then high density areas were automatically removed by thresholding at 500 HU and morphological operations. The pancreatic tumor contrast-to-noise ratio (CNR), signal-tonoise ratio (SNR) and conspicuity (C, absolute difference of mean enhancement levels in P and T) were computed to quantitatively assess image quality. The Wilcoxon rank sum test was used to compare these quantities. Results: In qualitative evaluations, CE 3D-CT and CE 4D-CT scored equivalently (4.4±0.4 and 4.3±0.4) and both were significantly better than 4D-CT (3.1±0.6). In quantitative evaluations, the C values were higher in CE 4D-CT (28±19 HU, p=0.19 and 0.17) than the clinical standard pair of CE 3D-CT and 4D-CT (17±12 and 16±17 HU, p=0.65). In CE 3D-CT and CE 4D-CT, mean CNR (1.8±1.4 and 1.8±1.7, p=0.94) and mean SNR (5.8±2.6 and 5.5±3.2, p=0.71) both were higher than 4D-CT (CNR: 1.1±1.3, p<0.3; SNR: 3.3±2.1, p<0.1). The absolute enhancement levels for T and P were higher in CE 4D-CT (87, 82 HU) than in CE 3D-CT (60, 56) and 4DCT (53, 70). Conclusions: The individually optimized CE 4D-CT is feasible and achieved comparable image qualities to the clinical standard simulation. This study was supported in part by Philips Healthcare.

  1. Image quality in digital radiography

    International Nuclear Information System (INIS)

    Kuhn, H.

    1986-01-01

    The contribution deals with the potentials of digital radiography and critically evaluates the advantages of drawbacks of the image intensifier-tv-digital system; digitalisation of the X-ray film and scanning of luminescent storage foils. The evaluation is done in comparison with the image quality of the traditional, large-size X-ray picture. (orig.) [de

  2. Diagnostic image quality of video-digitized chest images

    International Nuclear Information System (INIS)

    Winter, L.H.; Butler, R.B.; Becking, W.B.; Warnars, G.A.O.; Haar Romeny, B. ter; Ottes, F.P.; Valk, J.-P.J. de

    1989-01-01

    The diagnostic accuracy obtained with the Philips picture archiving and communications subsystem was investigated by means of an observer performance study using receiver operating characteristic (ROC) analysis. The image qualities of conventional films and video digitized images were compared. The scanner had a 1024 x 1024 x 8 bit memory. The digitized images were displayed on a 60 Hz interlaced display monitor 1024 lines. Posteroanterior (AP) roetgenograms of a chest phantom with superimposed simulated interstitial pattern disease (IPD) were produced; there were 28 normal and 40 abnormal films. Normal films were produced by the chest phantom alone. Abnormal films were taken of the chest phantom with varying degrees of superimposed simulated intersitial disease (PND) for an observer performance study, because the results of a simulated interstitial pattern disease study are less likely to be influenced by perceptual capabilities. The conventional films and the video digitized images were viewed by five experienced observers during four separate sessions. Conventional films were presented on a viewing box, the digital images were displayed on the monitor described above. The presence of simulated intersitial disease was indicated on a 5-point ROC certainty scale by each observer. We analyzed the differences between ROC curves derived from correlated data statistically. The mean time required to evaluate 68 digitized images is approximately four times the mean time needed to read the convential films. The diagnostic quality of the video digitized images was significantly lower (at the 5% level) than that of the conventional films (median area under the curve (AUC) of 0.71 and 0.94, respectively). (author). 25 refs.; 2 figs.; 4 tabs

  3. Water Quality Analysis Simulation Program (WASP)

    Science.gov (United States)

    The Water Quality Analysis Simulation Program (WASP) model helps users interpret and predict water quality responses to natural phenomena and manmade pollution for various pollution management decisions.

  4. Image quality (IQ) guided multispectral image compression

    Science.gov (United States)

    Zheng, Yufeng; Chen, Genshe; Wang, Zhonghai; Blasch, Erik

    2016-05-01

    Image compression is necessary for data transportation, which saves both transferring time and storage space. In this paper, we focus on our discussion on lossy compression. There are many standard image formats and corresponding compression algorithms, for examples, JPEG (DCT -- discrete cosine transform), JPEG 2000 (DWT -- discrete wavelet transform), BPG (better portable graphics) and TIFF (LZW -- Lempel-Ziv-Welch). The image quality (IQ) of decompressed image will be measured by numerical metrics such as root mean square error (RMSE), peak signal-to-noise ratio (PSNR), and structural Similarity (SSIM) Index. Given an image and a specified IQ, we will investigate how to select a compression method and its parameters to achieve an expected compression. Our scenario consists of 3 steps. The first step is to compress a set of interested images by varying parameters and compute their IQs for each compression method. The second step is to create several regression models per compression method after analyzing the IQ-measurement versus compression-parameter from a number of compressed images. The third step is to compress the given image with the specified IQ using the selected compression method (JPEG, JPEG2000, BPG, or TIFF) according to the regressed models. The IQ may be specified by a compression ratio (e.g., 100), then we will select the compression method of the highest IQ (SSIM, or PSNR). Or the IQ may be specified by a IQ metric (e.g., SSIM = 0.8, or PSNR = 50), then we will select the compression method of the highest compression ratio. Our experiments tested on thermal (long-wave infrared) images (in gray scales) showed very promising results.

  5. Image quality of cone beam CT on respiratory motion

    International Nuclear Information System (INIS)

    Zhang Ke; Li Minghui; Dai Jianrong; Wang Shi

    2011-01-01

    In this study,the influence of respiratory motion on Cone Beam CT (CBCT) image quality was investigated by a motion simulating platform, an image quality phantom, and a kV X-ray CBCT. A total of 21 motion states in the superior-inferior direction and the anterior-posterior direction, separately or together, was simulated by considering different respiration amplitudes, periods and hysteresis. The influence of motion on CBCT image quality was evaluated with the quality indexes of low contrast visibility, geometric accuracy, spatial resolution and uniformity of CT values. The results showed that the quality indexes were affected by the motion more prominently in AP direction than in SI direction, and the image quality was affected by the respiration amplitude more prominently than the respiration period and the hysteresis. The CBCT image quality and its characteristics influenced by the respiration motion, and may be exploited in finding solutions. (authors)

  6. Can quantum imaging be classically simulated?

    OpenAIRE

    D'Angelo, Milena; Shih, Yanhua

    2003-01-01

    Quantum imaging has been demonstrated since 1995 by using entangled photon pairs. The physics community named these experiments "ghost image", "quantum crypto-FAX", "ghost interference", etc. Recently, Bennink et al. simulated the "ghost" imaging experiment by two co-rotating k-vector correlated lasers. Did the classical simulation simulate the quantum aspect of the "ghost" image? We wish to provide an answer. In fact, the simulation is very similar to a historical model of local realism. The...

  7. Simulated annealing image reconstruction for positron emission tomography

    Energy Technology Data Exchange (ETDEWEB)

    Sundermann, E; Lemahieu, I; Desmedt, P [Department of Electronics and Information Systems, University of Ghent, St. Pietersnieuwstraat 41, B-9000 Ghent, Belgium (Belgium)

    1994-12-31

    In Positron Emission Tomography (PET) images have to be reconstructed from moisy projection data. The noise on the PET data can be modeled by a Poison distribution. In this paper, we present the results of using the simulated annealing technique to reconstruct PET images. Various parameter settings of the simulated annealing algorithm are discussed and optimized. The reconstructed images are of good quality and high contrast, in comparison to other reconstruction techniques. (authors). 11 refs., 2 figs.

  8. Simulated annealing image reconstruction for positron emission tomography

    International Nuclear Information System (INIS)

    Sundermann, E.; Lemahieu, I.; Desmedt, P.

    1994-01-01

    In Positron Emission Tomography (PET) images have to be reconstructed from moisy projection data. The noise on the PET data can be modeled by a Poison distribution. In this paper, we present the results of using the simulated annealing technique to reconstruct PET images. Various parameter settings of the simulated annealing algorithm are discussed and optimized. The reconstructed images are of good quality and high contrast, in comparison to other reconstruction techniques. (authors)

  9. Synthetic aperture radar imaging simulator for pulse envelope evaluation

    Science.gov (United States)

    Balster, Eric J.; Scarpino, Frank A.; Kordik, Andrew M.; Hill, Kerry L.

    2017-10-01

    A simulator for spotlight synthetic aperture radar (SAR) image formation is presented. The simulator produces radar returns from a virtual radar positioned at an arbitrary distance and altitude. The radar returns are produced from a source image, where the return is a weighted summation of linear frequency-modulated (LFM) pulse signals delayed by the distance of each pixel in the image to the radar. The imagery is resampled into polar format to ensure consistent range profiles to the position of the radar. The SAR simulator provides a capability enabling the objective analysis of formed SAR imagery, comparing it to an original source image. This capability allows for analysis of various SAR signal processing techniques previously determined by impulse response function (IPF) analysis. The results suggest that IPF analysis provides results that may not be directly related to formed SAR image quality. Instead, the SAR simulator uses image quality metrics, such as peak signal-to-noise ratio (PSNR) and structured similarity index (SSIM), for formed SAR image quality analysis. To showcase the capability of the SAR simulator, it is used to investigate the performance of various envelopes applied to LFM pulses. A power-raised cosine window with a power p=0.35 and roll-off factor of β=0.15 is shown to maximize the quality of the formed SAR images by improving PSNR by 0.84 dB and SSIM by 0.06 from images formed utilizing a rectangular pulse, on average.

  10. Quality of radiation field imaging

    International Nuclear Information System (INIS)

    Petr, I.

    1988-01-01

    The questions were studied of the quality of imaging the gamma radiation field and of the limits of the quality in directional detector scanning. A resolution angle was introduced to quantify the imaging quality, and its relation was sought with the detection effective half-angle of the directional detector. The resolution angle was defined for the simplest configuration of the radiation field consisting of two monoenergetic gamma beams in one plane. It was shown that the resolution angle decreases, i.e., resolution in imaging the radiation field is better, with the effective half-angle of the directional detector. It was also found that resolution of both gamma beams deteriorated when the beams were surrounded with an isotropic background field. If the beams are surrounded with a background field showing general distribution, the angle size will be affected not only by the properties of the detector but also by the distribution of the ambient radiation field and the method of its scanning. The method described can be applied in designing a directional detector necessary for imaging the presumed radiation field in the required quality. (Z.M.). 4 figs., 3 refs

  11. The semantics of image quality

    NARCIS (Netherlands)

    Janssen, T.J.W.M.; Blommaert, F.J.J.

    1996-01-01

    In this contribution we will discuss image quality in the context of the visuo-cognitive system as an information-processing system. To this end, we subdivide the information-processing as performed by the visuo-cognitive system into three distinct processes: (1) the construction of a visual

  12. A universal color image quality metric

    NARCIS (Netherlands)

    Toet, A.; Lucassen, M.P.

    2003-01-01

    We extend a recently introduced universal grayscale image quality index to a newly developed perceptually decorrelated color space. The resulting color image quality index quantifies the distortion of a processed color image relative to its original version. We evaluated the new color image quality

  13. Review of Image Quality Measures for Solar Imaging

    Science.gov (United States)

    Popowicz, Adam; Radlak, Krystian; Bernacki, Krzysztof; Orlov, Valeri

    2017-12-01

    Observations of the solar photosphere from the ground encounter significant problems caused by Earth's turbulent atmosphere. Before image reconstruction techniques can be applied, the frames obtained in the most favorable atmospheric conditions (the so-called lucky frames) have to be carefully selected. However, estimating the quality of images containing complex photospheric structures is not a trivial task, and the standard routines applied in nighttime lucky imaging observations are not applicable. In this paper we evaluate 36 methods dedicated to the assessment of image quality, which were presented in the literature over the past 40 years. We compare their effectiveness on simulated solar observations of both active regions and granulation patches, using reference data obtained by the Solar Optical Telescope on the Hinode satellite. To create images that are affected by a known degree of atmospheric degradation, we employed the random wave vector method, which faithfully models all the seeing characteristics. The results provide useful information about the method performances, depending on the average seeing conditions expressed by the ratio of the telescope's aperture to the Fried parameter, D/r0. The comparison identifies three methods for consideration by observers: Helmli and Scherer's mean, the median filter gradient similarity, and the discrete cosine transform energy ratio. While the first method requires less computational effort and can be used effectively in virtually any atmospheric conditions, the second method shows its superiority at good seeing (D/r0<4). The third method should mainly be considered for the post-processing of strongly blurred images.

  14. Featured Image: Simulating Planetary Gaps

    Science.gov (United States)

    Kohler, Susanna

    2017-03-01

    The authors model of howthe above disk would look as we observe it in a scattered-light image. The morphology of the gap can be used to estimate the mass of the planet that caused it. [Dong Fung 2017]The above image from a computer simulation reveals the dust structure of a protoplanetary disk (with the star obscured in the center) as a newly formed planet orbits within it. A recent study by Ruobing Dong (Steward Observatory, University of Arizona) and Jeffrey Fung (University of California, Berkeley) examines how we can determine mass of such a planet based on our observations of the gap that the planet opens in the disk as it orbits. The authors models help us to better understand how our observations of gaps might change if the disk is inclined relative to our line of sight, and how we can still constrain the mass of the gap-opening planet and the viscosity of the disk from the scattered-light images we have recently begun to obtain of distant protoplanetary disks. For more information, check out the paper below!CitationRuobing Dong () and Jeffrey Fung () 2017 ApJ 835 146. doi:10.3847/1538-4357/835/2/146

  15. Mathematical determination of image quality

    Energy Technology Data Exchange (ETDEWEB)

    Barrett, H H

    1982-01-01

    The subjective term ''image quality'' is generally not easy to define and to measure. If, however, we limit ourselves, to determine certain anomalies in blurred images, then the task can be done more easily. The efficiency can in fact be measured and the results can be presented as ROC-characteristics (receiver operating characteristics). One can determine a relation between the characteristic and the noise distance of the imaging system, and this way the efficiency of an hypothetical ideal observer can be predicted. Furthermore one can compute noise distance and other statistical parameters of X-ray images distorted by quantum interference by special techniques that are founded on the so-called ''blur core''. The technique proved to be very successful in nuclear medicine, but is also valid in computerized tomography and X-ray diagnostics. The technique is explained without mathematical details. The question will be answered concerning the role mathematical analysis will play in the determination and optimization of the quality of diagnostic exposures.

  16. MATLAB-based Applications for Image Processing and Image Quality Assessment – Part II: Experimental Results

    Directory of Open Access Journals (Sweden)

    L. Krasula

    2012-04-01

    Full Text Available The paper provides an overview of some possible usage of the software described in the Part I. It contains the real examples of image quality improvement, distortion simulations, objective and subjective quality assessment and other ways of image processing that can be obtained by the individual applications.

  17. Quality assessment for online iris images

    CSIR Research Space (South Africa)

    Makinana, S

    2015-01-01

    Full Text Available Iris recognition systems have attracted much attention for their uniqueness, stability and reliability. However, performance of this system depends on quality of iris image. Therefore there is a need to select good quality images before features can...

  18. Image quality dependence on image processing software in ...

    African Journals Online (AJOL)

    Image quality dependence on image processing software in computed radiography. ... Agfa CR readers use MUSICA software, and an upgrade with significantly different image ... Full Text: EMAIL FREE FULL TEXT EMAIL FREE FULL TEXT

  19. Image based SAR product simulation for analysis

    Science.gov (United States)

    Domik, G.; Leberl, F.

    1987-01-01

    SAR product simulation serves to predict SAR image gray values for various flight paths. Input typically consists of a digital elevation model and backscatter curves. A new method is described of product simulation that employs also a real SAR input image for image simulation. This can be denoted as 'image-based simulation'. Different methods to perform this SAR prediction are presented and advantages and disadvantages discussed. Ascending and descending orbit images from NASA's SIR-B experiment were used for verification of the concept: input images from ascending orbits were converted into images from a descending orbit; the results are compared to the available real imagery to verify that the prediction technique produces meaningful image data.

  20. Tolerance analysis through computational imaging simulations

    Science.gov (United States)

    Birch, Gabriel C.; LaCasse, Charles F.; Stubbs, Jaclynn J.; Dagel, Amber L.; Bradley, Jon

    2017-11-01

    The modeling and simulation of non-traditional imaging systems require holistic consideration of the end-to-end system. We demonstrate this approach through a tolerance analysis of a random scattering lensless imaging system.

  1. Fundamental factors influencing portal image quality

    International Nuclear Information System (INIS)

    Jaffray, D.A.

    1995-01-01

    It has been recognized that improved methods of verifying radiation field placement in external beam radiotherapy are required in order to make frequent checks of field placement feasible. As a result, a large number of electronic portal imaging systems have been developed as possible replacements for film. These developments have produced digital systems with faster acquisition and improved display contrast, however, the quality of the images acquired with such systems is still disappointing. This presentation examines many of the fundamental factors which limit the quality of radiographs obtained with a megavoltage radiotherapy beam. The size and shape of the radiation sources (focal and extra-focal) in radiotherapy machines and their influence on the spatial resolution of portal images are examined. Monte Carlo simulations of x-ray interactions within the patient determined that a significant fraction of the x-ray scatter generated in the patient is due to bremsstrahlung and positron annihilation. Depending on the detector, the scatter signal can reduce the differential signal-to-noise by 20%. Furthermore, a Monte Carlo study of the interaction of x-rays within typical fluoroscopic imaging detectors (metal plate/phosphor screen) demonstrates the degrading effect of energy absorption noise on the detective quantum efficiency of fluoroscopic based imaging systems. Finally, the spatial frequency content in the x-ray shadowgram is demonstrated to change with x-ray energy, resulting in images that appear to have reduced spatial resolution at megavoltage energies. The relative magnitude of each of these factors will be presented and recommendations for the next generation of portal imaging systems will be made

  2. Assessing product image quality for online shopping

    Science.gov (United States)

    Goswami, Anjan; Chung, Sung H.; Chittar, Naren; Islam, Atiq

    2012-01-01

    Assessing product-image quality is important in the context of online shopping. A high quality image that conveys more information about a product can boost the buyer's confidence and can get more attention. However, the notion of image quality for product-images is not the same as that in other domains. The perception of quality of product-images depends not only on various photographic quality features but also on various high level features such as clarity of the foreground or goodness of the background etc. In this paper, we define a notion of product-image quality based on various such features. We conduct a crowd-sourced experiment to collect user judgments on thousands of eBay's images. We formulate a multi-class classification problem for modeling image quality by classifying images into good, fair and poor quality based on the guided perceptual notions from the judges. We also conduct experiments with regression using average crowd-sourced human judgments as target. We compute a pseudo-regression score with expected average of predicted classes and also compute a score from the regression technique. We design many experiments with various sampling and voting schemes with crowd-sourced data and construct various experimental image quality models. Most of our models have reasonable accuracies (greater or equal to 70%) on test data set. We observe that our computed image quality score has a high (0.66) rank correlation with average votes from the crowd sourced human judgments.

  3. Simulation of Optical and Synthetic Imaging using Microwave Reflectometry

    Energy Technology Data Exchange (ETDEWEB)

    G.J. Kramer; R. Nazikian; E. Valeo

    2004-01-16

    Two-dimensional full-wave time-dependent simulations in full plasma geometry are presented which show that conventional reflectometry (without a lens) can be used to synthetically image density fluctuations in fusion plasmas under conditions where the parallel correlation length greatly exceeds the poloidal correlation length of the turbulence. The advantage of synthetic imaging is that the image can be produced without the need for a large lens of high optical quality, and each frequency that is launched can be independently imaged. A particularly simple arrangement, consisting of a single receiver located at the midpoint of a microwave beam propagating along the plasma midplane is shown to suffice for imaging purposes. However, as the ratio of the parallel to poloidal correlation length decreases, a poloidal array of receivers needs to be used to synthesize the image with high accuracy. Simulations using DIII-D relevant parameters show the similarity of synthetic and optical imaging in present-day experiments.

  4. Simulation of Optical and Synthetic Imaging using Microwave Reflectometry

    International Nuclear Information System (INIS)

    Kramer, G.J.; Nazikian, R.; Valeo, E.

    2004-01-01

    Two-dimensional full-wave time-dependent simulations in full plasma geometry are presented which show that conventional reflectometry (without a lens) can be used to synthetically image density fluctuations in fusion plasmas under conditions where the parallel correlation length greatly exceeds the poloidal correlation length of the turbulence. The advantage of synthetic imaging is that the image can be produced without the need for a large lens of high optical quality, and each frequency that is launched can be independently imaged. A particularly simple arrangement, consisting of a single receiver located at the midpoint of a microwave beam propagating along the plasma midplane is shown to suffice for imaging purposes. However, as the ratio of the parallel to poloidal correlation length decreases, a poloidal array of receivers needs to be used to synthesize the image with high accuracy. Simulations using DIII-D relevant parameters show the similarity of synthetic and optical imaging in present-day experiments

  5. Noise simulation in cone beam CT imaging with parallel computing

    International Nuclear Information System (INIS)

    Tu, S.-J.; Shaw, Chris C; Chen, Lingyun

    2006-01-01

    We developed a computer noise simulation model for cone beam computed tomography imaging using a general purpose PC cluster. This model uses a mono-energetic x-ray approximation and allows us to investigate three primary performance components, specifically quantum noise, detector blurring and additive system noise. A parallel random number generator based on the Weyl sequence was implemented in the noise simulation and a visualization technique was accordingly developed to validate the quality of the parallel random number generator. In our computer simulation model, three-dimensional (3D) phantoms were mathematically modelled and used to create 450 analytical projections, which were then sampled into digital image data. Quantum noise was simulated and added to the analytical projection image data, which were then filtered to incorporate flat panel detector blurring. Additive system noise was generated and added to form the final projection images. The Feldkamp algorithm was implemented and used to reconstruct the 3D images of the phantoms. A 24 dual-Xeon PC cluster was used to compute the projections and reconstructed images in parallel with each CPU processing 10 projection views for a total of 450 views. Based on this computer simulation system, simulated cone beam CT images were generated for various phantoms and technique settings. Noise power spectra for the flat panel x-ray detector and reconstructed images were then computed to characterize the noise properties. As an example among the potential applications of our noise simulation model, we showed that images of low contrast objects can be produced and used for image quality evaluation

  6. Studies oriented to optimize the image quality of the small animal PET: Clear PET, modifying some of the parameters of the reconstruction algorithm IMF-OSEM 3D on the data acquisition simulated with GAMOS

    International Nuclear Information System (INIS)

    Canadas, M.; Mendoza, J.; Embid, M.

    2007-01-01

    This report presents studies oriented to optimize the image quality of the small animal PET: Clear- PET. Certain figures of merit (FOM) were used to assess a quantitative value of the contrast and delectability of lesions. The optimization was carried out modifying some of the parameters in the reconstruction software of the scanner, imaging a mini-Derenzo phantom and a cylinder phantom with background activity and two hot spheres. Specifically, it was evaluated the incidence of the inter-update Metz filter (IMF) inside the iterative reconstruction algorithm 3D OSEM. The data acquisition was simulated using the GAMOS framework (Monte Carlo simulation). Integrating GAMOS output with the reconstruction software of the scanner was an additional novelty of this work, to achieve this, data sets were written with the list-mode format (LMF) of ClearPET. In order to verify the optimum values obtained, we foresee to make real acquisitions in the ClearPET of CIEMAT. (Author) 17 refs

  7. Image quality of digital mammography images produced using wet and dry laser imaging systems

    International Nuclear Information System (INIS)

    Al Khalifah, K.; Brindhaban, A.; AlArfaj, R.; Jassim, O.

    2006-01-01

    Introduction: A study was carried out to compare the quality of digital mammographic images printed or processed by a wet laser imaging system and a dedicated mammographic dry laser imaging system. Material and methods: Digital images of a tissue equivalent breast phantom were obtained using a GE Senographe 2000D digital mammography system and different target/filter combinations of the X-ray tube. These images were printed on films using the Fuji FL-IM D wet laser imaging system and the Kodak DryView 8600 dry laser imaging system. The quality of images was assessed in terms of detectability of microcalcifications and simulated tumour masses by five radiologists. In addition, the contrast index and speed index of the two systems were measured using the step wedge in the phantom. The unpaired, unequal variance t-test was used to test any statistically significant differences. Results: There were no significant (p < 0.05) differences between the images printed using the two systems in terms of microcalcification and tumour mass detectability. The wet system resulted in slightly higher contrast index while the dry system showed significantly higher speed index. Conclusion: Both wet and dry laser imaging systems can produce mammography images of good quality on which 0.2 mm microcalcifications and 2 mm tumour masses can be detected. Dry systems are preferable due to the absence of wet chemical processing and solid or liquid chemical waste. The wet laser imaging systems, however, still represent a useful alternative to dry laser imaging systems for mammography studies

  8. Optimization of Synthetic Aperture Image Quality

    DEFF Research Database (Denmark)

    Moshavegh, Ramin; Jensen, Jonas; Villagómez Hoyos, Carlos Armando

    2016-01-01

    Synthetic Aperture (SA) imaging produces high-quality images and velocity estimates of both slow and fast flow at high frame rates. However, grating lobe artifacts can appear both in transmission and reception. These affect the image quality and the frame rate. Therefore optimization of parameter...

  9. Image quality analysis of digital mammographic equipments

    Energy Technology Data Exchange (ETDEWEB)

    Mayo, P.; Pascual, A.; Verdu, G. [Valencia Univ. Politecnica, Chemical and Nuclear Engineering Dept. (Spain); Rodenas, F. [Valencia Univ. Politecnica, Applied Mathematical Dept. (Spain); Campayo, J.M. [Valencia Univ. Hospital Clinico, Servicio de Radiofisica y Proteccion Radiologica (Spain); Villaescusa, J.I. [Hospital Clinico La Fe, Servicio de Proteccion Radiologica, Valencia (Spain)

    2006-07-01

    The image quality assessment of a radiographic phantom image is one of the fundamental points in a complete quality control programme. The good functioning result of all the process must be an image with an appropriate quality to carry out a suitable diagnostic. Nowadays, the digital radiographic equipments are replacing the traditional film-screen equipments and it is necessary to update the parameters to guarantee the quality of the process. Contrast-detail phantoms are applied to digital radiography to study the threshold contrast detail sensitivity at operation conditions of the equipment. The phantom that is studied in this work is C.D.M.A.M. 3.4, which facilitates the evaluation of image contrast and detail resolution. One of the most extended indexes to measure the image quality in an objective way is the Image Quality Figure (I.Q.F.). This parameter is useful to calculate the image quality taking into account the contrast and detail resolution of the image analysed. The contrast-detail curve is useful as a measure of the image quality too, because it is a graphical representation in which the hole thickness and diameter are plotted for each contrast-detail combination detected in the radiographic image of the phantom. It is useful for the comparison of the functioning of different radiographic image systems, for phantom images under the same exposition conditions. The aim of this work is to study the image quality of different images contrast-detail phantom C.D.M.A.M. 3.4, carrying out the automatic detection of the contrast-detail combination and to establish a parameter which characterize in an objective way the mammographic image quality. This is useful to compare images obtained at different digital mammographic equipments to study the functioning of the equipments. (authors)

  10. Image quality analysis of digital mammographic equipments

    International Nuclear Information System (INIS)

    Mayo, P.; Pascual, A.; Verdu, G.; Rodenas, F.; Campayo, J.M.; Villaescusa, J.I.

    2006-01-01

    The image quality assessment of a radiographic phantom image is one of the fundamental points in a complete quality control programme. The good functioning result of all the process must be an image with an appropriate quality to carry out a suitable diagnostic. Nowadays, the digital radiographic equipments are replacing the traditional film-screen equipments and it is necessary to update the parameters to guarantee the quality of the process. Contrast-detail phantoms are applied to digital radiography to study the threshold contrast detail sensitivity at operation conditions of the equipment. The phantom that is studied in this work is C.D.M.A.M. 3.4, which facilitates the evaluation of image contrast and detail resolution. One of the most extended indexes to measure the image quality in an objective way is the Image Quality Figure (I.Q.F.). This parameter is useful to calculate the image quality taking into account the contrast and detail resolution of the image analysed. The contrast-detail curve is useful as a measure of the image quality too, because it is a graphical representation in which the hole thickness and diameter are plotted for each contrast-detail combination detected in the radiographic image of the phantom. It is useful for the comparison of the functioning of different radiographic image systems, for phantom images under the same exposition conditions. The aim of this work is to study the image quality of different images contrast-detail phantom C.D.M.A.M. 3.4, carrying out the automatic detection of the contrast-detail combination and to establish a parameter which characterize in an objective way the mammographic image quality. This is useful to compare images obtained at different digital mammographic equipments to study the functioning of the equipments. (authors)

  11. Quality assurance for image-guided radiotherapy

    International Nuclear Information System (INIS)

    Marinello, Ginette

    2008-01-01

    The topics discussed include, among others, the following: Quality assurance program; Image guided radiotherapy; Commissioning and quality assurance; Check of agreement between visual and displayed scales; quality controls: electronic portal imaging device (EPID), MV-kV and kV-kV, cone-beam CT (CBCT), patient doses. (P.A.)

  12. Improving the quality of brain CT image from Wavelet filters

    International Nuclear Information System (INIS)

    Pita Machado, Reinaldo; Perez Diaz, Marlen; Bravo Pino, Rolando

    2012-01-01

    An algorithm to reduce Poisson noise is described using Wavelet filters. Five tomographic images of patients and a head anthropomorphic phantom were used. They were acquired with two different CT machines. Due to the original images contain the acquisition noise; some simulated free noise lesions were added to the images and after that the whole images were contaminated with noise. Contaminated images were filtered with 9 Wavelet filters at different decomposition levels and thresholds. Image quality of filtered and unfiltered images was graded using the Signal to Noise ratio, Normalized Mean Square Error and the Structural Similarity Index, as well as, by the subjective JAFROC methods with 5 observers. Some filters as Bior 3.7 and dB45 improved in a significant way head CT image quality (p<0.05) producing an increment in SNR without visible structural distortions

  13. Medical imaging informatics simulators: a tutorial.

    Science.gov (United States)

    Huang, H K; Deshpande, Ruchi; Documet, Jorge; Le, Anh H; Lee, Jasper; Ma, Kevin; Liu, Brent J

    2014-05-01

    A medical imaging informatics infrastructure (MIII) platform is an organized method of selecting tools and synthesizing data from HIS/RIS/PACS/ePR systems with the aim of developing an imaging-based diagnosis or treatment system. Evaluation and analysis of these systems can be made more efficient by designing and implementing imaging informatics simulators. This tutorial introduces the MIII platform and provides the definition of treatment/diagnosis systems, while primarily focusing on the development of the related simulators. A medical imaging informatics (MII) simulator in this context is defined as a system integration of many selected imaging and data components from the MIII platform and clinical treatment protocols, which can be used to simulate patient workflow and data flow starting from diagnostic procedures to the completion of treatment. In these processes, DICOM and HL-7 standards, IHE workflow profiles, and Web-based tools are emphasized. From the information collected in the database of a specific simulator, evidence-based medicine can be hypothesized to choose and integrate optimal clinical decision support components. Other relevant, selected clinical resources in addition to data and tools from the HIS/RIS/PACS and ePRs platform may also be tailored to develop the simulator. These resources can include image content indexing, 3D rendering with visualization, data grid and cloud computing, computer-aided diagnosis (CAD) methods, specialized image-assisted surgical, and radiation therapy technologies. Five simulators will be discussed in this tutorial. The PACS-ePR simulator with image distribution is the cradle of the other simulators. It supplies the necessary PACS-based ingredients and data security for the development of four other simulators: the data grid simulator for molecular imaging, CAD-PACS, radiation therapy simulator, and image-assisted surgery simulator. The purpose and benefits of each simulator with respect to its clinical relevance

  14. Intelligent medical image processing by simulated annealing

    International Nuclear Information System (INIS)

    Ohyama, Nagaaki

    1992-01-01

    Image processing is being widely used in the medical field and already has become very important, especially when used for image reconstruction purposes. In this paper, it is shown that image processing can be classified into 4 categories; passive, active, intelligent and visual image processing. These 4 classes are explained at first through the use of several examples. The results show that the passive image processing does not give better results than the others. Intelligent image processing, then, is addressed, and the simulated annealing method is introduced. Due to the flexibility of the simulated annealing, formulated intelligence is shown to be easily introduced in an image reconstruction problem. As a practical example, 3D blood vessel reconstruction from a small number of projections, which is insufficient for conventional method to give good reconstruction, is proposed, and computer simulation clearly shows the effectiveness of simulated annealing method. Prior to the conclusion, medical file systems such as IS and C (Image Save and Carry) is pointed out to have potential for formulating knowledge, which is indispensable for intelligent image processing. This paper concludes by summarizing the advantages of simulated annealing. (author)

  15. Improving image quality in portal venography with spectral CT imaging

    International Nuclear Information System (INIS)

    Zhao, Li-qin; He, Wen; Li, Jian-ying; Chen, Jiang-hong; Wang, Ke-yang; Tan, Li

    2012-01-01

    Objective: To investigate the effect of energy spectral CT on the image quality of CT portal venography in cirrhosis patients. Materials and methods: 30 portal hypertension patients underwent spectral CT examination using a single-tube, fast dual tube voltage switching technique. 101 sets of monochromatic images were generated from 40 keV to 140 keV. Image noise and contrast-to-noise ratio (CNR) for portal veins from the monochromatic images were measured. An optimal monochromatic image set was selected for obtaining the best CNR for portal veins. The image noise and CNR of the intra-hepatic portal vein and extra-hepatic main stem at the selected monochromatic level were compared with those from the conventional polychromatic images. Image quality was also assessed and compared. Results: The monochromatic images at 51 keV were found to provide the best CNR for both the intra-hepatic and extra-hepatic portal veins. At this energy level, the monochromatic images had about 100% higher CNR than the polychromatic images with a moderate 30% noise increase. The qualitative image quality assessment was also statistically higher with monochromatic images at 51 keV. Conclusion: Monochromatic images at 51 keV for CT portal venography could improve CNR for displaying hepatic portal veins and improve the overall image quality.

  16. Improving image quality in portal venography with spectral CT imaging

    Energy Technology Data Exchange (ETDEWEB)

    Zhao, Li-qin, E-mail: zhaolqzr@sohu.com [Department of Radiology, Beijing Friendship Hospital Affiliated to Capital Medical University, Beijing,100050 (China); He, Wen, E-mail: hewen1724@sina.com [Department of Radiology, Beijing Friendship Hospital Affiliated to Capital Medical University, Beijing,100050 (China); Li, Jian-ying, E-mail: jianying.li@med.ge.com [CT Advanced Application and Research, GE Healthcare, 100176 China (China); Chen, Jiang-hong, E-mail: chenjianghong1973@hotmail.com [Department of Radiology, Beijing Friendship Hospital Affiliated to Capital Medical University, Beijing,100050 (China); Wang, Ke-yang, E-mail: ke7ke@sina.com [Department of Radiology, Beijing Friendship Hospital Affiliated to Capital Medical University, Beijing,100050 (China); Tan, Li, E-mail: Litan@ge.com [CT product, GE Healthcare, 100176 China (China)

    2012-08-15

    Objective: To investigate the effect of energy spectral CT on the image quality of CT portal venography in cirrhosis patients. Materials and methods: 30 portal hypertension patients underwent spectral CT examination using a single-tube, fast dual tube voltage switching technique. 101 sets of monochromatic images were generated from 40 keV to 140 keV. Image noise and contrast-to-noise ratio (CNR) for portal veins from the monochromatic images were measured. An optimal monochromatic image set was selected for obtaining the best CNR for portal veins. The image noise and CNR of the intra-hepatic portal vein and extra-hepatic main stem at the selected monochromatic level were compared with those from the conventional polychromatic images. Image quality was also assessed and compared. Results: The monochromatic images at 51 keV were found to provide the best CNR for both the intra-hepatic and extra-hepatic portal veins. At this energy level, the monochromatic images had about 100% higher CNR than the polychromatic images with a moderate 30% noise increase. The qualitative image quality assessment was also statistically higher with monochromatic images at 51 keV. Conclusion: Monochromatic images at 51 keV for CT portal venography could improve CNR for displaying hepatic portal veins and improve the overall image quality.

  17. Process perspective on image quality evaluation

    Science.gov (United States)

    Leisti, Tuomas; Halonen, Raisa; Kokkonen, Anna; Weckman, Hanna; Mettänen, Marja; Lensu, Lasse; Ritala, Risto; Oittinen, Pirkko; Nyman, Göte

    2008-01-01

    The psychological complexity of multivariate image quality evaluation makes it difficult to develop general image quality metrics. Quality evaluation includes several mental processes and ignoring these processes and the use of a few test images can lead to biased results. By using a qualitative/quantitative (Interpretation Based Quality, IBQ) methodology, we examined the process of pair-wise comparison in a setting, where the quality of the images printed by laser printer on different paper grades was evaluated. Test image consisted of a picture of a table covered with several objects. Three other images were also used, photographs of a woman, cityscape and countryside. In addition to the pair-wise comparisons, observers (N=10) were interviewed about the subjective quality attributes they used in making their quality decisions. An examination of the individual pair-wise comparisons revealed serious inconsistencies in observers' evaluations on the test image content, but not on other contexts. The qualitative analysis showed that this inconsistency was due to the observers' focus of attention. The lack of easily recognizable context in the test image may have contributed to this inconsistency. To obtain reliable knowledge of the effect of image context or attention on subjective image quality, a qualitative methodology is needed.

  18. Balancing patient dose and image quality

    International Nuclear Information System (INIS)

    Martin, C.J.; Sutton, D.G.; Sharp, P.F.

    1999-01-01

    The formation of images in diagnostic radiology involves a complex interdependence of many factors. The ideal balance is to obtain an image which is adequate for the clinical purpose with the minimum radiation dose. Factors which affect radiation dose and image quality can be grouped under three headings; radiation quality, photon fluence and removal of scattered radiation. If optimal performance is to be achieved, it is necessary to understand how these factors influence image formation and affect radiation dose, and apply methodology for image quality and dose analysis at each stage in the development and use of X-ray equipment

  19. Image quality assessment using deep convolutional networks

    Science.gov (United States)

    Li, Yezhou; Ye, Xiang; Li, Yong

    2017-12-01

    This paper proposes a method of accurately assessing image quality without a reference image by using a deep convolutional neural network. Existing training based methods usually utilize a compact set of linear filters for learning features of images captured by different sensors to assess their quality. These methods may not be able to learn the semantic features that are intimately related with the features used in human subject assessment. Observing this drawback, this work proposes training a deep convolutional neural network (CNN) with labelled images for image quality assessment. The ReLU in the CNN allows non-linear transformations for extracting high-level image features, providing a more reliable assessment of image quality than linear filters. To enable the neural network to take images of any arbitrary size as input, the spatial pyramid pooling (SPP) is introduced connecting the top convolutional layer and the fully-connected layer. In addition, the SPP makes the CNN robust to object deformations to a certain extent. The proposed method taking an image as input carries out an end-to-end learning process, and outputs the quality of the image. It is tested on public datasets. Experimental results show that it outperforms existing methods by a large margin and can accurately assess the image quality on images taken by different sensors of varying sizes.

  20. Simulation of Profiles Data For Computed Tomography Using Object Images

    International Nuclear Information System (INIS)

    Srisatit, Somyot

    2007-08-01

    Full text: It is necessary to use a scanning system to obtain the profiles data for computed tomographic images. A good profile data can give a good contrast and resolution. For the scanning system, high efficiency and high price of radiation equipments must be used. So, the simulated profiles data to obtain a good CT images quality as same as the real one for the demonstration can be used

  1. Quantitative Image Simulation and Analysis of Nanoparticles

    DEFF Research Database (Denmark)

    Madsen, Jacob; Hansen, Thomas Willum

    Microscopy (HRTEM) has become a routine analysis tool for structural characterization at atomic resolution, and with the recent development of in-situ TEMs, it is now possible to study catalytic nanoparticles under reaction conditions. However, the connection between an experimental image, and the underlying...... physical phenomena or structure is not always straightforward. The aim of this thesis is to use image simulation to better understand observations from HRTEM images. Surface strain is known to be important for the performance of nanoparticles. Using simulation, we estimate of the precision and accuracy...... of strain measurements from TEM images, and investigate the stability of these measurements to microscope parameters. This is followed by our efforts toward simulating metal nanoparticles on a metal-oxide support using the Charge Optimized Many Body (COMB) interatomic potential. The simulated interface...

  2. Hyperspectral Image Analysis of Food Quality

    DEFF Research Database (Denmark)

    Arngren, Morten

    inspection.Near-infrared spectroscopy can address these issues by offering a fast and objectiveanalysis of the food quality. A natural extension to these single spectrumNIR systems is to include image information such that each pixel holds a NIRspectrum. This augmented image information offers several......Assessing the quality of food is a vital step in any food processing line to ensurethe best food quality and maximum profit for the farmer and food manufacturer.Traditional quality evaluation methods are often destructive and labourintensive procedures relying on wet chemistry or subjective human...... extensions to the analysis offood quality. This dissertation is concerned with hyperspectral image analysisused to assess the quality of single grain kernels. The focus is to highlight thebenefits and challenges of using hyperspectral imaging for food quality presentedin two research directions. Initially...

  3. Effect of image quality on calcification detection in digital mammography

    Energy Technology Data Exchange (ETDEWEB)

    Warren, Lucy M.; Mackenzie, Alistair; Cooke, Julie; Given-Wilson, Rosalind M.; Wallis, Matthew G.; Chakraborty, Dev P.; Dance, David R.; Bosmans, Hilde; Young, Kenneth C. [National Co-ordinating Centre for the Physics of Mammography, Royal Surrey County Hospital NHS Foundation Trust, Guildford GU2 7XX, United Kingdom and Department of Physics, Faculty of Engineering and Physical Sciences, University of Surrey, Guildford, GU2 7XH (United Kingdom); Jarvis Breast Screening and Diagnostic Centre, Guildford GU1 1LJ (United Kingdom); Department of Radiology, St. George' s Healthcare NHS Trust, Tooting, London SW17 0QT (United Kingdom); Cambridge Breast Unit, Cambridge University Hospitals NHS Foundation Trust, Cambridge CB2 0QQ, United Kingdom and NIHR Cambridge Biomedical Research Centre, Cambridge CB2 0QQ (United Kingdom); Department of Radiology, University of Pittsburgh, Pittsburgh, Pennsylvania 15210 (United States); National Co-ordinating Centre for the Physics of Mammography, Royal Surrey County Hospital NHS Foundation Trust, Guildford GU2 7XX, United Kingdom and Department of Physics, Faculty of Engineering and Physical Sciences, University of Surrey, Guildford GU2 7XH (United Kingdom); University Hospitals Leuven, Herestraat 49, 3000 Leuven (Belgium); National Co-ordinating Centre for the Physics of Mammography, Royal Surrey County Hospital NHS Foundation Trust, Guildford GU2 7XX, United Kingdom and Department of Physics, Faculty of Engineering and Physical Sciences, University of Surrey, Guildford GU2 7XH (United Kingdom)

    2012-06-15

    Purpose: This study aims to investigate if microcalcification detection varies significantly when mammographic images are acquired using different image qualities, including: different detectors, dose levels, and different image processing algorithms. An additional aim was to determine how the standard European method of measuring image quality using threshold gold thickness measured with a CDMAM phantom and the associated limits in current EU guidelines relate to calcification detection. Methods: One hundred and sixty two normal breast images were acquired on an amorphous selenium direct digital (DR) system. Microcalcification clusters extracted from magnified images of slices of mastectomies were electronically inserted into half of the images. The calcification clusters had a subtle appearance. All images were adjusted using a validated mathematical method to simulate the appearance of images from a computed radiography (CR) imaging system at the same dose, from both systems at half this dose, and from the DR system at quarter this dose. The original 162 images were processed with both Hologic and Agfa (Musica-2) image processing. All other image qualities were processed with Agfa (Musica-2) image processing only. Seven experienced observers marked and rated any identified suspicious regions. Free response operating characteristic (FROC) and ROC analyses were performed on the data. The lesion sensitivity at a nonlesion localization fraction (NLF) of 0.1 was also calculated. Images of the CDMAM mammographic test phantom were acquired using the automatic setting on the DR system. These images were modified to the additional image qualities used in the observer study. The images were analyzed using automated software. In order to assess the relationship between threshold gold thickness and calcification detection a power law was fitted to the data. Results: There was a significant reduction in calcification detection using CR compared with DR: the alternative FROC

  4. The use of the general image quality equation in the design and evaluation of imaging systems

    Science.gov (United States)

    Cota, Steve A.; Florio, Christopher J.; Duvall, David J.; Leon, Michael A.

    2009-08-01

    The design of any modern imaging system is the end result of many trade studies, each seeking to optimize image quality within real world constraints such as cost, schedule and overall risk. The National Imagery Interpretability Rating Scale (NIIRS) is a useful measure of image quality, because, by characterizing the overall interpretability of an image, it combines into one metric those contributors to image quality to which a human interpreter is most sensitive. The main drawback to using a NIIRS rating as a measure of image quality in engineering trade studies is the fact that it is tied to the human observer and cannot be predicted from physical principles and engineering parameters alone. The General Image Quality Equation (GIQE) of Leachtenauer et al. 1997 [Appl. Opt. 36, 8322-8328 (1997)] is a regression of actual image analyst NIIRS ratings vs. readily calculable engineering metrics, and provides a mechanism for using the expected NIIRS rating of an imaging system in the design and evaluation process. In this paper, we will discuss how we use the GIQE in conjunction with The Aerospace Corporation's Parameterized Image Chain Analysis & Simulation SOftware (PICASSO) to evaluate imager designs, taking a hypothetical high resolution commercial imaging system as an example.

  5. Image quality at synthetic brain magnetic resonance imaging in children

    Energy Technology Data Exchange (ETDEWEB)

    Lee, So Mi; Cho, Seung Hyun; Kim, Won Hwa; Kim, Hye Jung [Kyungpook National University Hospital, Department of Radiology, Daegu (Korea, Republic of); Choi, Young Hun; Cheon, Jung-Eun; Kim, In-One [Seoul National University College of Medicine, Department of Radiology and Institute of Radiation Medicine, Seoul (Korea, Republic of); Cho, Hyun-Hae [Ewha Womans University Mokdong Hospital, Department of Radiology, Seoul (Korea, Republic of); You, Sun-Kyoung [Chungnam National University Hospital, Department of Radiology, Daejeon (Korea, Republic of); Park, Sook-Hyun [Kyungpook National University Hospital, Department of Pediatrics, Daegu (Korea, Republic of); Hwang, Moon Jung [GE Healthcare, MR Applications and Workflow, Seoul (Korea, Republic of)

    2017-11-15

    The clinical application of the multi-echo, multi-delay technique of synthetic magnetic resonance imaging (MRI) generates multiple sequences in a single acquisition but has mainly been used in adults. To evaluate the image quality of synthetic brain MR in children compared with that of conventional images. Twenty-nine children (median age: 6 years, range: 0-16 years) underwent synthetic and conventional imaging. Synthetic (T2-weighted, T1-weighted and fluid-attenuated inversion recovery [FLAIR]) images with settings matching those of the conventional images were generated. The overall image quality, gray/white matter differentiation, lesion conspicuity and image degradations were rated on a 5-point scale. The relative contrasts were assessed quantitatively and acquisition times for the two imaging techniques were compared. Synthetic images were inferior due to more pronounced image degradations; however, there were no significant differences for T1- and T2-weighted images in children <2 years old. The quality of T1- and T2-weighted images were within the diagnostically acceptable range. FLAIR images showed greatly reduced quality. Gray/white matter differentiation was comparable or better in synthetic T1- and T2-weighted images, but poorer in FLAIR images. There was no effect on lesion conspicuity. Synthetic images had equal or greater relative contrast. Acquisition time was approximately two-thirds of that for conventional sequences. Synthetic T1- and T2-weighted images were diagnostically acceptable, but synthetic FLAIR images were not. Lesion conspicuity and gray/white matter differentiation were comparable to conventional MRI. (orig.)

  6. IMAGE QUALITY FORECASTING FOR SPACE OBJECTS

    Directory of Open Access Journals (Sweden)

    A. I. Altukhov

    2013-05-01

    Full Text Available The article deals with an approach to quality predicting of the space objects images, which can be used to plan optoelectronic systems of remote sensing satellites work programs. The proposed approach is based on evaluation of the optoelectronic equipment transfer properties and calculation of indexes images quality considering the influence of the orbital shooting conditions.

  7. Image quality and dose in computed tomography

    International Nuclear Information System (INIS)

    Jurik, A.G.; Jessen, K.A.; Hansen, J.

    1997-01-01

    Radiation exposure to the patient during CT is relatively high, and it is therefore important to optimize the dose so that it is as low as possible but still consistent with required diagnostic image quality. There is no established method for measuring diagnostic image quality; therefore, a set of image quality criteria which must be fulfilled for optimal image quality was defined for the retroperitoneal space and the mediastinum. The use of these criteria for assessment of image quality was tested based on 113 retroperitoneal and 68 mediastinal examinations performed in seven different CT units. All the criteria, except one, were found to be usable for measuring diagnostic image quality. The fulfilment of criteria was related to the radiation dose given in the different departments. By examination of the retroperitoneal space the effective dose varied between 5.1 and 20.0 mSv (milli Sievert), and there was a slight correlation between dose and high percent of ''yes'' score for the image quality criteria. For examination of the mediastinum the dose range was 4.4-26.5 mSv, and there was no significant increment of image quality at high doses. The great variation of dose at different CT units was due partly to differences regarding the examination procedure, especially the number of slices and the mAs (milli ampere second), but inherent dose variation between different scanners also played a part. (orig.). With 6 figs., 6 tabs

  8. Effect of image quality on calcification detection in digital mammography.

    Science.gov (United States)

    Warren, Lucy M; Mackenzie, Alistair; Cooke, Julie; Given-Wilson, Rosalind M; Wallis, Matthew G; Chakraborty, Dev P; Dance, David R; Bosmans, Hilde; Young, Kenneth C

    2012-06-01

    This study aims to investigate if microcalcification detection varies significantly when mammographic images are acquired using different image qualities, including: different detectors, dose levels, and different image processing algorithms. An additional aim was to determine how the standard European method of measuring image quality using threshold gold thickness measured with a CDMAM phantom and the associated limits in current EU guidelines relate to calcification detection. One hundred and sixty two normal breast images were acquired on an amorphous selenium direct digital (DR) system. Microcalcification clusters extracted from magnified images of slices of mastectomies were electronically inserted into half of the images. The calcification clusters had a subtle appearance. All images were adjusted using a validated mathematical method to simulate the appearance of images from a computed radiography (CR) imaging system at the same dose, from both systems at half this dose, and from the DR system at quarter this dose. The original 162 images were processed with both Hologic and Agfa (Musica-2) image processing. All other image qualities were processed with Agfa (Musica-2) image processing only. Seven experienced observers marked and rated any identified suspicious regions. Free response operating characteristic (FROC) and ROC analyses were performed on the data. The lesion sensitivity at a nonlesion localization fraction (NLF) of 0.1 was also calculated. Images of the CDMAM mammographic test phantom were acquired using the automatic setting on the DR system. These images were modified to the additional image qualities used in the observer study. The images were analyzed using automated software. In order to assess the relationship between threshold gold thickness and calcification detection a power law was fitted to the data. There was a significant reduction in calcification detection using CR compared with DR: the alternative FROC (AFROC) area decreased from

  9. Effect of image quality on calcification detection in digital mammography

    International Nuclear Information System (INIS)

    Warren, Lucy M.; Mackenzie, Alistair; Cooke, Julie; Given-Wilson, Rosalind M.; Wallis, Matthew G.; Chakraborty, Dev P.; Dance, David R.; Bosmans, Hilde; Young, Kenneth C.

    2012-01-01

    Purpose: This study aims to investigate if microcalcification detection varies significantly when mammographic images are acquired using different image qualities, including: different detectors, dose levels, and different image processing algorithms. An additional aim was to determine how the standard European method of measuring image quality using threshold gold thickness measured with a CDMAM phantom and the associated limits in current EU guidelines relate to calcification detection. Methods: One hundred and sixty two normal breast images were acquired on an amorphous selenium direct digital (DR) system. Microcalcification clusters extracted from magnified images of slices of mastectomies were electronically inserted into half of the images. The calcification clusters had a subtle appearance. All images were adjusted using a validated mathematical method to simulate the appearance of images from a computed radiography (CR) imaging system at the same dose, from both systems at half this dose, and from the DR system at quarter this dose. The original 162 images were processed with both Hologic and Agfa (Musica-2) image processing. All other image qualities were processed with Agfa (Musica-2) image processing only. Seven experienced observers marked and rated any identified suspicious regions. Free response operating characteristic (FROC) and ROC analyses were performed on the data. The lesion sensitivity at a nonlesion localization fraction (NLF) of 0.1 was also calculated. Images of the CDMAM mammographic test phantom were acquired using the automatic setting on the DR system. These images were modified to the additional image qualities used in the observer study. The images were analyzed using automated software. In order to assess the relationship between threshold gold thickness and calcification detection a power law was fitted to the data. Results: There was a significant reduction in calcification detection using CR compared with DR: the alternative FROC

  10. Quality Control in Mammography: Image Quality and Patient Doses

    International Nuclear Information System (INIS)

    Ciraj Bjelac, O.; Arandjic, D.; Boris Loncar, B.; Kosutic, D.

    2008-01-01

    Mammography is method of choice for early detection of breast cancer. The purpose of this paper is preliminary evaluation the mammography practice in Serbia, in terms of both quality control indicators, i.e. image quality and patient doses. The survey demonstrated considerable variations in technical parameters that affect image quality and patients doses. Mean glandular doses ranged from 0.12 to 2.8 mGy, while reference optical density ranged from 1.2 to 2.8. Correlation between image contrast and mean glandular doses was demonstrated. Systematic implementation of quality control protocol should provide satisfactory performance of mammography units and maintain satisfactory image quality and keep patient doses as low as reasonably practicable. (author)

  11. Clinical evaluation of a new set of image quality criteria for mammography

    International Nuclear Information System (INIS)

    Grahn, A.; Hemdal, B.; Andersson, I.; Ruschin, M.; Thilander-Klang, A.; Boerjesson, S.; Tingberg, A.; Mattsson, S.; Haakansson, M.; Baath, M.; Maansson, L. G.; Medin, J.; Wanninger, F.; Panzer, W.

    2005-01-01

    The European Commission (EC) quality criteria for screen-film mammography are used as a tool to asses image quality. A new set of criteria was developed and initially tested in a previous study. In the present study, these criteria are further evaluated using screen-film mammograms that have been digitised, manipulated to simulated different image quality level and reprinted on film. Expert radiologists have evaluated these manipulated to simulate different image quality levels and reprinted on film. Expert radiologists have evaluated these manipulated images using both the original (EC) and the new criteria. A comparison of three different simulated dose levels that the new criteria yield a larger separation of image criteria scores than the old ones. These results indicated that the new set of image quality criteria has a higher discriminative power than the old set and thus seems to be more suitable for evaluation of image quality in mammography. (authors)

  12. Monte-Carlo simulations and image reconstruction for novel imaging scenarios in emission tomography

    International Nuclear Information System (INIS)

    Gillam, John E.; Rafecas, Magdalena

    2016-01-01

    Emission imaging incorporates both the development of dedicated devices for data acquisition as well as algorithms for recovering images from that data. Emission tomography is an indirect approach to imaging. The effect of device modification on the final image can be understood through both the way in which data are gathered, using simulation, and the way in which the image is formed from that data, or image reconstruction. When developing novel devices, systems and imaging tasks, accurate simulation and image reconstruction allow performance to be estimated, and in some cases optimized, using computational methods before or during the process of physical construction. However, there are a vast range of approaches, algorithms and pre-existing computational tools that can be exploited and the choices made will affect the accuracy of the in silico results and quality of the reconstructed images. On the one hand, should important physical effects be neglected in either the simulation or reconstruction steps, specific enhancements provided by novel devices may not be represented in the results. On the other hand, over-modeling of device characteristics in either step leads to large computational overheads that can confound timely results. Here, a range of simulation methodologies and toolkits are discussed, as well as reconstruction algorithms that may be employed in emission imaging. The relative advantages and disadvantages of a range of options are highlighted using specific examples from current research scenarios.

  13. Monte-Carlo simulations and image reconstruction for novel imaging scenarios in emission tomography

    Energy Technology Data Exchange (ETDEWEB)

    Gillam, John E. [The University of Sydney, Faculty of Health Sciences and The Brain and Mind Centre, Camperdown (Australia); Rafecas, Magdalena, E-mail: rafecas@imt.uni-luebeck.de [University of Lubeck, Institute of Medical Engineering, Ratzeburger Allee 160, 23538 Lübeck (Germany)

    2016-02-11

    Emission imaging incorporates both the development of dedicated devices for data acquisition as well as algorithms for recovering images from that data. Emission tomography is an indirect approach to imaging. The effect of device modification on the final image can be understood through both the way in which data are gathered, using simulation, and the way in which the image is formed from that data, or image reconstruction. When developing novel devices, systems and imaging tasks, accurate simulation and image reconstruction allow performance to be estimated, and in some cases optimized, using computational methods before or during the process of physical construction. However, there are a vast range of approaches, algorithms and pre-existing computational tools that can be exploited and the choices made will affect the accuracy of the in silico results and quality of the reconstructed images. On the one hand, should important physical effects be neglected in either the simulation or reconstruction steps, specific enhancements provided by novel devices may not be represented in the results. On the other hand, over-modeling of device characteristics in either step leads to large computational overheads that can confound timely results. Here, a range of simulation methodologies and toolkits are discussed, as well as reconstruction algorithms that may be employed in emission imaging. The relative advantages and disadvantages of a range of options are highlighted using specific examples from current research scenarios.

  14. Simulation of ultrasound backscatter images from fish

    DEFF Research Database (Denmark)

    Pham, An Hoai

    2011-01-01

    The objective of this work is to investigate ultrasound (US) backscatter in the MHz range from fis to develop a realistic and reliable simulation model. The long term objective of the work is to develop the needed signal processing for fis species differentiation using US. In in-vitro experiments...... is 10 MHz and the Full Width at Half Maximum (FWHM) at the focus point is 0.54 mm in the lateral direction. The transducer model in Field II was calibrated using a wire phantom to validate the simulated point spread function. The inputs to the simulation were the CT image data of the fis converted......, a cod (Gadus morhua) was scanned with both a BK Medical ProFocus 2202 ultrasound scanner and a Toshiba Aquilion ONE computed tomography (CT) scanner. The US images of the fis were compared with US images created using the ultrasound simulation program Field II. The center frequency of the transducer...

  15. Analog and digital image quality:

    OpenAIRE

    Sardo, Alberto

    2004-01-01

    Background. Lastly the X ray facilities are moving to a slow, but continuous process of digitalization. The dry laser printers allow hardcopy images with optimum resolution and contrast for all the modalities. In breast imaging, thedelay of digitalization depends to the high cost of digital systems and, attimes, to the doubts of the diagnostic accuracy of reading the breast digital images. Conclusions. The Screen film mammography (SFM) is the most efficient diagnostic modality to detect the b...

  16. Hyperspectral imaging simulation of object under sea-sky background

    Science.gov (United States)

    Wang, Biao; Lin, Jia-xuan; Gao, Wei; Yue, Hui

    2016-10-01

    Remote sensing image simulation plays an important role in spaceborne/airborne load demonstration and algorithm development. Hyperspectral imaging is valuable in marine monitoring, search and rescue. On the demand of spectral imaging of objects under the complex sea scene, physics based simulation method of spectral image of object under sea scene is proposed. On the development of an imaging simulation model considering object, background, atmosphere conditions, sensor, it is able to examine the influence of wind speed, atmosphere conditions and other environment factors change on spectral image quality under complex sea scene. Firstly, the sea scattering model is established based on the Philips sea spectral model, the rough surface scattering theory and the water volume scattering characteristics. The measured bi directional reflectance distribution function (BRDF) data of objects is fit to the statistical model. MODTRAN software is used to obtain solar illumination on the sea, sky brightness, the atmosphere transmittance from sea to sensor and atmosphere backscattered radiance, and Monte Carlo ray tracing method is used to calculate the sea surface object composite scattering and spectral image. Finally, the object spectrum is acquired by the space transformation, radiation degradation and adding the noise. The model connects the spectrum image with the environmental parameters, the object parameters, and the sensor parameters, which provide a tool for the load demonstration and algorithm development.

  17. MRI quality control: six imagers studied using eleven unified image quality parameters

    International Nuclear Information System (INIS)

    Ihalainen, T.; Sipilae, O.; Savolainen, S.

    2004-01-01

    Quality control of the magnetic resonance imagers of different vendors in the clinical environment is non-harmonised, and comparing the performance is difficult. The purpose of this study was to develop and apply a harmonised long-term quality control protocol for the six imagers in our organisation in order to assure that they fulfil the same basic image quality requirements. The same Eurospin phantom set and identical imaging parameters were used with each imager. Values of 11 comparable parameters describing the image quality were measured. Automatic image analysis software was developed to objectively analyse the images. The results proved that the imagers were operating at a performance level adequate for clinical imaging. Some deficiencies were detected in image uniformity and geometry. The automated analysis of the Eurospin phantom images was successful. The measurements were successfully repeated after 2 weeks on one imager and after half a year on all imagers. As an objective way of examining the image quality, this kind of comparable and objective quality control of different imagers is considered as an essential step towards harmonisation of the clinical MRI studies through a large hospital organisation. (orig.)

  18. No-reference image quality assessment for horizontal-path imaging scenarios

    Science.gov (United States)

    Rios, Carlos; Gladysz, Szymon

    2013-05-01

    There exist several image-enhancement algorithms and tasks associated with imaging through turbulence that depend on defining the quality of an image. Examples include: "lucky imaging", choosing the width of the inverse filter for image reconstruction, or stopping iterative deconvolution. We collected a number of image quality metrics found in the literature. Particularly interesting are the blind, "no-reference" metrics. We discuss ways of evaluating the usefulness of these metrics, even when a fully objective comparison is impossible because of the lack of a reference image. Metrics are tested on simulated and real data. Field data comes from experiments performed by the NATO SET 165 research group over a 7 km distance in Dayton, Ohio.

  19. [Bone drilling simulation by three-dimensional imaging].

    Science.gov (United States)

    Suto, Y; Furuhata, K; Kojima, T; Kurokawa, T; Kobayashi, M

    1989-06-01

    The three-dimensional display technique has a wide range of medical applications. Pre-operative planning is one typical application: in orthopedic surgery, three-dimensional image processing has been used very successfully. We have employed this technique in pre-operative planning for orthopedic surgery, and have developed a simulation system for bone-drilling. Positive results were obtained by pre-operative rehearsal; when a region of interest is indicated by means of a mouse on the three-dimensional image displayed on the CRT, the corresponding region appears on the slice image which is displayed simultaneously. Consequently, the status of the bone-drilling is constantly monitored. In developing this system, we have placed emphasis on the quality of the reconstructed three-dimensional images, on fast processing, and on the easy operation of the surgical planning simulation.

  20. PLEIADES-HR IMAGE QUALITY COMMISSIONING

    Directory of Open Access Journals (Sweden)

    L. Lebègue

    2012-07-01

    Full Text Available PLEIADES is the highest resolution civilian earth observing system ever developed in Europe. This imagery program is conducted by the French National Space Agency, CNES. It operates since 2012 a first satellite PLEIADES-HR launched on 2011 December 17th, a second one should be launched by the end of the year. Each satellite is designed to provide optical 70 cm resolution coloured images to civilian and defence users. The Image Quality requirements were defined from users studies from the different spatial imaging applications, taking into account the trade-off between on-board technological complexity and ground processing capacity. The assessment of the image quality and the calibration operation have been performed by CNES Image Quality team during the 6 month commissioning phase that followed the satellite launch. These activities cover many topics gathered in two families : radiometric and geometric image quality. The new capabilities offered by PLEIADES-HR agility allowed to imagine new methods of image calibration and performance assessment. Starting from an overview of the satellite characteristics, this paper presents all the calibration operations that were conducted during the commissioning phase and also gives the main results for every image quality performance.

  1. Parameters related to the image quality in computed tomography -CT

    International Nuclear Information System (INIS)

    Alonso, T.C.; Silva, T.A.; Mourão, A.P.; Silva, T.A.

    2015-01-01

    Quality control programs in computed tomography, CT, should be continuously reviewed to always ensure the best image quality with the lowest possible dose for the patient in the diagnostic process. The quality control in CT aims to design and implement a set of procedures that allows the verification of their operating conditions within the specified requirements for its use. In Brazil, the Ministry of Health (MOH), the Technical Rules (Resolution NE in 1016.) - Radiology Medical - 'Equipment and Safety Performance' establishes a reference to the analysis of tests on TC. A large number of factors such as image noise, slice thickness (resolution of the Z axis), low contrast resolution and high contrast resolution and the radiation dose can be affected by the selection of technical parameters in exams. The purpose of this study was to investigate how changes in image acquisition protocols modify its quality and determine the advantages and disadvantages between the different aspects of image quality, especially the reduction of patient radiation dose. A preliminary procedure is to check the operating conditions of the CT measurements were performed on a scanner with 64-MDCT scanner (GE Healthcare, BrightSpeed) in the service of the Molecular Imaging Center (Cimol) of the Federal University of Minas Gerais (UFMG). When performing the image quality tests we used a simulator, Catphan-600, this device has five modules, and in each you can perform a series of tests. Different medical imaging practices have different requirements for acceptable image quality. The results of quality control tests showed that the analyzed equipment is in accordance with the requirements established by current regulations. [pt

  2. Image quality in coronary computed tomography angiography

    DEFF Research Database (Denmark)

    Precht, Helle; Gerke, Oke; Thygesen, Jesper

    2018-01-01

    Background Computed tomography (CT) technology is rapidly evolving and software solution developed to optimize image quality and/or lower radiation dose. Purpose To investigate the influence of adaptive statistical iterative reconstruction (ASIR) at different radiation doses in coronary CT...

  3. Quality control of imaging devices

    International Nuclear Information System (INIS)

    Soni, P.S.

    1992-01-01

    Quality assurance in nuclear medicine refers collectively to all aspects of a nuclear medicine service. It would include patient scheduling, radiopharmaceutical preparation and dispensing, radiation protection of patients, staff and general public, preventive maintenance and the care of instruments, methodology, data interpretation and records keeping, and many other small things which contribute directly or indirectly to the overall quality of a nuclear medicine service in a hospital. Quality Control, on the other hand, refers to a signal component of the system and is usually applied in relation to a specific instrument and its performance

  4. Tradeoffs between image quality and dose

    International Nuclear Information System (INIS)

    Seibert, J.A.

    2004-01-01

    Image quality takes on different perspectives and meanings when associated with the concept of as low as reasonably achievable (ALARA), which is chiefly focused on radiation dose delivered as a result of a medical imaging procedure. ALARA is important because of the increased radiosensitivity of children to ionizing radiation and the desire to keep the radiation dose low. By the same token, however, image quality is also important because of the need to provide the necessary information in a radiograph in order to make an accurate diagnosis. Thus, there are tradeoffs to be considered between image quality and radiation dose, which is the main topic of this article. ALARA does not necessarily mean the lowest radiation dose, nor, when implemented, does it result in the least desirable radiographic images. With the recent widespread implementation of digital radiographic detectors and displays, a new level of flexibility and complexity confronts the technologist, physicist, and radiologist in optimizing the pediatric radiography exam. This is due to the separation of the acquisition, display, and archiving events that were previously combined by the screen-film detector, which allows for compensation for under- and overexposures, image processing, and on-line image manipulation. As explained in the article, different concepts must be introduced for a better understanding of the tradeoffs encountered when dealing with digital radiography and ALARA. In addition, there are many instances during the image acquisition/display/interpretation process in which image quality and associated dose can be compromised. This requires continuous diligence to quality control and feedback mechanisms to verify that the goals of image quality, dose and ALARA are achieved. (orig.)

  5. Synthetic tracked aperture ultrasound imaging: design, simulation, and experimental evaluation.

    Science.gov (United States)

    Zhang, Haichong K; Cheng, Alexis; Bottenus, Nick; Guo, Xiaoyu; Trahey, Gregg E; Boctor, Emad M

    2016-04-01

    Ultrasonography is a widely used imaging modality to visualize anatomical structures due to its low cost and ease of use; however, it is challenging to acquire acceptable image quality in deep tissue. Synthetic aperture (SA) is a technique used to increase image resolution by synthesizing information from multiple subapertures, but the resolution improvement is limited by the physical size of the array transducer. With a large F-number, it is difficult to achieve high resolution in deep regions without extending the effective aperture size. We propose a method to extend the available aperture size for SA-called synthetic tracked aperture ultrasound (STRATUS) imaging-by sweeping an ultrasound transducer while tracking its orientation and location. Tracking information of the ultrasound probe is used to synthesize the signals received at different positions. Considering the practical implementation, we estimated the effect of tracking and ultrasound calibration error to the quality of the final beamformed image through simulation. In addition, to experimentally validate this approach, a 6 degree-of-freedom robot arm was used as a mechanical tracker to hold an ultrasound transducer and to apply in-plane lateral translational motion. Results indicate that STRATUS imaging with robotic tracking has the potential to improve ultrasound image quality.

  6. Psychophysical evaluation of image quality : from judgment to impression

    NARCIS (Netherlands)

    Ridder, de H.; Rogowitz, B.E.; Pappas, T.N.

    1998-01-01

    Designs of imaging systems, image processing algorithms etc. usually take for granted that methods for assessing perceived image quality produce unbiased estimates of the viewers' quality impression. Quality judgments, however, are affected by the judgment strategies induced by the experimental

  7. Image Quality Assessment via Quality-aware Group Sparse Coding

    Directory of Open Access Journals (Sweden)

    Minglei Tong

    2014-12-01

    Full Text Available Image quality assessment has been attracting growing attention at an accelerated pace over the past decade, in the fields of image processing, vision and machine learning. In particular, general purpose blind image quality assessment is technically challenging and lots of state-of-the-art approaches have been developed to solve this problem, most under the supervised learning framework where the human scored samples are needed for training a regression model. In this paper, we propose an unsupervised learning approach that work without the human label. In the off-line stage, our method trains a dictionary covering different levels of image quality patch atoms across the training samples without knowing the human score, where each atom is associated with a quality score induced from the reference image; at the on-line stage, given each image patch, our method performs group sparse coding to encode the sample, such that the sample quality can be estimated from the few labeled atoms whose encoding coefficients are nonzero. Experimental results on the public dataset show the promising performance of our approach and future research direction is also discussed.

  8. An investigation of the trade-off between the count level and image quality in myocardial perfusion SPECT using simulated images: the effects of statistical noise and object variability on defect detectability

    International Nuclear Information System (INIS)

    He Xin; Links, Jonathan M; Frey, Eric C

    2010-01-01

    Quantum noise as well as anatomic and uptake variability in patient populations limits observer performance on a defect detection task in myocardial perfusion SPECT (MPS). The goal of this study was to investigate the relative importance of these two effects by varying acquisition time, which determines the count level, and assessing the change in performance on a myocardial perfusion (MP) defect detection task using both mathematical and human observers. We generated ten sets of projections of a simulated patient population with count levels ranging from 1/128 to around 15 times a typical clinical count level to simulate different levels of quantum noise. For the simulated population we modeled variations in patient, heart and defect size, heart orientation and shape, defect location, organ uptake ratio, etc. The projection data were reconstructed using the OS-EM algorithm with no compensation or with attenuation, detector response and scatter compensation (ADS). The images were then post-filtered and reoriented to generate short-axis slices. A channelized Hotelling observer (CHO) was applied to the short-axis images, and the area under the receiver operating characteristics (ROC) curve (AUC) was computed. For each noise level and reconstruction method, we optimized the number of iterations and cutoff frequencies of the Butterworth filter to maximize the AUC. Using the images obtained with the optimal iteration and cutoff frequency and ADS compensation, we performed human observer studies for four count levels to validate the CHO results. Both CHO and human observer studies demonstrated that observer performance was dependent on the relative magnitude of the quantum noise and the patient variation. When the count level was high, the patient variation dominated, and the AUC increased very slowly with changes in the count level for the same level of anatomic variability. When the count level was low, however, quantum noise dominated, and changes in the count level

  9. Quality assurance in diagnostic radiology - assessing the fluoroscopic image quality

    International Nuclear Information System (INIS)

    Tabakov, S.

    1995-01-01

    The X-ray fluoroscopic image has a considerably lower resolution than the radiographic one. This requires a careful quality control aiming at optimal use of the fluoroscopic equipment. The basic procedures for image quality assessment of Image Intensifier/TV image are described. Test objects from Leeds University (UK) are used as prototypes. The results from examining 50 various fluoroscopic devices are shown. Their limiting spatial resolution varies between 0.8 lp/mm (at maximum II field size) and 2.24 lp/mm (at minimum field size). The mean value of the limiting spatial resolution for a 23 cm Image Intensifier is about 1.24 lp/mm. The mean limits of variation of the contrast/detail diagram for various fluoroscopic equipment are graphically expressed. 14 refs., 1 fig. (author)

  10. Image and Dose Simulation in Support of New Imaging Modalities

    International Nuclear Information System (INIS)

    Kuruvilla Verghese

    2002-01-01

    This report summarizes the highlights of the research performed under the 2-year NEER grant from the Department of Energy. The primary outcome of the work was a new Monte Carlo code, MCMIS-DS, for Monte Carlo for Mammography Image Simulation including Differential Sampling. The code was written to generate simulated images and dose distributions from two different new digital x-ray imaging modalities, namely, synchrotron imaging (SI) and a slot geometry digital mammography system called Fisher Senoscan. A differential sampling scheme was added to the code to generate multiple images that included variations in the parameters of the measurement system and the object in a single execution of the code. The code is to serve multiple purposes; (1) to answer questions regarding the contribution of scattered photons to images, (2) for use in design optimization studies, and (3) to do up to second-order perturbation studies to assess the effects of design parameter variations and/or physical parameters of the object (the breast) without having to re-run the code for each set of varied parameters. The accuracy and fidelity of the code were validated by a large variety of benchmark studies using published data and also using experimental results from mammography phantoms on both imaging modalities

  11. Improvement of material decomposition and image quality in dual-energy radiography by reducing image noise

    International Nuclear Information System (INIS)

    Lee, D.; Choi, S.; Kim, H.; Kim, H.-J.; Kim, Y.-S.; Choi, S.; Lee, H.; Jo, B.D.; Jeon, P.-H.; Kim, H.; Kim, D.

    2016-01-01

    Although digital radiography has been widely used for screening human anatomical structures in clinical situations, it has several limitations due to anatomical overlapping. To resolve this problem, dual-energy imaging techniques, which provide a method for decomposing overlying anatomical structures, have been suggested as alternative imaging techniques. Previous studies have reported several dual-energy techniques, each resulting in different image qualities. In this study, we compared three dual-energy techniques: simple log subtraction (SLS), simple smoothing of a high-energy image (SSH), and anti-correlated noise reduction (ACNR) with respect to material thickness quantification and image quality. To evaluate dual-energy radiography, we conducted Monte Carlo simulation and experimental phantom studies. The Geant 4 Application for Tomographic Emission (GATE) v 6.0 and tungsten anode spectral model using interpolation polynomials (TASMIP) codes were used for simulation studies and digital radiography, and human chest phantoms were used for experimental studies. The results of the simulation study showed improved image contrast-to-noise ratio (CNR) and coefficient of variation (COV) values and bone thickness estimation accuracy by applying the ACNR and SSH methods. Furthermore, the chest phantom images showed better image quality with the SSH and ACNR methods compared to the SLS method. In particular, the bone texture characteristics were well-described by applying the SSH and ACNR methods. In conclusion, the SSH and ACNR methods improved the accuracy of material quantification and image quality in dual-energy radiography compared to SLS. Our results can contribute to better diagnostic capabilities of dual-energy images and accurate material quantification in various clinical situations.

  12. Retinal image quality assessment based on image clarity and content

    Science.gov (United States)

    Abdel-Hamid, Lamiaa; El-Rafei, Ahmed; El-Ramly, Salwa; Michelson, Georg; Hornegger, Joachim

    2016-09-01

    Retinal image quality assessment (RIQA) is an essential step in automated screening systems to avoid misdiagnosis caused by processing poor quality retinal images. A no-reference transform-based RIQA algorithm is introduced that assesses images based on five clarity and content quality issues: sharpness, illumination, homogeneity, field definition, and content. Transform-based RIQA algorithms have the advantage of considering retinal structures while being computationally inexpensive. Wavelet-based features are proposed to evaluate the sharpness and overall illumination of the images. A retinal saturation channel is designed and used along with wavelet-based features for homogeneity assessment. The presented sharpness and illumination features are utilized to assure adequate field definition, whereas color information is used to exclude nonretinal images. Several publicly available datasets of varying quality grades are utilized to evaluate the feature sets resulting in area under the receiver operating characteristic curve above 0.99 for each of the individual feature sets. The overall quality is assessed by a classifier that uses the collective features as an input vector. The classification results show superior performance of the algorithm in comparison to other methods from literature. Moreover, the algorithm addresses efficiently and comprehensively various quality issues and is suitable for automatic screening systems.

  13. Quality Improvement With Discrete Event Simulation: A Primer for Radiologists.

    Science.gov (United States)

    Booker, Michael T; O'Connell, Ryan J; Desai, Bhushan; Duddalwar, Vinay A

    2016-04-01

    The application of simulation software in health care has transformed quality and process improvement. Specifically, software based on discrete-event simulation (DES) has shown the ability to improve radiology workflows and systems. Nevertheless, despite the successful application of DES in the medical literature, the power and value of simulation remains underutilized. For this reason, the basics of DES modeling are introduced, with specific attention to medical imaging. In an effort to provide readers with the tools necessary to begin their own DES analyses, the practical steps of choosing a software package and building a basic radiology model are discussed. In addition, three radiology system examples are presented, with accompanying DES models that assist in analysis and decision making. Through these simulations, we provide readers with an understanding of the theory, requirements, and benefits of implementing DES in their own radiology practices. Copyright © 2016 American College of Radiology. All rights reserved.

  14. Monte Carlo simulation of PET images for injection doseoptimization

    Czech Academy of Sciences Publication Activity Database

    Boldyš, Jiří; Dvořák, Jiří; Skopalová, M.; Bělohlávek, O.

    2013-01-01

    Roč. 29, č. 9 (2013), s. 988-999 ISSN 2040-7939 R&D Projects: GA MŠk 1M0572 Institutional support: RVO:67985556 Keywords : positron emission tomography * Monte Carlo simulation * biological system modeling * image quality Subject RIV: FD - Oncology ; Hematology Impact factor: 1.542, year: 2013 http://library.utia.cas.cz/separaty/2013/ZOI/boldys-0397175.pdf

  15. Quality control of a virtual simulation installation. SFPM report nr 25, August 2009

    International Nuclear Information System (INIS)

    Foulquier, Jean-Noel; Allieres, Norbert; Batalla, Alain; Beaumont, Stephane; Di Bartolo, Cristelle; Khodri, Mustapha; Tauziede, Jean-Marc; Dedieu, Veronique; Bramoule, Celine; Caselles, Olivier; Lacaze, Brigitte; Mazurier, Jocelyne

    2009-08-01

    This report is a notably comprehensive guide for the different controls which can be performed on devices present in a Virtual Simulation installation. After a brief historical recall, the authors present the definition and organisation of a virtual simulation, the different components of a virtual simulation installation, and the different steps of virtual simulation process. Then, they address the quality control of the scanner-simulator (linearity and periodicity of controls, patient table or support, tolerance levels and periodicity of controls of this support). They address tracking systems (quality control of laser systems), the quality control of virtual simulation tools (iso-centre contouring and positioning, ballistic tools, tolerance levels, control periodicity), the quality control of data transfer or objects (elements to be analysed during an image transfer, tolerance levels and control periodicity), the imager quality control, and tests phantoms (the physical and digital phantoms)

  16. Medical Image Registration and Surgery Simulation

    DEFF Research Database (Denmark)

    Bro-Nielsen, Morten

    1996-01-01

    This thesis explores the application of physical models in medical image registration and surgery simulation. The continuum models of elasticity and viscous fluids are described in detail, and this knowledge is used as a basis for most of the methods described here. Real-time deformable models......, and the use of selective matrix vector multiplication. Fluid medical image registration A new and faster algorithm for non-rigid registration using viscous fluid models is presented. This algorithm replaces the core part of the original algorithm with multi-resolution convolution using a new filter, which...... growth is also presented. Using medical knowledge about the growth processes of the mandibular bone, a registration algorithm for time sequence images of the mandible is developed. Since this registration algorithm models the actual development of the mandible, it is possible to simulate the development...

  17. Image quality evaluation of full reference algorithm

    Science.gov (United States)

    He, Nannan; Xie, Kai; Li, Tong; Ye, Yushan

    2018-03-01

    Image quality evaluation is a classic research topic, the goal is to design the algorithm, given the subjective feelings consistent with the evaluation value. This paper mainly introduces several typical reference methods of Mean Squared Error(MSE), Peak Signal to Noise Rate(PSNR), Structural Similarity Image Metric(SSIM) and feature similarity(FSIM) of objective evaluation methods. The different evaluation methods are tested by Matlab, and the advantages and disadvantages of these methods are obtained by analyzing and comparing them.MSE and PSNR are simple, but they are not considered to introduce HVS characteristics into image quality evaluation. The evaluation result is not ideal. SSIM has a good correlation and simple calculation ,because it is considered to the human visual effect into image quality evaluation,However the SSIM method is based on a hypothesis,The evaluation result is limited. The FSIM method can be used for test of gray image and color image test, and the result is better. Experimental results show that the new image quality evaluation algorithm based on FSIM is more accurate.

  18. Quality assurance for electronic portal imaging devices

    International Nuclear Information System (INIS)

    Shalev, S.; Rajapakshe, R.; Gluhchev, G.; Luchka, K.

    1997-01-01

    Electronic portal imaging devices (EPIDS) are assuming an ever-increasing role in the verification of radiation treatment accuracy. They are used both in a passive capacity, for the determination of field displacement distributions (''setup errors''), and also in an active role whereby the patient setup is corrected on the basis of electronic portal images. In spite of their potential impact on the precision of patient treatment, there are few quality assurance procedures available, and most of the EPIDS in clinical use are subject, at best, to only perfunctory quality assurance. The goals of this work are (a) to develop an objective and reproducible test for EPID image quality on the factory floor and during installation of the EPID on site; (b) to provide the user with a simple and accurate tool for acceptance, commissioning, and routine quality control; and (c) to initiate regional, national and international collaboration in the implementation of standardized, objective, and automated quality assurance procedures. To this end we have developed an automated test in which a simple test object is imaged daily, and the spatial and contrast resolution of the EPID are automatically evaluated in terms of ''acceptable'', ''warning'' and ''stop'' criteria. Our experience over two years shows the test to be highly sensitive, reproducible, and inexpensive in time and effort. Inter-institutional trials are under way in Canada, US and Europe which indicate large variations in EPID image quality from one EPID to another, and from one center to another. We expect the new standardized quality assurance procedure to lead to improved, and consistent image quality, increased operator acceptance of the technology, and agreement on uniform standards by equipment suppliers and health care agencies. (author)

  19. Measures of Image Quality. Chapter 4

    Energy Technology Data Exchange (ETDEWEB)

    Maidment, A. D.A. [University of Pennsylvania, Philadelphia (United States)

    2014-09-15

    A medical image is a pictorial representation of a measurement of an object or function of the body. This information can be acquired in one to three spatial dimensions. It can be static or dynamic, meaning that it can also be measured as a function of time. Certain fundamental properties can be associated with all of these data. Firstly, no image can exactly represent the object or function; at best, one has a measurement with an associated error equal to the difference between the true object and the measured image. Secondly, no two images will be identical, even if acquired with the same imaging system of the same anatomic region; this variability is generally referred to as noise. There are many different ways to acquire medical image data; the various mechanisms of acquisition are described in detail in the subsequent chapters. However, regardless of the method of image formation, one must be able to judge the fidelity of the image in an attempt to answer the question “How accurately does the image portray the body or the bodily function?” This judgement falls under the rubric of ‘image quality’. In this chapter, methods of quantifying image quality are described.

  20. Quality Control of Mega Voltage Portal Imaging System

    International Nuclear Information System (INIS)

    Diklic, A.; Dundara Debeljuh, D.; Jurkovic, S.; Smilovic Radojcic, D.; Svabic Kolacio; Kasabasic, M.; Faj, D.

    2013-01-01

    The Electronic Portal Imaging Device (EPID) is a system used to verify either the correct positioning of the patient during radiotherapy treatment or the linear accelerator beam parameters. The correct position of the patient corresponds to the position at which the patient was scanned at the CT simulator and according to which the therapy plan was made and optimized. Regarding this, besides the advanced treatment planning system and optimized treatment planning techniques, the day-to-day reproduction of simulated conditions is of great importance for the treatment outcome. Therefore, to verify the patient set-up portal imaging should be applied prior to the first treatment session and repeated according to treatment prescriptions during the treatment. In order to achieve full functionality and precision of the EPID, it must be included in radiotherapy Quality Control (QC) programme. The QC of the Mega Voltage portal imaging system was separated in two parts. In the first, the QC of the detector parameters should be performed. For this purpose, the FC2 and QC3 phantoms should be used, along with the Portal Image Processing System program (PIPSpro) package for data analysis. The second part of the QC of the linear accelerator's portal imaging system should include the QC of the CBCT. In this part a set of predefined manufacturer's tests using two different phantoms, one for the geometry calibration and the other for the image quality evaluation, should be performed. Also, the treatment conditions should be simulated using anthropomorphic phantoms and dose distributions for particular EPID protocols should be measured. Procedures for quality control of the portal imaging system developed and implemented at University Hospital Rijeka are presented in this paper.(author)

  1. Subjective matters: from image quality to image psychology

    Science.gov (United States)

    Fedorovskaya, Elena A.; De Ridder, Huib

    2013-03-01

    From the advent of digital imaging through several decades of studies, the human vision research community systematically focused on perceived image quality and digital artifacts due to resolution, compression, gamma, dynamic range, capture and reproduction noise, blur, etc., to help overcome existing technological challenges and shortcomings. Technological advances made digital images and digital multimedia nearly flawless in quality, and ubiquitous and pervasive in usage, provide us with the exciting but at the same time demanding possibility to turn to the domain of human experience including higher psychological functions, such as cognition, emotion, awareness, social interaction, consciousness and Self. In this paper we will outline the evolution of human centered multidisciplinary studies related to imaging and propose steps and potential foci of future research.

  2. Quality criteria for cardiac images: An update

    International Nuclear Information System (INIS)

    Bernardi, G.; Bar, O.; Jezewski, T.; Vano, E.; Maccia, C.; Trianni, A.; Padovani, R.

    2008-01-01

    The DIMOND II and III Cardiology Groups have agreed on quality criteria for cardiac images and developed a scoring system, to provide a tool to test quality of coronary angiograms, which was demonstrated to be of value in clinical practice. In the last years, digital flat panel technology has been introduced in cardiac angiographic systems and the radiological technique may have been influenced by the better performance of these new detectors. This advance in digital imaging, together with the lesson learned from previous studies, warranted the revision of the quality criteria for cardiac angiographic images as formerly defined. DIMOND criteria were reassessed to allow a simpler evaluation of angiograms. Clinical criteria were simplified and separated from technical criteria. Furthermore, the characteristics of an optimised angiographic technique have been outlined. (authors)

  3. Image Quality in Vascular Radiology

    International Nuclear Information System (INIS)

    Vanhavere, F.; Struelens, L.

    2005-01-01

    In vascular radiology, the radiologists use the radiological image to diagnose or treat a specific vascular structure. From literature, we know that related doses are high and that large dose variability exists between different hospitals. The application of the optimization principle is therefore necessary and is obliged by the new legislation. So far, very little fieldwork has been performed and no practical instructions are available to do the necessary work. It's indisputable that obtaining quantitative data is of great interest for optimization purposes. In order to gain insight into these doses and the possible measures for dose reduction, we performed a comparative study in 7 hospitals. Patient doses will be measured and calculated for specific procedures in vascular radiology and evaluated against their most influencing parameters. In view of optimization purposes, a protocol for dose audit will be set-up. From the results and conclusions in this study, experimentally based guidelines will be proposed, in order to improve clinical practice in vascular radiology

  4. Software for Simulation of Hyperspectral Images

    Science.gov (United States)

    Richtsmeier, Steven C.; Singer-Berk, Alexander; Bernstein, Lawrence S.

    2002-01-01

    A package of software generates simulated hyperspectral images for use in validating algorithms that generate estimates of Earth-surface spectral reflectance from hyperspectral images acquired by airborne and spaceborne instruments. This software is based on a direct simulation Monte Carlo approach for modeling three-dimensional atmospheric radiative transport as well as surfaces characterized by spatially inhomogeneous bidirectional reflectance distribution functions. In this approach, 'ground truth' is accurately known through input specification of surface and atmospheric properties, and it is practical to consider wide variations of these properties. The software can treat both land and ocean surfaces and the effects of finite clouds with surface shadowing. The spectral/spatial data cubes computed by use of this software can serve both as a substitute for and a supplement to field validation data.

  5. Image Acquisition and Quality in Digital Radiography.

    Science.gov (United States)

    Alexander, Shannon

    2016-09-01

    Medical imaging has undergone dramatic changes and technological breakthroughs since the introduction of digital radiography. This article presents information on the development of digital radiography and types of digital radiography systems. Aspects of image quality and radiation exposure control are highlighted as well. In addition, the article includes related workplace changes and medicolegal considerations in the digital radiography environment. ©2016 American Society of Radiologic Technologists.

  6. Ultrasound Image Quality Assessment: A framework for evaluation of clinical image quality

    DEFF Research Database (Denmark)

    Hemmsen, Martin Christian; Pedersen, Mads Møller; Nikolov, Svetoslav Ivanov

    2010-01-01

    Improvement of ultrasound images should be guided by their diagnostic value. Evaluation of clinical image quality is generally performed subjectively, because objective criteria have not yet been fully developed and accepted for the evaluation of clinical image quality. Based on recommendation 50...... information, which is fast enough to get sufficient number of scans under realistic operating conditions, so that statistical evaluation is valid and reliable....

  7. Comparison of dose and image quality of a Flat-panel detector and an image intensifier

    International Nuclear Information System (INIS)

    Lazzaro, M.; Friedrich, B.Q.; Luz, R.M. da; Silva, A.M.M. da

    2016-01-01

    With the development of new technologies, have emerged new conversion methods of X-ray image, such as flat panel detectors. The aim of this work is the comparison of entrance surface air kerma (ESAK) and image quality between an image intensifier type of detector (A) and a flat panel (B). The ESAK was obtained by placing a ionization chamber under PMMA simulators of 10, 20 and 30 cm and the image quality was obtained by using the TOR "1"8FG simulator. The ESAK to the equipment A is higher when compared to the equipment B. The high contrast resolution is better for the equipment A for all thicknesses of simulators. The equipment A has low contrast resolution with a better viewing threshold for thicknesses of 10 and 20 cm, and a worse performance for 30 cm. It is concluded that the equipment B has ESAK smaller and despite having lower resolution, in almost all cases, have appropriate image quality for diagnosis. (author)

  8. Subjective evaluation of compressed image quality

    Science.gov (United States)

    Lee, Heesub; Rowberg, Alan H.; Frank, Mark S.; Choi, Hyung-Sik; Kim, Yongmin

    1992-05-01

    Lossy data compression generates distortion or error on the reconstructed image and the distortion becomes visible as the compression ratio increases. Even at the same compression ratio, the distortion appears differently depending on the compression method used. Because of the nonlinearity of the human visual system and lossy data compression methods, we have evaluated subjectively the quality of medical images compressed with two different methods, an intraframe and interframe coding algorithms. The evaluated raw data were analyzed statistically to measure interrater reliability and reliability of an individual reader. Also, the analysis of variance was used to identify which compression method is better statistically, and from what compression ratio the quality of a compressed image is evaluated as poorer than that of the original. Nine x-ray CT head images from three patients were used as test cases. Six radiologists participated in reading the 99 images (some were duplicates) compressed at four different compression ratios, original, 5:1, 10:1, and 15:1. The six readers agree more than by chance alone and their agreement was statistically significant, but there were large variations among readers as well as within a reader. The displacement estimated interframe coding algorithm is significantly better in quality than that of the 2-D block DCT at significance level 0.05. Also, 10:1 compressed images with the interframe coding algorithm do not show any significant differences from the original at level 0.05.

  9. REQUIREMENTS FOR IMAGE QUALITY OF EMERGENCY SPACECRAFTS

    Directory of Open Access Journals (Sweden)

    A. I. Altukhov

    2015-05-01

    Full Text Available The paper deals with the method for formation of quality requirements to the images of emergency spacecrafts. The images are obtained by means of remote sensing of near-earth space orbital deployment in the visible range. of electromagnetic radiation. The method is based on a joint taking into account conditions of space survey, characteristics of surveillance equipment, main design features of the observed spacecrafts and orbital inspection tasks. Method. Quality score is the predicted linear resolution image that gives the possibility to create a complete view of pictorial properties of the space image obtained by electro-optical system from the observing satellite. Formulation of requirements to the numerical value of this indicator is proposed to perform based on the properties of remote sensing system, forming images in the conditions of outer space, and the properties of the observed emergency spacecraft: dimensions, platform construction of the satellite, on-board equipment placement. For method implementation the authors have developed a predictive model of requirements to a linear resolution for images of emergency spacecrafts, making it possible to select the intervals of space shooting and get the satellite images required for quality interpretation. Main results. To verify the proposed model functionality we have carried out calculations of the numerical values for the linear resolution of the image, ensuring the successful task of determining the gross structural damage of the spacecrafts and identifying changes in their spatial orientation. As input data were used with dimensions and geometric primitives corresponding to the shape of deemed inspected spacecrafts: Resurs-P", "Canopus-B", "Electro-L". Numerical values of the linear resolution images have been obtained, ensuring the successful task solution for determining the gross structural damage of spacecrafts.

  10. Image simulation for automatic license plate recognition

    Science.gov (United States)

    Bala, Raja; Zhao, Yonghui; Burry, Aaron; Kozitsky, Vladimir; Fillion, Claude; Saunders, Craig; Rodríguez-Serrano, José

    2012-01-01

    Automatic license plate recognition (ALPR) is an important capability for traffic surveillance applications, including toll monitoring and detection of different types of traffic violations. ALPR is a multi-stage process comprising plate localization, character segmentation, optical character recognition (OCR), and identification of originating jurisdiction (i.e. state or province). Training of an ALPR system for a new jurisdiction typically involves gathering vast amounts of license plate images and associated ground truth data, followed by iterative tuning and optimization of the ALPR algorithms. The substantial time and effort required to train and optimize the ALPR system can result in excessive operational cost and overhead. In this paper we propose a framework to create an artificial set of license plate images for accelerated training and optimization of ALPR algorithms. The framework comprises two steps: the synthesis of license plate images according to the design and layout for a jurisdiction of interest; and the modeling of imaging transformations and distortions typically encountered in the image capture process. Distortion parameters are estimated by measurements of real plate images. The simulation methodology is successfully demonstrated for training of OCR.

  11. Saliency image of feature building for image quality assessment

    Science.gov (United States)

    Ju, Xinuo; Sun, Jiyin; Wang, Peng

    2011-11-01

    The purpose and method of image quality assessment are quite different for automatic target recognition (ATR) and traditional application. Local invariant feature detectors, mainly including corner detectors, blob detectors and region detectors etc., are widely applied for ATR. A saliency model of feature was proposed to evaluate feasibility of ATR in this paper. The first step consisted of computing the first-order derivatives on horizontal orientation and vertical orientation, and computing DoG maps in different scales respectively. Next, saliency images of feature were built based auto-correlation matrix in different scale. Then, saliency images of feature of different scales amalgamated. Experiment were performed on a large test set, including infrared images and optical images, and the result showed that the salient regions computed by this model were consistent with real feature regions computed by mostly local invariant feature extraction algorithms.

  12. Improving image quality of parallel phase-shifting digital holography

    International Nuclear Information System (INIS)

    Awatsuji, Yasuhiro; Tahara, Tatsuki; Kaneko, Atsushi; Koyama, Takamasa; Nishio, Kenzo; Ura, Shogo; Kubota, Toshihiro; Matoba, Osamu

    2008-01-01

    The authors propose parallel two-step phase-shifting digital holography to improve the image quality of parallel phase-shifting digital holography. The proposed technique can increase the effective number of pixels of hologram twice in comparison to the conventional parallel four-step technique. The increase of the number of pixels makes it possible to improve the image quality of the reconstructed image of the parallel phase-shifting digital holography. Numerical simulation and preliminary experiment of the proposed technique were conducted and the effectiveness of the technique was confirmed. The proposed technique is more practical than the conventional parallel phase-shifting digital holography, because the composition of the digital holographic system based on the proposed technique is simpler.

  13. Image quality in digital radiographic systems

    Directory of Open Access Journals (Sweden)

    Almeida Solange Maria de

    2003-01-01

    Full Text Available The aim of the present study was to evaluate the image quality of four direct digital radiographic systems. Radiographs were made of the maxillary central incisor and mandibular left molar regions of a dry skull, and an aluminum step-wedge. The X-ray generator operated at 10 mA, 60 and 70 kVp, and images were acquired with 3, 5, 8, 12, 24 and 48 exposure pulses. Six well-trained observers classified the images by means of scores from 1 to 3. Collected data were submitted to nonparametric statistical analysis using Fisher's exact test. Statistical analysis showed significant differences (p<0.01 in image quality with the four systems. Based on the results, it was possible to conclude that: 1 all of the digital systems presented good performance in producing acceptable images for diagnosis, if the exposures of the step-wedge and the maxillary central incisor region were made at 5 pulses, as well as at 8 pulses for the mandibular left molar region, selecting 60 or 70kVp; 2 higher percentages of acceptable images were obtained with the administration of lower radiation doses in CCD-sensors (charge-coupled device; 3 the Storage Phosphor systems produced acceptable images at a large range of exposure settings, that included low, intermediate and high radiation doses.

  14. Medical image archive node simulation and architecture

    Science.gov (United States)

    Chiang, Ted T.; Tang, Yau-Kuo

    1996-05-01

    It is a well known fact that managed care and new treatment technologies are revolutionizing the health care provider world. Community Health Information Network and Computer-based Patient Record projects are underway throughout the United States. More and more hospitals are installing digital, `filmless' radiology (and other imagery) systems. They generate a staggering amount of information around the clock. For example, a typical 500-bed hospital might accumulate more than 5 terabytes of image data in a period of 30 years for conventional x-ray images and digital images such as Magnetic Resonance Imaging and Computer Tomography images. With several hospitals contributing to the archive, the storage required will be in the hundreds of terabytes. Systems for reliable, secure, and inexpensive storage and retrieval of digital medical information do not exist today. In this paper, we present a Medical Image Archive and Distribution Service (MIADS) concept. MIADS is a system shared by individual and community hospitals, laboratories, and doctors' offices that need to store and retrieve medical images. Due to the large volume and complexity of the data, as well as the diversified user access requirement, implementation of the MIADS will be a complex procedure. One of the key challenges to implementing a MIADS is to select a cost-effective, scalable system architecture to meet the ingest/retrieval performance requirements. We have performed an in-depth system engineering study, and developed a sophisticated simulation model to address this key challenge. This paper describes the overall system architecture based on our system engineering study and simulation results. In particular, we will emphasize system scalability and upgradability issues. Furthermore, we will discuss our simulation results in detail. The simulations study the ingest/retrieval performance requirements based on different system configurations and architectures for variables such as workload, tape

  15. Large-field image intensifiers versus conventional chest radiography: ROC study with simulated interstitial disease

    International Nuclear Information System (INIS)

    Winter, L.H.L.; Chakraborty, D.P.; Waes, P.F.G.M.

    1988-01-01

    Two image intensifier tubes have recently been introduced whose large imaging area makes them suitable for chest imaging (Phillips Pulmodiagnost TLX slit II and Siemens TX 57 large entrance field II). Both modalities present a 10 x 10-cm hard copy image to the radiologist. A receiver operating characteristic (ROC) curve study with simulated interstitial disease was performed to compare the image quality of these image intensifiers with conventional chest images. The relative ranking in terms of decreasing ROC areas was Siemens, conventional, and Philips. Compared with conventional imaging, none of the differences in ROC curve area were statistically significant at the 5% level

  16. Image quality comparison between single energy and dual energy CT protocols for hepatic imaging

    International Nuclear Information System (INIS)

    Yao, Yuan; Pelc, Norbert J.; Ng, Joshua M.; Megibow, Alec J.

    2016-01-01

    Purpose: Multi-detector computed tomography (MDCT) enables volumetric scans in a single breath hold and is clinically useful for hepatic imaging. For simple tasks, conventional single energy (SE) computed tomography (CT) images acquired at the optimal tube potential are known to have better quality than dual energy (DE) blended images. However, liver imaging is complex and often requires imaging of both structures containing iodinated contrast media, where atomic number differences are the primary contrast mechanism, and other structures, where density differences are the primary contrast mechanism. Hence it is conceivable that the broad spectrum used in a dual energy acquisition may be an advantage. In this work we are interested in comparing these two imaging strategies at equal-dose and more complex settings. Methods: We developed numerical anthropomorphic phantoms to mimic realistic clinical CT scans for medium size and large size patients. MDCT images based on the defined phantoms were simulated using various SE and DE protocols at pre- and post-contrast stages. For SE CT, images from 60 kVp through 140 with 10 kVp steps were considered; for DE CT, both 80/140 and 100/140 kVp scans were simulated and linearly blended at the optimal weights. To make a fair comparison, the mAs of each scan was adjusted to match the reference radiation dose (120 kVp, 200 mAs for medium size patients and 140 kVp, 400 mAs for large size patients). Contrast-to-noise ratio (CNR) of liver against other soft tissues was used to evaluate and compare the SE and DE protocols, and multiple pre- and post-contrasted liver-tissue pairs were used to define a composite CNR. To help validate the simulation results, we conducted a small clinical study. Eighty-five 120 kVp images and 81 blended 80/140 kVp images were collected and compared through both quantitative image quality analysis and an observer study. Results: In the simulation study, we found that the CNR of pre-contrast SE image mostly

  17. Quality control: comparison of images quality with screen film system and digital mammography CR

    International Nuclear Information System (INIS)

    Alvarenga, Frederico L.; Nogueira, Maria do Socorro

    2008-01-01

    The mammography screen film system should be used as part of processing chemicals, revelation process, equipment and this system has have a progressive replacing by the digital technology Full Field Digital Mammography FFDM, Computed Radiography (CR) Mammography and hardcopy. This new acquisition process of medical images has improved radiology section; however it is necessary efficient means for evaluating of the quality parameters. It should be considered taking into account the adaptation of the existent equipment and that procedures adopted for the exam, as well the adaptation of the new mammography films, the radiologist view box constitutes a part of the quality control program. This program aims at obtaining radiography with good quality that allows obtaining more information for the diagnosis and decreases the patient dose. For evaluation the quality image, this article is focused on presenting the differences regarding the acquired images through simulator mammography radiographic PMMA (Poly methyl methacrylate) in CR Mammography system and screen film system. The tests were accomplished at the same equipment of Mammography with the Automatic Exposure Control using a tension of 28 kV for both systems. The quality tests evaluated the spatial resolution, the own structures of the phantom, artifacts, optical density and contrast with conventional and laser films by mammography system. The installation for the accomplishment of the test has a quality control program. The evaluation was based on the pattern developed by the competent organ of the State of Minas Gerais. In this study, it was verified that the suitable Phantom Mama used by the Brazilian School of Radiology for conventional mammography did not obtain satisfactory result for Spatial Resolution in the digital mammography system CR. The final aim of this work is to obtain parameters to characterize the reference phantom quality image in an objective way. These parameters will be used to compare

  18. Fourier transform based scalable image quality measure.

    Science.gov (United States)

    Narwaria, Manish; Lin, Weisi; McLoughlin, Ian; Emmanuel, Sabu; Chia, Liang-Tien

    2012-08-01

    We present a new image quality assessment (IQA) algorithm based on the phase and magnitude of the 2D (twodimensional) Discrete Fourier Transform (DFT). The basic idea is to compare the phase and magnitude of the reference and distorted images to compute the quality score. However, it is well known that the Human Visual Systems (HVSs) sensitivity to different frequency components is not the same. We accommodate this fact via a simple yet effective strategy of nonuniform binning of the frequency components. This process also leads to reduced space representation of the image thereby enabling the reduced-reference (RR) prospects of the proposed scheme. We employ linear regression to integrate the effects of the changes in phase and magnitude. In this way, the required weights are determined via proper training and hence more convincing and effective. Lastly, using the fact that phase usually conveys more information than magnitude, we use only the phase for RR quality assessment. This provides the crucial advantage of further reduction in the required amount of reference image information. The proposed method is therefore further scalable for RR scenarios. We report extensive experimental results using a total of 9 publicly available databases: 7 image (with a total of 3832 distorted images with diverse distortions) and 2 video databases (totally 228 distorted videos). These show that the proposed method is overall better than several of the existing fullreference (FR) algorithms and two RR algorithms. Additionally, there is a graceful degradation in prediction performance as the amount of reference image information is reduced thereby confirming its scalability prospects. To enable comparisons and future study, a Matlab implementation of the proposed algorithm is available at http://www.ntu.edu.sg/home/wslin/reduced_phase.rar.

  19. Simulation Study of Effects of the Blind Deconvolution on Ultrasound Image

    Science.gov (United States)

    He, Xingwu; You, Junchen

    2018-03-01

    Ultrasonic image restoration is an essential subject in Medical Ultrasound Imaging. However, without enough and precise system knowledge, some traditional image restoration methods based on the system prior knowledge often fail to improve the image quality. In this paper, we use the simulated ultrasound image to find the effectiveness of the blind deconvolution method for ultrasound image restoration. Experimental results demonstrate that the blind deconvolution method can be applied to the ultrasound image restoration and achieve the satisfactory restoration results without the precise prior knowledge, compared with the traditional image restoration method. And with the inaccurate small initial PSF, the results shows blind deconvolution could improve the overall image quality of ultrasound images, like much better SNR and image resolution, and also show the time consumption of these methods. it has no significant increasing on GPU platform.

  20. Image quality and stability of image-guided radiotherapy (IGRT) devices: A comparative study

    International Nuclear Information System (INIS)

    Stock, Markus; Pasler, Marlies; Birkfellner, Wolfgang; Homolka, Peter; Poetter, Richard; Georg, Dietmar

    2009-01-01

    Introduction: Our aim was to implement standards for quality assurance of IGRT devices used in our department and to compare their performances with that of a CT simulator. Materials and methods: We investigated image quality parameters for three devices over a period of 16 months. A multislice CT was used as a benchmark and results related to noise, spatial resolution, low contrast visibility (LCV) and uniformity were compared with a cone beam CT (CBCT) at a linac and simulator. Results: All devices performed well in terms of LCV and, in fact, exceeded vendor specifications. MTF was comparable between CT and linac CBCT. Integral nonuniformity was, on average, 0.002 for the CT and 0.006 for the linac CBCT. Uniformity, LCV and MTF varied depending on the protocols used for the linac CBCT. Contrast-to-noise ratio was an average of 51% higher for the CT than for the linac and simulator CBCT. No significant time trend was observed and tolerance limits were implemented. Discussion: Reasonable differences in image quality between CT and CBCT were observed. Further research and development are necessary to increase image quality of commercially available CBCT devices in order for them to serve the needs for adaptive and/or online planning.

  1. Image quality and stability of image-guided radiotherapy (IGRT) devices: A comparative study.

    Science.gov (United States)

    Stock, Markus; Pasler, Marlies; Birkfellner, Wolfgang; Homolka, Peter; Poetter, Richard; Georg, Dietmar

    2009-10-01

    Our aim was to implement standards for quality assurance of IGRT devices used in our department and to compare their performances with that of a CT simulator. We investigated image quality parameters for three devices over a period of 16months. A multislice CT was used as a benchmark and results related to noise, spatial resolution, low contrast visibility (LCV) and uniformity were compared with a cone beam CT (CBCT) at a linac and simulator. All devices performed well in terms of LCV and, in fact, exceeded vendor specifications. MTF was comparable between CT and linac CBCT. Integral nonuniformity was, on average, 0.002 for the CT and 0.006 for the linac CBCT. Uniformity, LCV and MTF varied depending on the protocols used for the linac CBCT. Contrast-to-noise ratio was an average of 51% higher for the CT than for the linac and simulator CBCT. No significant time trend was observed and tolerance limits were implemented. Reasonable differences in image quality between CT and CBCT were observed. Further research and development are necessary to increase image quality of commercially available CBCT devices in order for them to serve the needs for adaptive and/or online planning.

  2. Correlation of the clinical and physical image quality in chest radiography for average adults with a computed radiography imaging system.

    Science.gov (United States)

    Moore, C S; Wood, T J; Beavis, A W; Saunderson, J R

    2013-07-01

    The purpose of this study was to examine the correlation between the quality of visually graded patient (clinical) chest images and a quantitative assessment of chest phantom (physical) images acquired with a computed radiography (CR) imaging system. The results of a previously published study, in which four experienced image evaluators graded computer-simulated postero-anterior chest images using a visual grading analysis scoring (VGAS) scheme, were used for the clinical image quality measurement. Contrast-to-noise ratio (CNR) and effective dose efficiency (eDE) were used as physical image quality metrics measured in a uniform chest phantom. Although optimal values of these physical metrics for chest radiography were not derived in this work, their correlation with VGAS in images acquired without an antiscatter grid across the diagnostic range of X-ray tube voltages was determined using Pearson's correlation coefficient. Clinical and physical image quality metrics increased with decreasing tube voltage. Statistically significant correlations between VGAS and CNR (R=0.87, pchest CR images acquired without an antiscatter grid. A statistically significant correlation has been found between the clinical and physical image quality in CR chest imaging. The results support the value of using CNR and eDE in the evaluation of quality in clinical thorax radiography.

  3. Objective assessment of image quality VI: imaging in radiation therapy

    International Nuclear Information System (INIS)

    Barrett, Harrison H; Kupinski, Matthew A; Müeller, Stefan; Halpern, Howard J; Morris, John C III; Dwyer, Roisin

    2013-01-01

    Earlier work on objective assessment of image quality (OAIQ) focused largely on estimation or classification tasks in which the desired outcome of imaging is accurate diagnosis. This paper develops a general framework for assessing imaging quality on the basis of therapeutic outcomes rather than diagnostic performance. By analogy to receiver operating characteristic (ROC) curves and their variants as used in diagnostic OAIQ, the method proposed here utilizes the therapy operating characteristic or TOC curves, which are plots of the probability of tumor control versus the probability of normal-tissue complications as the overall dose level of a radiotherapy treatment is varied. The proposed figure of merit is the area under the TOC curve, denoted AUTOC. This paper reviews an earlier exposition of the theory of TOC and AUTOC, which was specific to the assessment of image-segmentation algorithms, and extends it to other applications of imaging in external-beam radiation treatment as well as in treatment with internal radioactive sources. For each application, a methodology for computing the TOC is presented. A key difference between ROC and TOC is that the latter can be defined for a single patient rather than a population of patients. (paper)

  4. Simulated Thin-Film Growth and Imaging

    Science.gov (United States)

    Schillaci, Michael

    2001-06-01

    Thin-films have become the cornerstone of the electronics, telecommunications, and broadband markets. A list of potential products includes: computer boards and chips, satellites, cell phones, fuel cells, superconductors, flat panel displays, optical waveguides, building and automotive windows, food and beverage plastic containers, metal foils, pipe plating, vision ware, manufacturing equipment and turbine engines. For all of these reasons a basic understanding of the physical processes involved in both growing and imaging thin-films can provide a wonderful research project for advanced undergraduate and first-year graduate students. After producing rudimentary two- and three-dimensional thin-film models incorporating ballsitic deposition and nearest neighbor Coulomb-type interactions, the QM tunneling equations are used to produce simulated scanning tunneling microscope (SSTM) images of the films. A discussion of computational platforms, languages, and software packages that may be used to accomplish similar results is also given.

  5. Digital imaging in diagnostic radiology. Image quality - radiation exposure

    International Nuclear Information System (INIS)

    Schmidt, T.; Stieve, F.E.

    1996-01-01

    The publication contains the 37 lectures of the symposium on digital imaging in diagnostic radiology, held in November 1995 at Kloster Seeon, as well as contributions enhancing the information presented in the lectures. The publication reflects the state of the art in this subject field, discusses future trends and gives recommendations and information relating to current practice in radiology. In-depth information is given about R and D activities for the digitalisation of X-ray pictures and the image quality required to meet the purposes of modern diagnostics. Further aspects encompass radiological protection and dose optimization as well as optimization of examination methods. (vhe) [de

  6. Dried fruits quality assessment by hyperspectral imaging

    Science.gov (United States)

    Serranti, Silvia; Gargiulo, Aldo; Bonifazi, Giuseppe

    2012-05-01

    Dried fruits products present different market values according to their quality. Such a quality is usually quantified in terms of freshness of the products, as well as presence of contaminants (pieces of shell, husk, and small stones), defects, mould and decays. The combination of these parameters, in terms of relative presence, represent a fundamental set of attributes conditioning dried fruits humans-senses-detectable-attributes (visual appearance, organolectic properties, etc.) and their overall quality in terms of marketable products. Sorting-selection strategies exist but sometimes they fail when a higher degree of detection is required especially if addressed to discriminate between dried fruits of relatively small dimensions and when aiming to perform an "early detection" of pathogen agents responsible of future moulds and decays development. Surface characteristics of dried fruits can be investigated by hyperspectral imaging (HSI). In this paper, specific and "ad hoc" applications addressed to propose quality detection logics, adopting a hyperspectral imaging (HSI) based approach, are described, compared and critically evaluated. Reflectance spectra of selected dried fruits (hazelnuts) of different quality and characterized by the presence of different contaminants and defects have been acquired by a laboratory device equipped with two HSI systems working in two different spectral ranges: visible-near infrared field (400-1000 nm) and near infrared field (1000-1700 nm). The spectra have been processed and results evaluated adopting both a simple and fast wavelength band ratio approach and a more sophisticated classification logic based on principal component (PCA) analysis.

  7. Improvement of Fuzzy Image Contrast Enhancement Using Simulated Ergodic Fuzzy Markov Chains

    Directory of Open Access Journals (Sweden)

    Behrouz Fathi-Vajargah

    2014-01-01

    Full Text Available This paper presents a novel fuzzy enhancement technique using simulated ergodic fuzzy Markov chains for low contrast brain magnetic resonance imaging (MRI. The fuzzy image contrast enhancement is proposed by weighted fuzzy expected value. The membership values are then modified to enhance the image using ergodic fuzzy Markov chains. The qualitative performance of the proposed method is compared to another method in which ergodic fuzzy Markov chains are not considered. The proposed method produces better quality image.

  8. Automated Quality Assurance Applied to Mammographic Imaging

    Directory of Open Access Journals (Sweden)

    Anne Davis

    2002-07-01

    Full Text Available Quality control in mammography is based upon subjective interpretation of the image quality of a test phantom. In order to suppress subjectivity due to the human observer, automated computer analysis of the Leeds TOR(MAM test phantom is investigated. Texture analysis via grey-level co-occurrence matrices is used to detect structures in the test object. Scoring of the substructures in the phantom is based on grey-level differences between regions and information from grey-level co-occurrence matrices. The results from scoring groups of particles within the phantom are presented.

  9. Meteosat third generation imager: simulation of the flexible combined imager instrument chain

    Science.gov (United States)

    Just, Dieter; Gutiérrez, Rebeca; Roveda, Fausto; Steenbergen, Theo

    2014-10-01

    The Meteosat Third Generation (MTG) Programme is the next generation of European geostationary meteorological systems. The first MTG satellite, MTG-I1, which is scheduled for launch at the end of 2018, will host two imaging instruments: the Flexible Combined Imager (FCI) and the Lightning Imager. The FCI will provide continuation of the SEVIRI imager operations on the current Meteosat Second Generation satellites (MSG), but with an improved spatial, temporal and spectral resolution, not dissimilar to GOES-R (of NASA/NOAA). Unlike SEVIRI on the spinning MSG spacecraft, the FCI will be mounted on a 3-axis stabilised platform and a 2-axis tapered scan will provide a full coverage of the Earth in 10 minute repeat cycles. Alternatively, a rapid scanning mode can cover smaller areas, but with a better temporal resolution of up to 2.5 minutes. In order to assess some of the data acquisition and processing aspects which will apply to the FCI, a simplified end-to-end imaging chain prototype was set up. The simulation prototype consists of four different functional blocks: - A function for the generation of FCI-like references images - An image acquisition simulation function for the FCI Line-of-Sight calculation and swath generation - A processing function that reverses the swath generation process by rectifying the swath data - An evaluation function for assessing the quality of the processed data with respect to the reference images This paper presents an overview of the FCI instrument chain prototype, covering instrument characteristics, reference image generation, image acquisition simulation, and processing aspects. In particular, it provides in detail the description of the generation of references images, highlighting innovative features, but also limitations. This is followed by a description of the image acquisition simulation process, and the rectification and evaluation function. The latter two are described in more detail in a separate paper. Finally, results

  10. Photometric Modeling of Simulated Surace-Resolved Bennu Images

    Science.gov (United States)

    Golish, D.; DellaGiustina, D. N.; Clark, B.; Li, J. Y.; Zou, X. D.; Bennett, C. A.; Lauretta, D. S.

    2017-12-01

    The Origins, Spectral Interpretation, Resource Identification, Security, Regolith Explorer (OSIRIS-REx) is a NASA mission to study and return a sample of asteroid (101955) Bennu. Imaging data from the mission will be used to develop empirical surface-resolved photometric models of Bennu at a series of wavelengths. These models will be used to photometrically correct panchromatic and color base maps of Bennu, compensating for variations due to shadows and photometric angle differences, thereby minimizing seams in mosaicked images. Well-corrected mosaics are critical to the generation of a global hazard map and a global 1064-nm reflectance map which predicts LIDAR response. These data products directly feed into the selection of a site from which to safely acquire a sample. We also require photometric correction for the creation of color ratio maps of Bennu. Color ratios maps provide insight into the composition and geological history of the surface and allow for comparison to other Solar System small bodies. In advance of OSIRIS-REx's arrival at Bennu, we use simulated images to judge the efficacy of both the photometric modeling software and the mission observation plan. Our simulation software is based on USGS's Integrated Software for Imagers and Spectrometers (ISIS) and uses a synthetic shape model, a camera model, and an empirical photometric model to generate simulated images. This approach gives us the flexibility to create simulated images of Bennu based on analog surfaces from other small Solar System bodies and to test our modeling software under those conditions. Our photometric modeling software fits image data to several conventional empirical photometric models and produces the best fit model parameters. The process is largely automated, which is crucial to the efficient production of data products during proximity operations. The software also produces several metrics on the quality of the observations themselves, such as surface coverage and the

  11. Study on the influence of several factors on the quality of SR-XFMT image

    International Nuclear Information System (INIS)

    Deng Biao; Yu Xiaohan; Xu Hongjie

    2007-01-01

    Synchrotron Radiation based X-ray Fluorescent Microtomography (SR-XFMT) is a novel non-destructive technique, which has the ability to reconstruct elemental distributions within a specimen with nondestructive methods. The paper studied the influence of several factors, such as the sampling interval and projections, image reconstruction algorithm and fluorescence signals, on the quality of SR-XFMT image by computer simulation. Some useful conclusions on the quality of SR-XFMT image can be drawn. (authors)

  12. Research on simulated infrared image utility evaluation using deep representation

    Science.gov (United States)

    Zhang, Ruiheng; Mu, Chengpo; Yang, Yu; Xu, Lixin

    2018-01-01

    Infrared (IR) image simulation is an important data source for various target recognition systems. However, whether simulated IR images could be used as training data for classifiers depends on the features of fidelity and authenticity of simulated IR images. For evaluation of IR image features, a deep-representation-based algorithm is proposed. Being different from conventional methods, which usually adopt a priori knowledge or manually designed feature, the proposed method can extract essential features and quantitatively evaluate the utility of simulated IR images. First, for data preparation, we employ our IR image simulation system to generate large amounts of IR images. Then, we present the evaluation model of simulated IR image, for which an end-to-end IR feature extraction and target detection model based on deep convolutional neural network is designed. At last, the experiments illustrate that our proposed method outperforms other verification algorithms in evaluating simulated IR images. Cross-validation, variable proportion mixed data validation, and simulation process contrast experiments are carried out to evaluate the utility and objectivity of the images generated by our simulation system. The optimum mixing ratio between simulated and real data is 0.2≤γ≤0.3, which is an effective data augmentation method for real IR images.

  13. A virtual image chain for perceived image quality of medical display

    Science.gov (United States)

    Marchessoux, Cédric; Jung, Jürgen

    2006-03-01

    This paper describes a virtual image chain for medical display (project VICTOR: granted in the 5th framework program by European commission). The chain starts from raw data of an image digitizer (CR, DR) or synthetic patterns and covers image enhancement (MUSICA by Agfa) and both display possibilities, hardcopy (film on viewing box) and softcopy (monitor). Key feature of the chain is a complete image wise approach. A first prototype is implemented in an object-oriented software platform. The display chain consists of several modules. Raw images are either taken from scanners (CR-DR) or from a pattern generator, in which characteristics of DR- CR systems are introduced by their MTF and their dose-dependent Poisson noise. The image undergoes image enhancement and comes to display. For soft display, color and monochrome monitors are used in the simulation. The image is down-sampled. The non-linear response of a color monitor is taken into account by the GOG or S-curve model, whereas the Standard Gray-Scale-Display-Function (DICOM) is used for monochrome display. The MTF of the monitor is applied on the image in intensity levels. For hardcopy display, the combination of film, printer, lightbox and viewing condition is modeled. The image is up-sampled and the DICOM-GSDF or a Kanamori Look-Up-Table is applied. An anisotropic model for the MTF of the printer is applied on the image in intensity levels. The density-dependent color (XYZ) of the hardcopy film is introduced by Look-Up-tables. Finally a Human Visual System Model is applied to the intensity images (XYZ in terms of cd/m2) in order to eliminate nonvisible differences. Comparison leads to visible differences, which are quantified by higher order image quality metrics. A specific image viewer is used for the visualization of the intensity image and the visual difference maps.

  14. Image quality and dose differences caused by vendor-specific image processing of neonatal radiographs

    International Nuclear Information System (INIS)

    Sensakovic, William F.; O'Dell, M.C.; Letter, Haley; Kohler, Nathan; Rop, Baiywo; Cook, Jane; Logsdon, Gregory; Varich, Laura

    2016-01-01

    Image processing plays an important role in optimizing image quality and radiation dose in projection radiography. Unfortunately commercial algorithms are black boxes that are often left at or near vendor default settings rather than being optimized. We hypothesize that different commercial image-processing systems, when left at or near default settings, create significant differences in image quality. We further hypothesize that image-quality differences can be exploited to produce images of equivalent quality but lower radiation dose. We used a portable radiography system to acquire images on a neonatal chest phantom and recorded the entrance surface air kerma (ESAK). We applied two image-processing systems (Optima XR220amx, by GE Healthcare, Waukesha, WI; and MUSICA"2 by Agfa HealthCare, Mortsel, Belgium) to the images. Seven observers (attending pediatric radiologists and radiology residents) independently assessed image quality using two methods: rating and matching. Image-quality ratings were independently assessed by each observer on a 10-point scale. Matching consisted of each observer matching GE-processed images and Agfa-processed images with equivalent image quality. A total of 210 rating tasks and 42 matching tasks were performed and effective dose was estimated. Median Agfa-processed image-quality ratings were higher than GE-processed ratings. Non-diagnostic ratings were seen over a wider range of doses for GE-processed images than for Agfa-processed images. During matching tasks, observers matched image quality between GE-processed images and Agfa-processed images acquired at a lower effective dose (11 ± 9 μSv; P < 0.0001). Image-processing methods significantly impact perceived image quality. These image-quality differences can be exploited to alter protocols and produce images of equivalent image quality but lower doses. Those purchasing projection radiography systems or third-party image-processing software should be aware that image processing

  15. Image quality and dose differences caused by vendor-specific image processing of neonatal radiographs

    Energy Technology Data Exchange (ETDEWEB)

    Sensakovic, William F.; O' Dell, M.C.; Letter, Haley; Kohler, Nathan; Rop, Baiywo; Cook, Jane; Logsdon, Gregory; Varich, Laura [Florida Hospital, Imaging Administration, Orlando, FL (United States)

    2016-10-15

    Image processing plays an important role in optimizing image quality and radiation dose in projection radiography. Unfortunately commercial algorithms are black boxes that are often left at or near vendor default settings rather than being optimized. We hypothesize that different commercial image-processing systems, when left at or near default settings, create significant differences in image quality. We further hypothesize that image-quality differences can be exploited to produce images of equivalent quality but lower radiation dose. We used a portable radiography system to acquire images on a neonatal chest phantom and recorded the entrance surface air kerma (ESAK). We applied two image-processing systems (Optima XR220amx, by GE Healthcare, Waukesha, WI; and MUSICA{sup 2} by Agfa HealthCare, Mortsel, Belgium) to the images. Seven observers (attending pediatric radiologists and radiology residents) independently assessed image quality using two methods: rating and matching. Image-quality ratings were independently assessed by each observer on a 10-point scale. Matching consisted of each observer matching GE-processed images and Agfa-processed images with equivalent image quality. A total of 210 rating tasks and 42 matching tasks were performed and effective dose was estimated. Median Agfa-processed image-quality ratings were higher than GE-processed ratings. Non-diagnostic ratings were seen over a wider range of doses for GE-processed images than for Agfa-processed images. During matching tasks, observers matched image quality between GE-processed images and Agfa-processed images acquired at a lower effective dose (11 ± 9 μSv; P < 0.0001). Image-processing methods significantly impact perceived image quality. These image-quality differences can be exploited to alter protocols and produce images of equivalent image quality but lower doses. Those purchasing projection radiography systems or third-party image-processing software should be aware that image

  16. Effects of characteristics of image quality in an immersive environment

    Science.gov (United States)

    Duh, Henry Been-Lirn; Lin, James J W.; Kenyon, Robert V.; Parker, Donald E.; Furness, Thomas A.

    2002-01-01

    Image quality issues such as field of view (FOV) and resolution are important for evaluating "presence" and simulator sickness (SS) in virtual environments (VEs). This research examined effects on postural stability of varying FOV, image resolution, and scene content in an immersive visual display. Two different scenes (a photograph of a fountain and a simple radial pattern) at two different resolutions were tested using six FOVs (30, 60, 90, 120, 150, and 180 deg.). Both postural stability, recorded by force plates, and subjective difficulty ratings varied as a function of FOV, scene content, and image resolution. Subjects exhibited more balance disturbance and reported more difficulty in maintaining posture in the wide-FOV, high-resolution, and natural scene conditions.

  17. INDOOR AIR QUALITY AND INHALATION EXPOSURE - SIMULATION TOOL KIT

    Science.gov (United States)

    A Microsoft Windows-based indoor air quality (IAQ) simulation software package is presented. Named Simulation Tool Kit for Indoor Air Quality and Inhalation Exposure, or IAQX for short, this package complements and supplements existing IAQ simulation programs and is desi...

  18. Cost Analysis of Poor Quality Using a Software Simulation

    Directory of Open Access Journals (Sweden)

    Jana Fabianová

    2017-02-01

    Full Text Available The issues of quality, cost of poor quality and factors affecting quality are crucial to maintaining a competitiveness regarding to business activities. Use of software applications and computer simulation enables more effective quality management. Simulation tools offer incorporating the variability of more variables in experiments and evaluating their common impact on the final output. The article presents a case study focused on the possibility of using computer simulation Monte Carlo in the field of quality management. Two approaches for determining the cost of poor quality are introduced here. One from retrospective scope of view, where the cost of poor quality and production process are calculated based on historical data. The second approach uses the probabilistic characteristics of the input variables by means of simulation, and reflects as a perspective view of the costs of poor quality. Simulation output in the form of a tornado and sensitivity charts complement the risk analysis.

  19. Using a web-based image quality assurance reporting system to improve image quality.

    Science.gov (United States)

    Czuczman, Gregory J; Pomerantz, Stuart R; Alkasab, Tarik K; Huang, Ambrose J

    2013-08-01

    The purpose of this study is to show the impact of a web-based image quality assurance reporting system on the rates of three common image quality errors at our institution. A web-based image quality assurance reporting system was developed and used beginning in April 2009. Image quality endpoints were assessed immediately before deployment (period 1), approximately 18 months after deployment of a prototype reporting system (period 2), and approximately 12 months after deployment of a subsequent upgraded department-wide reporting system (period 3). A total of 3067 axillary shoulder radiographs were reviewed for correct orientation, 355 shoulder CT scans were reviewed for correct reformatting of coronal and sagittal images, and 346 sacral MRI scans were reviewed for correct acquisition plane of axial images. Error rates for each review period were calculated and compared using the Fisher exact test. Error rates of axillary shoulder radiograph orientation were 35.9%, 7.2%, and 10.0%, respectively, for the three review periods. The decrease in error rate between periods 1 and 2 was statistically significant (p < 0.0001). Error rates of shoulder CT reformats were 9.8%, 2.7%, and 5.8%, respectively, for the three review periods. The decrease in error rate between periods 1 and 2 was statistically significant (p = 0.03). Error rates for sacral MRI axial sequences were 96.5%, 32.5%, and 3.4%, respectively, for the three review periods. The decrease in error rates between periods 1 and 2 and between periods 2 and 3 was statistically significant (p < 0.0001). A web-based system for reporting image quality errors may be effective for improving image quality.

  20. Fundamental image quality limits for microcomputed tomography in small animals

    International Nuclear Information System (INIS)

    Ford, N.L.; Thornton, M.M.; Holdsworth, D.W.

    2003-01-01

    Small-animal imaging has become increasingly more important as transgenic and knockout mice are produced to model human diseases. One imaging technique that has emerged is microcomputed tomography (micro-CT). For live-animal imaging, the precision in the images will be determined by the x-ray dose given to the animal. As a result, we propose a simple method to predict the noise performance of an x-ray micro-CT system as a function of dose and image resolution. An ideal, quantum-noise limited micro-CT scanner, assumed to have perfect resolution and ideal efficiency, was modeled. Using a simplified model, the coefficient of variation (COV) of the linear attenuation coefficient was calculated for a range of entrance doses and isotropic voxel sizes. COV calculations were performed for the ideal case and with simulated imperfections in efficiency and resolution. Our model was validated in phantom studies and mouse images were acquired with a specimen scanner to illustrate the results. A simplified model of noise propagation in the case of isotropic resolution indicates that the COV in the linear attenuation coefficient is proportional to (dose) -1/2 and to the (isotropic voxel size) -2 in the reconstructed volume. Therefore an improvement in the precision can be achieved only by increasing the isotropic voxel size (thereby decreasing the resolution of the image) or by increasing the x-ray dose. For the ideal scanner, a COV of 1% in the linear attenuation coefficient for an image of a mouse exposed to 0.25 Gy is obtained with a minimum isotropic voxel size of 135 μm. However, the same COV is achieved at a dose of 5.0 Gy with a 65 μm isotropic voxel size. Conversely, for a 68 mm diameter rat, a COV of 1% obtained from an image at 5.0 Gy would require an isotropic voxel size of 100 μm. These results indicate that short-term, potentially lethal, effects of ionizing radiation will limit high-resolution live animal imaging. As improvements in detector technology allow the

  1. Daily quality controls analysis of a CT scanner simulator

    International Nuclear Information System (INIS)

    Vasques, Maira Milanelo; Santos, Gabriela R.; Furnari, Laura

    2016-01-01

    With the increasing technological developments, radiotherapy practices, which allow for better involvement of the tumor with the required therapeutic dose and minimize the complications of normal tissues, have become reality in several Radiotherapy services. The use of these resources in turn, was only possible due to the progress made in planning based on digital volumetric images of good quality, such as computed tomography (CT), which allow the correct delimitation of the tumor volume and critical structures. Specific tests for quality control in a CT scanner used in radiotherapy, named CT simulator, should be applied as part of the institutional Quality Assurance Program. This study presents the methodology used in the Instituto de Radiologia do Hospital das Clinicas da Faculdade de Medicina da Universidade de Sao Paulo (HCFMUSP) for daily testing of the CT scanner simulator and the results obtained throughout more than two years. The experience gained in the period conducted showed that the tests are easy to perform and can be done in a few minutes by a trained professional. Data analysis showed good reproducibility, which allowed the tests could be performed less frequently, after 16 months of data collection. (author)

  2. A fourier transform quality measure for iris images

    CSIR Research Space (South Africa)

    Makinana, S

    2014-08-01

    Full Text Available to ensure that good quality images are selected for feature extraction, in order to improve iris recognition system. In addition, this research proposes a measure of iris image quality using a Fourier Transform. The experimental results demonstrate...

  3. Digitalization and networking of analog simulators and portal images

    Energy Technology Data Exchange (ETDEWEB)

    Pesznyak, C.; Zarand, P.; Mayer, A. [Uzsoki Hospital, Budapest (Hungary). Inst. of Oncoradiology

    2007-03-15

    Background: Many departments have analog simulators and irradiation facilities (especially cobalt units) without electronic portal imaging. Import of the images into the R and V (Record and Verify) system is required. Material and Methods: Simulator images are grabbed while portal films scanned by using a laser scanner and both converted into DICOM RT (Digital Imaging and Communications in Medicine Radiotherapy) images. Results: Image intensifier output of a simulator and portal films are converted to DICOM RT images and used in clinical practice. The simulator software was developed in cooperation at the authors' hospital. Conclusion: The digitalization of analog simulators is a valuable updating in clinical use replacing screen-film technique. Film scanning and digitalization permit the electronic archiving of films. Conversion into DICOM RT images is a precondition of importing to the R and V system. (orig.)

  4. Digitalization and networking of analog simulators and portal images.

    Science.gov (United States)

    Pesznyák, Csilla; Zaránd, Pál; Mayer, Arpád

    2007-03-01

    Many departments have analog simulators and irradiation facilities (especially cobalt units) without electronic portal imaging. Import of the images into the R&V (Record & Verify) system is required. Simulator images are grabbed while portal films scanned by using a laser scanner and both converted into DICOM RT (Digital Imaging and Communications in Medicine Radiotherapy) images. Image intensifier output of a simulator and portal films are converted to DICOM RT images and used in clinical practice. The simulator software was developed in cooperation at the authors' hospital. The digitalization of analog simulators is a valuable updating in clinical use replacing screen-film technique. Film scanning and digitalization permit the electronic archiving of films. Conversion into DICOM RT images is a precondition of importing to the R&V system.

  5. Image quality assessment based on multiscale geometric analysis.

    Science.gov (United States)

    Gao, Xinbo; Lu, Wen; Tao, Dacheng; Li, Xuelong

    2009-07-01

    Reduced-reference (RR) image quality assessment (IQA) has been recognized as an effective and efficient way to predict the visual quality of distorted images. The current standard is the wavelet-domain natural image statistics model (WNISM), which applies the Kullback-Leibler divergence between the marginal distributions of wavelet coefficients of the reference and distorted images to measure the image distortion. However, WNISM fails to consider the statistical correlations of wavelet coefficients in different subbands and the visual response characteristics of the mammalian cortical simple cells. In addition, wavelet transforms are optimal greedy approximations to extract singularity structures, so they fail to explicitly extract the image geometric information, e.g., lines and curves. Finally, wavelet coefficients are dense for smooth image edge contours. In this paper, to target the aforementioned problems in IQA, we develop a novel framework for IQA to mimic the human visual system (HVS) by incorporating the merits from multiscale geometric analysis (MGA), contrast sensitivity function (CSF), and the Weber's law of just noticeable difference (JND). In the proposed framework, MGA is utilized to decompose images and then extract features to mimic the multichannel structure of HVS. Additionally, MGA offers a series of transforms including wavelet, curvelet, bandelet, contourlet, wavelet-based contourlet transform (WBCT), and hybrid wavelets and directional filter banks (HWD), and different transforms capture different types of image geometric information. CSF is applied to weight coefficients obtained by MGA to simulate the appearance of images to observers by taking into account many of the nonlinearities inherent in HVS. JND is finally introduced to produce a noticeable variation in sensory experience. Thorough empirical studies are carried out upon the LIVE database against subjective mean opinion score (MOS) and demonstrate that 1) the proposed framework has

  6. The mobile image quality survey game

    Science.gov (United States)

    Rasmussen, D. René

    2012-01-01

    In this paper we discuss human assessment of the quality of photographic still images, that are degraded in various manners relative to an original, for example due to compression or noise. In particular, we examine and present results from a technique where observers view images on a mobile device, perform pairwise comparisons, identify defects in the images, and interact with the display to indicate the location of the defects. The technique measures the response time and accuracy of the responses. By posing the survey in a form similar to a game, providing performance feedback to the observer, the technique attempts to increase the engagement of the observers, and to avoid exhausting observers, a factor that is often a problem for subjective surveys. The results are compared with the known physical magnitudes of the defects and with results from similar web-based surveys. The strengths and weaknesses of the technique are discussed. Possible extensions of the technique to video quality assessment are also discussed.

  7. Arabidopsis Growth Simulation Using Image Processing Technology

    Directory of Open Access Journals (Sweden)

    Junmei Zhang

    2014-01-01

    Full Text Available This paper aims to provide a method to represent the virtual Arabidopsis plant at each growth stage. It includes simulating the shape and providing growth parameters. The shape is described with elliptic Fourier descriptors. First, the plant is segmented from the background with the chromatic coordinates. With the segmentation result, the outer boundary series are obtained by using boundary tracking algorithm. The elliptic Fourier analysis is then carried out to extract the coefficients of the contour. The coefficients require less storage than the original contour points and can be used to simulate the shape of the plant. The growth parameters include total area and the number of leaves of the plant. The total area is obtained with the number of the plant pixels and the image calibration result. The number of leaves is derived by detecting the apex of each leaf. It is achieved by using wavelet transform to identify the local maximum of the distance signal between the contour points and the region centroid. Experiment result shows that this method can record the growth stage of Arabidopsis plant with fewer data and provide a visual platform for plant growth research.

  8. Clinical efficiency, image quality and dosimetric considerations

    Energy Technology Data Exchange (ETDEWEB)

    Arreola, M. [Director of Clinical Radiological Physics, Shands Hospital at the University of Florida College of Medicine, Gainesville, FL (United States)

    2000-07-01

    Three decades have passed since the first clinical use of the famous EMI Computed Axial Tomography (Cat) scanner. At the time, the prospects for clinical success of this innovative idea were not very good. Time, however, has proven otherwise as what is now simply known as Computed tomography (CT) has been boosted in each one of these decades for different reasons. In the 1970s, technological progress augmented by the realization of the importance of tomographic imaging got everything started; in the 1980s, the boom in health care demand in the US solidified its position and in the 1990s the technological explosion in computers and the imperative need to lower costs in the health care industry have prompted the most dramatic changes in the wy CT is utilized in the year 2000. Thus, different motivations have led the way of progress in CT at various times, and in spite of amazing developments in other crucial imaging modalities, such as ultrasound, Doppler ultrasound, digital subtraction angiography and magnetic resonance imaging, CT maintains its rightful place as the premiere imaging modality in the modern radiology department. This work covers the basic principles of tomographic image reconstruction, and how axial CT scanners progressed historically in the first two decades. Developments in X-ray tubes, and detection systems are highlighted, as well as the impact of clinical efficiency, image quality and patient doses. The basic construction of translate-rotate (1st and 2nd generation), rotate-rotate (3rd generation) and detector ring (4th generation) scanners are described. The so-called 5th generation scanner, the electron beam scanner, is also described, with its clinical and technical advantages and its inherent financial and maintenance disadvantages, which brought the advent of spiral and multi-slice scanners. These most recent developments in CT technology have opened a new era in the clinical use of CT; and although image quality has reached an expected

  9. Clinical efficiency, image quality and dosimetric considerations

    International Nuclear Information System (INIS)

    Arreola, M.

    2000-01-01

    Three decades have passed since the first clinical use of the famous EMI Computed Axial Tomography (Cat) scanner. At the time, the prospects for clinical success of this innovative idea were not very good. Time, however, has proven otherwise as what is now simply known as Computed tomography (CT) has been boosted in each one of these decades for different reasons. In the 1970s, technological progress augmented by the realization of the importance of tomographic imaging got everything started; in the 1980s, the boom in health care demand in the US solidified its position and in the 1990s the technological explosion in computers and the imperative need to lower costs in the health care industry have prompted the most dramatic changes in the wy CT is utilized in the year 2000. Thus, different motivations have led the way of progress in CT at various times, and in spite of amazing developments in other crucial imaging modalities, such as ultrasound, Doppler ultrasound, digital subtraction angiography and magnetic resonance imaging, CT maintains its rightful place as the premiere imaging modality in the modern radiology department. This work covers the basic principles of tomographic image reconstruction, and how axial CT scanners progressed historically in the first two decades. Developments in X-ray tubes, and detection systems are highlighted, as well as the impact of clinical efficiency, image quality and patient doses. The basic construction of translate-rotate (1st and 2nd generation, rotate-rotate (3rd generation) and detector ring (4th generation) scanners are described. The so-called 5th generation scanner, the electron beam scanner, is also described, with its clinical and technical advantages and its inherent financial and maintenance disadvantages, which brought the advent of spiral and multi-slice scanners. These most recent developments in CT technology have opened a new era in the clinical use of CT; and although image quality has reached an expected

  10. Nonlinear filtering for character recognition in low quality document images

    Science.gov (United States)

    Diaz-Escobar, Julia; Kober, Vitaly

    2014-09-01

    Optical character recognition in scanned printed documents is a well-studied task, where the captured conditions like sheet position, illumination, contrast and resolution are controlled. Nowadays, it is more practical to use mobile devices for document capture than a scanner. So as a consequence, the quality of document images is often poor owing to presence of geometric distortions, nonhomogeneous illumination, low resolution, etc. In this work we propose to use multiple adaptive nonlinear composite filters for detection and classification of characters. Computer simulation results obtained with the proposed system are presented and discussed.

  11. Influence of radiation dose on image quality

    Energy Technology Data Exchange (ETDEWEB)

    Reichmann, S; Aastrand, K [Sahlgrenska Sjukhuset, Goeteborg (Sweden)

    1979-01-01

    When the speed of a recording medium is doubled the background quantum mottle is increased by a factor ..sqrt..2. However, the signal/noise ratio is changed not in proportion to the square root of the exposure, but in a linear fashion, i.e. by a factor 2. The change in the depiction of objects with a very high attenuation difference in relation to its surroundings appears not to be linear, but proportional to the square root of the exposure. Such objects (metal wire meshes, lead bar grids) should thus be avoided in routine evaluation of image quality since they give incomplete information as to image impairment when high-speed recording media are used.

  12. Influence of radiation dose on image quality

    International Nuclear Information System (INIS)

    Reichmann, S.; Aastrand, K.

    1979-01-01

    When the speed of a recording medium is doubled the background quantum mottle is increased by a factor √2. However, the signal/noise ratio is changed not in proportion to the square root of the exposure, but in a linear fashion, i.e. by a factor 2. The change in the depiction of objects with a very high attenuation difference in relation to its surroundings appears not to be linear, but proportional to the square root of the exposure. Such objects (metal wire meshes, lead bar grids) should thus be avoided in routine evaluation of image quality since they give incomplete information as to image impairment when high-speed recording media are used. (Auth.)

  13. Image Quality in Screening Mammography in Croatia

    International Nuclear Information System (INIS)

    Brnic, Z.; Klasic, B.; Popic-Ramac, J.; Ljevar, A.

    2011-01-01

    Mortality reduction through screening mammography (SMG) is possible only with examination of high image quality (IQ), which should be performed with acceptable patient breast radiation dose (BRD). Besides film processing control, equipment assessment with breast phantom and dosimetry, periodical external mammographic IQ assessment (MIQA) is needed, including image labelling (L), breast positioning (BP), exposure (EX) and artefacts (AR) assessment. The nationwide breast cancer screening program (NBSP) has been introduced in Croatia in 2006, and the MIQA is initiated as the first step in establishing quality assurance/quality control (QA/QC) framework in breast imaging in Croatia. The current study was aimed: (1) to provide objective evidence about the technical MIQ in NBSP in Croatia, (2) to compare MIQ between different types of mammographic units (MUs), (3) to identify the common deficiencies, and (4) to propose corrective activities. Mammograms (MGs) for IQA were collected from a total of 84 MUs which participate in NBSP, which represents 70 % of all MUs nationwide: A total of 420 MG examinations were reviewed. Each MU was requested to submit ''what they consider to be their five best representative MGs, each one performed in one of five consecutive workdays''. Mean age of MG machines was 7.76 years (range 2 - 21), with no difference between four MU types. This very first study of MIQ in Croatia corroborated our intuitive impression of inadequate IQ, staff training and equipment in many MUs nationwide. As MIQ strongly influences BC detection rate, suboptimal QA/QC always carries a risk to compromise the success of NBSP. Deficiencies in SMG, especially in ID and BP reflect different level of competency of radiological staff in Croatia. Differences in MIQ in various MU types are determined by their organization, equipment, education, working habits and motivation. More efforts are needed to train both RTs and radiologists to implement and maintain QA/QC in their

  14. Data simulation for the Associated Particle Imaging system

    International Nuclear Information System (INIS)

    Tunnell, L.N.

    1994-01-01

    A data simulation procedure for the Associated Particle Imaging (API) system has been developed by postprocessing output from the Monte Carlo Neutron Photon (MCNP) code. This paper compares the simulated results to our experimental data

  15. Image quality assessment for video stream recognition systems

    Science.gov (United States)

    Chernov, Timofey S.; Razumnuy, Nikita P.; Kozharinov, Alexander S.; Nikolaev, Dmitry P.; Arlazarov, Vladimir V.

    2018-04-01

    Recognition and machine vision systems have long been widely used in many disciplines to automate various processes of life and industry. Input images of optical recognition systems can be subjected to a large number of different distortions, especially in uncontrolled or natural shooting conditions, which leads to unpredictable results of recognition systems, making it impossible to assess their reliability. For this reason, it is necessary to perform quality control of the input data of recognition systems, which is facilitated by modern progress in the field of image quality evaluation. In this paper, we investigate the approach to designing optical recognition systems with built-in input image quality estimation modules and feedback, for which the necessary definitions are introduced and a model for describing such systems is constructed. The efficiency of this approach is illustrated by the example of solving the problem of selecting the best frames for recognition in a video stream for a system with limited resources. Experimental results are presented for the system for identity documents recognition, showing a significant increase in the accuracy and speed of the system under simulated conditions of automatic camera focusing, leading to blurring of frames.

  16. Quality of intensive care chest imaging

    International Nuclear Information System (INIS)

    Adam, G.; Wein, B.; Keulers, P.; Stargardt, A.; Guenther, R.W.

    1989-01-01

    The authors have evaluated the image quality of a stimulable phosphorous plate system in intensive care chest radiography. Four radiologists examined 308 chest radiographs (200 conventional, 108 digital) according to the following criteria: visibility of catheters, tubes (artificial objects), bronchi, central and peripheral vessels, diaphragm, trachea, and retrocardial lung parenchyma. Detectability of these structures was classified as good, poor, or impossible to see. In addition, optical density was measured in the region of liver, heart, and lung. Results were evaluated by Student and υ test

  17. Acceptance testing and quality assurance of Simulix evolution radiotherapy simulator

    International Nuclear Information System (INIS)

    Sinha, Ashutosh; Singh, Navin; Gurjar, Om Prakash; Bagdare, Priyusha

    2015-01-01

    The success of radiotherapy depends on precise treatment simulation and proper patient positioning. The simulator is a conventional radiographic and fluoroscopic system which emulates the geometrical positions of radiotherapy treatment unit. Hence, the acceptance tests and quality assurance (QA) of the simulator are important prior to its commissioning for the safe and precise clinical use. The verification of mechanical and optical readouts, field size, isocenter, optical and radiation field congruence were performed. The X-ray beam parameters were tested for kVp, mAs and consistency of radiation output. The flat panel detector performance was checked with respect to resolution, low contrast sensitivity (LCS), automatic dose rate control (ADRC), and gray image resolution (GIR). Gantry, table, and imaging system collision possibility was checked. Radiation survey around the room was also performed. The field size test for digital readout and on graph paper, the results of isocenter checkup for rotation of gantry, collimator, and couch, and the deviations observed in auto stop for various movements were found within the tolerance limits. Optical field and radiation field was found congruent. All the lasers were found aligned with the established isocenter. Maximum deviation for set and measured kV was found to be 3% in fluoro mode. The maximum deviation observed in mAs was 1.5% in 3-point as well as in 2-point film exposed mode. The X-ray output was found consistent. The results of tests for resolution, LCS, ADRC, and GIR of the flat panel detector were within tolerance limits. All the six safety interlocks were found working. Radiation level around the room was found within the acceptable limits. All the tests carried out were found within the tolerance limits. The data which has been taken in this study will provide basic support to the routine QA of the simulator. (author)

  18. Improvements in image quality with pseudo-parallel imaging in the phase-scrambling fourier transform technique

    International Nuclear Information System (INIS)

    Ito, Satoshi; Kawawa, Yasuhiro; Yamada, Yoshifumi

    2010-01-01

    The signal obtained in the phase-scrambling Fourier transform (PSFT) imaging technique can be transformed to the signal described by the Fresnel transform of the objects, in which the amplitude of the PSFT presents some kind of blurred image of the objects. Therefore, the signal can be considered to exist in the object domain as well as the Fourier domain of the object. This notable feature makes it possible to assign weights to the reconstructed images by applying a weighting function to the PSFT signal after data acquisition, and as a result, pseudo-parallel image reconstruction using these aliased image data with different weights on the images is feasible. In this study, the improvements in image quality with such pseudo-parallel imaging were examined and demonstrated. The weighting function of the PSFT signal that provides a given weight on the image is estimated using the obtained image data and is iteratively updated after sensitivity encoding (SENSE)-based image reconstruction. Simulation studies showed that reconstruction errors were dramatically reduced and that the spatial resolution was also improved in almost all image spaces. The proposed method was applied to signals synthesized from MR image data with phase variations to verify its effectiveness. It was found that the image quality was improved and that images almost entirely free of aliasing artifacts could be obtained. (author)

  19. Monte Carlo simulation of PET and SPECT imaging of {sup 90}Y

    Energy Technology Data Exchange (ETDEWEB)

    Takahashi, Akihiko, E-mail: takahsr@hs.med.kyushu-u.ac.jp; Sasaki, Masayuki [Department of Health Sciences, Faculty of Medical Sciences, Kyushu University, 3-1-1 Maidashi, Higashi-ku, Fukuoka 812-8582 (Japan); Himuro, Kazuhiko; Yamashita, Yasuo; Komiya, Isao [Division of Radiology, Department of Medical Technology, Kyushu University Hospital, 3-1-1 Maidashi, Higashi-ku, Fukuoka 812-8582 (Japan); Baba, Shingo [Department of Clinical Radiology, Kyushu University Hospital, 3-1-1 Maidashi, Higashi-ku, Fukuoka 812-8582 (Japan)

    2015-04-15

    Purpose: Yittrium-90 ({sup 90}Y) is traditionally thought of as a pure beta emitter, and is used in targeted radionuclide therapy, with imaging performed using bremsstrahlung single-photon emission computed tomography (SPECT). However, because {sup 90}Y also emits positrons through internal pair production with a very small branching ratio, positron emission tomography (PET) imaging is also available. Because of the insufficient image quality of {sup 90}Y bremsstrahlung SPECT, PET imaging has been suggested as an alternative. In this paper, the authors present the Monte Carlo-based simulation–reconstruction framework for {sup 90}Y to comprehensively analyze the PET and SPECT imaging techniques and to quantitatively consider the disadvantages associated with them. Methods: Our PET and SPECT simulation modules were developed using Monte Carlo simulation of Electrons and Photons (MCEP), developed by Dr. S. Uehara. PET code (MCEP-PET) generates a sinogram, and reconstructs the tomography image using a time-of-flight ordered subset expectation maximization (TOF-OSEM) algorithm with attenuation compensation. To evaluate MCEP-PET, simulated results of {sup 18}F PET imaging were compared with the experimental results. The results confirmed that MCEP-PET can simulate the experimental results very well. The SPECT code (MCEP-SPECT) models the collimator and NaI detector system, and generates the projection images and projection data. To save the computational time, the authors adopt the prerecorded {sup 90}Y bremsstrahlung photon data calculated by MCEP. The projection data are also reconstructed using the OSEM algorithm. The authors simulated PET and SPECT images of a water phantom containing six hot spheres filled with different concentrations of {sup 90}Y without background activity. The amount of activity was 163 MBq, with an acquisition time of 40 min. Results: The simulated {sup 90}Y-PET image accurately simulated the experimental results. PET image is visually

  20. Image quality preferences among radiographers and radiologists. A conjoint analysis

    International Nuclear Information System (INIS)

    Ween, Borgny; Kristoffersen, Doris Tove; Hamilton, Glenys A.; Olsen, Dag Rune

    2005-01-01

    Purpose: The aim of this study was to investigate the image quality preferences among radiographers and radiologists. The radiographers' preferences are mainly related to technical parameters, whereas radiologists assess image quality based on diagnostic value. Methods: A conjoint analysis was undertaken to survey image quality preferences; the study included 37 respondents: 19 radiographers and 18 radiologists. Digital urograms were post-processed into 8 images with different properties of image quality for 3 different patients. The respondents were asked to rank the images according to their personally perceived subjective image quality. Results: Nearly half of the radiographers and radiologists were consistent in their ranking of the image characterised as 'very best image quality'. The analysis showed, moreover, that chosen filtration level and image intensity were responsible for 72% and 28% of the preferences, respectively. The corresponding figures for each of the two professions were 76% and 24% for the radiographers, and 68% and 32% for the radiologists. In addition, there were larger variations in image preferences among the radiologists, as compared to the radiographers. Conclusions: Radiographers revealed a more consistent preference than the radiologists with respect to image quality. There is a potential for image quality improvement by developing sets of image property criteria

  1. Computer-simulated images of icosahedral, pentagonal and decagonal clusters of atoms

    International Nuclear Information System (INIS)

    Peng JuLin; Bursill, L.A.

    1989-01-01

    The aim of this work was to assess, by computer-simulation the sensitivity of high-resolution electron microscopy (HREM) images for a set of icosahedral and decagonal clusters, containing 50-400 atoms. An experimental study of both crystalline and quasy-crystalline alloys of A1(Si)Mn is presented, in which carefully-chosen electron optical conditions were established by computer simulation then used to obtain high quality images. It was concluded that while there is a very significant degree of model sensitiveness available, direct inversion from image to structure is not at realistic possibility. A reasonable procedure would be to record experimental images of known complex icosahedral alloys, in a crystalline phase, then use the computer-simulations to identify fingerprint imaging conditions whereby certain structural elements could be identified in images of quasi-crystalline or amorphous specimens. 27 refs., 12 figs., 1 tab

  2. Fingerprint matching algorithm for poor quality images

    Directory of Open Access Journals (Sweden)

    Vedpal Singh

    2015-04-01

    Full Text Available The main aim of this study is to establish an efficient platform for fingerprint matching for low-quality images. Generally, fingerprint matching approaches use the minutiae points for authentication. However, it is not such a reliable authentication method for low-quality images. To overcome this problem, the current study proposes a fingerprint matching methodology based on normalised cross-correlation, which would improve the performance and reduce the miscalculations during authentication. It would decrease the computational complexities. The error rate of the proposed method is 5.4%, which is less than the two-dimensional (2D dynamic programming (DP error rate of 5.6%, while Lee's method produces 5.9% and the combined method has 6.1% error rate. Genuine accept rate at 1% false accept rate is 89.3% but at 0.1% value it is 96.7%, which is higher. The outcome of this study suggests that the proposed methodology has a low error rate with minimum computational effort as compared with existing methods such as Lee's method and 2D DP and the combined method.

  3. Image Quality Characteristics of Handheld Display Devices for Medical Imaging

    Science.gov (United States)

    Yamazaki, Asumi; Liu, Peter; Cheng, Wei-Chung; Badano, Aldo

    2013-01-01

    Handheld devices such as mobile phones and tablet computers have become widespread with thousands of available software applications. Recently, handhelds are being proposed as part of medical imaging solutions, especially in emergency medicine, where immediate consultation is required. However, handheld devices differ significantly from medical workstation displays in terms of display characteristics. Moreover, the characteristics vary significantly among device types. We investigate the image quality characteristics of various handheld devices with respect to luminance response, spatial resolution, spatial noise, and reflectance. We show that the luminance characteristics of the handheld displays are different from those of workstation displays complying with grayscale standard target response suggesting that luminance calibration might be needed. Our results also demonstrate that the spatial characteristics of handhelds can surpass those of medical workstation displays particularly for recent generation devices. While a 5 mega-pixel monochrome workstation display has horizontal and vertical modulation transfer factors of 0.52 and 0.47 at the Nyquist frequency, the handheld displays released after 2011 can have values higher than 0.63 at the respective Nyquist frequencies. The noise power spectra for workstation displays are higher than 1.2×10−5 mm2 at 1 mm−1, while handheld displays have values lower than 3.7×10−6 mm2. Reflectance measurements on some of the handheld displays are consistent with measurements for workstation displays with, in some cases, low specular and diffuse reflectance coefficients. The variability of the characterization results among devices due to the different technological features indicates that image quality varies greatly among handheld display devices. PMID:24236113

  4. Practical evaluation of clinical image quality (4). Determination of image quality in digital radiography system

    International Nuclear Information System (INIS)

    Katayama, Reiji

    2016-01-01

    Recently, for medical imaging, digital radiography systems are widely used in clinical practices. However, a study in the past reported that a patient radiation exposure level by digital radiography is in fact not lower than that by analog radiography system. High level of attention needs to be paid for over-exposure when using the conventional analog radiography with a screen and a film, as it results in high density of the film. However, for digital radiography systems, since the automatic adjusting function of image density is equipped with them, no attention for radiation dose need to be paid. Thus technologists tend to be careless and results in higher chance for over-exposure. Current digital radiography systems are high-performance in the image properties and capable of patient dose reduction. Especially, the image quality of the flat panel detector system is recognized, higher than that of the computed radiography system by imaging plates, in both objective and subjective evaluations. Therefore, we technologists are responsible for optimizing the balance between the image quality of the digital radiogram and the radiation dose required for each case. Moreover, it is also required for us as medical technologists to make effective use of such evaluation result of medical images for patients. (author)

  5. Multi-scale imaging and elastic simulation of carbonates

    Science.gov (United States)

    Faisal, Titly Farhana; Awedalkarim, Ahmed; Jouini, Mohamed Soufiane; Jouiad, Mustapha; Chevalier, Sylvie; Sassi, Mohamed

    2016-05-01

    Digital Rock Physics (DRP) is an emerging technology that can be used to generate high quality, fast and cost effective special core analysis (SCAL) properties compared to conventional experimental techniques and modeling techniques. The primary workflow of DRP conssits of three elements: 1) image the rock sample using high resolution 3D scanning techniques (e.g. micro CT, FIB/SEM), 2) process and digitize the images by segmenting the pore and matrix phases 3) simulate the desired physical properties of the rocks such as elastic moduli and velocities of wave propagation. A Finite Element Method based algorithm, that discretizes the basic Hooke's Law equation of linear elasticity and solves it numerically using a fast conjugate gradient solver, developed by Garboczi and Day [1] is used for mechanical and elastic property simulations. This elastic algorithm works directly on the digital images by treating each pixel as an element. The images are assumed to have periodic constant-strain boundary condition. The bulk and shear moduli of the different phases are required inputs. For standard 1.5" diameter cores however the Micro-CT scanning reoslution (around 40 μm) does not reveal smaller micro- and nano- pores beyond the resolution. This results in an unresolved "microporous" phase, the moduli of which is uncertain. Knackstedt et al. [2] assigned effective elastic moduli to the microporous phase based on self-consistent theory (which gives good estimation of velocities for well cemented granular media). Jouini et al. [3] segmented the core plug CT scan image into three phases and assumed that micro porous phase is represented by a sub-extracted micro plug (which too was scanned using Micro-CT). Currently the elastic numerical simulations based on CT-images alone largely overpredict the bulk, shear and Young's modulus when compared to laboratory acoustic tests of the same rocks. For greater accuracy of numerical simulation prediction, better estimates of moduli inputs

  6. Implementation of dictionary pair learning algorithm for image quality improvement

    Science.gov (United States)

    Vimala, C.; Aruna Priya, P.

    2018-04-01

    This paper proposes an image denoising on dictionary pair learning algorithm. Visual information is transmitted in the form of digital images is becoming a major method of communication in the modern age, but the image obtained after transmissions is often corrupted with noise. The received image needs processing before it can be used in applications. Image denoising involves the manipulation of the image data to produce a visually high quality image.

  7. Blind CT image quality assessment via deep learning strategy: initial study

    Science.gov (United States)

    Li, Sui; He, Ji; Wang, Yongbo; Liao, Yuting; Zeng, Dong; Bian, Zhaoying; Ma, Jianhua

    2018-03-01

    Computed Tomography (CT) is one of the most important medical imaging modality. CT images can be used to assist in the detection and diagnosis of lesions and to facilitate follow-up treatment. However, CT images are vulnerable to noise. Actually, there are two major source intrinsically causing the CT data noise, i.e., the X-ray photo statistics and the electronic noise background. Therefore, it is necessary to doing image quality assessment (IQA) in CT imaging before diagnosis and treatment. Most of existing CT images IQA methods are based on human observer study. However, these methods are impractical in clinical for their complex and time-consuming. In this paper, we presented a blind CT image quality assessment via deep learning strategy. A database of 1500 CT images is constructed, containing 300 high-quality images and 1200 corresponding noisy images. Specifically, the high-quality images were used to simulate the corresponding noisy images at four different doses. Then, the images are scored by the experienced radiologists by the following attributes: image noise, artifacts, edge and structure, overall image quality, and tumor size and boundary estimation with five-point scale. We trained a network for learning the non-liner map from CT images to subjective evaluation scores. Then, we load the pre-trained model to yield predicted score from the test image. To demonstrate the performance of the deep learning network in IQA, correlation coefficients: Pearson Linear Correlation Coefficient (PLCC) and Spearman Rank Order Correlation Coefficient (SROCC) are utilized. And the experimental result demonstrate that the presented deep learning based IQA strategy can be used in the CT image quality assessment.

  8. Numerical simulation and optimal design of Segmented Planar Imaging Detector for Electro-Optical Reconnaissance

    Science.gov (United States)

    Chu, Qiuhui; Shen, Yijie; Yuan, Meng; Gong, Mali

    2017-12-01

    Segmented Planar Imaging Detector for Electro-Optical Reconnaissance (SPIDER) is a cutting-edge electro-optical imaging technology to realize miniaturization and complanation of imaging systems. In this paper, the principle of SPIDER has been numerically demonstrated based on the partially coherent light theory, and a novel concept of adjustable baseline pairing SPIDER system has further been proposed. Based on the results of simulation, it is verified that the imaging quality could be effectively improved by adjusting the Nyquist sampling density, optimizing the baseline pairing method and increasing the spectral channel of demultiplexer. Therefore, an adjustable baseline pairing algorithm is established for further enhancing the image quality, and the optimal design procedure in SPIDER for arbitrary targets is also summarized. The SPIDER system with adjustable baseline pairing method can broaden its application and reduce cost under the same imaging quality.

  9. Developing and validating a psychometric scale for image quality assessment

    International Nuclear Information System (INIS)

    Mraity, H.; England, A.; Hogg, P.

    2014-01-01

    Purpose: Using AP pelvis as a catalyst, this paper explains how a psychometric scale for image quality assessment can be created using Bandura's theory for self-efficacy. Background: Establishing an accurate diagnosis is highly dependent upon the quality of the radiographic image. Image quality, as a construct (i.e. set of attributes that makes up the image quality), continues to play an essential role in the field of diagnostic radiography. The process of assessing image quality can be facilitated by using criteria, such as the European Commission (EC) guidelines for quality criteria as published in 1996. However, with the advent of new technology (Computed Radiography and Digital Radiography), some of the EC criteria may no longer be suitable for assessing the visual quality of a digital radiographic image. Moreover, a lack of validated visual image quality scales in the literature can also lead to significant variations in image quality evaluation. Creating and validating visual image quality scales, using a robust methodology, could reduce variability and improve the validity and reliability of perceptual image quality evaluations

  10. Numerical simulation for neutron pinhole imaging in ICF

    International Nuclear Information System (INIS)

    Chen Faxin; Yang Jianlun; Wen Shuhuai

    2005-01-01

    Pinhole imaging of the neutron production in laser-driven inertial confinement fusion experiments can provide important information about performance of various capsule designs. In order to get good results in experiments, it is needed to judge performance of various pinhole designs qualitatively or quantitatively before experiment. Calculation of imaging can be simply separated into pinhole imaging and image spectral analysis. In this paper, pinhole imaging is discussed, codes for neutron pinhole imaging and image showing is programed. The codes can be used to provide theoretical foundation for pinhole designing and simulating data for image analysing. (authors)

  11. Perceived image quality for autostereoscopic holograms in healthcare training

    Science.gov (United States)

    Goldiez, Brian; Abich, Julian; Carter, Austin; Hackett, Matthew

    2017-03-01

    The current state of dynamic light field holography requires further empirical investigation to ultimately advance this developing technology. This paper describes a user-centered design approach for gaining insight into the features most important to clinical personnel using emerging dynamic holographic displays. The approach describes the generation of a high quality holographic model of a simulated traumatic amputation above the knee using 3D scanning. Using that model, a set of static holographic prints will be created varying in color or monochrome, contrast ratio, and polygon density. Leveraging methods from image quality research, the goal for this paper is to describe an experimental approach wherein participants are asked to provide feedback regarding the elements previously mentioned in order to guide the ongoing evolution of holographic displays.

  12. Effect of quality control implementation on image quality of radiographic films and irradiation doses to patients

    International Nuclear Information System (INIS)

    Cheng Yuxi; Zhou Qipu; Ge Lijuan; Hou Changsong; Qi Xuesong; Yue Baorong; Wang Zuoling; Wei Kedao

    1999-01-01

    Objective: To study the changes in the image quality of radiographic films and the irradiation doses to patients after quality control (QC) implementation. Methods: The entrance surface doses (ESD) to patients measured with TLD and the image quality of radiographic films were evaluated on the basis of CEC image quality criteria. Results: The ESD to patients were significantly reduced after QC implementation (P 0.05), but the post-QC image quality was significantly improved in chest PA, lumbar spine AP and pelvis AP(P0.01 or P<0.05). Conclusion: Significantly reduced irradiation dose with improved image quality can be obtained by QC implementation

  13. Bio-imaging and visualization for patient-customized simulations

    CERN Document Server

    Luo, Xiongbiao; Li, Shuo

    2014-01-01

    This book contains the full papers presented at the MICCAI 2013 workshop Bio-Imaging and Visualization for Patient-Customized Simulations (MWBIVPCS 2013). MWBIVPCS 2013 brought together researchers representing several fields, such as Biomechanics, Engineering, Medicine, Mathematics, Physics and Statistic. The contributions included in this book present and discuss new trends in those fields, using several methods and techniques, including the finite element method, similarity metrics, optimization processes, graphs, hidden Markov models, sensor calibration, fuzzy logic, data mining, cellular automation, active shape models, template matching and level sets. These serve as tools to address more efficiently different and timely applications involving signal and image acquisition, image processing and analysis, image segmentation, image registration and fusion, computer simulation, image based modelling, simulation and surgical planning, image guided robot assisted surgical and image based diagnosis.  This boo...

  14. Noise Estimation and Quality Assessment of Gaussian Noise Corrupted Images

    Science.gov (United States)

    Kamble, V. M.; Bhurchandi, K.

    2018-03-01

    Evaluating the exact quantity of noise present in an image and quality of an image in the absence of reference image is a challenging task. We propose a near perfect noise estimation method and a no reference image quality assessment method for images corrupted by Gaussian noise. The proposed methods obtain initial estimate of noise standard deviation present in an image using the median of wavelet transform coefficients and then obtains a near to exact estimate using curve fitting. The proposed noise estimation method provides the estimate of noise within average error of +/-4%. For quality assessment, this noise estimate is mapped to fit the Differential Mean Opinion Score (DMOS) using a nonlinear function. The proposed methods require minimum training and yields the noise estimate and image quality score. Images from Laboratory for image and Video Processing (LIVE) database and Computational Perception and Image Quality (CSIQ) database are used for validation of the proposed quality assessment method. Experimental results show that the performance of proposed quality assessment method is at par with the existing no reference image quality assessment metric for Gaussian noise corrupted images.

  15. Image quality transfer and applications in diffusion MRI

    DEFF Research Database (Denmark)

    Alexander, Daniel C.; Zikic, Darko; Ghosh, Aurobrata

    2017-01-01

    This paper introduces a new computational imaging technique called image quality transfer (IQT). IQT uses machine learning to transfer the rich information available from one-off experimental medical imaging devices to the abundant but lower-quality data from routine acquisitions. The procedure u...

  16. How the task of evaluating image quality influences viewing behavior

    NARCIS (Netherlands)

    Alers, H.; Bos, Lennart; Heynderickx, I.E.J.

    2011-01-01

    Image quality scores collected in subjective experiments are widely used in image quality research, particularly in the design of objective quality assessment algorithms. It is therefore of vital importance to make sure that the collected scores reflect viewers' opinions in real-life situations.

  17. Development of breast phantom for quality assessment of mammographic images

    Energy Technology Data Exchange (ETDEWEB)

    Arvelos, Jeniffer Miranda; Flores, Mabel Bustos; Amaral, Fernando; Rio, Margarita Chevalier del; Mourao, Arnaldo Prata, E-mail: jenifferarvelos00@gmail.com [Centro Federal de Educação Tecnológica de Minas Gerais (CEFET-MG), Belo Horizonte, MG (Brazil). Centro de Engenharia Biomedica; Universidade Federal de Minas Gerais (UFMG), Belo Horizonte, MG (Brazil). Departamento de Engenharia Nuclear; Universidad Complutense de Madrid (UCM), Madrid (Spain). Faculdad de Medicina. Departmento de Radiologia

    2017-11-01

    Diagnosis of breast cancer in young women may be impaired by the tissue composition of breast in this age group, as fibroglandular tissue is present in greater amount in young women and it has higher density than fibrous and fatty tissues which predominate in women older than 40 years old. The higher density of breast tissue makes it difficult to identify nodules in two-dimensional techniques, due to the overlapping of dense layers. Breast phantoms are used in evaluation and quality control of clinical images, and therefore, it is important to develop non-homogeneous phantoms that may better simulate a real breast. Grouped microcalcifications are often the earliest changes associated with malignant neoplasm of breast. In this work, a phantom was developed in the form of a compressed breast using acrylic resin blend. The resin blend used to fulfill the interior of the phantom has similar mammographic density to the one in fibroglandular tissue, representing a dense breast. The lesions were made of acrylic resin blend and calcium compounds that might simulate breast abnormalities, representing nodules, macrocalcifications and microcalcifications of different dimensions and densities. They were distributed into the ma-terial representing fibroglandular tissue. The developed phantom has a thickness of 1 cm, and it may be matched with other plates to represent a dense breast of thickness between 5 and 6 cm. The main goal of the project is to evaluate the sensitivity of detection of these calcifications in relation to their density and location in the breast in two-dimensional images generated in mammography equipment. Mammographic images allow the visualization of the changes implemented in the phantom. The developed phantom may be used in evaluation of diagnostic images generated through two-dimensional and three-dimensional images. (author)

  18. Development of breast phantom for quality assessment of mammographic images

    International Nuclear Information System (INIS)

    Arvelos, Jeniffer Miranda; Flores, Mabel Bustos; Amaral, Fernando; Rio, Margarita Chevalier del; Mourao, Arnaldo Prata; Universidade Federal de Minas Gerais; Universidad Complutense de Madrid

    2017-01-01

    Diagnosis of breast cancer in young women may be impaired by the tissue composition of breast in this age group, as fibroglandular tissue is present in greater amount in young women and it has higher density than fibrous and fatty tissues which predominate in women older than 40 years old. The higher density of breast tissue makes it difficult to identify nodules in two-dimensional techniques, due to the overlapping of dense layers. Breast phantoms are used in evaluation and quality control of clinical images, and therefore, it is important to develop non-homogeneous phantoms that may better simulate a real breast. Grouped microcalcifications are often the earliest changes associated with malignant neoplasm of breast. In this work, a phantom was developed in the form of a compressed breast using acrylic resin blend. The resin blend used to fulfill the interior of the phantom has similar mammographic density to the one in fibroglandular tissue, representing a dense breast. The lesions were made of acrylic resin blend and calcium compounds that might simulate breast abnormalities, representing nodules, macrocalcifications and microcalcifications of different dimensions and densities. They were distributed into the ma-terial representing fibroglandular tissue. The developed phantom has a thickness of 1 cm, and it may be matched with other plates to represent a dense breast of thickness between 5 and 6 cm. The main goal of the project is to evaluate the sensitivity of detection of these calcifications in relation to their density and location in the breast in two-dimensional images generated in mammography equipment. Mammographic images allow the visualization of the changes implemented in the phantom. The developed phantom may be used in evaluation of diagnostic images generated through two-dimensional and three-dimensional images. (author)

  19. Quality Improvement of Liver Ultrasound Images Using Fuzzy Techniques.

    Science.gov (United States)

    Bayani, Azadeh; Langarizadeh, Mostafa; Radmard, Amir Reza; Nejad, Ahmadreza Farzaneh

    2016-12-01

    Liver ultrasound images are so common and are applied so often to diagnose diffuse liver diseases like fatty liver. However, the low quality of such images makes it difficult to analyze them and diagnose diseases. The purpose of this study, therefore, is to improve the contrast and quality of liver ultrasound images. In this study, a number of image contrast enhancement algorithms which are based on fuzzy logic were applied to liver ultrasound images - in which the view of kidney is observable - using Matlab2013b to improve the image contrast and quality which has a fuzzy definition; just like image contrast improvement algorithms using a fuzzy intensification operator, contrast improvement algorithms applying fuzzy image histogram hyperbolization, and contrast improvement algorithms by fuzzy IF-THEN rules. With the measurement of Mean Squared Error and Peak Signal to Noise Ratio obtained from different images, fuzzy methods provided better results, and their implementation - compared with histogram equalization method - led both to the improvement of contrast and visual quality of images and to the improvement of liver segmentation algorithms results in images. Comparison of the four algorithms revealed the power of fuzzy logic in improving image contrast compared with traditional image processing algorithms. Moreover, contrast improvement algorithm based on a fuzzy intensification operator was selected as the strongest algorithm considering the measured indicators. This method can also be used in future studies on other ultrasound images for quality improvement and other image processing and analysis applications.

  20. Optical Imaging and Radiometric Modeling and Simulation

    Science.gov (United States)

    Ha, Kong Q.; Fitzmaurice, Michael W.; Moiser, Gary E.; Howard, Joseph M.; Le, Chi M.

    2010-01-01

    OPTOOL software is a general-purpose optical systems analysis tool that was developed to offer a solution to problems associated with computational programs written for the James Webb Space Telescope optical system. It integrates existing routines into coherent processes, and provides a structure with reusable capabilities that allow additional processes to be quickly developed and integrated. It has an extensive graphical user interface, which makes the tool more intuitive and friendly. OPTOOL is implemented using MATLAB with a Fourier optics-based approach for point spread function (PSF) calculations. It features parametric and Monte Carlo simulation capabilities, and uses a direct integration calculation to permit high spatial sampling of the PSF. Exit pupil optical path difference (OPD) maps can be generated using combinations of Zernike polynomials or shaped power spectral densities. The graphical user interface allows rapid creation of arbitrary pupil geometries, and entry of all other modeling parameters to support basic imaging and radiometric analyses. OPTOOL provides the capability to generate wavefront-error (WFE) maps for arbitrary grid sizes. These maps are 2D arrays containing digital sampled versions of functions ranging from Zernike polynomials to combination of sinusoidal wave functions in 2D, to functions generated from a spatial frequency power spectral distribution (PSD). It also can generate optical transfer functions (OTFs), which are incorporated into the PSF calculation. The user can specify radiometrics for the target and sky background, and key performance parameters for the instrument s focal plane array (FPA). This radiometric and detector model setup is fairly extensive, and includes parameters such as zodiacal background, thermal emission noise, read noise, and dark current. The setup also includes target spectral energy distribution as a function of wavelength for polychromatic sources, detector pixel size, and the FPA s charge

  1. Inter-observer variation in masked and unmasked images for quality evaluation of clinical radiographs

    International Nuclear Information System (INIS)

    Tingberg, A.; Eriksson, F.; Medin, J.; Besjakov, J.; Baarth, M.; Haakansson, M.; Sandborg, M.; Almen, A.; Lanhede, B.; Alm-Carlsson, G.; Mattsson, S.; Maansson, L. G.

    2005-01-01

    Purpose: To investigate the influence of masking on the inter-observer variation in image quality evaluation of clinical radiographs of chest and lumbar spine. Background: Inter-observer variation is a big problem in image quality evaluation since this variation is often much bigger than the variation in image quality between, for example, two radiographic systems. In this study, we have evaluated the effect of masking on the inter-observer variation. The idea of the masking was to force every observer to view exactly the same part of the image and to avoid the effect of the overall 'first impression' of the image. A discussion with a group of European expert radiologists before the study indicated that masking might be a good way to reduce the inter-observer variation. Methods: Five chest and five lumbar spine radiographs were collected together with detailed information regarding exposure conditions. The radiographs were digitised with a high-performance scanner and five different manipulations were performed, simulating five different exposure conditions. The contrast, noise and spatial resolution were manipulated by this method. The images were printed onto the film and the individual masks were produced for each film, showing only the parts of the images that were necessary for the image quality evaluation. The quality of the images was evaluated on ordinary viewing boxes by a large group of experienced radiologists. The images were examined with and without the masks with a set of image criteria (if fulfilled, 1 point; and not fulfilled, 0 point), and the mean score was calculated for each simulated exposure condition. Results: The results of this study indicate that - contrary to what was supposed - the inter-observer variation increased when the images were masked. In some cases, especially for chest, this increase was statistically significant. Conclusions: Based on the results of this study, image masking in studies of fulfilment of image criteria cannot

  2. RADARSAT-1 Image Quality Excellence in the Extended Mission

    National Research Council Canada - National Science Library

    Srivastava, S. K; Cote, S; Le Dantec, P; Hawkins, R. K

    2005-01-01

    ... after its launch on November 4, 1995. Both single beams and ScanSAR imagery are still monitored routinely for radiometric calibration performance based on images of the Amazon Rainforest, and for image quality performance using imagery...

  3. Image quality influences the assessment of left ventricular function

    DEFF Research Database (Denmark)

    Grossgasteiger, Manuel; Hien, Maximilian D; Graser, Bastian

    2014-01-01

    divided by the total endocardial border. These ratings were used to generate groups of poor (0%-40%), fair (41%-70%), and good (71%-100%) image quality. The ejection fraction (EF), end-diastolic volume, and end-systolic volume were analyzed by the Simpson method of disks (biplane and monoplane), eyeball...... method yield better correlations with poor image quality. The eyeball method was unaffected by image quality....

  4. Image quality and dose differences caused by vendor-specific image processing of neonatal radiographs.

    Science.gov (United States)

    Sensakovic, William F; O'Dell, M Cody; Letter, Haley; Kohler, Nathan; Rop, Baiywo; Cook, Jane; Logsdon, Gregory; Varich, Laura

    2016-10-01

    Image processing plays an important role in optimizing image quality and radiation dose in projection radiography. Unfortunately commercial algorithms are black boxes that are often left at or near vendor default settings rather than being optimized. We hypothesize that different commercial image-processing systems, when left at or near default settings, create significant differences in image quality. We further hypothesize that image-quality differences can be exploited to produce images of equivalent quality but lower radiation dose. We used a portable radiography system to acquire images on a neonatal chest phantom and recorded the entrance surface air kerma (ESAK). We applied two image-processing systems (Optima XR220amx, by GE Healthcare, Waukesha, WI; and MUSICA(2) by Agfa HealthCare, Mortsel, Belgium) to the images. Seven observers (attending pediatric radiologists and radiology residents) independently assessed image quality using two methods: rating and matching. Image-quality ratings were independently assessed by each observer on a 10-point scale. Matching consisted of each observer matching GE-processed images and Agfa-processed images with equivalent image quality. A total of 210 rating tasks and 42 matching tasks were performed and effective dose was estimated. Median Agfa-processed image-quality ratings were higher than GE-processed ratings. Non-diagnostic ratings were seen over a wider range of doses for GE-processed images than for Agfa-processed images. During matching tasks, observers matched image quality between GE-processed images and Agfa-processed images acquired at a lower effective dose (11 ± 9 μSv; P < 0.0001). Image-processing methods significantly impact perceived image quality. These image-quality differences can be exploited to alter protocols and produce images of equivalent image quality but lower doses. Those purchasing projection radiography systems or third-party image-processing software should be aware that image

  5. Design of a practical model-observer-based image quality assessment method for CT imaging systems

    Science.gov (United States)

    Tseng, Hsin-Wu; Fan, Jiahua; Cao, Guangzhi; Kupinski, Matthew A.; Sainath, Paavana

    2014-03-01

    The channelized Hotelling observer (CHO) is a powerful method for quantitative image quality evaluations of CT systems and their image reconstruction algorithms. It has recently been used to validate the dose reduction capability of iterative image-reconstruction algorithms implemented on CT imaging systems. The use of the CHO for routine and frequent system evaluations is desirable both for quality assurance evaluations as well as further system optimizations. The use of channels substantially reduces the amount of data required to achieve accurate estimates of observer performance. However, the number of scans required is still large even with the use of channels. This work explores different data reduction schemes and designs a new approach that requires only a few CT scans of a phantom. For this work, the leave-one-out likelihood (LOOL) method developed by Hoffbeck and Landgrebe is studied as an efficient method of estimating the covariance matrices needed to compute CHO performance. Three different kinds of approaches are included in the study: a conventional CHO estimation technique with a large sample size, a conventional technique with fewer samples, and the new LOOL-based approach with fewer samples. The mean value and standard deviation of area under ROC curve (AUC) is estimated by shuffle method. Both simulation and real data results indicate that an 80% data reduction can be achieved without loss of accuracy. This data reduction makes the proposed approach a practical tool for routine CT system assessment.

  6. Research on hyperspectral dynamic scene and image sequence simulation

    Science.gov (United States)

    Sun, Dandan; Liu, Fang; Gao, Jiaobo; Sun, Kefeng; Hu, Yu; Li, Yu; Xie, Junhu; Zhang, Lei

    2016-10-01

    This paper presents a simulation method of hyperspectral dynamic scene and image sequence for hyperspectral equipment evaluation and target detection algorithm. Because of high spectral resolution, strong band continuity, anti-interference and other advantages, in recent years, hyperspectral imaging technology has been rapidly developed and is widely used in many areas such as optoelectronic target detection, military defense and remote sensing systems. Digital imaging simulation, as a crucial part of hardware in loop simulation, can be applied to testing and evaluation hyperspectral imaging equipment with lower development cost and shorter development period. Meanwhile, visual simulation can produce a lot of original image data under various conditions for hyperspectral image feature extraction and classification algorithm. Based on radiation physic model and material characteristic parameters this paper proposes a generation method of digital scene. By building multiple sensor models under different bands and different bandwidths, hyperspectral scenes in visible, MWIR, LWIR band, with spectral resolution 0.01μm, 0.05μm and 0.1μm have been simulated in this paper. The final dynamic scenes have high real-time and realistic, with frequency up to 100 HZ. By means of saving all the scene gray data in the same viewpoint image sequence is obtained. The analysis results show whether in the infrared band or the visible band, the grayscale variations of simulated hyperspectral images are consistent with the theoretical analysis results.

  7. Efficient scatter model for simulation of ultrasound images from computed tomography data

    Science.gov (United States)

    D'Amato, J. P.; Lo Vercio, L.; Rubi, P.; Fernandez Vera, E.; Barbuzza, R.; Del Fresno, M.; Larrabide, I.

    2015-12-01

    Background and motivation: Real-time ultrasound simulation refers to the process of computationally creating fully synthetic ultrasound images instantly. Due to the high value of specialized low cost training for healthcare professionals, there is a growing interest in the use of this technology and the development of high fidelity systems that simulate the acquisitions of echographic images. The objective is to create an efficient and reproducible simulator that can run either on notebooks or desktops using low cost devices. Materials and methods: We present an interactive ultrasound simulator based on CT data. This simulator is based on ray-casting and provides real-time interaction capabilities. The simulation of scattering that is coherent with the transducer position in real time is also introduced. Such noise is produced using a simplified model of multiplicative noise and convolution with point spread functions (PSF) tailored for this purpose. Results: The computational efficiency of scattering maps generation was revised with an improved performance. This allowed a more efficient simulation of coherent scattering in the synthetic echographic images while providing highly realistic result. We describe some quality and performance metrics to validate these results, where a performance of up to 55fps was achieved. Conclusion: The proposed technique for real-time scattering modeling provides realistic yet computationally efficient scatter distributions. The error between the original image and the simulated scattering image was compared for the proposed method and the state-of-the-art, showing negligible differences in its distribution.

  8. Recognizable or Not: Towards Image Semantic Quality Assessment for Compression

    Science.gov (United States)

    Liu, Dong; Wang, Dandan; Li, Houqiang

    2017-12-01

    Traditionally, image compression was optimized for the pixel-wise fidelity or the perceptual quality of the compressed images given a bit-rate budget. But recently, compressed images are more and more utilized for automatic semantic analysis tasks such as recognition and retrieval. For these tasks, we argue that the optimization target of compression is no longer perceptual quality, but the utility of the compressed images in the given automatic semantic analysis task. Accordingly, we propose to evaluate the quality of the compressed images neither at pixel level nor at perceptual level, but at semantic level. In this paper, we make preliminary efforts towards image semantic quality assessment (ISQA), focusing on the task of optical character recognition (OCR) from compressed images. We propose a full-reference ISQA measure by comparing the features extracted from text regions of original and compressed images. We then propose to integrate the ISQA measure into an image compression scheme. Experimental results show that our proposed ISQA measure is much better than PSNR and SSIM in evaluating the semantic quality of compressed images; accordingly, adopting our ISQA measure to optimize compression for OCR leads to significant bit-rate saving compared to using PSNR or SSIM. Moreover, we perform subjective test about text recognition from compressed images, and observe that our ISQA measure has high consistency with subjective recognizability. Our work explores new dimensions in image quality assessment, and demonstrates promising direction to achieve higher compression ratio for specific semantic analysis tasks.

  9. Objective and Subjective Assessment of Digital Pathology Image Quality

    Directory of Open Access Journals (Sweden)

    Prarthana Shrestha

    2015-03-01

    Full Text Available The quality of an image produced by the Whole Slide Imaging (WSI scanners is of critical importance for using the image in clinical diagnosis. Therefore, it is very important to monitor and ensure the quality of images. Since subjective image quality assessments by pathologists are very time-consuming, expensive and difficult to reproduce, we propose a method for objective assessment based on clinically relevant and perceptual image parameters: sharpness, contrast, brightness, uniform illumination and color separation; derived from a survey of pathologists. We developed techniques to quantify the parameters based on content-dependent absolute pixel performance and to manipulate the parameters in a predefined range resulting in images with content-independent relative quality measures. The method does not require a prior reference model. A subjective assessment of the image quality is performed involving 69 pathologists and 372 images (including 12 optimal quality images and their distorted versions per parameter at 6 different levels. To address the inter-reader variability, a representative rating is determined as a one-tailed 95% confidence interval of the mean rating. The results of the subjective assessment support the validity of the proposed objective image quality assessment method to model the readers’ perception of image quality. The subjective assessment also provides thresholds for determining the acceptable level of objective quality per parameter. The images for both the subjective and objective quality assessment are based on the HercepTestTM slides scanned by the Philips Ultra Fast Scanners, developed at Philips Digital Pathology Solutions. However, the method is applicable also to other types of slides and scanners.

  10. Protocol for quality control of scanners used in the simulation of radiotherapy treatment planning

    International Nuclear Information System (INIS)

    Yanes, Yaima; Alfonso, Rodolfo; Silvestre, Ileana

    2009-01-01

    Computed Tomography (CT) has become the tool fundamental imaging of modern radiation therapy, to locate targets and critical organs and dose planning. Tomographs used for these purposes require strict assurance program quality, which differs in many aspects of monitoring required for diagnostic use only with intention. The aim of this work has been the design and validation of a quality control protocol applicable to any TAC used for simulation, radiotherapy planning. (author)

  11. Simulation Study of Real Time 3-D Synthetic Aperture Sequential Beamforming for Ultrasound Imaging

    DEFF Research Database (Denmark)

    Hemmsen, Martin Christian; Rasmussen, Morten Fischer; Stuart, Matthias Bo

    2014-01-01

    in the main system. The real-time imaging capability is achieved using a synthetic aperture beamforming technique, utilizing the transmit events to generate a set of virtual elements that in combination can generate an image. The two core capabilities in combination is named Synthetic Aperture Sequential......This paper presents a new beamforming method for real-time three-dimensional (3-D) ultrasound imaging using a 2-D matrix transducer. To obtain images with sufficient resolution and contrast, several thousand elements are needed. The proposed method reduces the required channel count from...... Beamforming (SASB). Simulations are performed to evaluate the image quality of the presented method in comparison to Parallel beamforming utilizing 16 receive beamformers. As indicators for image quality the detail resolution and Cystic resolution are determined for a set of scatterers at a depth of 90mm...

  12. Automated image quality assessment for chest CT scans.

    Science.gov (United States)

    Reeves, Anthony P; Xie, Yiting; Liu, Shuang

    2018-02-01

    Medical image quality needs to be maintained at standards sufficient for effective clinical reading. Automated computer analytic methods may be applied to medical images for quality assessment. For chest CT scans in a lung cancer screening context, an automated quality assessment method is presented that characterizes image noise and image intensity calibration. This is achieved by image measurements in three automatically segmented homogeneous regions of the scan: external air, trachea lumen air, and descending aorta blood. Profiles of CT scanner behavior are also computed. The method has been evaluated on both phantom and real low-dose chest CT scans and results show that repeatable noise and calibration measures may be realized by automated computer algorithms. Noise and calibration profiles show relevant differences between different scanners and protocols. Automated image quality assessment may be useful for quality control for lung cancer screening and may enable performance improvements to automated computer analysis methods. © 2017 American Association of Physicists in Medicine.

  13. Improving the Image Quality of Synthetic Transmit Aperture Ultrasound Images - Achieving Real-Time In-Vivo Imaging

    DEFF Research Database (Denmark)

    Gammelmark, Kim

    in-vivo experiments, showed, that TMS imaging can increase the SNR by as much as 17 dB compared to the traditional imaging techniques, which improves the in-vivo image quality to a highly competitive level. An in-vivo evaluation of convex array TMS imaging for abdominal imaging applications......-vivo imaging, and that the obtained image quality is highly competitive with the techniques applied in current medical ultrasound scanners. Hereby, the goals of the PhD have been successfully achieved.......Synthetic transmit aperture (STA) imaging has the potential to increase the image quality of medical ultrasound images beyond the levels obtained by conventional imaging techniques (linear, phased, and convex array imaging). Currently, however, in-vivo applications of STA imaging is limited...

  14. Dynamic flat panel detector versus image intensifier in cardiac imaging: dose and image quality

    Science.gov (United States)

    Vano, E.; Geiger, B.; Schreiner, A.; Back, C.; Beissel, J.

    2005-12-01

    The practical aspects of the dosimetric and imaging performance of a digital x-ray system for cardiology procedures were evaluated. The system was configured with an image intensifier (II) and later upgraded to a dynamic flat panel detector (FD). Entrance surface air kerma (ESAK) to phantoms of 16, 20, 24 and 28 cm of polymethyl methacrylate (PMMA) and the image quality of a test object were measured. Images were evaluated directly on the monitor and with numerical methods (noise and signal-to-noise ratio). Information contained in the DICOM header for dosimetry audit purposes was also tested. ESAK values per frame (or kerma rate) for the most commonly used cine and fluoroscopy modes for different PMMA thicknesses and for field sizes of 17 and 23 cm for II, and 20 and 25 cm for FD, produced similar results in the evaluated system with both technologies, ranging between 19 and 589 µGy/frame (cine) and 5 and 95 mGy min-1 (fluoroscopy). Image quality for these dose settings was better for the FD version. The 'study dosimetric report' is comprehensive, and its numerical content is sufficiently accurate. There is potential in the future to set those systems with dynamic FD to lower doses than are possible in the current II versions, especially for digital cine runs, or to benefit from improved image quality.

  15. Effects on MR images compression in tissue classification quality

    International Nuclear Information System (INIS)

    Santalla, H; Meschino, G; Ballarin, V

    2007-01-01

    It is known that image compression is required to optimize the storage in memory. Moreover, transmission speed can be significantly improved. Lossless compression is used without controversy in medicine, though benefits are limited. If we compress images lossy, where image can not be totally recovered; we can only recover an approximation. In this point definition of 'quality' is essential. What we understand for 'quality'? How can we evaluate a compressed image? Quality in images is an attribute whit several definitions and interpretations, which actually depend on the posterior use we want to give them. This work proposes a quantitative analysis of quality for lossy compressed Magnetic Resonance (MR) images, and their influence in automatic tissue classification, accomplished with these images

  16. The influence of environment temperature on SEM image quality

    International Nuclear Information System (INIS)

    Chen, Li; Liu, Junshan

    2015-01-01

    As the structure dimension goes down to the nano-scale, it often requires a scanning electron microscope (SEM) to provide image magnification up to 100 000  ×. However, SEM images at such a high magnification usually suffer from high resolution value and low signal-to-noise ratio, which results in low quality of the SEM image. In this paper, the quality of the SEM image is improved by optimizing the environment temperature. The experimental results indicate that at 100 000  ×, the quality of the SEM image is influenced by the environment temperature, whereas at 50 000  × it is not. At 100 000  × the best SEM image quality can be achieved from the environment temperature ranging 292 from 294 K, and the SEM image quality evaluated by the double stimulus continuous quality scale method can increase from grade 1 to grade 5. It is expected that this image quality improving method can be used in routine measurements with ordinary SEMs to get high quality images by optimizing the environment temperature. (paper)

  17. Quality assurance in digital dental imaging: a systematic review.

    Science.gov (United States)

    Metsälä, Eija; Henner, Anja; Ekholm, Marja

    2014-07-01

    Doses induced by individual dental examinations are low. However, dental radiography accounts for nearly one third of the total number of radiological examinations in the European Union. Therefore, special attention is needed with regard to radiation protection. In order to lower patient doses, the staff performing dental examinations must have competence in imaging as well as in radiation protection issues. This paper presents a systematic review about the core competencies needed by the healthcare staff in performing digital dental radiological imaging quality assurance. The following databases were searched: Pubmed, Cinahl, Pro Quest and IEEXplore digital library. Also volumes of some dental imaging journals and doctoral theses of the Finnish universities educating dentists were searched. The search was performed using both MeSH terms and keywords using the option 'search all text'. The original keywords were: dental imaging, digital, x-ray, panoramic, quality, assurance, competence, competency, skills, knowledge, radiographer, radiologist technician, dentist, oral hygienist, radiation protection and their Finnish synonyms. Core competencies needed by the healthcare staff performing digital dental radiological imaging quality assurance described in the selected studies were: management of dental imaging equipment, competence in image quality and factors associated with it, dose optimization and quality assurance. In the future there will be higher doses in dental imaging due to increasing use of CBCT and digital imaging. The staff performing dental imaging must have competence in dental imaging quality assurance issues found in this review. They also have to practice ethical radiation safety culture in clinical practice.

  18. Optimising cardiac/angiographic digital images using a Butternut as the image quality phantom

    International Nuclear Information System (INIS)

    Bibbo, G.; Balman, D.

    2008-01-01

    Full text: Digital images, whether produced by image intensifiers, flat panels or computed radiography imaging plates, have a broad dynamic range and, thus, there is a need to adjust the exposure parameters of the imaging protocols to obtain diagnostic images without over exposing patients. The default exposure techniques of protocols delivered with the imaging equipment are in general set to produce high quality images at the expense of high radiation doses to patients. Ideally, these protocols should be optimised for best possible image quality at the lowest possible patient dose, particularly for paediatric patients. Manufacturers of equipment do not generally supply paediatric protocols and, thus, the default settings of the adult protocols have to be adjusted for paediatric patients. Optimising imaging protocols is not a trivial matter and, without a suitable phantom, it is difficult and time consuming. Commercial phantoms are commonly used to optimise adult protocols, but these are made of dry materials such as perspex, Teflon, aluminium, dry bone as in dry skulls, or a combination of these materials. The problem with these phantoms is that the features on their images are artificial, not simulating any characteristics of patients' anatomic details. In optimising paediatric protocols for our new cardiac/angiographic Siemens Biplane Digital Imaging System, we searched for a paediatric phantom with moisture content, and found that the humble butternut pumpkin (cucurbita moschate) from the squash family makes a good paediatric phantom, particularly, when it is injected with contrast. The part of the butternut that is useful as a phantom is the pulp, i.e., the part that contains the seeds. This is also the part where the contrast is injected. The image of the pulp contains structures that are natural as the butternut is the fruit of a living plant. The image of the seeds is suitable for low-level contrast detectability while fine structures enhanced by the

  19. Mass imbalances in EPANET water-quality simulations

    Science.gov (United States)

    Davis, Michael J.; Janke, Robert; Taxon, Thomas N.

    2018-04-01

    EPANET is widely employed to simulate water quality in water distribution systems. However, in general, the time-driven simulation approach used to determine concentrations of water-quality constituents provides accurate results only for short water-quality time steps. Overly long time steps can yield errors in concentration estimates and can result in situations in which constituent mass is not conserved. The use of a time step that is sufficiently short to avoid these problems may not always be feasible. The absence of EPANET errors or warnings does not ensure conservation of mass. This paper provides examples illustrating mass imbalances and explains how such imbalances can occur because of fundamental limitations in the water-quality routing algorithm used in EPANET. In general, these limitations cannot be overcome by the use of improved water-quality modeling practices. This paper also presents a preliminary event-driven approach that conserves mass with a water-quality time step that is as long as the hydraulic time step. Results obtained using the current approach converge, or tend to converge, toward those obtained using the preliminary event-driven approach as the water-quality time step decreases. Improving the water-quality routing algorithm used in EPANET could eliminate mass imbalances and related errors in estimated concentrations. The results presented in this paper should be of value to those who perform water-quality simulations using EPANET or use the results of such simulations, including utility managers and engineers.

  20. Simulation of FIB-SEM images for analysis of porous microstructures.

    Science.gov (United States)

    Prill, Torben; Schladitz, Katja

    2013-01-01

    Focused ion beam nanotomography-scanning electron microscopy tomography yields high-quality three-dimensional images of materials microstructures at the nanometer scale combining serial sectioning using a focused ion beam with SEM. However, FIB-SEM tomography of highly porous media leads to shine-through artifacts preventing automatic segmentation of the solid component. We simulate the SEM process in order to generate synthetic FIB-SEM image data for developing and validating segmentation methods. Monte-Carlo techniques yield accurate results, but are too slow for the simulation of FIB-SEM tomography requiring hundreds of SEM images for one dataset alone. Nevertheless, a quasi-analytic description of the specimen and various acceleration techniques, including a track compression algorithm and an acceleration for the simulation of secondary electrons, cut down the computing time by orders of magnitude, allowing for the first time to simulate FIB-SEM tomography. © Wiley Periodicals, Inc.

  1. Software quality and process improvement in scientific simulation codes

    Energy Technology Data Exchange (ETDEWEB)

    Ambrosiano, J.; Webster, R. [Los Alamos National Lab., NM (United States)

    1997-11-01

    This report contains viewgraphs on the quest to develope better simulation code quality through process modeling and improvement. This study is based on the experience of the authors and interviews with ten subjects chosen from simulation code development teams at LANL. This study is descriptive rather than scientific.

  2. QUALITY AWARDS: AN IMAGE OF BUSINESS EXCELLENCE

    Directory of Open Access Journals (Sweden)

    Ilies Liviu

    2015-07-01

    Full Text Available Across the world, increasingly more governmental organizations and industrial are doing everything possible to promote quality and to survive, the basic principle remains customer satisfaction and even more than that, it speaks of the principle of customer delight. In this sense, quality has become the source of sustained competitive advantage that provides organizations the supremacy of the global markets characterized by competition which becoming more and more intensified. Juran, one of the highest quality gurus say that “just as the twentieth century was the century of productivity, the twenty-first century will be the quality century” which is a very relevant and comprehensive statement of the economic reality of the past and a profound forecast for future business of the twenty-first century. In this regard, in order to achieve this competitive advantage, quality must be managed and this is accomplished through Total Quality Management (TQM. Quality awards models are instruments of total quality management through which quality can be assessed and improved, thus, knowing the quality awards models is critical for findings the new ways to improve the quality and performance of the organizations. The present paper aims to illustrate the best practices on quality improvement in this respect we intend to present the general framework of the quality awards for business excellence. In this sense we present the most important international quality awards, namely: "Malcolm Baldrige National Quality Award", "European Quality Award" and “Romanian Quality Award J. M. Juran". For this purpose we used as main sources of analyzing the structure and the operation mode of these three important quality awards, Juran's work (which is probably the most important work in the field of quality and other relevant sources in total quality management which treats issues related to quality awards and also we used as sources of updated information the official

  3. The relationship between compression force, image quality and ...

    African Journals Online (AJOL)

    Theoretically, an increase in breast compression gives a reduction in thickness without changing the density, resulting in improved image quality and reduced radiation dose. Aim. This study investigates the relationship between compression force, phantom thickness, image quality and radiation dose. The existence of a ...

  4. Monte Carlo simulations in small animal PET imaging

    Energy Technology Data Exchange (ETDEWEB)

    Branco, Susana [Universidade de Lisboa, Faculdade de Ciencias, Instituto de Biofisica e Engenharia Biomedica, Lisbon (Portugal)], E-mail: susana.silva@fc.ul.pt; Jan, Sebastien [Service Hospitalier Frederic Joliot, CEA/DSV/DRM, Orsay (France); Almeida, Pedro [Universidade de Lisboa, Faculdade de Ciencias, Instituto de Biofisica e Engenharia Biomedica, Lisbon (Portugal)

    2007-10-01

    This work is based on the use of an implemented Positron Emission Tomography (PET) simulation system dedicated for small animal PET imaging. Geant4 Application for Tomographic Emission (GATE), a Monte Carlo simulation platform based on the Geant4 libraries, is well suited for modeling the microPET FOCUS system and to implement realistic phantoms, such as the MOBY phantom, and data maps from real examinations. The use of a microPET FOCUS simulation model with GATE has been validated for spatial resolution, counting rates performances, imaging contrast recovery and quantitative analysis. Results from realistic studies of the mouse body using {sup -}F and [{sup 18}F]FDG imaging protocols are presented. These simulations include the injection of realistic doses into the animal and realistic time framing. The results have shown that it is possible to simulate small animal PET acquisitions under realistic conditions, and are expected to be useful to improve the quantitative analysis in PET mouse body studies.

  5. Simulation modeling for quality and productivity in steel cord manufacturing

    OpenAIRE

    Türkseven, Can Hulusi; Turkseven, Can Hulusi; Ertek, Gürdal; Ertek, Gurdal

    2003-01-01

    We describe the application of simulation modeling to estimate and improve quality and productivity performance of a steel cord manufacturing system. We describe the typical steel cord manufacturing plant, emphasize its distinguishing characteristics, identify various production settings and discuss applicability of simulation as a management decision support tool. Besides presenting the general structure of the developed simulation model, we focus on wire fractures, which can be an important...

  6. Comparison of quality of ultrasonographic image of the pancreas: Tissue harmonic image vs. Fundamental image

    International Nuclear Information System (INIS)

    Seo, Young Lan; Choi, Chul Soon; Kim, Ho Chul; Yoon, Dae Young; Han, Dae Hee; Bae, Sang Hoon

    2002-01-01

    To compare the quality of ultrasonographic (US) images, tissue harmonic image (THI) versus fundamental image (FI), of the pancreas. During a recent 2 month period, forty one patients with the normal pancreas on US were included. All of them were free of abnormal clinical and laboratory findings suggestive of pancreatic disease, US was performed by an abdominal radiologist with a 2.5-5 MHz convex-array transducer (Sequoia 512: Acuson, Mountain View, Calif.U.S.A.). Comparison of THI and FI of the pancreas was done for the following parameters:conspicuity, intermal architecture, and delineation range. Grading was made by the consensus of two abdominal radiologist witha three-point scale. Statistical analysis was done using Wilcox signed rank sum test. For the evaluation of the US image quality of the pancreas THI showed better conspicuity (p=0.0130), clearer internal architecture (p=0.0029) and superior delineation range (p=0.0191) than those of FI. THI appears to show a superior image quality than FI in evaluation of the pancreas.

  7. Using image quality measures and features to choose good images for classification of ISAR imagery

    CSIR Research Space (South Africa)

    Steyn, JM

    2014-10-01

    Full Text Available the quality measures and to determine the minimum dwell-time for ISAR image formation. Keywords—ISAR (inverse synthetic aperture radar), Dwell-time, Quality Measure, Image Contrast, Image Entropy, SNR (signal-to-noise ratio), Maritime Vessels ...

  8. The study of surgical image quality evaluation system by subjective quality factor method

    Science.gov (United States)

    Zhang, Jian J.; Xuan, Jason R.; Yang, Xirong; Yu, Honggang; Koullick, Edouard

    2016-03-01

    GreenLightTM procedure is an effective and economical way of treatment of benign prostate hyperplasia (BPH); there are almost a million of patients treated with GreenLightTM worldwide. During the surgical procedure, the surgeon or physician will rely on the monitoring video system to survey and confirm the surgical progress. There are a few obstructions that could greatly affect the image quality of the monitoring video, like laser glare by the tissue and body fluid, air bubbles and debris generated by tissue evaporation, and bleeding, just to name a few. In order to improve the physician's visual experience of a laser surgical procedure, the system performance parameter related to image quality needs to be well defined. However, since image quality is the integrated set of perceptions of the overall degree of excellence of an image, or in other words, image quality is the perceptually weighted combination of significant attributes (contrast, graininess …) of an image when considered in its marketplace or application, there is no standard definition on overall image or video quality especially for the no-reference case (without a standard chart as reference). In this study, Subjective Quality Factor (SQF) and acutance are used for no-reference image quality evaluation. Basic image quality parameters, like sharpness, color accuracy, size of obstruction and transmission of obstruction, are used as subparameter to define the rating scale for image quality evaluation or comparison. Sample image groups were evaluated by human observers according to the rating scale. Surveys of physician groups were also conducted with lab generated sample videos. The study shows that human subjective perception is a trustworthy way of image quality evaluation. More systematic investigation on the relationship between video quality and image quality of each frame will be conducted as a future study.

  9. Simulation modeling of quality assurance processes in an industrial plant

    Directory of Open Access Journals (Sweden)

    Gumerov Anwar Vazykhovich

    2013-11-01

    Full Text Available Quality management and the need for continuous improvement requires the development of methods of analysis and diagnostic parameters. The use of simulation techniques and statistical quality control methods will provide the basis for process control of industrial enterprises.

  10. Image quality assessment for CT used on small animals

    Energy Technology Data Exchange (ETDEWEB)

    Cisneros, Isabela Paredes, E-mail: iparedesc@unal.edu.co; Agulles-Pedrós, Luis, E-mail: lagullesp@unal.edu.co [Universidad Nacional de Colombia, Departamento de Física, Grupo de Física Médica (Colombia)

    2016-07-07

    Image acquisition on a CT scanner is nowadays necessary in almost any kind of medical study. Its purpose, to produce anatomical images with the best achievable quality, implies the highest diagnostic radiation exposure to patients. Image quality can be measured quantitatively based on parameters such as noise, uniformity and resolution. This measure allows the determination of optimal parameters of operation for the scanner in order to get the best diagnostic image. A human Phillips CT scanner is the first one minded for veterinary-use exclusively in Colombia. The aim of this study was to measure the CT image quality parameters using an acrylic phantom and then, using the computational tool MATLAB, determine these parameters as a function of current value and window of visualization, in order to reduce dose delivery by keeping the appropriate image quality.

  11. The effect of image sharpness on quantitative eye movement data and on image quality evaluation while viewing natural images

    Science.gov (United States)

    Vuori, Tero; Olkkonen, Maria

    2006-01-01

    The aim of the study is to test both customer image quality rating (subjective image quality) and physical measurement of user behavior (eye movements tracking) to find customer satisfaction differences in imaging technologies. Methodological aim is to find out whether eye movements could be quantitatively used in image quality preference studies. In general, we want to map objective or physically measurable image quality to subjective evaluations and eye movement data. We conducted a series of image quality tests, in which the test subjects evaluated image quality while we recorded their eye movements. Results show that eye movement parameters consistently change according to the instructions given to the user, and according to physical image quality, e.g. saccade duration increased with increasing blur. Results indicate that eye movement tracking could be used to differentiate image quality evaluation strategies that the users have. Results also show that eye movements would help mapping between technological and subjective image quality. Furthermore, these results give some empirical emphasis to top-down perception processes in image quality perception and evaluation by showing differences between perceptual processes in situations when cognitive task varies.

  12. Investigation of Collimator Influential Parameter on SPECT Image Quality: a Monte Carlo Study

    Directory of Open Access Journals (Sweden)

    Banari Bahnamiri Sh

    2015-03-01

    Full Text Available Background: Obtaining high quality images in Single Photon Emission Tomography (SPECT device is the most important goal in nuclear medicine. Because if image quality is low, the possibility of making a mistake in diagnosing and treating the patient will rise. Studying effective factors in spatial resolution of imaging systems is thus deemed to be vital. One of the most important factors in SPECT imaging in nuclear medicine is the use of an appropriate collimator for a certain radiopharmaceutical feature in order to create the best image as it can be effective in the quantity of Full Width at Half Maximum (FWHM which is the main parameter in spatial resolution. Method: In this research, the simulation of the detector and collimator of SPECT imaging device, Model HD3 made by Philips Co. and the investigation of important factors on the collimator were carried out using MCNP-4c code. Results: The results of the experimental measurments and simulation calculations revealed a relative difference of less than 5% leading to the confirmation of the accuracy of conducted simulation MCNP code calculation. Conclusion: This is the first essential step in the design and modelling of new collimators used for creating high quality images in nuclear medicine

  13. New developments in simulating X-ray phase contrast imaging

    International Nuclear Information System (INIS)

    Peterzol, A.; Berthier, J.; Duvauchelle, P.; Babot, D.; Ferrero, C.

    2007-01-01

    A deterministic algorithm simulating phase contrast (PC) x-ray images for complex 3- dimensional (3D) objects is presented. This algorithm has been implemented in a simulation code named VXI (Virtual X-ray Imaging). The physical model chosen to account for PC technique is based on the Fresnel-Kirchhoff diffraction theory. The algorithm consists mainly of two parts. The first one exploits the VXI ray-tracing approach to compute the object transmission function. The second part simulates the PC image due to the wave front distortion introduced by the sample. In the first part, the use of computer-aided drawing (CAD) models enables simulations to be carried out with complex 3D objects. Differently from the VXI original version, which makes use of an object description via triangular facets, the new code requires a more 'sophisticated' object representation based on Non-Uniform Rational B-Splines (NURBS). As a first step we produce a spatial high resolution image by using a point and monochromatic source and an ideal detector. To simulate the polychromatic case, the intensity image is integrated over the considered x-ray energy spectrum. Then, in order to account for the system spatial resolution properties, the high spatial resolution image (mono or polychromatic) is convolved with the total point spread function of the imaging system under consideration. The results supplied by the presented algorithm are examined with the help of some relevant examples. (authors)

  14. Deep inspiration breath-hold radiotherapy for lung cancer: impact on image quality and registration uncertainty in cone beam CT image guidance

    DEFF Research Database (Denmark)

    Josipovic, Mirjana; Persson, Gitte F; Bangsgaard, Jens Peter

    2016-01-01

    OBJECTIVE: We investigated the impact of deep inspiration breath-hold (DIBH) and tumour baseline shifts on image quality and registration uncertainty in image-guided DIBH radiotherapy (RT) for locally advanced lung cancer. METHODS: Patients treated with daily cone beam CT (CBCT)-guided free...... for the craniocaudal direction in FB, where it was >3 mm. On the 31st fraction, the intraobserver uncertainty increased compared with the second fraction. This increase was more pronounced in FB. Image quality scores improved in DIBH compared with FB for all parameters in all patients. Simulated tumour baseline shifts...... ≤2 mm did not affect the CBCT image quality considerably. CONCLUSION: DIBH CBCT improved image quality and reduced registration uncertainty in the craniocaudal direction in image-guided RT of locally advanced lung cancer. Baseline shifts ≤2 mm in DIBH during CBCT acquisition did not affect image...

  15. STANDARDIZING QUALITY ASSESSMENT OF FUSED REMOTELY SENSED IMAGES

    Directory of Open Access Journals (Sweden)

    C. Pohl

    2017-09-01

    Full Text Available The multitude of available operational remote sensing satellites led to the development of many image fusion techniques to provide high spatial, spectral and temporal resolution images. The comparison of different techniques is necessary to obtain an optimized image for the different applications of remote sensing. There are two approaches in assessing image quality: 1. Quantitatively by visual interpretation and 2. Quantitatively using image quality indices. However an objective comparison is difficult due to the fact that a visual assessment is always subject and a quantitative assessment is done by different criteria. Depending on the criteria and indices the result varies. Therefore it is necessary to standardize both processes (qualitative and quantitative assessment in order to allow an objective image fusion quality evaluation. Various studies have been conducted at the University of Osnabrueck (UOS to establish a standardized process to objectively compare fused image quality. First established image fusion quality assessment protocols, i.e. Quality with No Reference (QNR and Khan's protocol, were compared on varies fusion experiments. Second the process of visual quality assessment was structured and standardized with the aim to provide an evaluation protocol. This manuscript reports on the results of the comparison and provides recommendations for future research.

  16. Standardizing Quality Assessment of Fused Remotely Sensed Images

    Science.gov (United States)

    Pohl, C.; Moellmann, J.; Fries, K.

    2017-09-01

    The multitude of available operational remote sensing satellites led to the development of many image fusion techniques to provide high spatial, spectral and temporal resolution images. The comparison of different techniques is necessary to obtain an optimized image for the different applications of remote sensing. There are two approaches in assessing image quality: 1. Quantitatively by visual interpretation and 2. Quantitatively using image quality indices. However an objective comparison is difficult due to the fact that a visual assessment is always subject and a quantitative assessment is done by different criteria. Depending on the criteria and indices the result varies. Therefore it is necessary to standardize both processes (qualitative and quantitative assessment) in order to allow an objective image fusion quality evaluation. Various studies have been conducted at the University of Osnabrueck (UOS) to establish a standardized process to objectively compare fused image quality. First established image fusion quality assessment protocols, i.e. Quality with No Reference (QNR) and Khan's protocol, were compared on varies fusion experiments. Second the process of visual quality assessment was structured and standardized with the aim to provide an evaluation protocol. This manuscript reports on the results of the comparison and provides recommendations for future research.

  17. Evaluation of image quality of lumbar spine images: A comparison between FFE and VGA

    International Nuclear Information System (INIS)

    Tingberg, A.; Baath, M.; Haakansson, M.; Medin, J.; Besjakov, J.; Sandborg, M.; Alm-Carlsson, G.; Mattsson, S.; Maansson, L. G.

    2005-01-01

    Purpose: The aim of the present study is to compare two different methods for evaluation of the quality of clinical X-ray images. Methods: Based on fifteen lumbar spine radiographs, two new sets of images were created. A hybrid image set was created by adding two distributions of artificial lesions to each original image. The image quality parameters spatial resolution and noise were manipulated and a total of 210 hybrid images were created. A set of 105 disease-free images was created by applying the same combinations of spatial resolution and noise to the original images. The hybrid images were evaluated with the free-response forced error experiment (FFE) and the normal images with visual grading analysis (VGA) by nine experienced radiologists. Results: In the VGA study, images with low noise were preferred over images with higher noise levels. The alteration of the MTF had a limited influence on the VGA score. For the FFE study, the visibility of the lesions was independent of the sharpness and the noise level. No correlation was found between the two image quality measures. Conclusions: FFE is a precise method for evaluation of image quality, but the results are only valid for the type of lesion used in the study, whereas VGA is a more general method for clinical image quality assessment. The results of the FFE study indicate that there might be a potential to lower the dose levels in lumbar spine radiography without losing important diagnostic information. (authors)

  18. Improvement of Quality of Reconstructed Images in Multi-Frame Fresnel Digital Holography

    International Nuclear Information System (INIS)

    Xiao-Wei, Lu; Jing-Zhen, Li; Hong-Yi, Chen

    2010-01-01

    A modified reconstruction algorithm to improve the quality of reconstructed images of multi-frame Fresnel digital holography is presented. When the reference beams are plane or spherical waves with azimuth encoding, by introducing two spherical wave factors, images can be reconstructed with only one time Fourier transform. In numerical simulation, this algorithm could simplify the reconstruction process and improve the signal-to-noise ratio of the reconstructed images. In single-frame reconstruction experiments, the accurate reconstructed image is obtained with this simplified algorithm

  19. Optimization of accelerator target and detector for portal imaging using Monte Carlo simulation and experiment

    International Nuclear Information System (INIS)

    Flampouri, S.; Evans, P.M.; Partridge, M.; Nahum, A.E.; Verhaegen, A.E.; Spezi, E.

    2002-01-01

    Megavoltage portal images suffer from poor quality compared to those produced with kilovoltage x-rays. Several authors have shown that the image quality can be improved by modifying the linear accelerator to generate more low-energy photons. This work addresses the problem of using Monte Carlo simulation and experiment to optimize the beam and detector combination to maximize image quality for a given patient thickness. A simple model of the whole imaging chain was developed for investigation of the effect of the target parameters on the quality of the image. The optimum targets (6 mm thick aluminium and 1.6 mm copper) were installed in an Elekta SL25 accelerator. The first beam will be referred to as Al6 and the second as Cu1.6. A tissue-equivalent contrast phantom was imaged with the 6 MV standard photon beam and the experimental beams with standard radiotherapy and mammography film/screen systems. The arrangement with a thin Al target/mammography system improved the contrast from 1.4 cm bone in 5 cm water to 19% compared with 2% for the standard arrangement of a thick, high-Z target/radiotherapy verification system. The linac/phantom/detector system was simulated with the BEAM/EGS4 Monte Carlo code. Contrast calculated from the predicted images was in good agreement with the experiment (to within 2.5%). The use of MC techniques to predict images accurately, taking into account the whole imaging system, is a powerful new method for portal imaging system design optimization. (author)

  20. Image processing system performance prediction and product quality evaluation

    Science.gov (United States)

    Stein, E. K.; Hammill, H. B. (Principal Investigator)

    1976-01-01

    The author has identified the following significant results. A new technique for image processing system performance prediction and product quality evaluation was developed. It was entirely objective, quantitative, and general, and should prove useful in system design and quality control. The technique and its application to determination of quality control procedures for the Earth Resources Technology Satellite NASA Data Processing Facility are described.

  1. Performance comparison of different graylevel image fusion schemes through a universal image quality index

    NARCIS (Netherlands)

    Toet, A.; Hogervorst, M.A.

    2003-01-01

    We applied a recently introduced universal image quality index Q that quantifies the distortion of a processed image relative to its original version, to assess the performance of different graylevel image fusion schemes. The method is as follows. First, we adopt an original test image as the

  2. Electron Holography Image Simulation of Nanoparticles

    NARCIS (Netherlands)

    Keimpema, K.; Raedt, H. De; Hosson, J.Th.M. De

    We discuss a real-space and a Fourier-space technique to compute numerically, the phase images observed by electron holography of nanoscale particles. An assessment of the applicability and accuracy of these techniques is made by calculating numerical results for simple geometries for which

  3. Comparison of image deconvolution algorithms on simulated and laboratory infrared images

    Energy Technology Data Exchange (ETDEWEB)

    Proctor, D. [Lawrence Livermore National Lab., CA (United States)

    1994-11-15

    We compare Maximum Likelihood, Maximum Entropy, Accelerated Lucy-Richardson, Weighted Goodness of Fit, and Pixon reconstructions of simple scenes as a function of signal-to-noise ratio for simulated images with randomly generated noise. Reconstruction results of infrared images taken with the TAISIR (Temperature and Imaging System InfraRed) are also discussed.

  4. Objective analysis of image quality of video image capture systems

    Science.gov (United States)

    Rowberg, Alan H.

    1990-07-01

    As Picture Archiving and Communication System (PACS) technology has matured, video image capture has become a common way of capturing digital images from many modalities. While digital interfaces, such as those which use the ACR/NEMA standard, will become more common in the future, and are preferred because of the accuracy of image transfer, video image capture will be the dominant method in the short term, and may continue to be used for some time because of the low cost and high speed often associated with such devices. Currently, virtually all installed systems use methods of digitizing the video signal that is produced for display on the scanner viewing console itself. A series of digital test images have been developed for display on either a GE CT9800 or a GE Signa MRI scanner. These images have been captured with each of five commercially available image capture systems, and the resultant images digitally transferred on floppy disk to a PC1286 computer containing Optimast' image analysis software. Here the images can be displayed in a comparative manner for visual evaluation, in addition to being analyzed statistically. Each of the images have been designed to support certain tests, including noise, accuracy, linearity, gray scale range, stability, slew rate, and pixel alignment. These image capture systems vary widely in these characteristics, in addition to the presence or absence of other artifacts, such as shading and moire pattern. Other accessories such as video distribution amplifiers and noise filters can also add or modify artifacts seen in the captured images, often giving unusual results. Each image is described, together with the tests which were performed using them. One image contains alternating black and white lines, each one pixel wide, after equilibration strips ten pixels wide. While some systems have a slew rate fast enough to track this correctly, others blur it to an average shade of gray, and do not resolve the lines, or give

  5. Remote Cherenkov imaging-based quality assurance of a magnetic resonance image-guided radiotherapy system.

    Science.gov (United States)

    Andreozzi, Jacqueline M; Mooney, Karen E; Brůža, Petr; Curcuru, Austen; Gladstone, David J; Pogue, Brian W; Green, Olga

    2018-06-01

    Tools to perform regular quality assurance of magnetic resonance image-guided radiotherapy (MRIgRT) systems should ideally be independent of interference from the magnetic fields. Remotely acquired optical Cherenkov imaging-based dosimetry measurements in water were investigated for this purpose, comparing measures of dose accuracy, temporal dynamics, and overall integrated IMRT delivery. A 40 × 30.5 × 37.5 cm 3 water tank doped with 1 g/L of quinine sulfate was imaged using an intensified charge-coupled device (ICCD) to capture the Cherenkov emission while being irradiated by a commercial MRIgRT system (ViewRay™). The ICCD was placed down-bore at the end of the couch, 4 m from treatment isocenter and behind the 5-Gauss line of the 0.35-T MRI. After establishing optimal camera acquisition settings, square beams of increasing size (4.2 × 4.2 cm 2 , 10.5 × 10.5 cm 2 , and 14.7 × 14.7 cm 2 ) were imaged at 0.93 frames per second, from an individual cobalt-60 treatment head, to develop projection measures related to percent depth dose (PDD) curves and cross beam profiles (CPB). These Cherenkov-derived measurements were compared to ionization chamber (IC) and radiographic film dosimetry data, as well as simulation data from the treatment planning system (TPS). An intensity-modulated radiotherapy (IMRT) commissioning plan from AAPM TG-119 (C4:C-Shape) was also imaged at 2.1 frames per second, and the single linear sum image from 509 s of plan delivery was compared to the dose volume prediction generated by the TPS using gamma index analysis. Analysis of standardized test target images (1024 × 1024 pixels) yielded a pixel resolution of 0.37 mm/pixel. The beam width measured from the Cherenkov image-generated projection CBPs was within 1 mm accuracy when compared to film measurements for all beams. The 502 point measurements (i.e., pixels) of the Cherenkov image-based projection percent depth dose curves (pPDDs) were compared to p

  6. Comparing planar image quality of rotating slat and parallel hole collimation: influence of system modeling

    International Nuclear Information System (INIS)

    Holen, Roel van; Vandenberghe, Stefaan; Staelens, Steven; Lemahieu, Ignace

    2008-01-01

    The main remaining challenge for a gamma camera is to overcome the existing trade-off between collimator spatial resolution and system sensitivity. This problem, strongly limiting the performance of parallel hole collimated gamma cameras, can be overcome by applying new collimator designs such as rotating slat (RS) collimators which have a much higher photon collection efficiency. The drawback of a RS collimated gamma camera is that, even for obtaining planar images, image reconstruction is needed, resulting in noise accumulation. However, nowadays iterative reconstruction techniques with accurate system modeling can provide better image quality. Because the impact of this modeling on image quality differs from one system to another, an objective assessment of the image quality obtained with a RS collimator is needed in comparison to classical projection images obtained using a parallel hole (PH) collimator. In this paper, a comparative study of image quality, achieved with system modeling, is presented. RS data are reconstructed to planar images using maximum likelihood expectation maximization (MLEM) with an accurate Monte Carlo derived system matrix while PH projections are deconvolved using a Monte Carlo derived point-spread function. Contrast-to-noise characteristics are used to show image quality for cold and hot spots of varying size. Influence of the object size and contrast is investigated using the optimal contrast-to-noise ratio (CNR o ). For a typical phantom setup, results show that cold spot imaging is slightly better for a PH collimator. For hot spot imaging, the CNR o of the RS images is found to increase with increasing lesion diameter and lesion contrast while it decreases when background dimensions become larger. Only for very large background dimensions in combination with low contrast lesions, the use of a PH collimator could be beneficial for hot spot imaging. In all other cases, the RS collimator scores better. Finally, the simulation of a

  7. No-reference visual quality assessment for image inpainting

    Science.gov (United States)

    Voronin, V. V.; Frantc, V. A.; Marchuk, V. I.; Sherstobitov, A. I.; Egiazarian, K.

    2015-03-01

    Inpainting has received a lot of attention in recent years and quality assessment is an important task to evaluate different image reconstruction approaches. In many cases inpainting methods introduce a blur in sharp transitions in image and image contours in the recovery of large areas with missing pixels and often fail to recover curvy boundary edges. Quantitative metrics of inpainting results currently do not exist and researchers use human comparisons to evaluate their methodologies and techniques. Most objective quality assessment methods rely on a reference image, which is often not available in inpainting applications. Usually researchers use subjective quality assessment by human observers. It is difficult and time consuming procedure. This paper focuses on a machine learning approach for no-reference visual quality assessment for image inpainting based on the human visual property. Our method is based on observation that Local Binary Patterns well describe local structural information of the image. We use a support vector regression learned on assessed by human images to predict perceived quality of inpainted images. We demonstrate how our predicted quality value correlates with qualitative opinion in a human observer study. Results are shown on a human-scored dataset for different inpainting methods.

  8. The art of assessing quality for images and video

    International Nuclear Information System (INIS)

    Deriche, M.

    2011-01-01

    The early years of this century have witnessed a tremendous growth in the use of digital multimedia data for di?erent communication applications. Researchers from around the world are spending substantial research efforts in developing techniques for improving the appearance of images/video. However, as we know, preserving high quality is a challenging task. Images are subject to distortions during acquisition, compression, transmission, analysis, and reconstruction. For this reason, the research area focusing on image and video quality assessment has attracted a lot of attention in recent years. In particular, compression applications and other multimedia applications need powerful techniques for evaluating quality objectively without human interference. This tutorial will cover the di?erent faces of image quality assessment. We will motivate the need for robust image quality assessment techniques, then discuss the main algorithms found in the literature with a critical perspective. We will present the di?erent metrics used for full reference, reduced reference and no reference applications. We will then discuss the difference between image and video quality assessment. In all of the above, we will take a critical approach to explain which metric can be used for which application. Finally we will discuss the different approaches to analyze the performance of image/video quality metrics, and end the tutorial with some perspectives on newly introduced metrics and their potential applications.

  9. Cone beam computed tomography radiation dose and image quality assessments.

    Science.gov (United States)

    Lofthag-Hansen, Sara

    2010-01-01

    Diagnostic radiology has undergone profound changes in the last 30 years. New technologies are available to the dental field, cone beam computed tomography (CBCT) as one of the most important. CBCT is a catch-all term for a technology comprising a variety of machines differing in many respects: patient positioning, volume size (FOV), radiation quality, image capturing and reconstruction, image resolution and radiation dose. When new technology is introduced one must make sure that diagnostic accuracy is better or at least as good as the one it can be expected to replace. The CBCT brand tested was two versions of Accuitomo (Morita, Japan): 3D Accuitomo with an image intensifier as detector, FOV 3 cm x 4 cm and 3D Accuitomo FPD with a flat panel detector, FOVs 4 cm x 4 cm and 6 cm x 6 cm. The 3D Accuitomo was compared with intra-oral radiography for endodontic diagnosis in 35 patients with 46 teeth analyzed, of which 41 were endodontically treated. Three observers assessed the images by consensus. The result showed that CBCT imaging was superior with a higher number of teeth diagnosed with periapical lesions (42 vs 32 teeth). When evaluating 3D Accuitomo examinations in the posterior mandible in 30 patients, visibility of marginal bone crest and mandibular canal, important anatomic structures for implant planning, was high with good observer agreement among seven observers. Radiographic techniques have to be evaluated concerning radiation dose, which requires well-defined and easy-to-use methods. Two methods: CT dose index (CTDI), prevailing method for CT units, and dose-area product (DAP) were evaluated for calculating effective dose (E) for both units. An asymmetric dose distribution was revealed when a clinical situation was simulated. Hence, the CTDI method was not applicable for these units with small FOVs. Based on DAP values from 90 patient examinations effective dose was estimated for three diagnostic tasks: implant planning in posterior mandible and

  10. Increased Frame Rate for Plane Wave Imaging Without Loss of Image Quality

    DEFF Research Database (Denmark)

    Jensen, Jonas; Stuart, Matthias Bo; Jensen, Jørgen Arendt

    2015-01-01

    Clinical applications of plane wave imaging necessitate the creation of high-quality images with the highest possible frame rate for improved blood flow tracking and anatomical imaging. However, linear array transducers create grating lobe artefacts, which degrade the image quality especially...... in the near field for λ-pitch transducers. Artefacts can only partly be suppressed by increasing the number of emissions, and this paper demonstrates how the frame rate can be increased without loss of image quality by using λ/2-pitch transducers. The number of emissions and steering angles are optimized...

  11. A computer code to simulate X-ray imaging techniques

    International Nuclear Information System (INIS)

    Duvauchelle, Philippe; Freud, Nicolas; Kaftandjian, Valerie; Babot, Daniel

    2000-01-01

    A computer code was developed to simulate the operation of radiographic, radioscopic or tomographic devices. The simulation is based on ray-tracing techniques and on the X-ray attenuation law. The use of computer-aided drawing (CAD) models enables simulations to be carried out with complex three-dimensional (3D) objects and the geometry of every component of the imaging chain, from the source to the detector, can be defined. Geometric unsharpness, for example, can be easily taken into account, even in complex configurations. Automatic translations or rotations of the object can be performed to simulate radioscopic or tomographic image acquisition. Simulations can be carried out with monochromatic or polychromatic beam spectra. This feature enables, for example, the beam hardening phenomenon to be dealt with or dual energy imaging techniques to be studied. The simulation principle is completely deterministic and consequently the computed images present no photon noise. Nevertheless, the variance of the signal associated with each pixel of the detector can be determined, which enables contrast-to-noise ratio (CNR) maps to be computed, in order to predict quantitatively the detectability of defects in the inspected object. The CNR is a relevant indicator for optimizing the experimental parameters. This paper provides several examples of simulated images that illustrate some of the rich possibilities offered by our software. Depending on the simulation type, the computation time order of magnitude can vary from 0.1 s (simple radiographic projection) up to several hours (3D tomography) on a PC, with a 400 MHz microprocessor. Our simulation tool proves to be useful in developing new specific applications, in choosing the most suitable components when designing a new testing chain, and in saving time by reducing the number of experimental tests

  12. A computer code to simulate X-ray imaging techniques

    Energy Technology Data Exchange (ETDEWEB)

    Duvauchelle, Philippe E-mail: philippe.duvauchelle@insa-lyon.fr; Freud, Nicolas; Kaftandjian, Valerie; Babot, Daniel

    2000-09-01

    A computer code was developed to simulate the operation of radiographic, radioscopic or tomographic devices. The simulation is based on ray-tracing techniques and on the X-ray attenuation law. The use of computer-aided drawing (CAD) models enables simulations to be carried out with complex three-dimensional (3D) objects and the geometry of every component of the imaging chain, from the source to the detector, can be defined. Geometric unsharpness, for example, can be easily taken into account, even in complex configurations. Automatic translations or rotations of the object can be performed to simulate radioscopic or tomographic image acquisition. Simulations can be carried out with monochromatic or polychromatic beam spectra. This feature enables, for example, the beam hardening phenomenon to be dealt with or dual energy imaging techniques to be studied. The simulation principle is completely deterministic and consequently the computed images present no photon noise. Nevertheless, the variance of the signal associated with each pixel of the detector can be determined, which enables contrast-to-noise ratio (CNR) maps to be computed, in order to predict quantitatively the detectability of defects in the inspected object. The CNR is a relevant indicator for optimizing the experimental parameters. This paper provides several examples of simulated images that illustrate some of the rich possibilities offered by our software. Depending on the simulation type, the computation time order of magnitude can vary from 0.1 s (simple radiographic projection) up to several hours (3D tomography) on a PC, with a 400 MHz microprocessor. Our simulation tool proves to be useful in developing new specific applications, in choosing the most suitable components when designing a new testing chain, and in saving time by reducing the number of experimental tests.

  13. Modified-BRISQUE as no reference image quality assessment for structural MR images.

    Science.gov (United States)

    Chow, Li Sze; Rajagopal, Heshalini

    2017-11-01

    An effective and practical Image Quality Assessment (IQA) model is needed to assess the image quality produced from any new hardware or software in MRI. A highly competitive No Reference - IQA (NR - IQA) model called Blind/Referenceless Image Spatial Quality Evaluator (BRISQUE) initially designed for natural images were modified to evaluate structural MR images. The BRISQUE model measures the image quality by using the locally normalized luminance coefficients, which were used to calculate the image features. The modified-BRISQUE model trained a new regression model using MR image features and Difference Mean Opinion Score (DMOS) from 775 MR images. Two types of benchmarks: objective and subjective assessments were used as performance evaluators for both original and modified-BRISQUE models. There was a high correlation between the modified-BRISQUE with both benchmarks, and they were higher than those for the original BRISQUE. There was a significant percentage improvement in their correlation values. The modified-BRISQUE was statistically better than the original BRISQUE. The modified-BRISQUE model can accurately measure the image quality of MR images. It is a practical NR-IQA model for MR images without using reference images. Copyright © 2017 Elsevier Inc. All rights reserved.

  14. Sensitometric properties and image quality of radiographic film and paper

    International Nuclear Information System (INIS)

    Domanus, J.C.

    1985-01-01

    When using X-ray film or radiographic paper for industrial applications one is interested in knowing not only their sensitometric properties (such as speed and contrast) but also the image quality obtainable with a particular brand of film or paper. Although standard methods for testing sensitometric properties and image quality separately are available, it is desirable to find a method by the use of which all the relevant properties could be tested together. The sensitometric properties are usually determined at constant kilovoltage and filtration at the X-ray tube, whereas the radiographic image quality is tested at different kilovoltages and for different material thicknesses

  15. Image quality control in radiodiagnostic in an University Hospital

    International Nuclear Information System (INIS)

    Almeida, Claudio Domingues de; Mota, Helvecio C.; Almeida, Carlos Eduardo de

    1996-01-01

    The image quality criteria proposed for European Union (UE) has been used to evaluate the chest x-ray examinations in a typical Department of Radiology of an University Hospital in Rio de Janeiro. The study includes information on x-ray beam parameters, film-screen combination, doses to the patients, film processing and image quality. Lateral and PA chest examinations of 63 patients were investigated. Only 10% of the patients presented entrance doses greater than the reference level proposed for UE and adopted by International Atomic Energy Agency and World Health Organization. The image quality has been approved for 87% of the examinations. (author)

  16. Design and simulation of a totally digital image system for medical image applications

    International Nuclear Information System (INIS)

    Archwamety, C.

    1987-01-01

    The Totally Digital Imaging System (TDIS) is based on system requirements information from the Radiology Department, University of Arizona Health Science Center. This dissertation presents the design of this complex system, the TDIS specification, the system performance requirements, and the evaluation of the system using the computer-simulation programs. Discrete-event simulation models were developed for the TDIS subsystems, including an image network, imaging equipment, storage migration algorithm, data base archive system, and a control and management network. The simulation system uses empirical data generation and retrieval rates measured at the University Medical Center hospital. The entire TDIS system was simulated in Simscript II.5 using a VAX 8600 computer system. Simulation results show the fiber-optical-image network to be suitable; however, the optical-disk-storage system represents a performance bottleneck

  17. Brain imaging with synthetic MR in children: clinical quality assessment

    Energy Technology Data Exchange (ETDEWEB)

    Betts, Aaron M.; Serai, Suraj [Cincinnati Children' s Hospital Medical Center, Department of Radiology, Cincinnati, OH (United States); Leach, James L.; Jones, Blaise V. [Cincinnati Children' s Hospital Medical Center, Department of Radiology, Cincinnati, OH (United States); University of Cincinnati College of Medicine, Cincinnati, OH (United States); Zhang, Bin [Cincinnati Children' s Hospital Medical Center, Biostatistics and Epidemiology, Cincinnati, OH (United States)

    2016-10-15

    Synthetic magnetic resonance imaging is a quantitative imaging technique that measures inherent T1-relaxation, T2-relaxation, and proton density. These inherent tissue properties allow synthesis of various imaging sequences from a single acquisition. Clinical use of synthetic MR imaging has been described in adult populations. However, use of synthetic MR imaging has not been previously reported in children. The purpose of this study is to report our assessment of diagnostic image quality using synthetic MR imaging in children. Synthetic MR acquisition was obtained in a sample of children undergoing brain MR imaging. Image quality assessments were performed on conventional and synthetic T1-weighted, T2-weighted, and FLAIR images. Standardized linear measurements were performed on conventional and synthetic T2 images. Estimates of patient age based upon myelination patterns were also performed. Conventional and synthetic MR images were evaluated on 30 children. Using a 4-point assessment scale, conventional imaging performed better than synthetic imaging for T1-weighted, T2-weighted, and FLAIR images. When the assessment was simplified to a dichotomized scale, the conventional and synthetic T1-weighted and T2-weighted images performed similarly. However, the superiority of conventional FLAIR images persisted in the dichotomized assessment. There were no statistically significant differences between linear measurements made on T2-weighted images. Estimates of patient age based upon pattern of myelination were also similar between conventional and synthetic techniques. Synthetic MR imaging may be acceptable for clinical use in children. However, users should be aware of current limitations that could impact clinical utility in the software version used in this study. (orig.)

  18. Mammogram synthesis using a 3D simulation. I. Breast tissue model and image acquisition simulation

    International Nuclear Information System (INIS)

    Bakic, Predrag R.; Albert, Michael; Brzakovic, Dragana; Maidment, Andrew D. A.

    2002-01-01

    A method is proposed for generating synthetic mammograms based upon simulations of breast tissue and the mammographic imaging process. A computer breast model has been designed with a realistic distribution of large and medium scale tissue structures. Parameters controlling the size and placement of simulated structures (adipose compartments and ducts) provide a method for consistently modeling images of the same simulated breast with modified position or acquisition parameters. The mammographic imaging process is simulated using a compression model and a model of the x-ray image acquisition process. The compression model estimates breast deformation using tissue elasticity parameters found in the literature and clinical force values. The synthetic mammograms were generated by a mammogram acquisition model using a monoenergetic parallel beam approximation applied to the synthetically compressed breast phantom

  19. Effect of area x-ray beam equalization on image quality and dose in digital mammography

    Energy Technology Data Exchange (ETDEWEB)

    Wong, Jerry; Xu Tong; Husain, Adeel; Le, Huy; Molloi, Sabee [Department of Radiological Sciences, University of California, Irvine, CA 92697 (United States)

    2004-08-21

    In mammography, thick or dense breast regions persistently suffer from reduced contrast-to-noise ratio (CNR) because of degraded contrast from large scatter intensities and relatively high noise. Area x-ray beam equalization can improve image quality by increasing the x-ray exposure to under-penetrated regions without increasing the exposure to other breast regions. Optimal equalization parameters with respect to image quality and patient dose were determined through computer simulations and validated with experimental observations on a step phantom and an anthropomorphic breast phantom. Three parameters important in equalization digital mammography were considered: attenuator material (Z = 13-92), beam energy (22-34 kVp) and equalization level. A Mo/Mo digital mammography system was used for image acquisition. A prototype 16 x 16 piston driven equalization system was used for preparing patient-specific equalization masks. Simulation studies showed that a molybdenum attenuator and an equalization level of 20 were optimal for improving contrast, CNR and figure of merit (FOM = CNR{sup 2}/dose). Experimental measurements using these parameters showed significant improvements in contrast, CNR and FOM. Moreover, equalized images of a breast phantom showed improved image quality. These results indicate that area beam equalization can improve image quality in digital mammography.

  20. Improvement on image quality of single photon ECT with converging collimator system

    International Nuclear Information System (INIS)

    Murayama, Hideo; Nohara, Norimasa; Tanaka, Eiichi

    1986-01-01

    Single photon emission computed tomography (SPECT) with converging collimator system was proposed to improve quality of reconstructed images. The collimator system was designed to enhance sensitivity at the center region of field-of-view, where the probability photons escape the attenuating medium is smaller than at the off-center region. In order to evaluate efficiency of the improvement on image quality, the weighting function of projection, which is defined as relative sensitivity to the average on the lateral sampling of projection, was adopted to the image reconstruction algorithm of Radial Post Correction method. Statistical mean square noise in a reconstructed image was formulated in this method. Simulation studies using typical weighting function showed that center-enhanced weighting function brings effective improvement on image quality, especially, at the center region of cold area surrounded by annularly distributed activity. A new SPECT system was proposed as one example of the converging collimator systems. The system is composed of four gamma cameras with four fan-beam collimators, which have different focal distances one another. Simple simulation studies showed that the proposed system has reasonable center-enhanced weighting function, and the image quality based on the proposed system was fairly improved as compared with one based on uniform weighting function at the center region of the field-of-view. (author)

  1. SEGMENTATION AND QUALITY ANALYSIS OF LONG RANGE CAPTURED IRIS IMAGE

    Directory of Open Access Journals (Sweden)

    Anand Deshpande

    2016-05-01

    Full Text Available The iris segmentation plays a major role in an iris recognition system to increase the performance of the system. This paper proposes a novel method for segmentation of iris images to extract the iris part of long range captured eye image and an approach to select best iris frame from the iris polar image sequences by analyzing the quality of iris polar images. The quality of iris image is determined by the frequency components present in the iris polar images. The experiments are carried out on CASIA-long range captured iris image sequences. The proposed segmentation method is compared with Hough transform based segmentation and it has been determined that the proposed method gives higher accuracy for segmentation than Hough transform.

  2. CALIBRATED ULTRA FAST IMAGE SIMULATIONS FOR THE DARK ENERGY SURVEY

    Energy Technology Data Exchange (ETDEWEB)

    Bruderer, Claudio; Chang, Chihway; Refregier, Alexandre; Amara, Adam; Bergé, Joel; Gamper, Lukas, E-mail: claudio.bruderer@phys.ethz.ch [Institute for Astronomy, Department of Physics, ETH Zurich, Wolfgang-Pauli-Strasse 27, 8093 Zürich (Switzerland)

    2016-01-20

    Image simulations are becoming increasingly important in understanding the measurement process of the shapes of galaxies for weak lensing and the associated systematic effects. For this purpose we present the first implementation of the Monte Carlo Control Loops (MCCL), a coherent framework for studying systematic effects in weak lensing. It allows us to model and calibrate the shear measurement process using image simulations from the Ultra Fast Image Generator (UFig) and the image analysis software SExtractor. We apply this framework to a subset of the data taken during the Science Verification period (SV) of the Dark Energy Survey (DES). We calibrate the UFig simulations to be statistically consistent with one of the SV images, which covers ∼0.5 square degrees. We then perform tolerance analyses by perturbing six simulation parameters and study their impact on the shear measurement at the one-point level. This allows us to determine the relative importance of different parameters. For spatially constant systematic errors and point-spread function, the calibration of the simulation reaches the weak lensing precision needed for the DES SV survey area. Furthermore, we find a sensitivity of the shear measurement to the intrinsic ellipticity distribution, and an interplay between the magnitude-size and the pixel value diagnostics in constraining the noise model. This work is the first application of the MCCL framework to data and shows how it can be used to methodically study the impact of systematics on the cosmic shear measurement.

  3. Evaluating Picture Quality of Image Plates in Digital CR Systems

    Energy Technology Data Exchange (ETDEWEB)

    Kwak, Byung Joon [Dept. of Radiological Tecnology, Choonhae College of Health Science, Ulsan (Korea, Republic of); Ji Tae Jeong [Dept. of Radiological Science, Kaya University, Kimhae (Korea, Republic of)

    2011-12-15

    Lab effectively supplemented the effects of outside radiation on image plates in the process of image acquisition of CR (computed radiography) systems and conducted for effective utilization in the case of clinical application. For this, Lab classified the storage places and time periods of image plates and compared and analyzed the differences between small dark spots. Lab also assessed the concentration distribution within the boundaries of images. Lab compared and measured the number of dark spots in a light room and a dark room depending on the storage places of image plates and found that dark spots slightly increased in an image plate when stored in a light room on the first and second days. Dark spots increased in proportion to the length of time stored. In the case of the image plate stored in a dark room, the number of dark spots remarkably decreased. With regard to picture quality as related to the location of image plates, the damage to picture quality could be reduced by locating regions of interest in the center. With regard to differences in sharpness following changes in the thickness of subjects, fewer scatter rays occurred and sharpness improved by reducing the thickness of subjects as much as possible. To get medical images of excellent quality, image plates should be managed effectively and it is desirable to keep images plates in dark iron plate boxes and not to expose them to outside radiation for a long time.

  4. Evaluating Picture Quality of Image Plates in Digital CR Systems

    International Nuclear Information System (INIS)

    Kwak, Byung Joon; Ji Tae Jeong

    2011-01-01

    Lab effectively supplemented the effects of outside radiation on image plates in the process of image acquisition of CR (computed radiography) systems and conducted for effective utilization in the case of clinical application. For this, Lab classified the storage places and time periods of image plates and compared and analyzed the differences between small dark spots. Lab also assessed the concentration distribution within the boundaries of images. Lab compared and measured the number of dark spots in a light room and a dark room depending on the storage places of image plates and found that dark spots slightly increased in an image plate when stored in a light room on the first and second days. Dark spots increased in proportion to the length of time stored. In the case of the image plate stored in a dark room, the number of dark spots remarkably decreased. With regard to picture quality as related to the location of image plates, the damage to picture quality could be reduced by locating regions of interest in the center. With regard to differences in sharpness following changes in the thickness of subjects, fewer scatter rays occurred and sharpness improved by reducing the thickness of subjects as much as possible. To get medical images of excellent quality, image plates should be managed effectively and it is desirable to keep images plates in dark iron plate boxes and not to expose them to outside radiation for a long time.

  5. Tree-structured vector quantization of CT chest scans: Image quality and diagnostic accuracy

    International Nuclear Information System (INIS)

    Cosman, P.C.; Tseng, C.; Gray, R.M.; Olshen, R.A.; Moses, L.E.; Davidson, H.C.; Bergin, C.J.; Riskin, E.A.

    1993-01-01

    The quality of lossy compressed images is often characterized by signal-to-noise ratios, informal tests of subjective quality, or receiver operating characteristic (ROC) curves that include subjective appraisals of the value of an image for a particular application. The authors believe that for medical applications, lossy compressed images should be judged by a more natural and fundamental aspect of relative image quality: their use in making accurate diagnoses. They apply a lossy compression algorithm to medical images, and quantify the quality of the images by the diagnostic performance of radiologists, as well as by traditional signal-to-noise ratios and subjective ratings. The study is unlike previous studies of the effects of lossy compression in that they consider non-binary detection tasks, simulate actual diagnostic practice instead of using paired tests or confidence rankings, use statistical methods that are more appropriate for non-binary clinical data than are the popular ROC curves, and use low-complexity predictive tree-structured vector quantization for compression rather than DCT-based transform codes combined with entropy coding. Their diagnostic tasks are the identification of nodules (tumors) in the lungs and lymphadenopathy in the mediastinum from computerized tomography (CT) chest scans. For the image modality, compression algorithm, and diagnostic tasks they consider, the original 12 bit per pixel (bpp) CT image can be compressed to between 1 bpp and 2 bpp with no significant changes in diagnostic accuracy

  6. Effect of masking phase-only holograms on the quality of reconstructed images.

    Science.gov (United States)

    Deng, Yuanbo; Chu, Daping

    2016-04-20

    A phase-only hologram modulates the phase of the incident light and diffracts it efficiently with low energy loss because of the minimum absorption. Much research attention has been focused on how to generate phase-only holograms, and little work has been done to understand the effect and limitation of their partial implementation, possibly due to physical defects and constraints, in particular as in the practical situations where a phase-only hologram is confined or needs to be sliced or tiled. The present study simulates the effect of masking phase-only holograms on the quality of reconstructed images in three different scenarios with different filling factors, filling positions, and illumination intensity profiles. Quantitative analysis confirms that the width of the image point spread function becomes wider and the image quality decreases, as expected, when the filling factor decreases, and the image quality remains the same for different filling positions as well. The width of the image point spread function as derived from different filling factors shows a consistent behavior to that as measured directly from the reconstructed image, especially as the filling factor becomes small. Finally, mask profiles of different shapes and intensity distributions are shown to have more complicated effects on the image point spread function, which in turn affects the quality and textures of the reconstructed image.

  7. Free Energy Adjusted Peak Signal to Noise Ratio (FEA-PSNR) for Image Quality Assessment

    Science.gov (United States)

    Liu, Ning; Zhai, Guangtao

    2017-12-01

    Peak signal to noise ratio (PSNR), the de facto universal image quality metric has been widely criticized as having poor correlation with human subjective quality ratings. In this paper, it will be illustrated that the low performance of PSNR as an image quality metric is partially due to its inability of differentiating image contents. And it is revealed that the deviation between subjective score and PSNR for each type of distortions can be systematically captured by perceptual complexity of the target image. The free energy modelling technique is then introduced to simulate the human cognitive process and measure perceptual complexity of an image. Then it is shown that performance of PSNR can be effectively improved using a linear score mapping process considering image free energy and distortion type. The proposed free energy adjusted peak signal to noise ratio (FEA-PSNR) does not change computational steps the of ordinary PSNR and therefore it inherits the merits of being simple, derivable and physically meaningful. So FEA-PSNR can be easily integrated into existing PSNR based image processing systems to achieve more visually plausible results. And the proposed analysis approach can be extended to other types of image quality metrics for enhanced performance.

  8. Quality assessment in radiological imaging methods

    International Nuclear Information System (INIS)

    Herstel, W.

    1985-01-01

    The equipment used in diagnostic radiology is becoming more and more complicated. In the imaging process four components are distinguished, each of which can introduce loss in essential information: the X-ray source, the human body, the imaging system and the observer. In nearly all imaging methods the X-ray quantum fluctuations are a limitation to observation. But there are also technical factors. As an illustration it is shown how in a television scanning process the resolution is restricted by the system parameters. A short review is given of test devices and the results are given of an image comparison based on regular bar patterns. Although this method has the disadvantage of measuring mainly the limiting resolution, the results of the test correlate reasonably well with the subjective appreciations of radiographs of bony structures made by a group of trained radiologists. Fluoroscopic systems should preferably be tested using moving structures under dynamic conditions. (author)

  9. Optimization of shearography image quality analysis

    International Nuclear Information System (INIS)

    Rafhayudi Jamro

    2005-01-01

    Shearography is an optical technique based on speckle pattern to measure the deformation of the object surface in which the fringe pattern is obtained through the correlation analysis from the speckle pattern. Analysis of fringe pattern for engineering application is limited for qualitative measurement. Therefore, for further analysis that lead to qualitative data, series of image processing mechanism are involved. In this paper, the fringe pattern for qualitative analysis is discussed. In principal field of applications is qualitative non-destructive testing such as detecting discontinuity, defect in the material structure, locating fatigue zones and etc and all these required image processing application. In order to performed image optimisation successfully, the noise in the fringe pattern must be minimised and the fringe pattern itself must be maximise. This can be achieved by applying a filtering method with a kernel size ranging from 2 X 2 to 7 X 7 pixels size and also applying equalizer in the image processing. (Author)

  10. Assessing microscope image focus quality with deep learning.

    Science.gov (United States)

    Yang, Samuel J; Berndl, Marc; Michael Ando, D; Barch, Mariya; Narayanaswamy, Arunachalam; Christiansen, Eric; Hoyer, Stephan; Roat, Chris; Hung, Jane; Rueden, Curtis T; Shankar, Asim; Finkbeiner, Steven; Nelson, Philip

    2018-03-15

    Large image datasets acquired on automated microscopes typically have some fraction of low quality, out-of-focus images, despite the use of hardware autofocus systems. Identification of these images using automated image analysis with high accuracy is important for obtaining a clean, unbiased image dataset. Complicating this task is the fact that image focus quality is only well-defined in foreground regions of images, and as a result, most previous approaches only enable a computation of the relative difference in quality between two or more images, rather than an absolute measure of quality. We present a deep neural network model capable of predicting an absolute measure of image focus on a single image in isolation, without any user-specified parameters. The model operates at the image-patch level, and also outputs a measure of prediction certainty, enabling interpretable predictions. The model was trained on only 384 in-focus Hoechst (nuclei) stain images of U2OS cells, which were synthetically defocused to one of 11 absolute defocus levels during training. The trained model can generalize on previously unseen real Hoechst stain images, identifying the absolute image focus to within one defocus level (approximately 3 pixel blur diameter difference) with 95% accuracy. On a simpler binary in/out-of-focus classification task, the trained model outperforms previous approaches on both Hoechst and Phalloidin (actin) stain images (F-scores of 0.89 and 0.86, respectively over 0.84 and 0.83), despite only having been presented Hoechst stain images during training. Lastly, we observe qualitatively that the model generalizes to two additional stains, Hoechst and Tubulin, of an unseen cell type (Human MCF-7) acquired on a different instrument. Our deep neural network enables classification of out-of-focus microscope images with both higher accuracy and greater precision than previous approaches via interpretable patch-level focus and certainty predictions. The use of

  11. X-ray Computed Tomography Image Quality Indicator (IQI) Development

    Data.gov (United States)

    National Aeronautics and Space Administration — Phase one of the program is to identify suitable x-ray Computed Tomography (CT) Image Quality Indicator (IQI) design(s) that can be used to adequately capture CT...

  12. Image quality measurements for X-ray television chains

    International Nuclear Information System (INIS)

    Mohr, M.

    1986-01-01

    Image quality measurements were carried out for 36 television chains during 3 years. For the parameters sensitivity, resolution, contrast-detail diagram, minimal contrast and dose rate average values and experiences on their long-term stability are reported. (author)

  13. TH-B-207B-00: Pediatric Image Quality Optimization

    International Nuclear Information System (INIS)

    2016-01-01

    This imaging educational program will focus on solutions to common pediatric image quality optimization challenges. The speakers will present collective knowledge on best practices in pediatric imaging from their experience at dedicated children’s hospitals. One of the most commonly encountered pediatric imaging requirements for the non-specialist hospital is pediatric CT in the emergency room setting. Thus, this educational program will begin with optimization of pediatric CT in the emergency department. Though pediatric cardiovascular MRI may be less common in the non-specialist hospitals, low pediatric volumes and unique cardiovascular anatomy make optimization of these techniques difficult. Therefore, our second speaker will review best practices in pediatric cardiovascular MRI based on experiences from a children’s hospital with a large volume of cardiac patients. Learning Objectives: To learn techniques for optimizing radiation dose and image quality for CT of children in the emergency room setting. To learn solutions for consistently high quality cardiovascular MRI of children

  14. Dosimetry and image quality assessment in a direct radiography system

    Energy Technology Data Exchange (ETDEWEB)

    Oliveira, Bruno Beraldo; Paixao, Lucas; Nogueira, Maria do Socorro, E-mail: boliveira.mg@gmail.com [Centro de Desenvolvimento da Tecnologia Nuclear (CDTN/CNEN-MG), Belo Horizonte, MG (Brazil); Oliveira, Marcio Alves de [Universidade Federal de Minas Gerais (UFMG), Belo Horizonte, MG (Brazil). Fac. de Medicina. Dept. de Anatomia e Imagem; Teixeira, Maria Helena Araujo [Clinica Dra. Maria Helena Araujo Teixeira, Belo Horizonte, MG (Brazil)

    2014-11-15

    Objective: to evaluate the mean glandular dose with a solid state detector and the image quality in a direct radiography system, utilizing phantoms. Materials and methods: Irradiations were performed with automatic exposure control and polymethyl methacrylate slabs with different thicknesses to calculate glandular dose values. The image quality was evaluated by means of the structures visualized on the images of the phantoms. Results: considering the uncertainty of the measurements, the mean glandular dose results are in agreement with the values provided by the equipment and with internationally adopted reference levels. Results obtained from images of the phantoms were in agreement with the reference values. Conclusion: the present study contributes to verify the equipment conformity as regards dose values and image quality. (author)

  15. TH-B-207B-00: Pediatric Image Quality Optimization

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2016-06-15

    This imaging educational program will focus on solutions to common pediatric image quality optimization challenges. The speakers will present collective knowledge on best practices in pediatric imaging from their experience at dedicated children’s hospitals. One of the most commonly encountered pediatric imaging requirements for the non-specialist hospital is pediatric CT in the emergency room setting. Thus, this educational program will begin with optimization of pediatric CT in the emergency department. Though pediatric cardiovascular MRI may be less common in the non-specialist hospitals, low pediatric volumes and unique cardiovascular anatomy make optimization of these techniques difficult. Therefore, our second speaker will review best practices in pediatric cardiovascular MRI based on experiences from a children’s hospital with a large volume of cardiac patients. Learning Objectives: To learn techniques for optimizing radiation dose and image quality for CT of children in the emergency room setting. To learn solutions for consistently high quality cardiovascular MRI of children.

  16. Simulating Galaxies and Active Galactic Nuclei in the LSST Image Simulation Effort

    NARCIS (Netherlands)

    Pizagno II, Jim; Ahmad, Z.; Bankert, J.; Bard, D.; Connolly, A.; Chang, C.; Gibson, R. R.; Gilmore, K.; Grace, E.; Hannel, M.; Jernigan, J. G.; Jones, L.; Kahn, S. M.; Krughoff, S. K.; Lorenz, S.; Marshall, S.; Shmakova, S. M.; Sylvestri, N.; Todd, N.; Young, M.

    We present an extragalactic source catalog, which includes galaxies and Active Galactic Nuclei, that is used for the Large Survey Synoptic Telescope Imaging Simulation effort. The galaxies are taken from the De Lucia et. al. (2006) semi-analytic modeling (SAM) of the Millennium Simulation. The LSST

  17. Naturalness and image quality : saturation and lightness variation in color images of natural scenes

    NARCIS (Netherlands)

    Ridder, de H.

    1996-01-01

    The relation between perceived image quality and naturalness was investigated by varying the colorfulness of natural images at various lightness levels. At each lightness level, subjects assessed perceived colorfulness, naturalness, and quality as a function of average saturation by means of direct

  18. TH-CD-207A-08: Simulated Real-Time Image Guidance for Lung SBRT Patients Using Scatter Imaging

    Energy Technology Data Exchange (ETDEWEB)

    Redler, G; Cifter, G; Templeton, A; Lee, C; Bernard, D; Liao, Y; Zhen, H; Turian, J; Chu, J [Rush University Medical Center, Chicago, IL (United States)

    2016-06-15

    Purpose: To develop a comprehensive Monte Carlo-based model for the acquisition of scatter images of patient anatomy in real-time, during lung SBRT treatment. Methods: During SBRT treatment, images of patient anatomy can be acquired from scattered radiation. To rigorously examine the utility of scatter images for image guidance, a model is developed using MCNP code to simulate scatter images of phantoms and lung cancer patients. The model is validated by comparing experimental and simulated images of phantoms of different complexity. The differentiation between tissue types is investigated by imaging objects of known compositions (water, lung, and bone equivalent). A lung tumor phantom, simulating materials and geometry encountered during lung SBRT treatments, is used to investigate image noise properties for various quantities of delivered radiation (monitor units(MU)). Patient scatter images are simulated using the validated simulation model. 4DCT patient data is converted to an MCNP input geometry accounting for different tissue composition and densities. Lung tumor phantom images acquired with decreasing imaging time (decreasing MU) are used to model the expected noise amplitude in patient scatter images, producing realistic simulated patient scatter images with varying temporal resolution. Results: Image intensity in simulated and experimental scatter images of tissue equivalent objects (water, lung, bone) match within the uncertainty (∼3%). Lung tumor phantom images agree as well. Specifically, tumor-to-lung contrast matches within the uncertainty. The addition of random noise approximating quantum noise in experimental images to simulated patient images shows that scatter images of lung tumors can provide images in as fast as 0.5 seconds with CNR∼2.7. Conclusions: A scatter imaging simulation model is developed and validated using experimental phantom scatter images. Following validation, lung cancer patient scatter images are simulated. These simulated

  19. TH-CD-207A-08: Simulated Real-Time Image Guidance for Lung SBRT Patients Using Scatter Imaging

    International Nuclear Information System (INIS)

    Redler, G; Cifter, G; Templeton, A; Lee, C; Bernard, D; Liao, Y; Zhen, H; Turian, J; Chu, J

    2016-01-01

    Purpose: To develop a comprehensive Monte Carlo-based model for the acquisition of scatter images of patient anatomy in real-time, during lung SBRT treatment. Methods: During SBRT treatment, images of patient anatomy can be acquired from scattered radiation. To rigorously examine the utility of scatter images for image guidance, a model is developed using MCNP code to simulate scatter images of phantoms and lung cancer patients. The model is validated by comparing experimental and simulated images of phantoms of different complexity. The differentiation between tissue types is investigated by imaging objects of known compositions (water, lung, and bone equivalent). A lung tumor phantom, simulating materials and geometry encountered during lung SBRT treatments, is used to investigate image noise properties for various quantities of delivered radiation (monitor units(MU)). Patient scatter images are simulated using the validated simulation model. 4DCT patient data is converted to an MCNP input geometry accounting for different tissue composition and densities. Lung tumor phantom images acquired with decreasing imaging time (decreasing MU) are used to model the expected noise amplitude in patient scatter images, producing realistic simulated patient scatter images with varying temporal resolution. Results: Image intensity in simulated and experimental scatter images of tissue equivalent objects (water, lung, bone) match within the uncertainty (∼3%). Lung tumor phantom images agree as well. Specifically, tumor-to-lung contrast matches within the uncertainty. The addition of random noise approximating quantum noise in experimental images to simulated patient images shows that scatter images of lung tumors can provide images in as fast as 0.5 seconds with CNR∼2.7. Conclusions: A scatter imaging simulation model is developed and validated using experimental phantom scatter images. Following validation, lung cancer patient scatter images are simulated. These simulated

  20. ANALYSIS OF THE EFFECTS OF IMAGE QUALITY ON DIGITAL MAP GENERATION FROM SATELLITE IMAGES

    Directory of Open Access Journals (Sweden)

    H. Kim

    2012-07-01

    Full Text Available High resolution satellite images are widely used to produce and update a digital map since they became widely available. It is well known that the accuracy of digital map produced from satellite images is decided largely by the accuracy of geometric modelling. However digital maps are made by a series of photogrammetric workflow. Therefore the accuracy of digital maps are also affected by the quality of satellite images, such as image interpretability. For satellite images, parameters such as Modulation Transfer Function(MTF, Signal to Noise Ratio(SNR and Ground Sampling Distance(GSD are used to present images quality. Our previous research stressed that such quality parameters may not represent the quality of image products such as digital maps and that parameters for image interpretability such as Ground Resolved Distance(GRD and National Imagery Interpretability Rating Scale(NIIRS need to be considered. In this study, we analyzed the effects of the image quality on accuracy of digital maps produced by satellite images. QuickBird, IKONOS and KOMPSAT-2 imagery were used to analyze as they have similar GSDs. We measured various image quality parameters mentioned above from these images. Then we produced digital maps from the images using a digital photogrammetric workstation. We analyzed the accuracy of the digital maps in terms of their location accuracy and their level of details. Then we compared the correlation between various image quality parameters and the accuracy of digital maps. The results of this study showed that GRD and NIIRS were more critical for map production then GSD, MTF or SNR.

  1. Simulation of water quality for Salt Creek in northeastern Illinois

    Science.gov (United States)

    Melching, Charles S.; Chang, T.J.

    1996-01-01

    Water-quality processes in the Salt Creek watershed in northeastern Illinois were simulated with a computer model. Selected waste-load scenarios for 7-day, 10-year low-flow conditions were simulated in the stream system. The model development involved the calibration of the U.S. Environmental Protection Agency QUAL2E model to water-quality constituent concentration data collected by the Illinois Environmental Protection Agency (IEPA) for a diel survey on August 29-30, 1995, and the verification of this model with water-quality constituent concentration data collected by the IEPA for a diel survey on June 27-28, 1995. In-stream measurements of sediment oxygen demand rates and carbonaceous biochemical oxygen demand (CBOD) decay rates by the IEPA and traveltime and reaeration-rate coefficients by the U.S. Geological Survey facilitated the development of a model for simulation of water quality in the Salt Creek watershed. In general, the verification of the calibrated model increased confidence in the utility of the model for water-quality planning in the Salt Creek watershed. However, the model was adjusted to better simulate constituent concentrations measured during the June 27-28, 1995, diel survey. Two versions of the QUAL2E model were utilized to simulate dissolved oxygen (DO) concentrations in the Salt Creek watershed for selected effluent discharge and concentration scenarios for water-quality planning: (1) the QUAL2E model calibrated to the August 29-30, 1995, diel survey, and (2) the QUAL2E model adjusted to the June 27-28, 1995, diel survey. The results of these simulations indicated that the QUAL2E model adjusted to the June 27-28, 1995, diel survey simulates reliable information for water-quality planning. The results of these simulations also indicated that to maintain DO concentrations greater than 5 milligrams per liter (mg/L) throughout most of Salt Creek for 7-day, 10-year low-flow conditions, the sewage-treatment plants (STP's) must discharge

  2. [Accuracy of morphological simulation for orthognatic surgery. Assessment of a 3D image fusion software.

    Science.gov (United States)

    Terzic, A; Schouman, T; Scolozzi, P

    2013-08-06

    The CT/CBCT data allows for 3D reconstruction of skeletal and untextured soft tissue volume. 3D stereophotogrammetry technology has strongly improved the quality of facial soft tissue surface texture. The combination of these two technologies allows for an accurate and complete reconstruction. The 3D virtual head may be used for orthognatic surgical planning, virtual surgery, and morphological simulation obtained with a software dedicated to the fusion of 3D photogrammetric and radiological images. The imaging material include: a multi-slice CT scan or broad field CBCT scan, a 3D photogrammetric camera. The operative image processing protocol includes the following steps: 1) pre- and postoperative CT/CBCT scan and 3D photogrammetric image acquisition; 2) 3D image segmentation and fusion of untextured CT/CBCT skin with the preoperative textured facial soft tissue surface of the 3D photogrammetric scan; 3) image fusion of the pre- and postoperative CT/CBCT data set virtual osteotomies, and 3D photogrammetric soft tissue virtual simulation; 4) fusion of virtual simulated 3D photogrammetric and real postoperative images, and assessment of accuracy using a color-coded scale to measure the differences between the two surfaces. Copyright © 2013. Published by Elsevier Masson SAS.

  3. Improved radionuclide bone imaging agent injection needle withdrawal method can improve image quality

    International Nuclear Information System (INIS)

    Qin Yongmei; Wang Laihao; Zhao Lihua; Guo Xiaogang; Kong Qingfeng

    2009-01-01

    Objective: To investigate the improvement of radionuclide bone imaging agent injection needle withdrawal method on whole body bone scan image quality. Methods: Elbow vein injection syringe needle directly into the bone imaging agent in the routine group of 117 cases, with a cotton swab needle injection method for the rapid pull out the needle puncture point pressing, pressing moment. Improvement of 117 cases of needle injection method to put two needles into the skin swabs and blood vessels, pull out the needle while pressing two or more entry point 5min. After 2 hours underwent whole body bone SPECT imaging plane. Results: The conventional group at the injection site imaging agents uptake rate was 16.24%, improved group was 2.56%. Conclusion: The modified bone imaging agent injection needle withdrawal method, injection-site imaging agent uptake were significantly decreased whole body bone imaging can improve image quality. (authors)

  4. Applying image quality in cell phone cameras: lens distortion

    Science.gov (United States)

    Baxter, Donald; Goma, Sergio R.; Aleksic, Milivoje

    2009-01-01

    This paper describes the framework used in one of the pilot studies run under the I3A CPIQ initiative to quantify overall image quality in cell-phone cameras. The framework is based on a multivariate formalism which tries to predict overall image quality from individual image quality attributes and was validated in a CPIQ pilot program. The pilot study focuses on image quality distortions introduced in the optical path of a cell-phone camera, which may or may not be corrected in the image processing path. The assumption is that the captured image used is JPEG compressed and the cellphone camera is set to 'auto' mode. As the used framework requires that the individual attributes to be relatively perceptually orthogonal, in the pilot study, the attributes used are lens geometric distortion (LGD) and lateral chromatic aberrations (LCA). The goal of this paper is to present the framework of this pilot project starting with the definition of the individual attributes, up to their quantification in JNDs of quality, a requirement of the multivariate formalism, therefore both objective and subjective evaluations were used. A major distinction in the objective part from the 'DSC imaging world' is that the LCA/LGD distortions found in cell-phone cameras, rarely exhibit radial behavior, therefore a radial mapping/modeling cannot be used in this case.

  5. Relationships of virtual reality neuroendoscopic simulations to actual imaging.

    Science.gov (United States)

    Riegel, T; Alberti, O; Retsch, R; Shiratori, V; Hellwig, D; Bertalanffy, H

    2000-12-01

    Advances in computer technology have permitted virtual reality images of the ventricular system. To determine the relevance of these images we have compared virtual reality simulations of the ventricular system with endoscopic findings in three patients. The virtual fly-through can be simulated after definition of waypoints. Flight objects of interest can be viewed from all sides. Important drawbacks are that filigree structures may be missed and blood vessels cannot be distinguished clearly. However, virtual endoscopy can presently be used as a planning tool or for training and has future potential for neurosurgery.

  6. Fast and Automatic Ultrasound Simulation from CT Images

    Directory of Open Access Journals (Sweden)

    Weijian Cong

    2013-01-01

    Full Text Available Ultrasound is currently widely used in clinical diagnosis because of its fast and safe imaging principles. As the anatomical structures present in an ultrasound image are not as clear as CT or MRI. Physicians usually need advance clinical knowledge and experience to distinguish diseased tissues. Fast simulation of ultrasound provides a cost-effective way for the training and correlation of ultrasound and the anatomic structures. In this paper, a novel method is proposed for fast simulation of ultrasound from a CT image. A multiscale method is developed to enhance tubular structures so as to simulate the blood flow. The acoustic response of common tissues is generated by weighted integration of adjacent regions on the ultrasound propagation path in the CT image, from which parameters, including attenuation, reflection, scattering, and noise, are estimated simultaneously. The thin-plate spline interpolation method is employed to transform the simulation image between polar and rectangular coordinate systems. The Kaiser window function is utilized to produce integration and radial blurring effects of multiple transducer elements. Experimental results show that the developed method is very fast and effective, allowing realistic ultrasound to be fast generated. Given that the developed method is fully automatic, it can be utilized for ultrasound guided navigation in clinical practice and for training purpose.

  7. Blind image quality assessment based on aesthetic and statistical quality-aware features

    Science.gov (United States)

    Jenadeleh, Mohsen; Masaeli, Mohammad Masood; Moghaddam, Mohsen Ebrahimi

    2017-07-01

    The main goal of image quality assessment (IQA) methods is the emulation of human perceptual image quality judgments. Therefore, the correlation between objective scores of these methods with human perceptual scores is considered as their performance metric. Human judgment of the image quality implicitly includes many factors when assessing perceptual image qualities such as aesthetics, semantics, context, and various types of visual distortions. The main idea of this paper is to use a host of features that are commonly employed in image aesthetics assessment in order to improve blind image quality assessment (BIQA) methods accuracy. We propose an approach that enriches the features of BIQA methods by integrating a host of aesthetics image features with the features of natural image statistics derived from multiple domains. The proposed features have been used for augmenting five different state-of-the-art BIQA methods, which use statistical natural scene statistics features. Experiments were performed on seven benchmark image quality databases. The experimental results showed significant improvement of the accuracy of the methods.

  8. HDR Image Quality Enhancement Based on Spatially Variant Retinal Response

    Directory of Open Access Journals (Sweden)

    Horiuchi Takahiko

    2010-01-01

    Full Text Available There is a growing demand for being able to display high dynamic range (HDR images on low dynamic range (LDR devices. Tone mapping is a process for enhancing HDR image quality on an LDR device by converting the tonal values of the original image from HDR to LDR. This paper proposes a new tone mapping algorithm for enhancing image quality by deriving a spatially-variant operator for imitating S-potential response in human retina, which efficiently improves local contrasts while conserving good global appearance. The proposed tone mapping operator is studied from a system construction point of view. It is found that the operator is regarded as a natural extension of the Retinex algorithm by adding a global adaptation process to the local adaptation. The feasibility of the proposed algorithm is examined in detail on experiments using standard HDR images and real HDR scene images, comparing with conventional tone mapping algorithms.

  9. Decision theory on the quality evaluation of medical images

    International Nuclear Information System (INIS)

    Lessa, Patricia Silva

    2001-10-01

    The problem of quality has been a constant issue in every organization.One is always seeking to produce more, to do it at a lower cost, and to do it with better quality. However, in this country, there is no radiographic film quality control system for radiographic services. The tittle that actually gets done is essentially ad hoc and superficial. The implications of this gap, along with some other shortcomings that exist in process as a whole (the state of the x-ray equipment, the adequate to use in order to obtain a radiography, the quality of the film, the processing of the film, the brightness and homogeneity of the viewing boxes, the ability of the radiologist), have a very negative impact on the quality of the medical image, and, as result, to the quality of the medical diagnosis and therapy. It frequently happens that many radiographs have to be repeated, which leads to an increase of the patient's exposure to radiation, as well as of the cost of the procedure for the patient. Low quality radiographs that are not repeated greatly increase the probability of a wrong diagnosis, and consequently, of inadequate therapeutical procedures, thus producing increased incidence of bad outcomes and higher costs. The paradigm proposed in order to establish a system for the measurement of the image's quality is Decision Theory. The problem of the assessment of the image is studied by proposing a Decision Theory approach. The review of the literature reveals a great concern with the quality of the image, along with an absence of an adequate paradigm and several essentially empirical procedures. Image parameters are developed in order to formalize the problem in terms of Decision Theory, and various aspects of image digitalisation are exposed. Finally, a solution is presented, including a protocol for quality control. (author)

  10. Computerized method for radiologic system parameter simulations destined for quality assurance programs

    International Nuclear Information System (INIS)

    Marques, M.A.; Frere, A.F.; Oliveira, H.J.Q.; Marques, P.M.A.; Schiabel, H.

    1999-01-01

    The objective of this work is to develop a computational simulation method that allows for fast radiographical image quality evaluations that are devoid of the problems inherent to the traditional methods used to date. The algorithms implemented take into consideration the focal spot size and intensity distribution, the geometric conditions of exposure, the angular X-ray distribution (heel effect), the Compton effect and the anti-scatter grids (Bucky). (author)

  11. Simulation of scintillating fiber gamma ray detectors for medical imaging

    International Nuclear Information System (INIS)

    Chaney, R.C.; Fenyves, E.J.; Antich, P.P.

    1990-01-01

    This paper reports on plastic scintillating fibers which have been shown to be effective for high spatial and time resolution of gamma rays. They may be expected to significantly improve the resolution of current medical imaging systems such as PET and SPECT. Monte Carlo simulation of imaging systems using these detectors, provides a means to optimize their performance in this application, as well as demonstrate their resolution and efficiency. Monte Carlo results are presented for PET and SPECT systems constructed using these detectors

  12. Source position error influence on industry CT image quality

    International Nuclear Information System (INIS)

    Cong Peng; Li Zhipeng; Wu Haifeng

    2004-01-01

    Based on the emulational exercise, the influence of source position error on industry CT (ICT) image quality was studied and the valuable parameters were obtained for the design of ICT. The vivid container CT image was also acquired from the CT testing system. (authors)

  13. Evaluation of Effective Parameters on Quality of Magnetic Resonance Imaging-computed Tomography Image Fusion in Head and Neck Tumors for Application in Treatment Planning

    Directory of Open Access Journals (Sweden)

    Atefeh Shirvani

    2017-01-01

    Full Text Available Background: In radiation therapy, computed tomography (CT simulation is used for treatment planning to define the location of tumor. Magnetic resonance imaging (MRI-CT image fusion leads to more efficient tumor contouring. This work tried to identify the practical issues for the combination of CT and MRI images in real clinical cases. The effect of various factors is evaluated on image fusion quality. Materials and Methods: In this study, the data of thirty patients with brain tumors were used for image fusion. The effect of several parameters on possibility and quality of image fusion was evaluated. These parameters include angles of the patient's head on the bed, slices thickness, slice gap, and height of the patient's head. Results: According to the results, the first dominating factor on quality of image fusion was the difference slice gap between CT and MRI images (cor = 0.86, P 4 cm and image fusion quality was <25%. Conclusion: The most important problem in image fusion is that MRI images are taken without regard to their use in treatment planning. In general, parameters related to the patient position during MRI imaging should be chosen to be consistent with CT images of the patient in terms of location and angle.

  14. Image formation simulation for computer-aided inspection planning of machine vision systems

    Science.gov (United States)

    Irgenfried, Stephan; Bergmann, Stephan; Mohammadikaji, Mahsa; Beyerer, Jürgen; Dachsbacher, Carsten; Wörn, Heinz

    2017-06-01

    In this work, a simulation toolset for Computer Aided Inspection Planning (CAIP) of systems for automated optical inspection (AOI) is presented along with a versatile two-robot-setup for verification of simulation and system planning results. The toolset helps to narrow down the large design space of optical inspection systems in interaction with a system expert. The image formation taking place in optical inspection systems is simulated using GPU-based real time graphics and high quality off-line-rendering. The simulation pipeline allows a stepwise optimization of the system, from fast evaluation of surface patch visibility based on real time graphics up to evaluation of image processing results based on off-line global illumination calculation. A focus of this work is on the dependency of simulation quality on measuring, modeling and parameterizing the optical surface properties of the object to be inspected. The applicability to real world problems is demonstrated by taking the example of planning a 3D laser scanner application. Qualitative and quantitative comparison results of synthetic and real images are presented.

  15. Quality control in diagnostic mammography: myths, realities and their importance in the final image quality

    International Nuclear Information System (INIS)

    Mora Rodriguez, Patricia

    2011-01-01

    Mammography is the most used tool for early detection of breast cancer and reduce mortality from this cause. Studies with ionizing radiation it is important that be justified and provide a quality image to make the diagnosis, to get more benefits and fewer risks. The problem is the difficult to obtain an image of the breast. Therefore, the commitment to quality mammography is to maximize the contrast, definition, resolution and reliability, thus minimizing noise and dose. A mammogram performed without quality don't detect early breast cancer and the study doesn't have sense. Quality mammography requires trained and experienced staff, modern equipment and in good conditions, correct positioning, right technical factors and appropriate viewing conditions. In addition, quality programs are required to reach to ensure quality, control in testing techniques and image quality. (author) [es

  16. Detection and optimization of image quality and dose in digital mammography systems

    International Nuclear Information System (INIS)

    Semturs, F.

    2015-01-01

    Background and purpose: During the last few years, mammography institutes have replaced their conventional mammography systems (FSM) with digital mammography systems (FFDM). This happened mainly in direction to digital computed radiography systems (FFDM-CR), where the mammography device could be kept in operation. Consequently also the AEC-parameters have not been changed and therefore the same dose as for FFM was used. Following the main theme of the thesis "Optimization of image quality and dose", also measurements with such CR-Systems have been performed in relation to image quality and dose behavior. Optimization in this context means - in following the ALARA principle - the reduction of dose while ensuring required clinical image quality. With other words - image quality is of higher value compared to dose. Considering this, it has been found out through measurements during this thesis, that FFDM-CR Systems need considerable more dose for achieving image quality comparable with FFM. On the other hand, it has been shown with measurements during this thesis, that the newest FFDM-CR technology (needle structure) supports dose reduction (optimization) to a certain degree without compromising image quality. Dose increase, as recommended in this thesis, could also increase the danger of more radiation induced carcinoma. There are several studies (which are also discussed in this thesis), which show that the benefit of not missing cancers because of higher dose dramatically overrides any health concerns. Such an optimization of image quality and dose is now described in more detail by comparing the new CR needle technology with the older power based CR technology. Material and Methods: The image quality and dose behavior for multiple breast thicknesses (simulated with PMMA slabs) of a CR needle crystal detector system is optimized by considering also different beam qualities. Technical image quality is determined with a low contrast phantom (CDMAM phantom) and from

  17. The Image Quality Translator – A Way to Support Specification of Imaging Requirements

    DEFF Research Database (Denmark)

    Kejser, Ulla Bøgvad; Bech, Mogens

    2015-01-01

    Archives, libraries, and museums run numerous imaging projects to digitize physical works and collections of cultural heritage. This study presents a tool called the 'Image Quality Translator' that is being designed at the Royal Library to support the planning of digitization projects and to make...... the process of specifying and controlling imaging requirements more efficient. The tool seeks to translate between the language used by collection managers and curators to express needs for image quality, and the more technical terms and metrics used by imaging experts and photographers to express...

  18. Image quality analysis of vibration effects In C-arm-flat panel X-ray imaging

    NARCIS (Netherlands)

    Snoeren, R.M.; Kroon, J.N.; With, de P.H.N.

    2011-01-01

    The motion of C-arm scanning X-ray systems may result in vibrations of the imaging sub-system. In this paper, we connect C-arm system vibrations to Image Quality (IQ) deterioration for 2D angiography and 3D cone beam X-ray imaging, using large Flat Panel detectors. Vibrations will affect the

  19. Naturalness and image quality : chroma and hue variation in color images of natural scenes

    NARCIS (Netherlands)

    Ridder, de H.; Blommaert, F.J.J.; Fedorovskaya, E.A.; Rogowitz, B.E.; Allebach, J.P.

    1995-01-01

    The relation between perceptual image quality and naturalness was investigated by varying the colorfulness and hue of color images of natural scenes. These variations were created by digitizing the images, subsequently determining their color point distributions in the CIELUV color space and finally

  20. Naturalness and image quality: Chroma and hue variation in color images of natural scenes

    NARCIS (Netherlands)

    Ridder, de H.; Blommaert, F.J.J.; Fedorovskaya, E.A.; Eschbach, R.; Braun, K.

    1997-01-01

    The relation between perceptual image quality and natural ness was investigated by varying the colorfulness and hue of color images of natural scenes. These variations were created by digitizing the images, subsequently determining their color point distributions in the CIELUV color space and

  1. Improving high resolution retinal image quality using speckle illumination HiLo imaging.

    Science.gov (United States)

    Zhou, Xiaolin; Bedggood, Phillip; Metha, Andrew

    2014-08-01

    Retinal image quality from flood illumination adaptive optics (AO) ophthalmoscopes is adversely affected by out-of-focus light scatter due to the lack of confocality. This effect is more pronounced in small eyes, such as that of rodents, because the requisite high optical power confers a large dioptric thickness to the retina. A recently-developed structured illumination microscopy (SIM) technique called HiLo imaging has been shown to reduce the effect of out-of-focus light scatter in flood illumination microscopes and produce pseudo-confocal images with significantly improved image quality. In this work, we adopted the HiLo technique to a flood AO ophthalmoscope and performed AO imaging in both (physical) model and live rat eyes. The improvement in image quality from HiLo imaging is shown both qualitatively and quantitatively by using spatial spectral analysis.

  2. Assessment of COTS IR image simulation tools for ATR development

    Science.gov (United States)

    Seidel, Heiko; Stahl, Christoph; Bjerkeli, Frode; Skaaren-Fystro, Paal

    2005-05-01

    Following the tendency of increased use of imaging sensors in military aircraft, future fighter pilots will need onboard artificial intelligence e.g. ATR for aiding them in image interpretation and target designation. The European Aeronautic Defence and Space Company (EADS) in Germany has developed an advanced method for automatic target recognition (ATR) which is based on adaptive neural networks. This ATR method can assist the crew of military aircraft like the Eurofighter in sensor image monitoring and thereby reduce the workload in the cockpit and increase the mission efficiency. The EADS ATR approach can be adapted for imagery of visual, infrared and SAR sensors because of the training-based classifiers of the ATR method. For the optimal adaptation of these classifiers they have to be trained with appropriate and sufficient image data. The training images must show the target objects from different aspect angles, ranges, environmental conditions, etc. Incomplete training sets lead to a degradation of classifier performance. Additionally, ground truth information i.e. scenario conditions like class type and position of targets is necessary for the optimal adaptation of the ATR method. In Summer 2003, EADS started a cooperation with Kongsberg Defence & Aerospace (KDA) from Norway. The EADS/KDA approach is to provide additional image data sets for training-based ATR through IR image simulation. The joint study aims to investigate the benefits of enhancing incomplete training sets for classifier adaptation by simulated synthetic imagery. EADS/KDA identified the requirements of a commercial-off-the-shelf IR simulation tool capable of delivering appropriate synthetic imagery for ATR development. A market study of available IR simulation tools and suppliers was performed. After that the most promising tool was benchmarked according to several criteria e.g. thermal emission model, sensor model, targets model, non-radiometric image features etc., resulting in a

  3. Application of Simulated Three Dimensional CT Image in Orthognathic Surgery

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Hyun Don; Park, Chang Seo [Dept. of Dental Radiology, College of Dentistry, Yensei University, Seoul (Korea, Republic of); Yoo, Sun Kook; Lee, Kyoung Sang [Dept. of Medical Engineering, College of Medicine, Yensei University, Seoul (Korea, Republic of)

    1998-08-15

    In orthodontics and orthognathic surgery, cephalogram has been routine practice in diagnosis and treatment evaluation of craniofacial deformity. But its inherent distortion of actual length and angles during projecting three dimensional object to two dimensional plane might cause errors in quantitative analysis of shape and size. Therefore, it is desirable that three dimensional object is diagnosed and evaluated three dimensionally and three dimensional CT image is best for three dimensional analysis. Development of clinic necessitates evaluation of result of treatment and comparison before and after surgery. It is desirable that patient that was diagnosed and planned by three dimensional computed tomography before surgery is evaluated by three dimensional computed tomography after surgery, too. But Because there is no standardized normal values in three dimension now and three dimensional Computed Tomography needs expensive equipment and because of its expenses and amount of exposure to radiation, limitations still remain to be solved in its application to routine practice. If postoperative three dimensional image is constructed by pre and postoperative lateral and postero-anterior cephalograms and preoperative three dimensional computed tomogram, pre and postoperative image will be compared and evaluated three dimensionally without three dimensional computed tomography after surgery and that will contribute to standardize normal values in three dimension. This study introduced new method that computer-simulated three dimensional image was constructed by preoperative three dimensional computed tomogram and pre and postoperative lateral and postero-anterior cephalograms, and for validation of new method, in four cases of dry skull that position of mandible was displaced and four patients of orthognathic surgery, computer-simulated three dimensional image and actual postoperative three dimensional image were compared. The results were as follows. 1. In four cases of

  4. Application of Simulated Three Dimensional CT Image in Orthognathic Surgery

    International Nuclear Information System (INIS)

    Kim, Hyun Don; Park, Chang Seo; Yoo, Sun Kook; Lee, Kyoung Sang

    1998-01-01

    In orthodontics and orthognathic surgery, cephalogram has been routine practice in diagnosis and treatment evaluation of craniofacial deformity. But its inherent distortion of actual length and angles during projecting three dimensional object to two dimensional plane might cause errors in quantitative analysis of shape and size. Therefore, it is desirable that three dimensional object is diagnosed and evaluated three dimensionally and three dimensional CT image is best for three dimensional analysis. Development of clinic necessitates evaluation of result of treatment and comparison before and after surgery. It is desirable that patient that was diagnosed and planned by three dimensional computed tomography before surgery is evaluated by three dimensional computed tomography after surgery, too. But Because there is no standardized normal values in three dimension now and three dimensional Computed Tomography needs expensive equipment and because of its expenses and amount of exposure to radiation, limitations still remain to be solved in its application to routine practice. If postoperative three dimensional image is constructed by pre and postoperative lateral and postero-anterior cephalograms and preoperative three dimensional computed tomogram, pre and postoperative image will be compared and evaluated three dimensionally without three dimensional computed tomography after surgery and that will contribute to standardize normal values in three dimension. This study introduced new method that computer-simulated three dimensional image was constructed by preoperative three dimensional computed tomogram and pre and postoperative lateral and postero-anterior cephalograms, and for validation of new method, in four cases of dry skull that position of mandible was displaced and four patients of orthognathic surgery, computer-simulated three dimensional image and actual postoperative three dimensional image were compared. The results were as follows. 1. In four cases of

  5. Characterization of a computed tomography iterative reconstruction algorithm by image quality evaluations with an anthropomorphic phantom

    International Nuclear Information System (INIS)

    Rampado, O.; Bossi, L.; Garabello, D.; Davini, O.; Ropolo, R.

    2012-01-01

    Objective: This study aims to investigate the consequences on dose and image quality of the choices of different combinations of NI and adaptive statistical iterative reconstruction (ASIR) percentage, the image quality parameters of GE CT equipment. Methods: An anthropomorphic phantom was used to simulate the chest and upper abdomen of a standard weight patient. Images were acquired with tube current modulation and different values of noise index, in the range 10–22 for a slice thickness of 5 mm and a tube voltage of 120 kV. For each selected noise index, several image series were reconstructed using different percentages of ASIR (0, 40, 50, 60, 70, 100). Quantitative noise was assessed at different phantom locations. Computed tomography dose index (CTDI) and dose length products (DLP) were recorded. Three radiologists reviewed the images in a blinded and randomized manner and assessed the subjective image quality by comparing the image series with the one acquired with the reference protocol (noise index 14, ASIR 40%). The perceived noise, contrast, edge sharpness and overall quality were graded on a scale from −2 (much worse) to +2 (much better). Results: A repeatable trend of noise reduction versus the percentage of ASIR was observed for different noise levels and phantom locations. The different combinations of noise index and percentage of ASIR to obtain a desired dose reduction were assessed. The subjective image quality evaluation evidenced a possible dose reduction between 24 and 40% as a consequence of an increment of ASIR percentage to 50 or 70%, respectively. Conclusion: These results highlighted that the same patient dose reduction can be obtained with several combinations of noise index and percentages of ASIR, providing a model with which to choose these acquisition parameters in future optimization studies, with the aim of reducing patient dose by maintaining image quality in diagnostic levels.

  6. Characterization of a computed tomography iterative reconstruction algorithm by image quality evaluations with an anthropomorphic phantom

    Energy Technology Data Exchange (ETDEWEB)

    Rampado, O., E-mail: orampado@molinette.piemonte.it [S.C. Fisica Sanitaria, San Giovanni Battista Hospital of Turin, Corso Bramante 88, Torino 10126 (Italy); Bossi, L., E-mail: laura-bossi@hotmail.it [S.C. Fisica Sanitaria, San Giovanni Battista Hospital of Turin, Corso Bramante 88, Torino 10126 (Italy); Garabello, D., E-mail: dgarabello@molinette.piemonte.it [S.C. Radiodiagnostica DEA, San Giovanni Battista Hospital of Turin, Corso Bramante 88, Torino 10126 (Italy); Davini, O., E-mail: odavini@molinette.piemonte.it [S.C. Radiodiagnostica DEA, San Giovanni Battista Hospital of Turin, Corso Bramante 88, Torino 10126 (Italy); Ropolo, R., E-mail: rropolo@molinette.piemonte.it [S.C. Fisica Sanitaria, San Giovanni Battista Hospital of Turin, Corso Bramante 88, Torino 10126 (Italy)

    2012-11-15

    Objective: This study aims to investigate the consequences on dose and image quality of the choices of different combinations of NI and adaptive statistical iterative reconstruction (ASIR) percentage, the image quality parameters of GE CT equipment. Methods: An anthropomorphic phantom was used to simulate the chest and upper abdomen of a standard weight patient. Images were acquired with tube current modulation and different values of noise index, in the range 10-22 for a slice thickness of 5 mm and a tube voltage of 120 kV. For each selected noise index, several image series were reconstructed using different percentages of ASIR (0, 40, 50, 60, 70, 100). Quantitative noise was assessed at different phantom locations. Computed tomography dose index (CTDI) and dose length products (DLP) were recorded. Three radiologists reviewed the images in a blinded and randomized manner and assessed the subjective image quality by comparing the image series with the one acquired with the reference protocol (noise index 14, ASIR 40%). The perceived noise, contrast, edge sharpness and overall quality were graded on a scale from -2 (much worse) to +2 (much better). Results: A repeatable trend of noise reduction versus the percentage of ASIR was observed for different noise levels and phantom locations. The different combinations of noise index and percentage of ASIR to obtain a desired dose reduction were assessed. The subjective image quality evaluation evidenced a possible dose reduction between 24 and 40% as a consequence of an increment of ASIR percentage to 50 or 70%, respectively. Conclusion: These results highlighted that the same patient dose reduction can be obtained with several combinations of noise index and percentages of ASIR, providing a model with which to choose these acquisition parameters in future optimization studies, with the aim of reducing patient dose by maintaining image quality in diagnostic levels.

  7. Deep learning for objective quality assessment of 3D images

    NARCIS (Netherlands)

    Mocanu, D.C.; Exarchakos, G.; Liotta, A.

    2014-01-01

    Improving the users' Quality of Experience (QoE) in modern 3D Multimedia Systems is a challenging proposition, mainly due to our limited knowledge of 3D image Quality Assessment algorithms. While subjective QoE methods would better reflect the nature of human perception, these are not suitable in

  8. Radiation doses and some aspects of image quality in mammography facilities in New Zealand

    International Nuclear Information System (INIS)

    Williamson, B.D.P.; Poletti, J.L.

    1990-02-01

    Until recently, mammography in New Zealand was performed largely with adapted conventional x-ray machines with tungsten anode x-ray tubes. Over the last several years these have virtually all been replaced by dedicated mammography machines with molybdenum anode x-ray tubes. To assess current trends in radiation doses to patients and central aspects of image quality, some 37 mammography x-ray machines were surveyed during 1988-89. The mean glandular dose per film for 30 and 45 mm thick breast-equivalent phantoms was determined using thermoluminescent dosimetry. Imagings of simulated microcalcifications (specks) and a contrast-detail phantom were assessed. Accuracy of calibration of the x-ray machines and quality of film processing were also tested. Details of the survey results are given. Mean glandular tissue doses per cranio-caudal films were generally well within the recommended guidelines. Mammography facilities differed in their ability to delete simulated calcification specks. Mammographic equipment was found to be generally well adjusted. Speed and contrast of film processing were found to vary widely implying that this is a major cause of the variations in dose and image quality. An annex outlines a quality assurance programme for maintenance of optimal physical image quality and control of radiation doses. 55 refs., 21 tabs., 17 figs., 2 ills

  9. Correlation of contrast-detail analysis and clinical image quality assessment in chest radiography with a human cadaver study.

    Science.gov (United States)

    De Crop, An; Bacher, Klaus; Van Hoof, Tom; Smeets, Peter V; Smet, Barbara S; Vergauwen, Merel; Kiendys, Urszula; Duyck, Philippe; Verstraete, Koenraad; D'Herde, Katharina; Thierens, Hubert

    2012-01-01

    To determine the correlation between the clinical and physical image quality of chest images by using cadavers embalmed with the Thiel technique and a contrast-detail phantom. The use of human cadavers fulfilled the requirements of the institutional ethics committee. Clinical image quality was assessed by using three human cadavers embalmed with the Thiel technique, which results in excellent preservation of the flexibility and plasticity of organs and tissues. As a result, lungs can be inflated during image acquisition to simulate the pulmonary anatomy seen on a chest radiograph. Both contrast-detail phantom images and chest images of the Thiel-embalmed bodies were acquired with an amorphous silicon flat-panel detector. Tube voltage (70, 81, 90, 100, 113, 125 kVp), copper filtration (0.1, 0.2, 0.3 mm Cu), and exposure settings (200, 280, 400, 560, 800 speed class) were altered to simulate different quality levels. Four experienced radiologists assessed the image quality by using a visual grading analysis (VGA) technique based on European Quality Criteria for Chest Radiology. The phantom images were scored manually and automatically with use of dedicated software, both resulting in an inverse image quality figure (IQF). Spearman rank correlations between inverse IQFs and VGA scores were calculated. A statistically significant correlation (r = 0.80, P chest radiography. © RSNA, 2011.

  10. Effect of the glandular composition on digital breast tomosynthesis image quality and dose optimisation

    International Nuclear Information System (INIS)

    Marques, T.; Di Maria, S.; Vaz, P.; Ribeiro, A.; Belchior, A.; Cardoso, J.; Matela, N.; Oliveira, N.; Almeida, P.; Janeiro, L.

    2015-01-01

    In the image quality assessment for digital breast tomosynthesis (DBT), a breast phantom with an average percentage of 50 % glandular tissue is seldom used, which may not be representative of the breast tissue composition of the women undergoing such examination. This work aims at studying the effect of the glandular composition of the breast on the image quality taking into consideration different sizes of lesions. Monte Carlo simulations were performed using the state-of-the-art computer program PENELOPE to validate the image acquisition system of the DBT equipment as well as to calculate the mean glandular dose for each projection image and for different breast compositions. The integrated PENELOPE imaging tool (PenEasy) was used to calculate, in mammography, for each clinical detection task the X-ray energy that maximises the figure of merit. All the 2D cranial-caudal projections for DBT were simulated and then underwent the reconstruction process applying the Simultaneous Algebraic Reconstruction Technique. Finally, through signal-to-noise ratio analysis, the image quality in DBT was assessed. (authors)

  11. AUTOMATIC INTERPRETATION OF HIGH RESOLUTION SAR IMAGES: FIRST RESULTS OF SAR IMAGE SIMULATION FOR SINGLE BUILDINGS

    Directory of Open Access Journals (Sweden)

    J. Tao

    2012-09-01

    Full Text Available Due to the all-weather data acquisition capabilities, high resolution space borne Synthetic Aperture Radar (SAR plays an important role in remote sensing applications like change detection. However, because of the complex geometric mapping of buildings in urban areas, SAR images are often hard to interpret. SAR simulation techniques ease the visual interpretation of SAR images, while fully automatic interpretation is still a challenge. This paper presents a method for supporting the interpretation of high resolution SAR images with simulated radar images using a LiDAR digital surface model (DSM. Line features are extracted from the simulated and real SAR images and used for matching. A single building model is generated from the DSM and used for building recognition in the SAR image. An application for the concept is presented for the city centre of Munich where the comparison of the simulation to the TerraSAR-X data shows a good similarity. Based on the result of simulation and matching, special features (e.g. like double bounce lines, shadow areas etc. can be automatically indicated in SAR image.

  12. Restoration of polarimetric SAR images using simulated annealing

    DEFF Research Database (Denmark)

    Schou, Jesper; Skriver, Henning

    2001-01-01

    approach favoring one of the objectives. An algorithm for estimating the radar cross-section (RCS) for intensity SAR images has previously been proposed in the literature based on Markov random fields and the stochastic optimization method simulated annealing. A new version of the algorithm is presented......Filtering synthetic aperture radar (SAR) images ideally results in better estimates of the parameters characterizing the distributed targets in the images while preserving the structures of the nondistributed targets. However, these objectives are normally conflicting, often leading to a filtering...

  13. Microcomputer simulation of nuclear magnetic resonance imaging contrasts

    International Nuclear Information System (INIS)

    Le Bihan, D.

    1985-01-01

    The high information content of magnetic resonance images is due to the multiplicity of its parameters. However, this advantage introduces a difficulty in the interpretation of the contrast: an image is strongly modified according to the visualised parameters. The author proposes a micro-computer simulation program. After recalling the main intrinsic and extrinsic parameters, he shows how the program works and its interest as a pedagogic tool and as an aid for contrast optimisation of images as a function of the suspected pathology [fr

  14. Pre-analytic process control: projecting a quality image.

    Science.gov (United States)

    Serafin, Mark D

    2006-09-26

    Within the health-care system, the term "ancillary department" often describes the laboratory. Thus, laboratories may find it difficult to define their image and with it, customer perception of department quality. Regulatory requirements give laboratories who so desire an elegant way to address image and perception issues--a comprehensive pre-analytic system solution. Since large laboratories use such systems--laboratory service manuals--I describe and illustrate the process for the benefit of smaller facilities. There exist resources to help even small laboratories produce a professional service manual--an elegant solution to image and customer perception of quality.

  15. Quality and Reliability of Large-Eddy Simulations II

    CERN Document Server

    Salvetti, Maria Vittoria; Meyers, Johan; Sagaut, Pierre

    2011-01-01

    The second Workshop on "Quality and Reliability of Large-Eddy Simulations", QLES2009, was held at the University of Pisa from September 9 to September 11, 2009. Its predecessor, QLES2007, was organized in 2007 in Leuven (Belgium). The focus of QLES2009 was on issues related to predicting, assessing and assuring the quality of LES. The main goal of QLES2009 was to enhance the knowledge on error sources and on their interaction in LES and to devise criteria for the prediction and optimization of simulation quality, by bringing together mathematicians, physicists and engineers and providing a platform specifically addressing these aspects for LES. Contributions were made by leading experts in the field. The present book contains the written contributions to QLES2009 and is divided into three parts, which reflect the main topics addressed at the workshop: (i) SGS modeling and discretization errors; (ii) Assessment and reduction of computational errors; (iii) Mathematical analysis and foundation for SGS modeling.

  16. Numerical simulation of water quality in Yangtze Estuary

    Directory of Open Access Journals (Sweden)

    Xi Li

    2009-12-01

    Full Text Available In order to monitor water quality in the Yangtze Estuary, water samples were collected and field observation of current and velocity stratification was carried out using a shipboard acoustic Doppler current profiler (ADCP. Results of two representative variables, the temporal and spatial variation of new point source sewage discharge as manifested by chemical oxygen demand (COD and the initial water quality distribution as manifested by dissolved oxygen (DO, were obtained by application of the Environmental Fluid Dynamics Code (EFDC with solutions for hydrodynamics during tides. The numerical results were compared with field data, and the field data provided verification of numerical application: this numerical model is an effective tool for water quality simulation. For point source discharge, COD concentration was simulated with an initial value in the river of zero. The simulated increments and distribution of COD in the water show acceptable agreement with field data. The concentration of DO is much higher in the North Branch than in the South Branch due to consumption of oxygen in the South Branch resulting from discharge of sewage from Shanghai. The DO concentration is greater in the surface layer than in the bottom layer. The DO concentration is low in areas with a depth of less than 20 m, and high in areas between the 20-m and 30-m isobaths. It is concluded that the numerical model is valuable in simulation of water quality in the case of specific point source pollutant discharge. The EFDC model is also of satisfactory accuracy in water quality simulation of the Yangtze Estuary.

  17. Imaging Simulations for the Korean VLBI Network (KVN

    Directory of Open Access Journals (Sweden)

    Tae-Hyun Jung

    2005-03-01

    Full Text Available The Korean VLBI Network (KVN will open a new field of research in astronomy, geodesy and earth science using the newest three 21m radio telescopes. This will expand our ability to look at the Universe in the millimeter regime. Imaging capability of radio interferometry is highly dependent upon the antenna configuration, source size, declination and the shape of target. In this paper, imaging simulations are carried out with the KVN system configuration. Five test images were used which were a point source, multi-point sources, a uniform sphere with two different sizes compared to the synthesis beam of the KVN and a Very Large Array (VLA image of Cygnus A. The declination for the full time simulation was set as +60 degrees and the observation time range was --6 to +6 hours around transit. Simulations have been done at 22GHz, one of the KVN observation frequency. All these simulations and data reductions have been run with the Astronomical Image Processing System (AIPS software package. As the KVN array has a resolution of about 6 mas (milli arcsecond at 22GHz, in case of model source being approximately the beam size or smaller, the ratio of peak intensity over RMS shows about 10000:1 and 5000:1. The other case in which model source is larger than the beam size, this ratio shows very low range of about 115:1 and 34:1. This is due to the lack of short baselines and the small number of antenna. We compare the coordinates of the model images with those of the cleaned images. The result shows mostly perfect correspondence except in the case of the 12mas uniform sphere. Therefore, the main astronomical targets for the KVN will be the compact sources and the KVN will have an excellent performance in the astrometry for these sources.

  18. Computer simulation of radiographic images sharpness in several system of image record

    International Nuclear Information System (INIS)

    Silva, Marcia Aparecida; Schiable, Homero; Frere, Annie France; Marques, Paulo M.A.; Oliveira, Henrique J.Q. de; Alves, Fatima F.R.; Medeiros, Regina B.

    1996-01-01

    A method to predict the influence of the record system on radiographic images sharpness by computer simulation is studied. The method intend to previously show the image to be obtained for each type of film or screen-film combination used during the exposure

  19. Improving best-phase image quality in cardiac CT by motion correction with MAM optimization

    Energy Technology Data Exchange (ETDEWEB)

    Rohkohl, Christopher; Bruder, Herbert; Stierstorfer, Karl [Siemens AG, Healthcare Sector, Siemensstrasse 1, 91301 Forchheim (Germany); Flohr, Thomas [Siemens AG, Healthcare Sector, Siemensstrasse 1, 91301 Forchheim (Germany); Institute of Diagnostic Radiology, Eberhard Karls University, Hoppe-Seyler-Str. 3, 72076 Tuebingen (Germany)

    2013-03-15

    Purpose: Research in image reconstruction for cardiac CT aims at using motion correction algorithms to improve the image quality of the coronary arteries. The key to those algorithms is motion estimation, which is currently based on 3-D/3-D registration to align the structures of interest in images acquired in multiple heart phases. The need for an extended scan data range covering several heart phases is critical in terms of radiation dose to the patient and limits the clinical potential of the method. Furthermore, literature reports only slight quality improvements of the motion corrected images when compared to the most quiet phase (best-phase) that was actually used for motion estimation. In this paper a motion estimation algorithm is proposed which does not require an extended scan range but works with a short scan data interval, and which markedly improves the best-phase image quality. Methods: Motion estimation is based on the definition of motion artifact metrics (MAM) to quantify motion artifacts in a 3-D reconstructed image volume. The authors use two different MAMs, entropy, and positivity. By adjusting the motion field parameters, the MAM of the resulting motion-compensated reconstruction is optimized using a gradient descent procedure. In this way motion artifacts are minimized. For a fast and practical implementation, only analytical methods are used for motion estimation and compensation. Both the MAM-optimization and a 3-D/3-D registration-based motion estimation algorithm were investigated by means of a computer-simulated vessel with a cardiac motion profile. Image quality was evaluated using normalized cross-correlation (NCC) with the ground truth template and root-mean-square deviation (RMSD). Four coronary CT angiography patient cases were reconstructed to evaluate the clinical performance of the proposed method. Results: For the MAM-approach, the best-phase image quality could be improved for all investigated heart phases, with a maximum

  20. Improving best-phase image quality in cardiac CT by motion correction with MAM optimization

    International Nuclear Information System (INIS)

    Rohkohl, Christopher; Bruder, Herbert; Stierstorfer, Karl; Flohr, Thomas

    2013-01-01

    Purpose: Research in image reconstruction for cardiac CT aims at using motion correction algorithms to improve the image quality of the coronary arteries. The key to those algorithms is motion estimation, which is currently based on 3-D/3-D registration to align the structures of interest in images acquired in multiple heart phases. The need for an extended scan data range covering several heart phases is critical in terms of radiation dose to the patient and limits the clinical potential of the method. Furthermore, literature reports only slight quality improvements of the motion corrected images when compared to the most quiet phase (best-phase) that was actually used for motion estimation. In this paper a motion estimation algorithm is proposed which does not require an extended scan range but works with a short scan data interval, and which markedly improves the best-phase image quality. Methods: Motion estimation is based on the definition of motion artifact metrics (MAM) to quantify motion artifacts in a 3-D reconstructed image volume. The authors use two different MAMs, entropy, and positivity. By adjusting the motion field parameters, the MAM of the resulting motion-compensated reconstruction is optimized using a gradient descent procedure. In this way motion artifacts are minimized. For a fast and practical implementation, only analytical methods are used for motion estimation and compensation. Both the MAM-optimization and a 3-D/3-D registration-based motion estimation algorithm were investigated by means of a computer-simulated vessel with a cardiac motion profile. Image quality was evaluated using normalized cross-correlation (NCC) with the ground truth template and root-mean-square deviation (RMSD). Four coronary CT angiography patient cases were reconstructed to evaluate the clinical performance of the proposed method. Results: For the MAM-approach, the best-phase image quality could be improved for all investigated heart phases, with a maximum

  1. Selecting optimal monochromatic level with spectral CT imaging for improving imaging quality in hepatic venography

    International Nuclear Information System (INIS)

    Sun Jun; Luo Xianfu; Wang Shou'an; Wang Jun; Sun Jiquan; Wang Zhijun; Wu Jingtao

    2013-01-01

    Objective: To investigate the effect of spectral CT monochromatic images for improving imaging quality in hepatic venography. Methods: Thirty patients underwent spectral CT examination on a GE Discovery CT 750 HD scanner. During portal phase, 1.25 mm slice thickness polychromatic images and optimal monochromatic images were obtained, and volume rendering and maximum intensity projection were created to show the hepatic veins respectively. The overall imaging quality was evaluated on a five-point scale by two radiologists. Inter-observer agreement in subjective image quality grading was assessed by Kappa statistics. Paired-sample t test were used to compare hepatic vein attenuation, hepatic parenchyma attenuation, CT value difference between the hepatic vein and the liver parenchyma, image noise, vein-to-liver contrast-to-noise ratio (CNR), the image quality score of hepatic venography between the two image data sets. Results: The monochromatic images at 50 keV were found to demonstrate the best CNR for hepatic vein.The hepatic vein attenuation [(329 ± 47) HU], hepatic parenchyma attenuation [(178 ± 33) HU], CT value difference between the hepatic vein and the liver parenchyma [(151 ± 33) HU], image noise (17.33 ± 4.18), CNR (9.13 ± 2.65), the image quality score (4.2 ± 0.6) of optimal monochromatic images were significantly higher than those of polychromatic images [(149 ± 18) HU], [(107 ± 14) HU], [(43 ±11) HU], 12.55 ± 3.02, 3.53 ± 1.03, 3.1 ± 0.8 (t values were 24.79, 13.95, 18.85, 9.07, 13.25 and 12.04, respectively, P < 0.01). In the comparison of image quality, Kappa value was 0.81 with optimal monochromatic images and 0.69 with polychromatic images. Conclusion: Monochromatic images of spectral CT could improve CNR for displaying hepatic vein and improve the image quality compared to the conventional polychromatic images. (authors)

  2. Toward optimal color image quality of television display

    Science.gov (United States)

    MacDonald, Lindsay W.; Endrikhovski, Sergej N.; Bech, Soren; Jensen, Kaj

    1999-12-01

    A general framework and first experimental results are presented for the `OPTimal IMage Appearance' (OPTIMA) project, which aims to develop a computational model for achieving optimal color appearance of natural images on adaptive CRT television displays. To achieve this goal we considered the perceptual constraints determining quality of displayed images and how they could be quantified. The practical value of the notion of optimal image appearance was translated from the high level of the perceptual constraints into a method for setting the display's parameters at the physical level. In general, the whole framework of quality determination includes: (1) evaluation of perceived quality; (2) evaluation of the individual perceptual attributes; and (3) correlation between the physical measurements, psychometric parameters and the subjective responses. We performed a series of psychophysical experiments, with observers viewing a series of color images on a high-end consumer television display, to investigate the relationships between Overall Image Quality and four quality-related attributes: Brightness Rendering, Chromatic Rendering, Visibility of Details and Overall Naturalness. The results of the experiments presented in this paper suggest that these attributes are highly inter-correlated.

  3. Luminance and image quality analysis of an organic electroluminescent panel with a patterned microlens array attachment

    International Nuclear Information System (INIS)

    Lin, Hoang Yan; Chen, Kuan-Yu; Ho, Yu-Hsuan; Fang, Jheng-Hao; Hsu, Sheng-Chih; Lee, Jiun-Haw; Lin, Jia-Rong; Wei, Mao-Kuo

    2010-01-01

    Luminance and image quality observed from the normal direction of a commercial 2.0 inch panel based on organic electroluminescence (OEL) technology attached to regular and patterned microlens array films (MAFs) were studied and analyzed. When applying the regularly arranged MAF on the panel, a luminance enhancement of 23% was observed, accompanied by a reduction of the image quality index as low as 74%. By removing the microlenses on the emitting areas, the patterned MAF enhances the luminance efficiency of the OEL by 52% keeping the image quality index of the display as high as 94%, due to the effective light extraction in the glass substrate being less than the critical angle. 3D simulation based on a ray-tracing model was also established to investigate the spatial distribution of light rays radiated from an OEL pixel with different microstructures which showed consistent results with the experimental results

  4. Quality control of the interpretation monitors of digital radiological images

    International Nuclear Information System (INIS)

    Favero, Mariana S.; Goulart, Adriano Oliveira S.

    2016-01-01

    The performance monitors has great importance in image quality of digital radiographic systems. In environments without films, it became necessary to implement acceptance testing and quality control monitors used for interpretation of medical images. The monitors dedicated to radiodiagnostic should provide information that represent slight differences in x-ray attenuation or minor differences in some anatomical region of interest. This should also result in small differences in luminance of an image represented. Factors affecting the quality of medical imaging are contrast, noise, resolution, artifacts and distortions. Therefore, a monitor must have specific characteristics, making it possible for the observer to carry out an assessment that leads to better diagnosis. Based on the need to evaluate diagnostic monitors in various radiological applications, this paper presents a summary for implementation and standardization of tests that are recommended by the publication AAPM Report 03. (author)

  5. Standardization of Image Quality Analysis – ISO 19264

    DEFF Research Database (Denmark)

    Wüller, Dietmar; Kejser, Ulla Bøgvad

    2016-01-01

    There are a variety of image quality analysis tools available for the archiving world, which are based on different test charts and analysis algorithms. ISO has formed a working group in 2012 to harmonize these approaches and create a standard way of analyzing the image quality for archiving...... systems. This has resulted in three documents that have been or are going to be published soon. ISO 19262 defines the terms used in the area of image capture to unify the language. ISO 19263 describes the workflow issues and provides detailed information on how the measurements are done. Last...... but not least ISO 19264 describes the measurements in detail and provides aims and tolerance levels for the different aspects. This paper will present the new ISO 19264 technical specification to analyze image quality based on a single capture of a multi-pattern test chart, and discuss the reasoning behind its...

  6. MYTHS vesus reality in computed radiography image quality

    International Nuclear Information System (INIS)

    Mango, Steve; Castro, Luiz

    2009-01-01

    As NDE operation - particularly radiographic testing - ransition form analog to digital technologies such as computed radiography (CR), users are learning that there's more to digital image quality than meets the eye. In fact, there are ultiple factors that determine the final perceived image quality of a computed radiograph. Many of these factors are misunderstood, and some are touted as the ''key parameter'' or ''magic bullet'' in producing optiumum image quality, In reality, such claims are oversimplified, and are more marketing hype than reality. The truth?. Perceived image quality results form the cascaded effects of many factor - such as sharpness, system noise, spot size and pixel size, subject contrast, bit depth, radiographic technique, and so on. Many of these factors are within the control of rdiographers or designers of equipment and media. This paper will explain some of these key factors, dispel some of the myths surrounding them, and will show that qualities such as bigger, smaller, more, or less are not always better when it comes to CR image quality. (authors)

  7. Quality evaluation of no-reference MR images using multidirectional filters and image statistics.

    Science.gov (United States)

    Jang, Jinseong; Bang, Kihun; Jang, Hanbyol; Hwang, Dosik

    2018-09-01

    This study aimed to develop a fully automatic, no-reference image-quality assessment (IQA) method for MR images. New quality-aware features were obtained by applying multidirectional filters to MR images and examining the feature statistics. A histogram of these features was then fitted to a generalized Gaussian distribution function for which the shape parameters yielded different values depending on the type of distortion in the MR image. Standard feature statistics were established through a training process based on high-quality MR images without distortion. Subsequently, the feature statistics of a test MR image were calculated and compared with the standards. The quality score was calculated as the difference between the shape parameters of the test image and the undistorted standard images. The proposed IQA method showed a >0.99 correlation with the conventional full-reference assessment methods; accordingly, this proposed method yielded the best performance among no-reference IQA methods for images containing six types of synthetic, MR-specific distortions. In addition, for authentically distorted images, the proposed method yielded the highest correlation with subjective assessments by human observers, thus demonstrating its superior performance over other no-reference IQAs. Our proposed IQA was designed to consider MR-specific features and outperformed other no-reference IQAs designed mainly for photographic images. Magn Reson Med 80:914-924, 2018. © 2018 International Society for Magnetic Resonance in Medicine. © 2018 International Society for Magnetic Resonance in Medicine.

  8. Investigation of Photographic Image Quality Estimators

    Science.gov (United States)

    1980-04-01

    Bibeman (1973) describes acutance as being "expressed in terms of the mean square of the gradient of . . . density (in a photographic image) with...the density difference AD. for each interval from the (smoothed) microdensitometer trace (calibrated in density units). 4. Compute the gradient -77...resolution." Rotacion Effects: The conditions were: Target: Shutter Speed: I- requency: Arplitude: Medium contrast, variable aspect 250 milliseconds

  9. The influence of software filtering in digital mammography image quality

    Science.gov (United States)

    Michail, C.; Spyropoulou, V.; Kalyvas, N.; Valais, I.; Dimitropoulos, N.; Fountos, G.; Kandarakis, I.; Panayiotakis, G.

    2009-05-01

    Breast cancer is one of the most frequently diagnosed cancers among women. Several techniques have been developed to help in the early detection of breast cancer such as conventional and digital x-ray mammography, positron and single-photon emission mammography, etc. A key advantage in digital mammography is that images can be manipulated as simple computer image files. Thus non-dedicated commercially available image manipulation software can be employed to process and store the images. The image processing tools of the Photoshop (CS 2) software usually incorporate digital filters which may be used to reduce image noise, enhance contrast and increase spatial resolution. However, improving an image quality parameter may result in degradation of another. The aim of this work was to investigate the influence of three sharpening filters, named hereafter sharpen, sharpen more and sharpen edges on image resolution and noise. Image resolution was assessed by means of the Modulation Transfer Function (MTF).In conclusion it was found that the correct use of commercial non-dedicated software on digital mammograms may improve some aspects of image quality.

  10. The influence of software filtering in digital mammography image quality

    International Nuclear Information System (INIS)

    Michail, C; Spyropoulou, V; Valais, I; Panayiotakis, G; Kalyvas, N; Fountos, G; Kandarakis, I; Dimitropoulos, N

    2009-01-01

    Breast cancer is one of the most frequently diagnosed cancers among women. Several techniques have been developed to help in the early detection of breast cancer such as conventional and digital x-ray mammography, positron and single-photon emission mammography, etc. A key advantage in digital mammography is that images can be manipulated as simple computer image files. Thus non-dedicated commercially available image manipulation software can be employed to process and store the images. The image processing tools of the Photoshop (CS 2) software usually incorporate digital filters which may be used to reduce image noise, enhance contrast and increase spatial resolution. However, improving an image quality parameter may result in degradation of another. The aim of this work was to investigate the influence of three sharpening filters, named hereafter sharpen, sharpen more and sharpen edges on image resolution and noise. Image resolution was assessed by means of the Modulation Transfer Function (MTF).In conclusion it was found that the correct use of commercial non-dedicated software on digital mammograms may improve some aspects of image quality.

  11. Simulation and Efficient Measurements of Intensities for Complex Imaging Sequences

    DEFF Research Database (Denmark)

    Jensen, Jørgen Arendt; Rasmussen, Morten Fischer; Stuart, Matthias Bo

    2014-01-01

    on the sequence to simulate both intensity and mechanical index (MI) according to FDA rules. A 3 MHz BK Medical 8820e convex array transducer is used with the SARUS scanner. An Onda HFL-0400 hydrophone and the Onda AIMS III system measures the pressure field for three imaging schemes: a fixed focus, single...

  12. Correlation of simulated TEM images with irradiation induced damage

    International Nuclear Information System (INIS)

    Schaeublin, R.; Almeida, P. de; Almazouzi, A.; Victoria, M.

    2000-01-01

    Crystal damage induced by irradiation is investigated using transmission electron microscopy (TEM) coupled to molecular dynamics (MD) calculations. The displacement cascades are simulated for energies ranging from 10 to 50 keV in Al, Ni and Cu and for times of up to a few tens of picoseconds. Samples are then used to perform simulations of the TEM images that one could observe experimentally. Diffraction contrast is simulated using a method based on the multislice technique. It appears that the cascade induced damage in Al imaged in weak beam exhibits little contrast, which is too low to be experimentally visible, while in Ni and Cu a good contrast is observed. The number of visible clusters is always lower than the actual one. Conversely, high resolution TEM (HRTEM) imaging allows most of the defects contained in the sample to be observed, although experimental difficulties arise due to the low contrast intensity of the smallest defects. Single point defects give rise in HTREM to a contrast that is similar to that of cavities. TEM imaging of the defects is discussed in relation to the actual size of the defects and to the number of clusters deduced from MD simulations

  13. MO-F-CAMPUS-I-03: GPU Accelerated Monte Carlo Technique for Fast Concurrent Image and Dose Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Becchetti, M; Tian, X; Segars, P; Samei, E [Clinical Imaging Physics Group, Department of Radiology, Duke University Me, Durham, NC (United States)

    2015-06-15

    Purpose: To develop an accurate and fast Monte Carlo (MC) method of simulating CT that is capable of correlating dose with image quality using voxelized phantoms. Methods: A realistic voxelized phantom based on patient CT data, XCAT, was used with a GPU accelerated MC code for helical MDCT. Simulations were done with both uniform density organs and with textured organs. The organ doses were validated using previous experimentally validated simulations of the same phantom under the same conditions. Images acquired by tracking photons through the phantom with MC require lengthy computation times due to the large number of photon histories necessary for accurate representation of noise. A substantial speed up of the process was attained by using a low number of photon histories with kernel denoising of the projections from the scattered photons. These FBP reconstructed images were validated against those that were acquired in simulations using many photon histories by ensuring a minimal normalized root mean square error. Results: Organ doses simulated in the XCAT phantom are within 10% of the reference values. Corresponding images attained using projection kernel smoothing were attained with 3 orders of magnitude less computation time compared to a reference simulation using many photon histories. Conclusion: Combining GPU acceleration with kernel denoising of scattered photon projections in MC simulations allows organ dose and corresponding image quality to be attained with reasonable accuracy and substantially reduced computation time than is possible with standard simulation approaches.

  14. MO-F-CAMPUS-I-03: GPU Accelerated Monte Carlo Technique for Fast Concurrent Image and Dose Simulation

    International Nuclear Information System (INIS)

    Becchetti, M; Tian, X; Segars, P; Samei, E

    2015-01-01

    Purpose: To develop an accurate and fast Monte Carlo (MC) method of simulating CT that is capable of correlating dose with image quality using voxelized phantoms. Methods: A realistic voxelized phantom based on patient CT data, XCAT, was used with a GPU accelerated MC code for helical MDCT. Simulations were done with both uniform density organs and with textured organs. The organ doses were validated using previous experimentally validated simulations of the same phantom under the same conditions. Images acquired by tracking photons through the phantom with MC require lengthy computation times due to the large number of photon histories necessary for accurate representation of noise. A substantial speed up of the process was attained by using a low number of photon histories with kernel denoising of the projections from the scattered photons. These FBP reconstructed images were validated against those that were acquired in simulations using many photon histories by ensuring a minimal normalized root mean square error. Results: Organ doses simulated in the XCAT phantom are within 10% of the reference values. Corresponding images attained using projection kernel smoothing were attained with 3 orders of magnitude less computation time compared to a reference simulation using many photon histories. Conclusion: Combining GPU acceleration with kernel denoising of scattered photon projections in MC simulations allows organ dose and corresponding image quality to be attained with reasonable accuracy and substantially reduced computation time than is possible with standard simulation approaches

  15. Mass imbalances in EPANET water-quality simulations

    Energy Technology Data Exchange (ETDEWEB)

    Davis, Michael J.; Janke, Robert; Taxon, Thomas N.

    2018-04-06

    EPANET is widely employed to simulate water quality in water distribution systems. However, the time-driven simulation approach used to determine concentrations of water-quality constituents provides accurate results, in general, only for small water-quality time steps; use of an adequately short time step may not be feasible. Overly long time steps can yield errors in concentrations and result in situations in which constituent mass is not conserved. Mass may not be conserved even when EPANET gives no errors or warnings. This paper explains how such imbalances can occur and provides examples of such cases; it also presents a preliminary event-driven approach that conserves mass with a water-quality time step that is as long as the hydraulic time step. Results obtained using the current approach converge, or tend to converge, to those obtained using the new approach as the water-quality time step decreases. Improving the water-quality routing algorithm used in EPANET could eliminate mass imbalances and related errors in estimated concentrations.

  16. Image Quality Assessment of JPEG Compressed Mars Science Laboratory Mastcam Images using Convolutional Neural Networks

    Science.gov (United States)

    Kerner, H. R.; Bell, J. F., III; Ben Amor, H.

    2017-12-01

    The Mastcam color imaging system on the Mars Science Laboratory Curiosity rover acquires images within Gale crater for a variety of geologic and atmospheric studies. Images are often JPEG compressed before being downlinked to Earth. While critical for transmitting images on a low-bandwidth connection, this compression can result in image artifacts most noticeable as anomalous brightness or color changes within or near JPEG compression block boundaries. In images with significant high-frequency detail (e.g., in regions showing fine layering or lamination in sedimentary rocks), the image might need to be re-transmitted losslessly to enable accurate scientific interpretation of the data. The process of identifying which images have been adversely affected by compression artifacts is performed manually by the Mastcam science team, costing significant expert human time. To streamline the tedious process of identifying which images might need to be re-transmitted, we present an input-efficient neural network solution for predicting the perceived quality of a compressed Mastcam image. Most neural network solutions require large amounts of hand-labeled training data for the model to learn the target mapping between input (e.g. distorted images) and output (e.g. quality assessment). We propose an automatic labeling method using joint entropy between a compressed and uncompressed image to avoid the need for domain experts to label thousands of training examples by hand. We use automatically labeled data to train a convolutional neural network to estimate the probability that a Mastcam user would find the quality of a given compressed image acceptable for science analysis. We tested our model on a variety of Mastcam images and found that the proposed method correlates well with image quality perception by science team members. When assisted by our proposed method, we estimate that a Mastcam investigator could reduce the time spent reviewing images by a minimum of 70%.

  17. Quality and Reliability of Large-Eddy Simulations

    CERN Document Server

    Meyers, Johan; Sagaut, Pierre

    2008-01-01

    Computational resources have developed to the level that, for the first time, it is becoming possible to apply large-eddy simulation (LES) to turbulent flow problems of realistic complexity. Many examples can be found in technology and in a variety of natural flows. This puts issues related to assessing, assuring, and predicting the quality of LES into the spotlight. Several LES studies have been published in the past, demonstrating a high level of accuracy with which turbulent flow predictions can be attained, without having to resort to the excessive requirements on computational resources imposed by direct numerical simulations. However, the setup and use of turbulent flow simulations requires a profound knowledge of fluid mechanics, numerical techniques, and the application under consideration. The susceptibility of large-eddy simulations to errors in modelling, in numerics, and in the treatment of boundary conditions, can be quite large due to nonlinear accumulation of different contributions over time, ...

  18. Image quality and dose in mammographic images obtained in Mexico City hospitals

    International Nuclear Information System (INIS)

    Ruiz-Trejo, C.; Brandan, M.-E.; Verdejo, M.; Flores, A.; Guevara, M.; Martin, J.; Madero-Preciado, L.

    2001-01-01

    The performance of three mammographic systems in large Mexican hospitals has been evaluated, as well as the image quality and associated dose. Quality control tests include examination of X-ray equipment, darkroom conditions, film processor, and viewboxes. Systems referred to as '1', '2', and '3' passed 50%, 75% and 75% of these tests, respectively. Quality image is assessed using five images obtained under similar nominal conditions in each X-ray equipment. System 1 generates no image of acceptable quality, while equipment 2 and 3 produce one and two, respectively. The mean glandular dose for the best images obtained in each service with an accreditation phantom has been measured, and the values are 1.4 mGy, 1.6 mGy, and 1.0 mGy, respectively. (author)

  19. Physical image quality of computed radiography in mammography system

    International Nuclear Information System (INIS)

    Norriza Mohd Isa; Muhammad Jamal Isa; Wan Muhamad Saridan Wan Hassan; Fatimah Othman

    2013-01-01

    Full-text: Mammography is a screening procedure that mostly used for early detection of breast cancer. In digital imaging system, Computed Radiography is a cost-effective technology that applied indirect conversion detector. The paper presents physical image quality parameter measurements namely modulation transfer function (MTF), normalized noise power spectrum (NNPS) and detective quantum efficiency (DQE) of Computed Radiography in mammography system. MTF was calculated from two different orientations of slanted images of an edge test device and NNPS was estimated using flat-field image. Both images were acquired using a standard mammography beam quality. DQE was determined by applying the MTF and NNPS values into our developed software program. Both orientations have similar DQE characteristics. (author)

  20. Kinetic Simulation and Energetic Neutral Atom Imaging of the Magnetosphere

    Science.gov (United States)

    Fok, Mei-Ching H.

    2011-01-01

    Advanced simulation tools and measurement techniques have been developed to study the dynamic magnetosphere and its response to drivers in the solar wind. The Comprehensive Ring Current Model (CRCM) is a kinetic code that solves the 3D distribution in space, energy and pitch-angle information of energetic ions and electrons. Energetic Neutral Atom (ENA) imagers have been carried in past and current satellite missions. Global morphology of energetic ions were revealed by the observed ENA images. We have combined simulation and ENA analysis techniques to study the development of ring current ions during magnetic storms and substorms. We identify the timing and location of particle injection and loss. We examine the evolution of ion energy and pitch-angle distribution during different phases of a storm. In this talk we will discuss the findings from our ring current studies and how our simulation and ENA analysis tools can be applied to the upcoming TRIO-CINAMA mission.

  1. A Methodology for Anatomic Ultrasound Image Diagnostic Quality Assessment

    DEFF Research Database (Denmark)

    Hemmsen, Martin Christian; Lange, Theis; Brandt, Andreas Hjelm

    2017-01-01

    are presented. Earlier uses of the methodology has shown that it ensures validity of the assessment, as it separates the influences between developer, investigator, and assessor once a research protocol has been established. This separation reduces confounding influences on the result from the developer......This paper discusses methods for assessment of ultrasound image quality based on our experiences with evaluating new methods for anatomic imaging. It presents a methodology to ensure a fair assessment between competing imaging methods using clinically relevant evaluations. The methodology...... to properly reveal the clinical value. The paper exemplifies the methodology using recent studies of Synthetic Aperture Sequential Beamforming tissue harmonic imaging....

  2. ESIM: Edge Similarity for Screen Content Image Quality Assessment.

    Science.gov (United States)

    Ni, Zhangkai; Ma, Lin; Zeng, Huanqiang; Chen, Jing; Cai, Canhui; Ma, Kai-Kuang

    2017-10-01

    In this paper, an accurate full-reference image quality assessment (IQA) model developed for assessing screen content images (SCIs), called the edge similarity (ESIM), is proposed. It is inspired by the fact that the human visual system (HVS) is highly sensitive to edges that are often encountered in SCIs; therefore, essential edge features are extracted and exploited for conducting IQA for the SCIs. The key novelty of the proposed ESIM lies in the extraction and use of three salient edge features-i.e., edge contrast, edge width, and edge direction. The first two attributes are simultaneously generated from the input SCI based on a parametric edge model, while the last one is derived directly from the input SCI. The extraction of these three features will be performed for the reference SCI and the distorted SCI, individually. The degree of similarity measured for each above-mentioned edge attribute is then computed independently, followed by combining them together using our proposed edge-width pooling strategy to generate the final ESIM score. To conduct the performance evaluation of our proposed ESIM model, a new and the largest SCI database (denoted as SCID) is established in our work and made to the public for download. Our database contains 1800 distorted SCIs that are generated from 40 reference SCIs. For each SCI, nine distortion types are investigated, and five degradation levels are produced for each distortion type. Extensive simulation results have clearly shown that the proposed ESIM model is more consistent with the perception of the HVS on the evaluation of distorted SCIs than the multiple state-of-the-art IQA methods.

  3. Algorithms of CT value correction for reconstructing a radiotherapy simulation image through axial CT images

    International Nuclear Information System (INIS)

    Ogino, Takashi; Egawa, Sunao

    1991-01-01

    New algorithms of CT value correction for reconstructing a radiotherapy simulation image through axial CT images were developed. One, designated plane weighting method, is to correct CT value in proportion to the position of the beam element passing through the voxel. The other, designated solid weighting method, is to correct CT value in proportion to the length of the beam element passing through the voxel and the volume of voxel. Phantom experiments showed fair spatial resolution in the transverse direction. In the longitudinal direction, however, spatial resolution of under slice thickness could not be obtained. Contrast resolution was equivalent for both methods. In patient studies, the reconstructed radiotherapy simulation image was almost similar in visual perception of the density resolution to a simulation film taken by X-ray simulator. (author)

  4. Real-time computer treatment of THz passive device images with the high image quality

    Science.gov (United States)

    Trofimov, Vyacheslav A.; Trofimov, Vladislav V.

    2012-06-01

    We demonstrate real-time computer code improving significantly the quality of images captured by the passive THz imaging system. The code is not only designed for a THz passive device: it can be applied to any kind of such devices and active THz imaging systems as well. We applied our code for computer processing of images captured by four passive THz imaging devices manufactured by different companies. It should be stressed that computer processing of images produced by different companies requires using the different spatial filters usually. The performance of current version of the computer code is greater than one image per second for a THz image having more than 5000 pixels and 24 bit number representation. Processing of THz single image produces about 20 images simultaneously corresponding to various spatial filters. The computer code allows increasing the number of pixels for processed images without noticeable reduction of image quality. The performance of the computer code can be increased many times using parallel algorithms for processing the image. We develop original spatial filters which allow one to see objects with sizes less than 2 cm. The imagery is produced by passive THz imaging devices which captured the images of objects hidden under opaque clothes. For images with high noise we develop an approach which results in suppression of the noise after using the computer processing and we obtain the good quality image. With the aim of illustrating the efficiency of the developed approach we demonstrate the detection of the liquid explosive, ordinary explosive, knife, pistol, metal plate, CD, ceramics, chocolate and other objects hidden under opaque clothes. The results demonstrate the high efficiency of our approach for the detection of hidden objects and they are a very promising solution for the security problem.

  5. Evaluation of the image quality criteria and study of doses in a mammography department

    International Nuclear Information System (INIS)

    Alcantara, Marcela Costa

    2009-01-01

    The mammographic image quality criteria published by European Commission were implemented in three mammography equipment of a same radiology department in a hospital of Sao Paulo city. Among the mammography equipment, two use the screen-film system and one of them uses the indirect digital system. During the data collection, it was noted the need to conduct a study about image rejection in each mammography equipment. Therefore, this study was realized and, after that, the results in each mammography equipment of image rejection and image percentage that present each quality criterion it were compared. At the same time of this studies, it was realized other study about surface entrance dose and average glandular dose. These doses it was estimated based on different methods published by different groups of researcher, for all combinations anode filter available in the equipment. To estimate the surface entrance dose following the methodology published in Avenue's' guide and the average glandular dose following the Wu' methodology, it was developed a phantom, in different thicknesses of acrylic, to simulate a breast. Finally, the image quality it was associated with the dose received by patient. The digital equipment shows better results in the evaluation of quality criteria, lower rate of image rejection and lower values of average glandular dose and surface entrance dose in all methods studied. But it is not sufficient, because is not adequate for patients with great breast. (author)

  6. Patient-specific scatter correction in clinical cone beam computed tomography imaging made possible by the combination of Monte Carlo simulations and a ray tracing algorithm

    DEFF Research Database (Denmark)

    Slot Thing, Rune; Bernchou, Uffe; Mainegra-Hing, Ernesto

    2013-01-01

    Abstract Purpose. Cone beam computed tomography (CBCT) image quality is limited by scattered photons. Monte Carlo (MC) simulations provide the ability of predicting the patient-specific scatter contamination in clinical CBCT imaging. Lengthy simulations prevent MC-based scatter correction from...

  7. Validation of an image quality index: its correlation with quality control parameters

    International Nuclear Information System (INIS)

    Cabrejas, M.L.C.; Giannone, C.A.; Arashiro, J.G.; Cabrejas, R.C.

    2002-01-01

    Objective and Rationale: To validate a new image quality index (the Performance Index: PI) that assesses detectability of simulated lesions with a phantom. This index, presumably must depend markedly on quality control (QC) parameters as tomographic uniformity (Unif), Centre of Rotation (COR) and Spatial resolution (FWHM). The simultaneous effects of the QC parameters may explain much of the variation in the PIs; i.e. they may be predictors of the PI values. Methods: An overall performance phantom containing 3 sections was used. The first uniform section was used to determine tomographic uniformity. From the analysis of the slices corresponding to the second section containing 8 cold cylindrical simulated lesions of different diameters (range 7 mm - 17 mm), the number of true and false positives are determined and from these a new Performance Index (PI) is defined as the ratio between the positive predictive value and the sensitivity (expressed as its complement adding a constant to avoid a singularity). A point source located on the top of the phantom was used to determine the Centre of Rotation and the Spatial Resolution expressed by the FWHM in mm. 40 nuclear medicine labs participate at the survey. Standard multiple regression analysis between the Performance Index, as dependent variable, and FWHM, COR and Unif as independent variables was performed to evaluate the influence of the QC parameters on the PI values. Results: It is shown that resolution and COR are both predictors of the PIs, with statistical significance for the multiple correlation co-efficient R. However the addition of the variable tomographic uniformity to the model, does not improve the prediction of PIs. Moreover, the regression model lacks overall statistical significance. Regression summary for dependent variable Performance Index is presented. Conclusions: We confirm that the new Performance Index (PI), depends on QC parameters as COR and Spatial resolution. Those labs whose PIs are out

  8. Image Quality Improvement after Implementation of a CT Accreditation Program

    International Nuclear Information System (INIS)

    Kim, You Sung; Jung, Seung Eun; Choi, Byung Gil; Shin, Yu Ri; Hwang, Seong Su; Ku, Young Mi; Lim, Yeon Soo; Lee, Jae Mun

    2010-01-01

    The purpose of this study was to evaluate any improvement in the quality of abdominal CTs after the utilization of the nationally based accreditation program. Approval was obtained from the Institutional Review Board, and informed consent was waived. We retrospectively analyzed 1,011 outside abdominal CTs, from 2003 to 2007. We evaluated images using a fill-up sheet form of the national accreditation program, and subjectively by grading for the overall CT image quality. CT scans were divided into two categories according to time periods; before and after the implementation of the accreditation program. We compared CT scans between two periods according to parameters pertaining to the evaluation of images. We determined whether there was a correlation between the results of a subjective assessment of the image quality and the evaluation scores of the clinical image. The following parameters were significantly different after the implementation of the accreditation program: identifying data, display parameters, scan length, spatial and contrast resolution, window width and level, optimal contrast enhancement, slice thickness, and total score. The remaining parameters were not significantly different between scans obtained from the two different periods: scan parameters, film quality, and artifacts. After performing the CT accreditation program, the quality of the outside abdominal CTs show marked improvement, especially for the parameters related to the scanning protocol

  9. Body image and quality of life in a Spanish population

    Directory of Open Access Journals (Sweden)

    Ignacio Jáuregui Lobera

    2011-01-01

    Full Text Available Ignacio Jáuregui Lobera1, Patricia Bolaños Ríos21Department of Nutrition and Bromatology, Pablo de Olavide University, Seville, Spain; 2Behavior Science Institute, Seville, SpainPurpose: The aim of the current study was to analyze the psychometric properties, factor structure, and internal consistency of the Spanish version of the Body Image Quality of Life Inventory (BIQLI-SP as well as its test–retest reliability. Further objectives were to analyze different relationships with key dimensions of psychosocial functioning (ie, self-esteem, presence of psychopathological symptoms, eating and body image-related problems, and perceived stress and to evaluate differences in body image quality of life due to gender.Patients and methods: The sample comprised 417 students without any psychiatric history, recruited from the Pablo de Olavide University and the University of Seville. There were 140 men (33.57% and 277 women (66.43%, and the mean age was 21.62 years (standard deviation = 5.12. After obtaining informed consent from all participants, the following questionnaires were administered: BIQLI, Eating Disorder Inventory-2 (EDI-2, Perceived Stress Questionnaire (PSQ, Self-Esteem Scale (SES, and Symptom Checklist-90-Revised (SCL-90-R.Results: The BIQLI-SP shows adequate psychometric properties, and it may be useful to determine the body image quality of life in different physical conditions. A more positive body image quality of life is associated with better self-esteem, better psychological wellbeing, and fewer eating-related dysfunctional attitudes, this being more evident among women.Conclusion: The BIQLI-SP may be useful to determine the body image quality of life in different contexts with regard to dermatology, cosmetic and reconstructive surgery, and endocrinology, among others. In these fields of study, a new trend has emerged to assess body image-related quality of life.Keywords: body appreciation, wellbeing, self-esteem, social

  10. Analysis of an image quality assurance program

    International Nuclear Information System (INIS)

    Goethlin, J.H.; Alders, B.

    1985-01-01

    Reject film analysis before and after the introduction of a quality assurance program showed a 45% decrease in rejected films. The main changes in equipment and routines were: 1. Increased control of film processors and X-ray generators. 2. New film casettes and screens. 3. Decreased number of film sizes. 4. Information to and supervision of radiographing personnel. Savings in costs and increased income from an increased amount of out-patients corresponded to about 4.5% of the total cost of operating and maintaining the department. (orig.)

  11. Learning Receptive Fields and Quality Lookups for Blind Quality Assessment of Stereoscopic Images.

    Science.gov (United States)

    Shao, Feng; Lin, Weisi; Wang, Shanshan; Jiang, Gangyi; Yu, Mei; Dai, Qionghai

    2016-03-01

    Blind quality assessment of 3D images encounters more new challenges than its 2D counterparts. In this paper, we propose a blind quality assessment for stereoscopic images by learning the characteristics of receptive fields (RFs) from perspective of dictionary learning, and constructing quality lookups to replace human opinion scores without performance loss. The important feature of the proposed method is that we do not need a large set of samples of distorted stereoscopic images and the corresponding human opinion scores to learn a regression model. To be more specific, in the training phase, we learn local RFs (LRFs) and global RFs (GRFs) from the reference and distorted stereoscopic images, respectively, and construct their corresponding local quality lookups (LQLs) and global quality lookups (GQLs). In the testing phase, blind quality pooling can be easily achieved by searching optimal GRF and LRF indexes from the learnt LQLs and GQLs, and the quality score is obtained by combining the LRF and GRF indexes together. Experimental results on three publicly 3D image quality assessment databases demonstrate that in comparison with the existing methods, the devised algorithm achieves high consistent alignment with subjective assessment.

  12. Comparison of Model Predictions of Image Quality with Results of Clinical Trials in Chest and Lumbar Spine Screen-film Imaging

    International Nuclear Information System (INIS)

    Sandborg, M.; McVey, G.; Dance, D.R.; Carlsson, G.A.

    2000-01-01

    The ability to predict image quality from known physical and technical parameters is a prerequisite for making successful dose optimisation. In this study, imaging systems have been simulated using a Monte Carlo model of the imaging systems. The model includes a voxelised human anatomy and quantifies image quality in terms of contrast and signal-to-noise ratio for 5-6 anatomical details included in the anatomy. The imaging systems used in clinical trials were simulated and the ranking of the systems by the model and radiologists compared. The model and the results of the trial for chest PA both show that using a high maximum optical density was significantly better than using a low one. The model predicts that a good system is characterised by a large dynamic range and a high contrast of the blood vessels in the retrocardiac area. The ranking by the radiologists and the model agreed for the lumbar spine AP. (author)

  13. Two-dimensional pixel image lag simulation and optimization in a 4-T CMOS image sensor

    Energy Technology Data Exchange (ETDEWEB)

    Yu Junting; Li Binqiao; Yu Pingping; Xu Jiangtao [School of Electronics Information Engineering, Tianjin University, Tianjin 300072 (China); Mou Cun, E-mail: xujiangtao@tju.edu.c [Logistics Management Office, Hebei University of Technology, Tianjin 300130 (China)

    2010-09-15

    Pixel image lag in a 4-T CMOS image sensor is analyzed and simulated in a two-dimensional model. Strategies of reducing image lag are discussed from transfer gate channel threshold voltage doping adjustment, PPD N-type doping dose/implant tilt adjustment and transfer gate operation voltage adjustment for signal electron transfer. With the computer analysis tool ISE-TCAD, simulation results show that minimum image lag can be obtained at a pinned photodiode n-type doping dose of 7.0 x 10{sup 12} cm{sup -2}, an implant tilt of -2{sup 0}, a transfer gate channel doping dose of 3.0 x 10{sup 12} cm{sup -2} and an operation voltage of 3.4 V. The conclusions of this theoretical analysis can be a guideline for pixel design to improve the performance of 4-T CMOS image sensors. (semiconductor devices)

  14. An approach for quantitative image quality analysis for CT

    Science.gov (United States)

    Rahimi, Amir; Cochran, Joe; Mooney, Doug; Regensburger, Joe

    2016-03-01

    An objective and standardized approach to assess image quality of Compute Tomography (CT) systems is required in a wide variety of imaging processes to identify CT systems appropriate for a given application. We present an overview of the framework we have developed to help standardize and to objectively assess CT image quality for different models of CT scanners used for security applications. Within this framework, we have developed methods to quantitatively measure metrics that should correlate with feature identification, detection accuracy and precision, and image registration capabilities of CT machines and to identify strengths and weaknesses in different CT imaging technologies in transportation security. To that end we have designed, developed and constructed phantoms that allow for systematic and repeatable measurements of roughly 88 image quality metrics, representing modulation transfer function, noise equivalent quanta, noise power spectra, slice sensitivity profiles, streak artifacts, CT number uniformity, CT number consistency, object length accuracy, CT number path length consistency, and object registration. Furthermore, we have developed a sophisticated MATLAB based image analysis tool kit to analyze CT generated images of phantoms and report these metrics in a format that is standardized across the considered models of CT scanners, allowing for comparative image quality analysis within a CT model or between different CT models. In addition, we have developed a modified sparse principal component analysis (SPCA) method to generate a modified set of PCA components as compared to the standard principal component analysis (PCA) with sparse loadings in conjunction with Hotelling T2 statistical analysis method to compare, qualify, and detect faults in the tested systems.

  15. Optimage central organised image quality control including statistics and reporting

    International Nuclear Information System (INIS)

    Jahnen, A.; Schilz, C.; Shannoun, F.; Schreiner, A.; Hermen, J.; Moll, C.

    2008-01-01

    Quality control of medical imaging systems is performed using dedicated phantoms. As the imaging systems are more and more digital, adequate image processing methods might help to save evaluation time and to receive objective results. The developed software package OPTIMAGE is focusing on this with a central approach: On one hand, OPTIMAGE provides a framework, which includes functions like database integration, DICOM data sources, multilingual user interface and image processing functionality. On the other hand, the test methods are implemented using modules which are able to process the images automatically for the common imaging systems. The integration of statistics and reporting into this environment is paramount: This is the only way to provide these functions in an interactive, user-friendly way. These features enable the users to discover degradation in performance quickly and document performed measurements easily. (authors)

  16. Body image quality of life in eating disorders

    Directory of Open Access Journals (Sweden)

    Ignacio Jáuregui Lobera

    2011-03-01

    Full Text Available Ignacio Jáuregui Lobera1, Patricia Bolaños Ríos21Department of Nutrition and Bromatology, Pablo de Olavide University, Seville, Spain; 2Behavior Sciences Institute, Seville, SpainPurpose: The objective was to examine how body image affects quality of life in an eating-disorder (ED clinical sample, a non-ED clinical sample, and a nonclinical sample. We hypothesized that ED patients would show the worst body image quality of life. We also hypothesized that body image quality of life would have a stronger negative association with specific ED-related variables than with other psychological and psychopathological variables, mainly among ED patients. On the basis of previous studies, the influence of gender on the results was explored, too.Patients and methods: The final sample comprised 70 ED patients (mean age 22.65 ± 7.76 years; 59 women and 11 men; 106 were patients with other psychiatric disorders (mean age 28.20 ± 6.52; 67 women and 39 men, and 135 were university students (mean age 21.57 ± 2.58; 81 women and 54 men, with no psychiatric history. After having obtained informed consent, the following questionnaires were administered: Body Image Quality of Life Inventory-Spanish version (BIQLI-SP, Eating Disorders Inventory-2 (EDI-2, Perceived Stress Questionnaire (PSQ, Self-Esteem Scale (SES, and Symptom Checklist-90-Revised (SCL-90-R.Results: The ED patients' ratings on the BIQLI-SP were the lowest and negatively scored (BIQLI-SP means: +20.18, +5.14, and —6.18, in the student group, the non-ED patient group, and the ED group, respectively. The effect of body image on quality of life was more negative in the ED group in all items of the BIQLI-SP. Body image quality of life was negatively associated with specific ED-related variables, more than with other psychological and psychopathological variables, but not especially among ED patients.Conclusion: Body image quality of life was affected not only by specific pathologies related to body

  17. Simulating Visible/Infrared Imager Radiometer Suite Normalized Difference Vegetation Index Data Using Hyperion and MODIS

    Science.gov (United States)

    Ross, Kenton W.; Russell, Jeffrey; Ryan, Robert E.

    2006-01-01

    The success of MODIS (the Moderate Resolution Imaging Spectrometer) in creating unprecedented, timely, high-quality data for vegetation and other studies has created great anticipation for data from VIIRS (the Visible/Infrared Imager Radiometer Suite). VIIRS will be carried onboard the joint NASA/Department of Defense/National Oceanic and Atmospheric Administration NPP (NPOESS (National Polar-orbiting Operational Environmental Satellite System) Preparatory Project). Because the VIIRS instruments will have lower spatial resolution than the current MODIS instruments 400 m versus 250 m at nadir for the channels used to generate Normalized Difference Vegetation Index data, scientists need the answer to this question: how will the change in resolution affect vegetation studies? By using simulated VIIRS measurements, this question may be answered before the VIIRS instruments are deployed in space. Using simulated VIIRS products, the U.S. Department of Agriculture and other operational agencies can then modify their decision support systems appropriately in preparation for receipt of actual VIIRS data. VIIRS simulations and validations will be based on the ART (Application Research Toolbox), an integrated set of algorithms and models developed in MATLAB(Registerd TradeMark) that enables users to perform a suite of simulations and statistical trade studies on remote sensing systems. Specifically, the ART provides the capability to generate simulated multispectral image products, at various scales, from high spatial hyperspectral and/or multispectral image products. The ART uses acquired ( real ) or synthetic datasets, along with sensor specifications, to create simulated datasets. For existing multispectral sensor systems, the simulated data products are used for comparison, verification, and validation of the simulated system s actual products. VIIRS simulations will be performed using Hyperion and MODIS datasets. The hyperspectral and hyperspatial properties of Hyperion

  18. The Study on the Attenuation of X-ray and Imaging Quality by Contents in Stomach

    International Nuclear Information System (INIS)

    Dong, Kyung Rae; Ji, Youn Sang; Kim, Chang Bok; Choi, Seong Kwan; Moon, Sang In; Dieter, Kevin

    2009-01-01

    This study examined the change in the attenuation of X-rays with the ROI (Region of Interest) in DR (Digital Radiography) according to the stomach contents by manufacturing a tissue equivalent material phantom to simulate real stomach tissue based on the assumption that there is some attenuation of X-rays and a difference in imaging quality according to the stomach contents. The transit dosage by the attenuation of X-rays decreased with increasing protein thickness, which altered the average ROI values in the film and DR images. A comparison of the change in average ROI values of the film and DR image showed that the image in film caused larger density changes with varying thickness of protein than the image by DR. The results indicate that NPO (nothing by mouth) is more important in film system than in DR system.

  19. The Study on the Attenuation of X-ray and Imaging Quality by Contents in Stomach

    Energy Technology Data Exchange (ETDEWEB)

    Dong, Kyung Rae; Ji, Youn Sang; Kim, Chang Bok; Choi, Seong Kwan; Moon, Sang In [Dept. of Radiological Technology, Gwangju Health College University, Gwangju (Korea, Republic of); Dieter, Kevin [Dept. of Physical Therapy, Gwangju Health College University, Gwangju (Korea, Republic of)

    2009-03-15

    This study examined the change in the attenuation of X-rays with the ROI (Region of Interest) in DR (Digital Radiography) according to the stomach contents by manufacturing a tissue equivalent material phantom to simulate real stomach tissue based on the assumption that there is some attenuation of X-rays and a difference in imaging quality according to the stomach contents. The transit dosage by the attenuation of X-rays decreased with increasing protein thickness, which altered the average ROI values in the film and DR images. A comparison of the change in average ROI values of the film and DR image showed that the image in film caused larger density changes with varying thickness of protein than the image by DR. The results indicate that NPO (nothing by mouth) is more important in film system than in DR system.

  20. Frameworks for Assessing the Quality of Modeling and Simulation Capabilities

    Science.gov (United States)

    Rider, W. J.

    2012-12-01

    The importance of assuring quality in modeling and simulation has spawned several frameworks for structuring the examination of quality. The format and content of these frameworks provides an emphasis, completeness and flow to assessment activities. I will examine four frameworks that have been developed and describe how they can be improved and applied to a broader set of high consequence applications. Perhaps the first of these frameworks was known as CSAU [Boyack] (code scaling, applicability and uncertainty) used for nuclear reactor safety and endorsed the United States' Nuclear Regulatory Commission (USNRC). This framework was shaped by nuclear safety practice, and the practical structure needed after the Three Mile Island accident. It incorporated the dominant experimental program, the dominant analysis approach, and concerns about the quality of modeling. The USNRC gave it the force of law that made the nuclear industry take it seriously. After the cessation of nuclear weapons' testing the United States began a program of examining the reliability of these weapons without testing. This program utilizes science including theory, modeling, simulation and experimentation to replace the underground testing. The emphasis on modeling and simulation necessitated attention on the quality of these simulations. Sandia developed the PCMM (predictive capability maturity model) to structure this attention [Oberkampf]. PCMM divides simulation into six core activities to be examined and graded relative to the needs of the modeling activity. NASA [NASA] has built yet another framework in response to the tragedy of the space shuttle accidents. Finally, Ben-Haim and Hemez focus upon modeling robustness and predictive fidelity in another approach. These frameworks are similar, and applied in a similar fashion. The adoption of these frameworks at Sandia and NASA has been slow and arduous because the force of law has not assisted acceptance. All existing frameworks are

  1. Antero-posterior (AP) pelvis x-ray imaging on a trolley: Impact of trolley design, mattress design and radiographer practice on image quality and radiation dose

    International Nuclear Information System (INIS)

    Tugwell, J.R.; England, A.; Hogg, P.

    2017-01-01

    Introduction: Physical and technical differences exist between imaging on an x-ray tabletop and imaging on a trolley. This study evaluates how trolley imaging impacts image quality and radiation dose for an antero-posterior (AP) pelvis projection whilst subsequently exploring means of optimising this imaging examination. Methods: An anthropomorphic pelvis phantom was imaged on a commercially available trolley under various conditions. Variables explored included two mattresses, two image receptor holder positions, three source to image distances (SIDs) and four mAs values. Image quality was evaluated using relative visual grading analysis with the reference image acquired on the x-ray tabletop. Contrast to noise ratio (CNR) was calculated. Effective dose was established using Monte Carlo simulation. Optimisation scores were derived as a figure of merit by dividing effective dose with visual image quality scores. Results: Visual image quality reduced significantly (p < 0.05) whilst effective dose increased significantly (p < 0.05) for images acquired on the trolley using identical acquisition parameters to the reference image. The trolley image with the highest optimisation score was acquired using 130 cm SID, 20 mAs, the standard mattress and platform not elevated. A difference of 12.8 mm was found between the image with the lowest and highest magnification factor (18%). Conclusion: The acquisition parameters used for AP pelvis on the x-ray tabletop are not transferable to trolley imaging and should be modified accordingly to compensate for the differences that exist. Exposure charts should be developed for trolley imaging to ensure optimal image quality at lowest possible dose. - Highlights: • Acquisition parameters used for AP pelvis imaging on a trolley need adapting from those used on the x-ray tabletop. • Radiation dose significantly increases for trolley imaging. • An increase in SID can reduce radiation dose and magnification for trolley imaging

  2. Multiple-point statistical simulation for hydrogeological models: 3-D training image development and conditioning strategies

    Science.gov (United States)

    Høyer, Anne-Sophie; Vignoli, Giulio; Mejer Hansen, Thomas; Thanh Vu, Le; Keefer, Donald A.; Jørgensen, Flemming

    2017-12-01

    Most studies on the application of geostatistical simulations based on multiple-point statistics (MPS) to hydrogeological modelling focus on relatively fine-scale models and concentrate on the estimation of facies-level structural uncertainty. Much less attention is paid to the use of input data and optimal construction of training images. For instance, even though the training image should capture a set of spatial geological characteristics to guide the simulations, the majority of the research still relies on 2-D or quasi-3-D training images. In the present study, we demonstrate a novel strategy for 3-D MPS modelling characterized by (i) realistic 3-D training images and (ii) an effective workflow for incorporating a diverse group of geological and geophysical data sets. The study covers an area of 2810 km2 in the southern part of Denmark. MPS simulations are performed on a subset of the geological succession (the lower to middle Miocene sediments) which is characterized by relatively uniform structures and dominated by sand and clay. The simulated domain is large and each of the geostatistical realizations contains approximately 45 million voxels with size 100 m × 100 m × 5 m. Data used for the modelling include water well logs, high-resolution seismic data, and a previously published 3-D geological model. We apply a series of different strategies for the simulations based on data quality, and develop a novel method to effectively create observed spatial trends. The training image is constructed as a relatively small 3-D voxel model covering an area of 90 km2. We use an iterative training image development strategy and find that even slight modifications in the training image create significant changes in simulations. Thus, this study shows how to include both the geological environment and the type and quality of input information in order to achieve optimal results from MPS modelling. We present a practical workflow to build the training image and

  3. Indicators of image quality and doses in mammography

    International Nuclear Information System (INIS)

    Gaona, E.; Franco E, J.G.; Azorin N, J.; Diaz G, J.A.I.; Arreola, M.

    2007-01-01

    Full text: The purpose of the study was to determine the values of the image quality indicators and their relationship with the dose in mammography of screen-film that they allowed the detection of a bigger number of objects in the images obtained with the mannequin (phantom) authorized of the ACR/FDA. The study was carried out in four mammography services in a period of 12 months. The indicators of the image quality are the half optic density (DOM), contrast (differences of densities), the number of observed objects in the images and the dose for image. The minimum acceptable values by the ACR/FDA are a half optical density of 1.4, contrast of 0.4 and the one numbers minimum of objects observed in the image of the mannequin of mammography of 10 (4 fibers, 3 groups of calcifications and 3 masses), with a maximum dose by image of 3 mGy. The found results are half optical density of 1.9, contrast of 0.56 and the average number of objects observed in the images of 12, with a dose in the interval of 1.6 mGy to 2.4 mGy. The doses were measured by thermoluminescent dosimetry and ionization chamber. Once carried out the analysis of the tendencies of the indicators of image quality and their distributions is found that for a p < 0.05, the bigger number of objects observed in the images is in the interval from 1.8 to 1.9 of DOM. When comparing both mammography system, the system screen-film has a lower variability in the distribution of objects associated to a DOM. (Author)

  4. Dose and diagnostic image quality in digital tomosynthesis imaging of facial bones in pediatrics

    Science.gov (United States)

    King, J. M.; Hickling, S.; Elbakri, I. A.; Reed, M.; Wrogemann, J.

    2011-03-01

    The purpose of this study was to evaluate the use of digital tomosynthesis (DT) for pediatric facial bone imaging. We compared the eye lens dose and diagnostic image quality of DT facial bone exams relative to digital radiography (DR) and computed tomography (CT), and investigated whether we could modify our current DT imaging protocol to reduce patient dose while maintaining sufficient diagnostic image quality. We measured the dose to the eye lens for all three modalities using high-sensitivity thermoluminescent dosimeters (TLDs) and an anthropomorphic skull phantom. To assess the diagnostic image quality of DT compared to the corresponding DR and CT images, we performed an observer study where the visibility of anatomical structures in the DT phantom images were rated on a four-point scale. We then acquired DT images at lower doses and had radiologists indicate whether the visibility of each structure was adequate for diagnostic purposes. For typical facial bone exams, we measured eye lens doses of 0.1-0.4 mGy for DR, 0.3-3.7 mGy for DT, and 26 mGy for CT. In general, facial bone structures were visualized better with DT then DR, and the majority of structures were visualized well enough to avoid the need for CT. DT imaging provides high quality diagnostic images of the facial bones while delivering significantly lower doses to the lens of the eye compared to CT. In addition, we found that by adjusting the imaging parameters, the DT effective dose can be reduced by up to 50% while maintaining sufficient image quality.

  5. Studying and simulating transformer configuration to improve power quality

    Directory of Open Access Journals (Sweden)

    Oscar J. Peña Huaringa

    2011-06-01

    Full Text Available This paper presents a study and simulation of transformer configurations to improve power quality; it provides theoretical support based on the expansion of the Fourier series and analysis of symmetrical components. A test system was set up in the laboratory, taking measurements and checking configuration effectiveness in reducing the system’s harmonic content. The configurations were modelled with PSCAD / EMTDC software, using two 6 pulse rectifiers as test loads and two variable speed drives.

  6. Image quality and dose assessment in digital breast tomosynthesis: A Monte Carlo study

    International Nuclear Information System (INIS)

    Baptista, M.; Di Maria, S.; Oliveira, N.; Matela, N.; Janeiro, L.; Almeida, P.; Vaz, P.

    2014-01-01

    Mammography is considered a standard technique for the early detection of breast cancer. However, its sensitivity is limited essentially due to the issue of the overlapping breast tissue. This limitation can be partially overcome, with a relatively new technique, called digital breast tomosynthesis (DBT). For this technique, optimization of acquisition parameters which maximize image quality, whilst complying with the ALARA principle, continues to be an area of considerable research. The aim of this work was to study the best quantum energies that optimize the image quality with the lowest achievable dose in DBT and compare these results with the digital mammography (DM) ones. Monte Carlo simulations were performed using the state-of-the-art computer program MCNPX 2.7.0 in order to generate several 2D cranio-caudal (CC) projections obtained during an acquisition of a standard DBT examination. Moreover, glandular absorbed doses and photon flux calculations, for each projection image, were performed. A homogeneous breast computational phantom with 50%/50% glandular/adipose tissue composition was used and two compressed breast thicknesses were evaluated: 4 cm and 8 cm. The simulated projection images were afterwards reconstructed with an algebraic reconstruction tool and the signal difference to noise ratio (SDNR) was calculated in order to evaluate the image quality in DBT and DM. Finally, a thorough comparison between the results obtained in terms of SDNR and dose assessment in DBT and DM was performed. - Highlights: • Optimization of the image quality in digital breast tomosynthesis. • Calculation of photon energies that maximize the signal difference to noise ratio. • Projections images and dose calculations through the Monte Carlo (MC) method. • Tumor masses and microcalcifications included in the MC model. • A dose saving of about 30% can be reached if optimal photon energies are used

  7. Image quality enhancement for skin cancer optical diagnostics

    Science.gov (United States)

    Bliznuks, Dmitrijs; Kuzmina, Ilona; Bolocko, Katrina; Lihachev, Alexey

    2017-12-01

    The research presents image quality analysis and enhancement proposals in biophotonic area. The sources of image problems are reviewed and analyzed. The problems with most impact in biophotonic area are analyzed in terms of specific biophotonic task - skin cancer diagnostics. The results point out that main problem for skin cancer analysis is the skin illumination problems. Since it is often not possible to prevent illumination problems, the paper proposes image post processing algorithm - low frequency filtering. Practical results show diagnostic results improvement after using proposed filter. Along that, filter do not reduces diagnostic results' quality for images without illumination defects. Current filtering algorithm requires empirical tuning of filter parameters. Further work needed to test the algorithm in other biophotonic applications and propose automatic filter parameter selection.

  8. IMPROVING THE QUALITY OF NEAR-INFRARED IMAGING OF IN VIVOBLOOD VESSELS USING IMAGE FUSION METHODS

    DEFF Research Database (Denmark)

    Jensen, Andreas Kryger; Savarimuthu, Thiusius Rajeeth; Sørensen, Anders Stengaard

    2009-01-01

    We investigate methods for improving the visual quality of in vivo images of blood vessels in the human forearm. Using a near-infrared light source and a dual CCD chip camera system capable of capturing images at visual and nearinfrared spectra, we evaluate three fusion methods in terms...... of their capability of enhancing the blood vessels while preserving the spectral signature of the original color image. Furthermore, we investigate a possibility of removing hair in the images using a fusion rule based on the "a trous" stationary wavelet decomposition. The method with the best overall performance...... with both speed and quality in mind is the Intensity Injection method. Using the developed system and the methods presented in this article, it is possible to create images of high visual quality with highly emphasized blood vessels....

  9. Image Quality in High-resolution and High-cadence Solar Imaging

    Science.gov (United States)

    Denker, C.; Dineva, E.; Balthasar, H.; Verma, M.; Kuckein, C.; Diercke, A.; González Manrique, S. J.

    2018-03-01

    Broad-band imaging and even imaging with a moderate bandpass (about 1 nm) provides a photon-rich environment, where frame selection (lucky imaging) becomes a helpful tool in image restoration, allowing us to perform a cost-benefit analysis on how to design observing sequences for imaging with high spatial resolution in combination with real-time correction provided by an adaptive optics (AO) system. This study presents high-cadence (160 Hz) G-band and blue continuum image sequences obtained with the High-resolution Fast Imager (HiFI) at the 1.5-meter GREGOR solar telescope, where the speckle-masking technique is used to restore images with nearly diffraction-limited resolution. The HiFI employs two synchronized large-format and high-cadence sCMOS detectors. The median filter gradient similarity (MFGS) image-quality metric is applied, among others, to AO-corrected image sequences of a pore and a small sunspot observed on 2017 June 4 and 5. A small region of interest, which was selected for fast-imaging performance, covered these contrast-rich features and their neighborhood, which were part of Active Region NOAA 12661. Modifications of the MFGS algorithm uncover the field- and structure-dependency of this image-quality metric. However, MFGS still remains a good choice for determining image quality without a priori knowledge, which is an important characteristic when classifying the huge number of high-resolution images contained in data archives. In addition, this investigation demonstrates that a fast cadence and millisecond exposure times are still insufficient to reach the coherence time of daytime seeing. Nonetheless, the analysis shows that data acquisition rates exceeding 50 Hz are required to capture a substantial fraction of the best seeing moments, significantly boosting the performance of post-facto image restoration.

  10. Effects of sparse sampling schemes on image quality in low-dose CT

    International Nuclear Information System (INIS)

    Abbas, Sajid; Lee, Taewon; Cho, Seungryong; Shin, Sukyoung; Lee, Rena

    2013-01-01

    Purpose: Various scanning methods and image reconstruction algorithms are actively investigated for low-dose computed tomography (CT) that can potentially reduce a health-risk related to radiation dose. Particularly, compressive-sensing (CS) based algorithms have been successfully developed for reconstructing images from sparsely sampled data. Although these algorithms have shown promises in low-dose CT, it has not been studied how sparse sampling schemes affect image quality in CS-based image reconstruction. In this work, the authors present several sparse-sampling schemes for low-dose CT, quantitatively analyze their data property, and compare effects of the sampling schemes on the image quality.Methods: Data properties of several sampling schemes are analyzed with respect to the CS-based image reconstruction using two measures: sampling density and data incoherence. The authors present five different sparse sampling schemes, and simulated those schemes to achieve a targeted dose reduction. Dose reduction factors of about 75% and 87.5%, compared to a conventional scan, were tested. A fully sampled circular cone-beam CT data set was used as a reference, and sparse sampling has been realized numerically based on the CBCT data.Results: It is found that both sampling density and data incoherence affect the image quality in the CS-based reconstruction. Among the sampling schemes the authors investigated, the sparse-view, many-view undersampling (MVUS)-fine, and MVUS-moving cases have shown promising results. These sampling schemes produced images with similar image quality compared to the reference image and their structure similarity index values were higher than 0.92 in the mouse head scan with 75% dose reduction.Conclusions: The authors found that in CS-based image reconstructions both sampling density and data incoherence affect the image quality, and suggest that a sampling scheme should be devised and optimized by use of these indicators. With this strategic

  11. Radiation dose and image quality for paediatric interventional cardiology

    Science.gov (United States)

    Vano, E.; Ubeda, C.; Leyton, F.; Miranda, P.

    2008-08-01

    Radiation dose and image quality for paediatric protocols in a biplane x-ray system used for interventional cardiology have been evaluated. Entrance surface air kerma (ESAK) and image quality using a test object and polymethyl methacrylate (PMMA) phantoms have been measured for the typical paediatric patient thicknesses (4-20 cm of PMMA). Images from fluoroscopy (low, medium and high) and cine modes have been archived in digital imaging and communications in medicine (DICOM) format. Signal-to-noise ratio (SNR), figure of merit (FOM), contrast (CO), contrast-to-noise ratio (CNR) and high contrast spatial resolution (HCSR) have been computed from the images. Data on dose transferred to the DICOM header have been used to test the values of the dosimetric display at the interventional reference point. ESAK for fluoroscopy modes ranges from 0.15 to 36.60 µGy/frame when moving from 4 to 20 cm PMMA. For cine, these values range from 2.80 to 161.10 µGy/frame. SNR, FOM, CO, CNR and HCSR are improved for high fluoroscopy and cine modes and maintained roughly constant for the different thicknesses. Cumulative dose at the interventional reference point resulted 25-45% higher than the skin dose for the vertical C-arm (depending of the phantom thickness). ESAK and numerical image quality parameters allow the verification of the proper setting of the x-ray system. Knowing the increases in dose per frame when increasing phantom thicknesses together with the image quality parameters will help cardiologists in the good management of patient dose and allow them to select the best imaging acquisition mode during clinical procedures.

  12. Structural similarity image quality reliability: Determining parameters and window size

    OpenAIRE

    Silvestre-Blanes, Javier

    2011-01-01

    The need to obtain objective values of the quality of distorted images with respect to the original is fundamental in multimedia and image processing applications. It is generally required that this value correlates well with the human vision system (HVS). In spite of the properties and the general use of the mean square error (MSE) measurement, this has a poor correlation with HSV, which has led to the development of methods such as structural similarity (SSIM). This metric improves the corr...

  13. Developing 3D Imaging Programmes-Workflow and Quality Control

    OpenAIRE

    Hess, M.; Robson, S.; Serpico, M.; Amati, G.; Pridden, I.; Nelson, T.

    2016-01-01

    This article reports on a successful project for 3D imaging research, digital applications, and use of new technologies in the museum. The article will focus on the development and implementation of a viable workflow for the production of high-quality 3D models of museum objects, based on the 3D laser scanning and photogrammetry of selected ancient Egyptian artefacts. The development of a robust protocol for the complete process chain for imaging cultural heritage artefacts, from the acquisit...

  14. Radiation dose and image quality for paediatric interventional cardiology

    Energy Technology Data Exchange (ETDEWEB)

    Vano, E [Radiology Department, Medicine School, Complutense University and San Carlos University Hospital, 28040 Madrid (Spain); Ubeda, C [Clinical Sciences Department, Faculty of the Science of Health, Tarapaca University, 18 de Septiembre 2222, Arica (Chile); Leyton, F [Institute of Public Health of Chile, Marathon 1000, Nunoa, Santiago (Chile); Miranda, P [Hemodynamic Department, Cardiovascular Service, Luis Calvo Mackenna Hospital, Avenida Antonio Varas 360, Providencia, Santiago (Chile)], E-mail: eliseov@med.ucm.es

    2008-08-07

    Radiation dose and image quality for paediatric protocols in a biplane x-ray system used for interventional cardiology have been evaluated. Entrance surface air kerma (ESAK) and image quality using a test object and polymethyl methacrylate (PMMA) phantoms have been measured for the typical paediatric patient thicknesses (4-20 cm of PMMA). Images from fluoroscopy (low, medium and high) and cine modes have been archived in digital imaging and communications in medicine (DICOM) format. Signal-to-noise ratio (SNR), figure of merit (FOM), contrast (CO), contrast-to-noise ratio (CNR) and high contrast spatial resolution (HCSR) have been computed from the images. Data on dose transferred to the DICOM header have been used to test the values of the dosimetric display at the interventional reference point. ESAK for fluoroscopy modes ranges from 0.15 to 36.60 {mu}Gy/frame when moving from 4 to 20 cm PMMA. For cine, these values range from 2.80 to 161.10 {mu}Gy/frame. SNR, FOM, CO, CNR and HCSR are improved for high fluoroscopy and cine modes and maintained roughly constant for the different thicknesses. Cumulative dose at the interventional reference point resulted 25-45% higher than the skin dose for the vertical C-arm (depending of the phantom thickness). ESAK and numerical image quality parameters allow the verification of the proper setting of the x-ray system. Knowing the increases in dose per frame when increasing phantom thicknesses together with the image quality parameters will help cardiologists in the good management of patient dose and allow them to select the best imaging acquisition mode during clinical procedures.

  15. Exploratory survey of image quality on CR digital mammography imaging systems in Mexico

    International Nuclear Information System (INIS)

    Gaona, E.; Rivera, T.; Arreola, M.; Franco, J.; Molina, N.; Alvarez, B.; Azorín, C.G.; Casian, G.

    2014-01-01

    The purpose of this study was to assess the current status of image quality and dose in computed radiographic digital mammography (CRDM) systems. Studies included CRDM systems of various models and manufacturers which dose and image quality comparisons were performed. Due to the recent rise in the use of digital radiographic systems in Mexico, CRDM systems are rapidly replacing conventional film-screen systems without any regard to quality control or image quality standards. Study was conducted in 65 mammography facilities which use CRDM systems in the Mexico City and surrounding States. The systems were tested as used clinically. This means that the dose and beam qualities were selected using the automatic beam selection and photo-timed features. All systems surveyed generate laser film hardcopies for the radiologist to read on a scope or mammographic high luminance light box. It was found that 51 of CRDM systems presented a variety of image artefacts and non-uniformities arising from inadequate acquisition and processing, as well as from the laser printer itself. Undisciplined alteration of image processing settings by the technologist was found to be a serious prevalent problem in 42 facilities. Only four of them showed an image QC program which is periodically monitored by a medical physicist. The Average Glandular Dose (AGD) in the surveyed systems was estimated to have a mean value of 2.4 mGy. To improve image quality in mammography and make more efficient screening mammographic in early detection of breast cancer is required new legislation. - Highlights: • Radiation dose in CR digital mammography (CRDM) systems was determined. • Image quality related with dose in CR digital mammography (CRDM) systems was analysed. • Image processing artefacts were observed and correlated with dose. • Measured entrance dose by TL phosphors could be good parameter for radiation protection optimization in patient

  16. PET/CT Atlas on Quality Control and Image Artefacts

    International Nuclear Information System (INIS)

    2014-01-01

    Combined positron emission tomography (PET)/computed tomography (CT) imaging has become a routine procedure in diagnostic radiology and nuclear medicine. The clinical review of both PET and PET/CT images requires a thorough understanding of the basics of image formation as well as an appreciation of variations of inter-patient and intra-patient image appearance. Such variations may be caused by variations in tracer accumulation and metabolism, and, perhaps more importantly, by image artefacts related to methodological pitfalls of the two modalities. This atlas on quality control (QC) and PET/CT artefacts provides guidance on typical image distortions in clinical PET/CT usage scenarios. A number of cases are presented to provide nuclear medicine and radiology professionals with an assortment of examples of possible image distortions and errors in order to support the correct interpretation of images. About 70 typical PET and PET/CT cases, comprised of image sets and cases, have been collected in this book, and all have been catalogued and have explanations as to the causes of and solutions to each individual image problem. This atlas is intended to be used as a guide on how to take proper QC measures, on performing situation and problem analysis, and on problem prevention. This book will be especially useful to medical physicists, physicians, technologists and service engineers in the clinical field

  17. Imaging Performance Analysis of Simbol-X with Simulations

    Science.gov (United States)

    Chauvin, M.; Roques, J. P.

    2009-05-01

    Simbol-X is an X-Ray telescope operating in formation flight. It means that its optical performances will strongly depend on the drift of the two spacecrafts and its ability to measure these drifts for image reconstruction. We built a dynamical ray tracing code to study the impact of these parameters on the optical performance of Simbol-X (see Chauvin et al., these proceedings). Using the simulation tool we have developed, we have conducted detailed analyses of the impact of different parameters on the imaging performance of the Simbol-X telescope.

  18. Imaging Performance Analysis of Simbol-X with Simulations

    International Nuclear Information System (INIS)

    Chauvin, M.; Roques, J. P.

    2009-01-01

    Simbol-X is an X-Ray telescope operating in formation flight. It means that its optical performances will strongly depend on the drift of the two spacecrafts and its ability to measure these drifts for image reconstruction. We built a dynamical ray tracing code to study the impact of these parameters on the optical performance of Simbol-X (see Chauvin et al., these proceedings). Using the simulation tool we have developed, we have conducted detailed analyses of the impact of different parameters on the imaging performance of the Simbol-X telescope.

  19. Application of image simulation in weapon system development

    CSIR Research Space (South Africa)

    Willers, CJ

    2007-09-01

    Full Text Available systems. Index Terms—image simulation, scene modelling, weapon eval- uation, infrared I. INTRODUCTION Simulation is used increasingly to support military system development throughout all the product life cycle phases, from concept analysis... the theoretical models. The signature 0 2 4 6 8 10 12 14 0 0.1 0.2 0.3 0.4 0.5 Wavelength [ m]� Tr a n sm itt an ce Path length = 10 000 m Sub-arctic Summer: 14 C ambient, 75% RH, Navy maritime aerosol, 23 km visibility Very high humidity: 35 C...

  20. Dark Energy Studies with LSST Image Simulations, Final Report

    International Nuclear Information System (INIS)

    Peterson, John Russell

    2016-01-01

    This grant funded the development and dissemination of the Photon Simulator (PhoSim) for the purpose of studying dark energy at high precision with the upcoming Large Synoptic Survey Telescope (LSST) astronomical survey. The work was in collaboration with the LSST Dark Energy Science Collaboration (DESC). Several detailed physics improvements were made in the optics, atmosphere, and sensor, a number of validation studies were performed, and a significant number of usability features were implemented. Future work in DESC will use PhoSim as the image simulation tool for data challenges used by the analysis groups.

  1. A factorial experiment on image quality and radiation dose

    International Nuclear Information System (INIS)

    Norrman, E.; Persliden, J.

    2005-01-01

    To find if factorial experiments can be used in the optimisation of diagnostic imaging, a factorial experiment was performed to investigate some of the factors that influence image quality, kerma area product (KAP) and effective dose (E). In a factorial experiment the factors are varied together instead of one at a time, making it possible to discover interactions between the factors as well as major effects. The factors studied were tube potential, tube loading, focus size and filtration. Each factor was set to two levels (low and high). The influence of the factors on the response variables (image quality, KAP and E) was studied using a direct digital detector. The major effects of each factor on the response variables were estimated as well as the interaction effects between factors. The image quality, KAP and E were mainly influenced by tube loading, tube potential and filtration. There were some active interactions, for example, between tube potential and filtration and between tube loading and filtration. The study shows that factorial experiments can be used to predict the influence of various parameters on image quality and radiation dose. (authors)

  2. Self-Organizing Maps for Fingerprint Image Quality Assessment

    DEFF Research Database (Denmark)

    Olsen, Martin Aastrup; Tabassi, Elham; Makarov, Anton

    2013-01-01

    Fingerprint quality assessment is a crucial task which needs to be conducted accurately in various phases in the biometric enrolment and recognition processes. Neglecting quality measurement will adversely impact accuracy and efficiency of biometric recognition systems (e.g. verification and iden......Fingerprint quality assessment is a crucial task which needs to be conducted accurately in various phases in the biometric enrolment and recognition processes. Neglecting quality measurement will adversely impact accuracy and efficiency of biometric recognition systems (e.g. verification...... machine learning techniques. We train a self-organizing map (SOM) to cluster blocks of fingerprint images based on their spatial information content. The output of the SOM is a high-level representation of the finger image, which forms the input to a Random Forest trained to learn the relationship between...

  3. The dose and image quality in mammography in Macedonia

    International Nuclear Information System (INIS)

    Gershan, V.

    2007-01-01

    Complete test of publication follows. Mean Glandular Dose (MGD), dose distribution, phantom and quality of the real mammogram were studied for the first time in Macedonia. The study was conducted to review the condition of mammography equipment, to access the dose and image quality in mammography practices in Macedonia.. The purpose was to find out the weak points in the mammography practices in order to suggest improvements in the practices and establish quality control procedures. Under evaluation were 12 monographic machines. MGD was estimated using Entrance Surface Air KERMA at the breast surface K f measured free in air and appropriate conversion factors. Dose survey was curried by measurement and calculated of the HVLs and radiation output for 25-32 kVp and keeping record of the clinical parameters (breast thickness, kVp, mAs). Image quality was evaluated using the Mammographic Accreditation Phantom Gammex 156, PMMA plates and test tool for film processing.

  4. Dual-source CT coronary imaging in heart transplant recipients: image quality and optimal reconstruction interval

    International Nuclear Information System (INIS)

    Bastarrika, Gorka; Arraiza, Maria; Pueyo, Jesus C.; Cecco, Carlo N. de; Ubilla, Matias; Mastrobuoni, Stefano; Rabago, Gregorio

    2008-01-01

    The image quality and optimal reconstruction interval for coronary arteries in heart transplant recipients undergoing non-invasive dual-source computed tomography (DSCT) coronary angiography was evaluated. Twenty consecutive heart transplant recipients who underwent DSCT coronary angiography were included (19 male, one female; mean age 63.1±10.7 years). Data sets were reconstructed in 5% steps from 30% to 80% of the R-R interval. Two blinded independent observers assessed the image quality of each coronary segments using a five-point scale (from 0 = not evaluative to 4=excellent quality). A total of 289 coronary segments in 20 heart transplant recipients were evaluated. Mean heart rate during the scan was 89.1±10.4 bpm. At the best reconstruction interval, diagnostic image quality (score ≥2) was obtained in 93.4% of the coronary segments (270/289) with a mean image quality score of 3.04± 0.63. Systolic reconstruction intervals provided better image quality scores than diastolic reconstruction intervals (overall mean quality scores obtained with the systolic and diastolic reconstructions 3.03±1.06 and 2.73±1.11, respectively; P<0.001). Different systolic reconstruction intervals (35%, 40%, 45% of RR interval) did not yield to significant differences in image quality scores for the coronary segments (P=0.74). Reconstructions obtained at the systolic phase of the cardiac cycle allowed excellent diagnostic image quality coronary angiograms in heart transplant recipients undergoing DSCT coronary angiography. (orig.)

  5. Optimization of image quality and patient dose in mammography

    International Nuclear Information System (INIS)

    Shafqat Faaruq; Jaferi, R.A.; Nafeesa Nazlee

    2007-01-01

    Complete test of publication follows. Optimization of patient dose and image quality can be defined as to get the best image quality with minimum possible radiation dose to the patient by setting various parameters and modes of operation available in mammography machines. The optimization procedures were performed on two mammography units from M/S GE and Metaltronica, available at NORI, using standard mammographic accreditation phantom (Model: BR-156) and acrylic sheets of variable thicknesses. Quality assurance and quality control (QC) tests being the essential part of optimization. The QC tests as recommended by American College of Radiology, were first performed on both machines as well as X-ray film processor. In the second step, different affecting the image quality and radiation dose to patient, like film screen combination (FSC), phantom optical density (PD), kVp, mAs etc, were adjusted for various phantom thicknesses ranging from 3 cm to 6.5 cm in various modes of operation in the machines (semi-auto- and manual in GE, Auto-, semi-auto- and manual mode in Metaltronica). The image quality was studied for these optimized parameters on the basis of the number of test objects of the phantom visible in these images. Finally the linear relationship between mAs and skin entrance dose (mGy) was verified using ionization chamber with the phantom and the actual patients. Despite some practical limitations, the results of the quality assurance tests were within acceptable limits defined by ACR. The dose factor for GE was 68.0 y/mAs, while 76.0 mGy/mAs for Metaltronica at 25 kVp. Before the start of this study the only one mammography unit GE, was routinely used at NORI and normal mode of operation of this unit was semi-auto mode with fixed kVp independent of compressed breast thickness, but in this study it was concluded that selecting kVp according to beast thickness result in an appreciable dose reduction (4-5 times less) without any compromise in image quality. The

  6. CCD Astrophotography High-Quality Imaging from the Suburbs

    CERN Document Server

    Stuart, Adam

    2006-01-01

    This is a reference book for amateur astronomers who have become interested in CCD imaging. Those glorious astronomical images found in astronomy magazines might seem out of reach to newcomers to CCD imaging, but this is not the case. Great pictures are attainable with modest equipment. Adam Stuart’s many beautiful images, reproduced in this book, attest to the quality of – initially – a beginner’s efforts. Chilled-chip astronomical CCD-cameras and software are also wonderful tools for cutting through seemingly impenetrable light-pollution. CCD Astrophotography from the Suburbs describes one man’s successful approach to the problem of getting high-quality astronomical images under some of the most light-polluted conditions. Here is a complete and thoroughly tested program that will help every CCD-beginner to work towards digital imaging of the highest quality. It is equally useful to astronomers who have perfect observing conditions, as to those who have to observe from light-polluted city skies.

  7. Quality assurance of imaging instruments for nuclear medicine

    International Nuclear Information System (INIS)

    Sera, T.; Csernay, L.

    1993-01-01

    Advanced quality control and assurance techniques for imaging instrumentation used in medical diagnosis are overviewed. The measurement systems for the homogeneity, linearity, geometrical resolution, energy resolution, sensitivity and pulse yield output of gamma camera detectors are presented in detail. The two most important quality control standards, the National Electrical Manufacturers' Association (NEMA) and the International Atomic Energy Agency standards and tests are described. Their use in gamma camera calibration is proposed. (R.P.) 22 refs.; 1 tabs

  8. Analysis and Comparison of Objective Methods for Image Quality Assessment

    Directory of Open Access Journals (Sweden)

    P. S. Babkin

    2014-01-01

    Full Text Available The purpose of this work is research and modification of the reference objective methods for image quality assessment. The ultimate goal is to obtain a modification of formal assessments that more closely corresponds to the subjective expert estimates (MOS.In considering the formal reference objective methods for image quality assessment we used the results of other authors, which offer results and comparative analyzes of the most effective algorithms. Based on these investigations we have chosen two of the most successful algorithm for which was made a further analysis in the MATLAB 7.8 R 2009 a (PQS and MSSSIM. The publication focuses on the features of the algorithms, which have great importance in practical implementation, but are insufficiently covered in the publications by other authors.In the implemented modification of the algorithm PQS boundary detector Kirsch was replaced by the boundary detector Canny. Further experiments were carried out according to the method of the ITU-R VT.500-13 (01/2012 using monochrome images treated with different types of filters (should be emphasized that an objective assessment of image quality PQS is applicable only to monochrome images. Images were obtained with a thermal imaging surveillance system. The experimental results proved the effectiveness of this modification.In the specialized literature in the field of formal to evaluation methods pictures, this type of modification was not mentioned.The method described in the publication can be applied to various practical implementations of digital image processing.Advisability and effectiveness of using the modified method of PQS to assess the structural differences between the images are shown in the article and this will be used in solving the problems of identification and automatic control.

  9. Presence capture cameras - a new challenge to the image quality

    Science.gov (United States)

    Peltoketo, Veli-Tapani

    2016-04-01

    Commercial presence capture cameras are coming to the markets and a new era of visual entertainment starts to get its shape. Since the true presence capturing is still a very new technology, the real technical solutions are just passed a prototyping phase and they vary a lot. Presence capture cameras have still the same quality issues to tackle as previous phases of digital imaging but also numerous new ones. This work concentrates to the quality challenges of presence capture cameras. A camera system which can record 3D audio-visual reality as it is has to have several camera modules, several microphones and especially technology which can synchronize output of several sources to a seamless and smooth virtual reality experience. Several traditional quality features are still valid in presence capture cameras. Features like color fidelity, noise removal, resolution and dynamic range create the base of virtual reality stream quality. However, co-operation of several cameras brings a new dimension for these quality factors. Also new quality features can be validated. For example, how the camera streams should be stitched together with 3D experience without noticeable errors and how to validate the stitching? The work describes quality factors which are still valid in the presence capture cameras and defines the importance of those. Moreover, new challenges of presence capture cameras are investigated in image and video quality point of view. The work contains considerations how well current measurement methods can be used in presence capture cameras.

  10. Simulation study of secondary electron images in scanning ion microscopy

    CERN Document Server

    Ohya, K

    2003-01-01

    The target atomic number, Z sub 2 , dependence of secondary electron yield is simulated by applying a Monte Carlo code for 17 species of metals bombarded by Ga ions and electrons in order to study the contrast difference between scanning ion microscopes (SIM) and scanning electron microscopes (SEM). In addition to the remarkable reversal of the Z sub 2 dependence between the Ga ion and electron bombardment, a fine structure, which is correlated to the density of the conduction band electrons in the metal, is calculated for both. The brightness changes of the secondary electron images in SIM and SEM are simulated using Au and Al surfaces adjacent to each other. The results indicate that the image contrast in SIM is much more sensitive to the material species and is clearer than that for SEM. The origin of the difference between SIM and SEM comes from the difference in the lateral distribution of secondary electrons excited within the escape depth.

  11. Model-based microwave image reconstruction: simulations and experiments

    International Nuclear Information System (INIS)

    Ciocan, Razvan; Jiang Huabei

    2004-01-01

    We describe an integrated microwave imaging system that can provide spatial maps of dielectric properties of heterogeneous media with tomographically collected data. The hardware system (800-1200 MHz) was built based on a lock-in amplifier with 16 fixed antennas. The reconstruction algorithm was implemented using a Newton iterative method with combined Marquardt-Tikhonov regularizations. System performance was evaluated using heterogeneous media mimicking human breast tissue. Finite element method coupled with the Bayliss and Turkel radiation boundary conditions were applied to compute the electric field distribution in the heterogeneous media of interest. The results show that inclusions embedded in a 76-diameter background medium can be quantitatively reconstructed from both simulated and experimental data. Quantitative analysis of the microwave images obtained suggests that an inclusion of 14 mm in diameter is the smallest object that can be fully characterized presently using experimental data, while objects as small as 10 mm in diameter can be quantitatively resolved with simulated data

  12. Improve Image Quality of Transversal Relaxation Time PROPELLER and FLAIR on Magnetic Resonance Imaging

    Science.gov (United States)

    Rauf, N.; Alam, D. Y.; Jamaluddin, M.; Samad, B. A.

    2018-03-01

    The Magnetic Resonance Imaging (MRI) is a medical imaging technique that uses the interaction between the magnetic field and the nuclear spins. MRI can be used to show disparity of pathology by transversal relaxation time (T2) weighted images. Some techniques for producing T2-weighted images are Periodically Rotated Overlapping Parallel Lines with Enhanced Reconstruction (PROPELLER) and Fluid Attenuated Inversion Recovery (FLAIR). A comparison of T2 PROPELLER and T2 FLAIR parameters in MRI image has been conducted. And improve Image Quality the image by using RadiAnt DICOM Viewer and ENVI software with method of image segmentation and Region of Interest (ROI). Brain images were randomly selected. The result of research showed that Time Repetition (TR) and Time Echo (TE) values in all types of images were not influenced by age. T2 FLAIR images had longer TR value (9000 ms), meanwhile T2 PROPELLER images had longer TE value (100.75 - 102.1 ms). Furthermore, areas with low and medium signal intensity appeared clearer by using T2 PROPELLER images (average coefficients of variation for low and medium signal intensity were 0.0431 and 0.0705, respectively). As for areas with high signal intensity appeared clearer by using T2 FLAIR images (average coefficient of variation was 0.0637).

  13. Practical guide to quality assurance in medical imaging

    International Nuclear Information System (INIS)

    Moores, M.; Watkinson, S.; Pearcy, J.; Henshaw, E.T.

    1987-01-01

    This volume forms an important part of the response to a growing need to ensure the same and cost-effective use of ionizing radiations for the benefit of both staff and patients. The authors provide guidance to implementing and running quality assurance programs in medical imaging departments. The treatment provides an overview of all the tests which need to be carried out in medical imaging, and the text contains step-by-step guidance as to how to perform and interpret the results of medical imaging

  14. The Advanced Gamma-ray Imaging System (AGIS): Simulation Studies

    Science.gov (United States)

    Fegan, Stephen; Buckley, J. H.; Bugaev, S.; Funk, S.; Konopelko, A.; Maier, G.; Vassiliev, V. V.; Simulation Studies Working Group; AGIS Collaboration

    2008-03-01

    The Advanced Gamma-ray Imaging System (AGIS) is a concept for the next generation instrument in ground-based very high energy gamma-ray astronomy. It has the goal of achieving significant improvement in sensitivity over current experiments. We present the results of simulation studies of various possible designs for AGIS. The primary characteristics of the array performance, collecting area, angular resolution, background rejection, and sensitivity are discussed.

  15. The Advanced Gamma-ray Imaging System (AGIS): Simulation Studies

    OpenAIRE

    Maier, G.; Collaboration, for the AGIS

    2009-01-01

    The Advanced Gamma-ray Imaging System (AGIS) is a next-generation ground-based gamma-ray observatory being planned in the U.S. The anticipated sensitivity of AGIS is about one order of magnitude better than the sensitivity of current observatories, allowing it to measure gammaray emmission from a large number of Galactic and extra-galactic sources. We present here results of simulation studies of various possible designs for AGIS. The primary characteristics of the array performance - collect...

  16. Image reconstruction using Monte Carlo simulation and artificial neural networks

    International Nuclear Information System (INIS)

    Emert, F.; Missimner, J.; Blass, W.; Rodriguez, A.

    1997-01-01

    PET data sets are subject to two types of distortions during acquisition: the imperfect response of the scanner and attenuation and scattering in the active distribution. In addition, the reconstruction of voxel images from the line projections composing a data set can introduce artifacts. Monte Carlo simulation provides a means for modeling the distortions and artificial neural networks a method for correcting for them as well as minimizing artifacts. (author) figs., tab., refs

  17. Image size invariant visual cryptography for general access structures subject to display quality constraints.

    Science.gov (United States)

    Lee, Kai-Hui; Chiu, Pei-Ling

    2013-10-01

    Conventional visual cryptography (VC) suffers from a pixel-expansion problem, or an uncontrollable display quality problem for recovered images, and lacks a general approach to construct visual secret sharing schemes for general access structures. We propose a general and systematic approach to address these issues without sophisticated codebook design. This approach can be used for binary secret images in non-computer-aided decryption environments. To avoid pixel expansion, we design a set of column vectors to encrypt secret pixels rather than using the conventional VC-based approach. We begin by formulating a mathematic model for the VC construction problem to find the column vectors for the optimal VC construction, after which we develop a simulated-annealing-based algorithm to solve the problem. The experimental results show that the display quality of the recovered image is superior to that of previous papers.

  18. Fast Simulation of Dynamic Ultrasound Images Using the GPU.

    Science.gov (United States)

    Storve, Sigurd; Torp, Hans

    2017-10-01

    Simulated ultrasound data is a valuable tool for development and validation of quantitative image analysis methods in echocardiography. Unfortunately, simulation time can become prohibitive for phantoms consisting of a large number of point scatterers. The COLE algorithm by Gao et al. is a fast convolution-based simulator that trades simulation accuracy for improved speed. We present highly efficient parallelized CPU and GPU implementations of the COLE algorithm with an emphasis on dynamic simulations involving moving point scatterers. We argue that it is crucial to minimize the amount of data transfers from the CPU to achieve good performance on the GPU. We achieve this by storing the complete trajectories of the dynamic point scatterers as spline curves in the GPU memory. This leads to good efficiency when simulating sequences consisting of a large number of frames, such as B-mode and tissue Doppler data for a full cardiac cycle. In addition, we propose a phase-based subsample delay technique that efficiently eliminates flickering artifacts seen in B-mode sequences when COLE is used without enough temporal oversampling. To assess the performance, we used a laptop computer and a desktop computer, each equipped with a multicore Intel CPU and an NVIDIA GPU. Running the simulator on a high-end TITAN X GPU, we observed two orders of magnitude speedup compared to the parallel CPU version, three orders of magnitude speedup compared to simulation times reported by Gao et al. in their paper on COLE, and a speedup of 27000 times compared to the multithreaded version of Field II, using numbers reported in a paper by Jensen. We hope that by releasing the simulator as an open-source project we will encourage its use and further development.

  19. Registration accuracy and quality of real-life images.

    Directory of Open Access Journals (Sweden)

    Wei-Yen Hsu

    Full Text Available BACKGROUND: A common registration problem for the application of consumer device is to align all the acquired image sequences into a complete scene. Image alignment requires a registration algorithm that will compensate as much as possible for geometric variability among images. However, images captured views from a real scene usually produce different distortions. Some are derived from the optic characteristics of image sensors, and others are caused by the specific scenes and objects. METHODOLOGY/PRINCIPAL FINDINGS: An image registration algorithm considering the perspective projection is proposed for the application of consumer devices in this study. It exploits a multiresolution wavelet-based method to extract significant features. An analytic differential approach is then proposed to achieve fast convergence of point matching. Finally, the registration accuracy is further refined to obtain subpixel precision by a feature-based modified Levenberg-Marquardt method. Due to its feature-based and nonlinear characteristic, it converges considerably faster than most other methods. In addition, vignette compensation and color difference adjustment are also performed to further improve the quality of registration results. CONCLUSIONS/SIGNIFICANCE: The performance of the proposed method is evaluated by testing the synthetic and real images acquired by a hand-held digital still camera and in comparison with two registration techniques in terms of the squared sum of intensity differences (SSD and correlation coefficient (CC. The results indicate that the proposed method is promising in registration accuracy and quality, which are statistically significantly better than other two approaches.

  20. Developing optimized CT scan protocols: Phantom measurements of image quality

    International Nuclear Information System (INIS)

    Zarb, Francis; Rainford, Louise; McEntee, Mark F.

    2011-01-01

    Purpose: The increasing frequency of computerized tomography (CT) examinations is well documented, leading to concern about potential radiation risks for patients. However, the consequences of not performing the CT examination and missing injuries and disease are potentially serious, impacting upon correct patient management. The ALARA principle of dose optimization must be employed for all justified CT examinations. Dose indicators displayed on the CT console as either CT dose index (CTDI) and/or dose length product (DLP), are used to indicate dose and can quantify improvements achieved through optimization. Key scan parameters contributing to dose have been identified in previous literature and in previous work by our group. The aim of this study was to optimize the scan parameters of mA; kV and pitch, whilst maintaining image quality and reducing dose. This research was conducted using psychophysical image quality measurements on a CT quality assurance (QA) phantom establishing the impact of dose optimization on image quality parameters. Method: Current CT scan parameters for head (posterior fossa and cerebrum), abdomen and chest examinations were collected from 57% of CT suites available nationally in Malta (n = 4). Current scan protocols were used to image a Catphan 600 CT QA phantom whereby image quality was assessed. Each scan parameter: mA; kV and pitch were systematically reduced until the contrast resolution (CR), spatial resolution (SR) and noise were significantly lowered. The Catphan 600 images, produced by the range of protocols, were evaluated by 2 expert observers assessing CR, SR and noise. The protocol considered as the optimization threshold was just above the setting that resulted in a significant reduction in CR and noise but not affecting SR at the 95% confidence interval. Results: The limit of optimization threshold was determined for each CT suite. Employing optimized parameters, CTDI and DLP were both significantly reduced (p ≤ 0.001) by

  1. DIANE stationary neutron radiography system image quality and industrial applications

    International Nuclear Information System (INIS)

    Cluzeau, S.; Huet, J.; Tourneur, P. le

    1994-01-01

    The SODERN neutron radiography laboratory has operated since February 1993 using a sealed tube generator (GENIE 46). An experimental programme of characterization (dosimetry, spectroscopy) has confirmed the expected performances concerning: neutron flux intensity, neutron energy range, residual gamma flux. Results are given in a specific report [2]. This paper is devoted to the image performance reporting. ASTM and specific indicators have been used to test the image quality with various converters and films. The corresponding modulation transfer functions are to be determined from image processing. Some industrial applications have demonstrated the capabilities of the system: corrosion detection in aircraft parts, ammunitions filling testing, detection of polymer lacks in sandwich steel sheets, detection of moisture in a probe for geophysics, residual ceramic cores imaging in turbine blades. Various computerized electronic imaging systems will be tested to improve the industrial capabilities. (orig.)

  2. Image quality testing of assembled IR camera modules

    Science.gov (United States)

    Winters, Daniel; Erichsen, Patrik

    2013-10-01

    Infrared (IR) camera modules for the LWIR (8-12_m) that combine IR imaging optics with microbolometer focal plane array (FPA) sensors with readout electronics are becoming more and more a mass market product. At the same time, steady improvements in sensor resolution in the higher priced markets raise the requirement for imaging performance of objectives and the proper alignment between objective and FPA. This puts pressure on camera manufacturers and system integrators to assess the image quality of finished camera modules in a cost-efficient and automated way for quality control or during end-of-line testing. In this paper we present recent development work done in the field of image quality testing of IR camera modules. This technology provides a wealth of additional information in contrast to the more traditional test methods like minimum resolvable temperature difference (MRTD) which give only a subjective overall test result. Parameters that can be measured are image quality via the modulation transfer function (MTF) for broadband or with various bandpass filters on- and off-axis and optical parameters like e.g. effective focal length (EFL) and distortion. If the camera module allows for refocusing the optics, additional parameters like best focus plane, image plane tilt, auto-focus quality, chief ray angle etc. can be characterized. Additionally, the homogeneity and response of the sensor with the optics can be characterized in order to calculate the appropriate tables for non-uniformity correction (NUC). The technology can also be used to control active alignment methods during mechanical assembly of optics to high resolution sensors. Other important points that are discussed are the flexibility of the technology to test IR modules with different form factors, electrical interfaces and last but not least the suitability for fully automated measurements in mass production.

  3. Using simulated fluorescence cell micrographs for the evaluation of cell image segmentation algorithms.

    Science.gov (United States)

    Wiesmann, Veit; Bergler, Matthias; Palmisano, Ralf; Prinzen, Martin; Franz, Daniela; Wittenberg, Thomas

    2017-03-18

    Manual assessment and evaluation of fluorescent micrograph cell experiments is time-consuming and tedious. Automated segmentation pipelines can ensure efficient and reproducible evaluation and analysis with constant high quality for all images of an experiment. Such cell segmentation approaches are usually validated and rated in comparison to manually annotated micrographs. Nevertheless, manual annotations are prone to errors and display inter- and intra-observer variability which influence the validation results of automated cell segmentation pipelines. We present a new approach to simulate fluorescent cell micrographs that provides an objective ground truth for the validation of cell segmentation methods. The cell simulation was evaluated twofold: (1) An expert observer study shows that the proposed approach generates realistic fluorescent cell micrograph simulations. (2) An automated segmentation pipeline on the simulated fluorescent cell micrographs reproduces segmentation performances of that pipeline on real fluorescent cell micrographs. The proposed simulation approach produces realistic fluorescent cell micrographs with corresponding ground truth. The simulated data is suited to evaluate image segmentation pipelines more efficiently and reproducibly than it is possible on manually annotated real micrographs.

  4. Performance evaluation of no-reference image quality metrics for face biometric images

    Science.gov (United States)

    Liu, Xinwei; Pedersen, Marius; Charrier, Christophe; Bours, Patrick

    2018-03-01

    The accuracy of face recognition systems is significantly affected by the quality of face sample images. The recent established standardization proposed several important aspects for the assessment of face sample quality. There are many existing no-reference image quality metrics (IQMs) that are able to assess natural image quality by taking into account similar image-based quality attributes as introduced in the standardization. However, whether such metrics can assess face sample quality is rarely considered. We evaluate the performance of 13 selected no-reference IQMs on face biometrics. The experimental results show that several of them can assess face sample quality according to the system performance. We also analyze the strengths and weaknesses of different IQMs as well as why some of them failed to assess face sample quality. Retraining an original IQM by using face database can improve the performance of such a metric. In addition, the contribution of this paper can be used for the evaluation of IQMs on other biometric modalities; furthermore, it can be used for the development of multimodality biometric IQMs.

  5. Adapting protocols of CT imaging in a pediatric emergency department. Evaluation of image quality and dose

    International Nuclear Information System (INIS)

    Batista Arce, A.; Gonzalez Lopez, S.; Catalan Acosta, A.; Casares Magaz, O.; Hernandez Armas, O.; Hernandez Armas, J.

    2011-01-01

    The purpose of this study was to assess qualitatively the picture quality in relation to the radiation dose delivered in CT studies of computer tomograph Pediatric Emergency Department of Hospital Universitario de Canarias (HUC) in order to optimize the technical parameters used these radiological examinations so as to obtain optimal image quality at the lowest possible dose.

  6. Image simulation of high-speed imaging by high-pressure gas ionization detector

    International Nuclear Information System (INIS)

    Miao Jichen; Liu Ximing; Wu Zhifang

    2005-01-01

    The signal of the neighbor pixels is cumulated in Freight Train Inspection System because data fetch time is shorter than ion excursion time. This paper analyzes the pertinency of neighbor pixels and designs computer simulation method to generate some emulate images such as indicator image. The result indicates the high-pressure gas ionization detector can be used in high-speed digital radiography field. (authors)

  7. Image-rotating cavity designs for improved beam quality in nanosecond optical parametric oscillators

    International Nuclear Information System (INIS)

    Smith, Arlee V.; Bowers, Mark S.

    2001-01-01

    We show by computer simulation that high beam quality can be achieved in high-energy, nanosecond optical parametric oscillators by use of image-rotating resonators. Lateral walk-off between the signal and the idler beams in a nonlinear crystal creates correlations across the beams in the walk off direction, or equivalently, creates a restricted acceptance angle. These correlations can improve the beam quality in the walk-off plane. We show that image rotation or reflection can be used to improve beam quality in both planes. The lateral walk-off can be due to birefringent walk-off in type II mixing or to noncollinear mixing in type I or type II mixing

  8. Modeling LCD Displays with Local Backlight Dimming for Image Quality Assessment

    DEFF Research Database (Denmark)

    Korhonen, Jari; Burini, Nino; Forchhammer, Søren

    2011-01-01

    for evaluating the signal quality distortion related directly to digital signal processing, such as compression. However, the physical characteristics of the display device also pose a significant impact on the overall perception. In order to facilitate image quality assessment on modern liquid crystaldisplays...... (LCD) using light emitting diode (LED) backlight with local dimming, we present the essential considerations and guidelines for modeling the characteristics of displays with high dynamic range (HDR) and locally adjustable backlight segments. The representation of the image generated by the model can...... be assessed using the traditional objective metrics, and therefore the proposed approach is useful for assessing the performance of different backlight dimming algorithms in terms of resulting quality and power consumption in a simulated environment. We have implemented the proposed model in C++ and compared...

  9. Optimization of the alpha image reconstruction. An iterative CT-image reconstruction with well-defined image quality metrics

    International Nuclear Information System (INIS)

    Lebedev, Sergej; Sawall, Stefan; Knaup, Michael; Kachelriess, Marc

    2017-01-01

    Optimization of the AIR-algorithm for improved convergence and performance. TThe AIR method is an iterative algorithm for CT image reconstruction. As a result of its linearity with respect to the basis images, the AIR algorithm possesses well defined, regular image quality metrics, e.g. point spread function (PSF) or modulation transfer function (MTF), unlike other iterative reconstruction algorithms. The AIR algorithm computes weighting images α to blend between a set of basis images that preferably have mutually exclusive properties, e.g. high spatial resolution or low noise. The optimized algorithm uses an approach that alternates between the optimization of rawdata fidelity using an OSSART like update and regularization using gradient descent, as opposed to the initially proposed AIR using a straightforward gradient descent implementation. A regularization strength for a given task is chosen by formulating a requirement for the noise reduction and checking whether it is fulfilled for different regularization strengths, while monitoring the spatial resolution using the voxel-wise defined modulation transfer function for the AIR image. The optimized algorithm computes similar images in a shorter time compared to the initial gradient descent implementation of AIR. The result can be influenced by multiple parameters that can be narrowed down to a relatively simple framework to compute high quality images. The AIR images, for instance, can have at least a 50% lower noise level compared to the sharpest basis image, while the spatial resolution is mostly maintained. The optimization improves performance by a factor of 6, while maintaining image quality. Furthermore, it was demonstrated that the spatial resolution for AIR can be determined using regular image quality metrics, given smooth weighting images. This is not possible for other iterative reconstructions as a result of their non linearity. A simple set of parameters for the algorithm is discussed that provides

  10. Optimization of the alpha image reconstruction. An iterative CT-image reconstruction with well-defined image quality metrics

    Energy Technology Data Exchange (ETDEWEB)

    Lebedev, Sergej; Sawall, Stefan; Knaup, Michael; Kachelriess, Marc [German Cancer Research Center, Heidelberg (Germany).

    2017-10-01

    Optimization of the AIR-algorithm for improved convergence and performance. TThe AIR method is an iterative algorithm for CT image reconstruction. As a result of its linearity with respect to the basis images, the AIR algorithm possesses well defined, regular image quality metrics, e.g. point spread function (PSF) or modulation transfer function (MTF), unlike other iterative reconstruction algorithms. The AIR algorithm computes weighting images α to blend between a set of basis images that preferably have mutually exclusive properties, e.g. high spatial resolution or low noise. The optimized algorithm uses an approach that alternates between the optimization of rawdata fidelity using an OSSART like update and regularization using gradient descent, as opposed to the initially proposed AIR using a straightforward gradient descent implementation. A regularization strength for a given task is chosen by formulating a requirement for the noise reduction and checking whether it is fulfilled for different regularization strengths, while monitoring the spatial resolution using the voxel-wise defined modulation transfer function for the AIR image. The optimized algorithm computes similar images in a shorter time compared to the initial gradient descent implementation of AIR. The result can be influenced by multiple parameters that can be narrowed down to a relatively simple framework to compute high quality images. The AIR images, for instance, can have at least a 50% lower noise level compared to the sharpest basis image, while the spatial resolution is mostly maintained. The optimization improves performance by a factor of 6, while maintaining image quality. Furthermore, it was demonstrated that the spatial resolution for AIR can be determined using regular image quality metrics, given smooth weighting images. This is not possible for other iterative reconstructions as a result of their non linearity. A simple set of parameters for the algorithm is discussed that provides

  11. Simulated annealing in adaptive optics for imaging the eye retina

    International Nuclear Information System (INIS)

    Zommer, S.; Adler, J.; Lipson, S. G.; Ribak, E.

    2004-01-01

    Full Text:Adaptive optics is a method designed to correct deformed images in real time. Once the distorted wavefront is known, a deformable mirror is used to compensate the aberrations and return the wavefront to a plane wave. This study concentrates on methods that omit wave front sensing from the reconstruction process. Such methods use stochastic algorithms to find the extremum of a certain sharpness function, thereby correcting the image without any information on the wavefront. Theoretical work [l] has shown that the optical problem can be mapped onto a model for crystal roughening. The main algorithm applied is simulated annealing. We present a first hardware realization of this algorithm in an adaptive optics system designed to image the retina of the human eye

  12. Optical image reconstruction using DC data: simulations and experiments

    International Nuclear Information System (INIS)

    Huabei Jiang; Paulsen, K.D.; Oesterberg, U.L.

    1996-01-01

    In this paper, we explore optical image formation using a diffusion approximation of light propagation in tissue which is modelled with a finite-element method for optically heterogeneous media. We demonstrate successful image reconstruction based on absolute experimental DC data obtained with a continuous wave 633 nm He-Ne laser system and a 751 nm diode laser system in laboratory phantoms having two optically distinct regions. The experimental systems used exploit a tomographic type of data collection scheme that provides information from which a spatially variable optical property map is deduced. Reconstruction of scattering coefficient only and simultaneous reconstruction of both scattering and absorption profiles in tissue-like phantoms are obtained from measured and simulated data. Images with different contrast levels between the heterogeneity and the background are also reported and the results show that although it is possible to obtain qualitative visual information on the location and size of a heterogeneity, it may not be possible to quantitatively resolve contrast levels or optical properties using reconstructions from DC data only. Sensitivity of image reconstruction to noise in the measurement data is investigated through simulations. The application of boundary constraints has also been addressed. (author)

  13. Safety Assessment of Advanced Imaging Sequences II: Simulations

    DEFF Research Database (Denmark)

    Jensen, Jørgen Arendt

    2016-01-01

    .6%, when using the impulse response of the probe estimated from an independent measurement. The accuracy is increased to between -22% to 24.5% for MI and between -33.2% to 27.0% for Ispta.3, when using the pressure response measured at a single point to scale the simulation. The spatial distribution of MI...... Mechanical Index (MI) and Ispta.3 as required by FDA. The method is performed on four different imaging schemes and compared to measurements conducted using the SARUS experimental scanner. The sequences include focused emissions with an F-number of 2 with 64 elements that generate highly non-linear fields....... The simulation time is between 0.67 ms to 2.8 ms per emission and imaging point, making it possible to simulate even complex emission sequences in less than 1 s for a single spatial position. The linear simulations yield a relative accuracy on MI between -12.1% to 52.3% and for Ispta.3 between -38.6% to 62...

  14. STEM image simulation with hybrid CPU/GPU programming

    International Nuclear Information System (INIS)

    Yao, Y.; Ge, B.H.; Shen, X.; Wang, Y.G.; Yu, R.C.

    2016-01-01

    STEM image simulation is achieved via hybrid CPU/GPU programming under parallel algorithm architecture to speed up calculation on a personal computer (PC). To utilize the calculation power of a PC fully, the simulation is performed using the GPU core and multi-CPU cores at the same time to significantly improve efficiency. GaSb and an artificial GaSb/InAs interface with atom diffusion have been used to verify the computation. - Highlights: • STEM image simulation is achieved by hybrid CPU/GPU programming under parallel algorithm architecture to speed up the calculation in the personal computer (PC). • In order to fully utilize the calculation power of the PC, the simulation is performed by GPU core and multi-CPU cores at the same time so efficiency is improved significantly. • GaSb and artificial GaSb/InAs interface with atom diffusion have been used to verify the computation. The results reveal some unintuitive phenomena about the contrast variation with the atom numbers.

  15. STEM image simulation with hybrid CPU/GPU programming

    Energy Technology Data Exchange (ETDEWEB)

    Yao, Y., E-mail: yaoyuan@iphy.ac.cn; Ge, B.H.; Shen, X.; Wang, Y.G.; Yu, R.C.

    2016-07-15

    STEM image simulation is achieved via hybrid CPU/GPU programming under parallel algorithm architecture to speed up calculation on a personal computer (PC). To utilize the calculation power of a PC fully, the simulation is performed using the GPU core and multi-CPU cores at the same time to significantly improve efficiency. GaSb and an artificial GaSb/InAs interface with atom diffusion have been used to verify the computation. - Highlights: • STEM image simulation is achieved by hybrid CPU/GPU programming under parallel algorithm architecture to speed up the calculation in the personal computer (PC). • In order to fully utilize the calculation power of the PC, the simulation is performed by GPU core and multi-CPU cores at the same time so efficiency is improved significantly. • GaSb and artificial GaSb/InAs interface with atom diffusion have been used to verify the computation. The results reveal some unintuitive phenomena about the contrast variation with the atom numbers.

  16. Performance simulation of a MRPC-based PET imaging system

    Science.gov (United States)

    Roy, A.; Banerjee, A.; Biswas, S.; Chattopadhyay, S.; Das, G.; Saha, S.

    2014-10-01

    The less expensive and high resolution Multi-gap Resistive Plate Chamber (MRPC) opens up a new possibility to find an efficient alternative detector for the Time of Flight (TOF) based Positron Emission Tomography, where the sensitivity of the system depends largely on the time resolution of the detector. In a layered structure, suitable converters can be used to increase the photon detection efficiency. In this work, we perform a detailed GEANT4 simulation to optimize the converter thickness towards improving the efficiency of photon conversion. A Monte Carlo based procedure has been developed to simulate the time resolution of the MRPC-based system, making it possible to simulate its response for PET imaging application. The results of the test of a six-gap MRPC, operating in avalanche mode, with 22Na source have been discussed.

  17. Mammography in Norway: Image quality and total performance

    International Nuclear Information System (INIS)

    Olsen, J.B.; Skretting, A.; Widmark, A.

    1997-04-01

    This report describes a method for assessing the total performance in mammography based on Receiver Operating Characteristic (ROC) analysis. In the time period from December 1993 to March 1994 the method was applied to assess the total performance of all the 45 Norwegian mammography laboratories operative at that time. Image quality characteristics in each laboratory was established by use of well-known phantoms

  18. New Image Qualities in Education: A Comparative Study

    Science.gov (United States)

    Çankaya, Ibrahim

    2018-01-01

    The aim of this study is to compare Turkish and European Union Countries Educations in terms of the new image qualities such as data like access to online education, digital access, foreign languages learnt per pupil, research & development investments, human resources employed in science and technology, the study opportunities offered to…

  19. Beyond image quality : designing engaging interactions with digital products

    NARCIS (Netherlands)

    Ridder, de H.; Rozendaal, M.C.

    2008-01-01

    Ubiquitous computing (or Ambient Intelligence) promises a world in which information is available anytime anywhere and with which humans can interact in a natural, multimodal way. In such world, perceptual image quality remains an important criterion since most information will be displayed

  20. Beyond image quality : Designing engaging interactions with digital products

    NARCIS (Netherlands)

    De Ridder, H.; Rozendaal, M.C.

    2008-01-01

    Ubiquitous computing (or Ambient Intelligence) promises a world in which information is available anytime anywhere and with which humans can interact in a natural, multimodal way. In such world, perceptual image quality remains an important criterion since most information will be displayed

  1. Quality measures for HRR alignment based ISAR imaging algorithms

    CSIR Research Space (South Africa)

    Janse van Rensburg, V

    2013-05-01

    Full Text Available Some Inverse Synthetic Aperture Radar (ISAR) algorithms form the image in a two-step process of range alignment and phase conjugation. This paper discusses a comprehensive set of measures used to quantify the quality of range alignment, with the aim...

  2. A Reduction in Radiographic Exposure and Image Quality in Film ...

    African Journals Online (AJOL)

    Purpose: To develop a protocol for the optimization of diagnostic chest radiography examination, the effect of radiographic exposure reduction on image quality is investigated. Procedure: Fourty-eight adult patients presenting for posterior-anterior (PA) chest radiography in a tertiary health care centre were categorized into 3 ...

  3. Determining storage related egg quality changes via digital image ...

    African Journals Online (AJOL)

    Area and length measurements related to exterior and interior egg quality were determined by digital image analysis. In general, excluding the outer thin albumen area, all of the area measurements such as total egg content area and inner thick albumen area were larger in stored eggs than in fresh eggs (52.28 vs.

  4. Thermoluminescence dosimetry in quality imaging in CR mammography systems

    Energy Technology Data Exchange (ETDEWEB)

    Gaona, E.; Franco E, J.G. [UAM-Xochimilco, 04960 Mexico D.F. (Mexico); Azorin N, J. [UAM-Iztapalapa, 09340 Mexico D.F. (Mexico); Diaz G, J.A.I. [CICATA, Unidad Legaria, Av. Legaria 694, 11599 mexico D.F. (Mexico); Arreola, M. [Department of Radiology, Shands Hospital at UF, PO Box 100374, Gainesville, FL 32610-0374 (United States)

    2006-07-01

    The aim of this work is to estimate the average glandular dose with Thermoluminescence Dosimetry (TLD) and comparison with quality imaging in CR mammography. For measuring dose, FDA and ACR use a phantom, so that dose and image quality are assessed with the same test object. The mammography is a radiological image to visualize early biological manifestations of breast cancer. Digital systems have two types of image-capturing devices, Full Field Digital Mammography (FFDM) and CR mammography. In Mexico, there are several CR mammography systems in clinical use, but only one CR mammography system has been approved for use by the FDA. Mammography CR uses a photostimulable phosphor detector (PSP) system. Most CR plates are made of 85% BaFBr and 15% BaFI doped with europium (Eu) commonly called barium fluoro halide. We carry out an exploratory survey of six CR mammography units from three different manufacturers and six dedicated x-ray mammography units with fully automatic exposure. The results show three CR mammography units (50%) have a dose that overcomes 3.0 mGy and it doesn't improve the image quality and dose to the breast will be excessive. The differences between doses averages from TLD system and dosimeter with ionization chamber are less than 10%. TLD system is a good option for average glandular dose measurement. (Author)

  5. Thermoluminescence dosimetry in quality imaging in CR mammography systems

    International Nuclear Information System (INIS)

    Gaona, E.; Franco E, J.G.; Azorin N, J.; Diaz G, J.A.I.; Arreola, M.

    2006-01-01

    The aim of this work is to estimate the average glandular dose with Thermoluminescence Dosimetry (TLD) and comparison with quality imaging in CR mammography. For measuring dose, FDA and ACR use a phantom, so that dose and image quality are assessed with the same test object. The mammography is a radiological image to visualize early biological manifestations of breast cancer. Digital systems have two types of image-capturing devices, Full Field Digital Mammography (FFDM) and CR mammography. In Mexico, there are several CR mammography systems in clinical use, but only one CR mammography system has been approved for use by the FDA. Mammography CR uses a photostimulable phosphor detector (PSP) system. Most CR plates are made of 85% BaFBr and 15% BaFI doped with europium (Eu) commonly called barium fluoro halide. We carry out an exploratory survey of six CR mammography units from three different manufacturers and six dedicated x-ray mammography units with fully automatic exposure. The results show three CR mammography units (50%) have a dose that overcomes 3.0 mGy and it doesn't improve the image quality and dose to the breast will be excessive. The differences between doses averages from TLD system and dosimeter with ionization chamber are less than 10%. TLD system is a good option for average glandular dose measurement. (Author)

  6. Comparison of quality control software tools for diffusion tensor imaging.

    Science.gov (United States)

    Liu, Bilan; Zhu, Tong; Zhong, Jianhui

    2015-04-01

    Image quality of diffusion tensor imaging (DTI) is critical for image interpretation, diagnostic accuracy and efficiency. However, DTI is susceptible to numerous detrimental artifacts that may impair the reliability and validity of the obtained data. Although many quality control (QC) software tools are being developed and are widely used and each has its different tradeoffs, there is still no general agreement on an image quality control routine for DTIs, and the practical impact of these tradeoffs is not well studied. An objective comparison that identifies the pros and cons of each of the QC tools will be helpful for the users to make the best choice among tools for specific DTI applications. This study aims to quantitatively compare the effectiveness of three popular QC tools including DTI studio (Johns Hopkins University), DTIprep (University of North Carolina at Chapel Hill, University of Iowa and University of Utah) and TORTOISE (National Institute of Health). Both synthetic and in vivo human brain data were used to quantify adverse effects of major DTI artifacts to tensor calculation as well as the effectiveness of different QC tools in identifying and correcting these artifacts. The technical basis of each tool was discussed, and the ways in which particular techniques affect the output of each of the tools were analyzed. The different functions and I/O formats that three QC tools provide for building a general DTI processing pipeline and integration with other popular image processing tools were also discussed. Copyright © 2015 Elsevier Inc. All rights reserved.

  7. Reducing the absorbed dose in analogue radiography of infant chest images by improving the image quality, using image processing techniques

    International Nuclear Information System (INIS)

    Karimian, A.; Yazdani, S.; Askari, M. A.

    2011-01-01

    Radiographic inspection is one of the most widely employed techniques for medical testing methods. Because of poor contrast and high un-sharpness of radiographic image quality in films, converting radiographs to a digital format and using further digital image processing is the best method of enhancing the image quality and assisting the interpreter in their evaluation. In this research work, radiographic films of 70 infant chest images with different sizes of defects were selected. To digitise the chest images and employ image processing the two algorithms (i) spatial domain and (ii) frequency domain techniques were used. The MATLAB environment was selected for processing in the digital format. Our results showed that by using these two techniques, the defects with small dimensions are detectable. Therefore, these suggested techniques may help medical specialists to diagnose the defects in the primary stages and help to prevent more repeat X-ray examination of paediatric patients. (authors)

  8. Physics-based optimization of image quality in 3D X-ray flat-panel cone-beam imaging

    NARCIS (Netherlands)

    Snoeren, R.M.

    2012-01-01

    This thesis describes the techniques for modeling and control of 3D X-ray cardiovascular systems in terms of Image Quality and patient dose, aiming at optimizing the diagnostic quality. When aiming at maximum Image Quality (IQ), a cascaded system constituted from inter-dependent imaging components,

  9. Model-Based Referenceless Quality Metric of 3D Synthesized Images Using Local Image Description.

    Science.gov (United States)

    Gu, Ke; Jakhetiya, Vinit; Qiao, Jun-Fei; Li, Xiaoli; Lin, Weisi; Thalmann, Daniel

    2017-07-28

    New challenges have been brought out along with the emerging of 3D-related technologies such as virtual reality (VR), augmented reality (AR), and mixed reality (MR). Free viewpoint video (FVV), due to its applications in remote surveillance, remote education, etc, based on the flexible selection of direction and viewpoint, has been perceived as the development direction of next-generation video technologies and has drawn a wide range of researchers' attention. Since FVV images are synthesized via a depth image-based rendering (DIBR) procedure in the "blind" environment (without reference images), a reliable real-time blind quality evaluation and monitoring system is urgently required. But existing assessment metrics do not render human judgments faithfully mainly because geometric distortions are generated by DIBR. To this end, this paper proposes a novel referenceless quality metric of DIBR-synthesized images using the autoregression (AR)-based local image description. It was found that, after the AR prediction, the reconstructed error between a DIBR-synthesized image and its AR-predicted image can accurately capture the geometry distortion. The visual saliency is then leveraged to modify the proposed blind quality metric to a sizable margin. Experiments validate the superiority of our no-reference quality method as compared with prevailing full-, reduced- and no-reference models.

  10. Latin American image quality survey in digital mammography studies

    International Nuclear Information System (INIS)

    Mora, Patricia; Khoury, Helen; Bitelli, Regina; Quintero, Ana Rosa; Garay, Fernando; Garcia Aguilar, Juan; Gamarra, Mirtha; Ubeda, Carlos

    2017-01-01

    Under International Atomic Energy Agency regional programme TSA3 Radiological Protection of Patients in Medical Exposures, Latin American countries evaluated the image quality and glandular doses for digital mammography equipment with the purpose of seeing the performance and compliance with international recommendations. Totally, 24 institutions participated from Brazil, Chile, Costa Rica, El Salvador, Mexico, Paraguay and Venezuela. Signal difference noise ratio results showed for CR poor compliance with tolerances; better results were obtained for full-field digital mammography equipment. Mean glandular dose results showed that the majority of units have values below the acceptable dose levels. This joint Latin American project identified common problems: difficulty in working with digital images and lack of specific training by medical physicists from the region. Image quality is a main issue not being satisfied in accordance with international recommendations; optimisation processes in which the doses are increased should be very carefully done in order to improve early detection of any cancer signs. (authors)

  11. Simultaneous analysis and quality assurance for diffusion tensor imaging.

    Directory of Open Access Journals (Sweden)

    Carolyn B Lauzon

    Full Text Available Diffusion tensor imaging (DTI enables non-invasive, cyto-architectural mapping of in vivo tissue microarchitecture through voxel-wise mathematical modeling of multiple magnetic resonance imaging (MRI acquisitions, each differently sensitized to water diffusion. DTI computations are fundamentally estimation processes and are sensitive to noise and artifacts. Despite widespread adoption in the neuroimaging community, maintaining consistent DTI data quality remains challenging given the propensity for patient motion, artifacts associated with fast imaging techniques, and the possibility of hardware changes/failures. Furthermore, the quantity of data acquired per voxel, the non-linear estimation process, and numerous potential use cases complicate traditional visual data inspection approaches. Currently, quality inspection of DTI data has relied on visual inspection and individual processing in DTI analysis software programs (e.g. DTIPrep, DTI-studio. However, recent advances in applied statistical methods have yielded several different metrics to assess noise level, artifact propensity, quality of tensor fit, variance of estimated measures, and bias in estimated measures. To date, these metrics have been largely studied in isolation. Herein, we select complementary metrics for integration into an automatic DTI analysis and quality assurance pipeline. The pipeline completes in 24 hours, stores statistical outputs, and produces a graphical summary quality analysis (QA report. We assess the utility of this streamlined approach for empirical quality assessment on 608 DTI datasets from pediatric neuroimaging studies. The efficiency and accuracy of quality analysis using the proposed pipeline is compared with quality analysis based on visual inspection. The unified pipeline is found to save a statistically significant amount of time (over 70% while improving the consistency of QA between a DTI expert and a pool of research associates. Projection of QA

  12. Improvement of the clinical use of computed radiography for mobile chest imaging: Image quality and patient dose

    Science.gov (United States)

    Rill, Lynn Neitzey

    Chest radiography is technically difficult because of the wide variation of tissue attenuations in the chest and limitations of screen-film systems. Mobile chest radiography, performed bedside on hospital inpatients, presents additional difficulties due to geometrical and equipment limitations inherent to mobile x-ray procedures and the severity of illness in patients. Computed radiography (CR) offers a new approach for mobile chest radiography by utilizing a photostimulable phosphor. Photostimulable phosphors are more efficient in absorbing lower-energy x-rays than standard intensifying screens and overcome some image quality limitations of mobile chest imaging, particularly because of the inherent latitude. This study evaluated changes in imaging parameters for CR to take advantage of differences between CR and screen-film radiography. Two chest phantoms, made of acrylic and aluminum, simulated x-ray attenuation for average-sized and large- sized adult chests. The phantoms contained regions representing the lungs, heart and subdiaphragm. Acrylic and aluminum disks (1.9 cm diameter) were positioned in the chest regions to make signal-to-noise ratio (SNR) measurements for different combinations of imaging parameters. Disk thicknesses (contrast) were determined from disk visibility. Effective dose to the phantom was also measured for technique combinations. The results indicated that using an anti-scatter grid and lowering x- ray tube potential improved the SNR significantly; however, the dose to the phantom also increased. An evaluation was performed to examine the clinical applicability of the observed improvements in SNR. Parameter adjustments that improved phantom SNRs by more than 50% resulted in perceived image quality improvements in the lung region of clinical mobile chest radiographs. Parameters that produced smaller improvements in SNR had no apparent effect on clinical image quality. Based on this study, it is recommended that a 3:1 grid be used for

  13. A quality-refinement process for medical imaging applications.

    Science.gov (United States)

    Neuhaus, J; Maleike, D; Nolden, M; Kenngott, H-G; Meinzer, H-P; Wolf, I

    2009-01-01

    To introduce and evaluate a process for refinement of software quality that is suitable to research groups. In order to avoid constraining researchers too much, the quality improvement process has to be designed carefully. The scope of this paper is to present and evaluate a process to advance quality aspects of existing research prototypes in order to make them ready for initial clinical studies. The proposed process is tailored for research environments and therefore more lightweight than traditional quality management processes. Focus on quality criteria that are important at the given stage of the software life cycle. Usage of tools that automate aspects of the process is emphasized. To evaluate the additional effort that comes along with the process, it was exemplarily applied for eight prototypical software modules for medical image processing. The introduced process has been applied to improve the quality of all prototypes so that they could be successfully used in clinical studies. The quality refinement yielded an average of 13 person days of additional effort per project. Overall, 107 bugs were found and resolved by applying the process. Careful selection of quality criteria and the usage of automated process tools lead to a lightweight quality refinement process suitable for scientific research groups that can be applied to ensure a successful transfer of technical software prototypes into clinical research workflows.

  14. Effects of task and image properties on visual-attention deployment in image-quality assessment

    Science.gov (United States)

    Alers, Hani; Redi, Judith; Liu, Hantao; Heynderickx, Ingrid

    2015-03-01

    It is important to understand how humans view images and how their behavior is affected by changes in the properties of the viewed images and the task they are given, particularly the task of scoring the image quality (IQ). This is a complex behavior that holds great importance for the field of image-quality research. This work builds upon 4 years of research work spanning three databases studying image-viewing behavior. Using eye-tracking equipment, it was possible to collect information on human viewing behavior of different kinds of stimuli and under different experimental settings. This work performs a cross-analysis on the results from all these databases using state-of-the-art similarity measures. The results strongly show that asking the viewers to score the IQ significantly changes their viewing behavior. Also muting the color saturation seems to affect the saliency of the images. However, a change in IQ was not consistently found to modify visual attention deployment, neither under free looking nor during scoring. These results are helpful in gaining a better understanding of image viewing behavior under different conditions. They also have important implications on work that collects subjective image-quality scores from human observers.

  15. Elaboration and implementation of standard operational procedure for quality assurance of cone beam CT image in radiotherapy

    International Nuclear Information System (INIS)

    Bonatto, Larisse N.; Estacio, Daniela R.; Lopes, Juliane S.; Sansson, Angela; Duarte, Lucas O.; Sbaraini, Patricia; Silva, Ana M. Marques da; Streck, Elaine E.

    2016-01-01

    The objective of this article is to present the implementation of the quality Control of Cone Beam Computed Tomography (CBCT) image, generated by the On-Board Imager, integrated with the linear accelerator Trilogy. Standard operating procedures (POPs) have been developed based on the literature and manuals of the simulator object Catphan 504 and the On-Board Imager. The following POPs were developed: acquisition of the CBCT image; linearity of CT number; uniformity; spatial resolution; low contrast resolution; spatial linearity; thickness of the cut. The validation of the elaborated procedures was done from an experimental acquisition of the simulator object. The results obtained in the validation of the POPs are in compliance with the parameters established by the manufacturer of the simulator object, as well as those obtained in the acceptance of the On-Board Imager device.

  16. Investigation of grid performance using simple image quality tests

    Directory of Open Access Journals (Sweden)

    Dogan Bor

    2016-01-01

    Full Text Available Antiscatter grids improve the X-ray image contrast at a cost of patient radiation doses. The choice of appropriate grid or its removal requires a good knowledge of grid characteristics, especially for pediatric digital imaging. The aim of this work is to understand the relation between grid performance parameters and some numerical image quality metrics for digital radiological examinations. The grid parameters such as bucky factor (BF, selectivity (Σ, Contrast improvement factor (CIF, and signal-to-noise improvement factor (SIF were determined following the measurements of primary, scatter, and total radiations with a digital fluoroscopic system for the thicknesses of 5, 10, 15, 20, and 25 cm polymethyl methacrylate blocks at the tube voltages of 70, 90, and 120 kVp. Image contrast for low- and high-contrast objects and high-contrast spatial resolution were measured with simple phantoms using the same scatter thicknesses and tube voltages. BF and SIF values were also calculated from the images obtained with and without grids. The correlation coefficients between BF values obtained using two approaches (grid parameters and image quality metrics were in good agreement. Proposed approach provides a quick and practical way of estimating grid performance for different digital fluoroscopic examinations.

  17. Objective masurement of image quality in fluoroscopic x-ray equipment FluoroQuality

    CERN Document Server

    Tapiovaara, M

    2003-01-01

    The report describes FluoroQuality, a computer program that is developed in STUK and used for measuring the image quality in medical fluoroscopic equipment. The method is based on the statistical decision theory (SDT) and the main measurement result is given in terms of the accumulation rate of the signal-to-noise ratio squared (SNR sup 2 sub r sub a sub t sub e). In addition to this quantity several other quantities are measured. These quantities include the SNR of single image frames, the spatio-temporal noise power spectrum and the temporal lag. The measurement method can be used, for example, for specifying the image quality in fluoroscopic images, for optimising the image quality and dose rate in fluoroscopy and for quality control of fluoroscopic equipment. The theory behind the measurement method is reviewed and the measurement of the various quantities is explained. An example of using the method for optimising a specified fluoroscopic procedure is given. The User's Manual of the program is included a...

  18. Cardiovascular CT angiography in neonates and children: Image quality and potential for radiation dose reduction with iterative image reconstruction techniques

    International Nuclear Information System (INIS)

    Tricarico, Francesco; Hlavacek, Anthony M.; Schoepf, U.J.; Ebersberger, Ullrich; Nance, John W.; Vliegenthart, Rozemarijn; Cho, Young Jun; Spears, J.R.; Secchi, Francesco; Savino, Giancarlo; Marano, Riccardo; Bonomo, Lorenzo; Schoenberg, Stefan O.; Apfaltrer, Paul

    2013-01-01

    To evaluate image quality (IQ) of low-radiation-dose paediatric cardiovascular CT angiography (CTA), comparing iterative reconstruction in image space (IRIS) and sinogram-affirmed iterative reconstruction (SAFIRE) with filtered back-projection (FBP) and estimate the potential for further dose reductions. Forty neonates and children underwent low radiation CTA with or without ECG synchronisation. Data were reconstructed with FBP, IRIS and SAFIRE. For ECG-synchronised studies, half-dose image acquisitions were simulated. Signal noise was measured and IQ graded. Effective dose (ED) was estimated. Mean absolute and relative image noise with IRIS and full-dose SAFIRE was lower than with FBP (P < 0.001), while SNR and CNR were higher (P < 0.001). Image noise was also lower and SNR and CNR higher in half-dose SAFIRE studies compared with full-and half-dose FBP studies (P < 0.001). IQ scores were higher for IRIS, full-dose SAFIRE and half-dose SAFIRE than for full-dose FBP and higher for half-dose SAFIRE than for half-dose FBP (P < 0.05). Median weight-specific ED was 0.3 mSv without and 1.36 mSv with ECG synchronisation. The estimated ED of half-dose SAFIRE studies was 0.68 mSv. IR improves image noise, SNR, CNR and subjective IQ compared with FBP in low-radiation-dose paediatric CTA and allows further dose reductions without compromising diagnostic IQ. (orig.)

  19. Cardiovascular CT angiography in neonates and children: Image quality and potential for radiation dose reduction with iterative image reconstruction techniques

    Energy Technology Data Exchange (ETDEWEB)

    Tricarico, Francesco [Medical University of South Carolina, Ashley River Tower, Department of Radiology and Radiological Science, Charleston, SC (United States); Catholic University of the Sacred Heart, ' ' A. Gemelli' ' Hospital, Department of Bioimaging and Radiological Sciences, Rome (Italy); Hlavacek, Anthony M. [Medical University of South Carolina, Ashley River Tower, Department of Radiology and Radiological Science, Charleston, SC (United States); Children' s Hospital, Medical University of South Carolina, Division of Pediatric Cardiology, Charleston, SC (United States); Schoepf, U.J. [Medical University of South Carolina, Ashley River Tower, Department of Radiology and Radiological Science, Charleston, SC (United States); Children' s Hospital, Medical University of South Carolina, Division of Pediatric Cardiology, Charleston, SC (United States); Medical University of South Carolina, Division of Cardiology, Department of Medicine, Charleston, SC (United States); Ebersberger, Ullrich [Medical University of South Carolina, Ashley River Tower, Department of Radiology and Radiological Science, Charleston, SC (United States); Heart Centre Munich-Bogenhausen, Department of Cardiology and Intensive Care Medicine, Munich (Germany); Nance, John W. [Medical University of South Carolina, Ashley River Tower, Department of Radiology and Radiological Science, Charleston, SC (United States); Johns Hopkins Hospital, The Russell H. Morgan Department of Radiology and Radiological Science, Baltimore, MD (United States); Vliegenthart, Rozemarijn [Medical University of South Carolina, Ashley River Tower, Department of Radiology and Radiological Science, Charleston, SC (United States); University Medical Centre Groningen/University of Groningen, Centre for Medical Imaging - North East Netherlands, Department of Radiology, Groningen (Netherlands); Cho, Young Jun [Medical University of South Carolina, Ashley River Tower, Department of Radiology and Radiological Science, Charleston, SC (United States); Konyang University School of Medicine, Department of Radiology, Daejeon (Korea, Republic of); Spears, J.R. [Medical University of South Carolina, Ashley River Tower, Department of Radiology and Radiological Science, Charleston, SC (United States); Secchi, Francesco [Medical University of South Carolina, Ashley River Tower, Department of Radiology and Radiological Science, Charleston, SC (United States); University of Milan School of Medicine IRCCS Policlinico San Donato, Department of Medical and Surgical Sciences, Radiology Unit, Milan (Italy); Savino, Giancarlo; Marano, Riccardo; Bonomo, Lorenzo [Catholic University of the Sacred Heart, ' ' A. Gemelli' ' Hospital, Department of Bioimaging and Radiological Sciences, Rome (Italy); Schoenberg, Stefan O. [University Medical Centre Mannheim, Medical Faculty Mannheim - Heidelberg University, Institute of Clinical Radiology and Nuclear Medicine, Mannheim (Germany); Apfaltrer, Paul [Medical University of South Carolina, Ashley River Tower, Department of Radiology and Radiological Science, Charleston, SC (United States); University Medical Centre Mannheim, Medical Faculty Mannheim - Heidelberg University, Institute of Clinical Radiology and Nuclear Medicine, Mannheim (Germany)

    2013-05-15

    To evaluate image quality (IQ) of low-radiation-dose paediatric cardiovascular CT angiography (CTA), comparing iterative reconstruction in image space (IRIS) and sinogram-affirmed iterative reconstruction (SAFIRE) with filtered back-projection (FBP) and estimate the potential for further dose reductions. Forty neonates and children underwent low radiation CTA with or without ECG synchronisation. Data were reconstructed with FBP, IRIS and SAFIRE. For ECG-synchronised studies, half-dose image acquisitions were simulated. Signal noise was measured and IQ graded. Effective dose (ED) was estimated. Mean absolute and relative image noise with IRIS and full-dose SAFIRE was lower than with FBP (P < 0.001), while SNR and CNR were higher (P < 0.001). Image noise was also lower and SNR and CNR higher in half-dose SAFIRE studies compared with full-and half-dose FBP studies (P < 0.001). IQ scores were higher for IRIS, full-dose SAFIRE and half-dose SAFIRE than for full-dose FBP and higher for half-dose SAFIRE than for half-dose FBP (P < 0.05). Median weight-specific ED was 0.3 mSv without and 1.36 mSv with ECG synchronisation. The estimated ED of half-dose SAFIRE studies was 0.68 mSv. IR improves image noise, SNR, CNR and subjective IQ compared with FBP in low-radiation-dose paediatric CTA and allows further dose reductions without compromising diagnostic IQ. (orig.)

  20. Online hyperspectral imaging system for evaluating quality of agricultural products

    Science.gov (United States)

    Mo, Changyeun; Kim, Giyoung; Lim, Jongguk

    2017-06-01

    The consumption of fresh-cut agricultural produce in Korea has been growing. The browning of fresh-cut vegetables that occurs during storage and foreign substances such as worms and slugs are some of the main causes of consumers' concerns with respect to safety and hygiene. The purpose of this study is to develop an on-line system for evaluating quality of agricultural products using hyperspectral imaging technology. The online evaluation system with single visible-near infrared hyperspectral camera in the range of 400 nm to 1000 nm that can assess quality of both surfaces of agricultural products such as fresh-cut lettuce was designed. Algorithms to detect browning surface were developed for this system. The optimal wavebands for discriminating between browning and sound lettuce as well as between browning lettuce and the conveyor belt were investigated using the correlation analysis and the one-way analysis of variance method. The imaging algorithms to discriminate the browning lettuces were developed using the optimal wavebands. The ratio image (RI) algorithm of the 533 nm and 697 nm images (RI533/697) for abaxial surface lettuce and the ratio image algorithm (RI533/697) and subtraction image (SI) algorithm (SI538-697) for adaxial surface lettuce had the highest classification accuracies. The classification accuracy of browning and sound lettuce was 100.0% and above 96.0%, respectively, for the both surfaces. The overall results show that the online hyperspectral imaging system could potentially be used to assess quality of agricultural products.

  1. Tool development for organ dose optimization taking into account the image quality in Computed Tomography

    International Nuclear Information System (INIS)

    Adrien-Decoene, Camille

    2015-01-01

    Due to the significant rise of computed tomography (CT) exams in the past few years and the increase of the collective dose due to medical exams, dose estimation in CT imaging has become a major public health issue. However dose optimization cannot be considered without taking into account the image quality which has to be good enough for radiologists. In clinical practice, optimization is obtained through empirical index and image quality using measurements performed on specific phantoms like the CATPHAN. Based on this kind of information, it is thus difficult to correctly optimize protocols regarding organ doses and radiologist criteria. Therefore our goal is to develop a tool allowing the optimization of the patient dose while preserving the image quality needed for diagnosis. The work is divided into two main parts: (i) the development of a Monte Carlo dose simulator based on the PENELOPE code, and (ii) the assessment of an objective image quality criterion. For that purpose, the GE Lightspeed VCT 64 CT tube was modelled with information provided by the manufacturer technical note and by adapting the method proposed by Turner et al (Med. Phys. 36: 2154-2164). The axial and helical movements of the X-ray tube were then implemented into the MC tool. To improve the efficiency of the simulation, two variance reduction techniques were used: a circular and a translational splitting. The splitting algorithms allow a uniform particle distribution along the gantry path to simulate the continuous gantry motion in a discrete way. Validations were performed in homogeneous conditions using a home-made phantom and the well-known CTDI phantoms. Then, dose values were measured in CIRS ATOM anthropomorphic phantom using both optically stimulated luminescence dosimeters for point doses and XR-QA Gafchromic films for relative dose maps. Comparisons between measured and simulated values enabled us to validate the MC tool used for dosimetric purposes. Finally, organ doses for

  2. Nonlinear image blending for dual-energy MDCT of the abdomen: can image quality be preserved if the contrast medium dose is reduced?

    Science.gov (United States)

    Mileto, Achille; Ramirez-Giraldo, Juan Carlos; Marin, Daniele; Alfaro-Cordoba, Marcela; Eusemann, Christian D; Scribano, Emanuele; Blandino, Alfredo; Mazziotti, Silvio; Ascenti, Giorgio

    2014-10-01

    The objective of this study was to compare the image quality of a dual-energy nonlinear image blending technique at reduced load of contrast medium with a simulated 120-kVp linear blending technique at a full dose during portal venous phase MDCT of the abdomen. Forty-five patients (25 men, 20 women; mean age, 65.6 ± 9.7 [SD] years; mean body weight, 74.9 ± 12.4 kg) underwent contrast-enhanced single-phase dual-energy CT of the abdomen by a random assignment to one of three different contrast medium (iomeprol 400) dose injection protocols: 1.3, 1.0, or 0.65 mL/kg of body weight. The contrast-to-noise ratio (CNR) and noise at the portal vein, liver, aorta, and kidney were compared among the different datasets using the ANOVA. Three readers qualitatively assessed all datasets in a blinded and independent fashion. Nonlinear blended images at a 25% reduced dose allowed a significant improvement in CNR (p < 0.05 for all comparisons), compared with simulated 120-kVp linear blended images at a full dose. No statistically significant difference existed in CNR and noise between the nonlinear blended images at a 50% reduced dose and the simulated 120-kVp linear blended images at a full dose. Nonlinear blended images at a 50% reduced dose were considered in all cases to have acceptable image quality. The dual-energy nonlinear image blending technique allows reducing the dose of contrast medium up to 50% during portal venous phase imaging of the abdomen while preserving image quality.

  3. Study on efficiency of time computation in x-ray imaging simulation base on Monte Carlo algorithm using graphics processing unit

    International Nuclear Information System (INIS)

    Setiani, Tia Dwi; Suprijadi; Haryanto, Freddy

    2016-01-01

    Monte Carlo (MC) is one of the powerful techniques for simulation in x-ray imaging. MC method can simulate the radiation transport within matter with high accuracy and provides a natural way to simulate radiation transport in complex systems. One of the codes based on MC algorithm that are widely used for radiographic images simulation is MC-GPU, a codes developed by Andrea Basal. This study was aimed to investigate the time computation of x-ray imaging simulation in GPU (Graphics Processing Unit) compared to a standard CPU (Central Processing Unit). Furthermore, the effect of physical parameters to the quality of radiographic images and the comparison of image quality resulted from simulation in the GPU and CPU are evaluated in this paper. The simulations were run in CPU which was simulated in serial condition, and in two GPU with 384 cores and 2304 cores. In simulation using GPU, each cores calculates one photon, so, a large number of photon were calculated simultaneously. Results show that the time simulations on GPU were significantly accelerated compared to CPU. The simulations on the 2304 core of GPU were performed about 64 -114 times faster than on CPU, while the simulation on the 384 core of GPU were performed about 20 – 31 times faster than in a single core of CPU. Another result shows that optimum quality of images from the simulation was gained at the history start from 10"8 and the energy from 60 Kev to 90 Kev. Analyzed by statistical approach, the quality of GPU and CPU images are relatively the same.

  4. Study on efficiency of time computation in x-ray imaging simulation base on Monte Carlo algorithm using graphics processing unit

    Energy Technology Data Exchange (ETDEWEB)

    Setiani, Tia Dwi, E-mail: tiadwisetiani@gmail.com [Computational Science, Faculty of Mathematics and Natural Sciences, Institut Teknologi Bandung Jalan Ganesha 10 Bandung, 40132 (Indonesia); Suprijadi [Computational Science, Faculty of Mathematics and Natural Sciences, Institut Teknologi Bandung Jalan Ganesha 10 Bandung, 40132 (Indonesia); Nuclear Physics and Biophysics Reaserch Division, Faculty of Mathematics and Natural Sciences, Institut Teknologi Bandung Jalan Ganesha 10 Bandung, 40132 (Indonesia); Haryanto, Freddy [Nuclear Physics and Biophysics Reaserch Division, Faculty of Mathematics and Natural Sciences, Institut Teknologi Bandung Jalan Ganesha 10 Bandung, 40132 (Indonesia)

    2016-03-11

    Monte Carlo (MC) is one of the powerful techniques for simulation in x-ray imaging. MC method can simulate the radiation transport within matter with high accuracy and provides a natural way to simulate radiation transport in complex systems. One of the codes based on MC algorithm that are widely used for radiographic images simulation is MC-GPU, a codes developed by Andrea Basal. This study was aimed to investigate the time computation of x-ray imaging simulation in GPU (Graphics Processing Unit) compared to a standard CPU (Central Processing Unit). Furthermore, the effect of physical parameters to the quality of radiographic images and the comparison of image quality resulted from simulation in the GPU and CPU are evaluated in this paper. The simulations were run in CPU which was simulated in serial condition, and in two GPU with 384 cores and 2304 cores. In simulation using GPU, each cores calculates one photon, so, a large number of photon were calculated simultaneously. Results show that the time simulations on GPU were significantly accelerated compared to CPU. The simulations on the 2304 core of GPU were performed about 64 -114 times faster than on CPU, while the simulation on the 384 core of GPU were performed about 20 – 31 times faster than in a single core of CPU. Another result shows that optimum quality of images from the simulation was gained at the history start from 10{sup 8} and the energy from 60 Kev to 90 Kev. Analyzed by statistical approach, the quality of GPU and CPU images are relatively the same.

  5. Variation in the quality of CT images of the upper abdomen when CT automatic exposure control is employed

    International Nuclear Information System (INIS)

    Aizawa, Isao; Muramatsu, Yoshihisa; Nomura, Keiichi; Shimizu, Fuminori

    2010-01-01

    The aim of this study was to analyze the reason for variation of image quality in the upper abdomen CT with the use of CT-automatic exposure control (AEC). The CT investigated was 3D modulation in the 16 multi detector row CT (MDCT) and lung cancer screening CT (LSCT) phantom was used to simulate the patient. When there was a phase difference, an image noise increase of around 15% at the maximum was accepted. It is concluded that the major reason for variation in image quality is respiratory motion and the importance of respiration control must be recognized. (author)

  6. MO-DE-209-03: Assessing Image Quality

    Energy Technology Data Exchange (ETDEWEB)

    Zhao, W. [Stony Brook Medicine (United States)

    2016-06-15

    Digital Breast Tomosynthesis (DBT) is rapidly replacing mammography as the standard of care in breast cancer screening and diagnosis. DBT is a form of computed tomography, in which a limited set of projection images are acquired over a small angular range and reconstructed into tomographic data. The angular range varies from 15° to 50° and the number of projections varies between 9 and 25 projections, as determined by the equipment manufacturer. It is equally valid to treat DBT as the digital analog of classical tomography – that is, linear tomography. In fact, the name “tomosynthesis” stands for “synthetic tomography.” DBT shares many common features with classical tomography, including the radiographic appearance, dose, and image quality considerations. As such, both the science and practical physics of DBT systems is a hybrid between computed tomography and classical tomographic methods. In this lecture, we will explore the continuum from radiography to computed tomography to illustrate the characteristics of DBT. This lecture will consist of four presentations that will provide a complete overview of DBT, including a review of the fundamentals of DBT acquisition, a discussion of DBT reconstruction methods, an overview of dosimetry for DBT systems, and summary of the underlying image theory of DBT thereby relating image quality and dose. Learning Objectives: To understand the fundamental principles behind tomosynthesis image acquisition. To understand the fundamentals of tomosynthesis image reconstruction. To learn the determinants of image quality and dose in DBT, including measurement techniques. To learn the image theory underlying tomosynthesis, and the relationship between dose and image quality. ADM is a consultant to, and holds stock in, Real Time Tomography, LLC. ADM receives research support from Hologic Inc., Analogic Inc., and Barco NV.; ADM is a member of the Scientific Advisory Board for Gamma Medica Inc.; A. Maidment, Research Support

  7. MO-DE-209-03: Assessing Image Quality

    International Nuclear Information System (INIS)

    Zhao, W.

    2016-01-01

    Digital Breast Tomosynthesis (DBT) is rapidly replacing mammography as the standard of care in breast cancer screening and diagnosis. DBT is a form of computed tomography, in which a limited set of projection images are acquired over a small angular range and reconstructed into tomographic data. The angular range varies from 15° to 50° and the number of projections varies between 9 and 25 projections, as determined by the equipment manufacturer. It is equally valid to treat DBT as the digital analog of classical tomography – that is, linear tomography. In fact, the name “tomosynthesis” stands for “synthetic tomography.” DBT shares many common features with classical tomography, including the radiographic appearance, dose, and image quality considerations. As such, both the science and practical physics of DBT systems is a hybrid between computed tomography and classical tomographic methods. In this lecture, we will explore the continuum from radiography to computed tomography to illustrate the characteristics of DBT. This lecture will consist of four presentations that will provide a complete overview of DBT, including a review of the fundamentals of DBT acquisition, a discussion of DBT reconstruction methods, an overview of dosimetry for DBT systems, and summary of the underlying image theory of DBT thereby relating image quality and dose. Learning Objectives: To understand the fundamental principles behind tomosynthesis image acquisition. To understand the fundamentals of tomosynthesis image reconstruction. To learn the determinants of image quality and dose in DBT, including measurement techniques. To learn the image theory underlying tomosynthesis, and the relationship between dose and image quality. ADM is a consultant to, and holds stock in, Real Time Tomography, LLC. ADM receives research support from Hologic Inc., Analogic Inc., and Barco NV.; ADM is a member of the Scientific Advisory Board for Gamma Medica Inc.; A. Maidment, Research Support

  8. Quality assurance for MR stereotactic imaging for three Siemens scanners

    International Nuclear Information System (INIS)

    Kozubikova, P.; Novotny, J. Jr.; Kulhova, K.; Mihalova, P.; Tamasova, J.; Veselsk, T.

    2014-01-01

    Quality assurance of stereotactic imaging, especially with MRI (magnetic resonance imaging), is a complex issue. It can be divided in the basic verification and commissioning of a particular new scanner or a new scanning MRI protocol that is being implemented into a clinical practice and the routine quality assurance performed for each single radiosurgical case. The aim of this study was geometric distortion assessment in MRI with a special PTGR (Physikalisch-Technische Gesellschaft fuer Radiologie - GmbH, Tuebingen, Germany) target phantom. PTGR phantom consists of 21 three-dimensional cross-hairs filled with contrast medium. Cross hairs are positioned at known Leksell coordinates with a precision of better than 0.1 mm and covering the whole stereotactic space. The phantom can be fixed in the Leksell stereotactic frame and thus stereotactic imaging procedures can be reproduced following exactly the same steps as for a real patient, including also the stereotactic image definition in the Leksell GammaPlan. Since the geometric position (stereotactic coordinates) of each cross-hair is known based on the construction of the phantom, it can be compared with the actual measured Leksell coordinates based on the stereotactic MRI. Deviations between expected and actual coordinates provide information about the level of distortion. The measured distortions proved satisfactory accuracy precision for stereotactic localization at 1.5 T Siemens Magnetom Avanto scanner, Siemens Magnetom Symphony scanner and 3T Siemens Magnetom Skyra scanner (Na Homolce Hospital, Prague). The mean distortion for these MR scanners for standard imaging protocol (T1 weighted 3D images) were 0.8 mm, 1.1 mm and 1.1 mm and maximum distortions were 1.3 mm, 1.9 mm and 2.2 mm, respectively.There was detected dependence of the distortions on the slice orientation and the type of imaging protocol. Image distortions are also property of each particular scanner, the worst distortion were observed for 3T

  9. Correlation of bone quality in radiographic images with clinical bone quality classification

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Hyun Woo; Huh, Kyung Hoe; Kim, Jeong Hwa; Yi, Won Jin; Heo, Min Suk; Lee, Sam Sun; Choi, Soon Chul [Seoul National University, Seoul (Korea, Republic of); Park, Kwan Soo [Inje University, Seoul (Korea, Republic of)

    2006-03-15

    To investigate the validity of digital image processing on panoramic radiographs in estimating bone quality before endosseous dental implant installation by correlating bone quality in radiographic images with clinical bone quality classification. An experienced surgeon assessed and classified bone quality for implant sites with tactile sensation at the time of implant placement. Including fractal dimension eighteen morphologic features of trabecular pattern were examined in each anatomical sites on panoramic radiographs. Finally bone quality of 67 implant sites were evaluated in 42 patients. Pearson correlation analysis showed that three morphologic parameters had weak linear negative correlation with clinical bone quality classification showing correlation coefficients of -0.276, -0.280, and -0.289, respectively (p<0.05). And other three morphologic parameters had obvious linear negative correlation with clinical bone quality classification showing correlation coefficients of -0.346, -0.488, and -0.343 respectively (p<0.05). Fractal dimension also had a linear correlating with clinical bone quality classification with correlation coefficients -0.506 significantly (P<0.05). This study suggests that fractal and morphometric analysis using digital panoramic radiographs can be used to evaluate bone quality for implant recipient sites.

  10. Hybrid simulation using mixed reality for interventional ultrasound imaging training.

    Science.gov (United States)

    Freschi, C; Parrini, S; Dinelli, N; Ferrari, M; Ferrari, V

    2015-07-01

    Ultrasound (US) imaging offers advantages over other imaging modalities and has become the most widespread modality for many diagnostic and interventional procedures. However, traditional 2D US requires a long training period, especially to learn how to manipulate the probe. A hybrid interactive system based on mixed reality was designed, implemented and tested for hand-eye coordination training in diagnostic and interventional US. A hybrid simulator was developed integrating a physical US phantom and a software application with a 3D virtual scene. In this scene, a 3D model of the probe with its relative scan plane is coherently displayed with a 3D representation of the phantom internal structures. An evaluation study of the diagnostic module was performed by recruiting thirty-six novices and four experts. The performances of the hybrid (HG) versus physical (PG) simulator were compared. After the training session, each novice was required to visualize a particular target structure. The four experts completed a 5-point Likert scale questionnaire. Seventy-eight percentage of the HG novices successfully visualized the target structure, whereas only 45% of the PG reached this goal. The mean scores from the questionnaires were 5.00 for usefulness, 4.25 for ease of use, 4.75 for 3D perception, and 3.25 for phantom realism. The hybrid US training simulator provides ease of use and is effective as a hand-eye coordination teaching tool. Mixed reality can improve US probe manipulation training.

  11. Sensitometric properties and image quality of radiographic film and paper

    International Nuclear Information System (INIS)

    Domanus, J.C.

    1985-09-01

    When using X-ray film or radiographic paper for industrial applications one is interested in knowing not only their sensitometric properties (such as speed and contrast) but also the image quality obtainable with a particular brand of film or paper. Although standard methods for testing both properties separately are available it is desirable that the method permits the assessment of all the relevant properties together. The sensitometric properties are usually determined at constant kilovoltage and filtration at the X-ray tube, whereas radiographic image thicknesses. The use of the constant exposure technique could be used to compare both the sensitometric properties as well as the image quality for different radiographic materials. It consist of exposing different film or paper brands at a chosen, constant mAmin exposure when testing radiographic image quality for different thicknesses of a given material. From the results obtained with the constant exposure technique conclusions are drawn about its applicability as a standard method for assessing radiographic film and paper. (author)

  12. Effect of exercise supplementation on dipyridamole thallium-201 image quality

    International Nuclear Information System (INIS)

    Stern, S.; Greenberg, I.D.; Corne, R.

    1991-01-01

    To determine the effect of different types of exercise supplementation on dipyridamole thallium image quality, 78 patients were prospectively randomized to one of three protocols: dipyridamole infusion alone, dipyridamole supplemented with isometric handgrip, and dipyridamole with low-level treadmill exercise. Heart-to-lung, heart-to-liver, and heart-to-adjacent infradiaphragmatic activity ratios were generated from anterior images acquired immediately following the test. Additionally, heart-to-total infradiaphragmatic activity was graded semiquantitatively. Results showed a significantly higher ratio of heart to subdiaphragmatic activity in the treadmill group as compared with dipyridamole alone (p less than 0.001) and dipyridamole supplemented with isometric handgrip exercise (p less than 0.001). No significant difference was observed between patients receiving the dipyridamole infusion, and dipyridamole supplemented with isometric handgrip exercise. The authors conclude that low-level treadmill exercise supplementation of dipyridamole infusion is an effective means of improving image quality. Supplementation with isometric handgrip does not improve image quality over dipyridamole alone

  13. TL dosimetry for quality control of CR mammography imaging systems

    Science.gov (United States)

    Gaona, E.; Nieto, J. A.; Góngora, J. A. I. D.; Arreola, M.; Enríquez, J. G. F.

    The aim of this work is to estimate the average glandular dose with thermoluminescent (TL) dosimetry and comparison with quality imaging in computed radiography (CR) mammography. For a measuring dose, the Food and Drug Administration (FDA) and the American College of Radiology (ACR) use a phantom, so that dose and image quality are assessed with the same test object. The mammography is a radiological image to visualize early biological manifestations of breast cancer. Digital systems have two types of image-capturing devices, full field digital mammography (FFDM) and CR mammography. In Mexico, there are several CR mammography systems in clinical use, but only one system has been approved for use by the FDA. Mammography CR uses a photostimulable phosphor detector (PSP) system. Most CR plates are made of 85% BaFBr and 15% BaFI doped with europium (Eu) commonly called barium flourohalideE We carry out an exploratory survey of six CR mammography units from three different manufacturers and six dedicated X-ray mammography units with fully automatic exposure. The results show three CR mammography units (50%) have a dose greater than 3.0 mGy without demonstrating improved image quality. The differences between doses averages from TLD system and dosimeter with ionization chamber are less than 10%. TLD system is a good option for average glandular dose measurement for X-rays with a HVL (0.35-0.38 mmAl) and kVp (24-26) used in quality control procedures with ACR Mammography Accreditation Phantom.

  14. The Importance of Store Image and Retail Service Quality in Private Brand Image-Building

    Directory of Open Access Journals (Sweden)

    Adi Alić

    2017-03-01

    Full Text Available Objective: The purpose of this research is to highlight the role which store image and retail service quality can play in private brand image-building in the context of an emerging market in South-Eastern Europe (i.e. Bosnia and Herzegovina. We propose to address this issue by seeking answers to the following questions: (1 Does a ‘halo effect’ take place between the store image and the private brand image? (2 How does consumer’s evaluation of the quality of the service delivered by a retailer affect the image of its private brand? Research Design & Methods: Data were collected through a field survey via the store-intercept method. The sample consisted of 699 customers of two large retail chains. The data were analysed using the Structural Equation Modelling technique. Findings: The findings of the present study suggest that store image and retail service quality are important factors in the formation of the image of product-branded products. Implications & Recommendations: This study offers some important insights for retailers who intend to develop their private brand. First, the image transfer from store brand to private brand suggests that retailers should consider the introduction of a private brand as a brand extension, with their stores as the parent brand. Second, we recommend that retailers put more emphasis on quality improvement initiatives related to the store environment attributes. Contribution & Value Added: This study enhances the discussion on the phenomenon of private branding by analysing the store-level factors which underpin the formation of private brand image in the context of less developed European markets.

  15. METHOD OF IMAGE QUALITY ENHANCEMENT FOR SPACE OBJECTS

    Directory of Open Access Journals (Sweden)

    D. S. Korshunov

    2014-07-01

    Full Text Available The paper deals with an approach for image quality improvement of the space objects in the visible range of electromagnetic wave spectrum. The proposed method is based on the joint taking into account of both the motion velocity of the space supervisory apparatus and a space object observed in the near-earth space when the time of photo-detector exposure is chosen. The timing of exposure is carried out by light-signal characteristics, which determines the optimal value of the charge package formed in the charge-coupled device being irradiated. Thus, the parameters of onboard observation equipment can be selected, which provides space images suitable for interpretation. The linear resolving capacity is used as quality indicator for space images, giving a complete picture for the image contrast and geometric properties of the object on the photo. Observation scenario modeling of the space object, done by sputnik-inspector, has shown the possibility of increasing the linear resolution up to10% - 20% or up to 40% - 50% depending on the non-complanarity angle at the movement along orbits. The proposed approach to the increase of photographs quality provides getting sharp and highcontrast images of space objects by the optical-electronic equipment of the space-based remote sensing. The usage of these images makes it possible to detect in time the space technology failures, which are the result of its exploitation in the nearearth space. The proposed method can be also applied at the stage of space systems design for optical-electronic surveillance in computer models used for facilities assessment of the shooting equipment information tract.

  16. Image quality assessment for selfies with and without super resolution

    Science.gov (United States)

    Kubota, Aya; Gohshi, Seiichi

    2018-04-01

    With the advent of cellphone cameras, in particular, on smartphones, many people now take photos of themselves alone and with others in the frame; such photos are popularly known as "selfies". Most smartphones are equipped with two cameras: the front-facing and rear cameras. The camera located on the back of the smartphone is referred to as the "out-camera," whereas the one located on the front of the smartphone is called the "in-camera." In-cameras are mainly used for selfies. Some smartphones feature high-resolution cameras. However, the original image quality cannot be obtained because smartphone cameras often have low-performance lenses. Super resolution (SR) is one of the recent technological advancements that has increased image resolution. We developed a new SR technology that can be processed on smartphones. Smartphones with new SR technology are currently available in the market have already registered sales. However, the effective use of new SR technology has not yet been verified. Comparing the image quality with and without SR on smartphone display is necessary to confirm the usefulness of this new technology. Methods that are based on objective and subjective assessments are required to quantitatively measure image quality. It is known that the typical object assessment value, such as Peak Signal to Noise Ratio (PSNR), does not go together with how we feel when we assess image/video. When digital broadcast started, the standard was determined using subjective assessment. Although subjective assessment usually comes at high cost because of personnel expenses for observers, the results are highly reproducible when they are conducted under right conditions and statistical analysis. In this study, the subjective assessment results for selfie images are reported.

  17. Quality assessment of butter cookies applying multispectral imaging

    Science.gov (United States)

    Andresen, Mette S; Dissing, Bjørn S; Løje, Hanne

    2013-01-01

    A method for characterization of butter cookie quality by assessing the surface browning and water content using multispectral images is presented. Based on evaluations of the browning of butter cookies, cookies were manually divided into groups. From this categorization, reference values were calculated for a statistical prediction model correlating multispectral images with a browning score. The browning score is calculated as a function of oven temperature and baking time. It is presented as a quadratic response surface. The investigated process window was the intervals 4–16 min and 160–200°C in a forced convection electrically heated oven. In addition to the browning score, a model for predicting the average water content based on the same images is presented. This shows how multispectral images of butter cookies may be used for the assessment of different quality parameters. Statistical