WorldWideScience

Sample records for carlo tomographic reconstruction

  1. Fully 3D tomographic reconstruction by Monte Carlo simulation of the system matrix in preclinical PET with iodine 124

    International Nuclear Information System (INIS)

    Moreau, Matthieu

    2014-01-01

    Immuno-PET imaging can be used to assess the pharmacokinetic in radioimmunotherapy. When using iodine-124, PET quantitative imaging is limited by physics-based degrading factors within the detection system and the object, such as the long positron range in water and the complex spectrum of gamma photons. The objective of this thesis was to develop a fully 3D tomographic reconstruction method (S(MC)2PET) using Monte Carlo simulations for estimating the system matrix, in the context of preclinical imaging with iodine-124. The Monte Carlo simulation platform GATE was used for that respect. Several complexities of system matrices were calculated, with at least a model of the PET system response function. Physics processes in the object was either neglected or taken into account using a precise or a simplified object description. The impact of modelling refinement and statistical variance related to the system matrix elements was evaluated on final reconstructed images. These studies showed that a high level of complexity did not always improve qualitative and quantitative results, owing to the high-variance of the associated system matrices. (author)

  2. Industrial dynamic tomographic reconstruction

    International Nuclear Information System (INIS)

    Oliveira, Eric Ferreira de

    2016-01-01

    The state of the art methods applied to industrial processes is currently based on the principles of classical tomographic reconstructions developed for tomographic patterns of static distributions, or is limited to cases of low variability of the density distribution function of the tomographed object. Noise and motion artifacts are the main problems caused by a mismatch in the data from views acquired in different instants. All of these add to the known fact that using a limited amount of data can result in the presence of noise, artifacts and some inconsistencies with the distribution under study. One of the objectives of the present work is to discuss the difficulties that arise from implementing reconstruction algorithms in dynamic tomography that were originally developed for static distributions. Another objective is to propose solutions that aim at reducing a temporal type of information loss caused by employing regular acquisition systems to dynamic processes. With respect to dynamic image reconstruction it was conducted a comparison between different static reconstruction methods, like MART and FBP, when used for dynamic scenarios. This comparison was based on a MCNPx simulation as well as an analytical setup of an aluminum cylinder that moves along the section of a riser during the process of acquisition, and also based on cross section images from CFD techniques. As for the adaptation of current tomographic acquisition systems for dynamic processes, this work established a sequence of tomographic views in a just-in-time fashion for visualization purposes, a form of visually disposing density information as soon as it becomes amenable to image reconstruction. A third contribution was to take advantage of the triple color channel necessary to display colored images in most displays, so that, by appropriately scaling the acquired values of each view in the linear system of the reconstruction, it was possible to imprint a temporal trace into the regularly

  3. Segmentation-DrivenTomographic Reconstruction

    DEFF Research Database (Denmark)

    Kongskov, Rasmus Dalgas

    such that the segmentation subsequently can be carried out by use of a simple segmentation method, for instance just a thresholding method. We tested the advantages of going from a two-stage reconstruction method to a one stage segmentation-driven reconstruction method for the phase contrast tomography reconstruction......The tomographic reconstruction problem is concerned with creating a model of the interior of an object from some measured data, typically projections of the object. After reconstructing an object it is often desired to segment it, either automatically or manually. For computed tomography (CT...

  4. Inverse Monte Carlo: a unified reconstruction algorithm for SPECT

    International Nuclear Information System (INIS)

    Floyd, C.E.; Coleman, R.E.; Jaszczak, R.J.

    1985-01-01

    Inverse Monte Carlo (IMOC) is presented as a unified reconstruction algorithm for Emission Computed Tomography (ECT) providing simultaneous compensation for scatter, attenuation, and the variation of collimator resolution with depth. The technique of inverse Monte Carlo is used to find an inverse solution to the photon transport equation (an integral equation for photon flux from a specified source) for a parameterized source and specific boundary conditions. The system of linear equations so formed is solved to yield the source activity distribution for a set of acquired projections. For the studies presented here, the equations are solved using the EM (Maximum Likelihood) algorithm although other solution algorithms, such as Least Squares, could be employed. While the present results specifically consider the reconstruction of camera-based Single Photon Emission Computed Tomographic (SPECT) images, the technique is equally valid for Positron Emission Tomography (PET) if a Monte Carlo model of such a system is used. As a preliminary evaluation, experimentally acquired SPECT phantom studies for imaging Tc-99m (140 keV) are presented which demonstrate the quantitative compensation for scatter and attenuation for a two dimensional (single slice) reconstruction. The algorithm may be expanded in a straight forward manner to full three dimensional reconstruction including compensation for out of plane scatter

  5. Tomographs based on non-conventional radiation sources and methods

    International Nuclear Information System (INIS)

    Barbuzza, R.; Fresno, M. del; Venere, Marcelo J.; Clausse, Alejandro; Moreno, C.

    2000-01-01

    Computer techniques for tomographic reconstruction of objects X-rayed with a compact plasma focus (PF) are presented. The implemented reconstruction algorithms are based on stochastic searching of solutions of Radon equation, using Genetic Algorithms and Monte Carlo methods. Numerical experiments using actual projections were performed concluding the feasibility of the application of both methods in tomographic reconstruction problem. (author)

  6. Tomographic reconstruction by using FPSIRT (Fast Particle System Iterative Reconstruction Technique)

    Energy Technology Data Exchange (ETDEWEB)

    Moreira, Icaro Valgueiro M.; Melo, Silvio de Barros; Dantas, Carlos; Lima, Emerson Alexandre; Silva, Ricardo Martins; Cardoso, Halisson Alberdan C., E-mail: ivmm@cin.ufpe.br, E-mail: sbm@cin.ufpe.br, E-mail: rmas@cin.ufpe.br, E-mail: hacc@cin.ufpe.br, E-mail: ccd@ufpe.br, E-mail: eal@cin.ufpe.br [Universidade Federal de Pernambuco (UFPE), Recife, PE (Brazil)

    2015-07-01

    The PSIRT (Particle System Iterative Reconstruction Technique) is a method of tomographic image reconstruction primarily designed to work with configurations suitable for industrial applications. A particle system is an optimization technique inspired in real physical systems that associates to the reconstructing material a set of particles with certain physical features, subject to a force eld, which can produce movement. The system constantly updates the set of particles by repositioning them in such a way as to approach the equilibrium. The elastic potential along a trajectory is a function of the difference between the attenuation coefficient in the current configuration and the corresponding input data. PSIRT has been successfully used to reconstruct simulated and real objects subject to sets of parallel and fanbeam lines in different angles, representing typical gamma-ray tomographic arrangements. One of PSIRT's limitation was its performance, too slow for real time scenarios. In this work, it is presented a reformulation in PSIRT's computational model, which is able to grant the new algorithm, the FPSIRT - Fast System Iterative Reconstruction Technique, a performance up to 200-time faster than PSIRT's. In this work a comparison of their application to real and simulated data from the HSGT, High Speed Gamma Tomograph, is presented. (author)

  7. Tomographic reconstruction by using FPSIRT (Fast Particle System Iterative Reconstruction Technique)

    International Nuclear Information System (INIS)

    Moreira, Icaro Valgueiro M.; Melo, Silvio de Barros; Dantas, Carlos; Lima, Emerson Alexandre; Silva, Ricardo Martins; Cardoso, Halisson Alberdan C.

    2015-01-01

    The PSIRT (Particle System Iterative Reconstruction Technique) is a method of tomographic image reconstruction primarily designed to work with configurations suitable for industrial applications. A particle system is an optimization technique inspired in real physical systems that associates to the reconstructing material a set of particles with certain physical features, subject to a force eld, which can produce movement. The system constantly updates the set of particles by repositioning them in such a way as to approach the equilibrium. The elastic potential along a trajectory is a function of the difference between the attenuation coefficient in the current configuration and the corresponding input data. PSIRT has been successfully used to reconstruct simulated and real objects subject to sets of parallel and fanbeam lines in different angles, representing typical gamma-ray tomographic arrangements. One of PSIRT's limitation was its performance, too slow for real time scenarios. In this work, it is presented a reformulation in PSIRT's computational model, which is able to grant the new algorithm, the FPSIRT - Fast System Iterative Reconstruction Technique, a performance up to 200-time faster than PSIRT's. In this work a comparison of their application to real and simulated data from the HSGT, High Speed Gamma Tomograph, is presented. (author)

  8. Preparation and tomographic reconstruction of an arbitrary single-photon path qubit

    International Nuclear Information System (INIS)

    Baek, So-Young; Kim, Yoon-Ho

    2011-01-01

    We report methods for preparation and tomographic reconstruction of an arbitrary single-photon path qubit. The arbitrary single-photon path qubit is prepared losslessly by passing the heralded single-photon state from spontaneous parametric down-conversion through variable beam splitter. Quantum state tomography of the single-photon path qubit is implemented by introducing path-projection measurements based on the first-order single-photon quantum interference. Using the state preparation and path-projection measurements methods for the single-photon path qubit, we demonstrate preparation and complete tomographic reconstruction of the single-photon path qubit with arbitrary purity. -- Highlights: → We report methods for preparation and tomographic reconstruction of an arbitrary single-photon path qubit. → We implement path-projection measurements based on the first-order single-photon quantum interference. → We demonstrate preparation and complete tomographic reconstruction of the single-photon path qubit with arbitrary purity.

  9. Industrial dynamic tomographic reconstruction; Reconstrucao tomografica dinamica industrial

    Energy Technology Data Exchange (ETDEWEB)

    Oliveira, Eric Ferreira de

    2016-07-01

    The state of the art methods applied to industrial processes is currently based on the principles of classical tomographic reconstructions developed for tomographic patterns of static distributions, or is limited to cases of low variability of the density distribution function of the tomographed object. Noise and motion artifacts are the main problems caused by a mismatch in the data from views acquired in different instants. All of these add to the known fact that using a limited amount of data can result in the presence of noise, artifacts and some inconsistencies with the distribution under study. One of the objectives of the present work is to discuss the difficulties that arise from implementing reconstruction algorithms in dynamic tomography that were originally developed for static distributions. Another objective is to propose solutions that aim at reducing a temporal type of information loss caused by employing regular acquisition systems to dynamic processes. With respect to dynamic image reconstruction it was conducted a comparison between different static reconstruction methods, like MART and FBP, when used for dynamic scenarios. This comparison was based on a MCNPx simulation as well as an analytical setup of an aluminum cylinder that moves along the section of a riser during the process of acquisition, and also based on cross section images from CFD techniques. As for the adaptation of current tomographic acquisition systems for dynamic processes, this work established a sequence of tomographic views in a just-in-time fashion for visualization purposes, a form of visually disposing density information as soon as it becomes amenable to image reconstruction. A third contribution was to take advantage of the triple color channel necessary to display colored images in most displays, so that, by appropriately scaling the acquired values of each view in the linear system of the reconstruction, it was possible to imprint a temporal trace into the regularly

  10. Tomographic image reconstruction using training images

    DEFF Research Database (Denmark)

    Soltani, Sara; Andersen, Martin Skovgaard; Hansen, Per Christian

    2017-01-01

    We describe and examine an algorithm for tomographic image reconstruction where prior knowledge about the solution is available in the form of training images. We first construct a non-negative dictionary based on prototype elements from the training images; this problem is formulated within...

  11. Tomographic ventricular reconstruction using multiple view first-pass radionuclide angiography

    International Nuclear Information System (INIS)

    Lacy, J.L.; Ball, M.E.; Verani, M.S.; Wiles, H.; Roberts, R.

    1985-01-01

    In first-pass radionuclide angiography (FPRA) images of both left and right ventricles are uncontaminated by adjacent structures. Thus, the problem of tomographic reconstruction is vastly simplified compared to equilibrium blood pool imaging in which all structures are imaged simultaneously. Tomographic reconstruction from a limited number of views may thus be possible. A simple filtered interpolative back-projection reconstruction technique was employed. In this technique interpolation was used between sectional distributions at successive angles. Interpolations yielding 9 and 13 back projection angles of 22.5 0 and 15 0 were evaluated. Ventricular borders were obtained in each back-projected tomographic slice by location of the intensity level which provided correct total ventricular volume. Cast cross sections were quantitatively well represented by these borders. This ventricular border definition algorithm forms the basis for applications of the technique in animals and humans

  12. Image interface in Java for tomographic reconstruction in nuclear medicine

    International Nuclear Information System (INIS)

    Andrade, M.A.; Silva, A.M. Marques da

    2004-01-01

    The aim of this study is to implement a software for tomographic reconstruction of SPECT data from Nuclear Medicine with a flexible interface design, cross-platform, written in Java. Validation tests were performed based on SPECT simulated data. The results showed that the implemented algorithms and filters agree with the theoretical context. We intend to extend the system by implementing additional tomographic reconstruction techniques and Java threads, in order to provide simultaneously image processing. (author)

  13. Connections model for tomographic images reconstruction

    International Nuclear Information System (INIS)

    Rodrigues, R.G.S.; Pela, C.A.; Roque, S.F. A.C.

    1998-01-01

    This paper shows an artificial neural network with an adequately topology for tomographic image reconstruction. The associated error function is derived and the learning algorithm is make. The simulated results are presented and demonstrate the existence of a generalized solution for nets with linear activation function. (Author)

  14. DG TOMO: A new method for tomographic reconstruction

    International Nuclear Information System (INIS)

    Freitas, D. de; Feschet, F.; Cachin, F.; Geissler, B.; Bapt, A.; Karidioula, I.; Martin, C.; Kelly, A.; Mestas, D.; Gerard, Y.; Reveilles, J.P.; Maublant, J.

    2006-01-01

    Aim: FBP and OSEM are the most popular tomographic reconstruction methods in scintigraphy. FBP is a simple method but artifacts of reconstruction are generated which corrections induce degradation of the spatial resolution. OSEM takes account of statistical fluctuations but noise strongly increases after a certain number of iterations. We compare a new method of tomographic reconstruction based on discrete geometry (DG TOMO) to FBP and OSEM. Materials and methods: Acquisitions were performed on a three-head gamma-camera (Philips) with a NEMA Phantom containing six spheres of sizes from 10 to 37 mm inner diameter, filled with around 325 MBq/l of technetium-99 m. The spheres were positioned in water containing 3 MBq/l of technetium-99 m. Acquisitions were realized during a 180 o -rotation around the phantom by 25-s steps. DG TOMO has been developed in our laboratory in order to minimize the number of projections at acquisition. Two tomographic reconstructions utilizing 32 and 16 projections with FBP, OSEM and DG TOMO were performed and transverse slices were compared. Results: FBP with 32 projections detects only the activity in the three largest spheres (diameter ≥22 mm). With 16 projections, the star effect is predominant and the contrast of the third sphere is very low. OSEM with 32 projections provides a better image but the three smallest spheres (diameter ≤17 mm) are difficult to distinguish. With 16 projections, the three smaller spheres are not detectable. The results of DG TOMO are similar to OSEM. Conclusion: Since the parameters of DG TOMO can be further optimized, this method appears as a promising alternative for tomoscintigraphy reconstruction

  15. Expediting model-based optoacoustic reconstructions with tomographic symmetries

    International Nuclear Information System (INIS)

    Lutzweiler, Christian; Deán-Ben, Xosé Luís; Razansky, Daniel

    2014-01-01

    Purpose: Image quantification in optoacoustic tomography implies the use of accurate forward models of excitation, propagation, and detection of optoacoustic signals while inversions with high spatial resolution usually involve very large matrices, leading to unreasonably long computation times. The development of fast and memory efficient model-based approaches represents then an important challenge to advance on the quantitative and dynamic imaging capabilities of tomographic optoacoustic imaging. Methods: Herein, a method for simplification and acceleration of model-based inversions, relying on inherent symmetries present in common tomographic acquisition geometries, has been introduced. The method is showcased for the case of cylindrical symmetries by using polar image discretization of the time-domain optoacoustic forward model combined with efficient storage and inversion strategies. Results: The suggested methodology is shown to render fast and accurate model-based inversions in both numerical simulations andpost mortem small animal experiments. In case of a full-view detection scheme, the memory requirements are reduced by one order of magnitude while high-resolution reconstructions are achieved at video rate. Conclusions: By considering the rotational symmetry present in many tomographic optoacoustic imaging systems, the proposed methodology allows exploiting the advantages of model-based algorithms with feasible computational requirements and fast reconstruction times, so that its convenience and general applicability in optoacoustic imaging systems with tomographic symmetries is anticipated

  16. Reconstruction of tomographic image from x-ray projections of a few views

    International Nuclear Information System (INIS)

    Kobayashi, Fujio; Yamaguchi, Shoichiro

    1982-01-01

    Computer tomographs have progressed rapidly, and in the latest high performance types, the photographing time has been shortened to less than 5 sec, but the clear images of hearts have not yet been obtained. The X-ray tomographs used so far irradiate X-ray from many directions and measure the projected data, but by limiting projection direction to a small number, it was planned to shorter the X-ray photographing time and to reduce X-ray exposure as the objective of this study. In this paper, a method is proposed, by which tomographic images are reconstructed from projected data in a small number of direction by generalized inverse matrix penalty method. This method is the calculation method newly devised by the authors for this purpose. It is a kind of the nonlinear planning method added with the restrictive condition using a generalized inverse matrix, and it is characterized by the simple calculation procedure and rapid convergence. Moreover, the effect on reconstructed images when errors are included in projected data was examined. Also, the simple computer simulation to reconstruct tomographic images using the projected data in four directions was performed, and the usefulness of this method was confirmed. It contributes to the development of superhigh speed tomographs in future. (Kako, I.)

  17. Tomographic Reconstruction from a Few Views: A Multi-Marginal Optimal Transport Approach

    Energy Technology Data Exchange (ETDEWEB)

    Abraham, I., E-mail: isabelle.abraham@cea.fr [CEA Ile de France (France); Abraham, R., E-mail: romain.abraham@univ-orleans.fr; Bergounioux, M., E-mail: maitine.bergounioux@univ-orleans.fr [Université d’Orléans, UFR Sciences, MAPMO, UMR 7349 (France); Carlier, G., E-mail: carlier@ceremade.dauphine.fr [CEREMADE, UMR CNRS 7534, Université Paris IX Dauphine, Pl. de Lattre de Tassigny (France)

    2017-02-15

    In this article, we focus on tomographic reconstruction. The problem is to determine the shape of the interior interface using a tomographic approach while very few X-ray radiographs are performed. We use a multi-marginal optimal transport approach. Preliminary numerical results are presented.

  18. Tomographic reconstruction of transverse phase space from turn-by-turn profile data

    CERN Document Server

    Hancock, S; Lindroos, M

    1999-01-01

    Tomographic methods have the potential for useful application in beam diagnostics. The tomographic reconstruction of transverse phase space density from turn-by-turn profile data has been studied with particular attention to the effects of dispersion and chromaticity. It is shown that the modified Algebraic Reconstruction Technique (ART) that deals successfully with the problem of non-linear motion in the longitudinal plane cannot, in general, be extended to cover the transverse case. Instead, an approach is proposed in which the effect of dispersion is deconvoluted from the measured profiles before the phase space picture is reconstructed using either the modified ART algorithm or the inverse Radon Transform. This requires an accurate knowledge of the momentum distribution of the beam and the modified ART reconstruction of longitudinal phase space density yields just such information. The method has been tested extensively with simulated data.

  19. Tomographic Image Reconstruction Using Training Images with Matrix and Tensor Formulations

    DEFF Research Database (Denmark)

    Soltani, Sara

    the image resolution compared to a classical reconstruction method such as Filtered Back Projection (FBP). Some priors for the tomographic reconstruction take the form of cross-section images of similar objects, providing a set of the so-called training images, that hold the key to the structural......Reducing X-ray exposure while maintaining the image quality is a major challenge in computed tomography (CT); since the imperfect data produced from the few view and/or low intensity projections results in low-quality images that are suffering from severe artifacts when using conventional...... information about the solution. The training images must be reliable and application-specific. This PhD project aims at providing a mathematical and computational framework for the use of training sets as non-parametric priors for the solution in tomographic image reconstruction. Through an unsupervised...

  20. MCPT: A Monte Carlo code for simulation of photon transport in tomographic scanners

    International Nuclear Information System (INIS)

    Prettyman, T.H.; Gardner, R.P.; Verghese, K.

    1990-01-01

    MCPT is a special-purpose Monte Carlo code designed to simulate photon transport in tomographic scanners. Variance reduction schemes and sampling games present in MCPT were selected to characterize features common to most tomographic scanners. Combined splitting and biasing (CSB) games are used to systematically sample important detection pathways. An efficient splitting game is used to tally particle energy deposition in detection zones. The pulse height distribution of each detector can be found by convolving the calculated energy deposition distribution with the detector's resolution function. A general geometric modelling package, HERMETOR, is used to describe the geometry of the tomographic scanners and provide MCPT information needed for particle tracking. MCPT's modelling capabilites are described and preliminary experimental validation is presented. (orig.)

  1. Rapid tomographic reconstruction based on machine learning for time-resolved combustion diagnostics

    Science.gov (United States)

    Yu, Tao; Cai, Weiwei; Liu, Yingzheng

    2018-04-01

    Optical tomography has attracted surged research efforts recently due to the progress in both the imaging concepts and the sensor and laser technologies. The high spatial and temporal resolutions achievable by these methods provide unprecedented opportunity for diagnosis of complicated turbulent combustion. However, due to the high data throughput and the inefficiency of the prevailing iterative methods, the tomographic reconstructions which are typically conducted off-line are computationally formidable. In this work, we propose an efficient inversion method based on a machine learning algorithm, which can extract useful information from the previous reconstructions and build efficient neural networks to serve as a surrogate model to rapidly predict the reconstructions. Extreme learning machine is cited here as an example for demonstrative purpose simply due to its ease of implementation, fast learning speed, and good generalization performance. Extensive numerical studies were performed, and the results show that the new method can dramatically reduce the computational time compared with the classical iterative methods. This technique is expected to be an alternative to existing methods when sufficient training data are available. Although this work is discussed under the context of tomographic absorption spectroscopy, we expect it to be useful also to other high speed tomographic modalities such as volumetric laser-induced fluorescence and tomographic laser-induced incandescence which have been demonstrated for combustion diagnostics.

  2. A high-throughput system for high-quality tomographic reconstruction of large datasets at Diamond Light Source.

    Science.gov (United States)

    Atwood, Robert C; Bodey, Andrew J; Price, Stephen W T; Basham, Mark; Drakopoulos, Michael

    2015-06-13

    Tomographic datasets collected at synchrotrons are becoming very large and complex, and, therefore, need to be managed efficiently. Raw images may have high pixel counts, and each pixel can be multidimensional and associated with additional data such as those derived from spectroscopy. In time-resolved studies, hundreds of tomographic datasets can be collected in sequence, yielding terabytes of data. Users of tomographic beamlines are drawn from various scientific disciplines, and many are keen to use tomographic reconstruction software that does not require a deep understanding of reconstruction principles. We have developed Savu, a reconstruction pipeline that enables users to rapidly reconstruct data to consistently create high-quality results. Savu is designed to work in an 'orthogonal' fashion, meaning that data can be converted between projection and sinogram space throughout the processing workflow as required. The Savu pipeline is modular and allows processing strategies to be optimized for users' purposes. In addition to the reconstruction algorithms themselves, it can include modules for identification of experimental problems, artefact correction, general image processing and data quality assessment. Savu is open source, open licensed and 'facility-independent': it can run on standard cluster infrastructure at any institution.

  3. Tomographic reconstruction of OH* chemiluminescence in two interacting turbulent flames

    International Nuclear Information System (INIS)

    Worth, Nicholas A; Dawson, James R

    2013-01-01

    The tomographic reconstruction of OH* chemiluminescence was performed on two interacting turbulent premixed bluff-body stabilized flames under steady flow conditions and acoustic excitation. These measurements elucidate the complex three-dimensional (3D) vortex–flame interactions which have previously not been accessible. The experiment was performed using a single camera and intensifier, with multiple views acquired by repositioning the camera, permitting calculation of the mean and phase-averaged volumetric OH* distributions. The reconstructed flame structure and phase-averaged dynamics are compared with OH planar laser-induced fluorescence and flame surface density measurements for the first time. The volumetric data revealed that the large-scale vortex–flame structures formed along the shear layers of each flame collide when the two flames meet, resulting in complex 3D flame structures in between the two flames. With a fairly simple experimental setup, it is shown that the tomographic reconstruction of OH* chemiluminescence in forced flames is a powerful tool that can yield important physical insights into large-scale 3D flame dynamics that are important in combustion instability. (paper)

  4. Direct computation of harmonic moments for tomographic reconstruction

    International Nuclear Information System (INIS)

    Nara, Takaaki; Ito, Nobutaka; Takamatsu, Tomonori; Sakurai, Tetsuya

    2007-01-01

    A novel algorithm to compute harmonic moments of a density function from its projections is presented for tomographic reconstruction. For projection p(r, θ), we define harmonic moments of projection by ∫ π 0 ∫ ∞ -∞ p(r,θ)(re iθ ) n drd θ and show that it coincides with the harmonic moments of the density function except a constant. Furthermore, we show that the harmonic moment of projection of order n can be exactly computed by using n+ 1 projection directions, which leads to an efficient algorithm to reconstruct the vertices of a polygon from projections.

  5. 3D Tomographic Image Reconstruction using CUDA C

    International Nuclear Information System (INIS)

    Dominguez, J. S.; Assis, J. T.; Oliveira, L. F. de

    2011-01-01

    This paper presents the study and implementation of a software for three dimensional reconstruction of images obtained with a tomographic system using the capabilities of Graphic Processing Units(GPU). The reconstruction by filtered back-projection method was developed using the CUDA C, for maximum utilization of the processing capabilities of GPUs to solve computational problems with large computational cost and highly parallelizable. It was discussed the potential of GPUs and shown its advantages to solving this kind of problems. The results in terms of runtime will be compared with non-parallelized implementations and must show a great reduction of processing time. (Author)

  6. Distance weighting for improved tomographic reconstructions

    International Nuclear Information System (INIS)

    Koeppe, R.A.; Holden, J.E.

    1984-01-01

    An improved method for the reconstruction of emission computed axial tomography images has been developed. The method is a modification of filtered back-projection, where the back projected values are weighted to reflect the loss of formation, with distance from the camera, which is inherent in gamma camera imaging. This information loss is a result of: loss of spatial resolution with distance, attenuation, and scatter. The weighting scheme can best be described by considering the contributions of any two opposing views to the reconstruction image pixels. The weight applied to the projections of one view is set to equal the relative amount of the original activity that was initially received in that projection, assuming a uniform attenuating medium. This yields a weighting value which is a function of distance into the image with a value of one for pixels ''near the camera'', a value of .5 at the image center, and a value of zero on the opposite side. Tomographic reconstructions produced with this method show improved spatial resolution when compared to conventional 360 0 reconstructions. The improvement is in the tangential direction, where simulations have indicated a FWHM improvement of 1 to 1.5 millimeters. The resolution in the radial direction is essentially the same for both methods. Visual inspection of the reconstructed images show improved resolution and contrast

  7. Spectrometry and emission tomographic image reconstruction stimulated by neutrons via EM algorithm and Monte Carlo Method

    International Nuclear Information System (INIS)

    Viana, Rodrigo Sartorelo Salemi

    2014-01-01

    The NSECT (Neutron Stimulated Emission Computed Tomography) figures as a new spectrographic technique able to evaluate in vivo the concentration of elements using the inelastic scattering reaction (n,n'). Since its introduction, several improvements have been proposed with the aim of investigating applications for clinical diagnosis and reduction of absorbed dose associated with CT acquisition. In this context, two new diagnostic applications are presented using spectroscopic and tomographic approaches from NSECT. A new methodology has also been proposed to optimize the sinogram sampling that is directly related to the quality of the reconstruction by the irradiation protocol. The studies were developed based on simulations with MCNP5 code. Diagnosis of Renal Cell Carcinoma (RCC) and the detection of breast microcalcifications were evaluated in studies conducted using a human phantom. The obtained results demonstrate the ability of the NSECT technique to detect changes in the composition of the modeled tissues as a function of the development of evaluated pathologies. The proposed method for optimizing sinograms was able to analytically simulate the composition of the irradiated medium allowing the assessment of quality of reconstruction and effective dose in terms of the sampling rate. However, future research must be conducted to quantify the sensitivity of detection according to the selected elements. (author)

  8. Advanced modeling in positron emission tomography using Monte Carlo simulations for improving reconstruction and quantification

    International Nuclear Information System (INIS)

    Stute, Simon

    2010-01-01

    Positron Emission Tomography (PET) is a medical imaging technique that plays a major role in oncology, especially using "1"8F-Fluoro-Deoxyglucose. However, PET images suffer from a modest spatial resolution and from high noise. As a result, there is still no consensus on how tumor metabolically active volume and tumor uptake should be characterized. In the meantime, research groups keep producing new methods for such characterizations that need to be assessed. A Monte Carlo simulation based method has been developed to produce simulated PET images of patients suffering from cancer, indistinguishable from clinical images, and for which all parameters are known. The method uses high resolution PET images from patient acquisitions, from which the physiological heterogeneous activity distribution can be modeled. It was shown that the performance of quantification methods on such highly realistic simulated images are significantly lower and more variable than using simple phantom studies. Fourteen different quantification methods were also compared in realistic conditions using a group of such simulated patients. In addition, the proposed method was extended to simulate serial PET scans in the context of patient monitoring, including a modeling of the tumor changes, as well as the variability over time of non-tumoral physiological activity distribution. Monte Carlo simulations were also used to study the detection probability inside the crystals of the tomograph. A model of the crystal response was derived and included in the system matrix involved in tomographic reconstruction. The resulting reconstruction method was compared with other sophisticated methods for modeling the detector response in the image space, proposed in the literature. We demonstrated the superiority of the proposed method over equivalent approaches on simulated data, and illustrated its robustness on clinical data. For a same noise level, it is possible to reconstruct PET images offering a

  9. Longitudinal and transverse digital image reconstruction with a tomographic scanner

    International Nuclear Information System (INIS)

    Pickens, D.R.; Price, R.R.; Erickson, J.J.; Patton, J.A.; Partain, C.L.; Rollo, F.D.

    1981-01-01

    A Siemens Gammasonics PHO/CON-192 Multiplane Imager is interfaced to a digital computer for the purpose of performing tomographic reconstructions from the data collected during a single scan. Data from the two moving gamma cameras as well as camera position information are sent to the computer by an interface designed in the authors' laboratory. Backprojection reconstruction is implemented by the computer. Longitudinal images in whole-body format as well as smaller formats are reconstructed for up to six planes simultaneously from the list mode data. Transverse reconstructions are demonstrated for 201 T1 myocardial scans. Post-reconstruction deconvolution processing to remove the blur artifact (characteristic of focal plane tomography) is applied to a multiplane phantom. Digital data acquisition of data and reconstruction of images are practical, and can extend the usefulness of the machine when compared with the film output (author)

  10. A tensor-based dictionary learning approach to tomographic image reconstruction

    DEFF Research Database (Denmark)

    Soltani, Sara; Kilmer, Misha E.; Hansen, Per Christian

    2016-01-01

    We consider tomographic reconstruction using priors in the form of a dictionary learned from training images. The reconstruction has two stages: first we construct a tensor dictionary prior from our training data, and then we pose the reconstruction problem in terms of recovering the expansion...... coefficients in that dictionary. Our approach differs from past approaches in that (a) we use a third-order tensor representation for our images and (b) we recast the reconstruction problem using the tensor formulation. The dictionary learning problem is presented as a non-negative tensor factorization problem...... with sparsity constraints. The reconstruction problem is formulated in a convex optimization framework by looking for a solution with a sparse representation in the tensor dictionary. Numerical results show that our tensor formulation leads to very sparse representations of both the training images...

  11. Monte Carlo simulation for the design of industrial gamma-ray transmission tomography

    International Nuclear Information System (INIS)

    Kim, Jongbum; Jung, Sunghee; Moon, Jinho; Kwon, Taekyong; Cho, Gyuseong

    2011-01-01

    The Monte Carlo simulation and experiment were carried out for a large-scale industrial gamma ray tomographic scanning geometry. The geometry of the tomographic system has a moving source with 16 stationary detectors. This geometry is advantageous for the diagnosis of a large-scale industrial plant. The simulation data was carried out for the phantom with 32 views, 16 detectors, and a different energy bin. The simulation data was processed to be used for image reconstruction. Image reconstruction was performed by a Diagonally-Scaled Gradient-Ascent algorithm for simulation data. Experiments were conducted in a 78 cm diameter column filled with polypropylene grains. Sixteen 0.5-inch-thick and 1 inch long NaI(Tl) cylindrical detectors, and 20 mCi of 137 Cs radioactive source were used. The experimental results were compared to the simulation data. The experimental results were similar to Monte Carlo simulation results. This result showed that the Monte Carlo simulation is useful for predicting the result of the industrial gamma tomographic scan method And it can also give a solution for designing the industrial gamma tomography system and preparing the field experiment. (author)

  12. A distributed multi-GPU system for high speed electron microscopic tomographic reconstruction.

    Science.gov (United States)

    Zheng, Shawn Q; Branlund, Eric; Kesthelyi, Bettina; Braunfeld, Michael B; Cheng, Yifan; Sedat, John W; Agard, David A

    2011-07-01

    Full resolution electron microscopic tomographic (EMT) reconstruction of large-scale tilt series requires significant computing power. The desire to perform multiple cycles of iterative reconstruction and realignment dramatically increases the pressing need to improve reconstruction performance. This has motivated us to develop a distributed multi-GPU (graphics processing unit) system to provide the required computing power for rapid constrained, iterative reconstructions of very large three-dimensional (3D) volumes. The participating GPUs reconstruct segments of the volume in parallel, and subsequently, the segments are assembled to form the complete 3D volume. Owing to its power and versatility, the CUDA (NVIDIA, USA) platform was selected for GPU implementation of the EMT reconstruction. For a system containing 10 GPUs provided by 5 GTX295 cards, 10 cycles of SIRT reconstruction for a tomogram of 4096(2) × 512 voxels from an input tilt series containing 122 projection images of 4096(2) pixels (single precision float) takes a total of 1845 s of which 1032 s are for computation with the remainder being the system overhead. The same system takes only 39 s total to reconstruct 1024(2) × 256 voxels from 122 1024(2) pixel projections. While the system overhead is non-trivial, performance analysis indicates that adding extra GPUs to the system would lead to steadily enhanced overall performance. Therefore, this system can be easily expanded to generate superior computing power for very large tomographic reconstructions and especially to empower iterative cycles of reconstruction and realignment. Copyright © 2011 Elsevier B.V. All rights reserved.

  13. An evolutionary algorithm for tomographic reconstructions in limited data sets problems

    International Nuclear Information System (INIS)

    Turcanu, Catrinel; Craciunescu, Teddy

    2000-01-01

    The paper proposes a new method for tomographic reconstructions. Unlike nuclear medicine applications, in physical science problems we are often confronted with limited data sets: constraints in the number of projections or limited angle views. The problem of image reconstruction from projections may be considered as a problem of finding an image (solution) having projections that match the experimental ones. In our approach, we choose a statistical correlation coefficient to evaluate the fitness of any potential solution. The optimization process is carried out by an evolutionary algorithm. Our algorithm has some problem-oriented characteristics. One of them is that a chromosome, representing a potential solution, is not linear but coded as a matrix of pixels corresponding to a two-dimensional image. This kind of internal representation reflects the genuine manifestation and slight differences between two points situated in the original problem space give rise to similar differences once they become coded. Another particular feature is a newly built crossover operator: the grid-based crossover, suitable for high dimension two-dimensional chromosomes. Except for the population size and the dimension of the cutting grid for the grid-based crossover, all the other parameters of the algorithm are independent of the geometry of the tomographic reconstruction. The performances of the method are evaluated in comparison with a traditional tomographic method, based on the maximization of the entropy of the image, that proved to work well with limited data sets. The test phantom is typical for an application with limited data sets: the determination of the neutron energy spectra with time resolution in case of short-pulsed neutron emission. The qualitative judgement and also the quantitative one, based on some figures of merit, point out that the proposed method ensures an improved reconstruction of shapes, sizes and resolution in the image, even in the presence of noise

  14. Investigation of the noise effect on tomographic reconstructions for a tangentially viewing vacuum ultraviolet imaging diagnostic

    International Nuclear Information System (INIS)

    Ming, Tingfeng; Ohdachi, Satoshi; Suzuki, Yasuhiro

    2011-01-01

    Tomographic reconstruction for a tangentially viewing two-dimensional (2D) imaging system is studied. A method to calculate the geometry matrix in 2D tomography is introduced. An algorithm based on a Phillips-Tikhonov (P-T) type regularization method is investigated, and numerical tests using the P-T method are conducted with both tokamak and Heliotron configurations. The numerical tests show that the P-T method is not sensitive to the added noise levels and the emission profiles with higher mode numbers can be reconstructed with adequate resolution. The results indicate that this method is suitable for 2D tomographic reconstruction for a tangentially viewing vacuum ultraviolet telescope system. (author)

  15. A distributed multi-GPU system for high speed electron microscopic tomographic reconstruction

    Energy Technology Data Exchange (ETDEWEB)

    Zheng, Shawn Q.; Branlund, Eric; Kesthelyi, Bettina; Braunfeld, Michael B.; Cheng, Yifan; Sedat, John W. [The Howard Hughes Medical Institute and the W.M. Keck Advanced Microscopy Laboratory, Department of Biochemistry and Biophysics, University of California, San Francisco, 600, 16th Street, Room S412D, CA 94158-2517 (United States); Agard, David A., E-mail: agard@msg.ucsf.edu [The Howard Hughes Medical Institute and the W.M. Keck Advanced Microscopy Laboratory, Department of Biochemistry and Biophysics, University of California, San Francisco, 600, 16th Street, Room S412D, CA 94158-2517 (United States)

    2011-07-15

    Full resolution electron microscopic tomographic (EMT) reconstruction of large-scale tilt series requires significant computing power. The desire to perform multiple cycles of iterative reconstruction and realignment dramatically increases the pressing need to improve reconstruction performance. This has motivated us to develop a distributed multi-GPU (graphics processing unit) system to provide the required computing power for rapid constrained, iterative reconstructions of very large three-dimensional (3D) volumes. The participating GPUs reconstruct segments of the volume in parallel, and subsequently, the segments are assembled to form the complete 3D volume. Owing to its power and versatility, the CUDA (NVIDIA, USA) platform was selected for GPU implementation of the EMT reconstruction. For a system containing 10 GPUs provided by 5 GTX295 cards, 10 cycles of SIRT reconstruction for a tomogram of 4096{sup 2}x512 voxels from an input tilt series containing 122 projection images of 4096{sup 2} pixels (single precision float) takes a total of 1845 s of which 1032 s are for computation with the remainder being the system overhead. The same system takes only 39 s total to reconstruct 1024{sup 2}x256 voxels from 122 1024{sup 2} pixel projections. While the system overhead is non-trivial, performance analysis indicates that adding extra GPUs to the system would lead to steadily enhanced overall performance. Therefore, this system can be easily expanded to generate superior computing power for very large tomographic reconstructions and especially to empower iterative cycles of reconstruction and realignment. -- Highlights: {yields} A distributed multi-GPU system has been developed for electron microscopic tomography (EMT). {yields} This system allows for rapid constrained, iterative reconstruction of very large volumes. {yields} This system can be easily expanded to generate superior computing power for large-scale iterative EMT realignment.

  16. A distributed multi-GPU system for high speed electron microscopic tomographic reconstruction

    International Nuclear Information System (INIS)

    Zheng, Shawn Q.; Branlund, Eric; Kesthelyi, Bettina; Braunfeld, Michael B.; Cheng, Yifan; Sedat, John W.; Agard, David A.

    2011-01-01

    Full resolution electron microscopic tomographic (EMT) reconstruction of large-scale tilt series requires significant computing power. The desire to perform multiple cycles of iterative reconstruction and realignment dramatically increases the pressing need to improve reconstruction performance. This has motivated us to develop a distributed multi-GPU (graphics processing unit) system to provide the required computing power for rapid constrained, iterative reconstructions of very large three-dimensional (3D) volumes. The participating GPUs reconstruct segments of the volume in parallel, and subsequently, the segments are assembled to form the complete 3D volume. Owing to its power and versatility, the CUDA (NVIDIA, USA) platform was selected for GPU implementation of the EMT reconstruction. For a system containing 10 GPUs provided by 5 GTX295 cards, 10 cycles of SIRT reconstruction for a tomogram of 4096 2 x512 voxels from an input tilt series containing 122 projection images of 4096 2 pixels (single precision float) takes a total of 1845 s of which 1032 s are for computation with the remainder being the system overhead. The same system takes only 39 s total to reconstruct 1024 2 x256 voxels from 122 1024 2 pixel projections. While the system overhead is non-trivial, performance analysis indicates that adding extra GPUs to the system would lead to steadily enhanced overall performance. Therefore, this system can be easily expanded to generate superior computing power for very large tomographic reconstructions and especially to empower iterative cycles of reconstruction and realignment. -- Highlights: → A distributed multi-GPU system has been developed for electron microscopic tomography (EMT). → This system allows for rapid constrained, iterative reconstruction of very large volumes. → This system can be easily expanded to generate superior computing power for large-scale iterative EMT realignment.

  17. A Penalization Approach for Tomographic Reconstruction of Binary Axially Symmetric Objects

    International Nuclear Information System (INIS)

    Abraham, R.; Bergounioux, M.; Trelat, E.

    2008-01-01

    We propose a variational method for tomographic reconstruction of blurred and noised binary images based on a penalization process of a minimization problem settled in the space of bounded variation functions. We prove existence and/or uniqueness results and derive a penalized optimality system. Numerical simulations are provided to demonstrate the relevance of the approach

  18. Comparison among tomographic reconstruction with limited data

    International Nuclear Information System (INIS)

    Oliveira, Eric F.; Dantas, Carlos C.; Vasconcelos, Daniel A.A.; Cadiz, Luis F.; Melo, Silvio B.

    2011-01-01

    Nowadays there is a continuing interest in applying computed tomography (CT) techniques in non-destructive testing and inspection of many industrial products. These applications of CT usually require a differentiated analysis when there are strong limitations in acquiring a sufficiently large amount of projection data. The use of a low number of tomographic data normally degrades the quality of the reconstructed image, highlighting the formation of artifacts and noise. This work investigates the reconstruction methods most commonly used (FBP, ART, SIRT, MART, SMART) and shows the performance of each one in this limited scenario. For this purpose, all methods were implemented and tested with a phantom of uniform density with well-known distribution, with measures of transmission of gamma radiation in a first generation CT scanner. The phantom is a concentric stainless steel tube coupled with a half - cylinder of aluminum. The measurements were made with an highest root mean square error, with the formation of visible artifacts. The artifacts are diminished but still visible in the ART and SIRT techniques, and the best performance was observed with the techniques MART and SMART. The technical superiority of these multiplicative methods is clearly seen in the reconstructed image quality, endorsing their application to situations of limited input data. (author)

  19. Investigating Gravity Waves in Polar Mesospheric Clouds Using Tomographic Reconstructions of AIM Satellite Imagery

    Science.gov (United States)

    Hart, V. P.; Taylor, M. J.; Doyle, T. E.; Zhao, Y.; Pautet, P.-D.; Carruth, B. L.; Rusch, D. W.; Russell, J. M.

    2018-01-01

    This research presents the first application of tomographic techniques for investigating gravity wave structures in polar mesospheric clouds (PMCs) imaged by the Cloud Imaging and Particle Size instrument on the NASA AIM satellite. Albedo data comprising consecutive PMC scenes were used to tomographically reconstruct a 3-D layer using the Partially Constrained Algebraic Reconstruction Technique algorithm and a previously developed "fanning" technique. For this pilot study, a large region (760 × 148 km) of the PMC layer (altitude 83 km) was sampled with a 2 km horizontal resolution, and an intensity weighted centroid technique was developed to create novel 2-D surface maps, characterizing the individual gravity waves as well as their altitude variability. Spectral analysis of seven selected wave events observed during the Northern Hemisphere 2007 PMC season exhibited dominant horizontal wavelengths of 60-90 km, consistent with previous studies. These tomographic analyses have enabled a broad range of new investigations. For example, a clear spatial anticorrelation was observed between the PMC albedo and wave-induced altitude changes, with higher-albedo structures aligning well with wave troughs, while low-intensity regions aligned with wave crests. This result appears to be consistent with current theories of PMC development in the mesopause region. This new tomographic imaging technique also provides valuable wave amplitude information enabling further mesospheric gravity wave investigations, including quantitative analysis of their hemispheric and interannual characteristics and variations.

  20. Dense velocity reconstruction from tomographic PTV with material derivatives

    Science.gov (United States)

    Schneiders, Jan F. G.; Scarano, Fulvio

    2016-09-01

    A method is proposed to reconstruct the instantaneous velocity field from time-resolved volumetric particle tracking velocimetry (PTV, e.g., 3D-PTV, tomographic PTV and Shake-the-Box), employing both the instantaneous velocity and the velocity material derivative of the sparse tracer particles. The constraint to the measured temporal derivative of the PTV particle tracks improves the consistency of the reconstructed velocity field. The method is christened as pouring time into space, as it leverages temporal information to increase the spatial resolution of volumetric PTV measurements. This approach becomes relevant in cases where the spatial resolution is limited by the seeding concentration. The method solves an optimization problem to find the vorticity and velocity fields that minimize a cost function, which includes next to instantaneous velocity, also the velocity material derivative. The velocity and its material derivative are related through the vorticity transport equation, and the cost function is minimized using the limited-memory Broyden-Fletcher-Goldfarb-Shanno (L-BFGS) algorithm. The procedure is assessed numerically with a simulated PTV experiment in a turbulent boundary layer from a direct numerical simulation (DNS). The experimental validation considers a tomographic particle image velocimetry (PIV) experiment in a similar turbulent boundary layer and the additional case of a jet flow. The proposed technique (`vortex-in-cell plus', VIC+) is compared to tomographic PIV analysis (3D iterative cross-correlation), PTV interpolation methods (linear and adaptive Gaussian windowing) and to vortex-in-cell (VIC) interpolation without the material derivative. A visible increase in resolved details in the turbulent structures is obtained with the VIC+ approach, both in numerical simulations and experiments. This results in a more accurate determination of the turbulent stresses distribution in turbulent boundary layer investigations. Data from a jet

  1. Tomographic reconstruction of the time-averaged density distribution in two-phase flow

    International Nuclear Information System (INIS)

    Fincke, J.R.

    1982-01-01

    The technique of reconstructive tomography has been applied to the measurement of time-average density and density distribution in a two-phase flow field. The technique of reconstructive tomography provides a model-independent method of obtaining flow-field density information. A tomographic densitometer system for the measurement of two-phase flow has two unique problems: a limited number of data values and a correspondingly coarse reconstruction grid. These problems were studied both experimentally through the use of prototype hardware on a 3-in. pipe, and analytically through computer generation of simulated data. The prototype data were taken on phantoms constructed of all Plexiglas and Plexiglas laminated with wood and polyurethane foam. Reconstructions obtained from prototype data are compared with reconstructions from the simulated data. Also presented are some representative results in a horizontal air/water flow

  2. Connections model for tomographic images reconstruction; Modelo conexionista para reconstrucao de imagens tomograficas

    Energy Technology Data Exchange (ETDEWEB)

    Rodrigues, R.G.S.; Pela, C.A.; Roque, S.F. A.C. [Departamento de Fisica e Matematica (FFCLRP) USP. Av. Bandeirantes, 3900- 14040- 901- Ribeirao Preto, Sao Paulo (Brazil)

    1998-12-31

    This paper shows an artificial neural network with an adequately topology for tomographic image reconstruction. The associated error function is derived and the learning algorithm is make. The simulated results are presented and demonstrate the existence of a generalized solution for nets with linear activation function. (Author)

  3. Terahertz Imaging for Biomedical Applications Pattern Recognition and Tomographic Reconstruction

    CERN Document Server

    Yin, Xiaoxia; Abbott, Derek

    2012-01-01

    Terahertz Imaging for Biomedical Applications: Pattern Recognition and Tomographic Reconstruction presents the necessary algorithms needed to assist screening, diagnosis, and treatment, and these algorithms will play a critical role in the accurate detection of abnormalities present in biomedical imaging. Terahertz biomedical imaging has become an area of interest due to its ability to simultaneously acquire both image and spectral information. Terahertz imaging systems are being commercialized with an increasing number of trials performed in a biomedical setting. Terahertz tomographic imaging and detection technology contributes to the ability to identify opaque objects with clear boundaries,and would be useful to both in vivo and ex vivo environments. This book also: Introduces terahertz radiation techniques and provides a number of topical examples of signal and image processing, as well as machine learning Presents the most recent developments in an emerging field, terahertz radiation Utilizes new methods...

  4. A maximum-likelihood reconstruction algorithm for tomographic gamma-ray nondestructive assay

    International Nuclear Information System (INIS)

    Prettyman, T.H.; Estep, R.J.; Cole, R.A.; Sheppard, G.A.

    1994-01-01

    A new tomographic reconstruction algorithm for nondestructive assay with high resolution gamma-ray spectroscopy (HRGS) is presented. The reconstruction problem is formulated using a maximum-likelihood approach in which the statistical structure of both the gross and continuum measurements used to determine the full-energy response in HRGS is precisely modeled. An accelerated expectation-maximization algorithm is used to determine the optimal solution. The algorithm is applied to safeguards and environmental assays of large samples (for example, 55-gal. drums) in which high continuum levels caused by Compton scattering are routinely encountered. Details of the implementation of the algorithm and a comparative study of the algorithm's performance are presented

  5. An efficient reconstruction algorithm for differential phase-contrast tomographic images from a limited number of views

    International Nuclear Information System (INIS)

    Sunaguchi, Naoki; Yuasa, Tetsuya; Gupta, Rajiv; Ando, Masami

    2015-01-01

    The main focus of this paper is reconstruction of tomographic phase-contrast image from a set of projections. We propose an efficient reconstruction algorithm for differential phase-contrast computed tomography that can considerably reduce the number of projections required for reconstruction. The key result underlying this research is a projection theorem that states that the second derivative of the projection set is linearly related to the Laplacian of the tomographic image. The proposed algorithm first reconstructs the Laplacian image of the phase-shift distribution from the second-derivative of the projections using total variation regularization. The second step is to obtain the phase-shift distribution by solving a Poisson equation whose source is the Laplacian image previously reconstructed under the Dirichlet condition. We demonstrate the efficacy of this algorithm using both synthetically generated simulation data and projection data acquired experimentally at a synchrotron. The experimental phase data were acquired from a human coronary artery specimen using dark-field-imaging optics pioneered by our group. Our results demonstrate that the proposed algorithm can reduce the number of projections to approximately 33% as compared with the conventional filtered backprojection method, without any detrimental effect on the image quality

  6. Tomographic apparatus and method for reconstructing planar slices from non-absorbed and non-scattered radiation

    International Nuclear Information System (INIS)

    1980-01-01

    An apparatus is described which can be used in computerized tomographic systems for constructing a representation of an object and which uses a fan-shaped beam source, detectors and a convolution method of data reconstruction. (U.K.)

  7. Imaging of turbulent structures and tomographic reconstruction of TORPEX plasma emissivity

    International Nuclear Information System (INIS)

    Iraji, D.; Furno, I.; Fasoli, A.; Theiler, C.

    2010-01-01

    In the TORPEX [A. Fasoli et al., Phys. Plasmas 13, 055902 (2006)], a simple magnetized plasma device, low frequency electrostatic fluctuations associated with interchange waves, are routinely measured by means of extensive sets of Langmuir probes. To complement the electrostatic probe measurements of plasma turbulence and study of plasma structures smaller than the spatial resolution of probes array, a nonperturbative direct imaging system has been developed on TORPEX, including a fast framing Photron-APX-RS camera and an image intensifier unit. From the line-integrated camera images, we compute the poloidal emissivity profile of the plasma by applying a tomographic reconstruction technique using a pixel method and solving an overdetermined set of equations by singular value decomposition. This allows comparing statistical, spectral, and spatial properties of visible light radiation with electrostatic fluctuations. The shape and position of the time-averaged reconstructed plasma emissivity are observed to be similar to those of the ion saturation current profile. In the core plasma, excluding the electron cyclotron and upper hybrid resonant layers, the mean value of the plasma emissivity is observed to vary with (T e ) α (n e ) β , in which α=0.25-0.7 and β=0.8-1.4, in agreement with collisional radiative model. The tomographic reconstruction is applied to the fast camera movie acquired with 50 kframes/s rate and 2 μs of exposure time to obtain the temporal evolutions of the emissivity fluctuations. Conditional average sampling is also applied to visualize and measure sizes of structures associated with the interchange mode. The ω-time and the two-dimensional k-space Fourier analysis of the reconstructed emissivity fluctuations show the same interchange mode that is detected in the ω and k spectra of the ion saturation current fluctuations measured by probes. Small scale turbulent plasma structures can be detected and tracked in the reconstructed emissivity

  8. A maximum entropy reconstruction technique for tomographic particle image velocimetry

    International Nuclear Information System (INIS)

    Bilsky, A V; Lozhkin, V A; Markovich, D M; Tokarev, M P

    2013-01-01

    This paper studies a novel approach for reducing tomographic PIV computational complexity. The proposed approach is an algebraic reconstruction technique, termed MENT (maximum entropy). This technique computes the three-dimensional light intensity distribution several times faster than SMART, using at least ten times less memory. Additionally, the reconstruction quality remains nearly the same as with SMART. This paper presents the theoretical computation performance comparison for MENT, SMART and MART, followed by validation using synthetic particle images. Both the theoretical assessment and validation of synthetic images demonstrate significant computational time reduction. The data processing accuracy of MENT was compared to that of SMART in a slot jet experiment. A comparison of the average velocity profiles shows a high level of agreement between the results obtained with MENT and those obtained with SMART. (paper)

  9. Evaluation of tomographic image quality of extended and conventional parallel hole collimators using maximum likelihood expectation maximization algorithm by Monte Carlo simulations.

    Science.gov (United States)

    Moslemi, Vahid; Ashoor, Mansour

    2017-10-01

    One of the major problems associated with parallel hole collimators (PCs) is the trade-off between their resolution and sensitivity. To solve this problem, a novel PC - namely, extended parallel hole collimator (EPC) - was proposed, in which particular trapezoidal denticles were increased upon septa on the side of the detector. In this study, an EPC was designed and its performance was compared with that of two PCs, PC35 and PC41, with a hole size of 1.5 mm and hole lengths of 35 and 41 mm, respectively. The Monte Carlo method was used to calculate the important parameters such as resolution, sensitivity, scattering, and penetration ratio. A Jaszczak phantom was also simulated to evaluate the resolution and contrast of tomographic images, which were produced by the EPC6, PC35, and PC41 using the Monte Carlo N-particle version 5 code, and tomographic images were reconstructed by using maximum likelihood expectation maximization algorithm. Sensitivity of the EPC6 was increased by 20.3% in comparison with that of the PC41 at the identical spatial resolution and full-width at tenth of maximum here. Moreover, the penetration and scattering ratio of the EPC6 was 1.2% less than that of the PC41. The simulated phantom images show that the EPC6 increases contrast-resolution and contrast-to-noise ratio compared with those of PC41 and PC35. When compared with PC41 and PC35, EPC6 improved trade-off between resolution and sensitivity, reduced penetrating and scattering ratios, and produced images with higher quality. EPC6 can be used to increase detectability of more details in nuclear medicine images.

  10. GPU acceleration towards real-time image reconstruction in 3D tomographic diffractive microscopy

    Science.gov (United States)

    Bailleul, J.; Simon, B.; Debailleul, M.; Liu, H.; Haeberlé, O.

    2012-06-01

    Phase microscopy techniques regained interest in allowing for the observation of unprepared specimens with excellent temporal resolution. Tomographic diffractive microscopy is an extension of holographic microscopy which permits 3D observations with a finer resolution than incoherent light microscopes. Specimens are imaged by a series of 2D holograms: their accumulation progressively fills the range of frequencies of the specimen in Fourier space. A 3D inverse FFT eventually provides a spatial image of the specimen. Consequently, acquisition then reconstruction are mandatory to produce an image that could prelude real-time control of the observed specimen. The MIPS Laboratory has built a tomographic diffractive microscope with an unsurpassed 130nm resolution but a low imaging speed - no less than one minute. Afterwards, a high-end PC reconstructs the 3D image in 20 seconds. We now expect an interactive system providing preview images during the acquisition for monitoring purposes. We first present a prototype implementing this solution on CPU: acquisition and reconstruction are tied in a producer-consumer scheme, sharing common data into CPU memory. Then we present a prototype dispatching some reconstruction tasks to GPU in order to take advantage of SIMDparallelization for FFT and higher bandwidth for filtering operations. The CPU scheme takes 6 seconds for a 3D image update while the GPU scheme can go down to 2 or > 1 seconds depending on the GPU class. This opens opportunities for 4D imaging of living organisms or crystallization processes. We also consider the relevance of GPU for 3D image interaction in our specific conditions.

  11. Monte Carlo simulation with the Gate software using grid computing

    International Nuclear Information System (INIS)

    Reuillon, R.; Hill, D.R.C.; Gouinaud, C.; El Bitar, Z.; Breton, V.; Buvat, I.

    2009-03-01

    Monte Carlo simulations are widely used in emission tomography, for protocol optimization, design of processing or data analysis methods, tomographic reconstruction, or tomograph design optimization. Monte Carlo simulations needing many replicates to obtain good statistical results can be easily executed in parallel using the 'Multiple Replications In Parallel' approach. However, several precautions have to be taken in the generation of the parallel streams of pseudo-random numbers. In this paper, we present the distribution of Monte Carlo simulations performed with the GATE software using local clusters and grid computing. We obtained very convincing results with this large medical application, thanks to the EGEE Grid (Enabling Grid for E-science), achieving in one week computations that could have taken more than 3 years of processing on a single computer. This work has been achieved thanks to a generic object-oriented toolbox called DistMe which we designed to automate this kind of parallelization for Monte Carlo simulations. This toolbox, written in Java is freely available on SourceForge and helped to ensure a rigorous distribution of pseudo-random number streams. It is based on the use of a documented XML format for random numbers generators statuses. (authors)

  12. Evaluation of a 3D point cloud tetrahedral tomographic reconstruction method

    Science.gov (United States)

    Pereira, N. F.; Sitek, A.

    2010-09-01

    Tomographic reconstruction on an irregular grid may be superior to reconstruction on a regular grid. This is achieved through an appropriate choice of the image space model, the selection of an optimal set of points and the use of any available prior information during the reconstruction process. Accordingly, a number of reconstruction-related parameters must be optimized for best performance. In this work, a 3D point cloud tetrahedral mesh reconstruction method is evaluated for quantitative tasks. A linear image model is employed to obtain the reconstruction system matrix and five point generation strategies are studied. The evaluation is performed using the recovery coefficient, as well as voxel- and template-based estimates of bias and variance measures, computed over specific regions in the reconstructed image. A similar analysis is performed for regular grid reconstructions that use voxel basis functions. The maximum likelihood expectation maximization reconstruction algorithm is used. For the tetrahedral reconstructions, of the five point generation methods that are evaluated, three use image priors. For evaluation purposes, an object consisting of overlapping spheres with varying activity is simulated. The exact parallel projection data of this object are obtained analytically using a parallel projector, and multiple Poisson noise realizations of these exact data are generated and reconstructed using the different point generation strategies. The unconstrained nature of point placement in some of the irregular mesh-based reconstruction strategies has superior activity recovery for small, low-contrast image regions. The results show that, with an appropriately generated set of mesh points, the irregular grid reconstruction methods can out-perform reconstructions on a regular grid for mathematical phantoms, in terms of the performance measures evaluated.

  13. Evaluation of a 3D point cloud tetrahedral tomographic reconstruction method

    International Nuclear Information System (INIS)

    Pereira, N F; Sitek, A

    2010-01-01

    Tomographic reconstruction on an irregular grid may be superior to reconstruction on a regular grid. This is achieved through an appropriate choice of the image space model, the selection of an optimal set of points and the use of any available prior information during the reconstruction process. Accordingly, a number of reconstruction-related parameters must be optimized for best performance. In this work, a 3D point cloud tetrahedral mesh reconstruction method is evaluated for quantitative tasks. A linear image model is employed to obtain the reconstruction system matrix and five point generation strategies are studied. The evaluation is performed using the recovery coefficient, as well as voxel- and template-based estimates of bias and variance measures, computed over specific regions in the reconstructed image. A similar analysis is performed for regular grid reconstructions that use voxel basis functions. The maximum likelihood expectation maximization reconstruction algorithm is used. For the tetrahedral reconstructions, of the five point generation methods that are evaluated, three use image priors. For evaluation purposes, an object consisting of overlapping spheres with varying activity is simulated. The exact parallel projection data of this object are obtained analytically using a parallel projector, and multiple Poisson noise realizations of these exact data are generated and reconstructed using the different point generation strategies. The unconstrained nature of point placement in some of the irregular mesh-based reconstruction strategies has superior activity recovery for small, low-contrast image regions. The results show that, with an appropriately generated set of mesh points, the irregular grid reconstruction methods can out-perform reconstructions on a regular grid for mathematical phantoms, in terms of the performance measures evaluated.

  14. Evaluation of a 3D point cloud tetrahedral tomographic reconstruction method

    Energy Technology Data Exchange (ETDEWEB)

    Pereira, N F; Sitek, A, E-mail: nfp4@bwh.harvard.ed, E-mail: asitek@bwh.harvard.ed [Department of Radiology, Brigham and Women' s Hospital-Harvard Medical School Boston, MA (United States)

    2010-09-21

    Tomographic reconstruction on an irregular grid may be superior to reconstruction on a regular grid. This is achieved through an appropriate choice of the image space model, the selection of an optimal set of points and the use of any available prior information during the reconstruction process. Accordingly, a number of reconstruction-related parameters must be optimized for best performance. In this work, a 3D point cloud tetrahedral mesh reconstruction method is evaluated for quantitative tasks. A linear image model is employed to obtain the reconstruction system matrix and five point generation strategies are studied. The evaluation is performed using the recovery coefficient, as well as voxel- and template-based estimates of bias and variance measures, computed over specific regions in the reconstructed image. A similar analysis is performed for regular grid reconstructions that use voxel basis functions. The maximum likelihood expectation maximization reconstruction algorithm is used. For the tetrahedral reconstructions, of the five point generation methods that are evaluated, three use image priors. For evaluation purposes, an object consisting of overlapping spheres with varying activity is simulated. The exact parallel projection data of this object are obtained analytically using a parallel projector, and multiple Poisson noise realizations of these exact data are generated and reconstructed using the different point generation strategies. The unconstrained nature of point placement in some of the irregular mesh-based reconstruction strategies has superior activity recovery for small, low-contrast image regions. The results show that, with an appropriately generated set of mesh points, the irregular grid reconstruction methods can out-perform reconstructions on a regular grid for mathematical phantoms, in terms of the performance measures evaluated.

  15. The 3D tomographic image reconstruction software for prompt-gamma measurement of the boron neutron capture therapy

    International Nuclear Information System (INIS)

    Morozov, Boris; Auterinen, Iiro; Kotiluoto, Petri; Kortesniemi, Mika

    2006-01-01

    A tomographic imaging system based on the spatial distribution measurement of the neutron capture reaction during Boron Neutron Capture Therapy (BNCT) would be very useful for clinical purpose. Using gamma-detectors in a 2D-panel, boron neutron capture and hydrogen neutron capture gamma-rays emitted by the neutron irradiated region can be detected, and an image of the neutron capture events can be reconstructed. A 3D reconstruction software package has been written to support the development of a 3D prompt-gamma tomographic system. The package consists of three independent modules: phantom generation, reconstruction and evaluation modules. The reconstruction modules are based on algebraic approach of the iterative reconstruction algorithm (ART), and on the maximum likelihood estimation method (ML-EM). In addition to that, two subsets of the ART, the simultaneous iterative reconstruction technique (SIRT) and the component averaging algorithms (CAV) have been included to the package employing parallel codes for multiprocessor architecture. All implemented algorithms use two different field functions for the reconstruction of the region. One is traditional voxel function, another is, so called, blob function, smooth spherically symmetric generalized Kaiser-Bessel function. The generation module provides the phantom and projections with background by tracing the prompt gamma-rays for a given scanner geometry. The evaluation module makes statistical comparisons between the generated and reconstructed images, and provides figure-of-merit (FOM) values for the applied reconstruction algorithms. The package has been written in C language and tested under Linux and Windows platforms. The simple graphical user interface (GUI) is used for command execution and visualization purposed. (author)

  16. Evaluation of tomographic ISOCAM Park II gamma camera parameters using Monte Carlo method

    International Nuclear Information System (INIS)

    Oramas Polo, Ivón

    2015-01-01

    In this paper the evaluation of tomographic ISOCAM Park II gamma camera parameters was performed using the Monte Carlo code SIMIND. The parameters uniformity, resolution and contrast were evaluated by Jaszczak phantom simulation. In addition the qualitative assessment of the center of rotation was performed. The results of the simulation are compared and evaluated against the specifications of the manufacturer of the gamma camera and taking into account the National Protocol for Quality Control of Nuclear Medicine Instruments of the Cuban Medical Equipment Control Center. A computational Jaszczak phantom model with three different distributions of activity was obtained. They can be used to perform studies with gamma cameras. (author)

  17. Advances in tomographic PIV

    NARCIS (Netherlands)

    Novara, M.

    2013-01-01

    This research deals with advanced developments in 3D particle image velocimetry based on the tomographic PIV technique (Tomo-PIV). The latter is a relatively recent measurement technique introduced by Elsinga et al. in 2005, which is based on the tomographic reconstruction of particle tracers in

  18. Implementation of Japanese male and female tomographic phantoms to multi-particle Monte Carlo code for ionizing radiation dosimetry

    International Nuclear Information System (INIS)

    Lee, Choonsik; Nagaoka, Tomoaki; Lee, Jai-Ki

    2006-01-01

    Japanese male and female tomographic phantoms, which have been developed for radio-frequency electromagnetic-field dosimetry, were implemented into multi-particle Monte Carlo transport code to evaluate realistic dose distribution in human body exposed to radiation field. Japanese tomographic phantoms, which were developed from the whole body magnetic resonance images of Japanese average adult male and female, were processed as follows to be implemented into general purpose multi-particle Monte Carlo code, MCNPX2.5. Original array size of Japanese male and female phantoms, 320 x 160 x 866 voxels and 320 x 160 x 804 voxels, respectively, were reduced into 320 x 160 x 433 voxels and 320 x 160 x 402 voxels due to the limitation of memory use in MCNPX2.5. The 3D voxel array of the phantoms were processed by using the built-in repeated structure algorithm, where the human anatomy was described by the repeated lattice of tiny cube containing the information of material composition and organ index number. Original phantom data were converted into ASCII file, which can be directly ported into the lattice card of MCNPX2.5 input deck by using in-house code. A total of 30 material compositions obtained from International Commission on Radiation Units and Measurement (ICRU) report 46 were assigned to 54 and 55 organs and tissues in the male and female phantoms, respectively, and imported into the material card of MCNPX2.5 along with the corresponding cross section data. Illustrative calculation of absorbed doses for 26 internal organs and effective dose were performed for idealized broad parallel photon and neutron beams in anterior-posterior irradiation geometry, which is typical for workers at nuclear power plant. The results were compared with the data from other Japanese and Caucasian tomographic phantom, and International Commission on Radiological Protection (ICRP) report 74. The further investigation of the difference in organ dose and effective dose among tomographic

  19. Tomographic reconstruction of atmospheric volumes from infrared limb-imager measurements

    Energy Technology Data Exchange (ETDEWEB)

    Ungermann, Joern

    2011-08-12

    State-of-the art nadir and limb-sounders, but also in situ measurements, do not offer the capability to highly resolve the atmosphere in all three dimensions. This leaves an observational gap with respect to small-scale structures that arise frequently in the atmosphere and that still lack a quantitative understanding. For instance, filaments and tropopause folds in the upper troposphere and lower stratosphere (UTLS) are crucial for its composition and variability. One way to achieve a highly resolved three-dimensional (3-D) picture of the atmosphere is the tomographic evaluation of limb-imager measurements. This thesis presents a methodology for the tomographic reconstruction of atmospheric constituents. To be able to deal with the large increase of observations and unknowns compared to conventional retrievals, great care is taken to reduce memory consumption and processing time. This method is used to evaluate the performance of two upcoming infrared limb-imager instruments and to prepare their missions. The first examined instrument is the infrared limb-imager on board of PREMIER (Process Exploration through Measurements of Infrared and millimetrewave Emitted Radiation), one of three remaining candidates for ESA's 7th Earth Explorer mission. Scientific goals of PREMIER are, among others, the examination of gravity waves and the quantification of processes controlling atmospheric composition in the UTLS, a region of particular importance for climate change. Simulations based on the performance requirements of this instrument deliver a vertical resolution that is slightly better than its vertical field-of-view (about 0.75 km) and a horizontal resolution of {approx}25km x 70 km. Non-linear end-to-end simulations for various gravity wave patterns demonstrate that the high 3-D resolution of PREMIER considerably extends the range of detectable gravity waves in terms of horizontal and vertical wavelength compared to previous observations. The second examined

  20. Tomographic reconstruction of atmospheric volumes from infrared limb-imager measurements

    Energy Technology Data Exchange (ETDEWEB)

    Ungermann, Joern

    2011-08-12

    State-of-the art nadir and limb-sounders, but also in situ measurements, do not offer the capability to highly resolve the atmosphere in all three dimensions. This leaves an observational gap with respect to small-scale structures that arise frequently in the atmosphere and that still lack a quantitative understanding. For instance, filaments and tropopause folds in the upper troposphere and lower stratosphere (UTLS) are crucial for its composition and variability. One way to achieve a highly resolved three-dimensional (3-D) picture of the atmosphere is the tomographic evaluation of limb-imager measurements. This thesis presents a methodology for the tomographic reconstruction of atmospheric constituents. To be able to deal with the large increase of observations and unknowns compared to conventional retrievals, great care is taken to reduce memory consumption and processing time. This method is used to evaluate the performance of two upcoming infrared limb-imager instruments and to prepare their missions. The first examined instrument is the infrared limb-imager on board of PREMIER (Process Exploration through Measurements of Infrared and millimetrewave Emitted Radiation), one of three remaining candidates for ESA's 7th Earth Explorer mission. Scientific goals of PREMIER are, among others, the examination of gravity waves and the quantification of processes controlling atmospheric composition in the UTLS, a region of particular importance for climate change. Simulations based on the performance requirements of this instrument deliver a vertical resolution that is slightly better than its vertical field-of-view (about 0.75 km) and a horizontal resolution of {approx}25km x 70 km. Non-linear end-to-end simulations for various gravity wave patterns demonstrate that the high 3-D resolution of PREMIER considerably extends the range of detectable gravity waves in terms of horizontal and vertical wavelength compared to previous observations. The second examined instrument

  1. The robustness of two tomography reconstructing techniques with heavily noisy dynamical experimental data from a high speed gamma-ray tomograph

    International Nuclear Information System (INIS)

    Vasconcelos, Geovane Vitor; Melo, Silvio de Barros; Dantas, Carlos Costa; Moreira, Icaro Malta; Johansen, Geira; Maad, Rachid

    2013-01-01

    The PSIRT (Particle Systems Iterative Reconstructive Technique) is, just like the ART method, an iterative tomographic reconstruction technique with the recommended use in the reconstruction of catalytic density distribution in the refining process of oil in the FCC-type riser. The PSIRT is based upon computer graphics' particle systems, where the reconstructing material is initially represented as composed of particles subject to a force field emanating from the beams, whose intensities are parameterized by the differences between the experimental readings of a given beam trajectory, and the values corresponding to the current amount of particles landed in this trajectory. A dynamical process is set as the beams fields of attracting forces dispute the particles. At the end, with the equilibrium established, the particles are replaced by the corresponding regions of pixels. The High Speed Gamma-ray Tomograph is a 5-source-fan-beam device with a 17-detector deck per source, capable of producing up to a thousand complete sinograms per second. Around 70.000 experimental sinograms from this tomograph were produced simulating the move of gas bubbles in different angular speeds immersed in oil within the vessel, through the use of a two-hole-polypropylene phantom. The sinogram frames were set with several different detector integration times. This article studies and compares the robustness of both ART and PSIRT methods in this heavily noisy scenario, where this noise comes not only from limitations in the dynamical sampling, but also from to the underlying apparatus that produces the counting in the tomograph. These experiments suggest that PSIRT is a more robust method than ART for noisy data. Visual inspection on the resulting images suggests that PSIRT is a more robust method than ART for noisy data, since it almost never presents globally scattered noise. (author)

  2. Precision of quantum tomographic detection of radiation

    Energy Technology Data Exchange (ETDEWEB)

    D' Ariano, G.M. (Dipartimento di Fisica ' ' Alessandro Volta' ' , Via A. Bassi 6, I-27100, Pavia (Italy) Istituto Nazionale di Fisica Nucleare, Sezione di Pavia, Via A. Bassi 6, I-27100, Pavia (Italy)); Macchiavello, Chiara (Dipartimento di Fisica ' ' Alessandro Volta' ' , Via A. Bassi 6, I-27100, Pavia (Italy)); Paris, M.G.A. (Dipartimento di Fisica ' ' Alessandro Volta' ' , Via A. Bassi 6, I-27100, Pavia (Italy))

    1994-11-21

    Homodyne tomography provides an experimental technique for reconstructing the density matrix of the radiation field. Here we analyze the tomographic precision in recovering observables like the photon number, the quadrature, and the phase. We show that tomographic reconstruction, despite providing a complete characterization of the state of the field, is generally much less efficient than conventional detection techniques. ((orig.))

  3. Precision of quantum tomographic detection of radiation

    International Nuclear Information System (INIS)

    D'Ariano, G.M.; Macchiavello, Chiara; Paris, M.G.A.

    1994-01-01

    Homodyne tomography provides an experimental technique for reconstructing the density matrix of the radiation field. Here we analyze the tomographic precision in recovering observables like the photon number, the quadrature, and the phase. We show that tomographic reconstruction, despite providing a complete characterization of the state of the field, is generally much less efficient than conventional detection techniques. ((orig.))

  4. Simulation of Tomographic Reconstruction of Magnetosphere Plasma Distribution By Multi-spacecraft Systems.

    Science.gov (United States)

    Kunitsyn, V.; Nesterov, I.; Andreeva, E.; Zelenyi, L.; Veselov, M.; Galperin, Y.; Buchner, J.

    A satellite radiotomography method for electron density distributions was recently proposed for closely-space multi-spacecraft group of high-altitude satellites to study the physics of reconnection process. The original idea of the ROY project is to use a constellation of spacecrafts (one main and several sub-satellites) in order to carry out closely-spaced multipoint measurements and 2D tomographic reconstruction of elec- tron density in the space between the main satellite and the subsatellites. The distances between the satellites were chosen to vary from dozens to few hundreds of kilometers. The easiest data interpretation is achieved when the subsatellites are placed along the plasma streamline. Then, whenever a plasma density irregularity moves between the main satellite and the subsatellites it will be scanned in different directions and we can get 2D distribution of plasma using these projections. However in general sub- satellites are not placed exactly along the plasma streamline. The method of plasma velocity determination relative to multi-spacecraft systems is considered. Possibilities of 3D tomographic imaging using multi-spacecraft systems are analyzed. The model- ing has shown that efficient scheme for 3D tomographic imaging would be to place spacecrafts in different planes so that the angle between the planes would make not more then ten degrees. Work is supported by INTAS PROJECT 2000-465.

  5. Optimization of reconstruction algorithms using Monte Carlo simulation

    International Nuclear Information System (INIS)

    Hanson, K.M.

    1989-01-01

    A method for optimizing reconstruction algorithms is presented that is based on how well a specified task can be performed using the reconstructed images. Task performance is numerically assessed by a Monte Carlo simulation of the complete imaging process including the generation of scenes appropriate to the desired application, subsequent data taking, reconstruction, and performance of the stated task based on the final image. The use of this method is demonstrated through the optimization of the Algebraic Reconstruction Technique (ART), which reconstructs images from their projections by a iterative procedure. The optimization is accomplished by varying the relaxation factor employed in the updating procedure. In some of the imaging situations studied, it is found that the optimization of constrained ART, in which a nonnegativity constraint is invoked, can vastly increase the detectability of objects. There is little improvement attained for unconstrained ART. The general method presented may be applied to the problem of designing neutron-diffraction spectrometers. 11 refs., 6 figs., 2 tabs

  6. Optimization of reconstruction algorithms using Monte Carlo simulation

    International Nuclear Information System (INIS)

    Hanson, K.M.

    1989-01-01

    A method for optimizing reconstruction algorithms is presented that is based on how well a specified task can be performed using the reconstructed images. Task performance is numerically assessed by a Monte Carlo simulation of the complete imaging process including the generation of scenes appropriate to the desired application, subsequent data taking, reconstruction, and performance of the stated task based on the final image. The use of this method is demonstrated through the optimization of the Algebraic Reconstruction Technique (ART), which reconstructs images from their projections by an iterative procedure. The optimization is accomplished by varying the relaxation factor employed in the updating procedure. In some of the imaging situations studied, it is found that the optimization of constrained ART, in which a non-negativity constraint is invoked, can vastly increase the detectability of objects. There is little improvement attained for unconstrained ART. The general method presented may be applied to the problem of designing neutron-diffraction spectrometers. (author)

  7. Bayesian tomographic reconstruction of microsystems

    International Nuclear Information System (INIS)

    Salem, Sofia Fekih; Vabre, Alexandre; Mohammad-Djafari, Ali

    2007-01-01

    The microtomography by X ray transmission plays an increasingly dominating role in the study and the understanding of microsystems. Within this framework, an experimental setup of high resolution X ray microtomography was developed at CEA-List to quantify the physical parameters related to the fluids flow in microsystems. Several difficulties rise from the nature of experimental data collected on this setup: enhanced error measurements due to various physical phenomena occurring during the image formation (diffusion, beam hardening), and specificities of the setup (limited angle, partial view of the object, weak contrast).To reconstruct the object we must solve an inverse problem. This inverse problem is known to be ill-posed. It therefore needs to be regularized by introducing prior information. The main prior information we account for is that the object is composed of a finite known number of different materials distributed in compact regions. This a priori information is introduced via a Gauss-Markov field for the contrast distributions with a hidden Potts-Markov field for the class materials in the Bayesian estimation framework. The computations are done by using an appropriate Markov Chain Monte Carlo (MCMC) technique.In this paper, we present first the basic steps of the proposed algorithms. Then we focus on one of the main steps in any iterative reconstruction method which is the computation of forward and adjoint operators (projection and backprojection). A fast implementation of these two operators is crucial for the real application of the method. We give some details on the fast computation of these steps and show some preliminary results of simulations

  8. Tomographic visualization of stress corrosion cracks in tubing

    International Nuclear Information System (INIS)

    Morris, R.A.; Kruger, R.P.; Wecksung, G.W.

    1979-06-01

    A feasibility study was conducted to determine the possibility of detecting and sizing cracks in reactor cooling water tubes using tomographic techniques. Due to time and financial constraints, only one tomographic reconstruction using the best technique available was made. The results indicate that tomographic reconstructions can, in fact, detect cracks in the tubing and might possibly be capable of measuring the depth of the cracks. Limits of detectability and sensitivity have not been determined but should be investigated in any future work

  9. Statistical list-mode image reconstruction for the high resolution research tomograph

    International Nuclear Information System (INIS)

    Rahmim, A; Lenox, M; Reader, A J; Michel, C; Burbar, Z; Ruth, T J; Sossi, V

    2004-01-01

    We have investigated statistical list-mode reconstruction applicable to a depth-encoding high resolution research tomograph. An image non-negativity constraint has been employed in the reconstructions and is shown to effectively remove the overestimation bias introduced by the sinogram non-negativity constraint. We have furthermore implemented a convergent subsetized (CS) list-mode reconstruction algorithm, based on previous work (Hsiao et al 2002 Conf. Rec. SPIE Med. Imaging 4684 10-19; Hsiao et al 2002 Conf. Rec. IEEE Int. Symp. Biomed. Imaging 409-12) on convergent histogram OSEM reconstruction. We have demonstrated that the first step of the convergent algorithm is exactly equivalent (unlike the histogram-mode case) to the regular subsetized list-mode EM algorithm, while the second and final step takes the form of additive updates in image space. We have shown that in terms of contrast, noise as well as FWHM width behaviour, the CS algorithm is robust and does not result in limit cycles. A hybrid algorithm based on the ordinary and the convergent algorithms is also proposed, and is shown to combine the advantages of the two algorithms (i.e. it is able to reach a higher image quality in fewer iterations while maintaining the convergent behaviour), making the hybrid approach a good alternative to the ordinary subsetized list-mode EM algorithm

  10. Application of the FDK algorithm for multi-slice tomographic image reconstruction; Aplicacao do algoritmo FDK para a reconstrucao de imagens tomograficas multicortes

    Energy Technology Data Exchange (ETDEWEB)

    Costa, Paulo Roberto, E-mail: pcosta@if.usp.b [Universidade de Sao Paulo (IFUSP), SP (Brazil). Inst. de Fisica. Dept. de Fisica Nuclear; Araujo, Ericky Caldas de Almeida [Fine Image Technology, Sao Paulo, SP (Brazil)

    2010-08-15

    This work consisted on the study and application of the FDK (Feldkamp- Davis-Kress) algorithm for tomographic image reconstruction using cone-beam geometry, resulting on the implementation of an adapted multi-slice computed tomography system. For the acquisition of the projections, a rotating platform coupled to a goniometer, an X-ray equipment and a digital image detector charge-coupled device type were used. The FDK algorithm was implemented on a computer with a Pentium{sup R} XEON{sup TM} 3.0 processor, which was used for the reconstruction process. Initially, the original FDK algorithm was applied considering only the ideal physical conditions in the measurement process. Then some artifacts corrections related to the projections measurement process were incorporated. The implemented MSCT system was calibrated. A specially designed and manufactured object with a known linear attenuation coefficient distribution ({mu}(r)) was used for this purpose. Finally, the implemented MSCT system was used for multi-slice tomographic reconstruction of an inhomogeneous object, whose distribution {mu}(r) was unknown. Some aspects of the reconstructed images were analyzed to assess the robustness and reproducibility of the system. During the system calibration, a linear relationship between CT number and linear attenuation coefficients of materials was verified, which validate the application of the implemented multi-slice tomographic system for the characterization of linear attenuation coefficients of distinct several objects. (author)

  11. Tomographic scanning apparatus

    International Nuclear Information System (INIS)

    1981-01-01

    Details are given of a tomographic scanning apparatus, with particular reference to a multiplexer slip ring means for receiving output from the detectors and enabling interfeed to the image reconstruction station. (U.K.)

  12. Analytical algorithm for the generation of polygonal projection data for tomographic reconstruction

    International Nuclear Information System (INIS)

    Davis, G.R.

    1996-01-01

    Tomographic reconstruction algorithms and filters can be tested using a mathematical phantom, that is, a computer program which takes numerical data as its input and outputs derived projection data. The input data is usually in the form of pixel ''densities'' over a regular grid, or position and dimensions of simple, geometrical objects. The former technique allows a greater variety of objects to be simulated, but is less suitable in the case when very small (relative to the ray-spacing) features are to be simulated. The second technique is normally used to simulate biological specimens, typically a human skull, modelled as a number of ellipses. This is not suitable for simulating non-biological specimens with features such as straight edges and fine cracks. We have therefore devised an algorithm for simulating objects described as a series of polygons. These polygons, or parts of them, may be smaller than the ray-spacing and there is no limit, except that imposed by computing resources, on the complexity, number or superposition of polygons. A simple test of such a phantom, reconstructed using the filtered back-projection method, revealed reconstruction artefacts not normally seen with ''biological'' phantoms. (orig.)

  13. Reconstruction of Clear-PEM data with STIR

    CERN Document Server

    Martins, M V; Rodrigues, P; Trindade, A; Oliveira, N; Correia, M; Cordeiro, H; Ferreira, N C; Varela, J; Almeida, P

    2006-01-01

    The Clear-PEM scanner is a device based on planar detectors that is currently under development within the Crystal Clear Collaboration, at CERN. The basis for 3D image reconstruction in Clear-PEM is the software for tomographic image reconstruction (STIR). STIR is an open source object-oriented library that efficiently deals with the 3D positron emission tomography data sets. This library was originally designed for the traditional cylindrical scanners. In order to make its use compatible with planar scanner data, new functionalities were introduced into the library's framework. In this work, Monte Carlo simulations of the Clear-PEM scanner acquisitions were used as input for image reconstruction with the 3D OSEM algorithm available in STIR. The results presented indicate that dual plate PEM data can be accurately reconstructed using the enhanced STIR framework.

  14. Image properties of list mode likelihood reconstruction for a rectangular positron emission mammography with DOI measurements

    International Nuclear Information System (INIS)

    Qi, Jinyi; Klein, Gregory J.; Huesman, Ronald H.

    2000-01-01

    A positron emission mammography scanner is under development at our Laboratory. The tomograph has a rectangular geometry consisting of four banks of detector modules. For each detector, the system can measure the depth of interaction information inside the crystal. The rectangular geometry leads to irregular radial and angular sampling and spatially variant sensitivity that are different from conventional PET systems. Therefore, it is of importance to study the image properties of the reconstructions. We adapted the theoretical analysis that we had developed for conventional PET systems to the list mode likelihood reconstruction for this tomograph. The local impulse response and covariance of the reconstruction can be easily computed using FFT. These theoretical results are also used with computer observer models to compute the signal-to-noise ratio for lesion detection. The analysis reveals the spatially variant resolution and noise properties of the list mode likelihood reconstruction. The theoretical predictions are in good agreement with Monte Carlo results

  15. Detectability in the presence of computed tomographic reconstruction noise

    International Nuclear Information System (INIS)

    Hanson, K.M.

    1977-01-01

    The multitude of commercial computed tomographic (CT) scanners which have recently been introduced for use in diagnostic radiology has given rise to a need to compare these different machines in terms of image quality and dose to the patient. It is therefore desirable to arrive at a figure of merit for a CT image which gives a measure of the diagnostic efficacy of that image. This figure of merit may well be dependent upon the specific visual task being performed. It is clearly important that the capabilities and deficiencies of the human observer as well as the interface between man and machine, namely the viewing system, be taken into account in formulating the figure of merit. Since the CT reconstruction is the result of computer processing, it is possible to use this processing to alter the characteristics of the displayed images. This image processing may improve or degrade the figure of merit

  16. Enhancement of precision and reduction of measuring points in tomographic reconstructions

    International Nuclear Information System (INIS)

    Lustfeld, H.; Hirschfeld, J.A.; Reissel, M.; Steffen, B.

    2011-01-01

    Accurate external measurements are required in tomographic problems to obtain a reasonable knowledge of the internal structures. Crucial is the distribution of the external measuring points. We suggest a procedure how to systematically optimize this distribution viz. to increase the precision (i.e. to shrink error bars) of the reconstruction by detecting the important and by eliminating the irrelevant measuring points. In a realistic numerical example we apply our scheme to magnetotomography of fuel cells. The result is striking: Starting from a smooth distribution of measuring points on a surface of a cuboid around the fuel cell, the number of measuring points can systematically be reduced by more than 90%. At the same time the precision increases by a factor of nearly 3.

  17. Linear information retrieval method in X-ray grating-based phase contrast imaging and its interchangeability with tomographic reconstruction

    Science.gov (United States)

    Wu, Z.; Gao, K.; Wang, Z. L.; Shao, Q. G.; Hu, R. F.; Wei, C. X.; Zan, G. B.; Wali, F.; Luo, R. H.; Zhu, P. P.; Tian, Y. C.

    2017-06-01

    In X-ray grating-based phase contrast imaging, information retrieval is necessary for quantitative research, especially for phase tomography. However, numerous and repetitive processes have to be performed for tomographic reconstruction. In this paper, we report a novel information retrieval method, which enables retrieving phase and absorption information by means of a linear combination of two mutually conjugate images. Thanks to the distributive law of the multiplication as well as the commutative law and associative law of the addition, the information retrieval can be performed after tomographic reconstruction, thus simplifying the information retrieval procedure dramatically. The theoretical model of this method is established in both parallel beam geometry for Talbot interferometer and fan beam geometry for Talbot-Lau interferometer. Numerical experiments are also performed to confirm the feasibility and validity of the proposed method. In addition, we discuss its possibility in cone beam geometry and its advantages compared with other methods. Moreover, this method can also be employed in other differential phase contrast imaging methods, such as diffraction enhanced imaging, non-interferometric imaging, and edge illumination.

  18. Looking for the Signal: A guide to iterative noise and artefact removal in X-ray tomographic reconstructions of porous geomaterials

    Science.gov (United States)

    Bruns, S.; Stipp, S. L. S.; Sørensen, H. O.

    2017-07-01

    X-ray micro- and nanotomography has evolved into a quantitative analysis tool rather than a mere qualitative visualization technique for the study of porous natural materials. Tomographic reconstructions are subject to noise that has to be handled by image filters prior to quantitative analysis. Typically, denoising filters are designed to handle random noise, such as Gaussian or Poisson noise. In tomographic reconstructions, noise has been projected from Radon space to Euclidean space, i.e. post reconstruction noise cannot be expected to be random but to be correlated. Reconstruction artefacts, such as streak or ring artefacts, aggravate the filtering process so algorithms performing well with random noise are not guaranteed to provide satisfactory results for X-ray tomography reconstructions. With sufficient image resolution, the crystalline origin of most geomaterials results in tomography images of objects that are untextured. We developed a denoising framework for these kinds of samples that combines a noise level estimate with iterative nonlocal means denoising. This allows splitting the denoising task into several weak denoising subtasks where the later filtering steps provide a controlled level of texture removal. We describe a hands-on explanation for the use of this iterative denoising approach and the validity and quality of the image enhancement filter was evaluated in a benchmarking experiment with noise footprints of a varying level of correlation and residual artefacts. They were extracted from real tomography reconstructions. We found that our denoising solutions were superior to other denoising algorithms, over a broad range of contrast-to-noise ratios on artificial piecewise constant signals.

  19. Reconstruction of computed tomographic image from a few x-ray projections by means of accelerative gradient method

    International Nuclear Information System (INIS)

    Kobayashi, Fujio; Yamaguchi, Shoichiro

    1982-01-01

    A method of the reconstruction of computed tomographic images was proposed to reduce the exposure dose to X-ray. The method is the small number of X-ray projection method by accelerative gradient method. The procedures of computation are described. The algorithm of these procedures is simple, the convergence of the computation is fast, and the required memory capacity is small. Numerical simulation was carried out to conform the validity of this method. A sample of simple shape was considered, projection data were given, and the images were reconstructed from 6 views. Good results were obtained, and the method is considered to be useful. (Kato, T.)

  20. Fully 3D GPU PET reconstruction

    Energy Technology Data Exchange (ETDEWEB)

    Herraiz, J.L., E-mail: joaquin@nuclear.fis.ucm.es [Grupo de Fisica Nuclear, Departmento Fisica Atomica, Molecular y Nuclear, Universidad Complutense de Madrid (Spain); Espana, S. [Department of Radiation Oncology, Massachusetts General Hospital and Harvard Medical School, Boston, MA (United States); Cal-Gonzalez, J. [Grupo de Fisica Nuclear, Departmento Fisica Atomica, Molecular y Nuclear, Universidad Complutense de Madrid (Spain); Vaquero, J.J. [Departmento de Bioingenieria e Ingenieria Espacial, Universidad Carlos III, Madrid (Spain); Desco, M. [Departmento de Bioingenieria e Ingenieria Espacial, Universidad Carlos III, Madrid (Spain); Unidad de Medicina y Cirugia Experimental, Hospital General Universitario Gregorio Maranon, Madrid (Spain); Udias, J.M. [Grupo de Fisica Nuclear, Departmento Fisica Atomica, Molecular y Nuclear, Universidad Complutense de Madrid (Spain)

    2011-08-21

    Fully 3D iterative tomographic image reconstruction is computationally very demanding. Graphics Processing Unit (GPU) has been proposed for many years as potential accelerators in complex scientific problems, but it has not been used until the recent advances in the programmability of GPUs that the best available reconstruction codes have started to be implemented to be run on GPUs. This work presents a GPU-based fully 3D PET iterative reconstruction software. This new code may reconstruct sinogram data from several commercially available PET scanners. The most important and time-consuming parts of the code, the forward and backward projection operations, are based on an accurate model of the scanner obtained with the Monte Carlo code PeneloPET and they have been massively parallelized on the GPU. For the PET scanners considered, the GPU-based code is more than 70 times faster than a similar code running on a single core of a fast CPU, obtaining in both cases the same images. The code has been designed to be easily adapted to reconstruct sinograms from any other PET scanner, including scanner prototypes.

  1. Fully 3D GPU PET reconstruction

    International Nuclear Information System (INIS)

    Herraiz, J.L.; Espana, S.; Cal-Gonzalez, J.; Vaquero, J.J.; Desco, M.; Udias, J.M.

    2011-01-01

    Fully 3D iterative tomographic image reconstruction is computationally very demanding. Graphics Processing Unit (GPU) has been proposed for many years as potential accelerators in complex scientific problems, but it has not been used until the recent advances in the programmability of GPUs that the best available reconstruction codes have started to be implemented to be run on GPUs. This work presents a GPU-based fully 3D PET iterative reconstruction software. This new code may reconstruct sinogram data from several commercially available PET scanners. The most important and time-consuming parts of the code, the forward and backward projection operations, are based on an accurate model of the scanner obtained with the Monte Carlo code PeneloPET and they have been massively parallelized on the GPU. For the PET scanners considered, the GPU-based code is more than 70 times faster than a similar code running on a single core of a fast CPU, obtaining in both cases the same images. The code has been designed to be easily adapted to reconstruct sinograms from any other PET scanner, including scanner prototypes.

  2. Precision tomographic analysis of reactor fuels

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Yong Deok; Lee, Chang Hee; Kim, Jong Soo; Jeong, Jwong Hwan; Nam, Ki Yong

    2001-03-01

    For the tomographical assay, search of current status, analysis of neutron beam characteristics, MCNP code simulation, sim-fuel fabrication, neutron experiment for sim-fuel, multiaxes operation system design were done. In sensitivity simulation, the reconstruction results showed the good agreement. Also, the scoping test at ANL was very helpful for actual assay. Therefore, the results are applied for HANARO tomographical system setup and consecutive next research.

  3. Precision tomographic analysis of reactor fuels

    International Nuclear Information System (INIS)

    Lee, Yong Deok; Lee, Chang Hee; Kim, Jong Soo; Jeong, Jwong Hwan; Nam, Ki Yong

    2001-03-01

    For the tomographical assay, search of current status, analysis of neutron beam characteristics, MCNP code simulation, sim-fuel fabrication, neutron experiment for sim-fuel, multiaxes operation system design were done. In sensitivity simulation, the reconstruction results showed the good agreement. Also, the scoping test at ANL was very helpful for actual assay. Therefore, the results are applied for HANARO tomographical system setup and consecutive next research

  4. MLE [Maximum Likelihood Estimator] reconstruction of a brain phantom using a Monte Carlo transition matrix and a statistical stopping rule

    International Nuclear Information System (INIS)

    Veklerov, E.; Llacer, J.; Hoffman, E.J.

    1987-10-01

    In order to study properties of the Maximum Likelihood Estimator (MLE) algorithm for image reconstruction in Positron Emission Tomographyy (PET), the algorithm is applied to data obtained by the ECAT-III tomograph from a brain phantom. The procedure for subtracting accidental coincidences from the data stream generated by this physical phantom is such that he resultant data are not Poisson distributed. This makes the present investigation different from other investigations based on computer-simulated phantoms. It is shown that the MLE algorithm is robust enough to yield comparatively good images, especially when the phantom is in the periphery of the field of view, even though the underlying assumption of the algorithm is violated. Two transition matrices are utilized. The first uses geometric considerations only. The second is derived by a Monte Carlo simulation which takes into account Compton scattering in the detectors, positron range, etc. in the detectors. It is demonstrated that the images obtained from the Monte Carlo matrix are superior in some specific ways. A stopping rule derived earlier and allowing the user to stop the iterative process before the images begin to deteriorate is tested. Since the rule is based on the Poisson assumption, it does not work well with the presently available data, although it is successful wit computer-simulated Poisson data

  5. Automated angular and translational tomographic alignment and application to phase-contrast imaging

    DEFF Research Database (Denmark)

    Cunha Ramos, Tiago Joao; Jørgensen, Jakob Sauer; Andreasen, Jens Wenzel

    2017-01-01

    X-ray computerized tomography (CT) is a 3D imaging technique that makes use of x-ray illumination and image reconstruction techniques to reproduce the internal cross-sections of a sample. Tomographic projection data usually require an initial relative alignment or knowledge of the exact object po...... improvement in the reconstruction resolution. A MATLAB implementation is made publicly available and will allow robust analysis of large volumes of phase-contrast tomography data.......X-ray computerized tomography (CT) is a 3D imaging technique that makes use of x-ray illumination and image reconstruction techniques to reproduce the internal cross-sections of a sample. Tomographic projection data usually require an initial relative alignment or knowledge of the exact object...... reconstruction artifacts and limit the attained resolution in the final tomographic reconstruction. Alignment algorithms that require manual interaction impede data analysis with ever-increasing data acquisition rates, supplied by more brilliant sources. We present in this paper an iterative reconstruction...

  6. Positioning of Nuclear Fuel Assemblies by Means of Image Analysis on Tomographic Data

    International Nuclear Information System (INIS)

    Troeng, Mats

    2005-06-01

    A tomographic measurement technique for nuclear fuel assemblies has been developed at the Department of Radiation Sciences at Uppsala University. The technique requires highly accurate information about the position of the measured nuclear fuel assembly relative to the measurement equipment. In experimental campaigns performed earlier, separate positioning measurements have therefore been performed in connection to the tomographic measurements. In this work, another positioning approach has been investigated, which requires only the collection of tomographic data. Here, a simplified tomographic reconstruction is performed, whereby an image is obtained. By performing image analysis on this image, the lateral and angular position of the fuel assembly can be determined. The position information can then be used to perform a more accurate tomographic reconstruction involving detailed physical modeling. Two image analysis techniques have been developed in this work. The stability of the two techniques with respect to some central parameters has been studied. The agreement between these image analysis techniques and the previously used positioning technique was found to meet the desired requirements. Furthermore, it has been shown that the image analysis techniques offer more detailed information than the previous technique. In addition, its off-line analysis properties reduce the need for valuable measurement time. When utilizing the positions obtained from the image analysis techniques in tomographic reconstructions of the rod-by-rod power distribution, the repeatability of the reconstructed values was improved. Furthermore, the reconstructions resulted in better agreement to theoretical data

  7. Novel edge treatment method for improving the transmission reconstruction quality in Tomographic Gamma Scanning.

    Science.gov (United States)

    Han, Miaomiao; Guo, Zhirong; Liu, Haifeng; Li, Qinghua

    2018-05-01

    Tomographic Gamma Scanning (TGS) is a method used for the nondestructive assay of radioactive wastes. In TGS, the actual irregular edge voxels are regarded as regular cubic voxels in the traditional treatment method. In this study, in order to improve the performance of TGS, a novel edge treatment method is proposed that considers the actual shapes of these voxels. The two different edge voxel treatment methods were compared by computing the pixel-level relative errors and normalized mean square errors (NMSEs) between the reconstructed transmission images and the ideal images. Both methods were coupled with two different interative algorithms comprising Algebraic Reconstruction Technique (ART) with a non-negativity constraint and Maximum Likelihood Expectation Maximization (MLEM). The results demonstrated that the traditional method for edge voxel treatment can introduce significant error and that the real irregular edge voxel treatment method can improve the performance of TGS by obtaining better transmission reconstruction images. With the real irregular edge voxel treatment method, MLEM algorithm and ART algorithm can be comparable when assaying homogenous matrices, but MLEM algorithm is superior to ART algorithm when assaying heterogeneous matrices. Copyright © 2018 Elsevier Ltd. All rights reserved.

  8. A study of the decoding of multiple pinhole coded aperture RI tomographic images

    International Nuclear Information System (INIS)

    Hasegawa, Takeo; Kobayashi, Akitoshi; Nishiyama, Yutaka; Akagi, Kiyoshi; Uehata, Hiroshi

    1981-01-01

    In order to obtain a radioisotope (RI) tomographic image, there are various, methods, including the RCT method, Time Modulate method, and Multiple Pinhole Coded Aperture (MPCA) method and others. The MPCA method has several advantages. Using the MPCA method, there is no need to move either the detector or the patient, Furthermore, the generally used γ-camera may be used without any alterations. Due to certain problems in reconstructing the tomographic image, the use of the MPCA method in clinical practice is limited to representation of small organs (e.g. heart) using the 7-Pinhole collimator. This research presents an experimental approach to overcome the problems in reconstruction of tomographic images of large organs (organs other than the heart, such as the brain, liver, lung etc.) by introducing a reconstruction algorithm and correction software into the MPCA method. There are 2 main problems in MPCA image reconstruction: (1) Due to the rounding-off procedure, there is both point omission and shifting of point coordinates. (2) The central portion is characterized by high-counts. Both of these problems were solved by incorporating a reconstruction algorithm and a correction function. The resultant corrected tomographic image was processed using a filter derived from subjecting a PSF to a Fourier transform. Thus, it has become possible to obtain a high-quality tomographic image of large organs for clinical use. (author)

  9. Data and Analysis from a Time-Resolved Tomographic Optical Beam Diagnostic

    International Nuclear Information System (INIS)

    Frayer, Daniel K.; Johnson, Douglas; Ekdahl, Carl

    2010-01-01

    An optical tomographic diagnostic instrument developed for the acquisition of high-speed time-resolved images has been fielded at the Dual-Axis Radiographic Hydrodynamic Test (DARHT) Facility at Los Alamos National Laboratory. The instrument was developed for the creation of time histories of electron-beam cross section through the collection of Cerenkov light. Four optical lines of sight optically collapse an image and relay projections via an optical fiber relay to recording instruments; a tomographic reconstruction algorithm creates the time history. Because the instrument may be operated in an adverse environment, it may be operated, adjusted, and calibrated remotely. The instrument was operated over the course of various activities during and after DARHT commissioning, and tomographic reconstructions reported verifiable beam characteristics. Results from the collected data and reconstructions and analysis of the data are discussed.

  10. Evaluation of interpolation methods for surface-based motion compensated tomographic reconstruction for cardiac angiographic C-arm data

    International Nuclear Information System (INIS)

    Müller, Kerstin; Schwemmer, Chris; Hornegger, Joachim; Zheng Yefeng; Wang Yang; Lauritsch, Günter; Rohkohl, Christopher; Maier, Andreas K.; Schultz, Carl; Fahrig, Rebecca

    2013-01-01

    experiments showed that TPS interpolation provided the best results. The quantitative results in the phantom experiments showed comparable nRMSE of ≈0.047 ± 0.004 for the TPS and Shepard's method. Only slightly inferior results for the smoothed weighting function and the linear approach were achieved. The UQI resulted in a value of ≈ 99% for all four interpolation methods. On clinical human data sets, the best results were clearly obtained with the TPS interpolation. The mean contour deviation between the TPS reconstruction and the standard FDK reconstruction improved in the three human cases by 1.52, 1.34, and 1.55 mm. The Dice coefficient showed less sensitivity with respect to variations in the ventricle boundary. Conclusions: In this work, the influence of different motion interpolation methods on left ventricle motion compensated tomographic reconstructions was investigated. The best quantitative reconstruction results of a phantom, a porcine, and human clinical data sets were achieved with the TPS approach. In general, the framework of motion estimation using a surface model and motion interpolation to a dense MVF provides the ability for tomographic reconstruction using a motion compensation technique.

  11. Terahertz wave tomographic imaging with a Fresnel lens

    Institute of Scientific and Technical Information of China (English)

    S. Wang; X.-C. Zhang

    2003-01-01

    We demonstrate three-dimensional tomographic imaging using a Fresnel lens with broadband terahertz pulses. Objects at various locations along the beam propagation path are uniquely imaged on the same imaging plane using a Fresnel lens with different frequencies of the imaging beam. This procedure allows the reconstruction of an object's tomographic contrast image by assembling the frequency-dependent images.

  12. 256-Slice coronary computed tomographic angiography in patients with atrial fibrillation: optimal reconstruction phase and image quality

    Energy Technology Data Exchange (ETDEWEB)

    Oda, Seitaro; Yuki, Hideaki; Kidoh, Masafumi; Utsunomiya, Daisuke; Nakaura, Takeshi; Namimoto, Tomohiro; Yamashita, Yasuyuki [Kumamoto University, Department of Diagnostic Radiology, Faculty of Life Sciences, Chuou-ku, Kumamoto (Japan); Honda, Keiichi; Yoshimura, Akira; Katahira, Kazuhiro [Kumamoto Chuo Hospital, Department of Diagnostic Radiology, Minami-ku, Kumamoto (Japan); Noda, Katsuo; Oshima, Shuichi [Kumamoto Chuo Hospital, Department of Cardiology, Minami-ku, Kumamoto (Japan)

    2016-01-15

    To assess the optimal reconstruction phase and the image quality of coronary computed tomographic angiography (CCTA) in patients with atrial fibrillation (AF). We performed CCTA in 60 patients with AF and 60 controls with sinus rhythm. The images were reconstructed in multiple phases in all parts of the cardiac cycle, and the optimal reconstruction phase with the fewest motion artefacts was identified. The coronary artery segments were visually evaluated to investigate their assessability. In 46 (76.7 %) patients, the optimal reconstruction phase was end-diastole, whereas in 6 (10.0 %) patients it was end-systole or mid-diastole, and in 2 (3.3 %) patients it was another cardiac phase. In 53 (88.3 %) of the controls, the optimal reconstruction phase was mid-diastole, whereas it was end-systole in 4 (6.7 %), and in 3 (5.0 %) it was another cardiac phase. There was a significant difference between patients with AF and the controls in the optimal phase (p < 0.01) but not in the visual image quality score (p = 0.06). The optimal reconstruction phase in most patients with AF was the end-diastolic phase. The end-systolic phase tended to be optimal in AF patients with higher average heart rates. (orig.)

  13. A morphological study of the mandibular molar region using reconstructed helical computed tomographic images

    International Nuclear Information System (INIS)

    Tsuno, Hiroaki; Noguchi, Makoto; Noguchi, Akira; Yoshida, Keiko; Tachinami, Yasuharu

    2010-01-01

    This study investigated the morphological variance in the mandibular molar region using reconstructed helical computed tomographic (CT) images. In addition, we discuss the necessity of CT scanning as part of the preoperative assessment process for dental implantation, by comparing the results with the findings of panoramic radiography. Sixty patients examined using CT as part of the preoperative assessment for dental implantation were analyzed. Reconstructed CT images were used to evaluate the bone quality and cross-sectional bone morphology of the mandibular molar region. The mandibular cortical index (MCI) and X-ray density ratio of this region were assessed using panoramic radiography in order to analyze the correlation between the findings of the CT images and panoramic radiography. CT images showed that there was a decrease in bone quality in cases with high MCI. Cross-sectional CT images revealed that the undercuts on the lingual side in the highly radiolucent areas in the basal portion were more frequent than those in the alveolar portion. This study showed that three-dimensional reconstructed CT images can help to detect variances in mandibular morphology that might be missed by panoramic radiography. In conclusion, it is suggested that CT should be included as an important examination tool before dental implantation. (author)

  14. Simulation studies on the tomographic reconstruction of the equatorial and low-latitude ionosphere in the context of the Indian tomography experiment: CRABEX

    Directory of Open Access Journals (Sweden)

    S. V. Thampi

    2004-11-01

    Full Text Available Equatorial ionosphere poses a challenge to any algorithm that is used for tomographic reconstruction because of the phenomena like the Equatorial Ionization Anomaly (EIA and Equatorial Spread F (ESF. Any tomographic reconstruction of ionospheric density distributions in the equatorial region is not acceptable if it does not image these phenomena, which exhibit large spatial and temporal variability, to a reasonable accuracy. The accuracy of the reconstructed image generally depends on many factors, such as the satellite-receiver configuration, the ray path modelling, grid intersections and finally, the reconstruction algorithm. The present simulation study is performed to examine these in the context of the operational Coherent Radio Beacon Experiment (CRABEX network just commenced in India. The feasibility of using this network for the studies of the equatorial and low-latitude ionosphere over Indian longitudes has been investigated through simulations. The electron density distributions that are characteristic of EIA and ESF are fed into various simulations and the reconstructed tomograms are investigated in terms of their reproducing capabilities. It is seen that, with the present receiver chain existing from 8.5° N to 34° N, it would be possible to obtain accurate images of EIA and the plasma bubbles. The Singular Value Decomposition (SVD algorithm has been used for the inversion procedure in this study. As is known, by the very nature of ionospheric tomography experiments, the received data contain various kinds of errors, like the measurement and discretization errors. The sensitivity of the inversion algorithm, SVD in the present case, to these errors has also been investigated and quantified.

  15. Evaluation of interpolation methods for surface-based motion compensated tomographic reconstruction for cardiac angiographic C-arm data

    Energy Technology Data Exchange (ETDEWEB)

    Mueller, Kerstin; Schwemmer, Chris; Hornegger, Joachim [Pattern Recognition Lab, Department of Computer Science, Erlangen Graduate School in Advanced Optical Technologies (SAOT), Friedrich-Alexander-Universitaet Erlangen-Nuernberg, Erlangen 91058 (Germany); Zheng Yefeng; Wang Yang [Imaging and Computer Vision, Siemens Corporate Research, Princeton, New Jersey 08540 (United States); Lauritsch, Guenter; Rohkohl, Christopher; Maier, Andreas K. [Siemens AG, Healthcare Sector, Forchheim 91301 (Germany); Schultz, Carl [Thoraxcenter, Erasmus MC, Rotterdam 3000 (Netherlands); Fahrig, Rebecca [Department of Radiology, Stanford University, Stanford, California 94305 (United States)

    2013-03-15

    all experiments showed that TPS interpolation provided the best results. The quantitative results in the phantom experiments showed comparable nRMSE of Almost-Equal-To 0.047 {+-} 0.004 for the TPS and Shepard's method. Only slightly inferior results for the smoothed weighting function and the linear approach were achieved. The UQI resulted in a value of Almost-Equal-To 99% for all four interpolation methods. On clinical human data sets, the best results were clearly obtained with the TPS interpolation. The mean contour deviation between the TPS reconstruction and the standard FDK reconstruction improved in the three human cases by 1.52, 1.34, and 1.55 mm. The Dice coefficient showed less sensitivity with respect to variations in the ventricle boundary. Conclusions: In this work, the influence of different motion interpolation methods on left ventricle motion compensated tomographic reconstructions was investigated. The best quantitative reconstruction results of a phantom, a porcine, and human clinical data sets were achieved with the TPS approach. In general, the framework of motion estimation using a surface model and motion interpolation to a dense MVF provides the ability for tomographic reconstruction using a motion compensation technique.

  16. Real-Space x-ray tomographic reconstruction of randomly oriented objects with sparse data frames.

    Science.gov (United States)

    Ayyer, Kartik; Philipp, Hugh T; Tate, Mark W; Elser, Veit; Gruner, Sol M

    2014-02-10

    Schemes for X-ray imaging single protein molecules using new x-ray sources, like x-ray free electron lasers (XFELs), require processing many frames of data that are obtained by taking temporally short snapshots of identical molecules, each with a random and unknown orientation. Due to the small size of the molecules and short exposure times, average signal levels of much less than 1 photon/pixel/frame are expected, much too low to be processed using standard methods. One approach to process the data is to use statistical methods developed in the EMC algorithm (Loh & Elser, Phys. Rev. E, 2009) which processes the data set as a whole. In this paper we apply this method to a real-space tomographic reconstruction using sparse frames of data (below 10(-2) photons/pixel/frame) obtained by performing x-ray transmission measurements of a low-contrast, randomly-oriented object. This extends the work by Philipp et al. (Optics Express, 2012) to three dimensions and is one step closer to the single molecule reconstruction problem.

  17. Image reconstruction using Monte Carlo simulation and artificial neural networks

    International Nuclear Information System (INIS)

    Emert, F.; Missimner, J.; Blass, W.; Rodriguez, A.

    1997-01-01

    PET data sets are subject to two types of distortions during acquisition: the imperfect response of the scanner and attenuation and scattering in the active distribution. In addition, the reconstruction of voxel images from the line projections composing a data set can introduce artifacts. Monte Carlo simulation provides a means for modeling the distortions and artificial neural networks a method for correcting for them as well as minimizing artifacts. (author) figs., tab., refs

  18. The Reconstruction Toolkit (RTK), an open-source cone-beam CT reconstruction toolkit based on the Insight Toolkit (ITK)

    International Nuclear Information System (INIS)

    Rit, S; Vila Oliva, M; Sarrut, D; Brousmiche, S; Labarbe, R; Sharp, G C

    2014-01-01

    We propose the Reconstruction Toolkit (RTK, http://www.openrtk.org), an open-source toolkit for fast cone-beam CT reconstruction, based on the Insight Toolkit (ITK) and using GPU code extracted from Plastimatch. RTK is developed by an open consortium (see affiliations) under the non-contaminating Apache 2.0 license. The quality of the platform is daily checked with regression tests in partnership with Kitware, the company supporting ITK. Several features are already available: Elekta, Varian and IBA inputs, multi-threaded Feldkamp-David-Kress reconstruction on CPU and GPU, Parker short scan weighting, multi-threaded CPU and GPU forward projectors, etc. Each feature is either accessible through command line tools or C++ classes that can be included in independent software. A MIDAS community has been opened to share CatPhan datasets of several vendors (Elekta, Varian and IBA). RTK will be used in the upcoming cone-beam CT scanner developed by IBA for proton therapy rooms. Many features are under development: new input format support, iterative reconstruction, hybrid Monte Carlo / deterministic CBCT simulation, etc. RTK has been built to freely share tomographic reconstruction developments between researchers and is open for new contributions.

  19. Tomographic anthropomorphic models. Pt. 1

    International Nuclear Information System (INIS)

    Veit, R.; Zankl, M.; Petoussi, N.; Mannweiler, E.; Drexler, G.; Williams, G.

    1989-01-01

    The first generation of heterogenoeous anthropomorphic mathematical models to be used in dose calculations was the MIRD-5 adult phantom, followed by the pediatric MIRD-type phantoms and by the GSF sex-specific phantoms ADAM and EVA. A new generation of realistic anthropomorphic models is now introduced. The organs and tissues of these models consist of a well defined number of volume elements (voxels), derived from computer tomographic (CT) data; consequently, these models were named voxel or tomographic models. So far two voxel models of real patients are available: one of an 8 week old baby and of a 7 year old child. For simplicity, the model of the baby will be referred to as BABY and that of the child as CHILD. In chapter 1 a brief literature review is given on the existing mathematical models and their applications. The reasons that lead to the construction of the new CT models is discussed. In chapter 2 the technique is described which allows to convert any physical object into computer files to be used for dose calculations. The technique which produces three dimensional reconstructions of high resolution is discussed. In chapter 3 the main characteristics of the models of the baby and child are given. Tables of organ masses and volumes are presented together with three dimensional images of some organs and tissues. A special mention is given to the assessment of bone marrow distribution. Chapter 4 gives a short description of the Monte Carlo code used in conjunction with the models to calculate organ and tissue doses resulting from photon exposures. Some technical details concerning the computer files which describe the models are also given. (orig./HP)

  20. Tomographic apparatus and method for reconstructing planar slices from non-absorbed radiation

    International Nuclear Information System (INIS)

    1976-01-01

    In a tomographic apparatus and method for reconstructing two-dimensional planar slices from linear projections of non-absorbed radiation useful in the fields of medical radiology, microscopy, and non-destructive testing, a beam of radiation in the shape of a fan is passed through an object lying in the same quasi-plane as the object slice and non-absorbtion thereof is recorded on oppositely-situated detectors aligned with the source of radiation. There is relative rotation between the source-detector configuration and the object within the quasi-plane. Periodic values of the detected radiation are taken, convolved with certain functions, and back-projected to produce a two-dimensional output picture on a visual display illustrating a facsimile of the object slice. A series of two-dimensional pictures obtained simultaneously or serially can be combined to produce a three dimensional portrayal of the entire object

  1. Tomographic sensing and localization of fluorescently labeled circulating cells in mice in vivo

    International Nuclear Information System (INIS)

    Zettergren, Eric; Swamy, Tushar; Niedre, Mark; Runnels, Judith; Lin, Charles P

    2012-01-01

    Sensing and enumeration of specific types of circulating cells in small animals is an important problem in many areas of biomedical research. Microscopy-based fluorescence in vivo flow cytometry methods have been developed previously, but these are typically limited to sampling of very small blood volumes, so that very rare circulating cells may escape detection. Recently, we described the development of a ‘diffuse fluorescence flow cytometer’ (DFFC) that allows sampling of much larger blood vessels and therefore circulating blood volumes in the hindlimb, forelimb or tail of a mouse. In this work, we extend this concept by developing and validating a method to tomographically localize circulating fluorescently labeled cells in the cross section of a tissue simulating optical flow phantom and mouse limb. This was achieved using two modulated light sources and an array of six fiber-coupled detectors that allowed rapid, high-sensitivity acquisition of full tomographic data sets at 10 Hz. These were reconstructed into two-dimensional cross-sectional images using Monte Carlo models of light propagation and the randomized algebraic reconstruction technique. We were able to obtain continuous images of moving cells in the sample cross section with 0.5 mm accuracy or better. We first demonstrated this concept in limb-mimicking optical flow photons with up to four flow channels, and then in the tails of mice with fluorescently labeled multiple myeloma cells. This approach increases the overall diagnostic utility of our DFFC instrument. (paper)

  2. Development of a 30-week-pregnant female tomographic model from computed tomography (CT) images for Monte Carlo organ dose calculations

    International Nuclear Information System (INIS)

    Shi Chengyu; Xu, X. George

    2004-01-01

    Assessment of radiation dose and risk to a pregnant woman and her fetus is an important task in radiation protection. Although tomographic models for male and female patients of different ages have been developed using medical images, such models for pregnant women had not been developed to date. This paper reports the construction of a partial-body model of a pregnant woman from a set of computed tomography (CT) images. The patient was 30 weeks into pregnancy, and the CT scan covered the portion of the body from above liver to below pubic symphysis in 70 slices. The thickness for each slice is 7 mm, and the image resolution is 512x512 pixels in a 48 cmx48 cm field; thus, the voxel size is 6.15 mm 3 . The images were segmented to identify 34 major internal organs and tissues considered sensitive to radiation. Even though the masses are noticeably different from other models, the three-dimensional visualization verified the segmentation and its suitability for Monte Carlo calculations. The model has been implemented into a Monte Carlo code, EGS4-VLSI (very large segmented images), for the calculations of radiation dose to a pregnant woman. The specific absorbed fraction (SAF) results for internal photons were compared with those from a stylized model. Small and large differences were found, and the differences can be explained by mass differences and by the relative geometry differences between the source and the target organs. The research provides the radiation dosimetry community with the first voxelized tomographic model of a pregnant woman, opening the door to future dosimetry studies

  3. Medical Imaging Image Quality Assessment with Monte Carlo Methods

    International Nuclear Information System (INIS)

    Michail, C M; Fountos, G P; Kalyvas, N I; Valais, I G; Kandarakis, I S; Karpetas, G E; Martini, Niki; Koukou, Vaia

    2015-01-01

    The aim of the present study was to assess image quality of PET scanners through a thin layer chromatography (TLC) plane source. The source was simulated using a previously validated Monte Carlo model. The model was developed by using the GATE MC package and reconstructed images obtained with the STIR software for tomographic image reconstruction, with cluster computing. The PET scanner simulated in this study was the GE DiscoveryST. A plane source consisted of a TLC plate, was simulated by a layer of silica gel on aluminum (Al) foil substrates, immersed in 18F-FDG bath solution (1MBq). Image quality was assessed in terms of the Modulation Transfer Function (MTF). MTF curves were estimated from transverse reconstructed images of the plane source. Images were reconstructed by the maximum likelihood estimation (MLE)-OSMAPOSL algorithm. OSMAPOSL reconstruction was assessed by using various subsets (3 to 21) and iterations (1 to 20), as well as by using various beta (hyper) parameter values. MTF values were found to increase up to the 12th iteration whereas remain almost constant thereafter. MTF improves by using lower beta values. The simulated PET evaluation method based on the TLC plane source can be also useful in research for the further development of PET and SPECT scanners though GATE simulations. (paper)

  4. Experimental device, corresponding forward model and processing of the experimental data using wavelet analysis for tomographic image reconstruction applied to eddy current nondestructive evaluation

    International Nuclear Information System (INIS)

    Joubert, P.Y.; Madaoui, N.

    1999-01-01

    In the context of eddy current non destructive evaluation using a tomographic image reconstruction process, the success of the reconstruction depends not only on the choice of the forward model and of the inversion algorithms, but also on the ability to extract the pertinent data from the raw signal provided by the sensor. We present in this paper, an experimental device designed for imaging purposes, the corresponding forward model, and a pre-processing of the experimental data using wavelet analysis. These three steps implemented with an inversion algorithm, will allow in the future to perform image reconstruction of 3-D flaws. (authors)

  5. Tomographic reconstruction of binary fields

    International Nuclear Information System (INIS)

    Roux, Stéphane; Leclerc, Hugo; Hild, François

    2012-01-01

    A novel algorithm is proposed for reconstructing binary images from their projection along a set of different orientations. Based on a nonlinear transformation of the projection data, classical back-projection procedures can be used iteratively to converge to the sought image. A multiscale implementation allows for a faster convergence. The algorithm is tested on images up to 1 Mb definition, and an error free reconstruction is achieved with a very limited number of projection data, saving a factor of about 100 on the number of projections required for classical reconstruction algorithms.

  6. Jini service to reconstruct tomographic data

    Science.gov (United States)

    Knoll, Peter; Mirzaei, S.; Koriska, K.; Koehn, H.

    2002-06-01

    A number of imaging systems rely on the reconstruction of a 3- dimensional model from its projections through the process of computed tomography (CT). In medical imaging, for example magnetic resonance imaging (MRI), positron emission tomography (PET), and Single Computer Tomography (SPECT) acquire two-dimensional projections of a three dimensional projections of a three dimensional object. In order to calculate the 3-dimensional representation of the object, i.e. its voxel distribution, several reconstruction algorithms have been developed. Currently, mainly two reconstruct use: the filtered back projection(FBP) and iterative methods. Although the quality of iterative reconstructed SPECT slices is better than that of FBP slices, such iterative algorithms are rarely used for clinical routine studies because of their low availability and increased reconstruction time. We used Jini and a self-developed iterative reconstructions algorithm to design and implement a Jini reconstruction service. With this service, the physician selects the patient study from a database and a Jini client automatically discovers the registered Jini reconstruction services in the department's Intranet. After downloading the proxy object the this Jini service, the SPECT acquisition data are reconstructed. The resulting transaxial slices are visualized using a Jini slice viewer, which can be used for various imaging modalities.

  7. Phillips-Tikhonov regularization with a priori information for neutron emission tomographic reconstruction on Joint European Torus

    Energy Technology Data Exchange (ETDEWEB)

    Bielecki, J.; Scholz, M.; Drozdowicz, K. [Institute of Nuclear Physics, Polish Academy of Sciences, PL-31342 Krakow (Poland); Giacomelli, L. [CCFE, Culham Science Centre, Abingdon OX14 3DB (United Kingdom); Istituto di Fisica del Plasma “P. Caldirola,” Milano (Italy); Kiptily, V.; Kempenaars, M. [CCFE, Culham Science Centre, Abingdon OX14 3DB (United Kingdom); Conroy, S. [CCFE, Culham Science Centre, Abingdon OX14 3DB (United Kingdom); Department of Physics and Astronomy, Uppsala University (Sweden); Craciunescu, T. [IAP, National Institute for Laser Plasma and Radiation Physics, Bucharest (Romania); Collaboration: EUROfusion Consortium, JET, Culham Science Centre, Abingdon OX14 3DB (United Kingdom)

    2015-09-15

    A method of tomographic reconstruction of the neutron emissivity in the poloidal cross section of the Joint European Torus (JET, Culham, UK) tokamak was developed. Due to very limited data set (two projection angles, 19 lines of sight only) provided by the neutron emission profile monitor (KN3 neutron camera), the reconstruction is an ill-posed inverse problem. The aim of this work consists in making a contribution to the development of reliable plasma tomography reconstruction methods that could be routinely used at JET tokamak. The proposed method is based on Phillips-Tikhonov regularization and incorporates a priori knowledge of the shape of normalized neutron emissivity profile. For the purpose of the optimal selection of the regularization parameters, the shape of normalized neutron emissivity profile is approximated by the shape of normalized electron density profile measured by LIDAR or high resolution Thomson scattering JET diagnostics. In contrast with some previously developed methods of ill-posed plasma tomography reconstruction problem, the developed algorithms do not include any post-processing of the obtained solution and the physical constrains on the solution are imposed during the regularization process. The accuracy of the method is at first evaluated by several tests with synthetic data based on various plasma neutron emissivity models (phantoms). Then, the method is applied to the neutron emissivity reconstruction for JET D plasma discharge #85100. It is demonstrated that this method shows good performance and reliability and it can be routinely used for plasma neutron emissivity reconstruction on JET.

  8. GPU based Monte Carlo for PET image reconstruction: detector modeling

    International Nuclear Information System (INIS)

    Légrády; Cserkaszky, Á; Lantos, J.; Patay, G.; Bükki, T.

    2011-01-01

    Monte Carlo (MC) calculations and Graphical Processing Units (GPUs) are almost like the dedicated hardware designed for the specific task given the similarities between visible light transport and neutral particle trajectories. A GPU based MC gamma transport code has been developed for Positron Emission Tomography iterative image reconstruction calculating the projection from unknowns to data at each iteration step taking into account the full physics of the system. This paper describes the simplified scintillation detector modeling and its effect on convergence. (author)

  9. The inverse problems of reconstruction in the X-rays, gamma or positron tomographic imaging systems

    International Nuclear Information System (INIS)

    Grangeat, P.

    1999-01-01

    The revolution in imagery, brought by the tomographic technic in the years 70, allows the computation of local values cartography for the attenuation or the emission activity. The reconstruction techniques thus allow the connection from integral measurements to characteristic information distribution by inversion of the measurement equations. They are a main application of the solution technic for inverse problems. In a first part the author recalls the physical principles for measures in X-rays, gamma and positron imaging. Then he presents the various problems with their associated inversion techniques. The third part is devoted to the activity sector and examples, to conclude in the last part with the forecast. (A.L.B.)

  10. GPU based Monte Carlo for PET image reconstruction: parameter optimization

    International Nuclear Information System (INIS)

    Cserkaszky, Á; Légrády, D.; Wirth, A.; Bükki, T.; Patay, G.

    2011-01-01

    This paper presents the optimization of a fully Monte Carlo (MC) based iterative image reconstruction of Positron Emission Tomography (PET) measurements. With our MC re- construction method all the physical effects in a PET system are taken into account thus superior image quality is achieved in exchange for increased computational effort. The method is feasible because we utilize the enormous processing power of Graphical Processing Units (GPUs) to solve the inherently parallel problem of photon transport. The MC approach regards the simulated positron decays as samples in mathematical sums required in the iterative reconstruction algorithm, so to complement the fast architecture, our work of optimization focuses on the number of simulated positron decays required to obtain sufficient image quality. We have achieved significant results in determining the optimal number of samples for arbitrary measurement data, this allows to achieve the best image quality with the least possible computational effort. Based on this research recommendations can be given for effective partitioning of computational effort into the iterations in limited time reconstructions. (author)

  11. Ratios between effective doses for tomographic and mathematician models due to internal exposure of photons

    International Nuclear Information System (INIS)

    Lima, F.R.A.; Kramer, R.; Khoury, H.J.; Santos, A.M.; Loureiro, E.C.M.

    2005-01-01

    The development of new and sophisticated Monte Carlo codes and tomographic human phantoms or voxels motivated the International Commission on Radiological Protection (ICRP) to revise the traditional models of exposure, which have been used to calculate effective dose coefficients for organs and tissues based on mathematician phantoms known as MIRD5. This paper shows the results of calculations using tomographic phantoms MAX (Male Adult voXel) and FAX (Female Adult voXel), recently developed by the authors as well as with the phantoms ADAM and EVA, of specific genres, type MIRD5, coupled to the EGS4 Monte Carlo and MCNP4C codes, for internal exposure with photons of energies between 10 keV and 4 MeV to several organs sources. Effective Doses for both models, tomographic and mathematician, will be compared separately as a function of the Monte Carlo code replacement, of compositions of human tissues and the anatomy reproduced through tomographs. The results indicate that for photon internal exposure, the use of models of exposure based in voxel, increases the values of effective doses up to 70% for some organs sources considered in this study, when compared with the corresponding results obtained with phantoms of MIRD-5 type

  12. 3D velocity measurements in a premixed flame by tomographic PIV

    International Nuclear Information System (INIS)

    Tokarev, M P; Sharaborin, D K; Lobasov, A S; Chikishev, L M; Dulin, V M; Markovich, D M

    2015-01-01

    Tomographic particle image velocimetry (PIV) has become a standard tool for 3D velocity measurements in non-reacting flows. However, the majority of the measurements in flows with combustion are limited to small resolved depth compared to the size of the field of view (typically 1 : 10). The limitations are associated with inhomogeneity of the volume illumination and the non-uniform flow seeding, the optical distortions and errors in the 3D calibration, and the unwanted flame luminosity. In the present work, the above constraints were overcome for the tomographic PIV experiment in a laminar axisymmetric premixed flame. The measurements were conducted for a 1 : 1 depth-to-size ratio using a system of eight CCD cameras and a 200 mJ pulsed laser. The results show that camera calibration based on the triangulation of the tracer particles in the non-reacting conditions provided reliable accuracy for the 3D image reconstruction in the flame. The modification of the tomographic reconstruction allowed a posteriori removal of unwanted bright objects, which were located outside of the region of interest but affected the reconstruction quality. This study reports on a novel experience for the instantaneous 3D velocimetry in laboratory-scale flames by using tomographic PIV. (paper)

  13. Three dimensional reconstruction of tomographic images of the retina

    International Nuclear Information System (INIS)

    Glittenberg, C.; Zeiler, F.; Falkner, C.; Binder, S.; Povazay, B.; Hermann, B.; Drexler, W.

    2007-01-01

    The development of a new display system for the three-dimensional visualization of tomographic images in ophthalmology. Specifically, a system that can use stacks of B-mode scans from an ultrahigh resolution optical tomography examination to vividly display retinal specimens as three-dimensional objects. Several subroutines were programmed in the rendering and raytracing program Cinema 4D XL 9.102 Studio Bundle (Maxon Computer Inc., Friedrichsburg, Germany), which could process stacks of tomographic scans into three-dimensional objects. Ultrahigh resolution optical coherence tomography examinations were performed on patients with various retinal pathologies and post processed with the subroutines that had been designed. All ultrahigh resolution optical coherence tomographies were performed with a titanium: sapphire based ultra broad bandwidth (160 nm) femtosecond laser system (INTEGRAL, Femtolasers Productions GmbH. Vienna Austria) with an axial resolution of 3 μm. A new three dimensional display system for tomographic images in ophthalmology was developed, which allows a highly vivid display of physiological and pathological structures of the retina. The system also distinguishes itself through its high interactivity and adaptability. This new display system allows the visualization of physiological and pathological structures of the retina in a new way, which will give us new insight into their morphology and development. (author) [de

  14. Reconstruction of tomographic images from projections of a small number of views by means of mathematical programming

    International Nuclear Information System (INIS)

    Kobayashi, Fujio; Yamaguchi, Shoichiro

    1985-01-01

    Fundamental studies have been made on the application of mathematical programming to the reconstruction of tomographic images from projections of a small number of views without requiring any circular symmetry nor periodicity. Linear programming and quadratic programming were applied to minimize the quadratic sum of the residue and to finally obtain optimized reconstruction images. The mathematical algorithms were verified by the method of computer simulation, and the relationship between the number of picture elements and the number of iterations necessary for convergence was also investigated. The methods of linear programming and quadratic programming require fairly simple mathematical procedures, and strict solutions can be obtained within a finite number of iterations. Their only draw back is the requirement of a large quantity of computer memory. But this problem will be desolved by the advent of large fast memory devices in the near future. (Aoki, K.)

  15. Applicability of a set of tomographic reconstruction algorithms for quantitative SPECT on irradiated nuclear fuel assemblies

    Energy Technology Data Exchange (ETDEWEB)

    Jacobsson Svärd, Staffan, E-mail: staffan.jacobsson_svard@physics.uu.se; Holcombe, Scott; Grape, Sophie

    2015-05-21

    A fuel assembly operated in a nuclear power plant typically contains 100–300 fuel rods, depending on fuel type, which become strongly radioactive during irradiation in the reactor core. For operational and security reasons, it is of interest to experimentally deduce rod-wise information from the fuel, preferably by means of non-destructive measurements. The tomographic SPECT technique offers such possibilities through its two-step application; (1) recording the gamma-ray flux distribution around the fuel assembly, and (2) reconstructing the assembly's internal source distribution, based on the recorded radiation field. In this paper, algorithms for performing the latter step and extracting quantitative relative rod-by-rod data are accounted for. As compared to application of SPECT in nuclear medicine, nuclear fuel assemblies present a much more heterogeneous distribution of internal attenuation to gamma radiation than the human body, typically with rods containing pellets of heavy uranium dioxide surrounded by cladding of a zirconium alloy placed in water or air. This inhomogeneity severely complicates the tomographic quantification of the rod-wise relative source content, and the deduction of conclusive data requires detailed modelling of the attenuation to be introduced in the reconstructions. However, as shown in this paper, simplified models may still produce valuable information about the fuel. Here, a set of reconstruction algorithms for SPECT on nuclear fuel assemblies are described and discussed in terms of their quantitative performance for two applications; verification of fuel assemblies' completeness in nuclear safeguards, and rod-wise fuel characterization. It is argued that a request not to base the former assessment on any a priori information brings constraints to which reconstruction methods that may be used in that case, whereas the use of a priori information on geometry and material content enables highly accurate quantitative

  16. Computerized tomographic in non-destructive testing

    International Nuclear Information System (INIS)

    Lopes, R.T.

    1988-01-01

    The process of computerized tomography has been developed for medical imaging purposes using tomographs with X-ray, and little attention has been given to others possibles applications of technique, because of its cost. As an alternative for the problem, we constructed a Tomographic System (STAC-1), using gamma-rays, for nonmedical applications. In this work we summarize the basic theory of reconstructing images using computerized tomography and we describe the considerations leading to the development of the experimental system. The method of reconstruction image implanted in the system is the filtered backprojection or convolution, with a digital filters system to carried on a pre-filtering in the projections. The experimental system is described, with details of control and the data processing. An alternative and a complementary system, using film as a detector is shown in preliminary form . This thesis discuss and shows the theorical and practical aspects, considered in the construction of the STAC-1, and also its limitations and apllications [pt

  17. Generalized Row-Action Methods for Tomographic Imaging

    DEFF Research Database (Denmark)

    Andersen, Martin Skovgaard; Hansen, Per Christian

    2014-01-01

    Row-action methods play an important role in tomographic image reconstruction. Many such methods can be viewed as incremental gradient methods for minimizing a sum of a large number of convex functions, and despite their relatively poor global rate of convergence, these methods often exhibit fast...... initial convergence which is desirable in applications where a low-accuracy solution is acceptable. In this paper, we propose relaxed variants of a class of incremental proximal gradient methods, and these variants generalize many existing row-action methods for tomographic imaging. Moreover, they allow...

  18. A parallelizable compression scheme for Monte Carlo scatter system matrices in PET image reconstruction

    International Nuclear Information System (INIS)

    Rehfeld, Niklas; Alber, Markus

    2007-01-01

    Scatter correction techniques in iterative positron emission tomography (PET) reconstruction increasingly utilize Monte Carlo (MC) simulations which are very well suited to model scatter in the inhomogeneous patient. Due to memory constraints the results of these simulations are not stored in the system matrix, but added or subtracted as a constant term or recalculated in the projector at each iteration. This implies that scatter is not considered in the back-projector. The presented scheme provides a method to store the simulated Monte Carlo scatter in a compressed scatter system matrix. The compression is based on parametrization and B-spline approximation and allows the formation of the scatter matrix based on low statistics simulations. The compression as well as the retrieval of the matrix elements are parallelizable. It is shown that the proposed compression scheme provides sufficient compression so that the storage in memory of a scatter system matrix for a 3D scanner is feasible. Scatter matrices of two different 2D scanner geometries were compressed and used for reconstruction as a proof of concept. Compression ratios of 0.1% could be achieved and scatter induced artifacts in the images were successfully reduced by using the compressed matrices in the reconstruction algorithm

  19. Fast GPU-based Monte Carlo code for SPECT/CT reconstructions generates improved 177Lu images.

    Science.gov (United States)

    Rydén, T; Heydorn Lagerlöf, J; Hemmingsson, J; Marin, I; Svensson, J; Båth, M; Gjertsson, P; Bernhardt, P

    2018-01-04

    Full Monte Carlo (MC)-based SPECT reconstructions have a strong potential for correcting for image degrading factors, but the reconstruction times are long. The objective of this study was to develop a highly parallel Monte Carlo code for fast, ordered subset expectation maximum (OSEM) reconstructions of SPECT/CT images. The MC code was written in the Compute Unified Device Architecture language for a computer with four graphics processing units (GPUs) (GeForce GTX Titan X, Nvidia, USA). This enabled simulations of parallel photon emissions from the voxels matrix (128 3 or 256 3 ). Each computed tomography (CT) number was converted to attenuation coefficients for photo absorption, coherent scattering, and incoherent scattering. For photon scattering, the deflection angle was determined by the differential scattering cross sections. An angular response function was developed and used to model the accepted angles for photon interaction with the crystal, and a detector scattering kernel was used for modeling the photon scattering in the detector. Predefined energy and spatial resolution kernels for the crystal were used. The MC code was implemented in the OSEM reconstruction of clinical and phantom 177 Lu SPECT/CT images. The Jaszczak image quality phantom was used to evaluate the performance of the MC reconstruction in comparison with attenuated corrected (AC) OSEM reconstructions and attenuated corrected OSEM reconstructions with resolution recovery corrections (RRC). The performance of the MC code was 3200 million photons/s. The required number of photons emitted per voxel to obtain a sufficiently low noise level in the simulated image was 200 for a 128 3 voxel matrix. With this number of emitted photons/voxel, the MC-based OSEM reconstruction with ten subsets was performed within 20 s/iteration. The images converged after around six iterations. Therefore, the reconstruction time was around 3 min. The activity recovery for the spheres in the Jaszczak phantom was

  20. Image reconstruction for a Positron Emission Tomograph optimized for breast cancer imaging

    International Nuclear Information System (INIS)

    Virador, Patrick R.G.

    2000-01-01

    The author performs image reconstruction for a novel Positron Emission Tomography camera that is optimized for breast cancer imaging. This work addresses for the first time, the problem of fully-3D, tomographic reconstruction using a septa-less, stationary, (i.e. no rotation or linear motion), and rectangular camera whose Field of View (FOV) encompasses the entire volume enclosed by detector modules capable of measuring Depth of Interaction (DOI) information. The camera is rectangular in shape in order to accommodate breasts of varying sizes while allowing for soft compression of the breast during the scan. This non-standard geometry of the camera exacerbates two problems: (a) radial elongation due to crystal penetration and (b) reconstructing images from irregularly sampled data. Packing considerations also give rise to regions in projection space that are not sampled which lead to missing information. The author presents new Fourier Methods based image reconstruction algorithms that incorporate DOI information and accommodate the irregular sampling of the camera in a consistent manner by defining lines of responses (LORs) between the measured interaction points instead of rebinning the events into predefined crystal face LORs which is the only other method to handle DOI information proposed thus far. The new procedures maximize the use of the increased sampling provided by the DOI while minimizing interpolation in the data. The new algorithms use fixed-width evenly spaced radial bins in order to take advantage of the speed of the Fast Fourier Transform (FFT), which necessitates the use of irregular angular sampling in order to minimize the number of unnormalizable Zero-Efficiency Bins (ZEBs). In order to address the persisting ZEBs and the issue of missing information originating from packing considerations, the algorithms (a) perform nearest neighbor smoothing in 2D in the radial bins (b) employ a semi-iterative procedure in order to estimate the unsampled data

  1. Image reconstruction for a Positron Emission Tomograph optimized for breast cancer imaging

    Energy Technology Data Exchange (ETDEWEB)

    Virador, Patrick R.G. [Univ. of California, Berkeley, CA (United States)

    2000-04-01

    The author performs image reconstruction for a novel Positron Emission Tomography camera that is optimized for breast cancer imaging. This work addresses for the first time, the problem of fully-3D, tomographic reconstruction using a septa-less, stationary, (i.e. no rotation or linear motion), and rectangular camera whose Field of View (FOV) encompasses the entire volume enclosed by detector modules capable of measuring Depth of Interaction (DOI) information. The camera is rectangular in shape in order to accommodate breasts of varying sizes while allowing for soft compression of the breast during the scan. This non-standard geometry of the camera exacerbates two problems: (a) radial elongation due to crystal penetration and (b) reconstructing images from irregularly sampled data. Packing considerations also give rise to regions in projection space that are not sampled which lead to missing information. The author presents new Fourier Methods based image reconstruction algorithms that incorporate DOI information and accommodate the irregular sampling of the camera in a consistent manner by defining lines of responses (LORs) between the measured interaction points instead of rebinning the events into predefined crystal face LORs which is the only other method to handle DOI information proposed thus far. The new procedures maximize the use of the increased sampling provided by the DOI while minimizing interpolation in the data. The new algorithms use fixed-width evenly spaced radial bins in order to take advantage of the speed of the Fast Fourier Transform (FFT), which necessitates the use of irregular angular sampling in order to minimize the number of unnormalizable Zero-Efficiency Bins (ZEBs). In order to address the persisting ZEBs and the issue of missing information originating from packing considerations, the algorithms (a) perform nearest neighbor smoothing in 2D in the radial bins (b) employ a semi-iterative procedure in order to estimate the unsampled data

  2. Robust statistical reconstruction for charged particle tomography

    Science.gov (United States)

    Schultz, Larry Joe; Klimenko, Alexei Vasilievich; Fraser, Andrew Mcleod; Morris, Christopher; Orum, John Christopher; Borozdin, Konstantin N; Sossong, Michael James; Hengartner, Nicolas W

    2013-10-08

    Systems and methods for charged particle detection including statistical reconstruction of object volume scattering density profiles from charged particle tomographic data to determine the probability distribution of charged particle scattering using a statistical multiple scattering model and determine a substantially maximum likelihood estimate of object volume scattering density using expectation maximization (ML/EM) algorithm to reconstruct the object volume scattering density. The presence of and/or type of object occupying the volume of interest can be identified from the reconstructed volume scattering density profile. The charged particle tomographic data can be cosmic ray muon tomographic data from a muon tracker for scanning packages, containers, vehicles or cargo. The method can be implemented using a computer program which is executable on a computer.

  3. TomoBank: a tomographic data repository for computational x-ray science

    Science.gov (United States)

    De Carlo, Francesco; Gürsoy, Doğa; Ching, Daniel J.; Joost Batenburg, K.; Ludwig, Wolfgang; Mancini, Lucia; Marone, Federica; Mokso, Rajmund; Pelt, Daniël M.; Sijbers, Jan; Rivers, Mark

    2018-03-01

    There is a widening gap between the fast advancement of computational methods for tomographic reconstruction and their successful implementation in production software at various synchrotron facilities. This is due in part to the lack of readily available instrument datasets and phantoms representative of real materials for validation and comparison of new numerical methods. Recent advancements in detector technology have made sub-second and multi-energy tomographic data collection possible (Gibbs et al 2015 Sci. Rep. 5 11824), but have also increased the demand to develop new reconstruction methods able to handle in situ (Pelt and Batenburg 2013 IEEE Trans. Image Process. 22 5238-51) and dynamic systems (Mohan et al 2015 IEEE Trans. Comput. Imaging 1 96-111) that can be quickly incorporated in beamline production software (Gürsoy et al 2014 J. Synchrotron Radiat. 21 1188-93). The x-ray tomography data bank, tomoBank, provides a repository of experimental and simulated datasets with the aim to foster collaboration among computational scientists, beamline scientists, and experimentalists and to accelerate the development and implementation of tomographic reconstruction methods for synchrotron facility production software by providing easy access to challenging datasets and their descriptors.

  4. Tomographic anthropomorphic models. Pt. 2. Organ doses from computed tomographic examinations in paediatric radiology

    International Nuclear Information System (INIS)

    Zankl, M.; Panzer, W.; Drexler, G.

    1993-11-01

    This report provides a catalogue of organ dose conversion factors resulting from computed tomographic (CT) examinations of children. Two radiation qualities and two exposure geometries were simulated as well as the use of asymmetrical beams. The use of further beam shaping devices was not considered. The organ dose conversion factors are applicable to babies at the age of ca. 2 months and to children between 5 and 7 years but can be used for other ages as well with the appropriate adjustments. For the calculations, the patients were represented by the GSF tomographic anthropomorphic models BABY and CHILD. The radiation transport in the body was simulated using a Monte Carlo method. The doses are presented as conversion factors of mean organ doses per air kerma free in air on the axis of rotation. Mean organ dose conversion factors are given per organ and per scanned body section of 1 cm height. The mean dose to an organ resulting from a particular CT examination can be estimated by summing up the contributions to the organ dose from all relevant sections. To facilitate the selection of the appropriate sections, a table is given which relates the tomographic models' coordinates to certain anatomical landmarks in the human body. (orig.)

  5. On the feasibility of tomographic-PIV with low pulse energy illumination in a lifted turbulent jet flame

    Science.gov (United States)

    Boxx, I.; Carter, C. D.; Meier, W.

    2014-08-01

    Tomographic particle image velocimetry (tomographic-PIV) is a recently developed measurement technique used to acquire volumetric velocity field data in liquid and gaseous flows. The technique relies on line-of-sight reconstruction of the rays between a 3D particle distribution and a multi-camera imaging system. In a turbulent flame, however, index-of-refraction variations resulting from local heat-release may inhibit reconstruction and thereby render the technique infeasible. The objective of this study was to test the efficacy of tomographic-PIV in a turbulent flame. An additional goal was to determine the feasibility of acquiring usable tomographic-PIV measurements in a turbulent flame at multi-kHz acquisition rates with current-generation laser and camera technology. To this end, a setup consisting of four complementary metal oxide semiconductor cameras and a dual-cavity Nd:YAG laser was implemented to test the technique in a lifted turbulent jet flame. While the cameras were capable of kHz-rate image acquisition, the laser operated at a pulse repetition rate of only 10 Hz. However, use of this laser allowed exploration of the required pulse energy and thus power for a kHz-rate system. The imaged region was 29 × 28 × 2.7 mm in size. The tomographic reconstruction of the 3D particle distributions was accomplished using the multiplicative algebraic reconstruction technique. The results indicate that volumetric velocimetry via tomographic-PIV is feasible with pulse energies of 25 mJ, which is within the capability of current-generation kHz-rate diode-pumped solid-state lasers.

  6. A fast multi-resolution approach to tomographic PIV

    Science.gov (United States)

    Discetti, Stefano; Astarita, Tommaso

    2012-03-01

    Tomographic particle image velocimetry (Tomo-PIV) is a recently developed three-component, three-dimensional anemometric non-intrusive measurement technique, based on an optical tomographic reconstruction applied to simultaneously recorded images of the distribution of light intensity scattered by seeding particles immersed into the flow. Nowadays, the reconstruction process is carried out mainly by iterative algebraic reconstruction techniques, well suited to handle the problem of limited number of views, but computationally intensive and memory demanding. The adoption of the multiplicative algebraic reconstruction technique (MART) has become more and more accepted. In the present work, a novel multi-resolution approach is proposed, relying on the adoption of a coarser grid in the first step of the reconstruction to obtain a fast estimation of a reliable and accurate first guess. A performance assessment, carried out on three-dimensional computer-generated distributions of particles, shows a substantial acceleration of the reconstruction process for all the tested seeding densities with respect to the standard method based on 5 MART iterations; a relevant reduction in the memory storage is also achieved. Furthermore, a slight accuracy improvement is noticed. A modified version, improved by a multiplicative line of sight estimation of the first guess on the compressed configuration, is also tested, exhibiting a further remarkable decrease in both memory storage and computational effort, mostly at the lowest tested seeding densities, while retaining the same performances in terms of accuracy.

  7. High resolution x-ray CMT: Reconstruction methods

    Energy Technology Data Exchange (ETDEWEB)

    Brown, J.K.

    1997-02-01

    This paper qualitatively discusses the primary characteristics of methods for reconstructing tomographic images from a set of projections. These reconstruction methods can be categorized as either {open_quotes}analytic{close_quotes} or {open_quotes}iterative{close_quotes} techniques. Analytic algorithms are derived from the formal inversion of equations describing the imaging process, while iterative algorithms incorporate a model of the imaging process and provide a mechanism to iteratively improve image estimates. Analytic reconstruction algorithms are typically computationally more efficient than iterative methods; however, analytic algorithms are available for a relatively limited set of imaging geometries and situations. Thus, the framework of iterative reconstruction methods is better suited for high accuracy, tomographic reconstruction codes.

  8. Construction of tomographic head model using sectioned photographic images of cadaver

    International Nuclear Information System (INIS)

    Lee, Choon Sik; Lee, Jai Ki; Park, Jin Seo; Chung, Min Suk

    2004-01-01

    Tomographic models are currently the most complete, developed and realistic models of the human anatomy. They have been used to estimate organ doses for diagnostic radiation examination and radiotherapy treatment planning, and radiation protection. The quality of original anatomic images is a key factor to build a quality tomographic model. Computed tomography (CT) and magnetic resonance imaging (MRI) scan, from which most of current tomographic models are constructed, have their inherent shortcomings. In this study, a tomographic model of Korean adult male head was constructed by using serially sectioned photographs of cadaver. The cadaver was embedded, frozen, serially sectioned and photographed by high resolution digital camera at 0.2 mm interval. The contours of organs and tissues in photographs were segmented by several trained anatomists. The 120 segmented images of head at 2mm interval were converted into binary files and ported into Monte Carlo code to perform an example calculation of organ dose. Whole body tomographic model will be constructed by using the procedure developed in this study

  9. Seismic tomographic constraints on plate-tectonic reconstructions of Nazca subduction under South America since late Cretaceous (˜80 Ma)

    Science.gov (United States)

    Chen, Y. W.; Wu, J.; Suppe, J.

    2017-12-01

    Global seismic tomography has provided new and increasingly higher resolution constraints on subducted lithospheric remnants in terms of their position, depth, and volumes. In this study we aim to link tomographic slab anomalies in the mantle under South America to Andean geology using methods to unfold (i.e. structurally restore) slabs back to earth surface and input them to globally consistent plate reconstructions (Wu et al., 2016). The Andean margin of South America has long been interpreted as a classic example of a continuous subduction system since early Jurassic or later. However, significant gaps in Andean plate tectonic reconstructions exist due to missing or incomplete geology from extensive Nazca-South America plate convergence (i.e. >5000 km since 80 Ma). We mapped and unfolded the Nazca slab from global seismic tomography to produce a quantitative plate reconstruction of the Andes back to the late Cretaceous 80 Ma. Our plate model predicts the latest phase of Nazca subduction began in the late Cretaceous subduction after a 100 to 80 Ma plate reorganization, which is supported by Andean geology that indicates a margin-wide compressional event at the mid-late Cretaceous (Tunik et al., 2010). Our Andean plate tectonic reconstructions predict the Andean margin experienced periods of strike-slip/transtensional and even divergent plate tectonics between 80 to 55 Ma. This prediction is roughly consistent with the arc magmatism from northern Chile between 20 to 36°S that resumed at 80 Ma after a magmatic gap. Our model indicates the Andean margin only became fully convergent after 55 Ma. We provide additional constraints on pre-subduction Nazca plate paleogeography by extracting P-wave velocity perturbations within our mapped slab surfaces following Wu et al. (2016). We identified localized slow anomalies within our mapped Nazca slab that apparently show the size and position of the subducted Nazca ridge, Carnegie ridge and the hypothesized Inca plateau

  10. New method to analyze internal disruptions with tomographic reconstructions

    Energy Technology Data Exchange (ETDEWEB)

    Tanzi, C.P. [EURATOM-FOM Association, FOM-Instituut voor Plasmafysica Rijnhuizen, P.O. BOX 1207, 3430 BE Nieuwegein (The Netherlands); de Blank, H.J. [Max-Planck-Institut fuer Plasmaphysik, EURATOM-IPP Association, 85740 Garching (Germany)

    1997-03-01

    Sawtooth crashes have been investigated on the Rijnhuizen Tokamak Project (RTP) [N. J. Lopes Cardozo {ital et al.}, {ital Proceedings of the 14th International Conference on Plasma Physics and Controlled Nuclear Fusion Research}, W{umlt u}rzburg, 1992 (International Atomic Energy Agency, Vienna, 1993), Vol. 1, p. 271]. Internal disruptions in tokamak plasmas often exhibit an m=1 poloidal mode structure prior to the collapse which can be clearly identified by means of multicamera soft x-ray diagnostics. In this paper tomographic reconstructions of such m=1 modes are analyzed with a new method, based on magnetohydrodynamic (MHD) invariants computed from the two-dimensional emissivity profiles, which quantifies the amount of profile flattening not only after the crash but also during the precursor oscillations. The results are interpreted by comparing them with two models which simulate the measurements of the m=1 redistribution of soft x-ray emissivity prior to the sawtooth crash. One model is based on the magnetic reconnection model of Kadomtsev. The other involves ideal MHD motion only. In cases where differences in magnetic topology between the two models cannot be seen in the tomograms, the analysis of profile flattening has an advantage. The analysis shows that in RTP the clearly observed m=1 displacement of some sawteeth requires the presence of convective ideal MHD motion, whereas other precursors are consistent with magnetic reconnection of up to 75{percent} of the magnetic flux within the q=1 surface. The possibility of ideal interchange combined with enhanced cross-field transport is not excluded. {copyright} {ital 1997 American Institute of Physics.}

  11. New method to analyze internal disruptions with tomographic reconstructions

    International Nuclear Information System (INIS)

    Tanzi, C.P.; de Blank, H.J.

    1997-01-01

    Sawtooth crashes have been investigated on the Rijnhuizen Tokamak Project (RTP) [N. J. Lopes Cardozo et al., Proceedings of the 14th International Conference on Plasma Physics and Controlled Nuclear Fusion Research, Wuerzburg, 1992 (International Atomic Energy Agency, Vienna, 1993), Vol. 1, p. 271]. Internal disruptions in tokamak plasmas often exhibit an m=1 poloidal mode structure prior to the collapse which can be clearly identified by means of multicamera soft x-ray diagnostics. In this paper tomographic reconstructions of such m=1 modes are analyzed with a new method, based on magnetohydrodynamic (MHD) invariants computed from the two-dimensional emissivity profiles, which quantifies the amount of profile flattening not only after the crash but also during the precursor oscillations. The results are interpreted by comparing them with two models which simulate the measurements of the m=1 redistribution of soft x-ray emissivity prior to the sawtooth crash. One model is based on the magnetic reconnection model of Kadomtsev. The other involves ideal MHD motion only. In cases where differences in magnetic topology between the two models cannot be seen in the tomograms, the analysis of profile flattening has an advantage. The analysis shows that in RTP the clearly observed m=1 displacement of some sawteeth requires the presence of convective ideal MHD motion, whereas other precursors are consistent with magnetic reconnection of up to 75% of the magnetic flux within the q=1 surface. The possibility of ideal interchange combined with enhanced cross-field transport is not excluded. copyright 1997 American Institute of Physics

  12. Neutron tomography using projection data obtained by Monte Carlo simulation for nondestructive evaluation

    International Nuclear Information System (INIS)

    Silva, A.X. da; Crispim, V.R.

    2002-01-01

    This work present the application of a computer package for generating of projection data for neutron computerized tomography, and in second part, discusses an application of neutron tomography, using the projection data obtained by Monte Carlo technique, for the detection and localization of light materials such as those containing hydrogen, concealed by heavy materials such as iron and lead. For tomographic reconstructions of the samples simulated use was made of only six equal projection angles distributed between 0 deg C and 180 deg C, with reconstruction making use of an algorithm (ARIEM), based on the principle of maximum entropy. With the neutron tomography it was possible to detect and locate polyethylene and water hidden by lead and iron (with 1 cm-thick). Thus, it is demonstrated that thermal neutrons tomography is a viable test method which can provide important interior information about test components, so, extremely useful in routine industrial applications.(author)

  13. Visual hull method for tomographic PIV measurement of flow around moving objects

    Energy Technology Data Exchange (ETDEWEB)

    Adhikari, D.; Longmire, E.K. [University of Minnesota, Department of Aerospace Engineering and Mechanics, Minneapolis, MN (United States)

    2012-10-15

    Tomographic particle image velocimetry (PIV) is a recently developed method to measure three components of velocity within a volumetric space. We present a visual hull technique that automates identification and masking of discrete objects within the measurement volume, and we apply existing tomographic PIV reconstruction software to measure the velocity surrounding the objects. The technique is demonstrated by considering flow around falling bodies of different shape with Reynolds number {proportional_to}1,000. Acquired image sets are processed using separate routines to reconstruct both the volumetric mask around the object and the surrounding tracer particles. After particle reconstruction, the reconstructed object mask is used to remove any ghost particles that otherwise appear within the object volume. Velocity vectors corresponding with fluid motion can then be determined up to the boundary of the visual hull without being contaminated or affected by the neighboring object velocity. Although the visual hull method is not meant for precise tracking of objects, the reconstructed object volumes nevertheless can be used to estimate the object location and orientation at each time step. (orig.)

  14. Tomographic image reconstruction using Artificial Neural Networks

    International Nuclear Information System (INIS)

    Paschalis, P.; Giokaris, N.D.; Karabarbounis, A.; Loudos, G.K.; Maintas, D.; Papanicolas, C.N.; Spanoudaki, V.; Tsoumpas, Ch.; Stiliaris, E.

    2004-01-01

    A new image reconstruction technique based on the usage of an Artificial Neural Network (ANN) is presented. The most crucial factor in designing such a reconstruction system is the network architecture and the number of the input projections needed to reconstruct the image. Although the training phase requires a large amount of input samples and a considerable CPU time, the trained network is characterized by simplicity and quick response. The performance of this ANN is tested using several image patterns. It is intended to be used together with a phantom rotating table and the γ-camera of IASA for SPECT image reconstruction

  15. Tomographic PIV: principles and practice

    International Nuclear Information System (INIS)

    Scarano, F

    2013-01-01

    A survey is given of the major developments in three-dimensional velocity field measurements using the tomographic particle image velocimetry (PIV) technique. The appearance of tomo-PIV dates back seven years from the present review (Elsinga et al 2005a 6th Int. Symp. PIV (Pasadena, CA)) and this approach has rapidly spread as a versatile, robust and accurate technique to investigate three-dimensional flows (Arroyo and Hinsch 2008 Topics in Applied Physics vol 112 ed A Schröder and C E Willert (Berlin: Springer) pp 127–54) and turbulence physics in particular. A considerable number of applications have been achieved over a wide range of flow problems, which requires the current status and capabilities of tomographic PIV to be reviewed. The fundamental aspects of the technique are discussed beginning from hardware considerations for volume illumination, imaging systems, their configurations and system calibration. The data processing aspects are of uppermost importance: image pre-processing, 3D object reconstruction and particle motion analysis are presented with their fundamental aspects along with the most advanced approaches. Reconstruction and cross-correlation algorithms, attaining higher measurement precision, spatial resolution or higher computational efficiency, are also discussed. The exploitation of 3D and time-resolved (4D) tomographic PIV data includes the evaluation of flow field pressure on the basis of the flow governing equation. The discussion also covers a-posteriori error analysis techniques. The most relevant applications of tomo-PIV in fluid mechanics are surveyed, covering experiments in air and water flows. In measurements in flow regimes from low-speed to supersonic, most emphasis is given to the complex 3D organization of turbulent coherent structures. (topical review)

  16. Voxel-based model construction from colored tomographic images

    International Nuclear Information System (INIS)

    Loureiro, Eduardo Cesar de Miranda

    2002-07-01

    This work presents a new approach in the construction of voxel-based phantoms that was implemented to simplify the segmentation process of organs and tissues reducing the time used in this procedure. The segmentation process is performed by painting tomographic images and attributing a different color for each organ or tissue. A voxel-based head and neck phantom was built using this new approach. The way as the data are stored allows an increasing in the performance of the radiation transport code. The program that calculates the radiation transport also works with image files. This capability allows image reconstruction showing isodose areas, under several points of view, increasing the information to the user. Virtual X-ray photographs can also be obtained allowing that studies could be accomplished looking for the radiographic techniques optimization assessing, at the same time, the doses in organs and tissues. The accuracy of the program here presented, called MCvoxEL, that implements this new approach, was tested by comparison to results from two modern and well-supported Monte Carlo codes. Dose conversion factors for parallel X-ray exposure were also calculated. (author)

  17. Reconstruction of 2D PET data with Monte Carlo generated system matrix for generalized natural pixels

    International Nuclear Information System (INIS)

    Vandenberghe, Stefaan; Staelens, Steven; Byrne, Charles L; Soares, Edward J; Lemahieu, Ignace; Glick, Stephen J

    2006-01-01

    In discrete detector PET, natural pixels are image basis functions calculated from the response of detector pairs. By using reconstruction with natural pixel basis functions, the discretization of the object into a predefined grid can be avoided. Here, we propose to use generalized natural pixel reconstruction. Using this approach, the basis functions are not the detector sensitivity functions as in the natural pixel case but uniform parallel strips. The backprojection of the strip coefficients results in the reconstructed image. This paper proposes an easy and efficient way to generate the matrix M directly by Monte Carlo simulation. Elements of the generalized natural pixel system matrix are formed by calculating the intersection of a parallel strip with the detector sensitivity function. These generalized natural pixels are easier to use than conventional natural pixels because the final step from solution to a square pixel representation is done by simple backprojection. Due to rotational symmetry in the PET scanner, the matrix M is block circulant and only the first blockrow needs to be stored. Data were generated using a fast Monte Carlo simulator using ray tracing. The proposed method was compared to a listmode MLEM algorithm, which used ray tracing for doing forward and backprojection. Comparison of the algorithms with different phantoms showed that an improved resolution can be obtained using generalized natural pixel reconstruction with accurate system modelling. In addition, it was noted that for the same resolution a lower noise level is present in this reconstruction. A numerical observer study showed the proposed method exhibited increased performance as compared to a standard listmode EM algorithm. In another study, more realistic data were generated using the GATE Monte Carlo simulator. For these data, a more uniform contrast recovery and a better contrast-to-noise performance were observed. It was observed that major improvements in contrast

  18. Construction of Korean male tomographic model segmented from PET-CT data

    International Nuclear Information System (INIS)

    Lee, Choon Sik; Park, Sang Kyun; Lee, Jai Ki

    2004-01-01

    Tomographic human models provide currently the most realistic representation of human anatomy for radiation dosimetry calculation. Most of the models have been constructed by using computed tomographic (CT) or magnetic resonance (MR) images obtained from a single individual. Each scan has its inherent advantages and disadvantages. CT scan gives a considerable radiation dose to a subject, and MR scan takes too long time to get clear images of an immobile subject. An emerging source of medical images for the construction of tomographic models is PET-CT, which is performed when looking for cancer. In this study, a tomographic model of Korean adult male was developed by processing whole-body CT images of a PET-CT-scanned healthy volunteer. The 343 slices of the CT images were semi-automatically segmented layer by layer using a graphic software and screen digitizer. The 3rd Korean tomographic model, named KRMAN-2, consisting of 300x150x344 voxels of a size of 2x2x5mm 3 , was constructed. Examples of application to Monte Carlo radiation dosimetry calculation in idealized whole-body irradiations were given and discussed

  19. Reasons between effective doses for tomographic and mathematical models due to external exposition by photons

    International Nuclear Information System (INIS)

    Kramer, R.; Khoury, H.J.; Yoriyaz, H.; Lima, F.R.A.; Loureiro, E.C.M.

    2005-01-01

    The development of Monte Carlo codes and new and sophisticated tomographic human models, or based on voxel, motivated the ICRP to propose a revision of the traditional exposition models, which have been used to calculate doses on organs and tissues using mathematical phantoms MIRD-type 5. This article presents calculations made with tomographic phantoms MAX (Male Adult voXel) and FAX (Female Adult voXel), recently developed and also, for comparison, with ADAM and Eve mathematician phantoms. All models were coupled to the EGS4 and MCNP4 codes for full body external irradiation by photons. It were simulated expositions AP, PA and rotational for energies varying between 10 keV and 10 MeV. The effective calculated doses were compared separately to evaluate: the replacement of the Monte Carlo code; the composition of the tissues and the replacement of tomographic phantoms by mathematical ones. Effective doses calculated results indicate that for external exposures by photons to introduce models based on voxels can cause a reduction of about 10% to the energies considered in this study

  20. A tomograph VMEbus parallel processing data acquisition system

    International Nuclear Information System (INIS)

    Wilkinson, N.A.; Rogers, J.G.; Atkins, M.S.

    1989-01-01

    This paper describes a VME based data acquisition system suitable for the development of Positron Volume Imaging tomographs which use 3-D data for improved image resolution over slice-oriented tomographs. the data acquisition must be flexible enough to accommodate several 3-D reconstruction algorithms; hence, a software-based system is most suitable. Furthermore, because of the increased dimensions and resolution of volume imaging tomographs, the raw data event rate is greater than that of slice-oriented machines. These dual requirements are met by our data acquisition system. Flexibility is achieved through an array of processors connected over a VMEbus, operating asynchronously and in parallel. High raw data throughput is achieved using a dedicated high speed data transfer device available for the VMEbus. The device can attain a raw data rate of 2.5 million coincidence events per second for raw events which are 64 bits wide

  1. Tomographic reconstruction with B-splines surfaces

    International Nuclear Information System (INIS)

    Oliveira, Eric F.; Dantas, Carlos C.; Melo, Silvio B.; Mota, Icaro V.; Lira, Mailson

    2011-01-01

    Algebraic reconstruction techniques when applied to a limited number of data usually suffer from noise caused by the process of correction or by inconsistencies in the data coming from the stochastic process of radioactive emission and oscillation equipment. The post - processing of the reconstructed image with the application of filters can be done to mitigate the presence of noise. In general these processes also attenuate the discontinuities present in edges that distinguish objects or artifacts, causing excessive blurring in the reconstructed image. This paper proposes a built-in noise reduction at the same time that it ensures adequate smoothness level in the reconstructed surface, representing the unknowns as linear combinations of elements of a piecewise polynomial basis, i.e. a B-splines basis. For that, the algebraic technique ART is modified to accommodate the first, second and third degree bases, ensuring C 0 , C 1 and C 2 smoothness levels, respectively. For comparisons, three methodologies are applied: ART, ART post-processed with regular B-splines filters (ART*) and the proposed method with the built-in B-splines filter (BsART). Simulations with input data produced from common mathematical phantoms were conducted. For the phantoms used the BsART method consistently presented the smallest errors, among the three methods. This study has shown the superiority of the change made to embed the filter in the ART when compared to the post-filtered ART. (author)

  2. Tensor-based dictionary learning for dynamic tomographic reconstruction

    International Nuclear Information System (INIS)

    Tan, Shengqi; Wu, Zhifang; Zhang, Yanbo; Mou, Xuanqin; Wang, Ge; Cao, Guohua; Yu, Hengyong

    2015-01-01

    In dynamic computed tomography (CT) reconstruction, the data acquisition speed limits the spatio-temporal resolution. Recently, compressed sensing theory has been instrumental in improving CT reconstruction from far few-view projections. In this paper, we present an adaptive method to train a tensor-based spatio-temporal dictionary for sparse representation of an image sequence during the reconstruction process. The correlations among atoms and across phases are considered to capture the characteristics of an object. The reconstruction problem is solved by the alternating direction method of multipliers. To recover fine or sharp structures such as edges, the nonlocal total variation is incorporated into the algorithmic framework. Preclinical examples including a sheep lung perfusion study and a dynamic mouse cardiac imaging demonstrate that the proposed approach outperforms the vectorized dictionary-based CT reconstruction in the case of few-view reconstruction. (paper)

  3. Tensor-based Dictionary Learning for Dynamic Tomographic Reconstruction

    Science.gov (United States)

    Tan, Shengqi; Zhang, Yanbo; Wang, Ge; Mou, Xuanqin; Cao, Guohua; Wu, Zhifang; Yu, Hengyong

    2015-01-01

    In dynamic computed tomography (CT) reconstruction, the data acquisition speed limits the spatio-temporal resolution. Recently, compressed sensing theory has been instrumental in improving CT reconstruction from far few-view projections. In this paper, we present an adaptive method to train a tensor-based spatio-temporal dictionary for sparse representation of an image sequence during the reconstruction process. The correlations among atoms and across phases are considered to capture the characteristics of an object. The reconstruction problem is solved by the alternating direction method of multipliers. To recover fine or sharp structures such as edges, the nonlocal total variation is incorporated into the algorithmic framework. Preclinical examples including a sheep lung perfusion study and a dynamic mouse cardiac imaging demonstrate that the proposed approach outperforms the vectorized dictionary-based CT reconstruction in the case of few-view reconstruction. PMID:25779991

  4. Reconstructed image quality analysis of an industrial instant non-scanning tomography system with different types of collimators by the Monte Carlo simulation

    Energy Technology Data Exchange (ETDEWEB)

    Velo, Alexandre F.; Carvalho, Diego V.; Alvarez, Alexandre G.; Hamada, Margarida M.; Mesquita, Carlos H., E-mail: afvelo@usp.br [Instituto de Pesquisas Energéticas e Nucleares (IPEN/CNEN-SP), São Paulo, SP (Brazil)

    2017-07-01

    The greatest impact of the tomography technology application currently occurs in medicine. The great success of medical tomography is due to the human body presents reasonably standardized dimensions with well established chemical composition. Generally, these favorable conditions are not found in large industrial objects. In the industry there is much interest in using the information of the tomograph in order to know the interior of: (1) manufactured industrial objects or (2) machines and their means of production. In these cases, the purpose of the tomograph is to: (a) control the quality of the final product and (b) optimize production, contributing to the pilot phase of the projects and analyzing the quality of the means of production. In different industrial processes, e. g. in chemical reactors and distillation columns, the phenomena related to multiphase processes are usually fast, requiring high temporal resolution of the computed tomography (CT) data acquisition. In this context, Instant non-scanning tomograph and fifth generation tomograph meets these requirements. An instant non scanning tomography system is being developed at the IPEN/CNEN. In this work, in order to optimize the system, this tomograph comprised different collimators was simulated, with Monte Carlo method using the MCNP4C. The image quality was evaluated with MATLAB® 2013b, by analysis of the following parameters: contrast to noise (CNR), root mean square ratio (RMSE), signal to noise ratio (SNR) and the spatial resolution by the Modulation Transfer Function (MTF(f)), to analyze which collimator fits better to the instant non scanning tomography. It was simulated three situations; (1) with no collimator; (2) ?25 mm x 50 mm cylindrical collimator with a septum of ø5.0 mm x 50 mm; (3) ø25 mm x 50 mm cylindrical collimator with a slit septum of 24 mm x 5.0 mm x 50 mm. (author)

  5. Reconstructed image quality analysis of an industrial instant non-scanning tomography system with different types of collimators by the Monte Carlo simulation

    International Nuclear Information System (INIS)

    Velo, Alexandre F.; Carvalho, Diego V.; Alvarez, Alexandre G.; Hamada, Margarida M.; Mesquita, Carlos H.

    2017-01-01

    The greatest impact of the tomography technology application currently occurs in medicine. The great success of medical tomography is due to the human body presents reasonably standardized dimensions with well established chemical composition. Generally, these favorable conditions are not found in large industrial objects. In the industry there is much interest in using the information of the tomograph in order to know the interior of: (1) manufactured industrial objects or (2) machines and their means of production. In these cases, the purpose of the tomograph is to: (a) control the quality of the final product and (b) optimize production, contributing to the pilot phase of the projects and analyzing the quality of the means of production. In different industrial processes, e. g. in chemical reactors and distillation columns, the phenomena related to multiphase processes are usually fast, requiring high temporal resolution of the computed tomography (CT) data acquisition. In this context, Instant non-scanning tomograph and fifth generation tomograph meets these requirements. An instant non scanning tomography system is being developed at the IPEN/CNEN. In this work, in order to optimize the system, this tomograph comprised different collimators was simulated, with Monte Carlo method using the MCNP4C. The image quality was evaluated with MATLAB® 2013b, by analysis of the following parameters: contrast to noise (CNR), root mean square ratio (RMSE), signal to noise ratio (SNR) and the spatial resolution by the Modulation Transfer Function (MTF(f)), to analyze which collimator fits better to the instant non scanning tomography. It was simulated three situations; (1) with no collimator; (2) ?25 mm x 50 mm cylindrical collimator with a septum of ø5.0 mm x 50 mm; (3) ø25 mm x 50 mm cylindrical collimator with a slit septum of 24 mm x 5.0 mm x 50 mm. (author)

  6. Evaluation of tomographic-image based geometries with PENELOPE Monte Carlo

    International Nuclear Information System (INIS)

    Kakoi, A.A.Y.; Galina, A.C.; Nicolucci, P.

    2009-01-01

    The Monte Carlo method can be used to evaluate treatment planning systems or for the determination of dose distributions in radiotherapy planning due to its accuracy and precision. In Monte Carlo simulation packages typically used in radiotherapy, however, a realistic representation of the geometry of the patient can not be used, which compromises the accuracy of the results. In this work, an algorithm for the description of geometries based on CT images of patients, developed to be used with Monte Carlo simulation package PENELOPE, is tested by simulating the dose distribution produced by a photon beam of 10 MV. The geometry simulated was based on CT images of a planning of prostate cancer. The volumes of interest in the treatment were adequately represented in the simulation geometry, allowing the algorithm to be used in verification of doses in radiotherapy treatments. (author)

  7. Methodological study of radionuclide tomographic phase analysis in localization of accessory conduction pathway in patients with wolff-parkinson-white syndrome

    International Nuclear Information System (INIS)

    Wo Jinshan; Zhu Junren; Li Zhishan

    1994-01-01

    In this study, the methodology of tomographic phase analysis to detect the site of accessory conduction pathway (ACP) in patients with Wolff-Parkinson-White syndrome was presented. We analyzed the major factors that affect image reconstruction, selection of tomographic planes and phase analysis, also discussed the key step for reconstruction short-axial section that parallel and closest to the level of atrio-ventricular rings. Of five patients undergoing this procedure prior to surgery, tomographic phase analysis correctly identified the site of ACP confirmed by epicardial mapping in all of the five patients. Our results suggest this approach to be an objective, clear and correct one for localizing ACP

  8. Optimized molecular reconstruction procedure combining hybrid reverse Monte Carlo and molecular dynamics

    Energy Technology Data Exchange (ETDEWEB)

    Bousige, Colin; Boţan, Alexandru; Coasne, Benoît, E-mail: coasne@mit.edu [Department of Civil and Environmental Engineering, Massachusetts Institute of Technology, 77 Massachusetts Avenue, Cambridge, Massachusetts 02139 (United States); UMI 3466 CNRS-MIT, 77 Massachusetts Avenue, Cambridge, Massachusetts 02139 (United States); Ulm, Franz-Josef [Department of Civil and Environmental Engineering, Massachusetts Institute of Technology, 77 Massachusetts Avenue, Cambridge, Massachusetts 02139 (United States); Pellenq, Roland J.-M. [Department of Civil and Environmental Engineering, Massachusetts Institute of Technology, 77 Massachusetts Avenue, Cambridge, Massachusetts 02139 (United States); UMI 3466 CNRS-MIT, 77 Massachusetts Avenue, Cambridge, Massachusetts 02139 (United States); CINaM, CNRS/Aix Marseille Université, Campus de Luminy, 13288 Marseille Cedex 09 (France)

    2015-03-21

    We report an efficient atom-scale reconstruction method that consists of combining the Hybrid Reverse Monte Carlo algorithm (HRMC) with Molecular Dynamics (MD) in the framework of a simulated annealing technique. In the spirit of the experimentally constrained molecular relaxation technique [Biswas et al., Phys. Rev. B 69, 195207 (2004)], this modified procedure offers a refined strategy in the field of reconstruction techniques, with special interest for heterogeneous and disordered solids such as amorphous porous materials. While the HRMC method generates physical structures, thanks to the use of energy penalties, the combination with MD makes the method at least one order of magnitude faster than HRMC simulations to obtain structures of similar quality. Furthermore, in order to ensure the transferability of this technique, we provide rational arguments to select the various input parameters such as the relative weight ω of the energy penalty with respect to the structure optimization. By applying the method to disordered porous carbons, we show that adsorption properties provide data to test the global texture of the reconstructed sample but are only weakly sensitive to the presence of defects. In contrast, the vibrational properties such as the phonon density of states are found to be very sensitive to the local structure of the sample.

  9. Motion tracking-enhanced MART for tomographic PIV

    International Nuclear Information System (INIS)

    Novara, Matteo; Scarano, Fulvio; Batenburg, Kees Joost

    2010-01-01

    A novel technique to increase the accuracy of multiplicative algebraic reconstruction technique (MART) reconstruction from tomographic particle image velocimetry (PIV) recordings at higher seeding density than currently possible is presented. The motion tracking enhancement (MTE) method is based on the combined utilization of images from two or more exposures to enhance the reconstruction of individual intensity fields. The working principle is first introduced qualitatively, and the mathematical background is given that explains how the MART reconstruction can be improved on the basis of an improved first guess object obtained from the combination of non-simultaneous views reduced to the same time instant deforming the 3D objects by an estimate of the particle motion field. The performances of MTE are quantitatively evaluated by numerical simulation of the imaging, reconstruction and image correlation processes. The cases of two or more exposures obtained from time-resolved experiments are considered. The iterative application of MTE appears to significantly improve the reconstruction quality, first by decreasing the intensity of the ghost images and second, by increasing the intensity and the reconstruction precision for the actual particles. Based on computer simulations, the maximum imaged seeding density that can be dealt with is tripled with respect to the MART analysis applied to a single exposure. The analysis also illustrates that the maximum effect of the MTE method is comparable to that of doubling the number of cameras in the tomographic system. Experiments performed on a transitional jet at Re = 5000 apply the MTE method to double-frame recordings. The velocity measurement precision is increased for a system with fewer views (two or three cameras compared with four cameras). The ghost particles' intensity is also visibly reduced although to a lesser extent with respect to the computer simulations. The velocity and vorticity field obtained from a three

  10. Correction of ring artifacts in X-ray tomographic images

    DEFF Research Database (Denmark)

    Lyckegaard, Allan; Johnson, G.; Tafforeau, P.

    2011-01-01

    Ring artifacts are systematic intensity distortions located on concentric circles in reconstructed tomographic X-ray images. When using X-ray tomography to study for instance low-contrast grain boundaries in metals it is crucial to correct for the ring artifacts in the images as they may have...... the same intensity level as the grain boundaries and thus make it impossible to perform grain segmentation. This paper describes an implementation of a method for correcting the ring artifacts in tomographic X-ray images of simple objects such as metal samples where the object and the background...... are separable. The method is implemented in Matlab, it works with very little user interaction and may run in parallel on a cluster if applied to a whole stack of images. The strength and robustness of the method implemented will be demonstrated on three tomographic X-ray data sets: a mono-phase β...

  11. 3D tomographic imaging with the γ-eye planar scintigraphic gamma camera

    Science.gov (United States)

    Tunnicliffe, H.; Georgiou, M.; Loudos, G. K.; Simcox, A.; Tsoumpas, C.

    2017-11-01

    γ-eye is a desktop planar scintigraphic gamma camera (100 mm × 50 mm field of view) designed by BET Solutions as an affordable tool for dynamic, whole body, small-animal imaging. This investigation tests the viability of using γ-eye for the collection of tomographic data for 3D SPECT reconstruction. Two software packages, QSPECT and STIR (software for tomographic image reconstruction), have been compared. Reconstructions have been performed using QSPECT’s implementation of the OSEM algorithm and STIR’s OSMAPOSL (Ordered Subset Maximum A Posteriori One Step Late) and OSSPS (Ordered Subsets Separable Paraboloidal Surrogate) algorithms. Reconstructed images of phantom and mouse data have been assessed in terms of spatial resolution, sensitivity to varying activity levels and uniformity. The effect of varying the number of iterations, the voxel size (1.25 mm default voxel size reduced to 0.625 mm and 0.3125 mm), the point spread function correction and the weight of prior terms were explored. While QSPECT demonstrated faster reconstructions, STIR outperformed it in terms of resolution (as low as 1 mm versus 3 mm), particularly when smaller voxel sizes were used, and in terms of uniformity, particularly when prior terms were used. Little difference in terms of sensitivity was seen throughout.

  12. Estimation of spatial uncertainties of tomographic velocity models

    Energy Technology Data Exchange (ETDEWEB)

    Jordan, M.; Du, Z.; Querendez, E. [SINTEF Petroleum Research, Trondheim (Norway)

    2012-12-15

    This research project aims to evaluate the possibility of assessing the spatial uncertainties in tomographic velocity model building in a quantitative way. The project is intended to serve as a test of whether accurate and specific uncertainty estimates (e.g., in meters) can be obtained. The project is based on Monte Carlo-type perturbations of the velocity model as obtained from the tomographic inversion guided by diagonal and off-diagonal elements of the resolution and the covariance matrices. The implementation and testing of this method was based on the SINTEF in-house stereotomography code, using small synthetic 2D data sets. To test the method the calculation and output of the covariance and resolution matrices was implemented, and software to perform the error estimation was created. The work included the creation of 2D synthetic data sets, the implementation and testing of the software to conduct the tests (output of the covariance and resolution matrices which are not implicitly provided by stereotomography), application to synthetic data sets, analysis of the test results, and creating the final report. The results show that this method can be used to estimate the spatial errors in tomographic images quantitatively. The results agree with' the known errors for our synthetic models. However, the method can only be applied to structures in the model, where the change of seismic velocity is larger than the predicted error of the velocity parameter amplitudes. In addition, the analysis is dependent on the tomographic method, e.g., regularization and parameterization. The conducted tests were very successful and we believe that this method could be developed further to be applied to third party tomographic images.

  13. Multicore Performance of Block Algebraic Iterative Reconstruction Methods

    DEFF Research Database (Denmark)

    Sørensen, Hans Henrik B.; Hansen, Per Christian

    2014-01-01

    Algebraic iterative methods are routinely used for solving the ill-posed sparse linear systems arising in tomographic image reconstruction. Here we consider the algebraic reconstruction technique (ART) and the simultaneous iterative reconstruction techniques (SIRT), both of which rely on semiconv......Algebraic iterative methods are routinely used for solving the ill-posed sparse linear systems arising in tomographic image reconstruction. Here we consider the algebraic reconstruction technique (ART) and the simultaneous iterative reconstruction techniques (SIRT), both of which rely...... on semiconvergence. Block versions of these methods, based on a partitioning of the linear system, are able to combine the fast semiconvergence of ART with the better multicore properties of SIRT. These block methods separate into two classes: those that, in each iteration, access the blocks in a sequential manner...... a fixed relaxation parameter in each method, namely, the one that leads to the fastest semiconvergence. Computational results show that for multicore computers, the sequential approach is preferable....

  14. Real-time quasi-3D tomographic reconstruction

    Science.gov (United States)

    Buurlage, Jan-Willem; Kohr, Holger; Palenstijn, Willem Jan; Joost Batenburg, K.

    2018-06-01

    Developments in acquisition technology and a growing need for time-resolved experiments pose great computational challenges in tomography. In addition, access to reconstructions in real time is a highly demanded feature but has so far been out of reach. We show that by exploiting the mathematical properties of filtered backprojection-type methods, having access to real-time reconstructions of arbitrarily oriented slices becomes feasible. Furthermore, we present , software for visualization and on-demand reconstruction of slices. A user of can interactively shift and rotate slices in a GUI, while the software updates the slice in real time. For certain use cases, the possibility to study arbitrarily oriented slices in real time directly from the measured data provides sufficient visual and quantitative insight. Two such applications are discussed in this article.

  15. Processing of acquisition data for a time of flight positron tomograph

    International Nuclear Information System (INIS)

    Robert, G.

    1987-10-01

    After a review of basic principles concerning the time of flight positron tomography, the LETI positron tomograph is briefly described. For performance optimization (acquisition, calibration, image reconstruction), various specialized operators have been designed: the realization of the acquisition system is presented [fr

  16. A tomograph VMEbus parallel processing data acquisition system

    International Nuclear Information System (INIS)

    Atkins, M.S.; Wilkinson, N.A.; Rogers, J.G.

    1988-11-01

    This paper describes a VME based data acquisition system suitable for the development of Positron Volume Imaging tomographs which use 3-D data for improved image resolution over slice-oriented tomographs. The data acquisition must be flexible enough to accommodate several 3-D reconstruction algorithms; hence, a software-based system is most suitable. Furthermore, because of the increased dimensions and resolution of volume imaging tomographs, the raw data event rate is greater than that of slice-oriented machines. These dual requirements are met by our data acquisition systems. Flexibility is achieved through an array of processors connected over a VMEbus, operating asynchronously and in parallel. High raw data throughput is achieved using a dedicated high speed data transfer device available for the VMEbus. The device can attain a raw data rate of 2.5 million coincidence events per second for raw events per second for raw events which are 64 bits wide. Real-time data acquisition and pre-processing requirements can be met by about forty 20 MHz Motorola 68020/68881 processors

  17. Time-resolved tomographic images of a relativistic electron beam

    International Nuclear Information System (INIS)

    Koehler, H.A.; Jacoby, B.A.; Nelson, M.

    1984-07-01

    We obtained a sequential series of time-resolved tomographic two-dimensional images of a 4.5-MeV, 6-kA, 30-ns electron beam. Three linear fiber-optic arrays of 30 or 60 fibers each were positioned around the beam axis at 0 0 , 61 0 , and 117 0 . The beam interacting with nitrogen at 20 Torr emitted light that was focused onto the fiber arrays and transmitted to a streak camera where the data were recorded on film. The film was digitized, and two-dimensional images were reconstructed using the maximum-entropy tomographic technique. These images were then combined to produce an ultra-high-speed movie of the electron-beam pulse

  18. Compton scatter and randoms corrections for origin ensembles 3D PET reconstructions

    Energy Technology Data Exchange (ETDEWEB)

    Sitek, Arkadiusz [Harvard Medical School, Boston, MA (United States). Dept. of Radiology; Brigham and Women' s Hospital, Boston, MA (United States); Kadrmas, Dan J. [Utah Univ., Salt Lake City, UT (United States). Utah Center for Advanced Imaging Research (UCAIR)

    2011-07-01

    In this work we develop a novel approach to correction for scatter and randoms in reconstruction of data acquired by 3D positron emission tomography (PET) applicable to tomographic reconstruction done by the origin ensemble (OE) approach. The statistical image reconstruction using OE is based on calculation of expectations of the numbers of emitted events per voxel based on complete-data space. Since the OE estimation is fundamentally different than regular statistical estimators such those based on the maximum likelihoods, the standard methods of implementation of scatter and randoms corrections cannot be used. Based on prompts, scatter, and random rates, each detected event is graded in terms of a probability of being a true event. These grades are utilized by the Markov Chain Monte Carlo (MCMC) algorithm used in OE approach for calculation of the expectation over the complete-data space of the number of emitted events per voxel (OE estimator). We show that the results obtained with the OE are almost identical to results obtained by the maximum likelihood-expectation maximization (ML-EM) algorithm for reconstruction for experimental phantom data acquired using Siemens Biograph mCT 3D PET/CT scanner. The developed correction removes artifacts due to scatter and randoms in investigated 3D PET datasets. (orig.)

  19. The performance of a hybrid analytical-Monte Carlo system response matrix in pinhole SPECT reconstruction

    International Nuclear Information System (INIS)

    El Bitar, Z; Pino, F; Candela, C; Ros, D; Pavía, J; Rannou, F R; Ruibal, A; Aguiar, P

    2014-01-01

    It is well-known that in pinhole SPECT (single-photon-emission computed tomography), iterative reconstruction methods including accurate estimations of the system response matrix can lead to submillimeter spatial resolution. There are two different methods for obtaining the system response matrix: those that model the system analytically using an approach including an experimental characterization of the detector response, and those that make use of Monte Carlo simulations. Methods based on analytical approaches are faster and handle the statistical noise better than those based on Monte Carlo simulations, but they require tedious experimental measurements of the detector response. One suggested approach for avoiding an experimental characterization, circumventing the problem of statistical noise introduced by Monte Carlo simulations, is to perform an analytical computation of the system response matrix combined with a Monte Carlo characterization of the detector response. Our findings showed that this approach can achieve high spatial resolution similar to that obtained when the system response matrix computation includes an experimental characterization. Furthermore, we have shown that using simulated detector responses has the advantage of yielding a precise estimate of the shift between the point of entry of the photon beam into the detector and the point of interaction inside the detector. Considering this, it was possible to slightly improve the spatial resolution in the edge of the field of view. (paper)

  20. Mesooptical microscope as a tomographical device

    International Nuclear Information System (INIS)

    Soroko, L.M.

    1989-01-01

    It is shown that there are at least four regions which are common for the mesooptical microscopes, on the one hand, and for the reconstructed tomography, on the other hand. The following characteristics of the mesooptical microscope show the tomographical properties: the structure of the output data concerning the orientation and the position in space of the straight-line objects going at small angles with the perpendicular to the given tomographic plane, the behaviour of the two-dimensional fourier-transform of the straight-line object in the course of the rotation of this object with respect to the specified axis in space, the scanning algorithm of the nuclear emulsion volume by the fence-like illuminated region in the mesooptical microscope for searching for particle tracks going parallel to the optical axis of the microscope, and, finally, the fact that the mesooptical images of the straight-line particle tracks with a common vertex in the nuclear emulsion lie on the sinogram. 12 refs.; 16 figs

  1. A Monte Carlo-based model for simulation of digital chest tomo-synthesis

    International Nuclear Information System (INIS)

    Ullman, G.; Dance, D. R.; Sandborg, M.; Carlsson, G. A.; Svalkvist, A.; Baath, M.

    2010-01-01

    The aim of this work was to calculate synthetic digital chest tomo-synthesis projections using a computer simulation model based on the Monte Carlo method. An anthropomorphic chest phantom was scanned in a computed tomography scanner, segmented and included in the computer model to allow for simulation of realistic high-resolution X-ray images. The input parameters to the model were adapted to correspond to the VolumeRAD chest tomo-synthesis system from GE Healthcare. Sixty tomo-synthesis projections were calculated with projection angles ranging from + 15 to -15 deg. The images from primary photons were calculated using an analytical model of the anti-scatter grid and a pre-calculated detector response function. The contributions from scattered photons were calculated using an in-house Monte Carlo-based model employing a number of variance reduction techniques such as the collision density estimator. Tomographic section images were reconstructed by transferring the simulated projections into the VolumeRAD system. The reconstruction was performed for three types of images using: (i) noise-free primary projections, (ii) primary projections including contributions from scattered photons and (iii) projections as in (ii) with added correlated noise. The simulated section images were compared with corresponding section images from projections taken with the real, anthropomorphic phantom from which the digital voxel phantom was originally created. The present article describes a work in progress aiming towards developing a model intended for optimisation of chest tomo-synthesis, allowing for simulation of both existing and future chest tomo-synthesis systems. (authors)

  2. Tomographic capabilities of the new GEM based SXR diagnostic of WEST

    Science.gov (United States)

    Jardin, A.; Mazon, D.; O'Mullane, M.; Mlynar, J.; Loffelmann, V.; Imrisek, M.; Chernyshova, M.; Czarski, T.; Kasprowicz, G.; Wojenski, A.; Bourdelle, C.; Malard, P.

    2016-07-01

    The tokamak WEST (Tungsten Environment in Steady-State Tokamak) will start operating by the end of 2016 as a test bed for the ITER divertor components in long pulse operation. In this context, radiative cooling of heavy impurities like tungsten (W) in the Soft X-ray (SXR) range [0.1 keV; 20 keV] is a critical issue for the plasma core performances. Thus reliable tools are required to monitor the local impurity density and avoid W accumulation. The WEST SXR diagnostic will be equipped with two new GEM (Gas Electron Multiplier) based poloidal cameras allowing to perform 2D tomographic reconstructions in tunable energy bands. In this paper tomographic capabilities of the Minimum Fisher Information (MFI) algorithm developed for Tore Supra and upgraded for WEST are investigated, in particular through a set of emissivity phantoms and the standard WEST scenario including reconstruction errors, influence of noise as well as computational time.

  3. Simultaneous tomographic reconstruction and segmentation with class priors

    DEFF Research Database (Denmark)

    Romanov, Mikhail; Dahl, Anders Bjorholm; Dong, Yiqiu

    2015-01-01

    are combined to produce a reconstruction that is identical to the segmentation. We consider instead a hybrid approach that simultaneously produces both a reconstructed image and segmentation. We incorporate priors about the desired classes of the segmentation through a Hidden Markov Measure Field Model, and we...

  4. Tomographic and analog 3-D simulations using NORA. [Non-Overlapping Redundant Image Array formed by multiple pinholes

    Science.gov (United States)

    Yin, L. I.; Trombka, J. I.; Bielefeld, M. J.; Seltzer, S. M.

    1984-01-01

    The results of two computer simulations demonstrate the feasibility of using the nonoverlapping redundant array (NORA) to form three-dimensional images of objects with X-rays. Pinholes admit the X-rays to nonoverlapping points on a detector. The object is reconstructed in the analog mode by optical correlation and in the digital mode by tomographic computations. Trials were run with a stick-figure pyramid and extended objects with out-of-focus backgrounds. Substitution of spherical optical lenses for the pinholes increased the light transmission sufficiently that objects could be easily viewed in a dark room. Out-of-focus aberrations in tomographic reconstruction could be eliminated using Chang's (1976) algorithm.

  5. Tomographic examination table

    International Nuclear Information System (INIS)

    Redington, R.W.; Henkes, J.L.

    1979-01-01

    Equipment is described for positioning and supporting patients during tomographic mammography using X-rays. The equipment consists of a table and fabric slings which permit the examination of a downward, pendant breast of a prone patient by allowing the breast to pass through a aperture in the table into a fluid filled container. The fluid has an X-ray absorption coefficient similar to that of soft human tissue allowing high density resolution radiography and permitting accurate detection of breast tumours. The shape of the equipment and the positioning of the patient allow the detector and X-ray source to rotate 360 0 about a vertical axis through the breast. This permits the use of relatively simple image reconstruction algorithms and a divergent X-ray geometry. (UK)

  6. A new algorithm for γ-ray tomographic imaging using a scintillation camera

    International Nuclear Information System (INIS)

    Terajima, Hirokatsu; Nakajima, Masato; Itoh, Takashi.

    1979-01-01

    The gamma ray tomographic imaging giving 3-dimensional distribution of RI in human bodies is being actively investigated for the reason that the conventional images are of 2-dimensional projection, but it is not yet employed practically, because there are some problems in the tomographic image quality obtained. One of the methods is a technique to determine the radioisotope distribution on each tomographic plane by placing a planar detector in parallel with the assumed tomographic planes and by processing the 2-dimensional radioisotope projection images thus obtained. It does not require the repetition of reconstructive algorithm. The authors have proposed the algorithm for this method, and have carried out the experiments to verify the propriety of the algorithm. Radioisotope phantom is composed of the overlapping acrylic cubic vessels of 30 mm sides containing radioisotopes arranged 2-dimensionally in each layer, and the multi-pinhole shutter array is used as the collimator. The projection image of radioisotope distribution on the scintillator face is converted into the digital imaging data sampled in 2-dimensional space of 64 x 64 with the mini-computer. Among the probable causes to affect the reconstructed image quality, statistical fluctuation, absorption of gamma ray and the shape of aperture for the collimator are discussed. These indicate that this method is more effective than the conventional methods, and can be the effective technique for medical diagnosis and therapy, because this is a technique to determine 3-dimensional distribution of RI by utilizing existing equipments. (Wakatsuki, Y.)

  7. Design of a volume-imaging positron emission tomograph

    International Nuclear Information System (INIS)

    Harrop, R.; Rogers, J.G.; Coombes, G.H.; Wilkinson, N.A.; Pate, B.D.; Morrison, K.S.; Stazyk, M.; Dykstra, C.J.; Barney, J.S.; Atkins, M.S.; Doherty, P.W.; Saylor, D.P.

    1988-11-01

    Progress is reported in several areas of design of a positron volume imaging tomograph. As a means of increasing the volume imaged and the detector packing fraction, a lens system of detector light coupling is considered. A prototype layered scintillator detector demonstrates improved spatial resolution due to a unique Compton rejection capability. The conceptual design of a new mechanism for measuring scattered radiation during emission scans has been tested by Monte Carlo simulation. The problem of how to use effectively the resulting sampled scattered radiation projections is presented and discussed

  8. Coded aperture optimization using Monte Carlo simulations

    International Nuclear Information System (INIS)

    Martineau, A.; Rocchisani, J.M.; Moretti, J.L.

    2010-01-01

    Coded apertures using Uniformly Redundant Arrays (URA) have been unsuccessfully evaluated for two-dimensional and three-dimensional imaging in Nuclear Medicine. The images reconstructed from coded projections contain artifacts and suffer from poor spatial resolution in the longitudinal direction. We introduce a Maximum-Likelihood Expectation-Maximization (MLEM) algorithm for three-dimensional coded aperture imaging which uses a projection matrix calculated by Monte Carlo simulations. The aim of the algorithm is to reduce artifacts and improve the three-dimensional spatial resolution in the reconstructed images. Firstly, we present the validation of GATE (Geant4 Application for Emission Tomography) for Monte Carlo simulations of a coded mask installed on a clinical gamma camera. The coded mask modelling was validated by comparison between experimental and simulated data in terms of energy spectra, sensitivity and spatial resolution. In the second part of the study, we use the validated model to calculate the projection matrix with Monte Carlo simulations. A three-dimensional thyroid phantom study was performed to compare the performance of the three-dimensional MLEM reconstruction with conventional correlation method. The results indicate that the artifacts are reduced and three-dimensional spatial resolution is improved with the Monte Carlo-based MLEM reconstruction.

  9. William, a voxel model of child anatomy from tomographic images for Monte Carlo dosimetry calculations

    International Nuclear Information System (INIS)

    Caon, M.

    2010-01-01

    Full text: Medical imaging provides two-dimensional pictures of the human internal anatomy from which may be constructed a three-dimensional model of organs and tissues suitable for calculation of dose from radiation. Diagnostic CT provides the greatest exposure to radiation per examination and the frequency of CT examination is high. Esti mates of dose from diagnostic radiography are still determined from data derived from geometric models (rather than anatomical models), models scaled from adult bodies (rather than bodies of children) and CT scanner hardware that is no longer used. The aim of anatomical modelling is to produce a mathematical representation of internal anatomy that has organs of realistic size, shape and positioning. The organs and tissues are represented by a great many cuboidal volumes (voxels). The conversion of medical images to voxels is called segmentation and on completion every pixel in an image is assigned to a tissue or organ. Segmentation is time consuming. An image processing pack age is used to identify organ boundaries in each image. Thirty to forty tomographic voxel models of anatomy have been reported in the literature. Each model is of an individual, or a composite from several individuals. Images of children are particularly scarce. So there remains a need for more paediatric anatomical models. I am working on segmenting ''William'' who is 368 PET-CT images from head to toe of a seven year old boy. William will be used for Monte Carlo dose calculations of dose from CT examination using a simulated modern CT scanner.

  10. Dosimetric reconstruction of radiological accident by numerical simulations by means associating an anthropomorphic model and a Monte Carlo computation code

    International Nuclear Information System (INIS)

    Courageot, Estelle

    2010-01-01

    After a description of the context of radiological accidents (definition, history, context, exposure types, associated clinic symptoms of irradiation and contamination, medical treatment, return on experience) and a presentation of dose assessment in the case of external exposure (clinic, biological and physical dosimetry), this research thesis describes the principles of numerical reconstruction of a radiological accident, presents some computation codes (Monte Carlo code, MCNPX code) and the SESAME tool, and reports an application to an actual case (an accident which occurred in Equator in April 2009). The next part reports the developments performed to modify the posture of voxelized phantoms and the experimental and numerical validations. The last part reports a feasibility study for the reconstruction of radiological accidents occurring in external radiotherapy. This work is based on a Monte Carlo simulation of a linear accelerator, with the aim of identifying the most relevant parameters to be implemented in SESAME in the case of external radiotherapy

  11. An original emission tomograph for in vivo brain imaging of small animals

    International Nuclear Information System (INIS)

    Ochoa, A.V.; Ploux, L.; Mastrippolito, R.

    1996-01-01

    The principle of a new tomograph TOHR dedicated for small volume analysis with very high resolution is presented in this paper. We use uncorrelated multi-photons (X or gamma rays) radioisotopes and a large solid angle focusing collimator to make tomographic imaging without reconstruction algorithm. With this original device, detection efficiency and resolution are independent and submillimetric resolution can be achieved. A feasibility study shows that, made achieve the predicted performances of TOHR. We discuss its potential in rat brain tomography by simulating a realistic neuropharmacological experiment using a 1.4 mm resolution prototype of TOHR under development

  12. Design and applications of Computed Industrial Tomographic Imaging System (CITIS)

    Energy Technology Data Exchange (ETDEWEB)

    Ramakrishna, G S; Kumar, Umesh; Datta, S S [Bhabha Atomic Research Centre, Bombay (India). Isotope Div.

    1994-12-31

    This paper highlights the design and development of a prototype Computed Tomographic (CT) imaging system and its software for image reconstruction, simulation and display. It also describes results obtained with several test specimens including Dhruva reactor uranium fuel assembly and possibility of using neutrons as well as high energy x-rays in computed tomography. 5 refs., 4 figs.

  13. Conceptual design of the tomographic system for simultaneous studying of soft and hard X-ray emission from dense magnetized plasma

    Energy Technology Data Exchange (ETDEWEB)

    Bielecki, J., E-mail: jakub.bielecki@ifj.edu.edu; Wójcik-Gargula, A.; Scholz, M.

    2016-11-15

    The article presents a new approach for investigation of spatial distributions of soft and hard X-rays emitted from dense magnetized plasma. The approach is based on the application of tomographic methods to the X-ray emission reconstruction in a plasma focus (PF) device. Quantitative investigation of the anisotropy of the reconstructed X–ray plasma emissivity may help to explain the nature of fusion reaction mechanisms in a PF device. The aim of this work is to present a conceptual design of a novel dual-energy X-ray emission tomographic system dedicated to the PF-24 plasma focus device. The system, which enables the simultaneous registration of soft and hard X-rays, is composed of three X‐ray pinhole cameras. Each camera is equipped with a pair of 16-element Si photodiode arrays arranged in two layers separated by an aluminum attenuator. The Geant4 code was used to optimize the layout and parameters of the applied detectors. In addition, a method of tomographic reconstruction from a sparse data set provided by the experimental setup has been presented.

  14. Generalized Filtered Back-Projection for Digital Breast Tomosynthesis Reconstruction

    NARCIS (Netherlands)

    Erhard, K.; Grass, M.; Hitziger, S.; Iske, A.; Nielsen, T.

    2012-01-01

    Filtered back-projection (FBP) has been commonly used as an efficient and robust reconstruction technique in tomographic X-ray imagingduring the last decades. For limited angle tomography acquisitions such as digital breast tomosynthesis, however, standard FBP reconstruction algorithms provide poor

  15. The use of transport and diffusion equations in the three-dimensional reconstruction of computerized tomographic images

    Energy Technology Data Exchange (ETDEWEB)

    Pires, Sandrerley Ramos, E-mail: sandrerley@eee.ufg.br [Escola de Engenharia Eletrica e de Computacao - EEEC, Universidade Federal de Goias - UFG, Goiania, GO (Brazil); Flores, Edna Lucia; Pires, Dulcineia Goncalves F.; Carrijo, Gilberto Arantes; Veiga, Antonio Claudio Paschoarelli [Faculdade de Engenharia Eletrica - FEELT, Universidade Federal de Uberlandia - UFU, Uberlandia, MG (Brazil); Barcelos, Celia Aparecida Z. [Faculdade de Matematica, Universidade Federal de Uberlandia - UFU, Uberlandia, MG (Brazil)

    2012-09-15

    The visualization of a computerized tomographic (TC) exam in 3D increases the quality of the medical diagnosis and, consequently, the success probability in the treatment. To obtain a high quality image it is necessary to obtain slices which are close to one another. Motivated towards the goal of reaching an improved balance between quantity of slices and visualization quality, this research work presents a digital inpainting technique of 3D interpolation for CT slices used in the visualization of human body structures. The inpainting is carried out via non-linear partial differential equations (PDE). The PDE's have been used, in the image-processing context to fill in the damaged regions in a digital 2D image. Inspired by this idea, this article proposes an interpolation method for the filling in of the empty regions between the CT slices. To do it, considering the high similarity between two consecutive real slice, the first step of the proposed method is to create the virtual slices. The virtual slices contain all similarity between the intercalated slices and, when there are not similarities between real slices, the virtual slices will contain indefinite portions. In the second step of the proposed method, the created virtual slices will be used together with the real slices images, in the reconstruction of the structure in three dimensions, mapped onto the exam. The proposed method is capable of reconstructing the curvatures of the patient's internal structures without using slices that are close to one another. The experiments carried out show the proposed method's efficiency. (author)

  16. Validation of the GATE Monte Carlo simulation platform for modelling a CsI(Tl) scintillation camera dedicated to small-animal imaging

    International Nuclear Information System (INIS)

    Lazaro, D; Buvat, I; Loudos, G; Strul, D; Santin, G; Giokaris, N; Donnarieix, D; Maigne, L; Spanoudaki, V; Styliaris, S; Staelens, S; Breton, V

    2004-01-01

    Monte Carlo simulations are increasingly used in scintigraphic imaging to model imaging systems and to develop and assess tomographic reconstruction algorithms and correction methods for improved image quantitation. GATE (GEANT4 application for tomographic emission) is a new Monte Carlo simulation platform based on GEANT4 dedicated to nuclear imaging applications. This paper describes the GATE simulation of a prototype of scintillation camera dedicated to small-animal imaging and consisting of a CsI(Tl) crystal array coupled to a position-sensitive photomultiplier tube. The relevance of GATE to model the camera prototype was assessed by comparing simulated 99m Tc point spread functions, energy spectra, sensitivities, scatter fractions and image of a capillary phantom with the corresponding experimental measurements. Results showed an excellent agreement between simulated and experimental data: experimental spatial resolutions were predicted with an error less than 100 μm. The difference between experimental and simulated system sensitivities for different source-to-collimator distances was within 2%. Simulated and experimental scatter fractions in a [98-82 keV] energy window differed by less than 2% for sources located in water. Simulated and experimental energy spectra agreed very well between 40 and 180 keV. These results demonstrate the ability and flexibility of GATE for simulating original detector designs. The main weakness of GATE concerns the long computation time it requires: this issue is currently under investigation by the GEANT4 and the GATE collaborations

  17. Simulating multi-spacecraft Heliospheric Imager observations for tomographic reconstruction of interplanetary CMEs

    Science.gov (United States)

    Barnes, D.

    2017-12-01

    The multiple, spatially separated vantage points afforded by the STEREO and SOHO missions provide physicists with a means to infer the three-dimensional structure of the solar corona via tomographic imaging. The reconstruction process combines these multiple projections of the optically thin plasma to constrain its three-dimensional density structure and has been successfully applied to the low corona using the STEREO and SOHO coronagraphs. However, the technique is also possible at larger, inter-planetary distances using wide-angle imagers, such as the STEREO Heliospheric Imagers (HIs), to observe faint solar wind plasma and Coronal Mass Ejections (CMEs). Limited small-scale structure may be inferred from only three, or fewer, viewpoints and the work presented here is done so with the aim of establishing techniques for observing CMEs with upcoming and future HI-like technology. We use simulated solar wind densities to compute realistic white-light HI observations, with which we explore the requirements of such instruments for determining solar wind plasma density structure via tomography. We exploit this information to investigate the optimal orbital characteristics, such as spacecraft number, separation, inclination and eccentricity, necessary to perform the technique with HIs. Further to this we argue that tomography may be greatly enhanced by means of improved instrumentation; specifically, the use of wide-angle imagers capable of measuring polarised light. This work has obvious space weather applications, serving as a demonstration for potential future missions (such as at L1 and L5) and will prove timely in fully exploiting the science return from the upcoming Solar Orbiter and Parker Solar Probe missions.

  18. A general purpose tomographic program with combined inversions

    International Nuclear Information System (INIS)

    Xu Wenbin; Dong Jiafu; Li Fanzhu

    1996-01-01

    A general tomographic program has been developed by combining the Bessel expansion with the Zernicke expansion. It is useful for studying of the magnetic island structure of the tearing mode and in reconstructing the density profiles of impurities in tokamak plasmas. This combined method have the advantages of both expansions, i.e. there will be no spurious images in the edge and it will be of high inverse precision in the center of plasma

  19. Low-dose computed tomographic imaging in orbital trauma

    Energy Technology Data Exchange (ETDEWEB)

    Jackson, A.; Whitehouse, R.W. (Manchester Univ. (United Kingdom). Dept. of Diagnostic Radiology)

    1993-08-01

    The authors review findings in 75 computed tomographic (CT) examinations of 66 patients with orbital trauma who were imaged using a low-radiation-dose CT technique. Imaging was performed using a dynamic scan mode and exposure factors of 120 kVp and 80 mAs resulting in a skin dose of 11 mGy with an effective dose-equivalent of 0.22 mSv. Image quality was diagnostic in all cases and excellent in 73 examinations. Soft-tissue abnormalities within the orbit including muscle adhesions were well demonstrated both on primary axial and reconstructed multiplanar images. The benefits of multiplanar reconstructions are stressed and the contribution of soft-tissue injuries to symptomatic diplopia examined. (author).

  20. Three-dimensional computed tomographic angiography to predict weight and volume of deep inferior epigastric artery perforator flap for breast reconstruction.

    Science.gov (United States)

    Rosson, Gedge D; Shridharani, Sachin M; Magarakis, Michael; Manahan, Michele A; Stapleton, Sahael M; Gilson, Marta M; Flores, Jaime I; Basdag, Basak; Fishman, Elliot K

    2011-10-01

    Three-dimensional computed tomographic angiography (3D CTA) can be used preoperatively to evaluate the course and caliber of perforating blood vessels for abdominal free-flap breast reconstruction. For postmastectomy breast reconstruction, many women inquire whether the abdominal tissue volume will match that of the breast to be removed. Therefore, our goal was to estimate preoperative volume and weight of the proposed flap and compare them with the actual volume and weight to determine if diagnostic imaging can accurately identify the amount of tissue that could potentially to be harvested. Preoperative 3D CTA was performed in 15 patients, who underwent breast reconstruction using the deep inferior epigastric artery perforator flap. Before each angiogram, stereotactic fiducials were placed on the planned flap outline. The radiologist reviewed each preoperative angiogram to estimate the volume, and thus, weight of the flap. These estimated weights were compared with the actual intraoperative weights. The average estimated weight was 99.7% of the actual weight. The interquartile range (25th to 75th percentile), which represents the "middle half" of the patients, was 91-109%, indicating that half of the patients had an estimated weight within 9% of the actual weight; however, there was a large range (70-133%). 3D CTA with stereotactic fiducials allows surgeons to adequately estimate abdominal flap volume before surgery, potentially giving guidance in the amount of tissue that can be harvested from a patient's lower abdomen. Copyright © 2011 Wiley-Liss, Inc.

  1. Tomographic PIV: particles versus blobs

    International Nuclear Information System (INIS)

    Champagnat, Frédéric; Cornic, Philippe; Besnerais, Guy Le; Plyer, Aurélien; Cheminet, Adam; Leclaire, Benjamin

    2014-01-01

    We present an alternative approach to tomographic particle image velocimetry (tomo-PIV) that seeks to recover nearly single voxel particles rather than blobs of extended size. The baseline of our approach is a particle-based representation of image data. An appropriate discretization of this representation yields an original linear forward model with a weight matrix built with specific samples of the system’s point spread function (PSF). Such an approach requires only a few voxels to explain the image appearance, therefore it favors much more sparsely reconstructed volumes than classic tomo-PIV. The proposed forward model is general and flexible and can be embedded in a classical multiplicative algebraic reconstruction technique (MART) or a simultaneous multiplicative algebraic reconstruction technique (SMART) inversion procedure. We show, using synthetic PIV images and by way of a large exploration of the generating conditions and a variety of performance metrics, that the model leads to better results than the classical tomo-PIV approach, in particular in the case of seeding densities greater than 0.06 particles per pixel and of PSFs characterized by a standard deviation larger than 0.8 pixels. (paper)

  2. Data-parallel tomographic reconstruction : A comparison of filtered backprojection and direct Fourier reconstruction

    NARCIS (Netherlands)

    Roerdink, J.B.T.M.; Westenberg, M.A

    1998-01-01

    We consider the parallelization of two standard 2D reconstruction algorithms, filtered backprojection and direct Fourier reconstruction, using the data-parallel programming style. The algorithms are implemented on a Connection Machine CM-5 with 16 processors and a peak performance of 2 Gflop/s.

  3. Tomographic apparatus for reconstructing planar slices from non-absorbed and non-scattered radiation

    International Nuclear Information System (INIS)

    1980-01-01

    Apparatus which can be used in computerized tomographic systems for producing a fan shaped beam, detectors to be used in conjunction with the source and equipment for rotating the source supports are described. (U.K.)

  4. Monte Carlo simulation studies on scintillation detectors and image reconstruction of brain-phantom tumors in TOFPET

    Directory of Open Access Journals (Sweden)

    Mondal Nagendra

    2009-01-01

    Full Text Available This study presents Monte Carlo Simulation (MCS results of detection efficiencies, spatial resolutions and resolving powers of a time-of-flight (TOF PET detector systems. Cerium activated Lutetium Oxyorthosilicate (Lu 2 SiO 5 : Ce in short LSO, Barium Fluoride (BaF 2 and BriLanCe 380 (Cerium doped Lanthanum tri-Bromide, in short LaBr 3 scintillation crystals are studied in view of their good time and energy resolutions and shorter decay times. The results of MCS based on GEANT show that spatial resolution, detection efficiency and resolving power of LSO are better than those of BaF 2 and LaBr 3 , although it possesses inferior time and energy resolutions. Instead of the conventional position reconstruction method, newly established image reconstruction (talked about in the previous work method is applied to produce high-tech images. Validation is a momentous step to ensure that this imaging method fulfills all purposes of motivation discussed by reconstructing images of two tumors in a brain phantom.

  5. Tomographic Reconstruction of Tracer Gas Concentration Profiles in a Room with the Use of a Single OP-FTIR and Two Iterative Algorithms: ART and PWLS.

    Science.gov (United States)

    Park, Doo Y; Fessier, Jeffrey A; Yost, Michael G; Levine, Steven P

    2000-03-01

    Computed tomographic (CT) reconstructions of air contaminant concentration fields were conducted in a room-sized chamber employing a single open-path Fourier transform infrared (OP-FTIR) instrument and a combination of 52 flat mirrors and 4 retroreflectors. A total of 56 beam path data were repeatedly collected for around 1 hr while maintaining a stable concentration gradient. The plane of the room was divided into 195 pixels (13 × 15) for reconstruction. The algebraic reconstruction technique (ART) failed to reconstruct the original concentration gradient patterns for most cases. These poor results were caused by the "highly underdetermined condition" in which the number of unknown values (156 pixels) exceeds that of known data (56 path integral concentrations) in the experimental setting. A new CT algorithm, called the penalized weighted least-squares (PWLS), was applied to remedy this condition. The peak locations were correctly positioned in the PWLS-CT reconstructions. A notable feature of the PWLS-CT reconstructions was a significant reduction of highly irregular noise peaks found in the ART-CT reconstructions. However, the peak heights were slightly reduced in the PWLS-CT reconstructions due to the nature of the PWLS algorithm. PWLS could converge on the original concentration gradient even when a fairly high error was embedded into some experimentally measured path integral concentrations. It was also found in the simulation tests that the PWLS algorithm was very robust with respect to random errors in the path integral concentrations. This beam geometry and the use of a single OP-FTIR scanning system, in combination with the PWLS algorithm, is a system applicable to both environmental and industrial settings.

  6. Self-masking noise subtraction (SMNS) in digital X-ray tomosynthesis for the improvement of tomographic image quality

    International Nuclear Information System (INIS)

    Oh, J.E.; Cho, H.S.; Choi, S.I.; Park, Y.O.; Lee, M.S.; Cho, H.M.; Yang, Y.J.; Je, U.K.; Woo, T.H.; Lee, H.K.

    2011-01-01

    In this paper, we proposed a simple and effective reconstruction algorithm, the so-called self-masking noise subtraction (SMNS), in digital X-ray tomosynthesis to reduce the tomographic blur that is inherent in the conventional tomosynthesis based upon the shift-and-add (SAA) method. Using the SAA and the SMNS algorithms, we investigated the influence of tomographic parameters such as tomographic angle (θ) and angle step (Δθ) on the image quality, measuring the signal-difference-to-noise ratio (SDNR). Our simulation results show that the proposed algorithm seems to be efficient in reducing the tomographic blur and, thus, improving image sharpness. We expect the simulation results to be useful for the optimal design of a digital X-ray tomosynthesis system for our ongoing application of nondestructive testing (NDT).

  7. Monte Carlo Study of the Effect of Collimator Thickness on T-99m Source Response in Single Photon Emission Computed Tomography

    International Nuclear Information System (INIS)

    Islamian, Jalil Pirayesh; Toossi, Mohammad Taghi Bahreyni; Momennezhad, Mahdi; Zakavi, Seyyed Rasoul; Sadeghi, Ramin; Ljungberg, Michael

    2012-01-01

    In single photon emission computed tomography (SPECT), the collimator is a crucial element of the imaging chain and controls the noise resolution tradeoff of the collected data. The current study is an evaluation of the effects of different thicknesses of a low-energy high-resolution (LEHR) collimator on tomographic spatial resolution in SPECT. In the present study, the SIMIND Monte Carlo program was used to simulate a SPECT equipped with an LEHR collimator. A point source of 99m Tc and an acrylic cylindrical Jaszczak phantom, with cold spheres and rods, and a human anthropomorphic torso phantom (4D-NCAT phantom) were used. Simulated planar images and reconstructed tomographic images were evaluated both qualitatively and quantitatively. According to the tabulated calculated detector parameters, contribution of Compton scattering, photoelectric reactions, and also peak to Compton (P/C) area in the obtained energy spectrums (from scanning of the sources with 11 collimator thicknesses, ranging from 2.400 to 2.410 cm), we concluded the thickness of 2.405 cm as the proper LEHR parallel hole collimator thickness. The image quality analyses by structural similarity index (SSIM) algorithm and also by visual inspection showed suitable quality images obtained with a collimator thickness of 2.405 cm. There was a suitable quality and also performance parameters’ analysis results for the projections and reconstructed images prepared with a 2.405 cm LEHR collimator thickness compared with the other collimator thicknesses

  8. Sensitivity study of poisson corruption in tomographic measurements for air-water flows

    International Nuclear Information System (INIS)

    Munshi, P.; Vaidya, M.S.

    1993-01-01

    An application of computerized tomography (CT) for measuring void fraction profiles in two-phase air-water flows was reported earlier. Those attempts involved some special radial methods for tomographic reconstruction and the popular convolution backprojection (CBP) method. The CBP method is capable of reconstructing void profiles for nonsymmetric flows also. In this paper, we investigate the effect of corrupted CT data for gamma-ray sources and aCBP algorithm. The corruption in such a case is due to the statistical (Poisson) nature of the source

  9. Reconstruction of Monte Carlo replicas from Hessian parton distributions

    Energy Technology Data Exchange (ETDEWEB)

    Hou, Tie-Jiun [Department of Physics, Southern Methodist University,Dallas, TX 75275-0181 (United States); Gao, Jun [INPAC, Shanghai Key Laboratory for Particle Physics and Cosmology,Department of Physics and Astronomy, Shanghai Jiao-Tong University, Shanghai 200240 (China); High Energy Physics Division, Argonne National Laboratory,Argonne, Illinois, 60439 (United States); Huston, Joey [Department of Physics and Astronomy, Michigan State University,East Lansing, MI 48824 (United States); Nadolsky, Pavel [Department of Physics, Southern Methodist University,Dallas, TX 75275-0181 (United States); Schmidt, Carl; Stump, Daniel [Department of Physics and Astronomy, Michigan State University,East Lansing, MI 48824 (United States); Wang, Bo-Ting; Xie, Ke Ping [Department of Physics, Southern Methodist University,Dallas, TX 75275-0181 (United States); Dulat, Sayipjamal [Department of Physics and Astronomy, Michigan State University,East Lansing, MI 48824 (United States); School of Physics Science and Technology, Xinjiang University,Urumqi, Xinjiang 830046 (China); Center for Theoretical Physics, Xinjiang University,Urumqi, Xinjiang 830046 (China); Pumplin, Jon; Yuan, C.P. [Department of Physics and Astronomy, Michigan State University,East Lansing, MI 48824 (United States)

    2017-03-20

    We explore connections between two common methods for quantifying the uncertainty in parton distribution functions (PDFs), based on the Hessian error matrix and Monte-Carlo sampling. CT14 parton distributions in the Hessian representation are converted into Monte-Carlo replicas by a numerical method that reproduces important properties of CT14 Hessian PDFs: the asymmetry of CT14 uncertainties and positivity of individual parton distributions. The ensembles of CT14 Monte-Carlo replicas constructed this way at NNLO and NLO are suitable for various collider applications, such as cross section reweighting. Master formulas for computation of asymmetric standard deviations in the Monte-Carlo representation are derived. A correction is proposed to address a bias in asymmetric uncertainties introduced by the Taylor series approximation. A numerical program is made available for conversion of Hessian PDFs into Monte-Carlo replicas according to normal, log-normal, and Watt-Thorne sampling procedures.

  10. RADON reconstruction in longitudinal phase space

    International Nuclear Information System (INIS)

    Mane, V.; Peggs, S.; Wei, J.

    1997-01-01

    Longitudinal particle motion in circular accelerators is typically monitoring by one dimensional (1-D) profiles. Adiabatic particle motion in two dimensional (2-D) phase space can be reconstructed with tomographic techniques, using 1-D profiles. A computer program RADON has been developed in C++ to process digitized mountain range data and perform the phase space reconstruction for the AGS, and later for Relativistic Heavy Ion Collider (RHIC)

  11. Assessment of Normal Eyeball Protrusion Using Computed Tomographic Imaging and Three-Dimensional Reconstruction in Korean Adults.

    Science.gov (United States)

    Shin, Kang-Jae; Gil, Young-Chun; Lee, Shin-Hyo; Kim, Jeong-Nam; Yoo, Ja-Young; Kim, Soon-Heum; Choi, Hyun-Gon; Shin, Hyun Jin; Koh, Ki-Seok; Song, Wu-Chul

    2017-01-01

    The aim of the present study was to assess normal eyeball protrusion from the orbital rim using two- and three-dimensional images and demonstrate the better suitability of CT images for assessment of exophthalmos. The facial computed tomographic (CT) images of Korean adults were acquired in sagittal and transverse views. The CT images were used in reconstructing three-dimensional volume of faces using computer software. The protrusion distances from orbital rims and the diameters of eyeballs were measured in the two views of the CT image and three-dimensional volume of the face. Relative exophthalmometry was calculated by the difference in protrusion distance between the right and left sides. The eyeball protrusion was 4.9 and 12.5 mm in sagittal and transverse views, respectively. The protrusion distances were 2.9 mm in the three-dimensional volume of face. There were no significant differences between right and left sides in the degree of protrusion, and the difference was within 2 mm in more than 90% of the subjects. The results of the present study will provide reliable criteria for precise diagnosis and postoperative monitoring using CT imaging of diseases such as thyroid-associated ophthalmopathy and orbital tumors.

  12. ALICE EMCal Reconstructable Energy Non-Linearity From Test Beam Monte Carlo

    CERN Document Server

    Carter, Thomas Michael

    2017-01-01

    Calorimeters play many important roles in modern high energy physics detectors, such as event selection, triggering, and precision energy measurements. EMCal, in the case of the ALICE experiment provides triggering on high energy jets, improves jet quenching study measurement bias and jet energy resolution, and improves electron and photon measurements [3]. With the EMCal detector in the ALICE experiment taking on so many important roles, it is important to fully understand, characterize and model its interactions with particles. In 2010 SPS and PS electron test beam measurements were performed on an EMCal mini-module [2]. Alongside this, the test beam setup and geometry was recreated in Geant4 by Nico [1]. Figure 1 shows the reconstructable energy linearity for the SPS test beam data and that obtained from the test beam monte carlo, indicating the amount of energy deposit as hits in the EMCal module. It can be seen that for energies above ∼ 100 GeV there is a significant drop in the reconstructableenergym...

  13. Initial results from the Donner 600-crystal positron tomograph

    International Nuclear Information System (INIS)

    Derenzo, S.E.; Huesman, R.H.; Cahoon, J.L.; Geyer, A.B.; Uber, D.C.; Vuletich, T.; Budinger, T.F.

    1987-01-01

    These results show the 3-mm BGO crystals can improve the resolution in positron tomography by a substantial factor. This measured crystal-pair resolution of 2.4 mm FWHM and the reconstructed image resolution of 2.9 mm FWHM at the center of the tomograph are in good agreement with expected values. The most serious limitation of the detector design is that only a single section can be imaged. 4 refs., 4 figs

  14. GPS tomography. Validation of reconstructed 3-D humidity fields with radiosonde profiles

    Energy Technology Data Exchange (ETDEWEB)

    Shangguan, M.; Bender, M.; Ramatschi, M.; Dick, G.; Wickert, J. [Helmholtz Centre Potsdam, German Research Centre for Geosciences (GFZ), Potsdam (Germany); Raabe, A. [Leipzig Institute for Meteorology (LIM), Leipzig (Germany); Galas, R. [Technische Univ. Berlin (Germany). Dept. for Geodesy and Geoinformation Sciences

    2013-11-01

    Water vapor plays an important role in meteorological applications; GeoForschungsZentrum (GFZ) therefore developed a tomographic system to derive 3-D distributions of the tropospheric water vapor above Germany using GPS data from about 300 ground stations. Input data for the tomographic reconstructions are generated by the Earth Parameter and Orbit determination System (EPOS) software of the GFZ, which provides zenith total delay (ZTD), integrated water vapor (IWV) and slant total delay (STD) data operationally with a temporal resolution of 2.5 min (STD) and 15 min (ZTD, IWV). The water vapor distribution in the atmosphere is derived by tomographic reconstruction techniques. The quality of the solution is dependent on many factors such as the spatial coverage of the atmosphere with slant paths, the spatial distribution of their intersections and the accuracy of the input observations. Independent observations are required to validate the tomographic reconstructions and to get precise information on the accuracy of the derived 3-D water vapor fields. To determine the quality of the GPS tomography, more than 8000 vertical water vapor profiles at 13 German radiosonde stations were used for the comparison. The radiosondes were launched twice a day (at 00:00 UTC and 12:00 UTC) in 2007. In this paper, parameters of the entire profiles such as the wet refractivity, and the zenith wet delay have been compared. Before the validation the temporal and spatial distribution of the slant paths, serving as a basis for tomographic reconstruction, as well as their angular distribution were studied. The mean wet refractivity differences between tomography and radiosonde data for all points vary from -1.3 to 0.3, and the root mean square is within the range of 6.5-9. About 32% of 6803 profiles match well, 23% match badly and 45% are difficult to classify as they match only in parts.

  15. Simulation of a Quality Control Jaszczak Phantom with SIMIND Monte Carlo and Adding the Phantom as an Accessory to the Program

    International Nuclear Information System (INIS)

    Pirayesh Islamian, J.; Bahreyni Toosi, M. T.; Momennezhad, M.; Naseri, Sh.; Ljungberg, M.

    2012-01-01

    Quality control is an important phenomenon in nuclear medicine imaging. A Jaszczak SPECT Phantom provides consistent performance information for any SPECT or PET system. This article describes the simulation of a Jaszczak phantom and creating an executable phantom file for comparing assessment of SPECT cameras using SIMIND Monte Carlo simulation program which is well-established for SPECT. The simulation was based on a Deluxe model of Jaszczak Phantom with defined geometry. Quality control tests were provided together with initial imaging example and suggested use for the assessment of parameters such as spatial resolution, limits of lesion detection, and contrast comparing with a Siemens E.Cam SPECT system. The phantom simulation was verified by matching tomographic spatial resolution, image contrast, and also uniformity compared with the experiment SPECT of the phantom from filtered backprojection reconstructed images of the spheres and rods. The calculated contrasts of the rods were 0.774, 0.627, 0.575, 0.372, 0.191, and 0.132 for an experiment with the rods diameters of 31.8, 25.4, 19.1, 15.9, 12.7, and 9.5 mm, respectively. The calculated contrasts of simulated rods were 0.661, 0.527, 0.487, 0.400, 0.23, and 0.2 for cold rods and also 0.92, 0.91, 0.88, 0.81, 0.76, and 0.56 for hot rods. Reconstructed spatial tomographic resolution of both experiment and simulated SPECTs of the phantom obtained about 9.5 mm. An executable phantom file and an input phantom file were created for the SIMIND Monte Carlo program. This phantom may be used for simulated SPECT systems and would be ideal for verification of the simulated systems with real ones by comparing the results of quality control and image evaluation. It is also envisaged that this phantom could be used with a range of radionuclide doses in simulation situations such as cold, hot, and background uptakes for the assessment of detection characteristics when a new similar clinical SPECT procedure is being simulated.

  16. Simulation of a Quality Control Jaszczak Phantom with SIMIND Monte Carlo and Adding the Phantom as an Accessory to the Program

    Directory of Open Access Journals (Sweden)

    Jalil Pirayesh Islamian

    2012-03-01

    Full Text Available Introduction Quality control is an important phenomenon in nuclear medicine imaging. A Jaszczak SPECT Phantom provides consistent performance information for any SPECT or PET system. This article describes the simulation of a Jaszczak phantom and creating an executable phantom file for comparing assessment of SPECT cameras using SIMIND Monte Carlo simulation program which is well-established for SPECT. Materials and Methods The simulation was based on a Deluxe model of Jaszczak Phantom with defined geometry. Quality control tests were provided together with initial imaging example and suggested use for the assessment of parameters such as spatial resolution, limits of lesion detection, and contrast comparing with a Siemens E.Cam SPECT system. Results The phantom simulation was verified by matching tomographic spatial resolution, image contrast, and also uniformity compared with the experiment SPECT of the phantom from filtered backprojection reconstructed images of the spheres and rods. The calculated contrasts of the rods were 0.774, 0.627, 0.575, 0.372, 0.191, and 0.132 for an experiment with the rods diameters of 31.8, 25.4, 19.1, 15.9, 12.7, and 9.5 mm, respectively. The calculated contrasts of simulated rods were 0.661, 0.527, 0.487, 0.400, 0.23, and 0.2 for cold rods and also 0.92, 0.91, 0.88, 0.81, 0.76, and 0.56 for hot rods. Reconstructed spatial tomographic resolution of both experiment and simulated SPECTs of the phantom obtained about 9.5 mm. An executable phantom file and an input phantom file were created for the SIMIND Monte Carlo program. Conclusion This phantom may be used for simulated SPECT systems and would be ideal for verification of the simulated systems with real ones by comparing the results of quality control and image evaluation. It is also envisaged that this phantom could be used with a range of radionuclide doses in simulation situations such as cold, hot, and background uptakes for the assessment of detection

  17. Lamb-Wave-Based Tomographic Imaging Techniques for Hole-Edge Corrosion Monitoring in Plate Structures

    Directory of Open Access Journals (Sweden)

    Dengjiang Wang

    2016-11-01

    Full Text Available This study presents a novel monitoring method for hole-edge corrosion damage in plate structures based on Lamb wave tomographic imaging techniques. An experimental procedure with a cross-hole layout using 16 piezoelectric transducers (PZTs was designed. The A0 mode of the Lamb wave was selected, which is sensitive to thickness-loss damage. The iterative algebraic reconstruction technique (ART method was used to locate and quantify the corrosion damage at the edge of the hole. Hydrofluoric acid with a concentration of 20% was used to corrode the specimen artificially. To estimate the effectiveness of the proposed method, the real corrosion damage was compared with the predicted corrosion damage based on the tomographic method. The results show that the Lamb-wave-based tomographic method can be used to monitor the hole-edge corrosion damage accurately.

  18. Iterative reconstruction using a Monte Carlo based system transfer matrix for dedicated breast positron emission tomography

    Energy Technology Data Exchange (ETDEWEB)

    Saha, Krishnendu [Ohio Medical Physics Consulting, Dublin, Ohio 43017 (United States); Straus, Kenneth J.; Glick, Stephen J. [Department of Radiology, University of Massachusetts Medical School, Worcester, Massachusetts 01655 (United States); Chen, Yu. [Department of Radiation Oncology, Columbia University, New York, New York 10032 (United States)

    2014-08-28

    To maximize sensitivity, it is desirable that ring Positron Emission Tomography (PET) systems dedicated for imaging the breast have a small bore. Unfortunately, due to parallax error this causes substantial degradation in spatial resolution for objects near the periphery of the breast. In this work, a framework for computing and incorporating an accurate system matrix into iterative reconstruction is presented in an effort to reduce spatial resolution degradation towards the periphery of the breast. The GATE Monte Carlo Simulation software was utilized to accurately model the system matrix for a breast PET system. A strategy for increasing the count statistics in the system matrix computation and for reducing the system element storage space was used by calculating only a subset of matrix elements and then estimating the rest of the elements by using the geometric symmetry of the cylindrical scanner. To implement this strategy, polar voxel basis functions were used to represent the object, resulting in a block-circulant system matrix. Simulation studies using a breast PET scanner model with ring geometry demonstrated improved contrast at 45% reduced noise level and 1.5 to 3 times resolution performance improvement when compared to MLEM reconstruction using a simple line-integral model. The GATE based system matrix reconstruction technique promises to improve resolution and noise performance and reduce image distortion at FOV periphery compared to line-integral based system matrix reconstruction.

  19. Reconstruction in PET cameras with irregular sampling and depth of interaction capability

    International Nuclear Information System (INIS)

    Virador, P.R.G.; Moses, W.W.; Huesman, R.H.

    1998-01-01

    The authors present 2D reconstruction algorithms for a rectangular PET camera capable of measuring depth of interaction (DOI). The camera geometry leads to irregular radial and angular sampling of the tomographic data. DOI information increases sampling density, allowing the use of evenly spaced quarter-crystal width radial bins with minimal interpolation of irregularly spaced data. In the regions where DOI does not increase sampling density (chords normal to crystal faces), fine radial sinogram binning leads to zero efficiency bins if uniform angular binning is used. These zero efficiency sinogram bins lead to streak artifacts if not corrected. To minimize these unnormalizable sinogram bins the authors use two angular binning schemes: Fixed Width and Natural Width. Fixed Width uses a fixed angular width except in the problem regions where appropriately chosen widths are applied. Natural Width uses angle widths which are derived from intrinsic detector sampling. Using a modified filtered-backprojection algorithm to accommodate these angular binning schemes, the authors reconstruct artifact free images with nearly isotropic and position independent spatial resolution. Results from Monte Carlo data indicate that they have nearly eliminated image degradation due to crystal penetration

  20. X-ray Tomographic Microscopy at TOMCAT

    Energy Technology Data Exchange (ETDEWEB)

    Marone, F; Hintermueller, C; McDonald, S; Abela, R; Mikuljan, G; Isenegger, A; Stampanoni, M, E-mail: federica.marone@psi.c [Swiss Light Source, Paul Scherrer Institut, 5232 Villigen (Switzerland)

    2009-09-01

    Synchrotron-based X-ray Tomographic Microscopy is a powerful technique for fast non-destructive, high resolution quantitative volumetric investigations on diverse samples. At the TOMCAT (TOmographic Microscopy and Coherent rAdiology experimenTs) beamline at the Swiss Light Source, synchrotron light is delivered by a 2.9 T superbend. The main optical component, a Double Crystal Multilayer Monochromator, covers an energy range between 8 and 45 keV. The standard TOMCAT detector offers field of views ranging from 0.75x0.75 mm{sup 2} up to 12.1x12.1 mm{sup 2} with a pixel size of 0.37 {mu}m and 5.92 {mu}m, respectively. In addition to routine measurements, which exploit the absorption contrast, the high coherence of the source also enables phase contrast tomography, implemented with two complementary techniques (Modified Transport of Intensity approach and Grating Interferometry). Typical acquisition times for a tomogram are in the order of few minutes, ensuring high throughput and allowing for semi-dynamical investigations. Raw data are automatically post-processed online and full reconstructed volumes are available shortly after a scan with minimal user intervention.

  1. Reconstruction of point cross-section from ENDF data file for Monte Carlo applications

    International Nuclear Information System (INIS)

    Kumawat, H.; Saxena, A.; Carminati, F.; )

    2016-12-01

    Monte Carlo neutron transport codes are one of the best tools to simulate complex systems like fission and fusion reactors, Accelerator Driven Sub-critical systems, radio-activity management of spent fuel and waste, optimization and characterization of neutron detectors, optimization of Boron Neutron Capture Therapy, imaging etc. The neutron cross-section and secondary particle emission properties are the main input parameters of such codes. The fission, capture and elastic scattering cross-sections have complex resonating structures. Evaluated Nuclear Data File (ENDF) contains these cross-sections and secondary parameters. We report the development of reconstruction procedure to generate point cross-sections and probabilities from ENDF data file. The cross-sections are compared with the values obtained from PREPRO and in some cases NJOY codes. The results are in good agreement. (author)

  2. Analytical, experimental, and Monte Carlo system response matrix for pinhole SPECT reconstruction

    Energy Technology Data Exchange (ETDEWEB)

    Aguiar, Pablo, E-mail: pablo.aguiar.fernandez@sergas.es [Fundación Ramón Domínguez, Medicina Nuclear, CHUS, Spain and Grupo de Imaxe Molecular, IDIS, Santiago de Compostela 15706 (Spain); Pino, Francisco [Unitat de Biofísica, Facultat de Medicina, Universitat de Barcelona, Spain and Servei de Física Médica i Protecció Radiológica, Institut Catalá d' Oncologia, Barcelona 08036 (Spain); Silva-Rodríguez, Jesús [Fundación Ramón Domínguez, Medicina Nuclear, CHUS, Santiago de Compostela 15706 (Spain); Pavía, Javier [Servei de Medicina Nuclear, Hospital Clínic, Barcelona (Spain); Institut d' Investigacions Biomèdiques August Pí i Sunyer (IDIBAPS) (Spain); CIBER en Bioingeniería, Biomateriales y Nanomedicina (CIBER-BBN), Barcelona 08036 (Spain); Ros, Doménec [Unitat de Biofísica, Facultat de Medicina, Casanova 143 (Spain); Institut d' Investigacions Biomèdiques August Pí i Sunyer (IDIBAPS) (Spain); CIBER en Bioingeniería, Biomateriales y Nanomedicina (CIBER-BBN), Barcelona 08036 (Spain); Ruibal, Álvaro [Servicio Medicina Nuclear, CHUS (Spain); Grupo de Imaxe Molecular, Facultade de Medicina (USC), IDIS, Santiago de Compostela 15706 (Spain); Fundación Tejerina, Madrid (Spain); and others

    2014-03-15

    Purpose: To assess the performance of two approaches to the system response matrix (SRM) calculation in pinhole single photon emission computed tomography (SPECT) reconstruction. Methods: Evaluation was performed using experimental data from a low magnification pinhole SPECT system that consisted of a rotating flat detector with a monolithic scintillator crystal. The SRM was computed following two approaches, which were based on Monte Carlo simulations (MC-SRM) and analytical techniques in combination with an experimental characterization (AE-SRM). The spatial response of the system, obtained by using the two approaches, was compared with experimental data. The effect of the MC-SRM and AE-SRM approaches on the reconstructed image was assessed in terms of image contrast, signal-to-noise ratio, image quality, and spatial resolution. To this end, acquisitions were carried out using a hot cylinder phantom (consisting of five fillable rods with diameters of 5, 4, 3, 2, and 1 mm and a uniform cylindrical chamber) and a custom-made Derenzo phantom, with center-to-center distances between adjacent rods of 1.5, 2.0, and 3.0 mm. Results: Good agreement was found for the spatial response of the system between measured data and results derived from MC-SRM and AE-SRM. Only minor differences for point sources at distances smaller than the radius of rotation and large incidence angles were found. Assessment of the effect on the reconstructed image showed a similar contrast for both approaches, with values higher than 0.9 for rod diameters greater than 1 mm and higher than 0.8 for rod diameter of 1 mm. The comparison in terms of image quality showed that all rods in the different sections of a custom-made Derenzo phantom could be distinguished. The spatial resolution (FWHM) was 0.7 mm at iteration 100 using both approaches. The SNR was lower for reconstructed images using MC-SRM than for those reconstructed using AE-SRM, indicating that AE-SRM deals better with the

  3. Analytical, experimental, and Monte Carlo system response matrix for pinhole SPECT reconstruction

    International Nuclear Information System (INIS)

    Aguiar, Pablo; Pino, Francisco; Silva-Rodríguez, Jesús; Pavía, Javier; Ros, Doménec; Ruibal, Álvaro

    2014-01-01

    Purpose: To assess the performance of two approaches to the system response matrix (SRM) calculation in pinhole single photon emission computed tomography (SPECT) reconstruction. Methods: Evaluation was performed using experimental data from a low magnification pinhole SPECT system that consisted of a rotating flat detector with a monolithic scintillator crystal. The SRM was computed following two approaches, which were based on Monte Carlo simulations (MC-SRM) and analytical techniques in combination with an experimental characterization (AE-SRM). The spatial response of the system, obtained by using the two approaches, was compared with experimental data. The effect of the MC-SRM and AE-SRM approaches on the reconstructed image was assessed in terms of image contrast, signal-to-noise ratio, image quality, and spatial resolution. To this end, acquisitions were carried out using a hot cylinder phantom (consisting of five fillable rods with diameters of 5, 4, 3, 2, and 1 mm and a uniform cylindrical chamber) and a custom-made Derenzo phantom, with center-to-center distances between adjacent rods of 1.5, 2.0, and 3.0 mm. Results: Good agreement was found for the spatial response of the system between measured data and results derived from MC-SRM and AE-SRM. Only minor differences for point sources at distances smaller than the radius of rotation and large incidence angles were found. Assessment of the effect on the reconstructed image showed a similar contrast for both approaches, with values higher than 0.9 for rod diameters greater than 1 mm and higher than 0.8 for rod diameter of 1 mm. The comparison in terms of image quality showed that all rods in the different sections of a custom-made Derenzo phantom could be distinguished. The spatial resolution (FWHM) was 0.7 mm at iteration 100 using both approaches. The SNR was lower for reconstructed images using MC-SRM than for those reconstructed using AE-SRM, indicating that AE-SRM deals better with the

  4. A prototype high-resolution animal positron tomograph with avalanche photodiode arrays and LSO crystals

    International Nuclear Information System (INIS)

    Ziegler, S.I.; Pichler, B.J.; Rafecas, M.; Schwaiger, M.

    2001-01-01

    To fully utilize positron emission tomography (PET) as a non-invasive tool for tissue characterization, dedicated instrumentation is being developed which is specially suited for imaging mice and rats. Semiconductor detectors, such as avalanche photodiodes (APDs), may offer an alternative to photomultiplier tubes for the readout of scintillation crystals. Since the scintillation characteristics of lutetium oxyorthosilicate (LSO) are well matched to APDs, the combination of LSO and APDs seems favourable, and the goal of this study was to build a positron tomograph with LSO-APD modules to prove the feasibility of such an approach. A prototype PET scanner based on APD readout of small, individual LSO crystals was developed for tracer studies in mice and rats. The tomograph consists of two sectors (86 mm distance), each comprising three LSO-APD modules, which can be rotated for the acquisition of complete projections. In each module, small LSO crystals (3.7 x 3.7 x 12 mm 3 ) are individually coupled to one channel within matrices containing 2 x 8 square APDs (2.6 x 2.6 mm 2 sensitive area per channel). The list-mode data are reconstructed with a penalized weighted least squares algorithm which includes the spatially dependent line spread function of the tomograph. Basic performance parameters were measured with phantoms and first experiments with rats and mice were conducted to introduce this methodology for biomedical imaging. The reconstructed field of view covers 68 mm, which is 80% of the total detector diameter. Image resolution was shown to be 2.4 mm within the whole reconstructed field of view. Using a lower energy threshold of 450 keV, the system sensitivity was 350 Hz/MBq for a line source in air in the centre of the field of view. In a water-filled cylinder of 4.6 cm diameter, the scatter fraction at the centre of the field of view was 16% (450 keV threshold). The count rate was linear up to 700 coincidence counts per second. In vivo studies of anaesthetized

  5. Feasibility study for image reconstruction in circular digital tomosynthesis (CDTS) from limited-scan angle data based on compressed-sensing theory

    Energy Technology Data Exchange (ETDEWEB)

    Park, Yeonok; Je, Uikyu; Cho, Hyosung, E-mail: hscho1@yonsei.ac.kr; Hong, Daeki; Park, Chulkyu; Cho, Heemoon; Choi, Sungil; Woo, Taeho

    2015-03-21

    In this work, we performed a feasibility study for image reconstruction in a circular digital tomosynthesis (CDTS) from limited-scan angle data based on compressed-sensing (CS) theory. Here, the X-ray source moves along an arc within a limited-scan angle (≤ 180°) on a circular path set perpendicularly to the axial direction during the image acquisition. This geometry, compared to full-angle (360°) scan geometry, allows imaging system to be designed more compactly and gives better tomographic quality than conventional linear digital tomosynthesis (DTS). We implemented an efficient CS-based reconstruction algorithm for the proposed geometry and performed systematic simulations to investigate the image characteristics. We successfully reconstructed CDTS images with incomplete projections acquired at several selected limited-scan angles of 45°, 90°, 135°, and 180° for a given tomographic angle of 80° and evaluated the reconstruction quality. Our simulation results indicate that the proposed method can provide superior tomographic quality for axial view and even for the other views (i.e., sagittal and coronal), as in computed tomography, to conventional DTS. - Highlights: • Image reconstruction is done in circular digital tomosynthesis (CDTS). • The designed geometry allows imaging system to be the better image. • An efficient compressed-sensing (CS)-based reconstruction algorithm is performed. • Proposed method can provide superior tomographic quality for the axial view.

  6. Tomographic Particle Image Velocimetry Using Colored Shadow Imaging

    KAUST Repository

    Alarfaj, Meshal K.

    2016-02-01

    Tomographic Particle Image Velocimetry Using Colored Shadow Imaging by Meshal K Alarfaj, Master of Science King Abdullah University of Science & Technology, 2015 Tomographic Particle image velocimetry (PIV) is a recent PIV method capable of reconstructing the full 3D velocity field of complex flows, within a 3-D volume. For nearly the last decade, it has become the most powerful tool for study of turbulent velocity fields and promises great advancements in the study of fluid mechanics. Among the early published studies, a good number of researches have suggested enhancements and optimizations of different aspects of this technique to improve the effectiveness. One major aspect, which is the core of the present work, is related to reducing the cost of the Tomographic PIV setup. In this thesis, we attempt to reduce this cost by using an experimental setup exploiting 4 commercial digital still cameras in combination with low-cost Light emitting diodes (LEDs). We use two different colors to distinguish the two light pulses. By using colored shadows with red and green LEDs, we can identify the particle locations within the measurement volume, at the two different times, thereby allowing calculation of the velocities. The present work tests this technique on the flows patterns of a jet ejected from a tube in a water tank. Results from the images processing are presented and challenges discussed.

  7. Frequency-domain optical tomographic image reconstruction algorithm with the simplified spherical harmonics (SP3) light propagation model.

    Science.gov (United States)

    Kim, Hyun Keol; Montejo, Ludguier D; Jia, Jingfei; Hielscher, Andreas H

    2017-06-01

    We introduce here the finite volume formulation of the frequency-domain simplified spherical harmonics model with n -th order absorption coefficients (FD-SP N ) that approximates the frequency-domain equation of radiative transfer (FD-ERT). We then present the FD-SP N based reconstruction algorithm that recovers absorption and scattering coefficients in biological tissue. The FD-SP N model with 3 rd order absorption coefficient (i.e., FD-SP 3 ) is used as a forward model to solve the inverse problem. The FD-SP 3 is discretized with a node-centered finite volume scheme and solved with a restarted generalized minimum residual (GMRES) algorithm. The absorption and scattering coefficients are retrieved using a limited-memory Broyden-Fletcher-Goldfarb-Shanno (L-BFGS) algorithm. Finally, the forward and inverse algorithms are evaluated using numerical phantoms with optical properties and size that mimic small-volume tissue such as finger joints and small animals. The forward results show that the FD-SP 3 model approximates the FD-ERT (S 12 ) solution within relatively high accuracy; the average error in the phase (<3.7%) and the amplitude (<7.1%) of the partial current at the boundary are reported. From the inverse results we find that the absorption and scattering coefficient maps are more accurately reconstructed with the SP 3 model than those with the SP 1 model. Therefore, this work shows that the FD-SP 3 is an efficient model for optical tomographic imaging of small-volume media with non-diffuse properties both in terms of computational time and accuracy as it requires significantly lower CPU time than the FD-ERT (S 12 ) and also it is more accurate than the FD-SP 1 .

  8. Application of tomographic inversion in studying airglow in the mesopause region

    Directory of Open Access Journals (Sweden)

    T. Nygrén

    Full Text Available It is pointed out that observations of periodic nightglow structures give excellent information on atmospheric gravity waves in the mesosphere and lower thermosphere. The periods, the horizontal wavelengths and the phase speeds of the waves can be determined from airglow images and, using several cameras, the approximate altitude of the luminous layer can also be determined by triangulation. In this paper the possibility of applying tomographic methods for reconstructing the airglow structures is investigated using numerical simulations. A ground-based chain of cameras is assumed, two-dimensional airglow models in the vertical plane above the chain are constructed, and simulated data are calculated by integrating the models along a great number of rays with different elevation angles for each camera. After addition of random noise, these data are then inverted to obtain reconstructions of the models. A tomographic analysis package originally designed for satellite radiotomography is used in the inversion. The package is based on a formulation of stochastic inversion which allows the input of a priori information to the solver in terms of regularization variances. The reconstruction is carried out in two stages. In the first inversion, constant regularization variances are used within a wide altitude range. The results are used in determining the approximate altitude range of the airglow structures. Then, in the second inversion, constant non-zero regularization variances are used inside this region and zero variances outside it. With this method reliable reconstructions of the models are obtained. The number of cameras as well as their separations are varied in order to find out the limitations of the method.

    Key words. Tomography · Airglow · Mesopause · Gravity waves

  9. An iterative reconstruction from truncated projection data

    International Nuclear Information System (INIS)

    Anon.

    1985-01-01

    Various methods have been proposed for tomographic reconstruction from truncated projection data. In this paper, a reconstructive method is discussed which consists of iterations of filtered back-projection, reprojection and some nonlinear processings. First, the method is so constructed that it converges to a fixed point. Then, to examine its effectiveness, comparisons are made by computer experiments with two existing reconstructive methods for truncated projection data, that is, the method of extrapolation based on the smooth assumption followed by filtered back-projection, and modified additive ART

  10. Acceleration of iterative tomographic reconstruction using graphics processors

    International Nuclear Information System (INIS)

    Belzunce, M.A.; Osorio, A.; Verrastro, C.A.

    2009-01-01

    Using iterative algorithms for image reconstruction in 3 D Positron Emission Tomography has shown to produce images with better quality than analytical methods. How ever, these algorithms are computationally expensive. New Graphic Processor Units (GPU) provides high performance at low cost and also programming tools that make possible to execute parallel algorithms easily in scientific applications. In this work, we try to achieve an acceleration of image reconstruction algorithms in 3 D PET by using a GPU. A parallel implementation of the algorithm ML-EM 3 D was developed using Siddon algorithm as Projector and Back-projector. Results show that accelerations of more than one order of magnitude can be achieved, keeping similar image quality. (author)

  11. Rational approximations for tomographic reconstructions

    International Nuclear Information System (INIS)

    Reynolds, Matthew; Beylkin, Gregory; Monzón, Lucas

    2013-01-01

    We use optimal rational approximations of projection data collected in x-ray tomography to improve image resolution. Under the assumption that the object of interest is described by functions with jump discontinuities, for each projection we construct its rational approximation with a small (near optimal) number of terms for a given accuracy threshold. This allows us to augment the measured data, i.e., double the number of available samples in each projection or, equivalently, extend (double) the domain of their Fourier transform. We also develop a new, fast, polar coordinate Fourier domain algorithm which uses our nonlinear approximation of projection data in a natural way. Using augmented projections of the Shepp–Logan phantom, we provide a comparison between the new algorithm and the standard filtered back-projection algorithm. We demonstrate that the reconstructed image has improved resolution without additional artifacts near sharp transitions in the image. (paper)

  12. Pipelining Computational Stages of the Tomographic Reconstructor for Multi-Object Adaptive Optics on a Multi-GPU System

    KAUST Repository

    Charara, Ali; Ltaief, Hatem; Gratadour, Damien; Keyes, David E.; Sevin, Arnaud; Abdelfattah, Ahmad; Gendron, Eric; Morel, Carine; Vidal, Fabrice

    2014-01-01

    called MOSAIC has been proposed to perform multi-object spectroscopy using the Multi-Object Adaptive Optics (MOAO) technique. The core implementation of the simulation lies in the intensive computation of a tomographic reconstruct or (TR), which is used

  13. Fast Tomographic Reconstruction From Limited Data Using Artificial Neural Networks

    NARCIS (Netherlands)

    D.M. Pelt (Daniël); K.J. Batenburg (Joost)

    2013-01-01

    htmlabstractImage reconstruction from a small number of projections is a challenging problem in tomography. Advanced algorithms that incorporate prior knowledge can sometimes produce accurate reconstructions, but they typically require long computation times. Furthermore, the required prior

  14. Development of a X-ray micro-tomograph and its application to reservoir rocks characterization

    International Nuclear Information System (INIS)

    Ferreira de Paiva, R.

    1995-10-01

    We describe the construction and application to studies in three dimensions of a laboratory micro-tomograph for the characterisation of heterogeneous solids at the scale of a few microns. The system is based on an electron microprobe and a two dimensional X-ray detector. The use of a low beam divergence for image acquisition allows use of simple and rapid reconstruction software whilst retaining reasonable acquisition times. Spatial resolutions of better than 3 microns in radiography and 10 microns in tomography are obtained. The applications of microtomography in the petroleum industry are illustrated by the study of fibre orientation in polymer composites, of the distribution of minerals and pore space in reservoir rocks, and of the interaction of salt water with a model porous medium. A correction for X-ray beam hardening is described and used to obtain improved discrimination of the phases present in the sample. In the case of a North Sea reservoir rock we show the possibility to distinguish quartz, feldspar and in certain zone kaolinite. The representativeness of the tomographic reconstruction is demonstrated by comparing the surface of the reconstructed specimen with corresponding images obtained in scanning electron microscopy. (author). 58 refs., 10 tabs., 71 photos

  15. Effect of noise in computed tomographic reconstructions on detectability

    International Nuclear Information System (INIS)

    Hanson, K.M.

    1982-01-01

    The detectability of features in an image is ultimately limited by the random fluctuations in density or noise present in that image. The noise in CT reconstructions arising from the statistical fluctuations in the one-dimensional input projection measurements has an unusual character owing to the reconstruction procedure. Such CT image noise differs from the white noise normally found in images in its lack of low-frequency components. The noise power spectrum of CT reconstructions can be related to the effective density of x-ray quanta detected in the projection measurements, designated as NEQ (noise-equivalent quanta). The detectability of objects that are somewhat larger than the spatial resolution is directly related to NEQ. Since contrast resolution may be defined in terms of the ability to detect large, low-contrast objects, the measurement of a CT scanner's NEQ may be used to characterize its contrast sensitivity

  16. Volume reconstruction optimization for tomo-PIV algorithms applied to experimental data

    Science.gov (United States)

    Martins, Fabio J. W. A.; Foucaut, Jean-Marc; Thomas, Lionel; Azevedo, Luis F. A.; Stanislas, Michel

    2015-08-01

    Tomographic PIV is a three-component volumetric velocity measurement technique based on the tomographic reconstruction of a particle distribution imaged by multiple camera views. In essence, the performance and accuracy of this technique is highly dependent on the parametric adjustment and the reconstruction algorithm used. Although synthetic data have been widely employed to optimize experiments, the resulting reconstructed volumes might not have optimal quality. The purpose of the present study is to offer quality indicators that can be applied to data samples in order to improve the quality of velocity results obtained by the tomo-PIV technique. The methodology proposed can potentially lead to significantly reduction in the time required to optimize a tomo-PIV reconstruction, also leading to better quality velocity results. Tomo-PIV data provided by a six-camera turbulent boundary-layer experiment were used to optimize the reconstruction algorithms according to this methodology. Velocity statistics measurements obtained by optimized BIMART, SMART and MART algorithms were compared with hot-wire anemometer data and velocity measurement uncertainties were computed. Results indicated that BIMART and SMART algorithms produced reconstructed volumes with equivalent quality as the standard MART with the benefit of reduced computational time.

  17. Volume reconstruction optimization for tomo-PIV algorithms applied to experimental data

    International Nuclear Information System (INIS)

    Martins, Fabio J W A; Foucaut, Jean-Marc; Stanislas, Michel; Thomas, Lionel; Azevedo, Luis F A

    2015-01-01

    Tomographic PIV is a three-component volumetric velocity measurement technique based on the tomographic reconstruction of a particle distribution imaged by multiple camera views. In essence, the performance and accuracy of this technique is highly dependent on the parametric adjustment and the reconstruction algorithm used. Although synthetic data have been widely employed to optimize experiments, the resulting reconstructed volumes might not have optimal quality. The purpose of the present study is to offer quality indicators that can be applied to data samples in order to improve the quality of velocity results obtained by the tomo-PIV technique. The methodology proposed can potentially lead to significantly reduction in the time required to optimize a tomo-PIV reconstruction, also leading to better quality velocity results. Tomo-PIV data provided by a six-camera turbulent boundary-layer experiment were used to optimize the reconstruction algorithms according to this methodology. Velocity statistics measurements obtained by optimized BIMART, SMART and MART algorithms were compared with hot-wire anemometer data and velocity measurement uncertainties were computed. Results indicated that BIMART and SMART algorithms produced reconstructed volumes with equivalent quality as the standard MART with the benefit of reduced computational time. (paper)

  18. Sound field reconstruction based on the acousto-optic effect

    DEFF Research Database (Denmark)

    Torras Rosell, Antoni; Barrera Figueroa, Salvador; Jacobsen, Finn

    2011-01-01

    be measured with a laser Doppler vibrometer; furthermore, it can be exploited to characterize an arbitrary sound field using tomographic techniques. This paper briefly reviews the fundamental principles governing the acousto-optic effect in air, and presents an investigation of the tomographic reconstruction...... within the audible frequency range by means of simulations and experimental results. The good agreement observed between simulations and measurements is further confirmed with representations of the sound field obtained with traditional microphone array measurements....

  19. Monte Carlo Method with Heuristic Adjustment for Irregularly Shaped Food Product Volume Measurement

    Directory of Open Access Journals (Sweden)

    Joko Siswantoro

    2014-01-01

    Full Text Available Volume measurement plays an important role in the production and processing of food products. Various methods have been proposed to measure the volume of food products with irregular shapes based on 3D reconstruction. However, 3D reconstruction comes with a high-priced computational cost. Furthermore, some of the volume measurement methods based on 3D reconstruction have a low accuracy. Another method for measuring volume of objects uses Monte Carlo method. Monte Carlo method performs volume measurements using random points. Monte Carlo method only requires information regarding whether random points fall inside or outside an object and does not require a 3D reconstruction. This paper proposes volume measurement using a computer vision system for irregularly shaped food products without 3D reconstruction based on Monte Carlo method with heuristic adjustment. Five images of food product were captured using five cameras and processed to produce binary images. Monte Carlo integration with heuristic adjustment was performed to measure the volume based on the information extracted from binary images. The experimental results show that the proposed method provided high accuracy and precision compared to the water displacement method. In addition, the proposed method is more accurate and faster than the space carving method.

  20. Monte Carlo method with heuristic adjustment for irregularly shaped food product volume measurement.

    Science.gov (United States)

    Siswantoro, Joko; Prabuwono, Anton Satria; Abdullah, Azizi; Idrus, Bahari

    2014-01-01

    Volume measurement plays an important role in the production and processing of food products. Various methods have been proposed to measure the volume of food products with irregular shapes based on 3D reconstruction. However, 3D reconstruction comes with a high-priced computational cost. Furthermore, some of the volume measurement methods based on 3D reconstruction have a low accuracy. Another method for measuring volume of objects uses Monte Carlo method. Monte Carlo method performs volume measurements using random points. Monte Carlo method only requires information regarding whether random points fall inside or outside an object and does not require a 3D reconstruction. This paper proposes volume measurement using a computer vision system for irregularly shaped food products without 3D reconstruction based on Monte Carlo method with heuristic adjustment. Five images of food product were captured using five cameras and processed to produce binary images. Monte Carlo integration with heuristic adjustment was performed to measure the volume based on the information extracted from binary images. The experimental results show that the proposed method provided high accuracy and precision compared to the water displacement method. In addition, the proposed method is more accurate and faster than the space carving method.

  1. Design and applications of Computed Industrial Tomographic Imaging System (CITIS)

    International Nuclear Information System (INIS)

    Ramakrishna, G.S.; Umesh Kumar; Datta, S.S.; Rao, S.M.

    1996-01-01

    Computed tomographic imaging is an advanced technique for nondestructive testing (NDT) and examination. For the first time in India a computed aided tomography system has been indigenously developed in BARC for testing industrial components and was successfully demonstrated. The system in addition to Computed Tomography (CT) can also perform Digital Radiography (DR) to serve as a powerful tool for NDT applications. It has wider applications in the fields of nuclear, space and allied fields. The authors have developed a computed industrial tomographic imaging system with Cesium 137 gamma radiation source for nondestructive examination of engineering and industrial specimens. This presentation highlights the design and development of a prototype system and its software for image reconstruction, simulation and display. The paper also describes results obtained with several tests specimens, current development and possibility of using neutrons as well as high energy x-rays in computed tomography. (author)

  2. GPS tomography: validation of reconstructed 3-D humidity fields with radiosonde profiles

    Directory of Open Access Journals (Sweden)

    M. Shangguan

    2013-09-01

    Full Text Available Water vapor plays an important role in meteorological applications; GeoForschungsZentrum (GFZ therefore developed a tomographic system to derive 3-D distributions of the tropospheric water vapor above Germany using GPS data from about 300 ground stations. Input data for the tomographic reconstructions are generated by the Earth Parameter and Orbit determination System (EPOS software of the GFZ, which provides zenith total delay (ZTD, integrated water vapor (IWV and slant total delay (STD data operationally with a temporal resolution of 2.5 min (STD and 15 min (ZTD, IWV. The water vapor distribution in the atmosphere is derived by tomographic reconstruction techniques. The quality of the solution is dependent on many factors such as the spatial coverage of the atmosphere with slant paths, the spatial distribution of their intersections and the accuracy of the input observations. Independent observations are required to validate the tomographic reconstructions and to get precise information on the accuracy of the derived 3-D water vapor fields. To determine the quality of the GPS tomography, more than 8000 vertical water vapor profiles at 13 German radiosonde stations were used for the comparison. The radiosondes were launched twice a day (at 00:00 UTC and 12:00 UTC in 2007. In this paper, parameters of the entire profiles such as the wet refractivity, and the zenith wet delay have been compared. Before the validation the temporal and spatial distribution of the slant paths, serving as a basis for tomographic reconstruction, as well as their angular distribution were studied. The mean wet refractivity differences between tomography and radiosonde data for all points vary from −1.3 to 0.3, and the root mean square is within the range of 6.5–9. About 32% of 6803 profiles match well, 23% match badly and 45% are difficult to classify as they match only in parts.

  3. A resolution-enhancing image reconstruction method for few-view differential phase-contrast tomography

    Science.gov (United States)

    Guan, Huifeng; Anastasio, Mark A.

    2017-03-01

    It is well-known that properly designed image reconstruction methods can facilitate reductions in imaging doses and data-acquisition times in tomographic imaging. The ability to do so is particularly important for emerging modalities such as differential X-ray phase-contrast tomography (D-XPCT), which are currently limited by these factors. An important application of D-XPCT is high-resolution imaging of biomedical samples. However, reconstructing high-resolution images from few-view tomographic measurements remains a challenging task. In this work, a two-step sub-space reconstruction strategy is proposed and investigated for use in few-view D-XPCT image reconstruction. It is demonstrated that the resulting iterative algorithm can mitigate the high-frequency information loss caused by data incompleteness and produce images that have better preserved high spatial frequency content than those produced by use of a conventional penalized least squares (PLS) estimator.

  4. Tomographic evaluation of a dual-head positron emission tomography system

    International Nuclear Information System (INIS)

    Efthimiou, N; Maistros, S; Tripolitis, X; Panayiotakis, G; Samartzis, A; Loudos, G

    2011-01-01

    In this paper we present the performance evaluation results, in the planar and tomographic modes, of a low-cost positron emission tomography camera dedicated to small-animal imaging. The system consists of two pixelated Lu 2 SiO 5 crystals, two Hamamatsu H8500 position sensitive photomultiplier tubes, fast amplification electronics and an FPGA-USB-based read-out system. The parameters that have been studied are (i) saturation as a function of the head distance and photon acceptance angle, (ii) effect of the number of projections and half or complete head's rotation, (iii) spatial resolution as a function of the head distance, (iv) spatial resolution as a function of acceptance angle, (v) system's sensitivity as a function of these parameters and (vi) performance in small mice imaging. Image reconstruction has been carried out using open source software developed by our group (QSPECT), which is designed mainly for SPECT imaging. The results indicate that the system has a linear response for activities up to at least 2 MBq, which are typical in small-animal imaging. Best tomographic spatial resolution was measured to be ∼2 mm. The system has been found suitable for imaging of small mice both in the planar and tomographic modes

  5. Effect of Shot Noise on Simultaneous Sensing in Frequency Division Multiplexed Diffuse Optical Tomographic Imaging Process.

    Science.gov (United States)

    Jang, Hansol; Lim, Gukbin; Hong, Keum-Shik; Cho, Jaedu; Gulsen, Gultekin; Kim, Chang-Seok

    2017-11-28

    Diffuse optical tomography (DOT) has been studied for use in the detection of breast cancer, cerebral oxygenation, and cognitive brain signals. As optical imaging studies have increased significantly, acquiring imaging data in real time has become increasingly important. We have developed frequency-division multiplexing (FDM) DOT systems to analyze their performance with respect to acquisition time and imaging quality, in comparison with the conventional time-division multiplexing (TDM) DOT. A large tomographic area of a cylindrical phantom 60 mm in diameter could be successfully reconstructed using both TDM DOT and FDM DOT systems. In our experiment with 6 source-detector (S-D) pairs, the TDM DOT and FDM DOT systems required 6.18 and 1 s, respectively, to obtain a single tomographic data set. While the absorption coefficient of the reconstruction image was underestimated in the case of the FDM DOT, we experimentally confirmed that the abnormal region can be clearly distinguished from the background phantom using both methods.

  6. Effect of Shot Noise on Simultaneous Sensing in Frequency Division Multiplexed Diffuse Optical Tomographic Imaging Process

    Directory of Open Access Journals (Sweden)

    Hansol Jang

    2017-11-01

    Full Text Available Diffuse optical tomography (DOT has been studied for use in the detection of breast cancer, cerebral oxygenation, and cognitive brain signals. As optical imaging studies have increased significantly, acquiring imaging data in real time has become increasingly important. We have developed frequency-division multiplexing (FDM DOT systems to analyze their performance with respect to acquisition time and imaging quality, in comparison with the conventional time-division multiplexing (TDM DOT. A large tomographic area of a cylindrical phantom 60 mm in diameter could be successfully reconstructed using both TDM DOT and FDM DOT systems. In our experiment with 6 source-detector (S-D pairs, the TDM DOT and FDM DOT systems required 6.18 and 1 s, respectively, to obtain a single tomographic data set. While the absorption coefficient of the reconstruction image was underestimated in the case of the FDM DOT, we experimentally confirmed that the abnormal region can be clearly distinguished from the background phantom using both methods.

  7. Engineering developments on the UBC-TRIUMF modified PETT VI positron emission tomograph

    International Nuclear Information System (INIS)

    Evans, B.; Harrop, R.; Heywood, D.

    1982-10-01

    A tomograph with the PETT VI geometry has been built with improvements generally applicable to such devices. In addition to improved temperature control, the gantry features commercial CsF detectors with the newer Amperex photomultiplier tubes. Much of the coincidence support circuitry is of an original design, utilizing new 'fast' TTL family devices. A local DEC 11/23 microprocessor provides for routine diagnostic and reliability checking, gantry control, and acquisition of single detector counting rates. Image reconstruction and display is performed by a medium size VAX 11/780 computer, operating in a time-sharing environment. Some preliminary performance characteristics of the machine have been measured. The reconstructed resolution in-slice, as well as the reconstructed slice thickness, has been measured as a function of radius for both 'straight' and 'cross' slices

  8. On a novel low cost high accuracy experimental setup for tomographic particle image velocimetry

    International Nuclear Information System (INIS)

    Discetti, Stefano; Ianiro, Andrea; Astarita, Tommaso; Cardone, Gennaro

    2013-01-01

    This work deals with the critical aspects related to cost reduction of a Tomo PIV setup and to the bias errors introduced in the velocity measurements by the coherent motion of the ghost particles. The proposed solution consists of using two independent imaging systems composed of three (or more) low speed single frame cameras, which can be up to ten times cheaper than double shutter cameras with the same image quality. Each imaging system is used to reconstruct a particle distribution in the same measurement region, relative to the first and the second exposure, respectively. The reconstructed volumes are then interrogated by cross-correlation in order to obtain the measured velocity field, as in the standard tomographic PIV implementation. Moreover, differently from tomographic PIV, the ghost particle distributions of the two exposures are uncorrelated, since their spatial distribution is camera orientation dependent. For this reason, the proposed solution promises more accurate results, without the bias effect of the coherent ghost particles motion. Guidelines for the implementation and the application of the present method are proposed. The performances are assessed with a parametric study on synthetic experiments. The proposed low cost system produces a much lower modulation with respect to an equivalent three-camera system. Furthermore, the potential accuracy improvement using the Motion Tracking Enhanced MART (Novara et al 2010 Meas. Sci. Technol. 21 035401) is much higher than in the case of the standard implementation of tomographic PIV. (paper)

  9. Evaluation of reconstruction techniques in regional cerebral blood flow SPECT using trade-off plots: a Monte Carlo study.

    Science.gov (United States)

    Olsson, Anna; Arlig, Asa; Carlsson, Gudrun Alm; Gustafsson, Agnetha

    2007-09-01

    The image quality of single photon emission computed tomography (SPECT) depends on the reconstruction algorithm used. The purpose of the present study was to evaluate parameters in ordered subset expectation maximization (OSEM) and to compare systematically with filtered back-projection (FBP) for reconstruction of regional cerebral blood flow (rCBF) SPECT, incorporating attenuation and scatter correction. The evaluation was based on the trade-off between contrast recovery and statistical noise using different sizes of subsets, number of iterations and filter parameters. Monte Carlo simulated SPECT studies of a digital human brain phantom were used. The contrast recovery was calculated as measured contrast divided by true contrast. Statistical noise in the reconstructed images was calculated as the coefficient of variation in pixel values. A constant contrast level was reached above 195 equivalent maximum likelihood expectation maximization iterations. The choice of subset size was not crucial as long as there were > or = 2 projections per subset. The OSEM reconstruction was found to give 5-14% higher contrast recovery than FBP for all clinically relevant noise levels in rCBF SPECT. The Butterworth filter, power 6, achieved the highest stable contrast recovery level at all clinically relevant noise levels. The cut-off frequency should be chosen according to the noise level accepted in the image. Trade-off plots are shown to be a practical way of deciding the number of iterations and subset size for the OSEM reconstruction and can be used for other examination types in nuclear medicine.

  10. Improved quantitative 90 Y bremsstrahlung SPECT/CT reconstruction with Monte Carlo scatter modeling.

    Science.gov (United States)

    Dewaraja, Yuni K; Chun, Se Young; Srinivasa, Ravi N; Kaza, Ravi K; Cuneo, Kyle C; Majdalany, Bill S; Novelli, Paula M; Ljungberg, Michael; Fessler, Jeffrey A

    2017-12-01

    In 90 Y microsphere radioembolization (RE), accurate post-therapy imaging-based dosimetry is important for establishing absorbed dose versus outcome relationships for developing future treatment planning strategies. Additionally, accurately assessing microsphere distributions is important because of concerns for unexpected activity deposition outside the liver. Quantitative 90 Y imaging by either SPECT or PET is challenging. In 90 Y SPECT model based methods are necessary for scatter correction because energy window-based methods are not feasible with the continuous bremsstrahlung energy spectrum. The objective of this work was to implement and evaluate a scatter estimation method for accurate 90 Y bremsstrahlung SPECT/CT imaging. Since a fully Monte Carlo (MC) approach to 90 Y SPECT reconstruction is computationally very demanding, in the present study the scatter estimate generated by a MC simulator was combined with an analytical projector in the 3D OS-EM reconstruction model. A single window (105 to 195-keV) was used for both the acquisition and the projector modeling. A liver/lung torso phantom with intrahepatic lesions and low-uptake extrahepatic objects was imaged to evaluate SPECT/CT reconstruction without and with scatter correction. Clinical application was demonstrated by applying the reconstruction approach to five patients treated with RE to determine lesion and normal liver activity concentrations using a (liver) relative calibration. There was convergence of the scatter estimate after just two updates, greatly reducing computational requirements. In the phantom study, compared with reconstruction without scatter correction, with MC scatter modeling there was substantial improvement in activity recovery in intrahepatic lesions (from > 55% to > 86%), normal liver (from 113% to 104%), and lungs (from 227% to 104%) with only a small degradation in noise (13% vs. 17%). Similarly, with scatter modeling contrast improved substantially both visually and in

  11. An efficient simultaneous reconstruction technique for tomographic particle image velocimetry

    Science.gov (United States)

    Atkinson, Callum; Soria, Julio

    2009-10-01

    To date, Tomo-PIV has involved the use of the multiplicative algebraic reconstruction technique (MART), where the intensity of each 3D voxel is iteratively corrected to satisfy one recorded projection, or pixel intensity, at a time. This results in reconstruction times of multiple hours for each velocity field and requires considerable computer memory in order to store the associated weighting coefficients and intensity values for each point in the volume. In this paper, a rapid and less memory intensive reconstruction algorithm is presented based on a multiplicative line-of-sight (MLOS) estimation that determines possible particle locations in the volume, followed by simultaneous iterative correction. Reconstructions of simulated images are presented for two simultaneous algorithms (SART and SMART) as well as the now standard MART algorithm, which indicate that the same accuracy as MART can be achieved 5.5 times faster or 77 times faster with 15 times less memory if the processing and storage of the weighting matrix is considered. Application of MLOS-SMART and MART to a turbulent boundary layer at Re θ = 2200 using a 4 camera Tomo-PIV system with a volume of 1,000 × 1,000 × 160 voxels is discussed. Results indicate improvements in reconstruction speed of 15 times that of MART with precalculated weighting matrix, or 65 times if calculation of the weighting matrix is considered. Furthermore the memory needed to store a large weighting matrix and volume intensity is reduced by almost 40 times in this case.

  12. Use of regularized algebraic methods in tomographic reconstruction

    International Nuclear Information System (INIS)

    Koulibaly, P.M.; Darcourt, J.; Blanc-Ferraud, L.; Migneco, O.; Barlaud, M.

    1997-01-01

    The algebraic methods are used in emission tomography to facilitate the compensation of attenuation and of Compton scattering. We have tested on a phantom the use of a regularization (a priori introduction of information), as well as the taking into account of spatial resolution variation with the depth (SRVD). Hence, we have compared the performances of the two methods by back-projection filtering (BPF) and of the two algebraic methods (AM) in terms of FWHM (by means of a point source), of the reduction of background noise (σ/m) on the homogeneous part of Jaszczak's phantom and of reconstruction speed (time unit = BPF). The BPF methods make use of a grade filter (maximal resolution, no noise treatment), single or associated with a Hann's low-pass (f c = 0.4), as well as of an attenuation correction. The AM which embody attenuation and scattering corrections are, on one side, the OS EM (Ordered Subsets, partitioning and rearranging of the projection matrix; Expectation Maximization) without regularization or SRVD correction, and, on the other side, the OS MAP EM (Maximum a posteriori), regularized and embodying the SRVD correction. A table is given containing for each used method (grade, Hann, OS EM and OS MAP EM) the values of FWHM, σ/m and time, respectively. One can observe that the OS MAP EM algebraic method allows ameliorating both the resolution, by taking into account the SRVD in the reconstruction process and noise treatment by regularization. In addition, due to the OS technique the reconstruction times are acceptable

  13. Validation of Spherically Symmetric Inversion by Use of a Tomographically Reconstructed Three-Dimensional Electron Density of the Solar Corona

    Science.gov (United States)

    Wang, Tongjiang; Davila, Joseph M.

    2014-01-01

    Determining the coronal electron density by the inversion of white-light polarized brightness (pB) measurements by coronagraphs is a classic problem in solar physics. An inversion technique based on the spherically symmetric geometry (spherically symmetric inversion, SSI) was developed in the 1950s and has been widely applied to interpret various observations. However, to date there is no study of the uncertainty estimation of this method. We here present the detailed assessment of this method using a three-dimensional (3D) electron density in the corona from 1.5 to 4 solar radius as a model, which is reconstructed by a tomography method from STEREO/COR1 observations during the solar minimum in February 2008 (Carrington Rotation, CR 2066).We first show in theory and observation that the spherically symmetric polynomial approximation (SSPA) method and the Van de Hulst inversion technique are equivalent. Then we assess the SSPA method using synthesized pB images from the 3D density model, and find that the SSPA density values are close to the model inputs for the streamer core near the plane of the sky (POS) with differences generally smaller than about a factor of two; the former has the lower peak but extends more in both longitudinal and latitudinal directions than the latter. We estimate that the SSPA method may resolve the coronal density structure near the POS with angular resolution in longitude of about 50 deg. Our results confirm the suggestion that the SSI method is applicable to the solar minimum streamer (belt), as stated in some previous studies. In addition, we demonstrate that the SSPA method can be used to reconstruct the 3D coronal density, roughly in agreement with the reconstruction by tomography for a period of low solar activity (CR 2066). We suggest that the SSI method is complementary to the 3D tomographic technique in some cases, given that the development of the latter is still an ongoing research effort.

  14. Ghost hunting—an assessment of ghost particle detection and removal methods for tomographic-PIV

    International Nuclear Information System (INIS)

    Elsinga, G E; Tokgoz, S

    2014-01-01

    This paper discusses and compares several methods, which aim to remove spurious peaks, i.e. ghost particles, from the volume intensity reconstruction in tomographic-PIV. The assessment is based on numerical simulations of time-resolved tomographic-PIV experiments in linear shear flows. Within the reconstructed volumes, intensity peaks are detected and tracked over time. These peaks are associated with particles (either ghosts or actual particles) and are characterized by their peak intensity, size and track length. Peak intensity and track length are found to be effective in discriminating between most ghosts and the actual particles, although not all ghosts can be detected using only a single threshold. The size of the reconstructed particles does not reveal an important difference between ghosts and actual particles. The joint distribution of peak intensity and track length however does, under certain conditions, allow a complete separation of ghosts and actual particles. The ghosts can have either a high intensity or a long track length, but not both combined, like all the actual particles. Removing the detected ghosts from the reconstructed volume and performing additional MART iterations can decrease the particle position error at low to moderate seeding densities, but increases the position error, velocity error and tracking errors at higher densities. The observed trends in the joint distribution of peak intensity and track length are confirmed by results from a real experiment in laminar Taylor–Couette flow. This diagnostic plot allows an estimate of the number of ghosts that are indistinguishable from the actual particles. (paper)

  15. Data acquisition system for a positron tomograph using time-of-flight information

    International Nuclear Information System (INIS)

    Bertin, Francois.

    1981-12-01

    Progress in nuclear instrumentation has led to the development of scintillators much faster than the NaI crystal traditionally used in nuclear medicine. As a result it is now possible to measure time-of-flight, i.e. the time between the arrival of two γ rays emitted in coincidence on two detectors. With this extra information the β + annihilation site may be located. The introduction of time-of-flight in tomographic techniques called for research along two lines: - ''theoretical'' research leading to the creation of a new image reconstruction algorithm taking into account time-of-flight information - applied research leading to the development of an efficient measurement line and sophisticated data acquisition and processing electronics. This research has been carried out at LETI and is briefly outlined in chapter I. Chapter II shows how the introduction of time-of-flight and the modification of the reconstruction algorithm complicate the electronic and informatic equipment of the tomograph. Several acquisition and processing strategies are proposed, then the need to use an intermediate mass storage and hence to design a complex acquisition operator is demonstrated. Chapter III examines the structure of the acquisition operator and the resulting block diagram is presented in detail in chapter IV [fr

  16. Iterative reconstruction of magnetic induction using Lorentz transmission electron tomography

    International Nuclear Information System (INIS)

    Phatak, C.; Gürsoy, D.

    2015-01-01

    Intense ongoing research on complex nanomagnetic structures requires a fundamental understanding of the 3D magnetization and the stray fields around the nano-objects. 3D visualization of such fields offers the best way to achieve this. Lorentz transmission electron microscopy provides a suitable combination of high resolution and ability to quantitatively visualize the magnetization vectors using phase retrieval methods. In this paper, we present a formalism to represent the magnetic phase shift of electrons as a Radon transform of the magnetic induction of the sample. Using this formalism, we then present the application of common tomographic methods particularly the iterative methods, to reconstruct the 3D components of the vector field. We present an analysis of the effect of missing wedge and the limited angular sampling as well as reconstruction of complex 3D magnetization in a nanowire using simulations. - Highlights: • We present a formalism to represent electron-optical magnetic phase shift as a Radon transform of the 3D magnetic induction of the nano-object. • We have analyzed four different tomographic reconstruction methods for vectorial data reconstruction. • Reconstruction methods were tested for varying experimental limitations such as limited tilt range and limited angular sampling. • The analysis showed that Gridrec and SIRT methods performed better with lower errors than other reconstruction methods

  17. Advanced reconstruction algorithms for electron tomography: From comparison to combination

    Energy Technology Data Exchange (ETDEWEB)

    Goris, B. [EMAT, University of Antwerp, Groenenborgerlaan 171, B-2020 Antwerp (Belgium); Roelandts, T. [Vision Lab, University of Antwerp, Universiteitsplein 1, B-2610 Wilrijk (Belgium); Batenburg, K.J. [Vision Lab, University of Antwerp, Universiteitsplein 1, B-2610 Wilrijk (Belgium); Centrum Wiskunde and Informatica, Science Park 123, NL-1098XG Amsterdam (Netherlands); Heidari Mezerji, H. [EMAT, University of Antwerp, Groenenborgerlaan 171, B-2020 Antwerp (Belgium); Bals, S., E-mail: sara.bals@ua.ac.be [EMAT, University of Antwerp, Groenenborgerlaan 171, B-2020 Antwerp (Belgium)

    2013-04-15

    In this work, the simultaneous iterative reconstruction technique (SIRT), the total variation minimization (TVM) reconstruction technique and the discrete algebraic reconstruction technique (DART) for electron tomography are compared and the advantages and disadvantages are discussed. Furthermore, we describe how the result of a three dimensional (3D) reconstruction based on TVM can provide objective information that is needed as the input for a DART reconstruction. This approach results in a tomographic reconstruction of which the segmentation is carried out in an objective manner. - Highlights: ► A comparative study between different reconstruction algorithms for tomography is performed. ► Reconstruction algorithms that uses prior knowledge about the specimen have a superior result. ► One reconstruction algorithm can provide the prior knowledge for a second algorithm.

  18. Synchronized multiartifact reduction with tomographic reconstruction (SMART-RECON): A statistical model based iterative image reconstruction method to eliminate limited-view artifacts and to mitigate the temporal-average artifacts in time-resolved CT.

    Science.gov (United States)

    Chen, Guang-Hong; Li, Yinsheng

    2015-08-01

    In x-ray computed tomography (CT), a violation of the Tuy data sufficiency condition leads to limited-view artifacts. In some applications, it is desirable to use data corresponding to a narrow temporal window to reconstruct images with reduced temporal-average artifacts. However, the need to reduce temporal-average artifacts in practice may result in a violation of the Tuy condition and thus undesirable limited-view artifacts. In this paper, the authors present a new iterative reconstruction method, synchronized multiartifact reduction with tomographic reconstruction (SMART-RECON), to eliminate limited-view artifacts using data acquired within an ultranarrow temporal window that severely violates the Tuy condition. In time-resolved contrast enhanced CT acquisitions, image contrast dynamically changes during data acquisition. Each image reconstructed from data acquired in a given temporal window represents one time frame and can be denoted as an image vector. Conventionally, each individual time frame is reconstructed independently. In this paper, all image frames are grouped into a spatial-temporal image matrix and are reconstructed together. Rather than the spatial and/or temporal smoothing regularizers commonly used in iterative image reconstruction, the nuclear norm of the spatial-temporal image matrix is used in SMART-RECON to regularize the reconstruction of all image time frames. This regularizer exploits the low-dimensional structure of the spatial-temporal image matrix to mitigate limited-view artifacts when an ultranarrow temporal window is desired in some applications to reduce temporal-average artifacts. Both numerical simulations in two dimensional image slices with known ground truth and in vivo human subject data acquired in a contrast enhanced cone beam CT exam have been used to validate the proposed SMART-RECON algorithm and to demonstrate the initial performance of the algorithm. Reconstruction errors and temporal fidelity of the reconstructed

  19. Mobile 3D tomograph

    International Nuclear Information System (INIS)

    Illerhaus, Bernhard; Goebbels, Juergen; Onel, Yener; Sauerwein, Christoph

    2008-01-01

    Mobile tomographs often have the problem that high spatial resolution is impossible owing to the position or setup of the tomograph. While the tree tomograph developed by Messrs. Isotopenforschung Dr. Sauerwein GmbH worked well in practice, it is no longer used as the spatial resolution and measuring time are insufficient for many modern applications. The paper shows that the mechanical base of the method is sufficient for 3D CT measurements with modern detectors and X-ray tubes. CT measurements with very good statistics take less than 10 min. This means that mobile systems can be used, e.g. in examinations of non-transportable cultural objects or monuments. Enhancement of the spatial resolution of mobile tomographs capable of measuring in any position is made difficult by the fact that the tomograph has moving parts and will therefore have weight shifts. With the aid of tomographies whose spatial resolution is far higher than the mechanical accuracy, a correction method is presented for direct integration of the Feldkamp algorithm [de

  20. A review of US anthropometric reference data (1971-2000) with comparisons to both stylized and tomographic anatomic models

    International Nuclear Information System (INIS)

    Huh, C; Bolch, W E

    2003-01-01

    Two classes of anatomic models currently exist for use in both radiation protection and radiation dose reconstruction: stylized mathematical models and tomographic voxel models. The former utilize 3D surface equations to represent internal organ structure and external body shape, while the latter are based on segmented CT or MR images of a single individual. While tomographic models are clearly more anthropomorphic than stylized models, a given model's characterization as being anthropometric is dependent upon the reference human to which the model is compared. In the present study, data on total body mass, standing/sitting heights and body mass index are collected and reviewed for the US population covering the time interval from 1971 to 2000. These same anthropometric parameters are then assembled for the ORNL series of stylized models, the GSF series of tomographic models (Golem, Helga, Donna, etc), the adult male Zubal tomographic model and the UF newborn tomographic model. The stylized ORNL models of the adult male and female are found to be fairly representative of present-day average US males and females, respectively, in terms of both standing and sitting heights for ages between 20 and 60-80 years. While the ORNL adult male model provides a reasonably close match to the total body mass of the average US 21-year-old male (within ∼5%), present-day 40-year-old males have an average total body mass that is ∼16% higher. For radiation protection purposes, the use of the larger 73.7 kg adult ORNL stylized hermaphrodite model provides a much closer representation of average present-day US females at ages ranging from 20 to 70 years. In terms of the adult tomographic models from the GSF series, only Donna (40-year-old F) closely matches her age-matched US counterpart in terms of average body mass. Regarding standing heights, the better matches to US age-correlated averages belong to Irene (32-year-old F) for the females and Golem (38-year-old M) for the males

  1. High-throughput full-automatic synchrotron-based tomographic microscopy

    International Nuclear Information System (INIS)

    Mader, Kevin; Marone, Federica; Hintermueller, Christoph; Mikuljan, Gordan; Isenegger, Andreas; Stampanoni, Marco

    2011-01-01

    At the TOMCAT (TOmographic Microscopy and Coherent rAdiology experimenTs) beamline of the Swiss Light Source with an energy range of 8-45 keV and voxel size from 0.37 (micro)m to 7.4 (micro)m, full tomographic datasets are typically acquired in 5 to 10 min. To exploit the speed of the system and enable high-throughput studies to be performed in a fully automatic manner, a package of automation tools has been developed. The samples are automatically exchanged, aligned, moved to the correct region of interest, and scanned. This task is accomplished through the coordination of Python scripts, a robot-based sample-exchange system, sample positioning motors and a CCD camera. The tools are suited for any samples that can be mounted on a standard SEM stub, and require no specific environmental conditions. Up to 60 samples can be analyzed at a time without user intervention. The throughput of the system is dependent on resolution, energy and sample size, but rates of four samples per hour have been achieved with 0.74 (micro)m voxel size at 17.5 keV. The maximum intervention-free scanning time is theoretically unlimited, and in practice experiments have been running unattended as long as 53 h (the average beam time allocation at TOMCAT is 48 h per user). The system is the first fully automated high-throughput tomography station: mounting samples, finding regions of interest, scanning and reconstructing can be performed without user intervention. The system also includes many features which accelerate and simplify the process of tomographic microscopy.

  2. Reconstruction of emission coefficients for a non-axisymmetric coupling arc by algebraic reconstruction technique

    International Nuclear Information System (INIS)

    Zhang Guangjun; Xiong Jun; Gao Hongming; Wu Lin

    2011-01-01

    A preliminary investigation of tomographic reconstruction of an asymmetric arc plasma has been carried out. The objective of this work aims at reconstructing emission coefficients of a non-axisymmetric coupling arc from measured intensities by means of an algebraic reconstruction technique (ART). In order to define the optimal experimental scheme for good quality with limited views, the dependence of the reconstruction quality on three configurations (four, eight, ten projection angles) are presented and discussed via a displaced Gaussian model. Then, the emission coefficients of a free burning arc are reconstructed by the ART with the ten-view configuration and an Abel inversion, respectively, and good agreement is obtained. Finally, the emission coefficient profiles of the coupling arc are successfully achieved with the ten-view configuration. The results show that the distribution of emission coefficient for the coupling arc is different from centrosymmetric shape. The ART is perfectly suitable for reconstructing emission coefficients of the coupling arc with the ten-view configuration, proving the feasibility and utility of the ART to characterize an asymmetric arc.

  3. AOF LTAO mode: reconstruction strategy and first test results

    Science.gov (United States)

    Oberti, Sylvain; Kolb, Johann; Le Louarn, Miska; La Penna, Paolo; Madec, Pierre-Yves; Neichel, Benoit; Sauvage, Jean-François; Fusco, Thierry; Donaldson, Robert; Soenke, Christian; Suárez Valles, Marcos; Arsenault, Robin

    2016-07-01

    GALACSI is the Adaptive Optics (AO) system serving the instrument MUSE in the framework of the Adaptive Optics Facility (AOF) project. Its Narrow Field Mode (NFM) is a Laser Tomography AO (LTAO) mode delivering high resolution in the visible across a small Field of View (FoV) of 7.5" diameter around the optical axis. From a reconstruction standpoint, GALACSI NFM intends to optimize the correction on axis by estimating the turbulence in volume via a tomographic process, then projecting the turbulence profile onto one single Deformable Mirror (DM) located in the pupil, close to the ground. In this paper, the laser tomographic reconstruction process is described. Several methods (virtual DM, virtual layer projection) are studied, under the constraint of a single matrix vector multiplication. The pseudo-synthetic interaction matrix model and the LTAO reconstructor design are analysed. Moreover, the reconstruction parameter space is explored, in particular the regularization terms. Furthermore, we present here the strategy to define the modal control basis and split the reconstruction between the Low Order (LO) loop and the High Order (HO) loop. Finally, closed loop performance obtained with a 3D turbulence generator will be analysed with respect to the most relevant system parameters to be tuned.

  4. Tomographic image reconstruction and rendering with texture-mapping hardware

    International Nuclear Information System (INIS)

    Azevedo, S.G.; Cabral, B.K.; Foran, J.

    1994-07-01

    The image reconstruction problem, also known as the inverse Radon transform, for x-ray computed tomography (CT) is found in numerous applications in medicine and industry. The most common algorithm used in these cases is filtered backprojection (FBP), which, while a simple procedure, is time-consuming for large images on any type of computational engine. Specially-designed, dedicated parallel processors are commonly used in medical CT scanners, whose results are then passed to graphics workstation for rendering and analysis. However, a fast direct FBP algorithm can be implemented on modern texture-mapping hardware in current high-end workstation platforms. This is done by casting the FBP algorithm as an image warping operation with summing. Texture-mapping hardware, such as that on the Silicon Graphics Reality Engine (TM), shows around 600 times speedup of backprojection over a CPU-based implementation (a 100 Mhz R4400 in this case). This technique has the further advantages of flexibility and rapid programming. In addition, the same hardware can be used for both image reconstruction and for volumetric rendering. The techniques can also be used to accelerate iterative reconstruction algorithms. The hardware architecture also allows more complex operations than straight-ray backprojection if they are required, including fan-beam, cone-beam, and curved ray paths, with little or no speed penalties

  5. MO-FG-202-08: Real-Time Monte Carlo-Based Treatment Dose Reconstruction and Monitoring for Radiotherapy

    Energy Technology Data Exchange (ETDEWEB)

    Tian, Z; Shi, F; Gu, X; Tan, J; Hassan-Rezaeian, N; Jiang, S; Jia, X [UT Southwestern Medical Center, Dallas, TX (United States); Graves, Y [University of California, San Diego, La Jolla, CA (United States)

    2016-06-15

    Purpose: This proof-of-concept study is to develop a real-time Monte Carlo (MC) based treatment-dose reconstruction and monitoring system for radiotherapy, especially for the treatments with complicated delivery, to catch treatment delivery errors at the earliest possible opportunity and interrupt the treatment only when an unacceptable dosimetric deviation from our expectation occurs. Methods: First an offline scheme is launched to pre-calculate the expected dose from the treatment plan, used as ground truth for real-time monitoring later. Then an online scheme with three concurrent threads is launched while treatment delivering, to reconstruct and monitor the patient dose in a temporally resolved fashion in real-time. Thread T1 acquires machine status every 20 ms to calculate and accumulate fluence map (FM). Once our accumulation threshold is reached, T1 transfers the FM to T2 for dose reconstruction ad starts to accumulate a new FM. A GPU-based MC dose calculation is performed on T2 when MC dose engine is ready and a new FM is available. The reconstructed instantaneous dose is directed to T3 for dose accumulation and real-time visualization. Multiple dose metrics (e.g. maximum and mean dose for targets and organs) are calculated from the current accumulated dose and compared with the pre-calculated expected values. Once the discrepancies go beyond our tolerance, an error message will be send to interrupt the treatment delivery. Results: A VMAT Head-and-neck patient case was used to test the performance of our system. Real-time machine status acquisition was simulated here. The differences between the actual dose metrics and the expected ones were 0.06%–0.36%, indicating an accurate delivery. ∼10Hz frequency of dose reconstruction and monitoring was achieved, with 287.94s online computation time compared to 287.84s treatment delivery time. Conclusion: Our study has demonstrated the feasibility of computing a dose distribution in a temporally resolved fashion

  6. Monte Carlo simulation in nuclear medicine

    International Nuclear Information System (INIS)

    Morel, Ch.

    2007-01-01

    The Monte Carlo method allows for simulating random processes by using series of pseudo-random numbers. It became an important tool in nuclear medicine to assist in the design of new medical imaging devices, optimise their use and analyse their data. Presently, the sophistication of the simulation tools allows the introduction of Monte Carlo predictions in data correction and image reconstruction processes. The availability to simulate time dependent processes opens up new horizons for Monte Carlo simulation in nuclear medicine. In a near future, these developments will allow to tackle simultaneously imaging and dosimetry issues and soon, case system Monte Carlo simulations may become part of the nuclear medicine diagnostic process. This paper describes some Monte Carlo method basics and the sampling methods that were developed for it. It gives a referenced list of different simulation software used in nuclear medicine and enumerates some of their present and prospective applications. (author)

  7. Analysis of stability of tomographic reconstruction of x-ray medical images

    Directory of Open Access Journals (Sweden)

    Л. А. Булавін

    2017-09-01

    Full Text Available Slice reconstruction in X-ray computed tomography is reduced to the solution of integral equations, or a system of algebraic equations in discrete case. It is considered to be an ill-posed problem due to the inconsistencies in the number of equations and variables and due to errors in the experimental data. Therefore, determination of the best method of the slice reconstruction is of great interest. Furthermore, all available methods give approximate results. The aim of this article was two-fold: i to compare two methods of image reconstruction, viz. inverse projection and variation, using the numerical experiment; ii to obtain the relationship between image accuracy and experimental error. It appeared that the image obtained by inverse projection is unstable: there was no convergence of the approximate image to the accurate one, when the experimental error reached zero. In turn, the image obtained by variational method was accurate at zero experimental error. Finally, the latter showed better slice reconstruction, despite the low number of projections and the experimental errors.

  8. Optical Computed-Tomographic Microscope for Three-Dimensional Quantitative Histology

    Directory of Open Access Journals (Sweden)

    Ravil Chamgoulov

    2004-01-01

    Full Text Available A novel optical computed‐tomographic microscope has been developed allowing quantitative three‐dimensional (3D imaging and analysis of fixed pathological material. Rather than a conventional two‐dimensional (2D image, the instrument produces a 3D representation of fixed absorption‐stained material, from which quantitative histopathological features can be measured more accurately. The accurate quantification of these features is critically important in disease diagnosis and the clinical classification of cancer. The system consists of two high NA objective lenses, a light source, a digital spatial light modulator (DMD, by Texas Instrument, an x–y stage, and a CCD detector. The DMD, positioned at the back pupil‐plane of the illumination objective, is employed to illuminate the specimen with parallel rays at any desired angle. The system uses a modification of the convolution backprojection algorithm for reconstruction. In contrast to fluorescent images acquired by a confocal microscope, this instrument produces 3D images of absorption stained material. Microscopic 3D volume reconstructions of absorption‐stained cells have been demonstrated. Reconstructed 3D images of individual cells and tissue can be cut virtually with the distance between the axial slices less than 0.5 μm.

  9. Monte Carlo simulation of gamma ray tomography for image reconstruction

    Energy Technology Data Exchange (ETDEWEB)

    Guedes, Karlos A.N.; Moura, Alex; Dantas, Carlos; Melo, Silvio; Lima, Emerson, E-mail: karlosguedes@hotmail.com [Universidade Federal de Pernambuco (UFPE), Recife, PE (Brazil); Meric, Ilker [University of Bergen (Norway)

    2015-07-01

    The Monte Carlo simulations of known density and shape object was validate with Gamma Ray Tomography in static experiments. An aluminum half-moon piece placed inside a steel pipe was the MC simulation test object that was also measured by means of gamma ray transmission. Wall effect of the steel pipe due to irradiation geometry in a single pair source-detector tomography was evaluated by comparison with theoretical data. MCNPX code requires a defined geometry to each photon trajectory which practically prevents this usage for tomography reconstruction simulation. The solution was found by writing a program in Delphi language to create input files automation code. Simulations of tomography data by automated MNCPX code were carried out and validated by experimental data. Working in this sequence the produced data needed a databank to be stored. Experimental setup used a Cesium-137 isotopic radioactive source (7.4 × 109 Bq), and NaI(Tl) scintillation detector of (51 × 51) × 10−3 m crystal size coupled to a multichannel analyzer. A stainless steel tubes of 0,154 m internal diameter, 0.014 m thickness wall. The results show that the MCNPX simulation code adapted to automated input file is useful for generating a matrix data M(θ,t), of a computerized gamma ray tomography for any known density and regular shape object. Experimental validation used RMSE from gamma ray paths and from attenuation coefficient data. (author)

  10. Line-scanning tomographic optical microscope with isotropic transfer function

    International Nuclear Information System (INIS)

    Gajdátsy, Gábor; Dudás, László; Erdélyi, Miklós; Szabó, Gábor

    2010-01-01

    An imaging method and optical system, referred to as a line-scanning tomographic optical microscope (LSTOM) using a combination of line-scanning technique and CT reconstruction principle, is proposed and studied theoretically and experimentally. In our implementation a narrow focus line is scanned over the sample and the reflected light is measured in a confocal arrangement. One such scan is equivalent to a transverse projection in tomography. Repeating the scanning procedure in several directions, a number of transverse projections are recorded from which the image can be obtained using conventional CT reconstruction algorithms. The resolution of the image is independent of the spatial dimensions and structure of the applied detector; furthermore, the transfer function of the system is isotropic. The imaging performance of the implemented confocal LSTOM was compared with a point-scanning confocal microscope, based on recorded images. These images demonstrate that the resolution of the confocal LSTOM exceeds (by 15%) the resolution limit of a point-scanning confocal microscope

  11. Interval-based reconstruction for uncertainty quantification in PET

    Science.gov (United States)

    Kucharczak, Florentin; Loquin, Kevin; Buvat, Irène; Strauss, Olivier; Mariano-Goulart, Denis

    2018-02-01

    A new directed interval-based tomographic reconstruction algorithm, called non-additive interval based expectation maximization (NIBEM) is presented. It uses non-additive modeling of the forward operator that provides intervals instead of single-valued projections. The detailed approach is an extension of the maximum likelihood—expectation maximization algorithm based on intervals. The main motivation for this extension is that the resulting intervals have appealing properties for estimating the statistical uncertainty associated with the reconstructed activity values. After reviewing previously published theoretical concepts related to interval-based projectors, this paper describes the NIBEM algorithm and gives examples that highlight the properties and advantages of this interval valued reconstruction.

  12. Influence of light refraction on the image reconstruction in transmission optical tomography of scattering media

    International Nuclear Information System (INIS)

    Tereshchenko, Sergei A; Potapov, D A; Podgaetskii, Vitalii M; Smirnov, A V

    2002-01-01

    A distorting influence of light refraction at the boundaries of scattering media on the results of tomographic reconstruction of images of radially symmetric objects is investigated. The methods for the correction of such refraction-caused distortions are described. The results of the image reconstruction for two model cylindrical objects are presented.

  13. Tomographic anthropomorphic models. Pt. 4. Organ doses for adults due to idealized external photon exposures

    International Nuclear Information System (INIS)

    Zankl, M.; Petoussi-Henss, N.; Fill, U.; Regulla, D.

    2002-01-01

    The present report contains extensive tables and figures of conversion coefficients of organ and tissue equivalent dose, normalised to air kerma free in air for voxel anthropomorphic phantoms and for standard geometries of external photon radiation, estimated with Monte Carlo techniques. Four realistic adult voxel phantoms were used for the calculations, based on computed tomographic data of real people: three male phantoms, two of them being of average size, one representing a big man, and one female phantom of a tall and somewhat over weighted woman. (orig.)

  14. Tomographic anthropomorphic models. Pt. 4. Organ doses for adults due to idealized external photon exposures

    CERN Document Server

    Zankl, M; Petoussi-Henss, N; Regulla, D

    2002-01-01

    The present report contains extensive tables and figures of conversion coefficients of organ and tissue equivalent dose, normalised to air kerma free in air for voxel anthropomorphic phantoms and for standard geometries of external photon radiation, estimated with Monte Carlo techniques. Four realistic adult voxel phantoms were used for the calculations, based on computed tomographic data of real people: three male phantoms, two of them being of average size, one representing a big man, and one female phantom of a tall and somewhat over weighted woman.

  15. Computed tomographic reconstruction of beam profiles with a multi-wire chamber

    International Nuclear Information System (INIS)

    Alonso, J.R.; Tobias, C.A.; Chu, W.T.

    1979-03-01

    MEDUSA (MEdical Dose Uniformity SAmpler), a 16 plane multi-wire proportional chamber, has been built to accurately measure beam profiles. The large number of planes allows for reconstruction of highly detailed beam intensity structures by means of Fourier convolution reconstruction techniques. This instrument is being used for verification and tuning of the Bevalac radiotherapy beams, but has potential applications in many beam profile monitoring situations

  16. Phase-contrast tomographic imaging using an X-ray interferometer

    Energy Technology Data Exchange (ETDEWEB)

    Momose, A. [Hitachi Ltd, Advanced Research Lab., Saitama (Japan); Takeda, T.; Itai, Y. [Univ. of Tsukuba, Inst. of Clinical Medicine, Ibaraki (Japan); Yoneyama, A. [Hitachi Ltd, Central Resarch Lab., Tokyo (Japan); Hirano, K. [High Energy Accelerator Research Organization, Inst. of Materials Structure Science, Ibaraki (Japan)

    1998-05-01

    Apparatus for phase-contrast X-ray computed tomography using a monolithic X-ray interferometer is presented with some observational results for human breast tissues. Structures characteristic of the tissues were revealed in the phase-contrast tomograms. The procedure of image analysis consists of phase retrieval from X-ray interference patterns and tomographic image reconstruction from the retrieved phase shift. Next, feasibility of phase-contrast imaging using a two-crystal X-ray interferometer was studied aiming at in vivo observation in the future. In a preliminary study, the two-crystal X-ray interferometer was capable of generating fringes of 70% visibility using synchrotron X-rays. 35 refs.

  17. Phase-contrast tomographic imaging using an X-ray interferometer

    International Nuclear Information System (INIS)

    Momose, A.; Takeda, T.; Itai, Y.; Yoneyama, A.; Hirano, K.

    1998-01-01

    Apparatus for phase-contrast X-ray computed tomography using a monolithic X-ray interferometer is presented with some observational results for human breast tissues. Structures characteristic of the tissues were revealed in the phase-contrast tomograms. The procedure of image analysis consists of phase retrieval from X-ray interference patterns and tomographic image reconstruction from the retrieved phase shift. Next, feasibility of phase-contrast imaging using a two-crystal X-ray interferometer was studied aiming at in vivo observation in the future. In a preliminary study, the two-crystal X-ray interferometer was capable of generating fringes of 70% visibility using synchrotron X-rays

  18. Development of the Shimadzu computed tomographic scanner SCT-200N

    International Nuclear Information System (INIS)

    Ishihara, Hiroshi; Yamaoka, Nobuyuki; Saito, Masahiro

    1982-01-01

    The Shimadzu Computed Tomographic Scanner SCT-200N has been developed as an ideal CT scanner for diagnosing the head and spine. Due to the large aperture, moderate scan time and the Zoom Scan Mode, any part of the body can be scanned. High quality image can be obtained by adopting the precisely stabilized X-ray unit and densely packed array of 64-detectors. As for its operation, capability of computed radiography (CR) prior to patient positioning and real time reconstruction ensure efficient patient through-put. Details of the SCT-200N are described in this paper. (author)

  19. IRVE-II Post-Flight Trajectory Reconstruction

    Science.gov (United States)

    O'Keefe, Stephen A.; Bose, David M.

    2010-01-01

    NASA s Inflatable Re-entry Vehicle Experiment (IRVE) II successfully demonstrated an inflatable aerodynamic decelerator after being launched aboard a sounding rocket from Wallops Flight Facility (WFF). Preliminary day of flight data compared well with pre-flight Monte Carlo analysis, and a more complete trajectory reconstruction performed with an Extended Kalman Filter (EKF) approach followed. The reconstructed trajectory and comparisons to an attitude solution provided by NASA Sounding Rocket Operations Contract (NSROC) personnel at WFF are presented. Additional comparisons are made between the reconstructed trajectory and pre and post-flight Monte Carlo trajectory predictions. Alternative observations of the trajectory are summarized which leverage flight accelerometer measurements, the pre-flight aerodynamic database, and on-board flight video. Finally, analysis of the payload separation and aeroshell deployment events are presented. The flight trajectory is reconstructed to fidelity sufficient to assess overall project objectives related to flight dynamics and overall, IRVE-II flight dynamics are in line with expectations

  20. Optimisation of the image resolution of a positron emission tomograph

    International Nuclear Information System (INIS)

    Ziemons, K.

    1993-10-01

    The resolution and the respective signal-to-noise ratios of reconstructed pictures were a point of main interest of the work for optimisation of PET systems. Monte-Carlo modelling calculations were applied to derive possible improvements of the technical design or performance of the PET system. (DG) [de

  1. Radionuclide imaging with coded apertures and three-dimensional image reconstruction from focal-plane tomography

    International Nuclear Information System (INIS)

    Chang, L.T.

    1976-05-01

    Two techniques for radionuclide imaging and reconstruction have been studied;; both are used for improvement of depth resolution. The first technique is called coded aperture imaging, which is a technique of tomographic imaging. The second technique is a special 3-D image reconstruction method which is introduced as an improvement to the so called focal-plane tomography

  2. GATE Monte Carlo simulation of GE discovery 600 and a uniformity phantom

    Energy Technology Data Exchange (ETDEWEB)

    Sheen, Heesoon [Sungkyunkwan University, Seoul (Korea, Republic of); GE Healthcare Korea, Seoul (Korea, Republic of); Im, Kichun; Choi, Yong; Shin, Hanback [Sogang University, Seoul (Korea, Republic of); Han, Youngyih [Samsung Medical Center, Seoul (Korea, Republic of); Sungkyunkwan University, Seoul (Korea, Republic of); Chung, Kwangzoo; Cho, Junsang [Samsung Medical Center, Seoul (Korea, Republic of); Ahn, Sanghee [Sungkyunkwan University, Seoul (Korea, Republic of)

    2014-12-15

    GATE (Geant4 Application Tomography Emission) Monte Carlo simulations have been successful in the application of emission tomography for precise modeling of various physical processes. Most previous studies on Monte Carlo simulations have only involved performance assessments using virtual phantoms. Although that allows the performance of simulated positron emission tomography (PET) to be evaluated, it does not reflect the reality of practical conditions. This restriction causes substantial drawbacks in GATE simulations of real situations. To overcome the described limitation and to provide a method to enable simulation research relevant to clinically important issues, we conducted a GATE simulation using real data from a scanner rather than a virtual phantom and evaluated the scanner is performance. For that purpose, the system and the geometry of a commercial GE PET/ CT (computed tomography) scanner, BGO-based Discovery 600 (D600), was developed for the first time. The performance of the modeled PET system was evaluated by using the National Electrical Manufacturers Association NEMA NU 2-2007 protocols and results were compared with those of the reference data. The sensitivity, scatter fraction, noise-equivalent count rate (NECR), and resolution were estimated by using the protocol of the NEMA NU2-2007. Sensitivities were 9.01 cps/kBq at 0 cm and 9.43 cps/kBq at 10 cm. Scatter fractions were 39.5%. The NECR peak was 89.7 kcps at 14.7 kBq/cc. Resolutions were 4.8 mm in the transaxial plane and 5.9 mm in the axial plane at 1 cm, and 6.2 mm in the transaxial plane and 6.4 mm in the axial plane at 10 cm. The resolutions exceeded the limited value provided by the manufacturer. The uniformity phantom was simulated using the CT and the PET data. The output data in a ROOT format were converted and then reconstructed by using the C program and STIR (Software for Tomographic Image Reconstruction). The reconstructed images of the simulated uniformity phantom data had

  3. GATE Monte Carlo simulation of GE Discovery 600 and a uniformity phantom

    Science.gov (United States)

    Sheen, Heesoon; Im, Ki Chun; Choi, Yong; Shin, Hanback; Han, Youngyih; Chung, Kwangzoo; Cho, Junsang; Ahn, Sang Hee

    2014-12-01

    GATE (Geant4 Application Tomography Emission) Monte Carlo simulations have been successful in the application of emission tomography for precise modeling of various physical processes. Most previous studies on Monte Carlo simulations have only involved performance assessments using virtual phantoms. Although that allows the performance of simulated positron emission tomography (PET) to be evaluated, it does not reflect the reality of practical conditions. This restriction causes substantial drawbacks in GATE simulations of real situations. To overcome the described limitation and to provide a method to enable simulation research relevant to clinically important issues, we conducted a GATE simulation using real data from a scanner rather than a virtual phantom and evaluated the scanner is performance. For that purpose, the system and the geometry of a commercial GE PET/ CT (computed tomography) scanner, BGO-based Discovery 600 (D600), was developed for the first time. The performance of the modeled PET system was evaluated by using the National Electrical Manufacturers Association NEMA NU 2-2007 protocols and results were compared with those of the reference data. The sensitivity, scatter fraction, noise-equivalent count rate (NECR), and resolution were estimated by using the protocol of the NEMA NU2-2007. Sensitivities were 9.01 cps/kBq at 0 cm and 9.43 cps/kBq at 10 cm. Scatter fractions were 39.5%. The NECR peak was 89.7 kcps @ 14.7 kBq/cc. Resolutions were 4.8 mm in the transaxial plane and 5.9 mm in the axial plane at 1 cm, and 6.2 mm in the transaxial plane and 6.4 mm in the axial plane at 10 cm. The resolutions exceeded the limited value provided by the manufacturer. The uniformity phantom was simulated using the CT and the PET data. The output data in a ROOT format were converted and then reconstructed by using the C program and STIR (Software for Tomographic Image Reconstruction). The reconstructed images of the simulated uniformity phantom data had

  4. GATE Monte Carlo simulation of GE discovery 600 and a uniformity phantom

    International Nuclear Information System (INIS)

    Sheen, Heesoon; Im, Kichun; Choi, Yong; Shin, Hanback; Han, Youngyih; Chung, Kwangzoo; Cho, Junsang; Ahn, Sanghee

    2014-01-01

    GATE (Geant4 Application Tomography Emission) Monte Carlo simulations have been successful in the application of emission tomography for precise modeling of various physical processes. Most previous studies on Monte Carlo simulations have only involved performance assessments using virtual phantoms. Although that allows the performance of simulated positron emission tomography (PET) to be evaluated, it does not reflect the reality of practical conditions. This restriction causes substantial drawbacks in GATE simulations of real situations. To overcome the described limitation and to provide a method to enable simulation research relevant to clinically important issues, we conducted a GATE simulation using real data from a scanner rather than a virtual phantom and evaluated the scanner is performance. For that purpose, the system and the geometry of a commercial GE PET/ CT (computed tomography) scanner, BGO-based Discovery 600 (D600), was developed for the first time. The performance of the modeled PET system was evaluated by using the National Electrical Manufacturers Association NEMA NU 2-2007 protocols and results were compared with those of the reference data. The sensitivity, scatter fraction, noise-equivalent count rate (NECR), and resolution were estimated by using the protocol of the NEMA NU2-2007. Sensitivities were 9.01 cps/kBq at 0 cm and 9.43 cps/kBq at 10 cm. Scatter fractions were 39.5%. The NECR peak was 89.7 kcps at 14.7 kBq/cc. Resolutions were 4.8 mm in the transaxial plane and 5.9 mm in the axial plane at 1 cm, and 6.2 mm in the transaxial plane and 6.4 mm in the axial plane at 10 cm. The resolutions exceeded the limited value provided by the manufacturer. The uniformity phantom was simulated using the CT and the PET data. The output data in a ROOT format were converted and then reconstructed by using the C program and STIR (Software for Tomographic Image Reconstruction). The reconstructed images of the simulated uniformity phantom data had

  5. Comparative study of the macroscopic finding, conventional tomographic imaging, and computed tomographic imaging in locating the mandibular canal

    International Nuclear Information System (INIS)

    Choi, Hang Moon; You, Dong Soo

    1995-01-01

    The purpose of this study was comparison of conventional tomography with reformatted computed tomography for dental implant in locating the mandibular canal. Five dogs were used and after conventional tomographs and fitted computed tomographs were taken, four dentist traced all films. Mandibles were sectioned with 2 mm slice thickness and the sections were then radiographed (contact radiography). Each radiograpic image was traced and linear measurements were made from mandibular canal to alveolar crest, buccal cortex, lingual cortex, and inferior border. The following results were obtained; 1. Reformatted computed tomographs were exacter than conventional tomography by alveolar crest to canal length of -0.6 mm difference between real values and radiographs 2. The average measurements of buccal cortex to mandibular canal width and lingual cortex to mandibular canal width of conventional tomographs were exacter than reformatted computed tomographs, but standard deviations were higher than reformatted computed tomographs. 3. Standard deviations of reformatted computed tomographs were lower than conventional tomographs at all comparing sites 4. At reformatted computed tomography 62.5% of the measurements performed were within ±1 mm of the true value, and at conventional tomography 24.1% were. 5. Mandibular canal invisibility was 0.8% at reformatted computed tomography and 9.2% at conventional tomography. Reformatted computed tomography has been shown to be more useful radiographic technique for assessment of the mandibular canal than conventional tomography.

  6. Synchrotron radiation X-ray tomographic microscopy (SRXTM) of brachiopod shell interiors for taxonomy: Preliminary report

    OpenAIRE

    Motchurova-Dekova Neda; Harper David A.T.

    2010-01-01

    Synchrotron radiation X-ray tomographic microscopy (SRXTM) is a non-destructive technique for the investigation and visualization of the internal features of solid opaque objects, which allows reconstruction of a complete three-dimensional image of internal structures by recording of the differences in the effects on the passage of waves of energy reacting with those structures. Contrary to X-rays, produced in a conventional X-ray tube, the intense synchrot...

  7. Discrete tomographic reconstruction of 2D polycrystal orientation maps from X-ray diffraction projections using Gibbs priors

    DEFF Research Database (Denmark)

    Rodek, L.; Knudsen, E.; Poulsen, H.F.

    2005-01-01

    discrete tomographic algorithm, applying image-modelling Gibbs priors and a homogeneity condition. The optimization of the objective function is accomplished via the Gibbs Sampler in conjunction with simulated annealing. In order to express the structure of the orientation map, the similarity...

  8. Tomographic apparatus for reconstructing planar slices from non-absorbed and non-scattered radiation

    International Nuclear Information System (INIS)

    1980-01-01

    After briefly reviewing the history of computerised tomography, the deficiencies inherent in the various methods that have been adopted are discussed, e.g. slow data collection time, blurring of images and poor spatial resolution. Tomographic apparatus and processing methods are then described which can overcome these problems. The apparatus consists of a fan-shaped source of X-rays and a detector array which both rotate around the patient being examined. The data reduction process is derived in great detail; it is claimed that digital processing using convolution techniques is much faster than conventional methods. (U.K.)

  9. Planning surgical reconstruction in Treacher-Collins syndrome using virtual simulation.

    Science.gov (United States)

    Nikkhah, Dariush; Ponniah, Allan; Ruff, Cliff; Dunaway, David

    2013-11-01

    Treacher-Collins syndrome is a rare autosomal dominant condition of varying phenotypic expression. The surgical correction in this syndrome is difficult, and the approach varies between craniofacial departments worldwide. The authors aimed to design standardized tools for planning orbitozygomatic and mandibular reconstruction in Treacher-Collins syndrome using geometric morphometrics. The Great Ormond Street Hospital database was retrospectively identified for patients with Treacher-Collins syndrome. Thirteen children (aged 2 to 15 years) who had suitable preoperative three-dimensional computed tomographic head scans were included. Six Treacher-Collins syndrome three-dimensional computed tomographic head scans were quantitatively compared using a template of 96 anatomically defined landmarks to 26 age-matched normal dry skulls. Thin-plate spline videos illustrated the characteristic deformities of retromicrognathia and maxillary and orbitozygomatic hypoplasia in the Treacher-Collins syndrome population. Geometric morphometrics was used in the virtual reconstruction of the orbitozygomatic and mandibular region in Treacher-Collins syndrome patients. Intrarater and interrater reliability of the landmarks was acceptable and within a standard deviation of less than 1 mm on 97 percent and 100 percent of 10 repeated scans, respectively. Virtual normalization of the Treacher-Collins syndrome skull effectively describes characteristic skeletal deformities and provides a useful guide to surgical reconstruction. Size-matched stereolithographic templates derived from thin-plate spline warps can provide effective intraoperative templates for zygomatic and mandibular reconstruction in the Treacher-Collins syndrome patient. Diagnostic, V.

  10. Reconstruction of Exposure to m-Xylene from Human Biomonitoring Data Using PBPK Modelling, Bayesian Inference, and Markov Chain Monte Carlo Simulation

    Science.gov (United States)

    McNally, Kevin; Cotton, Richard; Cocker, John; Jones, Kate; Bartels, Mike; Rick, David; Price, Paul; Loizou, George

    2012-01-01

    There are numerous biomonitoring programs, both recent and ongoing, to evaluate environmental exposure of humans to chemicals. Due to the lack of exposure and kinetic data, the correlation of biomarker levels with exposure concentrations leads to difficulty in utilizing biomonitoring data for biological guidance values. Exposure reconstruction or reverse dosimetry is the retrospective interpretation of external exposure consistent with biomonitoring data. We investigated the integration of physiologically based pharmacokinetic modelling, global sensitivity analysis, Bayesian inference, and Markov chain Monte Carlo simulation to obtain a population estimate of inhalation exposure to m-xylene. We used exhaled breath and venous blood m-xylene and urinary 3-methylhippuric acid measurements from a controlled human volunteer study in order to evaluate the ability of our computational framework to predict known inhalation exposures. We also investigated the importance of model structure and dimensionality with respect to its ability to reconstruct exposure. PMID:22719759

  11. Reconstruction of Exposure to m-Xylene from Human Biomonitoring Data Using PBPK Modelling, Bayesian Inference, and Markov Chain Monte Carlo Simulation

    Directory of Open Access Journals (Sweden)

    Kevin McNally

    2012-01-01

    Full Text Available There are numerous biomonitoring programs, both recent and ongoing, to evaluate environmental exposure of humans to chemicals. Due to the lack of exposure and kinetic data, the correlation of biomarker levels with exposure concentrations leads to difficulty in utilizing biomonitoring data for biological guidance values. Exposure reconstruction or reverse dosimetry is the retrospective interpretation of external exposure consistent with biomonitoring data. We investigated the integration of physiologically based pharmacokinetic modelling, global sensitivity analysis, Bayesian inference, and Markov chain Monte Carlo simulation to obtain a population estimate of inhalation exposure to m-xylene. We used exhaled breath and venous blood m-xylene and urinary 3-methylhippuric acid measurements from a controlled human volunteer study in order to evaluate the ability of our computational framework to predict known inhalation exposures. We also investigated the importance of model structure and dimensionality with respect to its ability to reconstruct exposure.

  12. A tomographic test of cosmological principle using the JLA compilation of type Ia supernovae

    Science.gov (United States)

    Chang, Zhe; Lin, Hai-Nan; Sang, Yu; Wang, Sai

    2018-05-01

    We test the cosmological principle by fitting a dipolar modulation of distance modulus and searching for an evolution of this modulation with respect to cosmological redshift. Based on a redshift tomographic method, we divide the Joint Light-curve Analysis compilation of supernovae of type Ia into different redshift bins, and employ a Markov-Chain Monte-Carlo method to infer the anisotropic amplitude and direction in each redshift bin. However, we do not find any significant deviations from the cosmological principle, and the anisotropic amplitude is stringently constrained to be less than a few thousandths at 95% confidence level.

  13. Electron and Photon Reconstruction and Identification with the ATLAS Detector

    International Nuclear Information System (INIS)

    Kuna, Marine

    2011-01-01

    This article presents the electron and photon reconstruction performance of the ATLAS detector based on the first LHC collision data at √(s)=7 TeV. Calorimetric and tracker related electron identification variables are in a fair agreement with the Monte Carlo model describing the detector response. The position of the reconstructed photon conversions vertices has been used to compare the description of the inner detector used in Monte Carlo geometry to that from data. The energy flow measured in the electromagnetic calorimeter has been used to provide the same comparison at larger radii. π 0 →γγ and J/Ψ→e + e - peaks were observed with reconstructed masses in good agreement with both Monte Carlo and PDG values. 17 W→eν candidates and one Z→e + e - candidate have been observed in 6.69 nb -1 of data.

  14. Comparison of tomography reconstruction by maximum entropy and filtered retro projection

    International Nuclear Information System (INIS)

    Abdala, F.J.P.; Simpson, D.M.; Roberty, N.C.

    1992-01-01

    The tomographic reconstruction with few projections is studied, comparing the maximum entropy method with filtered retro projection. Simulations with and without the presence of noise and also with the presence of an object of high density inside of the skull are showed. (C.G.C.)

  15. In-situ tomographic observation of crack formation and propagation in tungsten materials in the framework of FEMaS-CA

    International Nuclear Information System (INIS)

    Riesch, J.; Linsmeier, C.; Nielsen, S.F.

    2010-01-01

    The EU has funded the Fusion Energy Materials Science project Coordination Action (FEMaSCA) with the intension to utilize the know-how in the materials community to help overcome the material science problems with fusion related materials. In this framework three different material concepts, tungsten-copper-composite (W/Cu), vacuum plasma sprayed tungsten (VPSW), and tungsten-fiber/tungsten-matrix-composite (W f /W m ) were investigated by means of insitu tomography during mechanical testing. The measuring campaign was conducted at the high energy beamtine ID ISA at the European Synchrotron Radiation Facility (ESRF) in Grenoble. A tensile testing machine was used to perform displacement controlled tension tests. At the end of each well defined displacement step a tomogram was taken. Tomographic reconstructions were successfully produced of samples with high tungsten content and sample diameters up to 1 mm. Force-displacement curves were measured during loading to complete fracture. Crack propagation could he observed in the tomographic reconstructions. This paper describes the first results with special focus on the experimental work and the role of FEMaS-CA. (Author)

  16. Update on orbital reconstruction.

    Science.gov (United States)

    Chen, Chien-Tzung; Chen, Yu-Ray

    2010-08-01

    Orbital trauma is common and frequently complicated by ocular injuries. The recent literature on orbital fracture is analyzed with emphasis on epidemiological data assessment, surgical timing, method of approach and reconstruction materials. Computed tomographic (CT) scan has become a routine evaluation tool for orbital trauma, and mobile CT can be applied intraoperatively if necessary. Concomitant serious ocular injury should be carefully evaluated preoperatively. Patients presenting with nonresolving oculocardiac reflex, 'white-eyed' blowout fracture, or diplopia with a positive forced duction test and CT evidence of orbital tissue entrapment require early surgical repair. Otherwise, enophthalmos can be corrected by late surgery with a similar outcome to early surgery. The use of an endoscope-assisted approach for orbital reconstruction continues to grow, offering an alternative method. Advances in alloplastic materials have improved surgical outcome and shortened operating time. In this review of modern orbital reconstruction, several controversial issues such as surgical indication, surgical timing, method of approach and choice of reconstruction material are discussed. Preoperative fine-cut CT image and thorough ophthalmologic examination are key elements to determine surgical indications. The choice of surgical approach and reconstruction materials much depends on the surgeon's experience and the reconstruction area. Prefabricated alloplastic implants together with image software and stereolithographic models are significant advances that help to more accurately reconstruct the traumatized orbit. The recent evolution of orbit reconstruction improves functional and aesthetic results and minimizes surgical complications.

  17. Improved diffusion coefficients generated from Monte Carlo codes

    International Nuclear Information System (INIS)

    Herman, B. R.; Forget, B.; Smith, K.; Aviles, B. N.

    2013-01-01

    Monte Carlo codes are becoming more widely used for reactor analysis. Some of these applications involve the generation of diffusion theory parameters including macroscopic cross sections and diffusion coefficients. Two approximations used to generate diffusion coefficients are assessed using the Monte Carlo code MC21. The first is the method of homogenization; whether to weight either fine-group transport cross sections or fine-group diffusion coefficients when collapsing to few-group diffusion coefficients. The second is a fundamental approximation made to the energy-dependent P1 equations to derive the energy-dependent diffusion equations. Standard Monte Carlo codes usually generate a flux-weighted transport cross section with no correction to the diffusion approximation. Results indicate that this causes noticeable tilting in reconstructed pin powers in simple test lattices with L2 norm error of 3.6%. This error is reduced significantly to 0.27% when weighting fine-group diffusion coefficients by the flux and applying a correction to the diffusion approximation. Noticeable tilting in reconstructed fluxes and pin powers was reduced when applying these corrections. (authors)

  18. Voxel-based model construction from colored tomographic images; Construcao de simuladores baseados em elementos de volume a partir de imagens tomograficas coloridas

    Energy Technology Data Exchange (ETDEWEB)

    Loureiro, Eduardo Cesar de Miranda

    2002-07-01

    This work presents a new approach in the construction of voxel-based phantoms that was implemented to simplify the segmentation process of organs and tissues reducing the time used in this procedure. The segmentation process is performed by painting tomographic images and attributing a different color for each organ or tissue. A voxel-based head and neck phantom was built using this new approach. The way as the data are stored allows an increasing in the performance of the radiation transport code. The program that calculates the radiation transport also works with image files. This capability allows image reconstruction showing isodose areas, under several points of view, increasing the information to the user. Virtual X-ray photographs can also be obtained allowing that studies could be accomplished looking for the radiographic techniques optimization assessing, at the same time, the doses in organs and tissues. The accuracy of the program here presented, called MCvoxEL, that implements this new approach, was tested by comparison to results from two modern and well-supported Monte Carlo codes. Dose conversion factors for parallel X-ray exposure were also calculated. (author)

  19. Interactive reconstruction in single-photon tomography

    International Nuclear Information System (INIS)

    Miller, T.R.; Wallis, J.W.; Wilson, A.D.

    1989-01-01

    A new method is described to allow interactive selection of the reconstruction filter at the time of interpretation of images from single-photon tomography. In the filtered back projection algorithm, the only part of the reconstruction process requiring user interaction is the selection of the window function. Since the ramp and window filters have different purposes, they can be separated, placing the window at the end of the reconstruction process as a three-dimensional filter. All stages of reconstruction except the window filtering are performed before the physician begins to interpret the study. The three-dimensional filtering is performed very rapidly with use of the Chebyshev convolution algorithm. A 64 x 64 x 64 pixel cube of data is filtered in 13-33 s using filters of 3-11 lengths. Smaller volumes of image data can be filtered in less than 1 s; thus, the user can interactively choose any desired filter for a given tomographic study at the time of interpretation of the images. (orig.)

  20. Direct iterative reconstruction of computed tomography trajectories (DIRECTT)

    International Nuclear Information System (INIS)

    Lange, A.; Hentschel, M.P.; Schors, J.

    2004-01-01

    The direct reconstruction approach employs an iterative procedure by selection of and angular averaging over projected trajectory data of volume elements. This avoids the blur effects of the classical Fourier method due to the sampling theorem. But longer computing time is required. The reconstructed tomographic images reveal at least the spatial resolution of the radiation detector. Any set of projection angles may be selected for the measurements. Limited rotation of the object yields still good reconstruction of details. Projections of a partial region of the object can be reconstructed without additional artifacts thus reducing the overall radiation dose. Noisy signal data from low dose irradiation have low impact on spatial resolution. The image quality is monitored during all iteration steps and is pre-selected according to the specific requirements. DIRECTT can be applied independently from the measurement equipment in addition to conventional reconstruction or as a refinement filter. (author)

  1. SU-E-J-100: Reconstruction of Prompt Gamma Ray Three Dimensional SPECT Image From Boron Neutron Capture Therapy(BNCT)

    Energy Technology Data Exchange (ETDEWEB)

    Yoon, D; Jung, J; Suh, T [The Catholic University of Korea, College of medicine, Department of biomedical engineering (Korea, Republic of)

    2014-06-01

    Purpose: Purpose of paper is to confirm the feasibility of acquisition of three dimensional single photon emission computed tomography (SPECT) image from boron neutron capture therapy (BNCT) using Monte Carlo simulation. Methods: In case of simulation, the pixelated SPECT detector, collimator and phantom were simulated using Monte Carlo n particle extended (MCNPX) simulation tool. A thermal neutron source (<1 eV) was used to react with the boron uptake region (BUR) in the phantom. Each geometry had a spherical pattern, and three different BURs (A, B and C region, density: 2.08 g/cm3) were located in the middle of the brain phantom. The data from 128 projections for each sorting process were used to achieve image reconstruction. The ordered subset expectation maximization (OSEM) reconstruction algorithm was used to obtain a tomographic image with eight subsets and five iterations. The receiver operating characteristic (ROC) curve analysis was used to evaluate the geometric accuracy of reconstructed image. Results: The OSEM image was compared with the original phantom pattern image. The area under the curve (AUC) was calculated as the gross area under each ROC curve. The three calculated AUC values were 0.738 (A region), 0.623 (B region), and 0.817 (C region). The differences between length of centers of two boron regions and distance of maximum count points were 0.3 cm, 1.6 cm and 1.4 cm. Conclusion: The possibility of extracting a 3D BNCT SPECT image was confirmed using the Monte Carlo simulation and OSEM algorithm. The prospects for obtaining an actual BNCT SPECT image were estimated from the quality of the simulated image and the simulation conditions. When multiple tumor region should be treated using the BNCT, a reasonable model to determine how many useful images can be obtained from the SPECT could be provided to the BNCT facilities. This research was supported by the Leading Foreign Research Institute Recruitment Program through the National Research

  2. Spin tomography

    Energy Technology Data Exchange (ETDEWEB)

    D' Ariano, G M [Quantum Optics and Information Group, INFM Udr Pavia, Dipartimento di Fisica ' Alessandro Volta' and INFM, Via Bassi 6, 27100 Pavia (Italy); Maccone, L [Quantum Optics and Information Group, INFM Udr Pavia, Dipartimento di Fisica ' Alessandro Volta' and INFM, Via Bassi 6, 27100 Pavia (Italy); Paini, M [Quantum Optics and Information Group, INFM Udr Pavia, Dipartimento di Fisica ' Alessandro Volta' and INFM, Via Bassi 6, 27100 Pavia (Italy)

    2003-02-01

    We propose a tomographic reconstruction scheme for spin states. The experimental set-up, which is a modification of the Stern-Gerlach scheme, can be easily performed with currently available technology. The method is generalized to multiparticle states, analysing the spin-1/2 case for indistinguishable particles. Some Monte Carlo numerical simulations are given to illustrate the technique.

  3. Spin tomography

    International Nuclear Information System (INIS)

    D'Ariano, G M; Maccone, L; Paini, M

    2003-01-01

    We propose a tomographic reconstruction scheme for spin states. The experimental set-up, which is a modification of the Stern-Gerlach scheme, can be easily performed with currently available technology. The method is generalized to multiparticle states, analysing the spin-1/2 case for indistinguishable particles. Some Monte Carlo numerical simulations are given to illustrate the technique

  4. Tomographic imaging system

    International Nuclear Information System (INIS)

    Hayakawa, T.; Horiba, I.; Kohno, H.; Nakaya, C.; Sekihara, K.; Shiono, H.; Tomura, T.; Yamamoto, S.; Yanaka, S.

    1980-01-01

    A tomographic imaging system comprising: irradiating means for irradating a cross-section of an object under consideration with radiation rays from plural directions; detector means for detecting the radiation rays transmitted through the cross-section of said object to produce an output signal; first memory means for storing the output signal of said detector means; and an image jreconstructing section for performing a convolution integral operation on the contents of said first memory means by means of a first weighting function to reconstruct a three-dimensional image of the cross-section of said object, said image reconstructing section including (I) second memory means for storing a second weighting function, said second weighting function being provided with a predetermined positive and negative (N-1)th order when the output signal of said detector means produced by the irradiation of the cross-section of said object from one of said plural directions is sampled by N points, the value of the (N-1)th order of said second weighting function being an integration of said first weighting function from the (N-1)th order to positive infinity and the value of -(N-1)th order of said second weighting function being an integration of said first weighting function from the -(N-1)th order to negative infinity, (II) control means for successively reading out the contents of said first and second memory means, and (III) operational means for performing multiplying and summing operations on the read-out contents of said first and second memory means, said operational means producing the product of the values fo the (N-1)th and -(N-1)th orders of said second weighting function and a component of the output signal of said detector means relating to the radiation rays free from the absorption thereof by said object

  5. Characterization and MCNP simulation of neutron energy spectrum shift after transmission through strong absorbing materials and its impact on tomography reconstructed image.

    Science.gov (United States)

    Hachouf, N; Kharfi, F; Boucenna, A

    2012-10-01

    An ideal neutron radiograph, for quantification and 3D tomographic image reconstruction, should be a transmission image which exactly obeys to the exponential attenuation law of a monochromatic neutron beam. There are many reasons for which this assumption does not hold for high neutron absorbing materials. The main deviations from the ideal are due essentially to neutron beam hardening effect. The main challenges of this work are the characterization of neutron transmission through boron enriched steel materials and the observation of beam hardening. Then, in our work, the influence of beam hardening effect on neutron tomographic image, for samples based on these materials, is studied. MCNP and FBP simulation are performed to adjust linear attenuation coefficients data and to perform 2D tomographic image reconstruction with and without beam hardening corrections. A beam hardening correction procedure is developed and applied based on qualitative and quantitative analyses of the projections data. Results from original and corrected 2D reconstructed images obtained shows the efficiency of the proposed correction procedure. Copyright © 2012 Elsevier Ltd. All rights reserved.

  6. Experimental and theoretical analysis for improved microscope design of optical projection tomographic microscopy.

    Science.gov (United States)

    Coe, Ryan L; Seibel, Eric J

    2013-09-01

    We present theoretical and experimental results of axial displacement of objects relative to a fixed condenser focal plane (FP) in optical projection tomographic microscopy (OPTM). OPTM produces three-dimensional, reconstructed images of single cells from two-dimensional projections. The cell rotates in a microcapillary to acquire projections from different perspectives where the objective FP is scanned through the cell while the condenser FP remains fixed at the center of the microcapillary. This work uses a combination of experimental and theoretical methods to improve the OPTM instrument design.

  7. Image reconstruction methods in positron tomography

    International Nuclear Information System (INIS)

    Townsend, D.W.; Defrise, M.

    1993-01-01

    In the two decades since the introduction of the X-ray scanner into radiology, medical imaging techniques have become widely established as essential tools in the diagnosis of disease. As a consequence of recent technological and mathematical advances, the non-invasive, three-dimensional imaging of internal organs such as the brain and the heart is now possible, not only for anatomical investigations using X-ray but also for studies which explore the functional status of the body using positron-emitting radioisotopes. This report reviews the historical and physical basis of medical imaging techniques using positron-emitting radioisotopes. Mathematical methods which enable three-dimensional distributions of radioisotopes to be reconstructed from projection data (sinograms) acquired by detectors suitably positioned around the patient are discussed. The extension of conventional two-dimensional tomographic reconstruction algorithms to fully three-dimensional reconstruction is described in detail. (orig.)

  8. Clamshell tomograph

    International Nuclear Information System (INIS)

    Derenzo, S. E.; Budinger, Th. F.

    1984-01-01

    In brief, the invention is a tomograph modified to be in a clamshell configuration so that the ring or rings may be moved to multiple sampling positions. The tomograph includes an array of detectors arranged in successive adjacent relative locations along a closed curve in a first position in a selected plane, and means for securing the detectors in the relative locations in a first sampling position. The securing means is movable in the plane in two sections and pivotable at one point and only one point to enable movement of at least one of the sections to a second sampling position out of the closed curve so that the ends of the section which are opposite the point are moved apart a predetermined space

  9. Impact of time-of-flight on indirect 3D and direct 4D parametric image reconstruction in the presence of inconsistent dynamic PET data

    Science.gov (United States)

    Kotasidis, F. A.; Mehranian, A.; Zaidi, H.

    2016-05-01

    Kinetic parameter estimation in dynamic PET suffers from reduced accuracy and precision when parametric maps are estimated using kinetic modelling following image reconstruction of the dynamic data. Direct approaches to parameter estimation attempt to directly estimate the kinetic parameters from the measured dynamic data within a unified framework. Such image reconstruction methods have been shown to generate parametric maps of improved precision and accuracy in dynamic PET. However, due to the interleaving between the tomographic and kinetic modelling steps, any tomographic or kinetic modelling errors in certain regions or frames, tend to spatially or temporally propagate. This results in biased kinetic parameters and thus limits the benefits of such direct methods. Kinetic modelling errors originate from the inability to construct a common single kinetic model for the entire field-of-view, and such errors in erroneously modelled regions could spatially propagate. Adaptive models have been used within 4D image reconstruction to mitigate the problem, though they are complex and difficult to optimize. Tomographic errors in dynamic imaging on the other hand, can originate from involuntary patient motion between dynamic frames, as well as from emission/transmission mismatch. Motion correction schemes can be used, however, if residual errors exist or motion correction is not included in the study protocol, errors in the affected dynamic frames could potentially propagate either temporally, to other frames during the kinetic modelling step or spatially, during the tomographic step. In this work, we demonstrate a new strategy to minimize such error propagation in direct 4D image reconstruction, focusing on the tomographic step rather than the kinetic modelling step, by incorporating time-of-flight (TOF) within a direct 4D reconstruction framework. Using ever improving TOF resolutions (580 ps, 440 ps, 300 ps and 160 ps), we demonstrate that direct 4D TOF image

  10. Optimization of Bayesian Emission tomographic reconstruction for region of interest quantitation

    International Nuclear Information System (INIS)

    Qi, Jinyi

    2003-01-01

    Region of interest (ROI) quantitation is an important task in emission tomography (e.g., positron emission tomography and single photon emission computed tomography). It is essential for exploring clinical factors such as tumor activity, growth rate, and the efficacy of therapeutic interventions. Bayesian methods based on the maximum a posteriori principle (or called penalized maximum likelihood methods) have been developed for emission image reconstructions to deal with the low signal to noise ratio of the emission data. Similar to the filter cut-off frequency in the filtered backprojection method, the smoothing parameter of the image prior in Bayesian reconstruction controls the resolution and noise trade-off and hence affects ROI quantitation. In this paper we present an approach for choosing the optimum smoothing parameter in Bayesian reconstruction for ROI quantitation. Bayesian reconstructions are difficult to analyze because the resolution and noise properties are nonlinear and object-dependent. Building on the recent progress on deriving the approximate expressions for the local impulse response function and the covariance matrix, we derived simplied theoretical expressions for the bias, the variance, and the ensemble mean squared error (EMSE) of the ROI quantitation. One problem in evaluating ROI quantitation is that the truth is often required for calculating the bias. This is overcome by using ensemble distribution of the activity inside the ROI and computing the average EMSE. The resulting expressions allow fast evaluation of the image quality for different smoothing parameters. The optimum smoothing parameter of the image prior can then be selected to minimize the EMSE

  11. Unfolding and smoothing applied to the quality enhancement of neutron tomographic images

    International Nuclear Information System (INIS)

    Almeida, Gevaldo L. de; Silvani, Maria I.; Lopes, Ricardo T.

    2008-01-01

    Resolution and contrast are the major parameters defining the quality of a computer-aided tomographic image. These parameters depend upon several features of the image acquisition system, such as detector resolution, geometrical arrangement of the source-object-detector, beam divergence, source strength, detector efficiency and counting time. Roughly, the detector finite resolution is the main source of systematic errors affecting the separation power of the image acquisition system, while the electronic noise and statistical fluctuation are responsible for the data dispersion, which spoils the contrast. An algorithm has been developed in this work aiming at the improvement of the image quality through the minimization of both types of errors. The systematic ones are reduced by a mathematical unfolding of the position spectra - used as projections to reconstruct the 2D-images - using the Line Spread Function - LSF of the neutron tomographic system. The principle behind this technique is that every single channel contains information about all channels of the spectrum, but it is concealed due to the automatic integration carried out by the detector. Therefore, knowing the shape of this curve, it is possible to retrieve the original spectra. These spectra are unfortunately corrupted by the unavoidable statistical fluctuation, and by oscillations arising from the unfolding process, which strongly affects the quality of the final unfolded image. In order to reduce this impact, the spectra have been filtered by a Fourier transform technique or smoothed with a least square fitting procedure. The algorithm has been applied to spectra of some test-bodies generated by an earlier developed tomographic simulator, which reproduces the spectra furnished by a thermal neutron tomographic system employing a position sensitive detector. The obtained results have shown that the unfolded spectra produce final images capable to resolve features otherwise not achievable with the

  12. Optimization of number and signal to noise ratio radiographs for defects 3D reconstruction in industrial control

    International Nuclear Information System (INIS)

    Bruandet, J.-P.

    2001-01-01

    Among numerous techniques for non-destructive evaluation (NOE), X-rays systems are well suited to inspect inner objects. Acquiring several radiographs of inspected objects under different points of view enables to recover a three dimensional structural information. In this NOE application, a tomographic testing is considered. This work deals with two tomographic testing optimizations in order to improve the characterization of defects that may occur into metallic welds. The first one consists in the optimization of the acquisition strategy. Because tomographic testing is made on-line, the total duration for image acquisition is fixed, limiting the number of available views. Hence, for a given acquisition duration, it is possible either to acquire a very limited number of radiographs with a good signal to noise ratio in each single acquisition or a larger number of radiographs with a limited signal to noise ratio. The second one consists in optimizing the 3D reconstruction algorithms from a limited number of cone-beam projections. To manage the lack of data, we first used algebraic reconstruction algorithms such as ART or regularized ICM. In terms of acquisition strategy optimization, an increase of the number of projections was proved to be valuable. Taking into account specific prior knowledge such as support constraint or physical noise model in attenuation images also improved reconstruction quality. Then, a new regularized region based reconstruction approach was developed. Defects to reconstruct are binary (lack of material in a homogeneous object). As a consequence, they are entirely described by their shapes. Because the number of defects to recover is unknown and is totally arbitrary, a level set formulation allowing handling topological changes was used. Results obtained with a regularized level-set reconstruction algorithm are optimistic in the proposed context. (author) [fr

  13. Ectomography - a tomographic method for gamma camera imaging

    International Nuclear Information System (INIS)

    Dale, S.; Edholm, P.E.; Hellstroem, L.G.; Larsson, S.

    1985-01-01

    In computerised gamma camera imaging the projections are readily obtained in digital form, and the number of picture elements may be relatively few. This condition makes emission techniques suitable for ectomography - a tomographic technique for directly visualising arbitrary sections of the human body. The camera rotates around the patient to acquire different projections in a way similar to SPECT. This method differs from SPECT, however, in that the camera is placed at an angle to the rotational axis, and receives two-dimensional, rather than one-dimensional, projections. Images of body sections are reconstructed by digital filtration and combination of the acquired projections. The main advantages of ectomography - a high and uniform resolution, a low and uniform attenuation and a high signal-to-noise ratio - are obtained when imaging sections close and parallel to a body surface. The filtration eliminates signals representing details outside the section and gives the section a certain thickness. Ectomographic transverse images of a line source and of a human brain have been reconstructed. Details within the sections are correctly visualised and details outside are effectively eliminated. For comparison, the same sections have been imaged with SPECT. (author)

  14. Suprathermal electron studies in the TCV tokamak: Design of a tomographic hard-x-ray spectrometer

    International Nuclear Information System (INIS)

    Gnesin, S.; Coda, S.; Decker, J.; Peysson, Y.

    2008-01-01

    Electron cyclotron resonance heating and electron cyclotron current drive, disruptive events, and sawtooth activity are all known to produce suprathermal electrons in fusion devices, motivating increasingly detailed studies of the generation and dynamics of this suprathermal population. Measurements have been performed in the past years in the tokamak a configuration variable (TCV) tokamak using a single pinhole hard-x-ray (HXR) camera and electron-cyclotron-emission radiometers, leading, in particular, to the identification of the crucial role of spatial transport in the physics of ECCD. The observation of a poloidal asymmetry in the emitted suprathermal bremsstrahlung radiation motivates the design of a proposed new tomographic HXR spectrometer reported in this paper. The design, which is based on a compact modified Soller collimator concept, is being aided by simulations of tomographic reconstruction. Quantitative criteria have been developed to optimize the design for the greatly variable shapes and positions of TCV plasmas.

  15. Electron and Photon Reconstruction and Identification with the ATLAS Detector

    CERN Document Server

    Kuna, M; The ATLAS collaboration

    2011-01-01

    This article presents the electron and photon reconstruction performance in ATLAS with the first LHC collision data at $sqrt{s}$~=~7~TeV collected up to the beginning of June 2010. Calorimetric and tracker related electron identification variables are shown to be in a fair agreement with the Monte Carlo model. %Material estimations in the inner detector were checked with photon conversions vertex position, complementarily to the energy flow from minimum bias events for material at larger radii. The position of the reconstructed photon conversions vertices has been used to compare the inner detector model used in Monte Carlo to the real one from data. The energy flow measured in the electromagnetic calorimeter with minimum bias data has been used to provide the same comparison at larger radii. $pi^0 ightarrow gamma gamma$ and $J/Psi ightarrow ee$ peaks were observed with a reconstructed mass in good agreement with both Monte Carlo and PDG value. 17 $W ightarrow e u$ candidates and one $Z ightarrow ee$ candidat...

  16. Assessment in dogs tympanic bulla, through virtual tomographic endoscopy; Avaliacao de bulas timpanicas em caes, por meio da endoscopia tomografica virtual

    Energy Technology Data Exchange (ETDEWEB)

    Silva, Luciana Carandina da; Sabino, Emanuelle Guidugli, E-mail: lucianacarandina@uol.com.br [Universidade Estadual Paulista Julio de Mesquita Filho (UNESP), Botucatu, SP (Brazil). Faculdade de Medicina Veterinaria. Dept. de Reproducao Animal e Radiologia Veterinaria; Vulcano, Luiz Carlos; Machado, Vania Maria de Vasconcelos [Universidade Estadual Paulista Julio de Mesquita Filho (UNESP), Botucatu, SP (Brazil). Faculdade de Medicina Veterinaria

    2012-07-01

    Dogs usually have problems related to the auditory canal. For the diagnosis of these pathologies, it is necessary a physical examination and, in some cases radiographic examination and computed tomography. The tympanic bulla is not easily visualized radiographically, since there is many structures of the brain overlaying the image obtained. The computed tomography has been the technique of choice to assess this structure faithfully. A new alternative assessment of the tympanic bulla is tomographic virtual endoscopy, which allows an improvement of the image obtained through the virtual tomographic technique. This paper provides information on the use of computed tomography, and a new technique, tomographic virtual endoscopy, in order to make the improvement of these techniques, and prove the reliability of these changes in the diagnosis of ear canals of dogs. Therefore, we performed the computed tomography of the tympanic bulla on healthy animals, and later performed image reconstruction in three-dimensional (3D) mode for virtual endoscopy. (author)

  17. Design considerations for a time-resolved tomographic diagnostic at DARHT

    International Nuclear Information System (INIS)

    Morris I. Kaufman, Daniel Frayer, Wendi Dreesen, Douglas Johnson, Alfred Meidinger

    2006-01-01

    An instrument has been developed to acquire time-resolved tomographic data from the electron beam at the DARHT [Dual-Axis Radiographic Hydrodynamic Test] facility at Los Alamos National Laboratory. The instrument contains four optical lines of sight that view a single tilted object. The lens design optically integrates along one optical axis for each line of sight. These images are relayed via fiber optic arrays to streak cameras, and the recorded streaks are used to reconstruct the original two-dimensional data. Installation of this instrument into the facility requires automation of both the optomechanical adjustments and calibration of the instrument in a constrained space. Additional design considerations include compound tilts on the object and image planes

  18. Statistical reconstruction for cosmic ray muon tomography.

    Science.gov (United States)

    Schultz, Larry J; Blanpied, Gary S; Borozdin, Konstantin N; Fraser, Andrew M; Hengartner, Nicolas W; Klimenko, Alexei V; Morris, Christopher L; Orum, Chris; Sossong, Michael J

    2007-08-01

    Highly penetrating cosmic ray muons constantly shower the earth at a rate of about 1 muon per cm2 per minute. We have developed a technique which exploits the multiple Coulomb scattering of these particles to perform nondestructive inspection without the use of artificial radiation. In prior work [1]-[3], we have described heuristic methods for processing muon data to create reconstructed images. In this paper, we present a maximum likelihood/expectation maximization tomographic reconstruction algorithm designed for the technique. This algorithm borrows much from techniques used in medical imaging, particularly emission tomography, but the statistics of muon scattering dictates differences. We describe the statistical model for multiple scattering, derive the reconstruction algorithm, and present simulated examples. We also propose methods to improve the robustness of the algorithm to experimental errors and events departing from the statistical model.

  19. [Diprosopus triophthalmus. From ancient terracotta sculptures to spiral computer tomographic reconstruction].

    Science.gov (United States)

    Sokiranski, R; Pirsig, W; Nerlich, A

    2005-03-01

    A still-born male fetus from the 19th century, fixed in formalin and presenting as diprosopia triophthalmica, was analysed by helical computer tomography and virtually reconstructed without damage. This rare, incomplete, symmetrical duplication of the face on a single head with three eyes, two noses and two mouths develops in the first 3 weeks of gestation and is a subset of the category of conjoined twins with unknown underlying etiology. Spiral computer tomography of fixed tissue demonstrated in the more than 100 year old specimen that virtual reconstruction can be performed in nearly the same way as in patients (contrast medium application not possible). The radiological reconstruction of the Munich fetus, here confined to head and neck data, is the basis for comparison with a number of imaging procedures of the last 3000 years. Starting with some Neolithic Mesoamerican ceramics, the "Pretty Ladies of Tlatilco", diprosopia triophthalmica was also depicted on engravings of the 16th and 17th century A.D. by artists as well as by the anatomist Soemmering and his engraver Berndt in the 18th century. Our modern spiral computer tomography confirms the ability of our ancestors to depict diprosopia triophthalmica in paintings and sculptures with a high level of natural precision.

  20. Initial results from the Donner 600 crystal positron tomograph

    International Nuclear Information System (INIS)

    Derenzo, S.E.; Huesman, R.H.; Cahoon, J.L.; Geyer, A.; Uber, D.; Vuletich, T.; Budinger, T.F.

    1986-10-01

    We describe a positron tomograph using a single ring of 600 close-packed 3 mm wide bismuth germanate (BGO) crystals coupled to 14 mm phototubes. The phototube preamplifier circuit derives a timing pulse from the first photoelectron, and sends it to address and coincidence circuits only if the integrated pulse height is within a pre-set window. The timing delays and pulse height windows for all 600 detectors and the coincidence timing windows are computer adjustable. An orbiting positron source is used for transmission measurements and a look-up table is used to reject scattered and random coincidences that do not pass through the source. Data can be acquired using a stationary mode for 1.57 mm lateral sampling or the two-position clam sampling mode for 0.79 mm lateral sampling. High maximum data rates are provided by 45 parallel coincidence circuits and 4 parallel histogram memory units. With two-position sampling and 1.57 mm bins, the reconstructed point spread function (PSF) of a 0.35 mm diam 22 Na wire source at the center of the tomograph is circular with 2.9 mm full-width at half-maximum (fwhm) and the PSF at a distance of 8 cm from the center is elliptical with a radial fwhm of 4.0 mm and tangential fwhm of 3.0 mm. 12 refs., 6 figs., 3 tabs

  1. Vertex Reconstruction for AEGIS’ FACT Detector

    CERN Document Server

    Themistokleous, Neofytos

    2017-01-01

    My project dealt with the development of a vertex reconstruction technique to discriminate antihydrogen from background signals in the AEGIS apparatus. It involved the creation of a Toy Monte-Carlo to simulate particle annihilation events, and a vertex reconstruction utility based on the Bayesian theory of probability. The first results based on 107 generated events with single track in the detector are encouraging. For such events, the algorithm can reconstruct the z-coordinate accurately , while for the r-coordinate the result is less accurate.

  2. Monitoring of health of trees by gamma-ray tomographic scanners and the first Kanpur error theorem

    International Nuclear Information System (INIS)

    Verma, Ruchi; Razdan, Mayuri; Quraishi, A.M.; Munshi, Prabhat

    2004-01-01

    CT scanners produce nondestructively images of a given cross-section with the help of radiation source-detector system and a suitable tomographic reconstruction algorithm. These CT images have inherent error associated with them and for unknown objects it is not possible to calculate it directly. Careful application of the first Kanpur theorem, however, gives an indirect estimate of the inaccuracy of these images. An interesting outcome of this theorem is monitoring of health of trees. (author)

  3. Tomographic bremsstrahlung imaging with yttrium-90 in the context of radioembolisation of liver tumors; Tomografische Bildgebung mit Yttrium-90-Bremsstrahlung im Rahmen der Radioembolisation von Lebertumoren

    Energy Technology Data Exchange (ETDEWEB)

    Grosser, Oliver Stephan

    2013-04-12

    Establish tomographic Bremsstrahlung SPECT imaging (BSPECT) for the clinical validation of Selective Internal Radiotherapy (SIRT) with Yttrium-90 ({sup 90}Y) labelled microspheres. Various energy ranges (75 ± 3.8 keV; 135 ± 6.8 keV; 167 ± 8.4 keV) and the summation window were studied to see if they were suitable for BSPECT. To this end, clinically available reconstruction techniques were analysed for their suitability for BSPECT. The tomographic examinations were performed on a cylindrical phantom filled with spheres of different diameters d = [28; 35; 40; 50; 60] mm in a non-active waterfilled background. The spheres were filled with identical {sup 90}Y activity concentration (AC). Measurements were conducted at AC = [14.58; 5.20; 1.98; 0.66] MBq/cm{sup 3}. The BSPECT were reconstructed with filtered back-projection (FBP), a 2D Ordered-Subset Expectation Maximisation Algorithm (2D-OSEM) and a 3D Geometric Mean Algorithm (3D-GMA). Evaluation was made visually and on the basis of objective performance parameters such as contrast, signal-to-noise ratio (SNR) and image noise. While the 75 keV ± 3.8 keV window was identified as suitable for the BSPECT, limitations were revealed as to use of different implementations of the Point Spread Function (PSF). It was found for all reconstruction techniques that, at a given sphere diameter, there existed a linear relationship between the AC in the spheres and the reconstructed pulse rate per volume element. The recovery effect was verified for small spheres. The iterative techniques were found to be suitable for the BSPECT at all AC. At low AC, the 3D-GMA exhibited the least noise and the highest SNR. The FBP turned out to be entirely inappropriate for the BSPECT. The narrow energy window in which the bremsstrahlung interferes with the characteristic X-radiation of lead can be used for BSPECT. In this approach, the tomographic data reconstructed with different algorithms exhibited a varying image quality, with the iterative

  4. X-ray differential phase-contrast tomographic reconstruction with a phase line integral retrieval filter

    International Nuclear Information System (INIS)

    Fu, Jian; Hu, Xinhua; Li, Chen

    2015-01-01

    We report an alternative reconstruction technique for x-ray differential phase-contrast computed tomography (DPC-CT). This approach is based on a new phase line integral projection retrieval filter, which is rooted in the derivative property of the Fourier transform and counteracts the differential nature of the DPC-CT projections. It first retrieves the phase line integral from the DPC-CT projections. Then the standard filtered back-projection (FBP) algorithms popular in x-ray absorption-contrast CT are directly applied to the retrieved phase line integrals to reconstruct the DPC-CT images. Compared with the conventional DPC-CT reconstruction algorithms, the proposed method removes the Hilbert imaginary filter and allows for the direct use of absorption-contrast FBP algorithms. Consequently, FBP-oriented image processing techniques and reconstruction acceleration softwares that have already been successfully used in absorption-contrast CT can be directly adopted to improve the DPC-CT image quality and speed up the reconstruction

  5. Theoretical Analysis of Penalized Maximum-Likelihood Patlak Parametric Image Reconstruction in Dynamic PET for Lesion Detection.

    Science.gov (United States)

    Yang, Li; Wang, Guobao; Qi, Jinyi

    2016-04-01

    Detecting cancerous lesions is a major clinical application of emission tomography. In a previous work, we studied penalized maximum-likelihood (PML) image reconstruction for lesion detection in static PET. Here we extend our theoretical analysis of static PET reconstruction to dynamic PET. We study both the conventional indirect reconstruction and direct reconstruction for Patlak parametric image estimation. In indirect reconstruction, Patlak parametric images are generated by first reconstructing a sequence of dynamic PET images, and then performing Patlak analysis on the time activity curves (TACs) pixel-by-pixel. In direct reconstruction, Patlak parametric images are estimated directly from raw sinogram data by incorporating the Patlak model into the image reconstruction procedure. PML reconstruction is used in both the indirect and direct reconstruction methods. We use a channelized Hotelling observer (CHO) to assess lesion detectability in Patlak parametric images. Simplified expressions for evaluating the lesion detectability have been derived and applied to the selection of the regularization parameter value to maximize detection performance. The proposed method is validated using computer-based Monte Carlo simulations. Good agreements between the theoretical predictions and the Monte Carlo results are observed. Both theoretical predictions and Monte Carlo simulation results show the benefit of the indirect and direct methods under optimized regularization parameters in dynamic PET reconstruction for lesion detection, when compared with the conventional static PET reconstruction.

  6. Effects of small variations of speed of sound in optoacoustic tomographic imaging

    International Nuclear Information System (INIS)

    Deán-Ben, X. Luís; Ntziachristos, Vasilis; Razansky, Daniel

    2014-01-01

    Purpose: Speed of sound difference in the imaged object and surrounding coupling medium may reduce the resolution and overall quality of optoacoustic tomographic reconstructions obtained by assuming a uniform acoustic medium. In this work, the authors investigate the effects of acoustic heterogeneities and discuss potential benefits of accounting for those during the reconstruction procedure. Methods: The time shift of optoacoustic signals in an acoustically heterogeneous medium is studied theoretically by comparing different continuous and discrete wave propagation models. A modification of filtered back-projection reconstruction is subsequently implemented by considering a straight acoustic rays model for ultrasound propagation. The results obtained with this reconstruction procedure are compared numerically and experimentally to those obtained assuming a heuristically fitted uniform speed of sound in both full-view and limited-view optoacoustic tomography scenarios. Results: The theoretical analysis showcases that the errors in the time-of-flight of the signals predicted by considering the straight acoustic rays model tend to be generally small. When using this model for reconstructing simulated data, the resulting images accurately represent the theoretical ones. On the other hand, significant deviations in the location of the absorbing structures are found when using a uniform speed of sound assumption. The experimental results obtained with tissue-mimicking phantoms and a mouse postmortem are found to be consistent with the numerical simulations. Conclusions: Accurate analysis of effects of small speed of sound variations demonstrates that accounting for differences in the speed of sound allows improving optoacoustic reconstruction results in realistic imaging scenarios involving acoustic heterogeneities in tissues and surrounding media

  7. Application of tomographic techniques to two-dimensional surface analysis using the Harwell nuclear microprobe

    International Nuclear Information System (INIS)

    Huddleston, J.; Hutchinson, I.G.; Pierce, T.B.

    1983-01-01

    Nuclear methods of surface analysis are discussed briefly, and the circumstances are described in which a two-dimensional analysis of the sample surface is desirable to enable the surface composition to be mapped accurately. Tomographic techniques of data manipulation are outlined. Data acquisition in the present case is performed by moving the sample in a defined sequence of positions, at each of which analytical data are gathered by the proton microprobe. The method and equipment are outlined. Data processing leading to the reconstruction of the image is summarised. (U.K.)

  8. Formation of tomographic images with neutrons

    International Nuclear Information System (INIS)

    Duarte, A.; Tenreiro, C; Valencia, J; Steinman, G.; Henriquez, C

    2000-01-01

    The possibility of having a non-destructive method of analysis for archaeological and paleontological samples is of interest. A special group of fossil samples has come to our attention, which because of their value should be preserved and, therefore, the availability of an indirect, non-destructive, non contaminating analytical technique is important. The strong absorption of usual kinds of radiation by a fossilized sample restricts the application of conventional methods of analysis. A type of radiation that is not completely attenuated by thick samples, in sizes that are typical in paleontology, is necessary. Neutrons may be considered as an ideal non-invasive probe with the possibility of developing a technique for the formation and analysis of images. A technique has been developed for the spatial reconstruction of the contents of a fossilized sample (tomography) with neutrons, without touching or altering the sample in any way. The neutron beam was extracted from the RECH-1 reactor belonging to the CCHEN, La Reina. The tomographic images of the contents of a fossilized egg are presented for the first time and represent views or cuts of the content as well as a set that permits the three dimensional reconstruction of the inside of the object and its subsequent animation in graphic format. This project developed a technique for taking neutron radiographs of this kind of sample including the numerical algorithms and the treatment and formation of the images (CW)

  9. A low error reconstruction method for confocal holography to determine 3-dimensional properties

    Energy Technology Data Exchange (ETDEWEB)

    Jacquemin, P.B., E-mail: pbjacque@nps.edu [Mechanical Engineering, University of Victoria, EOW 548,800 Finnerty Road, Victoria, BC (Canada); Herring, R.A. [Mechanical Engineering, University of Victoria, EOW 548,800 Finnerty Road, Victoria, BC (Canada)

    2012-06-15

    A confocal holography microscope developed at the University of Victoria uniquely combines holography with a scanning confocal microscope to non-intrusively measure fluid temperatures in three-dimensions (Herring, 1997), (Abe and Iwasaki, 1999), (Jacquemin et al., 2005). The Confocal Scanning Laser Holography (CSLH) microscope was built and tested to verify the concept of 3D temperature reconstruction from scanned holograms. The CSLH microscope used a focused laser to non-intrusively probe a heated fluid specimen. The focused beam probed the specimen instead of a collimated beam in order to obtain different phase-shift data for each scan position. A collimated beam produced the same information for scanning along the optical propagation z-axis. No rotational scanning mechanisms were used in the CSLH microscope which restricted the scan angle to the cone angle of the probe beam. Limited viewing angle scanning from a single view point window produced a challenge for tomographic 3D reconstruction. The reconstruction matrices were either singular or ill-conditioned making reconstruction with significant error or impossible. Establishing boundary conditions with a particular scanning geometry resulted in a method of reconstruction with low error referred to as 'wily'. The wily reconstruction method can be applied to microscopy situations requiring 3D imaging where there is a single viewpoint window, a probe beam with high numerical aperture, and specified boundary conditions for the specimen. The issues and progress of the wily algorithm for the CSLH microscope are reported herein. -- Highlights: Black-Right-Pointing-Pointer Evaluation of an optical confocal holography device to measure 3D temperature of a heated fluid. Black-Right-Pointing-Pointer Processing of multiple holograms containing the cumulative refractive index through the fluid. Black-Right-Pointing-Pointer Reconstruction issues due to restricting angular scanning to the numerical aperture of the

  10. A low error reconstruction method for confocal holography to determine 3-dimensional properties

    International Nuclear Information System (INIS)

    Jacquemin, P.B.; Herring, R.A.

    2012-01-01

    A confocal holography microscope developed at the University of Victoria uniquely combines holography with a scanning confocal microscope to non-intrusively measure fluid temperatures in three-dimensions (Herring, 1997), (Abe and Iwasaki, 1999), (Jacquemin et al., 2005). The Confocal Scanning Laser Holography (CSLH) microscope was built and tested to verify the concept of 3D temperature reconstruction from scanned holograms. The CSLH microscope used a focused laser to non-intrusively probe a heated fluid specimen. The focused beam probed the specimen instead of a collimated beam in order to obtain different phase-shift data for each scan position. A collimated beam produced the same information for scanning along the optical propagation z-axis. No rotational scanning mechanisms were used in the CSLH microscope which restricted the scan angle to the cone angle of the probe beam. Limited viewing angle scanning from a single view point window produced a challenge for tomographic 3D reconstruction. The reconstruction matrices were either singular or ill-conditioned making reconstruction with significant error or impossible. Establishing boundary conditions with a particular scanning geometry resulted in a method of reconstruction with low error referred to as “wily”. The wily reconstruction method can be applied to microscopy situations requiring 3D imaging where there is a single viewpoint window, a probe beam with high numerical aperture, and specified boundary conditions for the specimen. The issues and progress of the wily algorithm for the CSLH microscope are reported herein. -- Highlights: ► Evaluation of an optical confocal holography device to measure 3D temperature of a heated fluid. ► Processing of multiple holograms containing the cumulative refractive index through the fluid. ► Reconstruction issues due to restricting angular scanning to the numerical aperture of the beam. ► Minimizing tomographic reconstruction error by defining boundary

  11. A Monte Carlo simulation of scattering reduction in spectral x-ray computed tomography

    DEFF Research Database (Denmark)

    Busi, Matteo; Olsen, Ulrik Lund; Bergbäck Knudsen, Erik

    2017-01-01

    In X-ray computed tomography (CT), scattered radiation plays an important role in the accurate reconstruction of the inspected object, leading to a loss of contrast between the different materials in the reconstruction volume and cupping artifacts in the images. We present a Monte Carlo simulation...

  12. Fast approach to evaluate MAP reconstruction for lesion detection and localization

    International Nuclear Information System (INIS)

    Qi, Jinyi; Huesman, Ronald H.

    2004-01-01

    Lesion detection is an important task in emission tomography. Localization ROC (LROC) studies are often used to analyze the lesion detection and localization performance. Most researchers rely on Monte Carlo reconstruction samples to obtain LROC curves, which can be very time-consuming for iterative algorithms. In this paper we develop a fast approach to obtain LROC curves that does not require Monte Carlo reconstructions. We use a channelized Hotelling observer model to search for lesions, and the results can be easily extended to other numerical observers. We theoretically analyzed the mean and covariance of the observer output. Assuming the observer outputs are multivariate Gaussian random variables, an LROC curve can be directly generated by integrating the conditional probability density functions. The high-dimensional integrals are calculated using a Monte Carlo method. The proposed approach is very fast because no iterative reconstruction is involved. Computer simulations show that the results of the proposed method match well with those obtained using the tradition LROC analysis

  13. Collimator performance evaluation by Monte-Carlo techniques

    International Nuclear Information System (INIS)

    Milanesi, L.; Bettinardi, V.; Bellotti, E.; Gilardi, M.C.; Todd-Pokropek, A.; Fazio, F.

    1985-01-01

    A computer program using Monte-Carlo techniques has been developed to simulate gamma camera collimator performance. Input data include hole length, septum thickness, hole size and shape, collimator material, source characteristics, source to collimator distance and medium, radiation energy, total events number. Agreement between Monte-Carlo simulations and experimental measurements was found for commercial hexagonal parallel hole collimators in terms of septal penetration, transfer function and sensitivity. The method was then used to rationalize collimator design for tomographic brain studies. A radius of ration of 15 cm was assumed. By keeping constant resolution at 15 cm (FWHM = 1.3.cm), SPECT response to a point source was obtained in scattering medium for three theoretical collimators. Sensitivity was maximized in the first collimator, uniformity of resolution response in the third, while the second represented a trade-off between the two. The high sensitivity design may be superior in the hot spot and/or low activity situation, while for distributed sources of high activity an uniform resolution response should be preferred. The method can be used to personalize collimator design to different clinical needs in SPECT

  14. Shell model the Monte Carlo way

    International Nuclear Information System (INIS)

    Ormand, W.E.

    1995-01-01

    The formalism for the auxiliary-field Monte Carlo approach to the nuclear shell model is presented. The method is based on a linearization of the two-body part of the Hamiltonian in an imaginary-time propagator using the Hubbard-Stratonovich transformation. The foundation of the method, as applied to the nuclear many-body problem, is discussed. Topics presented in detail include: (1) the density-density formulation of the method, (2) computation of the overlaps, (3) the sign of the Monte Carlo weight function, (4) techniques for performing Monte Carlo sampling, and (5) the reconstruction of response functions from an imaginary-time auto-correlation function using MaxEnt techniques. Results obtained using schematic interactions, which have no sign problem, are presented to demonstrate the feasibility of the method, while an extrapolation method for realistic Hamiltonians is presented. In addition, applications at finite temperature are outlined

  15. Shell model the Monte Carlo way

    Energy Technology Data Exchange (ETDEWEB)

    Ormand, W.E.

    1995-03-01

    The formalism for the auxiliary-field Monte Carlo approach to the nuclear shell model is presented. The method is based on a linearization of the two-body part of the Hamiltonian in an imaginary-time propagator using the Hubbard-Stratonovich transformation. The foundation of the method, as applied to the nuclear many-body problem, is discussed. Topics presented in detail include: (1) the density-density formulation of the method, (2) computation of the overlaps, (3) the sign of the Monte Carlo weight function, (4) techniques for performing Monte Carlo sampling, and (5) the reconstruction of response functions from an imaginary-time auto-correlation function using MaxEnt techniques. Results obtained using schematic interactions, which have no sign problem, are presented to demonstrate the feasibility of the method, while an extrapolation method for realistic Hamiltonians is presented. In addition, applications at finite temperature are outlined.

  16. Three-dimensional multislice spiral computed tomographic angiography: a potentially useful tool for safer free tissue transfer to complicated regions

    DEFF Research Database (Denmark)

    Demirtas, Yener; Cifci, Mehmet; Kelahmetoglu, Osman

    2009-01-01

    Three-dimensional multislice spiral computed tomographic angiography (3D-MSCTA) is a minimally invasive method of vascular mapping. The aim of this study was to evaluate the clinical usefulness of this imaging technique in delineating the recipient vessels for safer free tissue transfer to compli......Three-dimensional multislice spiral computed tomographic angiography (3D-MSCTA) is a minimally invasive method of vascular mapping. The aim of this study was to evaluate the clinical usefulness of this imaging technique in delineating the recipient vessels for safer free tissue transfer...... be kept in mind, especially inthe patients with peripheral vascular disease. 3D-MSCTA has the potential to replace digital subtraction angiography for planning of microvascular reconstructions and newer devices with higher resolutions will probably increase the reliability of this technique. (c) 2009...

  17. Tomographic phase analysis to detect the site of accessory conduction pathway in Wolff-Parkinson-White syndrome

    International Nuclear Information System (INIS)

    Nakajima, K.; Bunko, H.; Tada, A.; Tonami, N.; Taki, J.; Nanbu, I.; Hisada, K.; Misaki, T.; Iwa, T.

    1984-01-01

    Phase analysis has been applied to Wolff-Parkinson-White syndrome (WPW) to detect the site of accessory conduction pathway (ACP); however, there was a limitation to estimate the precise location of ACP by planar phase analysis. In this study, the authors applied phase analysis to gated blood pool tomography. Twelve patients with WPW who underwent epicardial mapping and surgical division of ACP were studied by both of gated emission computed tomography (GECT) and routine gated blood pool study (GBPS). The GBPS was performed with Tc-99m red blood cells in multiple projections; modified left anterior oblique, right anterior oblique and/or left lateral views. In GECT, short axial, horizontal and vertical long axial blood pool images were reconstructed. Phase analysis was performed using fundamental frequency of the Fourier transform in both GECT and GBPS images, and abnormal initial contractions on both the planar and tomographic phase analysis were compared with the location of surgically confirmed ACPs. In planar phase analysis, abnormal initial phase was identified in 7 out of 12 (58%) patients, while in tomographic phase analysis, the localization of ACP was predicted in 11 out of 12 (92%) patients. Tomographic phase analysis is superior to planar phase images in 8 out of 12 patients to estimate the location of ACP. Phase analysis by GECT can avoid overlap of blood pool in cardiac chambers and has advantage to identify the propagation of phase three-dimensionally. Tomographic phase analysis is a good adjunctive method for patients with WPW to estimate the site of ACP

  18. Tomographic phase analysis to detect the site of accessory conduction pathway in Wolff-Parkinson-White syndrome

    Energy Technology Data Exchange (ETDEWEB)

    Nakajima, K.; Bunko, H.; Tada, A.; Tonami, N.; Taki, J.; Nanbu, I.; Hisada, K.; Misaki, T.; Iwa, T.

    1984-01-01

    Phase analysis has been applied to Wolff-Parkinson-White syndrome (WPW) to detect the site of accessory conduction pathway (ACP); however, there was a limitation to estimate the precise location of ACP by planar phase analysis. In this study, the authors applied phase analysis to gated blood pool tomography. Twelve patients with WPW who underwent epicardial mapping and surgical division of ACP were studied by both of gated emission computed tomography (GECT) and routine gated blood pool study (GBPS). The GBPS was performed with Tc-99m red blood cells in multiple projections; modified left anterior oblique, right anterior oblique and/or left lateral views. In GECT, short axial, horizontal and vertical long axial blood pool images were reconstructed. Phase analysis was performed using fundamental frequency of the Fourier transform in both GECT and GBPS images, and abnormal initial contractions on both the planar and tomographic phase analysis were compared with the location of surgically confirmed ACPs. In planar phase analysis, abnormal initial phase was identified in 7 out of 12 (58%) patients, while in tomographic phase analysis, the localization of ACP was predicted in 11 out of 12 (92%) patients. Tomographic phase analysis is superior to planar phase images in 8 out of 12 patients to estimate the location of ACP. Phase analysis by GECT can avoid overlap of blood pool in cardiac chambers and has advantage to identify the propagation of phase three-dimensionally. Tomographic phase analysis is a good adjunctive method for patients with WPW to estimate the site of ACP.

  19. Tomographical properties of uniformly redundant arrays

    International Nuclear Information System (INIS)

    Cannon, T.M.; Fenimore, E.E.

    1978-01-01

    Recent work in coded aperture imaging has shown that the uniformly redundant array (URA) can image distant planar radioactive sources with no artifacts. The performance of two URA apertures when used in a close-up tomographic imaging system is investigated. It is shown that a URA based on m sequences is superior to one based on quadratic residues. The m sequence array not only produces less obnoxious artifacts in tomographic imaging, but is also more resilient to some described detrimental effects of close-up imaging. It is shown that in spite of these close-up effects, tomographic depth resolution increases as the source is moved closer to the detector

  20. Tomographic reconstruction of neopterous carboniferous insect nymphs.

    Directory of Open Access Journals (Sweden)

    Russell Garwood

    Full Text Available Two new polyneopteran insect nymphs from the Montceau-les-Mines Lagerstätte of France are presented. Both are preserved in three dimensions, and are imaged with the aid of X-ray micro-tomography, allowing their morphology to be recovered in unprecedented detail. One-Anebos phrixos gen. et sp. nov.-is of uncertain affinities, and preserves portions of the antennae and eyes, coupled with a heavily spined habitus. The other is a roachoid with long antennae and chewing mouthparts very similar in form to the most generalized mandibulate mouthparts of extant orthopteroid insects. Computer reconstructions reveal limbs in both specimens, allowing identification of the segments and annulation in the tarsus, while poorly developed thoracic wing pads suggest both are young instars. This work describes the morphologically best-known Palaeozoic insect nymphs, allowing a better understanding of the juveniles' palaeobiology and palaeoecology. We also consider the validity of evidence from Palaeozoic juvenile insects in wing origin theories. The study of juvenile Palaeozoic insects is currently a neglected field, yet these fossils provide direct evidence on the evolution of insect development. It is hoped this study will stimulate a renewed interest in such work.

  1. A new approach for quantitative evaluation of reconstruction algorithms in SPECT

    International Nuclear Information System (INIS)

    Raeisi, E.; Rajabi, H.; Aghamiri, S. M. R.

    2006-01-01

    In nuclear medicine, phantoms are mainly used to evaluate the overall performance of the imaging systems, and practically there is no phantom exclusively designed for the evaluation of the software performance. In this study the Hoffman brain phantom was used for quantitative evaluation of reconstruction techniques. The phantom is modified to acquire tomographic and planar image of the same structure. The planar image may be used as the reference image to evaluate the quality of reconstructed slices, using the companion software developed in MATLAB. Materials and Methods: The designed phantom was composed of 4 independent 2D slices that could have been placed juxtapose to the 3D phantom. Each slice was composed of objects of different size and shape (for example: circle, triangle, and rectangle). Each 2D slice was imaged at distances ranging from 0 to 15 cm from the collimator surface. The phantom in 3D configuration was imaged acquiring 128 views of 128*128 matrix size. Reconstruction was performed using different filtering condition and the reconstructed images were compared to the corresponding planar images. The modulation transfer function, scatter fraction and attenuation map were calculated for each reconstructed image. Results: Since all the parameters of the acquisition were identical for the 2D and the 3D imaging, it was assumed that the difference in the quality of the images has exclusively been due to the reconstruction condition. The planar images were assumed to be the most perfect images which could be obtained with the system. The comparison of the reconstructed slices with the corresponding planar images yielded the optimum reconstruction condition. The results clearly showed that Wiener filter yields superior quality image among the entire tested filters. The extent of the improvement has been quantified in terms of universal image quality index. Conclusion : The phantom and the accompanying software were evaluated and found to be quite useful in

  2. Monte Carlo electron-trajectory simulations in bright-field and dark-field STEM: Implications for tomography of thick biological sections

    Energy Technology Data Exchange (ETDEWEB)

    Sousa, A.A.; Hohmann-Marriott, M.F.; Zhang, G. [Laboratory of Bioengineering and Physical Science, National Institute of Biomedical Imaging and Bioengineering, National Institutes of Health, Bldg. 13, Rm. 3N17, 13 South Drive, Bethesda, MD 20892-5766 (United States); Leapman, R.D. [Laboratory of Bioengineering and Physical Science, National Institute of Biomedical Imaging and Bioengineering, National Institutes of Health, Bldg. 13, Rm. 3N17, 13 South Drive, Bethesda, MD 20892-5766 (United States)], E-mail: leapmanr@mail.nih.gov

    2009-02-15

    A Monte Carlo electron-trajectory calculation has been implemented to assess the optimal detector configuration for scanning transmission electron microscopy (STEM) tomography of thick biological sections. By modeling specimens containing 2 and 3 at% osmium in a carbon matrix, it was found that for 1-{mu}m-thick samples the bright-field (BF) and annular dark-field (ADF) signals give similar contrast and signal-to-noise ratio provided the ADF inner angle and BF outer angle are chosen optimally. Spatial resolution in STEM imaging of thick sections is compromised by multiple elastic scattering which results in a spread of scattering angles and thus a spread in lateral distances of the electrons leaving the bottom surface. However, the simulations reveal that a large fraction of these multiply scattered electrons are excluded from the BF detector, which results in higher spatial resolution in BF than in high-angle ADF images for objects situated towards the bottom of the sample. The calculations imply that STEM electron tomography of thick sections should be performed using a BF rather than an ADF detector. This advantage was verified by recording simultaneous BF and high-angle ADF STEM tomographic tilt series from a stained 600-nm-thick section of C. elegans. It was found that loss of spatial resolution occurred markedly at the bottom surface of the specimen in the ADF STEM but significantly less in the BF STEM tomographic reconstruction. Our results indicate that it might be feasible to use BF STEM tomography to determine the 3D structure of whole eukaryotic microorganisms prepared by freeze-substitution, embedding, and sectioning.

  3. Dual-Source Swept-Source Optical Coherence Tomography Reconstructed on Integrated Spectrum

    Directory of Open Access Journals (Sweden)

    Shoude Chang

    2012-01-01

    Full Text Available Dual-source swept-source optical coherence tomography (DS-SSOCT has two individual sources with different central wavelengths, linewidth, and bandwidths. Because of the difference between the two sources, the individually reconstructed tomograms from each source have different aspect ratio, which makes the comparison and integration difficult. We report a method to merge two sets of DS-SSOCT raw data in a common spectrum, on which both data have the same spectrum density and a correct separation. The reconstructed tomographic image can seamlessly integrate the two bands of OCT data together. The final image has higher axial resolution and richer spectroscopic information than any of the individually reconstructed tomography image.

  4. Work in progress. Flashing tomosynthesis: a tomographic technique for quantitative coronary angiography

    International Nuclear Information System (INIS)

    Woelke, H.; Hanrath, P.; Schlueter, M.; Bleifeld, W.; Klotz, E.; Weiss, H.; Waller, D.; von Weltzien, J.

    1982-01-01

    Flashing tomosynthesis, a procedure that consists of a recording step and a reconstruction step, facilitates the tomographic imaging of coronary arteries. In a comparative study 10 postmortem coronary arteriograms were examined with 35-mm cine technique and with flashing tomosynthesis. The degrees of stenosis found with both of these techniques were compared with morphometrically obtained values. A higher correlation coefficient existed for the degrees of stenosis obtained with tomosynthesis and morphometry (r=0.92, p<0.001, SEE=9%) than for those obtained with cine technique and morphometry (r=0.82, p<0.001, SEE=16%). The technique has also been successfully carried out in 5 patients with coronary artery disease

  5. Application of reconstructive tomography to the measurement of density distribution in two-phase flow

    International Nuclear Information System (INIS)

    Fincke, J.R.; Berggren, M.J.; Johnson, S.A.

    1980-01-01

    The technique of reconstructive tomography has been applied to the measurement of average density and density distribution in multiphase flows. The technique of reconstructive tomography provides a model independent method of obtaining flow field density information. The unique features of interest in application of a practical tomographic densitometer system are the limited number of data values and the correspondingly coarse reconstruction grid (0.5 by 0.5 cm). These features were studied both experimentally, through the use of prototype hardware on a 3-in. pipe, and analytically, through computer generation of simulated data. Prototypical data were taken on phantoms constructed of Plexiglas and laminated Plexiglas, wood, and polyurethane foam. Reconstructions obtained from prototype data were compared with reconstructions from the simulated data

  6. Implementation of a Monte Carlo simulation environment for fully 3D PET on a high-performance parallel platform

    CERN Document Server

    Zaidi, H; Morel, Christian

    1998-01-01

    This paper describes the implementation of the Eidolon Monte Carlo program designed to simulate fully three-dimensional (3D) cylindrical positron tomographs on a MIMD parallel architecture. The original code was written in Objective-C and developed under the NeXTSTEP development environment. Different steps involved in porting the software on a parallel architecture based on PowerPC 604 processors running under AIX 4.1 are presented. Basic aspects and strategies of running Monte Carlo calculations on parallel computers are described. A linear decrease of the computing time was achieved with the number of computing nodes. The improved time performances resulting from parallelisation of the Monte Carlo calculations makes it an attractive tool for modelling photon transport in 3D positron tomography. The parallelisation paradigm used in this work is independent from the chosen parallel architecture

  7. Tomographic scanning apparatus

    International Nuclear Information System (INIS)

    1981-01-01

    This patent specification relates to a tomographic scanning apparatus using a fan beam and digital output signal, and particularly to the design of the gas-pressurized ionization detection system. (U.K.)

  8. Theoretical evaluation of the detectability of random lesions in bayesian emission reconstruction

    International Nuclear Information System (INIS)

    Qi, Jinyi

    2003-01-01

    Detecting cancerous lesion is an important task in positron emission tomography (PET). Bayesian methods based on the maximum a posteriori principle (also called penalized maximum likelihood methods) have been developed to deal with the low signal to noise ratio in the emission data. Similar to the filter cut-off frequency in the filtered backprojection method, the prior parameters in Bayesian reconstruction control the resolution and noise trade-off and hence affect detectability of lesions in reconstructed images. Bayesian reconstructions are difficult to analyze because the resolution and noise properties are nonlinear and object-dependent. Most research has been based on Monte Carlo simulations, which are very time consuming. Building on the recent progress on the theoretical analysis of image properties of statistical reconstructions and the development of numerical observers, here we develop a theoretical approach for fast computation of lesion detectability in Bayesian reconstruction. The results can be used to choose the optimum hyperparameter for the maximum lesion detectability. New in this work is the use of theoretical expressions that explicitly model the statistical variation of the lesion and background without assuming that the object variation is (locally) stationary. The theoretical results are validated using Monte Carlo simulations. The comparisons show good agreement between the theoretical predications and the Monte Carlo results

  9. Three-dimensional image reconstruction. I. Determination of pattern orientation

    International Nuclear Information System (INIS)

    Blankenbecler, Richard

    2004-01-01

    The problem of determining the Euler angles of a randomly oriented three-dimensional (3D) object from its 2D Fraunhofer diffraction patterns is discussed. This problem arises in the reconstruction of a positive semidefinite 3D object using oversampling techniques. In such a problem, the data consist of a measured set of magnitudes from 2D tomographic images of the object at several unknown orientations. After the orientation angles are determined, the object itself can then be reconstructed by a variety of methods using oversampling, the magnitude data from the 2D images, physical constraints on the image, and then iteration to determine the phases

  10. Emerging tomographic methods within the petroleum industry

    International Nuclear Information System (INIS)

    Johansen, Geir Anton

    2013-01-01

    Since industrial process tomography was introduced as a concept almost two decades ago, the considerable progress within a large variety of sensing modalities has to a large extent been technology driven. Industrial tomography applications may be divided into three categories: 1) Laboratory systems, 2) Field equipment for diagnostics and mapping purposes, and 3) Permanently installed systems. Examples on emerging methods on all categories will be presented, either from R and D at the University of Bergen and/or our industrial partners. Most developments are within the first category, where tomographs are used to provide better understanding of various processes such as pipe flow, separators, mixers and reactors. Here tomographic data is most often used to provide better process knowledge, for reference measurements and validation and development of process models, and finally for development for instruments and process equipment. The requirement here may be either high spatial resolution or high temporal resolution, or combinations of these. Tomographic field measurements are applied to either to inspect processes or equipment on a regular base or at faulty or irregular operation, or to map multicomponent systems such petroleum reservoirs, their structure and the distribution gas, oil and water within them. The latter will only be briefly touched upon here. Tomographic methods are increasingly being used for process and equipment diagnostics. The requirements vary and solutions based on repetition of single measurements, such as in column scanning, to full tomographic systems where there is sufficiently space or access. The third category is tomographic instruments that are permanently installed in situ in a process. These need not provide full tomographic images and instruments with fewer views are often preferred to reduce complexity and increase the instrument reliability. (author)

  11. Amplitude-based data selection for optimal retrospective reconstruction in micro-SPECT

    Science.gov (United States)

    Breuilly, M.; Malandain, G.; Guglielmi, J.; Marsault, R.; Pourcher, T.; Franken, P. R.; Darcourt, J.

    2013-04-01

    Respiratory motion can blur the tomographic reconstruction of positron emission tomography or single-photon emission computed tomography (SPECT) images, which subsequently impair quantitative measurements, e.g. in the upper abdomen area. Respiratory signal phase-based gated reconstruction addresses this problem, but deteriorates the signal-to-noise ratio (SNR) and other intensity-based quality measures. This paper proposes a 3D reconstruction method dedicated to micro-SPECT imaging of mice. From a 4D acquisition, the phase images exhibiting motion are identified and the associated list-mode data are discarded, which enables the reconstruction of a 3D image without respiratory artefacts. The proposed method allows a motion-free reconstruction exhibiting both satisfactory count statistics and accuracy of measures. With respect to standard 3D reconstruction (non-gated 3D reconstruction) without breathing motion correction, an increase of 14.6% of the mean standardized uptake value has been observed, while, with respect to a gated 4D reconstruction, up to 60% less noise and an increase of up to 124% of the SNR have been demonstrated.

  12. Comparison of kinetic models for data from a positron emission tomograph

    International Nuclear Information System (INIS)

    Coxson, P.G.; Huesman, R.H.; Lim, S.; Klein, G.J.; Reutter, B.W.; Budinger, T.F.

    1995-01-01

    The purpose of this research was to compare a physiological model of 82 Rb in the myocardium with two reduced order models with regard to their ability to assess physiological parameters of diagnostic significance. A three compartment physiological model of 82 Rb uptake in the myocardium was used to simulate kinetic region of interest data from a positron emission tomograph (PET). Simulations were generated for eight different blood flow rates reflecting the physiological range of interest. Two reduced order models which are commonly used with myocardial PET studies were fit to the simulated data and the parameters of the reduced order models were compared with the physiological parameters. Then all three models were fit to the simulated data with noise added. Monte Carlo simulations were used to evaluate and compare the diagnostic utility of the reduced order models

  13. Tomographic scanning apparatus

    International Nuclear Information System (INIS)

    1981-01-01

    Details are presented of a tomographic scanning apparatus, its rotational assembly, and the control and circuit elements, with particular reference to the amplifier and multiplexing circuits enabling detector signal calibration. (U.K.)

  14. Tomographic Small-Animal Imaging Using a High-Resolution Semiconductor Camera

    Science.gov (United States)

    Kastis, GA; Wu, MC; Balzer, SJ; Wilson, DW; Furenlid, LR; Stevenson, G; Barber, HB; Barrett, HH; Woolfenden, JM; Kelly, P; Appleby, M

    2015-01-01

    We have developed a high-resolution, compact semiconductor camera for nuclear medicine applications. The modular unit has been used to obtain tomographic images of phantoms and mice. The system consists of a 64 x 64 CdZnTe detector array and a parallel-hole tungsten collimator mounted inside a 17 cm x 5.3 cm x 3.7 cm tungsten-aluminum housing. The detector is a 2.5 cm x 2.5 cm x 0.15 cm slab of CdZnTe connected to a 64 x 64 multiplexer readout via indium-bump bonding. The collimator is 7 mm thick, with a 0.38 mm pitch that matches the detector pixel pitch. We obtained a series of projections by rotating the object in front of the camera. The axis of rotation was vertical and about 1.5 cm away from the collimator face. Mouse holders were made out of acrylic plastic tubing to facilitate rotation and the administration of gas anesthetic. Acquisition times were varied from 60 sec to 90 sec per image for a total of 60 projections at an equal spacing of 6 degrees between projections. We present tomographic images of a line phantom and mouse bone scan and assess the properties of the system. The reconstructed images demonstrate spatial resolution on the order of 1–2 mm. PMID:26568676

  15. Development of a portable computed tomographic scanner for on-line imaging of industrial piping systems

    International Nuclear Information System (INIS)

    Jaafar Abdullah; Mohd Arif Hamzah; Mohd Soyapi Mohd Yusof; Mohd Fitri Abdul Rahman; Fadil IsmaiI; Rasif Mohd Zain

    2003-01-01

    Computed tomography (CT) technology is being increasingly developed for industrial application. This paper presents the development of a portable computed tomographic scanner for on?line imaging of industrial piping systems. The theoretical approach, the system hardware, the data acquisition system and the adopted algorithm for image reconstruction are discussed. The scanner has large potential to be used to determine the extent of corrosion under insulation (CUI), to detect blockages, to measure the thickness of deposit/materials built-up on the walls and to improve understanding of material flow in pipelines. (Author)

  16. Computer tomographic diagnosis of echinococcosis

    Energy Technology Data Exchange (ETDEWEB)

    Haertel, M.; Fretz, C.; Fuchs, W.A.

    1980-08-01

    The computer tomographic appearances and differential diagnosis in 22 patients with echinococcosis are described; of these, twelve were of the cystic and ten of the alveolar type. The computer tomographic appearances are characterised by the presence of daughter cysts (66%) within the sharply demarkated parasitic cyst of water density. In the absence of daughter cysts, a definite aetiological diagnosis cannot be made, although there is a tendency to clasification of the occassionally multiple echinococcus cysts. The computer tomographic appearances of advanced alveolar echinococcosis are characterised by partial collequative necrosis, with clacification around the necrotic areas (90%). The absence of CT evidence of partial necrosis and calsification of the pseudotumour makes it difficult to establish a specific diagnosis. The conclusive and non-invasive character of the procedure and its reproducibility makes computer tomography the method of choice for the diagnosis and follow-up of echinococcosis.

  17. Pipelining Computational Stages of the Tomographic Reconstructor for Multi-Object Adaptive Optics on a Multi-GPU System

    KAUST Repository

    Charara, Ali

    2014-11-01

    The European Extremely Large Telescope project (E-ELT) is one of Europe\\'s highest priorities in ground-based astronomy. ELTs are built on top of a variety of highly sensitive and critical astronomical instruments. In particular, a new instrument called MOSAIC has been proposed to perform multi-object spectroscopy using the Multi-Object Adaptive Optics (MOAO) technique. The core implementation of the simulation lies in the intensive computation of a tomographic reconstruct or (TR), which is used to drive the deformable mirror in real time from the measurements. A new numerical algorithm is proposed (1) to capture the actual experimental noise and (2) to substantially speed up previous implementations by exposing more concurrency, while reducing the number of floating-point operations. Based on the Matrices Over Runtime System at Exascale numerical library (MORSE), a dynamic scheduler drives all computational stages of the tomographic reconstruct or simulation and allows to pipeline and to run tasks out-of order across different stages on heterogeneous systems, while ensuring data coherency and dependencies. The proposed TR simulation outperforms asymptotically previous state-of-the-art implementations up to 13-fold speedup. At more than 50000 unknowns, this appears to be the largest-scale AO problem submitted to computation, to date, and opens new research directions for extreme scale AO simulations. © 2014 IEEE.

  18. Three dimensional reconstruction of computed tomographic images by computer graphics method

    International Nuclear Information System (INIS)

    Kashiwagi, Toru; Kimura, Kazufumi.

    1986-01-01

    A three dimensional computer reconstruction system for CT images has been developed in a commonly used radionuclide data processing system using a computer graphics technique. The three dimensional model was constructed from organ surface information of CT images (slice thickness: 5 or 10 mm). Surface contours of the organs were extracted manually from a set of parallel transverse CT slices in serial order and stored in the computer memory. Interpolation was made between a set of the extracted contours by cubic spline functions, then three dimensional models were reconstructed. The three dimensional images were displayed as a wire-frame and/or solid models on the color CRT. Solid model images were obtained as follows. The organ surface constructed from contours was divided into many triangular patches. The intensity of light to each patch was calculated from the direction of incident light, eye position and the normal to the triangular patch. Firstly, this system was applied to the liver phantom. Reconstructed images of the liver phantom were coincident with the actual object. This system also has been applied to human various organs such as brain, lung, liver, etc. The anatomical organ surface was realistically viewed from any direction. The images made us more easily understand the location and configuration of organs in vivo than original CT images. Furthermore, spacial relationship among organs and/or lesions was clearly obtained by superimposition of wire-frame and/or different colored solid models. Therefore, it is expected that this system is clinically useful for evaluating the patho-morphological changes in broad perspective. (author)

  19. Use of a hybrid iterative reconstruction technique to reduce image noise and improve image quality in obese patients undergoing computed tomographic pulmonary angiography.

    Science.gov (United States)

    Kligerman, Seth; Mehta, Dhruv; Farnadesh, Mahmmoudreza; Jeudy, Jean; Olsen, Kathryn; White, Charles

    2013-01-01

    To determine whether an iterative reconstruction (IR) technique (iDose, Philips Healthcare) can reduce image noise and improve image quality in obese patients undergoing computed tomographic pulmonary angiography (CTPA). The study was Health Insurance Portability and Accountability Act compliant and approved by our institutional review board. A total of 33 obese patients (average body mass index: 42.7) underwent CTPA studies following standard departmental protocols. The data were reconstructed with filtered back projection (FBP) and 3 iDose strengths (iDoseL1, iDoseL3, and iDoseL5) for a total of 132 studies. FBP data were collected from 33 controls (average body mass index: 22) undergoing CTPA. Regions of interest were drawn at 6 identical levels in the pulmonary artery (PA), from the main PA to a subsegmental branch, in both the control group and study groups using each algorithm. Noise and attenuation were measured at all PA levels. Three thoracic radiologists graded each study on a scale of 1 (very poor) to 5 (ideal) by 4 categories: image quality, noise, PA enhancement, and "plastic" appearance. Statistical analysis was performed using an unpaired t test, 1-way analysis of variance, and linear weighted κ. Compared with the control group, there was significantly higher noise with FBP, iDoseL1, and iDoseL3 algorithms (Pnoise in the control group and iDoseL5 algorithm in the study group. Analysis within the study group showed a significant and progressive decrease in noise and increase in the contrast-to-noise ratio as the level of IR was increased (Pnoise and PA enhancement with increasing levels of iDose. The use of an IR technique leads to qualitative and quantitative improvements in image noise and image quality in obese patients undergoing CTPA.

  20. Enhanced imaging of microcalcifications in digital breast tomosynthesis through improved image-reconstruction algorithms

    International Nuclear Information System (INIS)

    Sidky, Emil Y.; Pan Xiaochuan; Reiser, Ingrid S.; Nishikawa, Robert M.; Moore, Richard H.; Kopans, Daniel B.

    2009-01-01

    Purpose: The authors develop a practical, iterative algorithm for image-reconstruction in undersampled tomographic systems, such as digital breast tomosynthesis (DBT). Methods: The algorithm controls image regularity by minimizing the image total p variation (TpV), a function that reduces to the total variation when p=1.0 or the image roughness when p=2.0. Constraints on the image, such as image positivity and estimated projection-data tolerance, are enforced by projection onto convex sets. The fact that the tomographic system is undersampled translates to the mathematical property that many widely varied resultant volumes may correspond to a given data tolerance. Thus the application of image regularity serves two purposes: (1) Reduction in the number of resultant volumes out of those allowed by fixing the data tolerance, finding the minimum image TpV for fixed data tolerance, and (2) traditional regularization, sacrificing data fidelity for higher image regularity. The present algorithm allows for this dual role of image regularity in undersampled tomography. Results: The proposed image-reconstruction algorithm is applied to three clinical DBT data sets. The DBT cases include one with microcalcifications and two with masses. Conclusions: Results indicate that there may be a substantial advantage in using the present image-reconstruction algorithm for microcalcification imaging.

  1. Evaluation of reconstruction algorithms in SPECT neuroimaging: Pt. 1

    International Nuclear Information System (INIS)

    Heejoung Kim; Zeeberg, B.R.; Reba, R.C.

    1993-01-01

    In the presence of statistical noise, an iterative reconstruction algorithm (IRA) for the quantitative reconstruction of single-photon-emission computed tomographic (SPECT) brain images overcomes major limitations of applying the standard filtered back projection (FBP) reconstruction algorithm to projection data which have been degraded by convolution of the true radioactivity distribution with a finite-resolution distance-dependent detector response: (a) the non-uniformity within the grey (or white) matter voxels which results even though the true model is uniform within these voxels; (b) a significantly lower ratio of grey/white matter voxel values than in the true model; and (c) an inability to detect an altered radioactivity value within the grey (or white) matter voxels. It is normally expected that an algorithm which improves spatial resolution and quantitative accuracy might also increase the magnitude of the statistical noise in the reconstructed image. However, the noise properties in the IRA images are very similar to those in the FBP images. (Author)

  2. Regional compensation for statistical maximum likelihood reconstruction error of PET image pixels

    International Nuclear Information System (INIS)

    Forma, J; Ruotsalainen, U; Niemi, J A

    2013-01-01

    In positron emission tomography (PET), there is an increasing interest in studying not only the regional mean tracer concentration, but its variation arising from local differences in physiology, the tissue heterogeneity. However, in reconstructed images this physiological variation is shadowed by a large reconstruction error, which is caused by noisy data and the inversion of tomographic problem. We present a new procedure which can quantify the error variation in regional reconstructed values for given PET measurement, and reveal the remaining tissue heterogeneity. The error quantification is made by creating and reconstructing the noise realizations of virtual sinograms, which are statistically similar with the measured sinogram. Tests with physical phantom data show that the characterization of error variation and the true heterogeneity are possible, despite the existing model error when real measurement is considered. (paper)

  3. Tomographic scanning apparatus

    International Nuclear Information System (INIS)

    1981-01-01

    This patent specification describes a tomographic scanning apparatus, with particular reference to the adjustable fan beam and its collimator system, together with the facility for taking a conventional x-radiograph without moving the patient. (U.K.)

  4. Quantification of rat brain SPECT with 123I-ioflupane: evaluation of different reconstruction methods and image degradation compensations using Monte Carlo simulation

    International Nuclear Information System (INIS)

    Roé-Vellvé, N; Pino, F; Cot, A; Ros, D; Falcon, C; Gispert, J D; Pavía, J; Marin, C

    2014-01-01

    SPECT studies with 123 I-ioflupane facilitate the diagnosis of Parkinson’s disease (PD). The effect on quantification of image degradations has been extensively evaluated in human studies but their impact on studies of experimental PD models is still unclear. The aim of this work was to assess the effect of compensating for the degrading phenomena on the quantification of small animal SPECT studies using 123 I-ioflupane. This assessment enabled us to evaluate the feasibility of quantitatively detecting small pathological changes using different reconstruction methods and levels of compensation for the image degrading phenomena. Monte Carlo simulated studies of a rat phantom were reconstructed and quantified. Compensations for point spread function (PSF), scattering, attenuation and partial volume effect were progressively included in the quantification protocol. A linear relationship was found between calculated and simulated specific uptake ratio (SUR) in all cases. In order to significantly distinguish disease stages, noise-reduction during the reconstruction process was the most relevant factor, followed by PSF compensation. The smallest detectable SUR interval was determined by biological variability rather than by image degradations or coregistration errors. The quantification methods that gave the best results allowed us to distinguish PD stages with SUR values that are as close as 0.5 using groups of six rats to represent each stage. (paper)

  5. Characteristics of computed tomographic reconstruction noise and their effect on detectability

    International Nuclear Information System (INIS)

    Hanson, K.M.; Boyd, D.P.

    1977-01-01

    The EMI 5005 scanner produces images with noise characteristics similar to those in simulated CT reconstructions. A detectability phantom is described which will provide a means of investigation of the effect on human detection capability of the peculiar correlations present in the noise present in CT scanner images

  6. SXR measurement and W transport survey using GEM tomographic system on WEST

    Science.gov (United States)

    Mazon, D.; Jardin, A.; Malard, P.; Chernyshova, M.; Coston, C.; Malard, P.; O'Mullane, M.; Czarski, T.; Malinowski, K.; Faisse, F.; Ferlay, F.; Verger, J. M.; Bec, A.; Larroque, S.; Kasprowicz, G.; Wojenski, A.; Pozniak, K.

    2017-11-01

    Measuring Soft X-Ray (SXR) radiation (0.1-20 keV) of fusion plasmas is a standard way of accessing valuable information on particle transport. Since heavy impurities like tungsten (W) could degrade plasma core performances and cause radiative collapses, it is necessary to develop new diagnostics to be able to monitor the impurity distribution in harsh fusion environments like ITER. A gaseous detector with energy discrimination would be a very good candidate for this purpose. The design and implementation of a new SXR diagnostic developed for the WEST project, based on a triple Gas Electron Multiplier (GEM) detector is presented. This detector works in photon counting mode and presents energy discrimination capabilities. The SXR system is composed of two 1D cameras (vertical and horizontal views respectively), located in the same poloidal cross-section to allow for tomographic reconstruction. An array (20 cm × 2 cm) consists of up to 128 detectors in front of a beryllium pinhole (equipped with a 1 mm diameter diaphragm) inserted at about 50 cm depth inside a cooled thimble in order to retrieve a wide plasma view. Acquisition of low energy spectrum is insured by a helium buffer installed between the pinhole and the detector. Complementary cooling systems (water) are used to maintain a constant temperature (25oC) inside the thimble. Finally a real-time automatic extraction system has been developed to protect the diagnostic during baking phases or any overheating unwanted events. Preliminary simulations of plasma emissivity and W distribution have been performed for WEST using a recently developed synthetic diagnostic coupled to a tomographic algorithm based on the minimum Fisher information (MFI) inversion method. First GEM acquisitions are presented as well as estimation of transport effect in presence of ICRH on W density reconstruction capabilities of the GEM.

  7. Segmented slant hole collimator for stationary cardiac SPECT: Monte Carlo simulations.

    Science.gov (United States)

    Mao, Yanfei; Yu, Zhicong; Zeng, Gengsheng L

    2015-09-01

    This work is a preliminary study of a stationary cardiac SPECT system. The goal of this research is to propose a stationary cardiac SPECT system using segmented slant-hole collimators and to perform computer simulations to test the feasibility. Compared to the rotational SPECT, a stationary system has a benefit of acquiring temporally consistent projections. The most challenging issue in building a stationary system is to provide sufficient projection view-angles. A GATE (GEANT4 application for tomographic emission) Monte Carlo model was developed to simulate a two-detector stationary cardiac SPECT that uses segmented slant-hole collimators. Each detector contains seven segmented slant-hole sections that slant to a common volume at the rotation center. Consequently, 14 view-angles over 180° were acquired without any gantry rotation. The NCAT phantom was used for data generation and a tailored maximum-likelihood expectation-maximization algorithm was used for image reconstruction. Effects of limited number of view-angles and data truncation were carefully evaluated in the paper. Simulation results indicated that the proposed segmented slant-hole stationary cardiac SPECT system is able to acquire sufficient data for cardiac imaging without a loss of image quality, even when the uptakes in the liver and kidneys are high. Seven views are acquired simultaneously at each detector, leading to 5-fold sensitivity gain over the conventional dual-head system at the same total acquisition time, which in turn increases the signal-to-noise ratio by 19%. The segmented slant-hole SPECT system also showed a good performance in lesion detection. In our prototype system, a short hole-length was used to reduce the dead zone between neighboring collimator segments. The measured sensitivity gain is about 17-fold over the conventional dual-head system. The gate Monte Carlo simulations confirm the feasibility of the proposed stationary cardiac SPECT system with segmented slant

  8. Segmented slant hole collimator for stationary cardiac SPECT: Monte Carlo simulations

    Energy Technology Data Exchange (ETDEWEB)

    Mao, Yanfei, E-mail: ymao@ucair.med.utah.edu [Department of Radiology, Utah Center for Advanced Imaging Research (UCAIR), University of Utah, Salt Lake City, Utah 84108 and Department of Bioengineering, University of Utah, Salt Lake City, Utah 84112 (United States); Yu, Zhicong [Department of Radiology, Mayo Clinic, Rochester, Minnesota 55905 (United States); Zeng, Gengsheng L. [Department of Radiology, Utah Center for Advanced Imaging Research (UCAIR), University of Utah, Salt Lake City, Utah 84108 and Department of Engineering, Weber State University, Ogden, Utah 84408 (United States)

    2015-09-15

    Purpose: This work is a preliminary study of a stationary cardiac SPECT system. The goal of this research is to propose a stationary cardiac SPECT system using segmented slant-hole collimators and to perform computer simulations to test the feasibility. Compared to the rotational SPECT, a stationary system has a benefit of acquiring temporally consistent projections. The most challenging issue in building a stationary system is to provide sufficient projection view-angles. Methods: A GATE (GEANT4 application for tomographic emission) Monte Carlo model was developed to simulate a two-detector stationary cardiac SPECT that uses segmented slant-hole collimators. Each detector contains seven segmented slant-hole sections that slant to a common volume at the rotation center. Consequently, 14 view-angles over 180° were acquired without any gantry rotation. The NCAT phantom was used for data generation and a tailored maximum-likelihood expectation-maximization algorithm was used for image reconstruction. Effects of limited number of view-angles and data truncation were carefully evaluated in the paper. Results: Simulation results indicated that the proposed segmented slant-hole stationary cardiac SPECT system is able to acquire sufficient data for cardiac imaging without a loss of image quality, even when the uptakes in the liver and kidneys are high. Seven views are acquired simultaneously at each detector, leading to 5-fold sensitivity gain over the conventional dual-head system at the same total acquisition time, which in turn increases the signal-to-noise ratio by 19%. The segmented slant-hole SPECT system also showed a good performance in lesion detection. In our prototype system, a short hole-length was used to reduce the dead zone between neighboring collimator segments. The measured sensitivity gain is about 17-fold over the conventional dual-head system. Conclusions: The GATE Monte Carlo simulations confirm the feasibility of the proposed stationary cardiac

  9. Iterative reconstruction with attenuation compensation from cone-beam projections acquired via nonplanar orbits

    International Nuclear Information System (INIS)

    Zeng, G.L.; Weng, Y.; Gullberg, G.T.

    1997-01-01

    Single photon emission computed tomography (SPECT) imaging with cone-beam collimators provides improved sensitivity and spatial resolution for imaging small objects with large field-of-view detectors. It is known that Tuy's cone-beam data sufficiency condition must be met to obtain artifact-free reconstructions. Even though Tuy's condition was derived for an attenuation-free situation, the authors hypothesize that an artifact-free reconstruction can be obtained even if the cone-beam data are attenuated, provided the imaging orbit satisfies Tuy's condition and the exact attenuation map is known. In the authors' studies, emission data are acquired using nonplanar circle-and-line orbits to acquire cone-beam data for tomographic reconstructions. An extended iterative ML-EM (maximum likelihood-expectation maximization) reconstruction algorithm is derived and used to reconstruct projection data with either a pre-acquired or assumed attenuation map. Quantitative accuracy of the attenuation corrected emission reconstruction is significantly improved

  10. Effect of object functions on tomographic reconstruction a numerical study

    International Nuclear Information System (INIS)

    Babu Rao, C.; Baldev Raj; Ravichandran, V.S.; Munshi, P.

    1996-01-01

    Convolution back projection is the most widely used algorithm of computed tomography (CT). Theoretical studies show that under ideal conditions, the error in the reconstruction can be correlated with the second fourier space derivative of filter function and with the Laplacian of the object function. This paper looks into the second aspect of the error function. In this paper a systematic numerical study is presented on the effect to object functions on global and local errors. (author)

  11. Iterative reconstruction: how it works, how to apply it

    Energy Technology Data Exchange (ETDEWEB)

    Seibert, James Anthony [University of California Davis Medical Center, Department of Radiology, Sacramento, CA (United States)

    2014-10-15

    Computed tomography acquires X-ray projection data from multiple angles though an object to generate a tomographic rendition of its attenuation characteristics. Filtered back projection is a fast, closed analytical solution to the reconstruction process, whereby all projections are equally weighted, but is prone to deliver inadequate image quality when the dose levels are reduced. Iterative reconstruction is an algorithmic method that uses statistical and geometric models to variably weight the image data in a process that can be solved iteratively to independently reduce noise and preserve resolution and image quality. Applications of this technology in a clinical setting can result in lower dose on the order of 20-40% compared to a standard filtered back projection reconstruction for most exams. A carefully planned implementation strategy and methodological approach is necessary to achieve the goals of lower dose with uncompromised image quality. (orig.)

  12. Development of an anthropomorphic model and a Monte Carlo calculation code devoted to the physical reconstruction of a radiological accident

    International Nuclear Information System (INIS)

    Roux, A.

    2001-01-01

    The diversity of radiological accidents makes difficult the medical prognosis and the therapy choice from only clinical observations. To complete this information, it is important to know the global dose received by the organism and the dose distributions in depth in tissues. The dose estimation can be made by a physical reconstruction of the accident with the help of tools based on experimental techniques or on calculation. The software of the geometry construction (M.G.E.D.), associated to the Monte-Carlo code of photons and neutrons transport (M.O.R.S.E.) replies these constraints. An important result of this work is to determine the principal parameters to know in function of the accident type, as well as the precision level required for these parameters. (N.C.)

  13. Axial tomographic scanner

    International Nuclear Information System (INIS)

    1976-01-01

    An axial tomographic system is described comprising axial tomographic means for collecting sets of data corresponding to the transmission or absorption of a number of beams of penetrating radiation through a planar slice of an object. It includes means to locate an object to be analyzed, a source and detector for directing one or more beams of penetrating radiation through the object from the source to the detector, and means to rotate (and optionally translate) the source as well as means to process the collected sets of data. Data collection, data processing, and data display can each be conducted independently of each other. An additional advantage of the system described is that the raw data (i.e., the originally collected data) are not destroyed by the data processing but instead are retained intact for further reference or use, if needed

  14. A general algorithm for the reconstruction of jet events in e+e- annihilation

    International Nuclear Information System (INIS)

    Goddard, M.C.

    1981-01-01

    A general method is described to reconstruct a predetermined number of jets. It can reconstruct the jet axes as accurately as any existing algorithm and is up to one hundred times faster. Results are shown from the reconstruction of 2-jet, 3-jet and 4-jet Monte Carlo events. (author)

  15. 3-dimensional charge collection efficiency measurements using volumetric tomographic reconstruction

    Energy Technology Data Exchange (ETDEWEB)

    Dobos, Daniel [CERN, Geneva (Switzerland)

    2016-07-01

    For a better understanding of the electrical field distribution of 3D semiconductor detectors and to allow efficiency based design improvements, a method to measure the 3D spatial charge collection efficiency of planar, 3D silicon and diamond sensors using 3D volumetric reconstruction techniques is possible. Simulation results and first measurements demonstrated the feasibility of this method and show that with soon available 10 times faster beam telescopes even small structures and efficiency differences will become measurable in few hours.

  16. Setup of HDRK-Man voxel model in Geant4 Monte Carlo code

    Energy Technology Data Exchange (ETDEWEB)

    Jeong, Jong Hwi; Cho, Sung Koo; Kim, Chan Hyeong [Hanyang Univ., Seoul (Korea, Republic of); Choi, Sang Hyoun [Inha Univ., Incheon (Korea, Republic of); Cho, Kun Woo [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)

    2008-10-15

    Many different voxel models, developed using tomographic images of human body, are used in various fields including both ionizing and non-ionizing radiation fields. Recently a high-quality voxel model/ named HDRK-Man, was constructed at Hanyang University and used to calculate the dose conversion coefficients (DCC) values for external photon and neutron beams using the MCNPX Monte Carlo code. The objective of the present study is to set up the HDRK-Man model in Geant4 in order to use it in more advanced calculations such as 4-D Monte Carlo simulations and space dosimetry studies involving very high energy particles. To that end, the HDRK-Man was ported to Geant4 and used to calculate the DCC values for external photon beams. The calculated values were then compared with the results of the MCNPX code. In addition, a computational Linux cluster was built to improve the computing speed in Geant4.

  17. Monte Carlo simulation of continuous-space crystal growth

    International Nuclear Information System (INIS)

    Dodson, B.W.; Taylor, P.A.

    1986-01-01

    We describe a method, based on Monte Carlo techniques, of simulating the atomic growth of crystals without the discrete lattice space assumed by conventional Monte Carlo growth simulations. Since no lattice space is assumed, problems involving epitaxial growth, heteroepitaxy, phonon-driven mechanisms, surface reconstruction, and many other phenomena incompatible with the lattice-space approximation can be studied. Also, use of the Monte Carlo method circumvents to some extent the extreme limitations on simulated timescale inherent in crystal-growth techniques which might be proposed using molecular dynamics. The implementation of the new method is illustrated by studying the growth of strained-layer superlattice (SLS) interfaces in two-dimensional Lennard-Jones atomic systems. Despite the extreme simplicity of such systems, the qualitative features of SLS growth seen here are similar to those observed experimentally in real semiconductor systems

  18. Fast parallel algorithm for three-dimensional distance-driven model in iterative computed tomography reconstruction

    International Nuclear Information System (INIS)

    Chen Jian-Lin; Li Lei; Wang Lin-Yuan; Cai Ai-Long; Xi Xiao-Qi; Zhang Han-Ming; Li Jian-Xin; Yan Bin

    2015-01-01

    The projection matrix model is used to describe the physical relationship between reconstructed object and projection. Such a model has a strong influence on projection and backprojection, two vital operations in iterative computed tomographic reconstruction. The distance-driven model (DDM) is a state-of-the-art technology that simulates forward and back projections. This model has a low computational complexity and a relatively high spatial resolution; however, it includes only a few methods in a parallel operation with a matched model scheme. This study introduces a fast and parallelizable algorithm to improve the traditional DDM for computing the parallel projection and backprojection operations. Our proposed model has been implemented on a GPU (graphic processing unit) platform and has achieved satisfactory computational efficiency with no approximation. The runtime for the projection and backprojection operations with our model is approximately 4.5 s and 10.5 s per loop, respectively, with an image size of 256×256×256 and 360 projections with a size of 512×512. We compare several general algorithms that have been proposed for maximizing GPU efficiency by using the unmatched projection/backprojection models in a parallel computation. The imaging resolution is not sacrificed and remains accurate during computed tomographic reconstruction. (paper)

  19. Tomographic method and apparatus

    International Nuclear Information System (INIS)

    Moore, R.M.

    1981-01-01

    A tomographic x-ray machine has a camera and film-plane section which move about a primary axis for imaging a selected cross-section of an anatomical member onto the film. A ''scout image'' of the member is taken at right angles to the plane of the desired cross-section to indicate the cross-section's angle with respect to the primary axis. The film plane is then located at the same angle with respect to a film cassette axis as the selected cross-section makes with the primary axis. The film plane and the cross-section are then maintained in parallel planes throughout motion of the camera and film plane during tomographic radiography. (author)

  20. Homogenization of steady-state creep of porous metals using three-dimensional microstructural reconstructions

    DEFF Research Database (Denmark)

    Kwok, Kawai; Boccaccini, Dino; Persson, Åsa Helen

    2016-01-01

    The effective steady-state creep response of porous metals is studied by numerical homogenization and analytical modeling in this paper. The numerical homogenization is based on finite element models of three-dimensional microstructures directly reconstructed from tomographic images. The effects ...... model, and closely matched by the Gibson-Ashby compression and the Ramakrishnan-Arunchalam creep models. [All rights reserved Elsevier]....

  1. Tomographic reconstruction of melanin structures of optical coherence tomography via the finite-difference time-domain simulation

    Science.gov (United States)

    Huang, Shi-Hao; Wang, Shiang-Jiu; Tseng, Snow H.

    2015-03-01

    Optical coherence tomography (OCT) provides high resolution, cross-sectional image of internal microstructure of biological tissue. We use the Finite-Difference Time-Domain method (FDTD) to analyze the data acquired by OCT, which can help us reconstruct the refractive index of the biological tissue. We calculate the refractive index tomography and try to match the simulation with the data acquired by OCT. Specifically, we try to reconstruct the structure of melanin, which has complex refractive indices and is the key component of human pigment system. The results indicate that better reconstruction can be achieved for homogenous sample, whereas the reconstruction is degraded for samples with fine structure or with complex interface. Simulation reconstruction shows structures of the Melanin that may be useful for biomedical optics applications.

  2. Positron transaxial emission tomograph with computerized image reconstruction

    International Nuclear Information System (INIS)

    Jatteau, Michel.

    1981-01-01

    This invention concerns a positron transaxial emission tomography apparatus with computerized image reconstruction, like those used in nuclear medicine for studying the metabolism of organs, in physiological examinations and as a diagnosis aid. The operation is based on the principle of the detection of photons emitted when the positrons are annihilated by impact with an electron. The appliance is mainly composed of: (a) - a set of gamma ray detectors distributed on a polygonal arrangement around the body area to be examined, (b) - circuits for amplifying the signals delivered by the gamma ray detectors, (c) - computers essentially comprising energy integration and discrimination circuits and provided at the output of the detectors for calculating and delivering, as from the amplified signals, information on the position and energy relative to each occurrence constituted by the detections of photons, (d) - time coincidence circuits for selecting by emission of detector validation signals, only those occurrences, among the ensemble of those detected, which effectively result from the annihilation of positrons inside the area examined, (e) - a data processing system [fr

  3. Simulation and reconstruction of the BESIII EMC

    International Nuclear Information System (INIS)

    He Miao

    2011-01-01

    The simulation and reconstruction software of the BES III Electromagnetic Calorimeter (EMC), are developed based on the object-oriented language C++, in the framework of Gaudi. Performance of EMC are studied with data and compared with Monte Carlo samples.

  4. Development of a X-ray micro-tomograph and its application to reservoir rocks characterization; Developpement d`un microtomographe X et application a la caracterisation des roches reservoirs

    Energy Technology Data Exchange (ETDEWEB)

    Ferreira de Paiva, R.

    1995-10-01

    We describe the construction and application to studies in three dimensions of a laboratory micro-tomograph for the characterisation of heterogeneous solids at the scale of a few microns. The system is based on an electron microprobe and a two dimensional X-ray detector. The use of a low beam divergence for image acquisition allows use of simple and rapid reconstruction software whilst retaining reasonable acquisition times. Spatial resolutions of better than 3 microns in radiography and 10 microns in tomography are obtained. The applications of microtomography in the petroleum industry are illustrated by the study of fibre orientation in polymer composites, of the distribution of minerals and pore space in reservoir rocks, and of the interaction of salt water with a model porous medium. A correction for X-ray beam hardening is described and used to obtain improved discrimination of the phases present in the sample. In the case of a North Sea reservoir rock we show the possibility to distinguish quartz, feldspar and in certain zone kaolinite. The representativeness of the tomographic reconstruction is demonstrated by comparing the surface of the reconstructed specimen with corresponding images obtained in scanning electron microscopy. (author). 58 refs., 10 tabs., 71 photos.

  5. Tomographic extreme-ultraviolet spectrographs: TESS.

    Science.gov (United States)

    Cotton, D M; Stephan, A; Cook, T; Vickers, J; Taylor, V; Chakrabarti, S

    2000-08-01

    We describe the system of Tomographic Extreme Ultraviolet (EUV) SpectrographS (TESS) that are the primary instruments for the Tomographic Experiment using Radiative Recombinative Ionospheric EUV and Radio Sources (TERRIERS) satellite. The spectrographs were designed to make high-sensitivity {80 counts/s)/Rayleigh [one Rayleigh is equivalent to 10(6) photons/(4pi str cm(2)s)}, line-of-sight measurements of the oi 135.6- and 91.1-nm emissions suitable for tomographic inversion. The system consists of five spectrographs, four identical nightglow instruments (for redundancy and added sensitivity), and one instrument with a smaller aperture to reduce sensitivity and increase spectral resolution for daytime operation. Each instrument has a bandpass of 80-140 nm with approximately 2- and 1-nm resolution for the night and day instruments, respectively. They utilize microchannel-plate-based two-dimensional imaging detectors with wedge-and-strip anode readouts. The instruments were designed, fabricated, and calibrated at Boston University, and the TERRIERS satellite was launched on 18 May 1999 from Vandenberg Air Force Base, California.

  6. System architecture for high speed reconstruction in time-of-flight positron tomography

    International Nuclear Information System (INIS)

    Campagnolo, R.E.; Bouvier, A.; Chabanas, L.; Robert, C.

    1985-06-01

    A new generation of Time Of Flight (TOF) positron tomograph with high resolution and high count rate capabilities is under development in our group. After a short recall of the data acquisition process and image reconstruction in a TOF PET camera, we present the data acquisition system which achieves a data transfer rate of 0.8 mega events per second or more if necessary in list mode. We describe the reconstruction process based on a five stages pipe line architecture using home made processors. The expected performance with this architecture is a time reconstruction of six seconds per image (256x256 pixels) of one million events. This time could be reduce to 4 seconds. We conclude with the future developments of the system

  7. Sound field reconstruction using acousto-optic tomography

    DEFF Research Database (Denmark)

    Torras Rosell, Antoni; Barrera Figueroa, Salvador; Jacobsen, Finn

    2012-01-01

    When sound propagates through a medium, it results in pressure fluctuations that change the instantaneous density of the medium. Under such circumstances, the refractive index that characterizes the propagation of light is not constant, but influenced by the acoustic field. This kind of interaction...... the acousto-optic effect in air, and demonstrates that it can be measured with a laser Doppler vibrometer in the audible frequency range. The tomographic reconstruction is tested by means of computer simulations and measurements. The main features observed in the simulations are also recognized...

  8. Implementation of 3D tomographic visualisation through planar ICT data from experimental gamma-ray tomographic system

    International Nuclear Information System (INIS)

    Umesh Kumar; Singh, Gursharan; Ravindran, V.R.

    2001-01-01

    Industrial Computed Tomography (ICT) is one of the latest methods of non-destructive testing and examination. Different prototypes of Computed Industrial Tomographic Imaging System (CITIS) have been developed and experimental data have been generated in Isotope Applications Division. The experimental gamma-rays based tomographic imaging system comprises of beam generator containing approx. 220 GBq (6 Curies) of 137 Cs, a single NaI(Tl) -PMT integral assembly in a thick shielding and associated electronics, stepper motor controlled mechanical manipulator, collimators and required software. CITIS data is normally acquired in one orientation of the sample. It may be sometimes required to view a tomographic plane in a different orientation. Also, 3D visualization may be required with the available 2D data set. All these can be achieved by processing the available data. We have customized some of the routines for this purpose provided IDL (Integrated Data Language) package to suit our requirements. The present paper discusses methodology adopted for this purpose with an illustrative example. (author)

  9. Scanning tomographic particle image velocimetry applied to a turbulent jet

    KAUST Repository

    Casey, T. A.

    2013-02-21

    We introduce a modified tomographic PIV technique using four high-speed video cameras and a scanning pulsed laser-volume. By rapidly illuminating adjacent subvolumes onto separate video frames, we can resolve a larger total volume of velocity vectors, while retaining good spatial resolution. We demonstrate this technique by performing time-resolved measurements of the turbulent structure of a round jet, using up to 9 adjacent volume slices. In essence this technique resolves more velocity planes in the depth direction by maintaining optimal particle image density and limiting the number of ghost particles. The total measurement volumes contain between 1 ×106 and 3 ×106 velocity vectors calculated from up to 1500 reconstructed depthwise image planes, showing time-resolved evolution of the large-scale vortical structures for a turbulent jet of Re up to 10 000.

  10. Atlas of the Underworld : Paleo-subduction, -geography, -atmosphere and -sea level reconstructed from present-day mantle structure

    NARCIS (Netherlands)

    van der Meer, Douwe G.

    2017-01-01

    In this thesis, I aimed at searching for new ways of constraining paleo-geographic, -atmosphere and -sea level reconstructions, through an extensive investigation of mantle structure in seismic tomographic models. To this end, I explored evidence for paleo-subduction in these models and how this may

  11. PET reconstruction

    International Nuclear Information System (INIS)

    O'Sullivan, F.; Pawitan, Y.; Harrison, R.L.; Lewellen, T.K.

    1990-01-01

    In statistical terms, filtered backprojection can be viewed as smoothed Least Squares (LS). In this paper, the authors report on improvement in LS resolution by: incorporating locally adaptive smoothers, imposing positivity and using statistical methods for optimal selection of the resolution parameter. The resulting algorithm has high computational efficiency relative to more elaborate Maximum Likelihood (ML) type techniques (i.e. EM with sieves). Practical aspects of the procedure are discussed in the context of PET and illustrations with computer simulated and real tomograph data are presented. The relative recovery coefficients for a 9mm sphere in a computer simulated hot-spot phantom range from .3 to .6 when the number of counts ranges from 10,000 to 640,000 respectively. The authors will also present results illustrating the relative efficacy of ML and LS reconstruction techniques

  12. Computer tomographs

    International Nuclear Information System (INIS)

    Niedzwiedzki, M.

    1982-01-01

    Physical foundations and the developments in the transmission and emission computer tomography are presented. On the basis of the available literature and private communications a comparison is made of the various transmission tomographs. A new technique of computer emission tomography ECT, unknown in Poland, is described. The evaluation of two methods of ECT, namely those of positron and single photon emission tomography is made. (author)

  13. Design and performance of a new positron computed tomograph (P.C.T.) using the time-of-flight (T.O.F.) information

    International Nuclear Information System (INIS)

    Laval, M.; Allemand, R.; Bouvier, A.

    1982-09-01

    A new tomograph for positron imaging using the time of flight measurement is described. Fast CsF crystals are used in this first prototype. Compared to the classical reconstruction method, the results of adding this information is a substantial increase of sensitivity, a reduced random coincidence count rate, and slight decrease of a scatter contribution in the images. Further improvements in the T.O.F. accuracy can be expected in using faster crystals

  14. Representation of photon limited data in emission tomography using origin ensembles

    Energy Technology Data Exchange (ETDEWEB)

    Sitek, A [Radiology Department, Brigham and Women' s Hospital and Harvard Medical School, 75 Francis Street, Boston, MA 02115 (United States)], E-mail: asitek@bwh.harvard.edu

    2008-06-21

    Representation and reconstruction of data obtained by emission tomography scanners are challenging due to high noise levels in the data. Typically, images obtained using tomographic measurements are represented using grids. In this work, we define images as sets of origins of events detected during tomographic measurements; we call these origin ensembles (OEs). A state in the ensemble is characterized by a vector of 3N parameters Y, where the parameters are the coordinates of origins of detected events in a three-dimensional space and N is the number of detected events. The 3N-dimensional probability density function (PDF) for that ensemble is derived, and we present an algorithm for OE image estimation from tomographic measurements. A displayable image (e.g. grid based image) is derived from the OE formulation by calculating ensemble expectations based on the PDF using the Markov chain Monte Carlo method. The approach was applied to computer-simulated 3D list-mode positron emission tomography data. The reconstruction errors for a 10 000 000 event acquisition for simulated ranged from 0.1 to 34.8%, depending on object size and sampling density. The method was also applied to experimental data and the results of the OE method were consistent with those obtained by a standard maximum-likelihood approach. The method is a new approach to representation and reconstruction of data obtained by photon-limited emission tomography measurements.

  15. Representation of photon limited data in emission tomography using origin ensembles

    Science.gov (United States)

    Sitek, A.

    2008-06-01

    Representation and reconstruction of data obtained by emission tomography scanners are challenging due to high noise levels in the data. Typically, images obtained using tomographic measurements are represented using grids. In this work, we define images as sets of origins of events detected during tomographic measurements; we call these origin ensembles (OEs). A state in the ensemble is characterized by a vector of 3N parameters Y, where the parameters are the coordinates of origins of detected events in a three-dimensional space and N is the number of detected events. The 3N-dimensional probability density function (PDF) for that ensemble is derived, and we present an algorithm for OE image estimation from tomographic measurements. A displayable image (e.g. grid based image) is derived from the OE formulation by calculating ensemble expectations based on the PDF using the Markov chain Monte Carlo method. The approach was applied to computer-simulated 3D list-mode positron emission tomography data. The reconstruction errors for a 10 000 000 event acquisition for simulated ranged from 0.1 to 34.8%, depending on object size and sampling density. The method was also applied to experimental data and the results of the OE method were consistent with those obtained by a standard maximum-likelihood approach. The method is a new approach to representation and reconstruction of data obtained by photon-limited emission tomography measurements.

  16. Representation of photon limited data in emission tomography using origin ensembles

    International Nuclear Information System (INIS)

    Sitek, A

    2008-01-01

    Representation and reconstruction of data obtained by emission tomography scanners are challenging due to high noise levels in the data. Typically, images obtained using tomographic measurements are represented using grids. In this work, we define images as sets of origins of events detected during tomographic measurements; we call these origin ensembles (OEs). A state in the ensemble is characterized by a vector of 3N parameters Y, where the parameters are the coordinates of origins of detected events in a three-dimensional space and N is the number of detected events. The 3N-dimensional probability density function (PDF) for that ensemble is derived, and we present an algorithm for OE image estimation from tomographic measurements. A displayable image (e.g. grid based image) is derived from the OE formulation by calculating ensemble expectations based on the PDF using the Markov chain Monte Carlo method. The approach was applied to computer-simulated 3D list-mode positron emission tomography data. The reconstruction errors for a 10 000 000 event acquisition for simulated ranged from 0.1 to 34.8%, depending on object size and sampling density. The method was also applied to experimental data and the results of the OE method were consistent with those obtained by a standard maximum-likelihood approach. The method is a new approach to representation and reconstruction of data obtained by photon-limited emission tomography measurements

  17. Optical image reconstruction using DC data: simulations and experiments

    International Nuclear Information System (INIS)

    Huabei Jiang; Paulsen, K.D.; Oesterberg, U.L.

    1996-01-01

    In this paper, we explore optical image formation using a diffusion approximation of light propagation in tissue which is modelled with a finite-element method for optically heterogeneous media. We demonstrate successful image reconstruction based on absolute experimental DC data obtained with a continuous wave 633 nm He-Ne laser system and a 751 nm diode laser system in laboratory phantoms having two optically distinct regions. The experimental systems used exploit a tomographic type of data collection scheme that provides information from which a spatially variable optical property map is deduced. Reconstruction of scattering coefficient only and simultaneous reconstruction of both scattering and absorption profiles in tissue-like phantoms are obtained from measured and simulated data. Images with different contrast levels between the heterogeneity and the background are also reported and the results show that although it is possible to obtain qualitative visual information on the location and size of a heterogeneity, it may not be possible to quantitatively resolve contrast levels or optical properties using reconstructions from DC data only. Sensitivity of image reconstruction to noise in the measurement data is investigated through simulations. The application of boundary constraints has also been addressed. (author)

  18. Fast implementations of reconstruction-based scatter compensation in fully 3D SPECT image reconstruction

    International Nuclear Information System (INIS)

    Kadrmas, Dan J.; Karimi, Seemeen S.; Frey, Eric C.; Tsui, Benjamin M.W.

    1998-01-01

    Accurate scatter compensation in SPECT can be performed by modelling the scatter response function during the reconstruction process. This method is called reconstruction-based scatter compensation (RBSC). It has been shown that RBSC has a number of advantages over other methods of compensating for scatter, but using RBSC for fully 3D compensation has resulted in prohibitively long reconstruction times. In this work we propose two new methods that can be used in conjunction with existing methods to achieve marked reductions in RBSC reconstruction times. The first method, coarse-grid scatter modelling, significantly accelerates the scatter model by exploiting the fact that scatter is dominated by low-frequency information. The second method, intermittent RBSC, further accelerates the reconstruction process by limiting the number of iterations during which scatter is modelled. The fast implementations were evaluated using a Monte Carlo simulated experiment of the 3D MCAT phantom with 99m Tc tracer, and also using experimentally acquired data with 201 Tl tracer. Results indicated that these fast methods can reconstruct, with fully 3D compensation, images very similar to those obtained using standard RBSC methods, and in reconstruction times that are an order of magnitude shorter. Using these methods, fully 3D iterative reconstruction with RBSC can be performed well within the realm of clinically realistic times (under 10 minutes for 64x64x24 image reconstruction). (author)

  19. Tomographic scanning apparatus

    International Nuclear Information System (INIS)

    1981-01-01

    Details are given of a tomographic scanning apparatus, with particular reference to the means of adjusting the apparent gain of the signal processing means for receiving output signals from the detectors, to compensate for drift in the gain characteristics, including means for passing a reference signal. (U.K.)

  20. Random signal tomographical analysis of two-phase flow

    International Nuclear Information System (INIS)

    Han, P.; Wesser, U.

    1990-01-01

    This paper reports on radiation tomography which is a useful tool for studying the internal structures of two-phase flow. However, general tomography analysis gives only time-averaged results, hence much information is lost. As a result, it is sometimes difficult to identify the flow regime; for example, the time-averaged picture does not significantly change as an annual flow develops from a slug flow. A two-phase flow diagnostic technique based on random signal tomographical analysis is developed. It extracts more information by studying the statistical variation of the measured signal with time. Local statistical parameters, including mean value, variance, skewness and flatness etc., are reconstructed from the information obtained by a general tomography technique. More important information are provided by the results. Not only the void fraction can be easily calculated, but also the flow pattern can be identified more objectively and more accurately. The experimental setup is introduced. It consisted of a two-phase flow loop, an X-ray system, a fan-like five-beam detector system and a signal acquisition and processing system. In the experiment, for both horizontal and vertical test sections (aluminum and steel tube with Di/Do = 40/45 mm), different flow situations are realized by independently adjusting air and water mass flow. Through a glass tube connected with the test section, some typical flow patterns are visualized and used for comparing with the reconstruction results

  1. Alignment Solution for CT Image Reconstruction using Fixed Point and Virtual Rotation Axis.

    Science.gov (United States)

    Jun, Kyungtaek; Yoon, Seokhwan

    2017-01-25

    Since X-ray tomography is now widely adopted in many different areas, it becomes more crucial to find a robust routine of handling tomographic data to get better quality of reconstructions. Though there are several existing techniques, it seems helpful to have a more automated method to remove the possible errors that hinder clearer image reconstruction. Here, we proposed an alternative method and new algorithm using the sinogram and the fixed point. An advanced physical concept of Center of Attenuation (CA) was also introduced to figure out how this fixed point is applied to the reconstruction of image having errors we categorized in this article. Our technique showed a promising performance in restoring images having translation and vertical tilt errors.

  2. GPU-accelerated few-view CT reconstruction using the OSC and TV techniques

    Energy Technology Data Exchange (ETDEWEB)

    Matenine, Dmitri [Montreal Univ., QC (Canada). Dept. de Physique; Hissoiny, Sami [Ecole Polytechnique de Montreal, QC (Canada). Dept. de Genie Informatique et Genie Logiciel; Despres, Philippe [Centre Hospitalier Univ. de Quebec, QC (Canada). Dept. de Radio-Oncologie

    2011-07-01

    The present work proposes a promising iterative reconstruction technique designed specifically for X-ray transmission computed tomography (CT). The main objective is to reduce diagnostic radiation dose through the reduction of the number of CT projections, while preserving image quality. The second objective is to provide a fast implementation compatible with clinical activities. The proposed tomographic reconstruction technique is a combination of the Ordered Subsets Convex (OSC) algorithm and the Total Variation minimization (TV) regularization technique. The results in terms of image quality and computational speed are discussed. Using this technique, it was possible to obtain reconstructed slices of relatively good quality with as few as 100 projections, leading to potential dose reduction factors of up to an order of magnitude depending on the application. The algorithm was implemented on a Graphical Processing Unit (GPU) and yielded reconstruction times of approximately 185 ms per slice. (orig.)

  3. Tomographic array

    International Nuclear Information System (INIS)

    1976-01-01

    A tomographic array with the following characteristics is described. An X-ray screen serving as detector is placed before a photomultiplier tube which itself is placed in front of a television camera connected to a set of image processors. The detector is concave towards the source and is replacable. Different images of the object are obtained simultaneously. Optical fibers and lenses are used for transmission within the system

  4. A positron emission tomograph designed for 3/4 mm resolution

    International Nuclear Information System (INIS)

    McInytre, J.A.; Allen, R.D.; Aguiar, J.; Paulson, J.T.

    1995-01-01

    Two factors of the design for a positron tomograph affect the magnitude of the tomograph spatial resolution: the gamma ray detector width and the analogue measurement of the scintillator location. In the tomograph design reported here the analogue measurement is eliminated and the detector transaxial width factor is reduced to 3/4 mm. The analogue measurement is eliminated by transmitting the scintillation light from each individual scintillator through optical fibers to four photo-multipliers (PMT's); the identities of the PMT's then provide a digital address for the scintillation location. Plastic scintillators are used to provide enough scintillation light for transmission through the optical fibers. Bonuses from the use of plastic scintillators are first, the reduction of the scintillator dead time to about 10 nsec, second, a large reduction of cross-talk between neighboring scintillators, third, the reduction of resolution loss from off-axis gamma rays and, fourth, the ability to sample the axial image at one-eighth the axial resolution distance of 2.5 mm. The designed tomograph incorporates 20 rings. Two of the 32 tomograph 20-ring modules have been constructed to measure the resolution and other characteristics of the tomographs

  5. Monte Carlo Simulation for Particle Detectors

    CERN Document Server

    Pia, Maria Grazia

    2012-01-01

    Monte Carlo simulation is an essential component of experimental particle physics in all the phases of its life-cycle: the investigation of the physics reach of detector concepts, the design of facilities and detectors, the development and optimization of data reconstruction software, the data analysis for the production of physics results. This note briefly outlines some research topics related to Monte Carlo simulation, that are relevant to future experimental perspectives in particle physics. The focus is on physics aspects: conceptual progress beyond current particle transport schemes, the incorporation of materials science knowledge relevant to novel detection technologies, functionality to model radiation damage, the capability for multi-scale simulation, quantitative validation and uncertainty quantification to determine the predictive power of simulation. The R&D on simulation for future detectors would profit from cooperation within various components of the particle physics community, and synerg...

  6. A filtered backprojection reconstruction algorithm for Compton camera

    Energy Technology Data Exchange (ETDEWEB)

    Lojacono, Xavier; Maxim, Voichita; Peyrin, Francoise; Prost, Remy [Lyon Univ., Villeurbanne (France). CNRS, Inserm, INSA-Lyon, CREATIS, UMR5220; Zoglauer, Andreas [California Univ., Berkeley, CA (United States). Space Sciences Lab.

    2011-07-01

    In this paper we present a filtered backprojection reconstruction algorithm for Compton Camera detectors of particles. Compared to iterative methods, widely used for the reconstruction of images from Compton camera data, analytical methods are fast, easy to implement and avoid convergence issues. The method we propose is exact for an idealized Compton camera composed of two parallel plates of infinite dimension. We show that it copes well with low number of detected photons simulated from a realistic device. Images reconstructed from both synthetic data and realistic ones obtained with Monte Carlo simulations demonstrate the efficiency of the algorithm. (orig.)

  7. Spectral reconstruction for a 6 MV linear accelerator

    International Nuclear Information System (INIS)

    Hernandez-Bojorquez, M.; Martinez-Davalos, A.; Larraga, J. M.

    2004-01-01

    In this work we present the first results of an x-ray spectral reconstruction for a 6 MV Varian LINAC. The shape of the spectrum will be used in Monte Carlo treatment planning in order to improve the quality and accuracy of the calculated dose distributions. We based our simulation method on the formalism proposed by Francois et al. In this method the spectrum is reconstructed from transmission measurements under narrow beam geometry for multiple attenuator thicknesses. These data allowed us to reconstruct the x-ray spectrum through direct solution of matrix systems using spectral algebra formalism

  8. Introduction to curved rotary tomographic apparatus 'TOMOREX'

    International Nuclear Information System (INIS)

    Kubota, Kazuo; Shinojima, Masayasu; Kohirasawa, Hideo; Tokui, Mitsuru

    1980-01-01

    In recent years, panorama X-ray photographic method is widely used for the X-ray diagnosis of teeth, jawbones and faces. One type based on the principle of tomography is curved surface rotary tomographic method utilizing fine-gap X-ray beam. With the synchronous rotation of an X-ray tube and a photographic film around a face, describing a U-shaped tomographic plane along a dental arch, an upper or lower jawbone is photographed. In the ''TOMOREX'' belonging to this type, is different tomographic planes are available, so that by selecting any position in advance, the part can be photographed. Furthermore, patients can be subjected to examination as laid on a stretcher. The mechanism and equipment, and the photographic method for eye sockets, cheekbones, upper jaw cavities and stereoscopic images are described. (J.P.N.)

  9. Research of the system response of neutron double scatter imaging for MLEM reconstruction

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, M., E-mail: wyj2013@163.com [Northwest Institute of Nuclear Technology, Xi’an 710024 (China); State Key Laboratory of Intense Pulsed Radiation-Simulation and Effect, Xi’an 710024 (China); Peng, B.D.; Sheng, L.; Li, K.N.; Zhang, X.P.; Li, Y.; Li, B.K.; Yuan, Y.; Wang, P.W.; Zhang, X.D.; Li, C.H. [Northwest Institute of Nuclear Technology, Xi’an 710024 (China); State Key Laboratory of Intense Pulsed Radiation-Simulation and Effect, Xi’an 710024 (China)

    2015-03-01

    A Maximum Likelihood image reconstruction technique has been applied to neutron scatter imaging. The response function of the imaging system can be obtained by Monte Carlo simulation, which is very time-consuming if the number of image pixels and particles is large. In this work, to improve time efficiency, an analytical approach based on the probability of neutron interaction and transport in the detector is developed to calculate the system response function. The response function was applied to calculate the relative efficiency of the neutron scatter imaging system as a function of the incident neutron energy. The calculated results agreed with simulations by the MCNP5 software. Then the maximum likelihood expectation maximization (MLEM) reconstruction method with the system response function was used to reconstruct data simulated by Monte Carlo method. The results showed that there was good consistency between the reconstruction position and true position. Compared with back-projection reconstruction, the improvement in image quality was obvious, and the locations could be discerned easily for multiple radiation point sources.

  10. Quality assurance for the ALICE Monte Carlo procedure

    CERN Document Server

    Ajaz, M; Hristov, Peter; Revol, Jean Pierre

    2009-01-01

    We implement the already existing macro,$ALICE_ROOT/STEER /CheckESD.C that is ran after reconstruction to compute the physics efficiency, as a task that will run on proof framework like CAF. The task was implemented in a C++ class called AliAnalysisTaskCheckESD and it inherits from AliAnalysisTaskSE base class. The function of AliAnalysisTaskCheckESD is to compute the ratio of the number of reconstructed particles to the number of particle generated by the Monte Carlo generator.The class AliAnalysisTaskCheckESD was successfully implemented. It was used during the production for first physics and permitted to discover several problems (missing track in the MUON arm reconstruction, low efficiency in the PHOS detector etc.). The code is committed to the SVN repository and will become standard tool for quality assurance.

  11. New Developments for Jet Substructure Reconstruction in CMS

    CERN Document Server

    CMS Collaboration

    2017-01-01

    We present Monte Carlo based studies showcasing several developments for jet substructure reconstruction in CMS. This include Quark/Gluon tagging algorithms using Boosted Decision Trees and Deep Neural Networks, the XCone jet clustering algorithm and the Boosted Event Shape Tagger (BEST).

  12. Hybrid Photoacoustic/Ultrasound Tomograph for Real-Time Finger Imaging.

    Science.gov (United States)

    Oeri, Milan; Bost, Wolfgang; Sénégond, Nicolas; Tretbar, Steffen; Fournelle, Marc

    2017-10-01

    We report a target-enclosing, hybrid tomograph with a total of 768 elements based on capacitive micromachined ultrasound transducer technology and providing fast, high-resolution 2-D/3-D photoacoustic and ultrasound tomography tailored to finger imaging. A freely programmable ultrasound beamforming platform sampling data at 80 MHz was developed to realize plane wave transmission under multiple angles. A multiplexing unit enables the connection and control of a large number of elements. Fast image reconstruction is provided by GPU processing. The tomograph is composed of four independent and fully automated movable arc-shaped transducers, allowing imaging of all three finger joints. The system benefits from photoacoustics, yielding high optical contrast and enabling visualization of finger vascularization, and ultrasound provides morphologic information on joints and surrounding tissue. A diode-pumped, Q-switched Nd:YAG laser and an optical parametric oscillator are used to broaden the spectrum of emitted wavelengths to provide multispectral imaging. Custom-made optical fiber bundles enable illumination of the region of interest in the plane of acoustic detection. Precision in positioning of the probe in motion is ensured by use of a motor-driven guide slide. The current position of the probe is encoded by the stage and used to relate ultrasound and photoacoustic signals to the corresponding region of interest of the suspicious finger joint. The system is characterized in phantoms and a healthy human finger in vivo. The results obtained promise to provide new opportunities in finger diagnostics and establish photoacoustic/ultrasound-tomography in medical routine. Copyright © 2017 World Federation for Ultrasound in Medicine & Biology. Published by Elsevier Inc. All rights reserved.

  13. Relaxed Simultaneous Tomographic Reconstruction and Segmentation with Class Priors for Poisson Noise

    DEFF Research Database (Denmark)

    Romanov, Mikhail; Dahl, Anders Bjorholm; Dong, Yiqiu

    : our new algorithm can handle Poisson noise in the data, and it can solve much larger problems since it does not store the matrix. We formulate this algorithm and test it on artificial test problems. Our results show that the algorithm performs well, and that we are able to produce reconstructions...

  14. A new electrode-mesh tomograph for advanced studies on bubbly flow characteristics

    Energy Technology Data Exchange (ETDEWEB)

    Richter, St.; Aritomi, M. [Tokyo Institute of Technology, Tokyo (Japan)

    2001-07-01

    For studies on the characteristics of bubble flow in a rectangular channel (20 x 100 mm) a new electrode-mesh tomograph have been applied. The measuring principle is based on local conductivity measurement and a signal concerning. The applied sensor scans the local void fraction distribution in 2 parallel planes, separated 1.5 mm in flow direction, with a resolution of 3.0 x 2.2 mm and a overall sampling rate of 1200 Hz (all 256 points). Algorithm for the calculation of the local instantaneous void fraction distribution and the true gas velocity are presented. Based on these values the approximate shape of bubbles have been reconstructed and the gas volume flow through the sensor evaluated. The superficial gas velocity as well as the local distribution of the gas volume flux can be calculated. An extensive sensitivity study illustrating the applicability and accuracy is presented, based on experimental observations as well as theoretical considerations. The evaluated results are compared with high-speed video observations of the flow field as well as data comparing the reconstructed volume flow with measurements by a laminar flow meter. Good agreement can be stated. (author)

  15. Deformable 3D–2D registration for CT and its application to low dose tomographic fluoroscopy

    International Nuclear Information System (INIS)

    Flach, Barbara; Brehm, Marcus; Sawall, Stefan; Kachelrieß, Marc

    2014-01-01

    Many applications in medical imaging include image registration for matching of images from the same or different modalities. In the case of full data sampling, the respective reconstructed images are usually of such a good image quality that standard deformable volume-to-volume (3D–3D) registration approaches can be applied. But research in temporal-correlated image reconstruction and dose reductions increases the number of cases where rawdata are available from only few projection angles. Here, deteriorated image quality leads to non-acceptable deformable volume-to-volume registration results. Therefore a registration approach is required that is robust against a decreasing number of projections defining the target position. We propose a deformable volume-to-rawdata (3D–2D) registration method that aims at finding a displacement vector field maximizing the alignment of a CT volume and the acquired rawdata based on the sum of squared differences in rawdata domain. The registration is constrained by a regularization term in accordance with a fluid-based diffusion. Both cost function components, the rawdata fidelity and the regularization term, are optimized in an alternating manner. The matching criterion is optimized by a conjugate gradient descent for nonlinear functions, while the regularization is realized by convolution of the vector fields with Gaussian kernels. We validate the proposed method and compare it to the demons algorithm, a well-known 3D–3D registration method. The comparison is done for a range of 4–60 target projections using datasets from low dose tomographic fluoroscopy as an application example. The results show a high correlation to the ground truth target position without introducing artifacts even in the case of very few projections. In particular the matching in the rawdata domain is improved compared to the 3D–3D registration for the investigated range. The proposed volume-to-rawdata registration increases the robustness

  16. Tomographic array

    International Nuclear Information System (INIS)

    1976-01-01

    The configuration of a tomographic array in which the object can rotate about its axis is described. The X-ray detector is a cylindrical screen perpendicular to the axis of rotation. The X-ray source has a line-shaped focus coinciding with the axis of rotation. The beam is fan-shaped with one side of this fan lying along the axis of rotation. The detector screen is placed inside an X-ray image multiplier tube

  17. Exact reconstruction in 2D dynamic CT: compensation of time-dependent affine deformations

    International Nuclear Information System (INIS)

    Roux, Sebastien; Desbat, Laurent; Koenig, Anne; Grangeat, Pierre

    2004-01-01

    This work is dedicated to the reduction of reconstruction artefacts due to motion occurring during the acquisition of computerized tomographic projections. This problem has to be solved when imaging moving organs such as the lungs or the heart. The proposed method belongs to the class of motion compensation algorithms, where the model of motion is included in the reconstruction formula. We address two fundamental questions. First what conditions on the deformation are required for the reconstruction of the object from projections acquired sequentially during the deformation, and second how do we reconstruct the object from those projections. Here we answer these questions in the particular case of 2D general time-dependent affine deformations, assuming the motion parameters are known. We treat the problem of admissibility conditions on the deformation in the parallel-beam and fan-beam cases. Then we propose exact reconstruction methods based on rebinning or sequential FBP formulae for each of these geometries and present reconstructed images obtained with the fan-beam algorithm on simulated data

  18. Weak lensing galaxy cluster field reconstruction

    Science.gov (United States)

    Jullo, E.; Pires, S.; Jauzac, M.; Kneib, J.-P.

    2014-02-01

    In this paper, we compare three methods to reconstruct galaxy cluster density fields with weak lensing data. The first method called FLens integrates an inpainting concept to invert the shear field with possible gaps, and a multi-scale entropy denoising procedure to remove the noise contained in the final reconstruction, that arises mostly from the random intrinsic shape of the galaxies. The second and third methods are based on a model of the density field made of a multi-scale grid of radial basis functions. In one case, the model parameters are computed with a linear inversion involving a singular value decomposition (SVD). In the other case, the model parameters are estimated using a Bayesian Monte Carlo Markov Chain optimization implemented in the lensing software LENSTOOL. Methods are compared on simulated data with varying galaxy density fields. We pay particular attention to the errors estimated with resampling. We find the multi-scale grid model optimized with Monte Carlo Markov Chain to provide the best results, but at high computational cost, especially when considering resampling. The SVD method is much faster but yields noisy maps, although this can be mitigated with resampling. The FLens method is a good compromise with fast computation, high signal-to-noise ratio reconstruction, but lower resolution maps. All three methods are applied to the MACS J0717+3745 galaxy cluster field, and reveal the filamentary structure discovered in Jauzac et al. We conclude that sensitive priors can help to get high signal-to-noise ratio, and unbiased reconstructions.

  19. Development of tomographic reconstruction algorithms for the PIXE analysis of biological samples

    International Nuclear Information System (INIS)

    Nguyen, D.T.

    2008-05-01

    The development of 3-dimensional microscopy techniques offering a spatial resolution of 1 μm or less has opened a large field of investigation in Cell Biology. Amongst them, an interesting advantage of ion beam micro-tomography is its ability to give quantitative results in terms of local concentrations in a direct way, using Particle Induced X-ray Emission (PIXET) combined to Scanning Transmission Ion Microscopy (STIMT) Tomography. After a brief introduction of existing reconstruction techniques, we present the principle of the DISRA code, the most complete written so far, which is the basis of the present work. We have modified and extended the DISRA algorithm by considering the specific aspects of biologic specimens. Moreover, correction procedures were added in the code to reduce noise in the tomograms. For portability purpose, a Windows graphic interface was designed to easily enter and modify experimental parameters used in the reconstruction, and control the several steps of data reduction. Results of STIMT and PIXET experiments on reference specimens and on human cancer cells will be also presented. (author)

  20. MO-DE-BRA-06: 3D Image Acquisition and Reconstruction Explained with Online Animations

    International Nuclear Information System (INIS)

    Kesner, A

    2016-01-01

    Purpose: Understanding the principles of 3D imaging and image reconstruction is fundamental to the field of medical imaging. Clinicians, technologists, physicists, patients, students, and inquisitive minds all stand to benefit from greater comprehension of the supporting technologies. To help explain the basic principles of 3D imaging, we developed multi-frame animations that convey the concepts of tomographic imaging. The series of free (gif) animations are accessible online, and provide a multimedia introduction to the main concepts of image reconstruction. Methods: Text and animations were created to convey the principles of analytic tomography in CT, PET, and SPECT. Specific topics covered included: principles of sinograms/image data storage, forward projection, principles of PET acquisitions, and filtered backprojection. A total of 8 animations were created and presented for CT, PET, and digital phantom formats. In addition, a free executable is also provided to allow users to create their own tomographic animations – providing an opportunity for interaction and personalization to help foster user interest. Results: Tutorial text and animations have been posted online, freely available to view or download. The animations are in first position in a google search of “image reconstruction animations”. The website currently receives approximately 200 hits/month, from all over the world, and the usage is growing. Positive feedback has been collected from users. Conclusion: We identified a need for improved teaching tools to help visualize the (temporally variant) concepts of image reconstruction, and have shown that animations can be a useful tool for this aspect of education. Furthermore, posting animations freely on the web has shown to be a good way to maximize their impact in the community. In future endeavors, we hope to expand this animated content, to cover principles of iterative reconstruction, as well as other phenomena relating to imaging.

  1. MO-DE-BRA-06: 3D Image Acquisition and Reconstruction Explained with Online Animations

    Energy Technology Data Exchange (ETDEWEB)

    Kesner, A

    2016-06-15

    Purpose: Understanding the principles of 3D imaging and image reconstruction is fundamental to the field of medical imaging. Clinicians, technologists, physicists, patients, students, and inquisitive minds all stand to benefit from greater comprehension of the supporting technologies. To help explain the basic principles of 3D imaging, we developed multi-frame animations that convey the concepts of tomographic imaging. The series of free (gif) animations are accessible online, and provide a multimedia introduction to the main concepts of image reconstruction. Methods: Text and animations were created to convey the principles of analytic tomography in CT, PET, and SPECT. Specific topics covered included: principles of sinograms/image data storage, forward projection, principles of PET acquisitions, and filtered backprojection. A total of 8 animations were created and presented for CT, PET, and digital phantom formats. In addition, a free executable is also provided to allow users to create their own tomographic animations – providing an opportunity for interaction and personalization to help foster user interest. Results: Tutorial text and animations have been posted online, freely available to view or download. The animations are in first position in a google search of “image reconstruction animations”. The website currently receives approximately 200 hits/month, from all over the world, and the usage is growing. Positive feedback has been collected from users. Conclusion: We identified a need for improved teaching tools to help visualize the (temporally variant) concepts of image reconstruction, and have shown that animations can be a useful tool for this aspect of education. Furthermore, posting animations freely on the web has shown to be a good way to maximize their impact in the community. In future endeavors, we hope to expand this animated content, to cover principles of iterative reconstruction, as well as other phenomena relating to imaging.

  2. Calibration simulation. A calibration Monte-Carlo program for the OPAL jet chamber

    International Nuclear Information System (INIS)

    Biebel, O.

    1989-12-01

    A calibration Monte Carlo program has been developed as a tool to investigate the interdependence of track reconstruction and calibration constants. Three categories of calibration effects have been considered: The precise knowledge of sense wire positions, necessary to reconstruct the particle trajectories in the jet chamber. Included are the staggering and the sag of the sense wires as well as tilts and rotations of their support structures. The various contributions to the measured drift time, with special emphasis on the aberration due to the track angle and the presence of a transverse magnetic field. A very precise knowledge of the drift velocity and the Lorentz angle of the drift paths with respect to the drift field is also required. The effects degrading particle identification via energy loss dE/dx. Impurities of the gas mixture and saturation effects depending on the track angle as well as the influence of the pulse shaping-electronics have been studied. These effects have been parametrised with coefficients corresponding to the calibration constants required for track reconstruction. Excellent agreement with the input data has been achieved when determining calibration constants from Monte Carlo data generated with these parametrisations. (orig.) [de

  3. Investigation of Compton scattering correction methods in cardiac SPECT by Monte Carlo simulations

    International Nuclear Information System (INIS)

    Silva, A.M. Marques da; Furlan, A.M.; Robilotta, C.C.

    2001-01-01

    The goal of this work was the use of Monte Carlo simulations to investigate the effects of two scattering correction methods: dual energy window (DEW) and dual photopeak window (DPW), in quantitative cardiac SPECT reconstruction. MCAT torso-cardiac phantom, with 99m Tc and non-uniform attenuation map was simulated. Two different photopeak windows were evaluated in DEW method: 15% and 20%. Two 10% wide subwindows centered symmetrically within the photopeak were used in DPW method. Iterative ML-EM reconstruction with modified projector-backprojector for attenuation correction was applied. Results indicated that the choice of the scattering and photopeak windows determines the correction accuracy. For the 15% window, fitted scatter fraction gives better results than k = 0.5. For the 20% window, DPW is the best method, but it requires parameters estimation using Monte Carlo simulations. (author)

  4. Wide-band antenna design for use in minimal-scan, microwave tomographic imaging

    Science.gov (United States)

    Klaser, Jacob

    Microwave tomography is widely used in biomedical imaging and nondestructive evaluation of dielectric materials. A novel microwave tomography system that uses an electrically-conformable mirror to steer the incident energy for producing multi-view projection data is being developed in the Non-Destructive Evaluation Laboratory (NDEL). Such a system will have a significant advantage over existing tomography systems in terms of simplicity of design and operation, particularly when there is limited-access of the structure that is being imaged. The major components of a mirror-based tomography system are the source mirror assembly, and a receiver array for capturing the multi-view projection data. This thesis addresses the design and development of the receiver array. This imaging array features balanced, anti-podal Vivaldi antennas, which offer large bandwidth, high gain and a compact size. From the simulations, as well as the experimental results for the antenna, the return loss (S 11) is below -10dB for the range from 2.2GHz to 8.2GHz, and the gain is measured to be near 6dB. The data gathered from the receiver array is then run through MATLAB code for tomographic reconstruction using the Filtered Back-Propagation algorithm from limited-view projections. Initial results of reconstruction from the measured data shows the feasibility of the approach, but a significant challenge remains in interpolating the data for a limited number of receiving antenna elements and removing noise from the reconstructed image.

  5. Arbitrary layer tomographic method and apparatus

    International Nuclear Information System (INIS)

    Kato, H.; Ishida, M.

    1984-01-01

    Many two-dimensional X-ray projection distribution images obtained by exposing an object to X-rays in various directions are once stored in positions different from one another in a stimulable phosphor sheet or respectively in many stimulable phosphor sheets. The stimulable phosphor sheet or sheets are then scanned with stimulating rays, and the light emitted thereby from the stimulable phosphor sheet or sheets is photoelectrically read out to obtain electric signals representing the X-ray projection distribution images. The electric signals are processed to obtain a tomographic image of an arbitrary tomographic layer of the object

  6. Search for 'Little Higgs' and reconstruction algorithms developments in Atlas

    International Nuclear Information System (INIS)

    Rousseau, D.

    2007-05-01

    This document summarizes developments of framework and reconstruction algorithms for the ATLAS detector at the LHC. A library of reconstruction algorithms has been developed in a more and more complex environment. The reconstruction software originally designed on an optimistic Monte-Carlo simulation, has been confronted with a more detailed 'as-built' simulation. The 'Little Higgs' is an effective theory which can be taken for granted, or as an opportunity to study heavy resonances. In several cases, these resonances can be detected in original channels like tZ, ZH or WH. (author)

  7. Multispectral x-ray CT: multivariate statistical analysis for efficient reconstruction

    Science.gov (United States)

    Kheirabadi, Mina; Mustafa, Wail; Lyksborg, Mark; Lund Olsen, Ulrik; Bjorholm Dahl, Anders

    2017-10-01

    Recent developments in multispectral X-ray detectors allow for an efficient identification of materials based on their chemical composition. This has a range of applications including security inspection, which is our motivation. In this paper, we analyze data from a tomographic setup employing the MultiX detector, that records projection data in 128 energy bins covering the range from 20 to 160 keV. Obtaining all information from this data requires reconstructing 128 tomograms, which is computationally expensive. Instead, we propose to reduce the dimensionality of projection data prior to reconstruction and reconstruct from the reduced data. We analyze three linear methods for dimensionality reduction using a dataset with 37 equally-spaced projection angles. Four bottles with different materials are recorded for which we are able to obtain similar discrimination of their content using a very reduced subset of tomograms compared to the 128 tomograms that would otherwise be needed without dimensionality reduction.

  8. Vectorization with SIMD extensions speeds up reconstruction in electron tomography.

    Science.gov (United States)

    Agulleiro, J I; Garzón, E M; García, I; Fernández, J J

    2010-06-01

    Electron tomography allows structural studies of cellular structures at molecular detail. Large 3D reconstructions are needed to meet the resolution requirements. The processing time to compute these large volumes may be considerable and so, high performance computing techniques have been used traditionally. This work presents a vector approach to tomographic reconstruction that relies on the exploitation of the SIMD extensions available in modern processors in combination to other single processor optimization techniques. This approach succeeds in producing full resolution tomograms with an important reduction in processing time, as evaluated with the most common reconstruction algorithms, namely WBP and SIRT. The main advantage stems from the fact that this approach is to be run on standard computers without the need of specialized hardware, which facilitates the development, use and management of programs. Future trends in processor design open excellent opportunities for vector processing with processor's SIMD extensions in the field of 3D electron microscopy.

  9. Gamma Ray Tomographic Scan Method for Large Scale Industrial Plants

    International Nuclear Information System (INIS)

    Moon, Jin Ho; Jung, Sung Hee; Kim, Jong Bum; Park, Jang Geun

    2011-01-01

    The gamma ray tomography systems have been used to investigate a chemical process for last decade. There have been many cases of gamma ray tomography for laboratory scale work but not many cases for industrial scale work. Non-tomographic equipment with gamma-ray sources is often used in process diagnosis. Gamma radiography, gamma column scanning and the radioisotope tracer technique are examples of gamma ray application in industries. In spite of many outdoor non-gamma ray tomographic equipment, the most of gamma ray tomographic systems still remained as indoor equipment. But, as the gamma tomography has developed, the demand on gamma tomography for real scale plants also increased. To develop the industrial scale system, we introduced the gamma-ray tomographic system with fixed detectors and rotating source. The general system configuration is similar to 4 th generation geometry. But the main effort has been made to actualize the instant installation of the system for real scale industrial plant. This work would be a first attempt to apply the 4th generation industrial gamma tomographic scanning by experimental method. The individual 0.5-inch NaI detector was used for gamma ray detection by configuring circular shape around industrial plant. This tomographic scan method can reduce mechanical complexity and require a much smaller space than a conventional CT. Those properties make it easy to get measurement data for a real scale plant

  10. Large R jet reconstruction and calibration at 13 TeV with the ATLAS detector

    CERN Document Server

    Taenzer, Joe; The ATLAS collaboration

    2017-01-01

    Large-R jets are used by many ATLAS analyses working in boosted regimes. ATLAS Large-R jets are reconstructed from locally callibrated calorimeter topoclusters with the Anti-k_{t} algorithm with radius parameter R=1.0, and then groomed to remove pile-up with the trimming algorithm with f_{cut} 0.05 and subjet radius R=0.2. Monte Carlo based energy and mass calibrations correct the reconstructed jet energy and mass to truth, followed by in-situ calibrations using a number of different techniques. Large-R jets can also be reconstructed using small-R jets as constituents, instead of topoclusters, a technique called jet reclustering, or from track calo clusters (TCCs), which are constituents constructed using both tracking and calorimeter information. An overview of large-R jet reconstruction will be presented here, along with selected results from the jet mass calibrations, both Monte Carlo based an insitu, from jet reclustering, and from track calo clusters.

  11. Tomographic Approach in Three-Orthogonal-Basis Quantum Key Distribution

    International Nuclear Information System (INIS)

    Liang Wen-Ye; Yin Zhen-Qiang; Chen Hua; Li Hong-Wei; Chen Wei; Han Zheng-Fu; Wen Hao

    2015-01-01

    At present, there is an increasing awareness of some three-orthogonal-basis quantum key distribution protocols, such as, the reference-frame-independent (RFI) protocol and the six-state protocol. For secure key rate estimations of these protocols, there are two methods: one is the conventional approach, and another is the tomographic approach. However, a comparison between these two methods has not been given yet. In this work, with the general model of rotation channel, we estimate the key rate using conventional and tomographic methods respectively. Results show that conventional estimation approach in RFI protocol is equivalent to tomographic approach only in the case of that one of three orthogonal bases is always aligned. In other cases, tomographic approach performs much better than the respective conventional approaches of the RFI protocol and the six-state protocol. Furthermore, based on the experimental data, we illustrate the deep connections between tomography and conventional RFI approach representations. (paper)

  12. Radiographic test phantom for computed tomographic lung nodule analysis

    International Nuclear Information System (INIS)

    Zerhouni, E.A.

    1987-01-01

    This patent describes a method for evaluating a computed tomograph scan of a nodule in a lung of a human or non-human animal. The method comprises generating a computer tomograph of a transverse section of the animal containing lung and nodule tissue, and generating a second computer tomograph of a test phantom comprising a device which simulates the transverse section of the animal. The tissue simulating portions of the device are constructed of materials having radiographic densities substantially identical to those of the corresponding tissue in the simulated transverse section of the animal and have voids therein which simulate, in size and shape, the lung cavities in the transverse section and which contain a test reference nodule constructed of a material of predetermined radiographic density which simulates in size, shape and position within a lung cavity void of the test phantom the nodule in the transverse section of the animal and comparing the respective tomographs

  13. A comparison of quantitative reconstruction techniques for PIXE-tomography analysis applied to biological samples

    Energy Technology Data Exchange (ETDEWEB)

    Beasley, D.G., E-mail: dgbeasley@ctn.ist.utl.pt [IST/C2TN, Universidade de Lisboa, Campus Tecnológico e Nuclear, E.N.10, 2686-953 Sacavém (Portugal); Alves, L.C. [IST/C2TN, Universidade de Lisboa, Campus Tecnológico e Nuclear, E.N.10, 2686-953 Sacavém (Portugal); Barberet, Ph.; Bourret, S.; Devès, G.; Gordillo, N.; Michelet, C. [Univ. Bordeaux, CENBG, UMR 5797, F-33170 Gradignan (France); CNRS, IN2P3, CENBG, UMR 5797, F-33170 Gradignan (France); Le Trequesser, Q. [Univ. Bordeaux, CENBG, UMR 5797, F-33170 Gradignan (France); CNRS, IN2P3, CENBG, UMR 5797, F-33170 Gradignan (France); Institut de Chimie de la Matière Condensée de Bordeaux (ICMCB, UPR9048) CNRS, Université de Bordeaux, 87 avenue du Dr. A. Schweitzer, Pessac F-33608 (France); Marques, A.C. [IST/IPFN, Universidade de Lisboa, Campus Tecnológico e Nuclear, E.N.10, 2686-953 Sacavém (Portugal); Seznec, H. [Univ. Bordeaux, CENBG, UMR 5797, F-33170 Gradignan (France); CNRS, IN2P3, CENBG, UMR 5797, F-33170 Gradignan (France); Silva, R.C. da [IST/IPFN, Universidade de Lisboa, Campus Tecnológico e Nuclear, E.N.10, 2686-953 Sacavém (Portugal)

    2014-07-15

    The tomographic reconstruction of biological specimens requires robust algorithms, able to deal with low density contrast and low element concentrations. At the IST/ITN microprobe facility new GPU-accelerated reconstruction software, JPIXET, has been developed, which can significantly increase the speed of quantitative reconstruction of Proton Induced X-ray Emission Tomography (PIXE-T) data. It has a user-friendly graphical user interface for pre-processing, data analysis and reconstruction of PIXE-T and Scanning Transmission Ion Microscopy Tomography (STIM-T). The reconstruction of PIXE-T data is performed using either an algorithm based on a GPU-accelerated version of the Maximum Likelihood Expectation Maximisation (MLEM) method or a GPU-accelerated version of the Discrete Image Space Reconstruction Algorithm (DISRA) (Sakellariou (2001) [2]). The original DISRA, its accelerated version, and the MLEM algorithm, were compared for the reconstruction of a biological sample of Caenorhabditis elegans – a small worm. This sample was analysed at the microbeam line of the AIFIRA facility of CENBG, Bordeaux. A qualitative PIXE-T reconstruction was obtained using the CENBG software package TomoRebuild (Habchi et al. (2013) [6]). The effects of pre-processing and experimental conditions on the elemental concentrations are discussed.

  14. Studies of discrete symmetries in a purely leptonic system using the Jagiellonian Positron Emission Tomograph

    Directory of Open Access Journals (Sweden)

    Moskal P.

    2016-01-01

    Full Text Available Discrete symmetries such as parity (P, charge-conjugation (C and time reversal (T are of fundamental importance in physics and cosmology. Breaking of charge conjugation symmetry (C and its combination with parity (CP constitute necessary conditions for the existence of the asymmetry between matter and antimatter in the observed Universe. The presently known sources of discrete symmetries violations can account for only a tiny fraction of the excess of matter over antimatter. So far CP and T symmetries violations were observed only for systems involving quarks and they were never reported for the purely leptonic objects. In this article we describe briefly an experimental proposal for the test of discrete symmetries in the decays of positronium atom which is made exclusively of leptons. The experiments are conducted by means of the Jagiellonian Positron Emission Tomograph (J-PET which is constructed from strips of plastic scintillators enabling registration of photons from the positronium annihilation. J-PET tomograph together with the positronium target system enable to measure expectation values for the discrete symmetries odd operators constructed from (i spin vector of the ortho-positronium atom, (ii momentum vectors of photons originating from the decay of positronium, and (iii linear polarization direction of annihilation photons. Linearly polarized positronium will be produced in the highly porous aerogel or polymer targets, exploiting longitudinally polarized positrons emitted by the sodium 22Na isotope. Information about the polarization vector of orthopositronium will be available on the event by event basis and will be reconstructed from the known position of the positron source and the reconstructed position of the orthopositronium annihilation. In 2016 the first tests and calibration runs are planned, and the data collection with high statistics will commence in the year 2017.

  15. Development of the two Korean adult tomographic computational phantoms for organ dosimetry

    International Nuclear Information System (INIS)

    Lee, Choonsik; Lee, Choonik; Park, Sang-Hyun; Lee, Jai-Ki

    2006-01-01

    Following the previously developed Korean tomographic phantom, KORMAN, two additional whole-body tomographic phantoms of Korean adult males were developed from magnetic resonance (MR) and computed tomography (CT) images, respectively. Two healthy male volunteers, whose body dimensions were fairly representative of the average Korean adult male, were recruited and scanned for phantom development. Contiguous whole body MR images were obtained from one subject exclusive of the arms, while whole-body CT images were acquired from the second individual. A total of 29 organs and tissues and 19 skeletal sites were segmented via image manipulation techniques such as gray-level thresholding, region growing, and manual drawing, in which each of segmented image slice was subsequently reviewed by an experienced radiologist for anatomical accuracy. The resulting phantoms, the MR-based KTMAN-1 (Korean Typical MAN-1) and the CT-based KTMAN-2 (Korean Typical MAN-2), consist of 300x150x344 voxels with a voxel resolution of 2x2x5 mm 3 for both phantoms. Masses of segmented organs and tissues were calculated as the product of a nominal reference density, the prevoxel volume, and the cumulative number of voxels defining each organs or tissue. These organs masses were then compared with those of both the Asian and the ICRP reference adult male. Organ masses within both KTMAN-1 and KTMAN-2 showed differences within 40% of Asian and ICRP reference values, with the exception of the skin, gall bladder, and pancreas which displayed larger differences. The resulting three-dimensional binary file was ported to the Monte Carlo code MCNPX2.4 to calculate organ doses following external irradiation for illustrative purposes. Colon, lung, liver, and stomach absorbed doses, as well as the effective dose, for idealized photon irradiation geometries (anterior-posterior and right lateral) were determined, and then compared with data from two other tomographic phantoms (Asian and Caucasian), and

  16. Statistical analysis of nonlinearly reconstructed near-infrared tomographic images: Part I--Theory and simulations.

    Science.gov (United States)

    Pogue, Brian W; Song, Xiaomei; Tosteson, Tor D; McBride, Troy O; Jiang, Shudong; Paulsen, Keith D

    2002-07-01

    Near-infrared (NIR) diffuse tomography is an emerging method for imaging the interior of tissues to quantify concentrations of hemoglobin and exogenous chromophores non-invasively in vivo. It often exploits an optical diffusion model-based image reconstruction algorithm to estimate spatial property values from measurements of the light flux at the surface of the tissue. In this study, mean-squared error (MSE) over the image is used to evaluate methods for regularizing the ill-posed inverse image reconstruction problem in NIR tomography. Estimates of image bias and image standard deviation were calculated based upon 100 repeated reconstructions of a test image with randomly distributed noise added to the light flux measurements. It was observed that the bias error dominates at high regularization parameter values while variance dominates as the algorithm is allowed to approach the optimal solution. This optimum does not necessarily correspond to the minimum projection error solution, but typically requires further iteration with a decreasing regularization parameter to reach the lowest image error. Increasing measurement noise causes a need to constrain the minimum regularization parameter to higher values in order to achieve a minimum in the overall image MSE.

  17. Monte Carlo-based dose reconstruction in a rat model for scattered ionizing radiation investigations.

    Science.gov (United States)

    Kirkby, Charles; Ghasroddashti, Esmaeel; Kovalchuk, Anna; Kolb, Bryan; Kovalchuk, Olga

    2013-09-01

    In radiation biology, rats are often irradiated, but the precise dose distributions are often lacking, particularly in areas that receive scatter radiation. We used a non-dedicated set of resources to calculate detailed dose distributions, including doses to peripheral organs well outside of the primary field, in common rat exposure settings. We conducted a detailed dose reconstruction in a rat through an analog to the conventional human treatment planning process. The process consisted of: (i) Characterizing source properties of an X-ray irradiator system, (ii) acquiring a computed tomography (CT) scan of a rat model, and (iii) using a Monte Carlo (MC) dose calculation engine to generate the dose distribution within the rat model. We considered cranial and liver irradiation scenarios where the rest of the body was protected by a lead shield. Organs of interest were the brain, liver and gonads. The study also included paired scenarios where the dose to adjacent, shielded rats was determined as a potential control for analysis of bystander effects. We established the precise doses and dose distributions delivered to the peripheral organs in single and paired rats. Mean doses to non-targeted organs in irradiated rats ranged from 0.03-0.1% of the reference platform dose. Mean doses to the adjacent rat peripheral organs were consistent to within 10% those of the directly irradiated rat. This work provided details of dose distributions in rat models under common irradiation conditions and established an effective scenario for delivering only scattered radiation consistent with that in a directly irradiated rat.

  18. First experience with a mobile computed tomograph in the USSR

    International Nuclear Information System (INIS)

    Portnoj, L.M.

    1989-01-01

    Utilization experience of mobile computerized tomograph mounted in the bus is presented. Problems concerning staff, selection of medical base institutes etc are considered. Efficiency of mobile computerized tomographes in revealing different diseases is pointed out

  19. Dynamic dual-tracer PET reconstruction.

    Science.gov (United States)

    Gao, Fei; Liu, Huafeng; Jian, Yiqiang; Shi, Pengcheng

    2009-01-01

    Although of important medical implications, simultaneous dual-tracer positron emission tomography reconstruction remains a challenging problem, primarily because the photon measurements from dual tracers are overlapped. In this paper, we propose a simultaneous dynamic dual-tracer reconstruction of tissue activity maps based on guidance from tracer kinetics. The dual-tracer reconstruction problem is formulated in a state-space representation, where parallel compartment models serve as continuous-time system equation describing the tracer kinetic processes of dual tracers, and the imaging data is expressed as discrete sampling of the system states in measurement equation. The image reconstruction problem has therefore become a state estimation problem in a continuous-discrete hybrid paradigm, and H infinity filtering is adopted as the estimation strategy. As H infinity filtering makes no assumptions on the system and measurement statistics, robust reconstruction results can be obtained for the dual-tracer PET imaging system where the statistical properties of measurement data and system uncertainty are not available a priori, even when there are disturbances in the kinetic parameters. Experimental results on digital phantoms, Monte Carlo simulations and physical phantoms have demonstrated the superior performance.

  20. Quantitative tomography simulations and reconstruction algorithms

    International Nuclear Information System (INIS)

    Martz, H.E.; Aufderheide, M.B.; Goodman, D.; Schach von Wittenau, A.; Logan, C.; Hall, J.; Jackson, J.; Slone, D.

    2000-01-01

    X-ray, neutron and proton transmission radiography and computed tomography (CT) are important diagnostic tools that are at the heart of LLNL's effort to meet the goals of the DOE's Advanced Radiography Campaign. This campaign seeks to improve radiographic simulation and analysis so that radiography can be a useful quantitative diagnostic tool for stockpile stewardship. Current radiographic accuracy does not allow satisfactory separation of experimental effects from the true features of an object's tomographically reconstructed image. This can lead to difficult and sometimes incorrect interpretation of the results. By improving our ability to simulate the whole radiographic and CT system, it will be possible to examine the contribution of system components to various experimental effects, with the goal of removing or reducing them. In this project, we are merging this simulation capability with a maximum-likelihood (constrained-conjugate-gradient-CCG) reconstruction technique yielding a physics-based, forward-model image-reconstruction code. In addition, we seek to improve the accuracy of computed tomography from transmission radiographs by studying what physics is needed in the forward model. During FY 2000, an improved version of the LLNL ray-tracing code called HADES has been coupled with a recently developed LLNL CT algorithm known as CCG. The problem of image reconstruction is expressed as a large matrix equation relating a model for the object being reconstructed to its projections (radiographs). Using a constrained-conjugate-gradient search algorithm, a maximum likelihood solution is sought. This search continues until the difference between the input measured radiographs or projections and the simulated or calculated projections is satisfactorily small

  1. Handling data redundancy in helical cone beam reconstruction with a cone-angle-based window function and its asymptotic approximation

    International Nuclear Information System (INIS)

    Tang Xiangyang; Hsieh Jiang

    2007-01-01

    A cone-angle-based window function is defined in this manuscript for image reconstruction using helical cone beam filtered backprojection (CB-FBP) algorithms. Rather than defining the window boundaries in a two-dimensional detector acquiring projection data for computed tomographic imaging, the cone-angle-based window function deals with data redundancy by selecting rays with the smallest cone angle relative to the reconstruction plane. To be computationally efficient, an asymptotic approximation of the cone-angle-based window function is also given and analyzed in this paper. The benefit of using such an asymptotic approximation also includes the avoidance of functional discontinuities that cause artifacts in reconstructed tomographic images. The cone-angle-based window function and its asymptotic approximation provide a way, equivalent to the Tam-Danielsson-window, for helical CB-FBP reconstruction algorithms to deal with data redundancy, regardless of where the helical pitch is constant or dynamically variable during a scan. By taking the cone-parallel geometry as an example, a computer simulation study is conducted to evaluate the proposed window function and its asymptotic approximation for helical CB-FBP reconstruction algorithm to handle data redundancy. The computer simulated Forbild head and thorax phantoms are utilized in the performance evaluation, showing that the proposed cone-angle-based window function and its asymptotic approximation can deal with data redundancy very well in cone beam image reconstruction from projection data acquired along helical source trajectories. Moreover, a numerical study carried out in this paper reveals that the proposed cone-angle-based window function is actually equivalent to the Tam-Danielsson-window, and rigorous mathematical proofs are being investigated

  2. A new method to evaluate image quality of nuclear medicine tomographs

    International Nuclear Information System (INIS)

    Giannone, C.A.; Cabrejas, M.L.; Arashiro, J.A.

    2002-01-01

    Objective: To evaluate the usefulness of a new statistics, the Performance Index (PI), in order to make judgements about diagnostic accuracy of nuclear medicine tomographs (NMT). Methods: A phantom was designed for blind evaluation of device performance. It has 8 cold cylindrical inserts of different diameters. Acquisitions were performed in 40 labs following a defined protocol (under an International Atomic Energy Agency survey). Non-reconstructed set of views were processed and evaluated at a central lab using the same protocol for all the studies. Lesion detection was performed over eye-selected reconstructed slices applying a smoothing filter and a look up table (LUT) with fixed thresholds: counts/pixel = mean ± K . Standard deviation, with K=1,2,3 or >3. The number and location of the inserts was reported by blind observers, afterwards the true and false positive fractions was assessed by another observer. Receiver operating characteristic (ROC) analysis cannot be applied in our experiment where each image with multiple simulated lesions needs to be evaluated. A free-response ROC analysis, developed for observers' performance evaluation, has also flaws. Moreover, our goal was to assess device performance minimising the observer component. A new index, PI, that considers simultaneously the number of true and false positives (TP and FP) was evaluated to categorise NMT. PI is the ratio between the positive predictive value and the sensitivity, expressed as its complement adding a constant to avoid a singularity. Results: The smoothing filter and the selected LUT leads to observers-independent simulated lesion detection. Based on statistical analysis (bootstrapping), it is concluded that the number of observed false positives must be lower than the observed true positives (no. FP < no. TP) to accept an instrument for clinical purposes. Moreover, the number of observed TP must be considered in relation to a minimum tomographic resolution needed to achieve enough

  3. BPF-type region-of-interest reconstruction for parallel translational computed tomography.

    Science.gov (United States)

    Wu, Weiwen; Yu, Hengyong; Wang, Shaoyu; Liu, Fenglin

    2017-01-01

    The objective of this study is to present and test a new ultra-low-cost linear scan based tomography architecture. Similar to linear tomosynthesis, the source and detector are translated in opposite directions and the data acquisition system targets on a region-of-interest (ROI) to acquire data for image reconstruction. This kind of tomographic architecture was named parallel translational computed tomography (PTCT). In previous studies, filtered backprojection (FBP)-type algorithms were developed to reconstruct images from PTCT. However, the reconstructed ROI images from truncated projections have severe truncation artefact. In order to overcome this limitation, we in this study proposed two backprojection filtering (BPF)-type algorithms named MP-BPF and MZ-BPF to reconstruct ROI images from truncated PTCT data. A weight function is constructed to deal with data redundancy for multi-linear translations modes. Extensive numerical simulations are performed to evaluate the proposed MP-BPF and MZ-BPF algorithms for PTCT in fan-beam geometry. Qualitative and quantitative results demonstrate that the proposed BPF-type algorithms cannot only more accurately reconstruct ROI images from truncated projections but also generate high-quality images for the entire image support in some circumstances.

  4. Prediction of beam hardening artefacts in computed tomography using Monte Carlo simulations

    DEFF Research Database (Denmark)

    Thomsen, M.; Bergbäck Knudsen, Erik; Willendrup, Peter Kjær

    2015-01-01

    We show how radiological images of both single and multi material samples can be simulated using the Monte Carlo simulation tool McXtrace and how these images can be used to make a three dimensional reconstruction. Good numerical agreement between the X-ray attenuation coefficient in experimental...

  5. THREE-DIMENSIONAL TOMOGRAPHIC RECONSTRUCTION OF FOUNDRY ARTICLES ON LIMITED MODEL AND EXPERIMENTAL DATA

    Directory of Open Access Journals (Sweden)

    V. L. Vengrinovich

    2009-01-01

    Full Text Available The ways of overcoming of lack of source information, The ways of overcoming of lack of source information, allowing to reduce energy of primary X-radiation, necessary  for radiography and to provide high quality of reconstruction, are offered.

  6. Computed tomographic findings of intracranial pyogenic abscess

    International Nuclear Information System (INIS)

    Kim, S. J.; Suh, J. H.; Park, C. Y.; Lee, K. C.; Chung, S. S.

    1982-01-01

    The early diagnosis and effective treatment of brain abscess pose a difficult clinical problem. With the advent of computed tomography, however, it appears that mortality due to intracranial abscess has significantly diminished. 54 cases of intracranial pyogenic abscess are presented. Etiologic factors and computed tomographic findings are analyzed and following result are obtained. 1. The common etiologic factors are otitis media, post operation, and head trauma, in order of frequency. 2. The most common initial computed tomographic findings of brain abscess is ring contrast enhancement with surrounding brain edema. 3. The most characteristic computed tomographic finding of ring contrast enhancement is smooth thin walled ring contrast enhancement. 4. Most of thick irregular ring contrast enhancement are abscess associated with cyanotic heart disease or poor operation. 5. The most common findings of epidural and subdural empyema is crescentic radiolucent area with thin wall contrast enhancement without surrounding brain edema in convexity of brain

  7. Tomographic particle image velocimetry investigation of the flow in a modeled human carotid artery bifurcation

    Science.gov (United States)

    Buchmann, N. A.; Atkinson, C.; Jeremy, M. C.; Soria, J.

    2011-04-01

    Hemodynamic forces within the human carotid artery are well known to play a key role in the initiation and progression of vascular diseases such as atherosclerosis. The degree and extent of the disease largely depends on the prevailing three-dimensional flow structure and wall shear stress (WSS) distribution. This work presents tomographic PIV (Tomo-PIV) measurements of the flow structure and WSS in a physiologically accurate model of the human carotid artery bifurcation. The vascular geometry is reconstructed from patient-specific data and reproduced in a transparent flow phantom to demonstrate the feasibility of Tomo-PIV in a complex three-dimensional geometry. Tomographic reconstruction is performed with the multiplicative line-of-sight (MLOS) estimation and simultaneous multiplicative algebraic reconstruction (SMART) technique. The implemented methodology is validated by comparing the results with Stereo-PIV measurements in the same facility. Using a steady flow assumption, the measurement error and RMS uncertainty are directly inferred from the measured velocity field. It is shown that the measurement uncertainty increases for increasing light sheet thickness and increasing velocity gradients, which are largest near the vessel walls. For a typical volume depth of 6 mm (or 256 pixel), the analysis indicates that the velocity derived from 3D cross-correlation can be measured within ±2% of the maximum velocity (or ±0.2 pixel) near the center of the vessel and within ±5% (±0.6 pixel) near the vessel wall. The technique is then applied to acquire 3D-3C velocity field data at multiple axial locations within the carotid artery model, which are combined to yield the flow field and WSS in a volume of approximately 26 mm × 27 mm × 60 mm. Shear stress is computed from the velocity gradient tensor and a method for inferring the WSS distribution on the vessel wall is presented. The results indicate the presence of a complex and three-dimensional flow structure, with

  8. A Convex Reconstruction Model for X-ray Tomographic Imaging with Uncertain Flat-fields

    DEFF Research Database (Denmark)

    Aggrawal, Hari Om; Andersen, Martin Skovgaard; Rose, Sean

    2018-01-01

    has a negligible effect on the reconstruction quality. However, in time- or dose-limited applications such as dynamic CT, this uncertainty may cause severe and systematic artifacts known as ring artifacts. By carefully modeling the measurement process and by taking uncertainties into account, we...

  9. Apparatus for tomography in which signal profiles gathered from divergent radiation can be reconstructed in signal profiles, each corresponding with a beam of parallel rays

    International Nuclear Information System (INIS)

    1976-01-01

    A tomograph which is capable of gathering divergent radiations and reconstruct them in signal profiles or images each corresponding with a beam of parallel rays is discussed which may eliminate the interfering point dispersion function which normally occurs

  10. Intensity-based bayesian framework for image reconstruction from sparse projection data

    International Nuclear Information System (INIS)

    Rashed, E.A.; Kudo, Hiroyuki

    2009-01-01

    This paper presents a Bayesian framework for iterative image reconstruction from projection data measured over a limited number of views. The classical Nyquist sampling rule yields the minimum number of projection views required for accurate reconstruction. However, challenges exist in many medical and industrial imaging applications in which the projection data is undersampled. Classical analytical reconstruction methods such as filtered backprojection (FBP) are not a good choice for use in such cases because the data undersampling in the angular range introduces aliasing and streak artifacts that degrade lesion detectability. In this paper, we propose a Bayesian framework for maximum likelihood-expectation maximization (ML-EM)-based iterative reconstruction methods that incorporates a priori knowledge obtained from expected intensity information. The proposed framework is based on the fact that, in tomographic imaging, it is often possible to expect a set of intensity values of the reconstructed object with relatively high accuracy. The image reconstruction cost function is modified to include the l 1 norm distance to the a priori known information. The proposed method has the potential to regularize the solution to reduce artifacts without missing lesions that cannot be expected from the a priori information. Numerical studies showed a significant improvement in image quality and lesion detectability under the condition of highly undersampled projection data. (author)

  11. The ATLAS Fast Monte Carlo Production Chain Project

    CERN Document Server

    Jansky, Roland Wolfgang; The ATLAS collaboration

    2015-01-01

    During the last years ATLAS has successfully deployed a new integrated simulation framework (ISF) which allows a flexible mixture of full and fast detector simulation techniques within the processing of one event. The thereby achieved possible speed-up in detector simulation of up to a factor 100 makes subsequent digitization and reconstruction the dominant contributions to the Monte Carlo (MC) production CPU cost. The slowest components of both digitization and reconstruction are inside the Inner Detector due to the complex signal modeling needed in the emulation of the detector readout and in reconstruction due to the combinatorial nature of the problem to solve, respectively. Alternative fast approaches have been developed for these components: for the silicon based detectors a simpler geometrical clustering approach has been deployed replacing the charge drift emulation in the standard digitization modules, which achieves a very high accuracy in describing the standard output. For the Inner Detector track...

  12. X-ray tomographic in-service testing of girth welds - The European project TomoWELD

    International Nuclear Information System (INIS)

    Ewert, Uwe; Redmer, Bernhard; Walter, David; Thiessenhusen, Kai-Uwe; Bellon, Carsten; Nicholson, P. Ian; Clarke, Alan; Finke-Haerkoenen, Klaus-Peter; Scharfschwerdt, Joerg W.; Rohde, Karsten

    2015-01-01

    The new standard ISO 17636-2: 2013 'NDT of welded joints - Radiographic testing - Part 2: X- and gamma radiographic testing with digital detectors ''defines the testing practice for digital radiography of welds for the production and in-service inspection. Furthermore the DIN 25435-7:2014 ''In-service inspections of the components of the primary circuit of light water reactors - Part 7: Radiographic testing'' was published. The essential requirements are discussed. The new TomoWELD system can both perform measurements according to these standards as well as record tomographic cross-sectional images (equivalent to metallographic sections), to determine image sizes. Areas of application are chemical and nuclear facilities. It provides a fast testing of girth welds as compared to the use of film or imaging plates. In 2006 the mechanized planar tomography system, TomoCAR, was already introduced, with one could measure cross-sectional images. TomoWELD uses a new photon counting and energy resolving detector with CdTe-CMOS crystal hybrids. The new detector allows the choice of energy thresholds, and enables the reduction of the influence of scattered radiation on the radiographic images and the reconstructed cross-sectional images. An optimized irradiation geometry with a new manipulator design and a fast GPU-based reconstruction algorithm can be used to accelerate the reconstruction and to improve the reconstruction results. The size and the shape of planar and voluminous irregularities can be determined. The concept and the first pictures will be presented. (Contains mainly PowerPoint slides). [de

  13. Penalized maximum likelihood reconstruction for x-ray differential phase-contrast tomography

    International Nuclear Information System (INIS)

    Brendel, Bernhard; Teuffenbach, Maximilian von; Noël, Peter B.; Pfeiffer, Franz; Koehler, Thomas

    2016-01-01

    Purpose: The purpose of this work is to propose a cost function with regularization to iteratively reconstruct attenuation, phase, and scatter images simultaneously from differential phase contrast (DPC) acquisitions, without the need of phase retrieval, and examine its properties. Furthermore this reconstruction method is applied to an acquisition pattern that is suitable for a DPC tomographic system with continuously rotating gantry (sliding window acquisition), overcoming the severe smearing in noniterative reconstruction. Methods: We derive a penalized maximum likelihood reconstruction algorithm to directly reconstruct attenuation, phase, and scatter image from the measured detector values of a DPC acquisition. The proposed penalty comprises, for each of the three images, an independent smoothing prior. Image quality of the proposed reconstruction is compared to images generated with FBP and iterative reconstruction after phase retrieval. Furthermore, the influence between the priors is analyzed. Finally, the proposed reconstruction algorithm is applied to experimental sliding window data acquired at a synchrotron and results are compared to reconstructions based on phase retrieval. Results: The results show that the proposed algorithm significantly increases image quality in comparison to reconstructions based on phase retrieval. No significant mutual influence between the proposed independent priors could be observed. Further it could be illustrated that the iterative reconstruction of a sliding window acquisition results in images with substantially reduced smearing artifacts. Conclusions: Although the proposed cost function is inherently nonconvex, it can be used to reconstruct images with less aliasing artifacts and less streak artifacts than reconstruction methods based on phase retrieval. Furthermore, the proposed method can be used to reconstruct images of sliding window acquisitions with negligible smearing artifacts

  14. A study of the decoding of multiple pinhole coded aperture RI tomographic images

    International Nuclear Information System (INIS)

    Hasegawa, Takeo; Kobayashi, Akitoshi; Nishiyama, Yutaka

    1980-01-01

    The authors constructed a Multiple Pinhole Coded Aperture (MPCA) and developed related decoding software. When simple coordinate transformation was performed, omission of points and shifting of counts occurred. By selecting various tomographic planes and collecting count for each tomographic depth from the shadowgram, a solution to these problems was found. The counts from the central portion of the tomographic image from the MPCA were incorrectly high, this was rectified by a correction function to improve the uniformity correction program of the γ-camera. Depth resolution of the tomographic image improved in proportion to the area encompassed by the pinhole configuration. An MPCA with a uniform arrangement of pinholes (e, g, pinholes in an arrangement parallel to the X-axis or the Y-axis) yielded decoded tomographic images of inferior quality. Optimum results were obtained with a ring-shaped arrangement yielding clinically applicable tomographic images even for large objects. (author)

  15. Positron emission mammography with tomographic acquisition using dual planar detectors: initial evaluations

    International Nuclear Information System (INIS)

    Smith, Mark F; Raylman, Raymond R; Majewski, Stan; Weisenberger, Andrew G

    2004-01-01

    Positron emission mammography (PEM) with tomographic acquisition using dual planar detectors rotating about the breast can obtain complete angular sampling and has the potential to improve activity estimation compared with PEM using stationary detectors. PEM tomography (PEMT) was compared with stationary PEM for point source and compressed breast phantom studies performed with a compact dual detector system. The acquisition geometries were appropriate for the target application of PEM guidance of stereotactic core biopsy. Images were reconstructed with a three-dimensional iterative maximum likelihood expectation maximization algorithm. PEMT eliminated blurring normal to the detectors seen with stationary PEM. Depth of interaction effects distorted the shape of the point spread functions for PEMT as the angular range from normal incidence of lines of response used in image reconstruction increased. Streak artefacts in PEMT for large detector rotation increments led to the development of an expression for the maximum rotation increment that maintains complete angular sampling. Studies with a compressed breast phantom were used to investigate contrast and signal-to-noise ratio (SNR) trade-offs for different sized spherical tumour models. PEMT and PEM both had advantages depending on lesion size and detector separation. The most appropriate acquisition method for specific detection or quantitation tasks requires additional investigation

  16. Observation of Jet Photoproduction and Comparison to Monte Carlo Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Lincoln, Donald W. [Rice Univ., Houston, TX (United States)

    1994-01-01

    The photon is the carrier of the electromagnetic force. However in addition to its well known nature, the theories of QCD and quantum mechanics would indicate that the photon can also for brief periods of time split into a $q\\bar{q}$ pair (an extended photon.) How these constituents share energy and momentum is an interesting question and such a measurement was investigated by scattering photons off protons. The post collision kinematics should reveal pre-collision information. Unfortunately, when these constituents exit the collision point, they undergo subsequent interactions (gluon radiation, fragmentation, etc.) which scramble their kinematics. An algorithm was explored which was shown via Monte Carlo techniques to partially disentangle these post collision interactions and reveal the collision kinematics. The presence or absence of large transverse momenta internal ($k_\\perp$) to the photon has a significant impact on the ability to reconstruct the kinematics of the leading order calculation hard scatter system. Reconstruction of the next to leading order high $E_\\perp$ partons is more straightforward. Since the photon exhibits this unusual behavior only part of the time, many of the collisions recorded will be with a non-extended (or direct) photon. Unless a method for culling only the extended photons out can be invented, this contamination of direct photons must be accounted for. No such culling method is currently known, and so any measurement will necessarily contain both photon types. Theoretical predictions using Monte Carlo methods are compared with the data and are found to reproduce many experimentally measured distributions quite well. Overall the LUND Monte Carlo reproduces the data better than the HERWIG Monte Carlo. As expected at low jet $E_\\perp$, the data set seems to be dominated by extended photons, with the mix becoming nearly equal at jet $E_\\perp > 4$ GeV. The existence of a large photon $k_\\perp$ appears to be favored.

  17. Class of backpropagation techniques for limited-angle reconstruction in microwave tomography

    International Nuclear Information System (INIS)

    Paladhi, P. Roy; Tayebi, A.; Udpa, L.; Udpa, S.; Sinha, A.

    2015-01-01

    Filtered backpropagation (FBPP) is a well-known technique used in Diffraction Tomography (DT). For accurate reconstruction using FBPP, full 360° angular coverage is necessary. However, it has been shown that using some inherent redundancies in the projection data in a tomographic setup, accurate reconstruction is still possible with 270° coverage which is called the minimal-scan angle range. This can be done by applying weighing functions (or filters) on projection data of the object to eliminate the redundancies and accurately reconstruct the image from 270° coverage. This paper demonstrates procedures to generate many general classes of these weighing filters. These are all equivalent at 270° coverage but vary in performance at lower angular coverages and in presence of noise. This paper does a comparative analysis of different filters when angular coverage is lower than minimal-scan angle of 270°. Simulation studies have been done to find optimum weight filters for sub-minimal angular coverage (<270°)

  18. HeinzelCluster: accelerated reconstruction for FORE and OSEM3D.

    Science.gov (United States)

    Vollmar, S; Michel, C; Treffert, J T; Newport, D F; Casey, M; Knöss, C; Wienhard, K; Liu, X; Defrise, M; Heiss, W D

    2002-08-07

    Using iterative three-dimensional (3D) reconstruction techniques for reconstruction of positron emission tomography (PET) is not feasible on most single-processor machines due to the excessive computing time needed, especially so for the large sinogram sizes of our high-resolution research tomograph (HRRT). In our first approach to speed up reconstruction time we transform the 3D scan into the format of a two-dimensional (2D) scan with sinograms that can be reconstructed independently using Fourier rebinning (FORE) and a fast 2D reconstruction method. On our dedicated reconstruction cluster (seven four-processor systems, Intel PIII@700 MHz, switched fast ethernet and Myrinet, Windows NT Server), we process these 2D sinograms in parallel. We have achieved a speedup > 23 using 26 processors and also compared results for different communication methods (RPC, Syngo, Myrinet GM). The other approach is to parallelize OSEM3D (implementation of C Michel), which has produced the best results for HRRT data so far and is more suitable for an adequate treatment of the sinogram gaps that result from the detector geometry of the HRRT. We have implemented two levels of parallelization for four dedicated cluster (a shared memory fine-grain level on each node utilizing all four processors and a coarse-grain level allowing for 15 nodes) reducing the time for one core iteration from over 7 h to about 35 min.

  19. Radiographic and tomographic study of the elbow joint in dogs

    International Nuclear Information System (INIS)

    Sendyk-Grunkraut, Alessandra; Martin, Claudia M.; Souza, Alexandre N.A.; Patricio, Geni Cristina F.; Lorigados, Carla A.B.; Matera, Julia M.; Fonseca-Pinto, Ana C.B.C.

    2017-01-01

    Elbow dysplasia disease includes an united anconeal process, fragmented medial coronoid process, osteochondrosis of humeral trochlea, articular incongruity and degenerative joint disease. The aim of this study was to present detailed morphologic and morphometric aspects of the elbow joint in dog in clinical and correlate with radiographic and tomographic (CT) exam. Inter-observer variation for articular incongruity measurements by CT, comparative analysis in the radiographic exam, angle in ulnar notch and its comparative analysis between radiographic and tomographic agreement examination in 44 elbow of dogs with different ages were evaluated. The statistics analyses included the kappa coefficient and interclass correlation and Fischer's test and McNemar's test. It was evidenced that individual performance of each radiographic incidence had poor agreement with the tomographic exam, suggesting that the accomplishment of more than two radiograph views are needed. There was no agreement between the three evaluators in the ulnar notch angle at radiographic and tomographic exams. However, there was good/moderate agreement for articular incongruity measurement in the sagittal plane between evaluators. It was possible to conclude that none of the five radiographic incidences was better than the others for radiographic analysis because each incidence had a better identification of a particular elbow compartment; measurements at the tomographic exam to evaluate radioulnar incongruity had no reproductiveness in the frontal plane, but in sagittal plan had a good/moderate agreement between observers and the angle in ulnar notch presented no repeatability at radiographic exam and no reproductiveness at tomographic exam. (author)

  20. Concepts and Plans towards fast large scale Monte Carlo production for the ATLAS Experiment

    Science.gov (United States)

    Ritsch, E.; Atlas Collaboration

    2014-06-01

    The huge success of the physics program of the ATLAS experiment at the Large Hadron Collider (LHC) during Run 1 relies upon a great number of simulated Monte Carlo events. This Monte Carlo production takes the biggest part of the computing resources being in use by ATLAS as of now. In this document we describe the plans to overcome the computing resource limitations for large scale Monte Carlo production in the ATLAS Experiment for Run 2, and beyond. A number of fast detector simulation, digitization and reconstruction techniques are being discussed, based upon a new flexible detector simulation framework. To optimally benefit from these developments, a redesigned ATLAS MC production chain is presented at the end of this document.

  1. SU-E-T-222: Computational Optimization of Monte Carlo Simulation On 4D Treatment Planning Using the Cloud Computing Technology

    International Nuclear Information System (INIS)

    Chow, J

    2015-01-01

    Purpose: This study evaluated the efficiency of 4D lung radiation treatment planning using Monte Carlo simulation on the cloud. The EGSnrc Monte Carlo code was used in dose calculation on the 4D-CT image set. Methods: 4D lung radiation treatment plan was created by the DOSCTP linked to the cloud, based on the Amazon elastic compute cloud platform. Dose calculation was carried out by Monte Carlo simulation on the 4D-CT image set on the cloud, and results were sent to the FFD4D image deformation program for dose reconstruction. The dependence of computing time for treatment plan on the number of compute node was optimized with variations of the number of CT image set in the breathing cycle and dose reconstruction time of the FFD4D. Results: It is found that the dependence of computing time on the number of compute node was affected by the diminishing return of the number of node used in Monte Carlo simulation. Moreover, the performance of the 4D treatment planning could be optimized by using smaller than 10 compute nodes on the cloud. The effects of the number of image set and dose reconstruction time on the dependence of computing time on the number of node were not significant, as more than 15 compute nodes were used in Monte Carlo simulations. Conclusion: The issue of long computing time in 4D treatment plan, requiring Monte Carlo dose calculations in all CT image sets in the breathing cycle, can be solved using the cloud computing technology. It is concluded that the optimized number of compute node selected in simulation should be between 5 and 15, as the dependence of computing time on the number of node is significant

  2. SU-E-T-222: Computational Optimization of Monte Carlo Simulation On 4D Treatment Planning Using the Cloud Computing Technology

    Energy Technology Data Exchange (ETDEWEB)

    Chow, J [Princess Margaret Cancer Center, Toronto, ON (Canada)

    2015-06-15

    Purpose: This study evaluated the efficiency of 4D lung radiation treatment planning using Monte Carlo simulation on the cloud. The EGSnrc Monte Carlo code was used in dose calculation on the 4D-CT image set. Methods: 4D lung radiation treatment plan was created by the DOSCTP linked to the cloud, based on the Amazon elastic compute cloud platform. Dose calculation was carried out by Monte Carlo simulation on the 4D-CT image set on the cloud, and results were sent to the FFD4D image deformation program for dose reconstruction. The dependence of computing time for treatment plan on the number of compute node was optimized with variations of the number of CT image set in the breathing cycle and dose reconstruction time of the FFD4D. Results: It is found that the dependence of computing time on the number of compute node was affected by the diminishing return of the number of node used in Monte Carlo simulation. Moreover, the performance of the 4D treatment planning could be optimized by using smaller than 10 compute nodes on the cloud. The effects of the number of image set and dose reconstruction time on the dependence of computing time on the number of node were not significant, as more than 15 compute nodes were used in Monte Carlo simulations. Conclusion: The issue of long computing time in 4D treatment plan, requiring Monte Carlo dose calculations in all CT image sets in the breathing cycle, can be solved using the cloud computing technology. It is concluded that the optimized number of compute node selected in simulation should be between 5 and 15, as the dependence of computing time on the number of node is significant.

  3. Comparative evaluation of two commercial PET scanners, ECAT EXACT HR+ and Biograph 2, using GATE

    International Nuclear Information System (INIS)

    Karakatsanis, N.; Sakellios, N.; Tsantilas, N.X.; Dikaios, N.; Tsoumpas, C.; Lazaro, D.; Loudos, G.; Schmidtlein, C.R.; Louizi, K.; Valais, J.; Nikolopoulos, D.; Malamitsi, J.; Kandarakis, J.; Nikita, K.

    2006-01-01

    Geant4 application for tomographic emission (GATE) is a generic Monte Carlo simulation platform based on a general-purpose code GEANT4 and designed to simulate positron emission tomography (PET) and single photon emission tomography systems. Monte Carlo simulations are used in nuclear medicine to model imaging systems and develop and assess tomographic reconstruction algorithms and correction methods for improved image quantification. The purpose of this study is to validate two GATE models of the commercial available PET scanner HR+ and the PET/CT Biograph 2. The geometry of the system components has been described in GATE, including detector ring, crystal blocks, PMTs etc. The energy and spatial resolution of the scanners as given by the manufacturers have been taken into account. The GATE simulated results are compared directly to experimental data obtained using a number of NEMA NU-2-2001 performance protocols, including spatial resolution, sensitivity and scatter fraction. All the respective phantoms are precisely modeled. Furthermore, an approximate dead-time model both at the level of single and coincidence events was developed so that the simulated count rate curve can satisfactorily match the experimental count rate performance curve for each scanner In addition a software tool was developed to build the sinograms from the simulated data and import them into the software for tomographic image reconstruction where the reconstruction algorithm of FBP3DRP was applied. An agreement of less than 0.8 mm was obtained between the spatial resolution of the simulated system and the experimental results. Also the simulated scatter fraction for the NEMA NU 2-2001 scatter phantom matched the experimental results to within 3% of measured values. Finally the ratio of the simulated sensitivities with sources radially offset 0 and 10 cm from the central axis of each of the two scanners reaches an agreement of less than 1% between the simulated and experimental values. This

  4. Monte Carlo based dosimetry and treatment planning for neutron capture therapy of brain tumors

    International Nuclear Information System (INIS)

    Zamenhof, R.G.; Brenner, J.F.; Wazer, D.E.; Madoc-Jones, H.; Clement, S.D.; Harling, O.K.; Yanch, J.C.

    1990-01-01

    Monte Carlo based dosimetry and computer-aided treatment planning for neutron capture therapy have been developed to provide the necessary link between physical dosimetric measurements performed on the MITR-II epithermal-neutron beams and the need of the radiation oncologist to synthesize large amounts of dosimetric data into a clinically meaningful treatment plan for each individual patient. Monte Carlo simulation has been employed to characterize the spatial dose distributions within a skull/brain model irradiated by an epithermal-neutron beam designed for neutron capture therapy applications. The geometry and elemental composition employed for the mathematical skull/brain model and the neutron and photon fluence-to-dose conversion formalism are presented. A treatment planning program, NCTPLAN, developed specifically for neutron capture therapy, is described. Examples are presented illustrating both one and two-dimensional dose distributions obtainable within the brain with an experimental epithermal-neutron beam, together with beam quality and treatment plan efficacy criteria which have been formulated for neutron capture therapy. The incorporation of three-dimensional computed tomographic image data into the treatment planning procedure is illustrated

  5. Monte Carlo simulations in small animal PET imaging

    Energy Technology Data Exchange (ETDEWEB)

    Branco, Susana [Universidade de Lisboa, Faculdade de Ciencias, Instituto de Biofisica e Engenharia Biomedica, Lisbon (Portugal)], E-mail: susana.silva@fc.ul.pt; Jan, Sebastien [Service Hospitalier Frederic Joliot, CEA/DSV/DRM, Orsay (France); Almeida, Pedro [Universidade de Lisboa, Faculdade de Ciencias, Instituto de Biofisica e Engenharia Biomedica, Lisbon (Portugal)

    2007-10-01

    This work is based on the use of an implemented Positron Emission Tomography (PET) simulation system dedicated for small animal PET imaging. Geant4 Application for Tomographic Emission (GATE), a Monte Carlo simulation platform based on the Geant4 libraries, is well suited for modeling the microPET FOCUS system and to implement realistic phantoms, such as the MOBY phantom, and data maps from real examinations. The use of a microPET FOCUS simulation model with GATE has been validated for spatial resolution, counting rates performances, imaging contrast recovery and quantitative analysis. Results from realistic studies of the mouse body using {sup -}F and [{sup 18}F]FDG imaging protocols are presented. These simulations include the injection of realistic doses into the animal and realistic time framing. The results have shown that it is possible to simulate small animal PET acquisitions under realistic conditions, and are expected to be useful to improve the quantitative analysis in PET mouse body studies.

  6. Monte Carlo codes and Monte Carlo simulator program

    International Nuclear Information System (INIS)

    Higuchi, Kenji; Asai, Kiyoshi; Suganuma, Masayuki.

    1990-03-01

    Four typical Monte Carlo codes KENO-IV, MORSE, MCNP and VIM have been vectorized on VP-100 at Computing Center, JAERI. The problems in vector processing of Monte Carlo codes on vector processors have become clear through the work. As the result, it is recognized that these are difficulties to obtain good performance in vector processing of Monte Carlo codes. A Monte Carlo computing machine, which processes the Monte Carlo codes with high performances is being developed at our Computing Center since 1987. The concept of Monte Carlo computing machine and its performance have been investigated and estimated by using a software simulator. In this report the problems in vectorization of Monte Carlo codes, Monte Carlo pipelines proposed to mitigate these difficulties and the results of the performance estimation of the Monte Carlo computing machine by the simulator are described. (author)

  7. Reconstruction, Energy Calibration, and Identification of Hadronically Decaying Tau Leptons in the ATLAS Experiment for Run-2 of the LHC

    CERN Document Server

    The ATLAS collaboration

    2015-01-01

    The reconstruction algorithm, energy calibration, and identification methods for hadronically decaying tau leptons in ATLAS used at the start of Run-2 of the Large Hadron Collider are described in this note. All algorithms have been optimised for Run-2 conditions. The energy calibration relies on Monte Carlo samples with hadronic tau lepton decays, and applies multiplicative factors based on the pT of the reconstructed tau lepton to the energy measurements in the calorimeters. The identification employs boosted decision trees. Systematic uncertainties on the energy scale, reconstruction efficiency and identification efficiency of hadronically decaying tau leptons are determined using Monte Carlo samples that simulate varying conditions.

  8. Muon reconstruction and the search for leptoquarks at LHC

    CERN Document Server

    Ruckert, B

    2006-01-01

    This diploma thesis focuses on the reconstruction of high-energetic muons. This simulation study was performed within the ATLAS experiment at the Large Hadron Collider (LHC) which is a pp-collider with a centre-of-mass energy p s = 14 TeV. The purpose of this study was to identify muons with strongly overestimated transverse momentum using Monte Carlo simulated data which has been generated using Pythia and run through a full detector simulation. These muons can lead to a faked leptoquark signal, as leptoquark-decays can include high-energetic muons. If leptoquarks exist, only a small number of such events is expected which makes the safe momentum measurement a crucial point. To achieve an optimal reconstruction, selection criteria have been developed which compare the track’s 2, the particle’s -direction and the reconstructed pT s from the different reconstruction algorithms, namely the inner detector standalone reconstruction, the muon spectrometer standalone reconstruction and a combination of both. Th...

  9. Vertex Reconstruction in the ATLAS Experiment at the LHC

    CERN Document Server

    Bouhova-Thacker, E; The ATLAS collaboration; Kostyukhin, V; Liebig, W; Limper, M; Piacquadio, G; Lichard, P; Weiser, C; Wildauer, A

    2009-01-01

    In the harsh environment of the Large Hadron Collider at CERN (design luminosity of $10^{34}$ cm$^{-2}$ s$^{-1}$) efficient reconstruction of vertices is crucial for many physics analyses. Described in this paper are the strategies for vertex reconstruction used in the ATLAS experiment and their implementation in the software framework Athena. The algorithms for the reconstruction of primary and secondary vertices as well as for finding of photon conversions and vertex reconstruction in jets are described. A special emphasis is made on the vertex fitting with application of additional constraints. The implementation of mentioned algorithms follows a very modular design based on object-oriented C++ and use of abstract interfaces. The user-friendly concept allows event reconstruction and physics analyses to compare and optimize their choice among different vertex reconstruction strategies. The performance of implemented algorithms has been studied on a variety of Monte Carlo samples and results are presented.

  10. Achievement report for fiscal 1998. Optical tomographic system; 1998 nendo seika hokokusho. Hikari danso imaging system

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1999-05-01

    Evaluations were given on spatial resolution and measurement time of an optical tomographic system by using the developed 64-channel time-resolved spectroscopy and an image reconstruction algorithm. With respect to the spatial resolution, the target value of 1 cm was verified from tomographic images of a phantom with a diameter of 10cm, simulating a neonate. The measurement time achieved 20 minutes, being one third of the target value. In installing the equipment at Hokkaido University, speeds of the optical switches and attenuators were increased to have reduced the measurement time to one minute. For installation at Kanagawa Rehabilitation Center, development has been made on a nano-second light pulser, whose average beam quantity has been increased to 40 times, and improvement has been given on the optical switches, the attenuators, and the indication software, by which the measurement time was decreased further by 30 seconds than that at Hokkaido University. In performing the clinical evaluation, the evaluation protocol resolved by the Experiment Evaluation Special Committee was submitted for deliberation at the Medical Welfare Device Clinical Evaluation Committee. Upon having been authorized by the Committee, the clinical evaluations were performed at Hokkaido University and the Kanagawa Rehabilitation Center. (NEDO)

  11. A novel technique to incorporate structural prior information into multi-modal tomographic reconstruction

    International Nuclear Information System (INIS)

    Kazantsev, Daniil; Dobson, Katherine J; Withers, Philip J; Lee, Peter D; Ourselin, Sébastien; Arridge, Simon R; Hutton, Brian F; Kaestner, Anders P; Lionheart, William R B

    2014-01-01

    There has been a rapid expansion of multi-modal imaging techniques in tomography. In biomedical imaging, patients are now regularly imaged using both single photon emission computed tomography (SPECT) and x-ray computed tomography (CT), or using both positron emission tomography and magnetic resonance imaging (MRI). In non-destructive testing of materials both neutron CT (NCT) and x-ray CT are widely applied to investigate the inner structure of material or track the dynamics of physical processes. The potential benefits from combining modalities has led to increased interest in iterative reconstruction algorithms that can utilize the data from more than one imaging mode simultaneously. We present a new regularization term in iterative reconstruction that enables information from one imaging modality to be used as a structural prior to improve resolution of the second modality. The regularization term is based on a modified anisotropic tensor diffusion filter, that has shape-adapted smoothing properties. By considering the underlying orientations of normal and tangential vector fields for two co-registered images, the diffusion flux is rotated and scaled adaptively to image features. The images can have different greyscale values and different spatial resolutions. The proposed approach is particularly good at isolating oriented features in images which are important for medical and materials science applications. By enhancing the edges it enables both easy identification and volume fraction measurements aiding segmentation algorithms used for quantification. The approach is tested on a standard denoising and deblurring image recovery problem, and then applied to 2D and 3D reconstruction problems; thereby highlighting the capabilities of the algorithm. Using synthetic data from SPECT co-registered with MRI, and real NCT data co-registered with x-ray CT, we show how the method can be used across a range of imaging modalities. (paper)

  12. Monte-Carlo simulations and image reconstruction for novel imaging scenarios in emission tomography

    International Nuclear Information System (INIS)

    Gillam, John E.; Rafecas, Magdalena

    2016-01-01

    Emission imaging incorporates both the development of dedicated devices for data acquisition as well as algorithms for recovering images from that data. Emission tomography is an indirect approach to imaging. The effect of device modification on the final image can be understood through both the way in which data are gathered, using simulation, and the way in which the image is formed from that data, or image reconstruction. When developing novel devices, systems and imaging tasks, accurate simulation and image reconstruction allow performance to be estimated, and in some cases optimized, using computational methods before or during the process of physical construction. However, there are a vast range of approaches, algorithms and pre-existing computational tools that can be exploited and the choices made will affect the accuracy of the in silico results and quality of the reconstructed images. On the one hand, should important physical effects be neglected in either the simulation or reconstruction steps, specific enhancements provided by novel devices may not be represented in the results. On the other hand, over-modeling of device characteristics in either step leads to large computational overheads that can confound timely results. Here, a range of simulation methodologies and toolkits are discussed, as well as reconstruction algorithms that may be employed in emission imaging. The relative advantages and disadvantages of a range of options are highlighted using specific examples from current research scenarios.

  13. Monte-Carlo simulations and image reconstruction for novel imaging scenarios in emission tomography

    Energy Technology Data Exchange (ETDEWEB)

    Gillam, John E. [The University of Sydney, Faculty of Health Sciences and The Brain and Mind Centre, Camperdown (Australia); Rafecas, Magdalena, E-mail: rafecas@imt.uni-luebeck.de [University of Lubeck, Institute of Medical Engineering, Ratzeburger Allee 160, 23538 Lübeck (Germany)

    2016-02-11

    Emission imaging incorporates both the development of dedicated devices for data acquisition as well as algorithms for recovering images from that data. Emission tomography is an indirect approach to imaging. The effect of device modification on the final image can be understood through both the way in which data are gathered, using simulation, and the way in which the image is formed from that data, or image reconstruction. When developing novel devices, systems and imaging tasks, accurate simulation and image reconstruction allow performance to be estimated, and in some cases optimized, using computational methods before or during the process of physical construction. However, there are a vast range of approaches, algorithms and pre-existing computational tools that can be exploited and the choices made will affect the accuracy of the in silico results and quality of the reconstructed images. On the one hand, should important physical effects be neglected in either the simulation or reconstruction steps, specific enhancements provided by novel devices may not be represented in the results. On the other hand, over-modeling of device characteristics in either step leads to large computational overheads that can confound timely results. Here, a range of simulation methodologies and toolkits are discussed, as well as reconstruction algorithms that may be employed in emission imaging. The relative advantages and disadvantages of a range of options are highlighted using specific examples from current research scenarios.

  14. A study of total measurement error in tomographic gamma scanning to assay nuclear material with emphasis on a bias issue for low-activity samples

    International Nuclear Information System (INIS)

    Burr, T.L.; Mercer, D.J.; Prettyman, T.H.

    1998-01-01

    Field experience with the tomographic gamma scanner to assay nuclear material suggests that the analysis techniques can significantly impact the assay uncertainty. For example, currently implemented image reconstruction methods exhibit a positive bias for low-activity samples. Preliminary studies indicate that bias reduction could be achieved at the expense of increased random error variance. In this paper, the authors examine three possible bias sources: (1) measurement error in the estimated transmission matrix, (2) the positivity constraint on the estimated mass of nuclear material, and (3) improper treatment of the measurement error structure. The authors present results from many small-scale simulation studies to examine this bias/variance tradeoff for a few image reconstruction methods in the presence of the three possible bias sources

  15. Microstructural Quantification, Property Prediction, and Stochastic Reconstruction of Heterogeneous Materials Using Limited X-Ray Tomography Data

    Science.gov (United States)

    Li, Hechao

    An accurate knowledge of the complex microstructure of a heterogeneous material is crucial for quantitative structure-property relations establishment and its performance prediction and optimization. X-ray tomography has provided a non-destructive means for microstructure characterization in both 3D and 4D (i.e., structural evolution over time). Traditional reconstruction algorithms like filtered-back-projection (FBP) method or algebraic reconstruction techniques (ART) require huge number of tomographic projections and segmentation process before conducting microstructural quantification. This can be quite time consuming and computationally intensive. In this thesis, a novel procedure is first presented that allows one to directly extract key structural information in forms of spatial correlation functions from limited x-ray tomography data. The key component of the procedure is the computation of a "probability map", which provides the probability of an arbitrary point in the material system belonging to specific phase. The correlation functions of interest are then readily computed from the probability map. Using effective medium theory, accurate predictions of physical properties (e.g., elastic moduli) can be obtained. Secondly, a stochastic optimization procedure that enables one to accurately reconstruct material microstructure from a small number of x-ray tomographic projections (e.g., 20 - 40) is presented. Moreover, a stochastic procedure for multi-modal data fusion is proposed, where both X-ray projections and correlation functions computed from limited 2D optical images are fused to accurately reconstruct complex heterogeneous materials in 3D. This multi-modal reconstruction algorithm is proved to be able to integrate the complementary data to perform an excellent optimization procedure, which indicates its high efficiency in using limited structural information. Finally, the accuracy of the stochastic reconstruction procedure using limited X

  16. Model-based microwave image reconstruction: simulations and experiments

    International Nuclear Information System (INIS)

    Ciocan, Razvan; Jiang Huabei

    2004-01-01

    We describe an integrated microwave imaging system that can provide spatial maps of dielectric properties of heterogeneous media with tomographically collected data. The hardware system (800-1200 MHz) was built based on a lock-in amplifier with 16 fixed antennas. The reconstruction algorithm was implemented using a Newton iterative method with combined Marquardt-Tikhonov regularizations. System performance was evaluated using heterogeneous media mimicking human breast tissue. Finite element method coupled with the Bayliss and Turkel radiation boundary conditions were applied to compute the electric field distribution in the heterogeneous media of interest. The results show that inclusions embedded in a 76-diameter background medium can be quantitatively reconstructed from both simulated and experimental data. Quantitative analysis of the microwave images obtained suggests that an inclusion of 14 mm in diameter is the smallest object that can be fully characterized presently using experimental data, while objects as small as 10 mm in diameter can be quantitatively resolved with simulated data

  17. Conceptual design of a compact positron tomograph for prostateimaging

    Energy Technology Data Exchange (ETDEWEB)

    Huber, J.S.; Derenzo, S.E.; Qi, J.; Moses, W.W.; Huesman, R.H.; Budinger, T.F.

    2000-11-04

    We present a conceptual design of a compact positron tomograph for prostate imaging using a pair of external curved detector banks, one placed above and one below the patient. The lower detector bank is fixed below the patient bed, and the top bank adjusts vertically for maximum sensitivity and patient access. Each bank is composed of 40conventional block detectors, forming two arcs (44 cm minor, 60 cm major axis) that are tilted to minimize attenuation and positioned as close as possible to the patient to improve sensitivity. The individual detectors are angled to point towards the prostate to minimize resolution degradation in that region. Inter-plane septa extend 5 cm beyond the scintillator crystals to reduce random and scatter backgrounds. A patient is not fully encircled by detector rings in order to minimize cost,causing incomplete sampling due to the side gaps. Monte Carlo simulation (including random and scatter) demonstrates the feasibility of detecting a spherical tumor of 2.5 cm diameter with a tumor to background ratio of2:1, utilizing the number of events that should be achievable with a6-minute scan after a 10 mCi injection (e.g., carbon-11 choline or fluorine-18 fluorocholine).

  18. Construction of a positron emission tomograph with 2.4 mm detectors

    International Nuclear Information System (INIS)

    McIntyre, J.A.; Sprosst, R.L.; Wang, K.

    1986-01-01

    One-quarter of one ring of a positron tomograph has been constructed. The positron annihilation gamma rays are detected by polished plastic scintillators which direct scintillation light by internal reflection to optical fibers for transmission to the photo-multiplier tubes. By viewing each scintillator with four sets of optical fibers, the number of photomultipliers required for an eight ring tomograph with 1024 detectors per ring (2.4 mm wide detectors) can be reduced from 8192 to 288, and the cost of the tomograph reduced accordingly

  19. Muon reconstruction performance in ATLAS at Run 2

    CERN Document Server

    Lesage, Arthur; The ATLAS collaboration

    2016-01-01

    This article documents the performance of the ATLAS muon identification and reconstruction using the first LHC dataset recorded at $\\sqrt{s} = 13$ TeV in 2015. Using a large sample of $J/\\psi\\rightarrow\\mu\\mu$ and $Z\\rightarrow\\mu\\mu$ decays, measurements of the reconstruction efficiency, as well as of the momentum scale and resolution, are presented and compared to Monte Carlo simulations. The reconstruction efficiency is measured to be close to $99\\%$ over most of the covered phase space ($|\\eta| 2.2$, the $p_{\\text{T}}$ resolution for a typical muon from $Z\\rightarrow\\mu\\mu$ decays is $2.9\\%$ while the precision on the momentum scale for low-$p_{\\text{T}}$ muons from $J/\\psi\\rightarrow\\mu\\mu$ decays is about $0.2\\%$.

  20. SPET reconstruction with a non-uniform attenuation coefficient using an analytical regularizing iterative method

    International Nuclear Information System (INIS)

    Soussaline, F.; LeCoq, C.; Raynaud, C.; Kellershohn

    1982-01-01

    The potential of the Regularizing Iterative Method (RIM), when used in brain studies, is evaluated. RIM is designed to provide fast and accurate reconstruction of tomographic images when non-uniform attenuation is to be accounted for. As indicated by phantom studies, this method improves the contrast and the signal-to-noise ratio as compared to those obtained with Filtered Back Projection (FBP) technique. Preliminary results obtained in brain studies using isopropil-amphetamine I-123 (AMPI-123) are very encouraging in terms of quantitative regional cellular activity. However, the clinical usefulness of this mathematically accurate reconstruction procedure is going to be demonstrated, in comparing quantitative data in heart or liver studies where control values can be obtained

  1. A semi-automatic method for positioning a femoral bone reconstruction for strict view generation.

    Science.gov (United States)

    Milano, Federico; Ritacco, Lucas; Gomez, Adrian; Gonzalez Bernaldo de Quiros, Fernan; Risk, Marcelo

    2010-01-01

    In this paper we present a semi-automatic method for femoral bone positioning after 3D image reconstruction from Computed Tomography images. This serves as grounding for the definition of strict axial, longitudinal and anterior-posterior views, overcoming the problem of patient positioning biases in 2D femoral bone measuring methods. After the bone reconstruction is aligned to a standard reference frame, new tomographic slices can be generated, on which unbiased measures may be taken. This could allow not only accurate inter-patient comparisons but also intra-patient comparisons, i.e., comparisons of images of the same patient taken at different times. This method could enable medical doctors to diagnose and follow up several bone deformities more easily.

  2. Method for optimizing side shielding in positron-emission tomographs and for comparing detector materials

    International Nuclear Information System (INIS)

    Derenzo, S.E.

    1980-01-01

    This report presents analytical formulas for the image-forming and background event rates seen by circular positron-emission tomographs with parallel side shielding. These formulas include deadtime losses, detector efficiency, coincidence resolving time, amount of activity, patient port diameter, shielding gap, and shielding depth. A figure of merit, defined in terms of these quantities, describes the signal-to-noise ratio in the reconstructed image of a 20-cm cylinder of water with uniformly dispersed activity. Results are presented for the scintillators NaI(TI), bismuth germanate (BGO), CsF, and plastic; and for Ge(Li) and wire chambers with converters. In these examples, BGO provided the best signal-to-noise for activity levels below 1000 μCi per cm, and CsF had the advantage for higher activity levels

  3. Minimal residual cone-beam reconstruction with attenuation correction in SPECT

    International Nuclear Information System (INIS)

    La, Valerie; Grangeat, Pierre

    1998-01-01

    This paper presents an iterative method based on the minimal residual algorithm for tomographic attenuation compensated reconstruction from attenuated cone-beam projections given the attenuation distribution. Unlike conjugate-gradient based reconstruction techniques, the proposed minimal residual based algorithm solves directly a quasisymmetric linear system, which is a preconditioned system. Thus it avoids the use of normal equations, which improves the convergence rate. Two main contributions are introduced. First, a regularization method is derived for quasisymmetric problems, based on a Tikhonov-Phillips regularization applied to the factorization of the symmetric part of the system matrix. This regularization is made spatially adaptive to avoid smoothing the region of interest. Second, our existing reconstruction algorithm for attenuation correction in parallel-beam geometry is extended to cone-beam geometry. A circular orbit is considered. Two preconditioning operators are proposed: the first one is Grangeat's inversion formula and the second one is Feldkamp's inversion formula. Experimental results obtained on simulated data are presented and the shadow zone effect on attenuated data is illustrated. (author)

  4. Reconstruction and visualization of nanoparticle composites by transmission electron tomography

    Energy Technology Data Exchange (ETDEWEB)

    Wang, X.Y. [National Institute for Nanotechnology, 11421 Saskatchewan Drive, Edmonton, Canada T6H 2M9 (Canada); Department of Physics, University of Alberta, Edmonton, Canada T6G 2G7 (Canada); Lockwood, R. [National Institute for Nanotechnology, 11421 Saskatchewan Drive, Edmonton, Canada T6H 2M9 (Canada); Malac, M., E-mail: marek.malac@nrc-cnrc.gc.ca [National Institute for Nanotechnology, 11421 Saskatchewan Drive, Edmonton, Canada T6H 2M9 (Canada); Department of Physics, University of Alberta, Edmonton, Canada T6G 2G7 (Canada); Furukawa, H. [SYSTEM IN FRONTIER INC., 2-8-3, Shinsuzuharu bldg. 4F, Akebono-cho, Tachikawa-shi, Tokyo 190-0012 (Japan); Li, P.; Meldrum, A. [National Institute for Nanotechnology, 11421 Saskatchewan Drive, Edmonton, Canada T6H 2M9 (Canada)

    2012-02-15

    This paper examines the limits of transmission electron tomography reconstruction methods for a nanocomposite object composed of many closely packed nanoparticles. Two commonly used reconstruction methods in TEM tomography were examined and compared, and the sources of various artefacts were explored. Common visualization methods were investigated, and the resulting 'interpretation artefacts' ( i.e., deviations from 'actual' particle sizes and shapes arising from the visualization) were determined. Setting a known or estimated nanoparticle volume fraction as a criterion for thresholding does not in fact give a good visualization. Unexpected effects associated with common built-in image filtering methods were also found. Ultimately, this work set out to establish the common problems and pitfalls associated with electron beam tomographic reconstruction and visualization of samples consisting of closely spaced nanoparticles. -- Highlights: Black-Right-Pointing-Pointer Electron tomography limits were explored by both experiment and simulation. Black-Right-Pointing-Pointer Reliable quantitative volumetry using electron tomography is not presently feasible. Black-Right-Pointing-Pointer Volume rendering appears to be better choice for visualization of composite samples.

  5. Experience of computed tomographic myelography and discography in cervical problem

    Energy Technology Data Exchange (ETDEWEB)

    Nakatani, Shigeru; Yamamoto, Masayuki; Uratsuji, Masaaki; Suzuki, Kunio; Matsui, Eigo [Hyogo Prefectural Awaji Hospital, Sumoto, Hyogo (Japan); Kurihara, Akira

    1983-06-01

    CTM (computed tomographic myelography) was performed on 15 cases of cervical lesions, and on 5 of them, CTD (computed tomographic discography) was also made. CTM revealed the intervertebral state, and in combination with CTD, providing more accurate information. The combined method of CTM and CTD was useful for soft disc herniation.

  6. Resolving ambiguities in reconstructed grain maps using discrete tomography

    DEFF Research Database (Denmark)

    Alpers, A.; Knudsen, E.; Poulsen, H.F.

    2005-01-01

    reconstruct the image from diffraction data, but they are often unable to assign unambiguous values to all pixels. We present an approach that resolves these ambiguous pixels by using a Monte Carlo technique that exploits the discrete nature of the problem and utilizes proven methods of discrete tomography...

  7. Global Distribution of Mercury's Neutrals from MESSENGER Measurements Combined with a Tomographic Method

    Science.gov (United States)

    Sarantos, Menelaos; McClintock, Bill; Vervack, Ron, Jr.; Killen, Rosemary; Merkel, Aimee; Slavin, James; Solomon, Sean C.

    2011-01-01

    The MErcury Surface, Space ENvironment, GEochemistry, and Ranging (MESSENGER) spacecraft entered orbit about Mercury on March 18, 2011. Since then, the Ultraviolet and Visible Spectrometer (UVVS) onboard this spacecraft has been observing Mercury's collisionless exosphere. We present measurements by MESSENGER UVVS of the sodium, calcium, and magnesium distributions that were obtained during multiple passes through the tail over a period of one month. Global maps of the exosphere were constructed daily from such measurements using a recently developed tomographic technique. During this period, Mercury moved towards the Sun from being about 0.44 astronomical units (AU) to approximately 0.32 AU from the Sun. Hence, our reconstructions provide information about the three-dimensional structure of the exosphere, the source processes for these species, and their dependence with orbital distance during the entire in-leg of Mercury's orbit.

  8. Tomographic findings of acute pulmonary toxoplasmosis in immunocompetent patients.

    Science.gov (United States)

    de Souza Giassi, Karina; Costa, Andre Nathan; Apanavicius, Andre; Teixeira, Fernando Bin; Fernandes, Caio Julio Cesar; Helito, Alfredo Salim; Kairalla, Ronaldo Adib

    2014-11-25

    Toxoplasmosis is one of the most common human zoonosis, and is generally benign in most of the individuals. Pulmonary involvement is common in immunocompromised subjects, but very rare in immunocompetents and there are scarce reports of tomographic findings in the literature. The aim of the study is to describe three immunocompetent patients diagnosed with acute pulmonary toxoplasmosis and their respective thoracic tomographic findings. Acute toxoplasmosis was diagnosed according to the results of serological tests suggestive of recent primary infection and the absence of an alternative etiology. From 2009 to 2013, three patients were diagnosed with acute respiratory failure secondary to acute toxoplasmosis. The patients were two female and one male, and were 38, 56 and 36 years old. Similarly they presented a two-week febrile illness and progressive dyspnea before admission. Laboratory tests demonstrated lymphocytosis, slight changes in liver enzymes and high inflammatory markers. Tomographic findings were bilateral smooth septal and peribronchovascular thickening (100%), ground-glass opacities (100%), atelectasis (33%), random nodules (33%), lymph node enlargement (33%) and pleural effusion (66%). All the patients improved their symptoms after treatment, and complete resolution of tomographic findings were found in the followup. These cases provide a unique description of the presentation and evolution of pulmonary tomographic manifestations of toxoplasmosis in immunocompetent patients. Toxoplasma pneumonia manifests with fever, dyspnea and a non-productive cough that may result in respiratory failure. In animal models, changes were described as interstitial pneumonitis with focal infiltrates of neutrophils that can finally evolve into a pattern of diffuse alveolar damage with focal necrosis. The tomographic findings are characterized as ground glass opacities, smooth septal and marked peribronchovascular thickening; and may mimic pulmonary congestion

  9. Data acquisition, reconstruction, and display

    International Nuclear Information System (INIS)

    Huesman, R.H.

    1981-01-01

    A special emphasis of the Research Medicine program is the development of methods for acquiring and manipulating data from the Donner 280-crystal positron emission tomograph. This past year, development of a system capable of taking 1 million events per second while simultaneously correcting for unwanted accidental coincidence events was completed. The system permits the simultaneous acquisition of data for eight different time-slices of the cardiac cycle. A microprocessor responds to the patient's electrocardiogram (EKG) signal, routing data to the histogram memory corresponding to the phase of the cardiac cycle indicated by the signal. Additional work completed this year includes quantitation of the signal-to-noise ratio to be expected when imaging the human head. Effort is continuing on the more complicated problem of noise propagation in reconstructions of the human thorax

  10. Utilisation of spatial and temporal correlations in positron emission tomography

    International Nuclear Information System (INIS)

    Sureau, F.

    2008-06-01

    In this thesis we propose, implement, and evaluate algorithms improving spatial resolution in reconstructed images and reducing data noise in positron emission tomography imaging. These algorithms have been developed for a high resolution tomograph (HRRT) and applied to brain imaging, but can be used for other tomographs or studies. We first developed an iterative reconstruction algorithm including a stationary and isotropic model of resolution in image space, experimentally measured. We evaluated the impact of such a model of resolution in Monte-Carlo simulations, physical phantom experiments and in two clinical studies by comparing our algorithm with a reference reconstruction algorithm. This study suggests that biases due to partial volume effects are reduced, in particular in the clinical studies. Better spatial and temporal correlations are also found at the voxel level. However, other methods should be developed to further reduce data noise. We then proposed a maximum a posteriori de-noising algorithm that can be used for dynamic data to de-noise temporally raw data (sino-grams) or reconstructed images. The a priori modeled the coefficients in a wavelet basis of all the signals without noise (in an image or sinogram). We compared this technique with a reference de-noising method on replicated simulations. This illustrates the potential benefits of our approach of sinogram de-noising. (author)

  11. Compact Positron Tomograph for Prostate Imaging

    National Research Council Canada - National Science Library

    Huber, Jennifer

    2004-01-01

    The goal of this project is to construct a functioning compact positron tomograph, whose geometry is optimized for detecting prostate tumors with molecular tracers such as 11Ccholine (carbon-11 choline...

  12. Compact Positron Tomograph for Prostate Imaging

    National Research Council Canada - National Science Library

    Huber, Jennifer S

    2005-01-01

    The goal of this project is to construct a functioning compact positron tomograph, whose geometry is optimized for detecting prostate tumors with molecular tracers such as 11Ccholine (carbon-11 choline...

  13. Diagnostic accuracy of multi-slice computed tomographic angiography in the detection of cerebral aneurysms

    International Nuclear Information System (INIS)

    Haghighatkhah, H. R.; Sabouri, S.; Borzouyeh, F.; Bagherzadeh, M. H.; Bakhshandeh, H.; Jalali, A. H.

    2008-01-01

    Multislice computed tomographic angiography is a rapid and minimally invasive method for the detection of intracranial aneurysms. The purpose of this study was to compare Multislice computed tomographic angiography with digital subtraction angiography In the diagnosis of cerebral aneurysms. Patients and Methods: In this cross sectional study we evaluated 111 consecutive patients [42(37.8%) male and 69(62.2%) female], who were admitted under clinical symptoms and signs. suggestive of harboring an intracranial aneurysm by using a four detector Multislice computed tomographic angiography. Then we compared results of Multislice computed tomographic angiography with digital subtraction angiography results as a gold standard method. Digital subtraction angiography was performed by bilateral selective common carotid artery injections and either unilateral or bilateral vertebral artery injections, as necessary. Multislice computed tomographic angiography images were interpreted by one radiologist and digital subtraction angiography was performed by another radiologist who was blinded to the interpretation of the Multislice computed tomographic angiograms. Results: The mean ±S D age of the patients was 49.1±13.6 years (range: 12-84 years). We performed Multislice computed tomographic in 111 and digital subtraction angiography in 85 patients. The sensitivity, specificity, positive predictive value, negative predictive value, positive and negative likelihood ratio of Multislice computed tomographic angiography, when compared with digital subtraction angiography as the gold standard, were 100%, 90%, 87.5%, 100%, 10 and 0, respectively. Conclusion: Multislice computed tomographic angiography seems to be an accurate and noninvasive imaging modality in the diagnosis of intracranial aneurysms

  14. Concepts and Plans towards fast large scale Monte Carlo production for the ATLAS Experiment

    CERN Document Server

    Chapman, J; Duehrssen, M; Elsing, M; Froidevaux, D; Harrington, R; Jansky, R; Langenberg, R; Mandrysch, R; Marshall, Z; Ritsch, E; Salzburger, A

    2014-01-01

    The huge success of the physics program of the ATLAS experiment at the Large Hadron Collider (LHC) during run I relies upon a great number of simulated Monte Carlo events. This Monte Carlo production takes the biggest part of the computing resources being in use by ATLAS as of now. In this document we describe the plans to overcome the computing resource limitations for large scale Monte Carlo production in the ATLAS Experiment for run II, and beyond. A number of fast detector simulation, digitization and reconstruction techniques and are being discussed, based upon a new flexible detector simulation framework. To optimally benefit from these developments, a redesigned ATLAS MC production chain is presented at the end of this document.

  15. An interior-point method for total variation regularized positron emission tomography image reconstruction

    Science.gov (United States)

    Bai, Bing

    2012-03-01

    There has been a lot of work on total variation (TV) regularized tomographic image reconstruction recently. Many of them use gradient-based optimization algorithms with a differentiable approximation of the TV functional. In this paper we apply TV regularization in Positron Emission Tomography (PET) image reconstruction. We reconstruct the PET image in a Bayesian framework, using Poisson noise model and TV prior functional. The original optimization problem is transformed to an equivalent problem with inequality constraints by adding auxiliary variables. Then we use an interior point method with logarithmic barrier functions to solve the constrained optimization problem. In this method, a series of points approaching the solution from inside the feasible region are found by solving a sequence of subproblems characterized by an increasing positive parameter. We use preconditioned conjugate gradient (PCG) algorithm to solve the subproblems directly. The nonnegativity constraint is enforced by bend line search. The exact expression of the TV functional is used in our calculations. Simulation results show that the algorithm converges fast and the convergence is insensitive to the values of the regularization and reconstruction parameters.

  16. Limited data tomographic image reconstruction via dual formulation of total variation minimization

    Science.gov (United States)

    Jang, Kwang Eun; Sung, Younghun; Lee, Kangeui; Lee, Jongha; Cho, Seungryong

    2011-03-01

    The X-ray mammography is the primary imaging modality for breast cancer screening. For the dense breast, however, the mammogram is usually difficult to read due to tissue overlap problem caused by the superposition of normal tissues. The digital breast tomosynthesis (DBT) that measures several low dose projections over a limited angle range may be an alternative modality for breast imaging, since it allows the visualization of the cross-sectional information of breast. The DBT, however, may suffer from the aliasing artifact and the severe noise corruption. To overcome these problems, a total variation (TV) regularized statistical reconstruction algorithm is presented. Inspired by the dual formulation of TV minimization in denoising and deblurring problems, we derived a gradient-type algorithm based on statistical model of X-ray tomography. The objective function is comprised of a data fidelity term derived from the statistical model and a TV regularization term. The gradient of the objective function can be easily calculated using simple operations in terms of auxiliary variables. After a descending step, the data fidelity term is renewed in each iteration. Since the proposed algorithm can be implemented without sophisticated operations such as matrix inverse, it provides an efficient way to include the TV regularization in the statistical reconstruction method, which results in a fast and robust estimation for low dose projections over the limited angle range. Initial tests with an experimental DBT system confirmed our finding.

  17. MO-DE-209-02: Tomosynthesis Reconstruction Methods

    International Nuclear Information System (INIS)

    Mainprize, J.

    2016-01-01

    Digital Breast Tomosynthesis (DBT) is rapidly replacing mammography as the standard of care in breast cancer screening and diagnosis. DBT is a form of computed tomography, in which a limited set of projection images are acquired over a small angular range and reconstructed into tomographic data. The angular range varies from 15° to 50° and the number of projections varies between 9 and 25 projections, as determined by the equipment manufacturer. It is equally valid to treat DBT as the digital analog of classical tomography – that is, linear tomography. In fact, the name “tomosynthesis” stands for “synthetic tomography.” DBT shares many common features with classical tomography, including the radiographic appearance, dose, and image quality considerations. As such, both the science and practical physics of DBT systems is a hybrid between computed tomography and classical tomographic methods. In this lecture, we will explore the continuum from radiography to computed tomography to illustrate the characteristics of DBT. This lecture will consist of four presentations that will provide a complete overview of DBT, including a review of the fundamentals of DBT acquisition, a discussion of DBT reconstruction methods, an overview of dosimetry for DBT systems, and summary of the underlying image theory of DBT thereby relating image quality and dose. Learning Objectives: To understand the fundamental principles behind tomosynthesis image acquisition. To understand the fundamentals of tomosynthesis image reconstruction. To learn the determinants of image quality and dose in DBT, including measurement techniques. To learn the image theory underlying tomosynthesis, and the relationship between dose and image quality. ADM is a consultant to, and holds stock in, Real Time Tomography, LLC. ADM receives research support from Hologic Inc., Analogic Inc., and Barco NV.; ADM is a member of the Scientific Advisory Board for Gamma Medica Inc.; A. Maidment, Research Support

  18. MO-DE-209-02: Tomosynthesis Reconstruction Methods

    Energy Technology Data Exchange (ETDEWEB)

    Mainprize, J. [Sunnybrook Health Sciences Centre, Toronto, ON (Canada)

    2016-06-15

    Digital Breast Tomosynthesis (DBT) is rapidly replacing mammography as the standard of care in breast cancer screening and diagnosis. DBT is a form of computed tomography, in which a limited set of projection images are acquired over a small angular range and reconstructed into tomographic data. The angular range varies from 15° to 50° and the number of projections varies between 9 and 25 projections, as determined by the equipment manufacturer. It is equally valid to treat DBT as the digital analog of classical tomography – that is, linear tomography. In fact, the name “tomosynthesis” stands for “synthetic tomography.” DBT shares many common features with classical tomography, including the radiographic appearance, dose, and image quality considerations. As such, both the science and practical physics of DBT systems is a hybrid between computed tomography and classical tomographic methods. In this lecture, we will explore the continuum from radiography to computed tomography to illustrate the characteristics of DBT. This lecture will consist of four presentations that will provide a complete overview of DBT, including a review of the fundamentals of DBT acquisition, a discussion of DBT reconstruction methods, an overview of dosimetry for DBT systems, and summary of the underlying image theory of DBT thereby relating image quality and dose. Learning Objectives: To understand the fundamental principles behind tomosynthesis image acquisition. To understand the fundamentals of tomosynthesis image reconstruction. To learn the determinants of image quality and dose in DBT, including measurement techniques. To learn the image theory underlying tomosynthesis, and the relationship between dose and image quality. ADM is a consultant to, and holds stock in, Real Time Tomography, LLC. ADM receives research support from Hologic Inc., Analogic Inc., and Barco NV.; ADM is a member of the Scientific Advisory Board for Gamma Medica Inc.; A. Maidment, Research Support

  19. A First-Order Primal-Dual Reconstruction Algorithm for Few-View SPECT

    DEFF Research Database (Denmark)

    Wolf, Paul; Jørgensen, Jakob Heide; Gilat-Schmidt, Taly

    2012-01-01

    A sparsity-exploiting algorithm intended for few-view Single Photon Emission Computed Tomography (SPECT) reconstruction is proposed and characterized. The algorithm models the object as piecewise constant subject to a blurring operation. Monte Carlo simulations were performed to provide more proj...

  20. Detection of explosive substances by tomographic inspection using neutron and gamma-ray spectroscopy

    International Nuclear Information System (INIS)

    Farahmand, M.; Boston, A.J.; Grint, A.N.; Nolan, P.J.; Joyce, M.J.; Mackin, R.O.; D'Mellow, B.; Aspinall, M.; Peyton, A.J.; Silfhout, R. van

    2007-01-01

    neutron detector providing data for inversion to tomographic images. In this paper, we present our approach to the design and implementation of a system for the efficient screening of goods in luggage and cargo containers. The simulation in a Monte Carlo framework using GEANT4 has been carried out for the imaging of gamma-ray events using the Compton camera design which will be discussed. The results of Compton camera measurements using HPGe detectors and the subsequent reconstructed images will also be presented

  1. An attenuation measurement technique for rotating planar detector positron tomographs

    International Nuclear Information System (INIS)

    McNeil, P.A.; Julyan, P.J.; Parker, D.J.

    1997-01-01

    This paper presents a new attenuation measurement technique suitable for rotating planar detector positron tomographs. Transmission measurements are made using two unshielded positron-emitting line sources, one attached to the front face of each detector. Many of the scattered and accidental coincidences are rejected by including only those coincidences that form a vector passing within a predetermined distance of either line source. Some scattered and accidental coincidences are still included, which reduces the measured linear attenuation; in principle their contribution can be accurately estimated and subtracted, but in practice, when limited statistics are available (as is the case with the multi-wire Birmingham positron camera), this background subtraction unacceptably increases the noise. Instead an attenuation image having the correct features can be reconstructed from the measured projections. For objects containing only a few discrete linear attenuation coefficients, segmentation of this attenuation image reduces noise and allows the correct linear attenuation coefficients to be restored by renormalization. Reprojection through the segmented image may then provide quantitatively correct attenuation correction factors of sufficient statistical quality to correct for attenuation in PET emission images. (author)

  2. New techniques for resolution enhancement of 3D x-ray tomographic imaging from incomplete data

    International Nuclear Information System (INIS)

    Vengrinovich, V.; Zolotarev, S.; Denkevich, Y.; Tillack, G.-R.

    2004-01-01

    Accurate evaluation of dimensions directly from tomographic images, restored from only few x-ray projections, made in a limited observation sector, is considered exploiting pipes wall thickness assessment like a typical example. Both experiments and simulations are used to extract main errors sources. It is taken from as known, that neglecting of the scattered radiation and beam hardening effects results in image blurring, strong artifacts and finally inaccurate sizing. The computerized technique is developed to simulate the contribution of scattered radiation and beam hardening for the purpose of their further extraction from projected data. After those accompanying effects extraction the iterative Bayesian techniques are applied to reconstruct images from the projections, using volumetric and/or shell representation of the objects like pipes. The achieved error of virtual pipe wall thickness assessment from 3D images can be as small as 300μk comparing to 1mm provided by modern techniques. Finally the conclusion was drawn that standard projection techniques using X- or Gamma rays in combination with X-ray film or imaging plates can be applied for the data acquisition to reconstruct finally wall thickness profiles in an in-field environment. (author)

  3. Computed tomographic findings of intracranial gliosis

    International Nuclear Information System (INIS)

    Weisberg, L.

    1981-01-01

    The clinical and computed tomographic (CT) findings in eight patients with pathological evidence of cerebral gliosis are analyzed. CT findings do not permit differentiation of gliosis from other neoplastic and non-neoplastic conditions. (orig.)

  4. Application of iterative reconstruction in dynamic studies

    International Nuclear Information System (INIS)

    Meikle, S.R.

    1998-01-01

    Full text: The conventional approach to analysing dynamic tomographic data (SPECT or PET) is to reconstruct projections corresponding to each time interval separately and then fit a suitable tracer kinetic model to the dynamic sequence (method 1 ) . This approach assumes that the tracer distribution remains static during any given time interval and, for practical reasons, filtered back-projection (FBP) is the preferred reconstruction algorithm. However, alternative approaches exist which lend themselves to iterative algorithms, such as EM. One approach is to fit the model directly to the projection data, followed by EM reconstruction of the parameter estimates (method 2). This requires that the tracer model can be expressed as a linear function of the unknown model parameters. A third alternative is to incorporate the tracer model into the reconstruction algorithm (method 3). Such an extension was described during the early development of the EM algorithm, referred to as the EM parametric image reconstruction algorithm (EM-PIRA). We have investigated these various strategies for analysing dynamic data and their relative pros and cons. Tracer modelling was performed using a general model, referred to as spectral analysis, which makes no restriction on the number of physiological compartments and satisfies the linearity requirement of method 2. A kinetic software phantom was created and used to test the convergence and noise properties of the different approaches. In summary, method 2 is the most practical as it reduces the number of reconstructions by at least an order of magnitude and provides improved signal-to-noise ratios compared with method 1. EM-PIRA allows greater flexibility in the choice of parametric images and appears to have a regularising effect on convergence. Methods 2 and 3 are also better suited to dynamic scanning with a rotating camera, as they can potentially account for changes in tracer distribution between projections

  5. Monte Carlo and Quasi-Monte Carlo Sampling

    CERN Document Server

    Lemieux, Christiane

    2009-01-01

    Presents essential tools for using quasi-Monte Carlo sampling in practice. This book focuses on issues related to Monte Carlo methods - uniform and non-uniform random number generation, variance reduction techniques. It covers several aspects of quasi-Monte Carlo methods.

  6. Post-processing methods of rendering and visualizing 3-D reconstructed tomographic images

    Energy Technology Data Exchange (ETDEWEB)

    Wong, S.T.C. [Univ. of California, San Francisco, CA (United States)

    1997-02-01

    The purpose of this presentation is to discuss the computer processing techniques of tomographic images, after they have been generated by imaging scanners, for volume visualization. Volume visualization is concerned with the representation, manipulation, and rendering of volumetric data. Since the first digital images were produced from computed tomography (CT) scanners in the mid 1970s, applications of visualization in medicine have expanded dramatically. Today, three-dimensional (3D) medical visualization has expanded from using CT data, the first inherently digital source of 3D medical data, to using data from various medical imaging modalities, including magnetic resonance scanners, positron emission scanners, digital ultrasound, electronic and confocal microscopy, and other medical imaging modalities. We have advanced from rendering anatomy to aid diagnosis and visualize complex anatomic structures to planning and assisting surgery and radiation treatment. New, more accurate and cost-effective procedures for clinical services and biomedical research have become possible by integrating computer graphics technology with medical images. This trend is particularly noticeable in current market-driven health care environment. For example, interventional imaging, image-guided surgery, and stereotactic and visualization techniques are now stemming into surgical practice. In this presentation, we discuss only computer-display-based approaches of volumetric medical visualization. That is, we assume that the display device available is two-dimensional (2D) in nature and all analysis of multidimensional image data is to be carried out via the 2D screen of the device. There are technologies such as holography and virtual reality that do provide a {open_quotes}true 3D screen{close_quotes}. To confine the scope, this presentation will not discuss such approaches.

  7. Post-processing methods of rendering and visualizing 3-D reconstructed tomographic images

    International Nuclear Information System (INIS)

    Wong, S.T.C.

    1997-01-01

    The purpose of this presentation is to discuss the computer processing techniques of tomographic images, after they have been generated by imaging scanners, for volume visualization. Volume visualization is concerned with the representation, manipulation, and rendering of volumetric data. Since the first digital images were produced from computed tomography (CT) scanners in the mid 1970s, applications of visualization in medicine have expanded dramatically. Today, three-dimensional (3D) medical visualization has expanded from using CT data, the first inherently digital source of 3D medical data, to using data from various medical imaging modalities, including magnetic resonance scanners, positron emission scanners, digital ultrasound, electronic and confocal microscopy, and other medical imaging modalities. We have advanced from rendering anatomy to aid diagnosis and visualize complex anatomic structures to planning and assisting surgery and radiation treatment. New, more accurate and cost-effective procedures for clinical services and biomedical research have become possible by integrating computer graphics technology with medical images. This trend is particularly noticeable in current market-driven health care environment. For example, interventional imaging, image-guided surgery, and stereotactic and visualization techniques are now stemming into surgical practice. In this presentation, we discuss only computer-display-based approaches of volumetric medical visualization. That is, we assume that the display device available is two-dimensional (2D) in nature and all analysis of multidimensional image data is to be carried out via the 2D screen of the device. There are technologies such as holography and virtual reality that do provide a open-quotes true 3D screenclose quotes. To confine the scope, this presentation will not discuss such approaches

  8. A new method of detection for a positron emission tomograph using a time of flight method

    International Nuclear Information System (INIS)

    Gresset, Christian.

    1981-05-01

    In the first chapter, it is shown the advantages of positron radioemitters (β + ) of low period, and the essential characteristics of positron tomographs realized at the present time. The second chapter presents the interest of an original technique of image reconstruction: the time of flight technique. The third chapter describes the characterization methods which were set for verifying the feasibility of cesium fluoride in tomography. Chapter four presents the results obtained by these methods. It appears that the cesium fluoride constitute presently the best positron emission associated to time of flight technique. The hypotheses made on eventual performances of such machines are validated by experiments with phantom. The results obtained with a detector (bismuth germanate) conserves all its interest in skull tomography [fr

  9. Continuous analog of multiplicative algebraic reconstruction technique for computed tomography

    Science.gov (United States)

    Tateishi, Kiyoko; Yamaguchi, Yusaku; Abou Al-Ola, Omar M.; Kojima, Takeshi; Yoshinaga, Tetsuya

    2016-03-01

    We propose a hybrid dynamical system as a continuous analog to the block-iterative multiplicative algebraic reconstruction technique (BI-MART), which is a well-known iterative image reconstruction algorithm for computed tomography. The hybrid system is described by a switched nonlinear system with a piecewise smooth vector field or differential equation and, for consistent inverse problems, the convergence of non-negatively constrained solutions to a globally stable equilibrium is guaranteed by the Lyapunov theorem. Namely, we can prove theoretically that a weighted Kullback-Leibler divergence measure can be a common Lyapunov function for the switched system. We show that discretizing the differential equation by using the first-order approximation (Euler's method) based on the geometric multiplicative calculus leads to the same iterative formula of the BI-MART with the scaling parameter as a time-step of numerical discretization. The present paper is the first to reveal that a kind of iterative image reconstruction algorithm is constructed by the discretization of a continuous-time dynamical system for solving tomographic inverse problems. Iterative algorithms with not only the Euler method but also the Runge-Kutta methods of lower-orders applied for discretizing the continuous-time system can be used for image reconstruction. A numerical example showing the characteristics of the discretized iterative methods is presented.

  10. Original circuitry for TOHR tomograph

    International Nuclear Information System (INIS)

    Cuzon, J.C.; Pinot, L.

    1999-01-01

    Having industrialization in mind, a specific electronics for a high resolution tomograph is designed out of the usual standards of nuclear physics. All the information are converted in the time domain and a fast processor, in front of the data acquisition, carries out the time and energy coincidences. (authors)

  11. Parallel-scanning tomosynthesis using a slot scanning technique: Fixed-focus reconstruction and the resulting image quality

    International Nuclear Information System (INIS)

    Shibata, Koichi; Notohara, Daisuke; Sakai, Takihito

    2014-01-01

    Purpose: Parallel-scanning tomosynthesis (PS-TS) is a novel technique that fuses the slot scanning technique and the conventional tomosynthesis (TS) technique. This approach allows one to obtain long-view tomosynthesis images in addition to normally sized tomosynthesis images, even when using a system that has no linear tomographic scanning function. The reconstruction technique and an evaluation of the resulting image quality for PS-TS are described in this paper. Methods: The PS-TS image-reconstruction technique consists of several steps (1) the projection images are divided into strips, (2) the strips are stitched together to construct images corresponding to the reconstruction plane, (3) the stitched images are filtered, and (4) the filtered stitched images are back-projected. In the case of PS-TS using the fixed-focus reconstruction method (PS-TS-F), one set of stitched images is used for the reconstruction planes at all heights, thus avoiding the necessity of repeating steps (1)–(3). A physical evaluation of the image quality of PS-TS-F compared with that of the conventional linear TS was performed using a R/F table (Sonialvision safire, Shimadzu Corp., Kyoto, Japan). The tomographic plane with the best theoretical spatial resolution (the in-focus plane, IFP) was set at a height of 100 mm from the table top by adjusting the reconstruction program. First, the spatial frequency response was evaluated at heights of −100, −50, 0, 50, 100, and 150 mm from the IFP using the edge of a 0.3-mm-thick copper plate. Second, the spatial resolution at each height was visually evaluated using an x-ray test pattern (Model No. 38, PTW Freiburg, Germany). Third, the slice sensitivity at each height was evaluated via the wire method using a 0.1-mm-diameter tungsten wire. Phantom studies using a knee phantom and a whole-body phantom were also performed. Results: The spatial frequency response of PS-TS-F yielded the best results at the IFP and degraded slightly as the

  12. Parallel-scanning tomosynthesis using a slot scanning technique: fixed-focus reconstruction and the resulting image quality.

    Science.gov (United States)

    Shibata, Koichi; Notohara, Daisuke; Sakai, Takihito

    2014-11-01

    Parallel-scanning tomosynthesis (PS-TS) is a novel technique that fuses the slot scanning technique and the conventional tomosynthesis (TS) technique. This approach allows one to obtain long-view tomosynthesis images in addition to normally sized tomosynthesis images, even when using a system that has no linear tomographic scanning function. The reconstruction technique and an evaluation of the resulting image quality for PS-TS are described in this paper. The PS-TS image-reconstruction technique consists of several steps (1) the projection images are divided into strips, (2) the strips are stitched together to construct images corresponding to the reconstruction plane, (3) the stitched images are filtered, and (4) the filtered stitched images are back-projected. In the case of PS-TS using the fixed-focus reconstruction method (PS-TS-F), one set of stitched images is used for the reconstruction planes at all heights, thus avoiding the necessity of repeating steps (1)-(3). A physical evaluation of the image quality of PS-TS-F compared with that of the conventional linear TS was performed using a R/F table (Sonialvision safire, Shimadzu Corp., Kyoto, Japan). The tomographic plane with the best theoretical spatial resolution (the in-focus plane, IFP) was set at a height of 100 mm from the table top by adjusting the reconstruction program. First, the spatial frequency response was evaluated at heights of -100, -50, 0, 50, 100, and 150 mm from the IFP using the edge of a 0.3-mm-thick copper plate. Second, the spatial resolution at each height was visually evaluated using an x-ray test pattern (Model No. 38, PTW Freiburg, Germany). Third, the slice sensitivity at each height was evaluated via the wire method using a 0.1-mm-diameter tungsten wire. Phantom studies using a knee phantom and a whole-body phantom were also performed. The spatial frequency response of PS-TS-F yielded the best results at the IFP and degraded slightly as the distance from the IFP increased. A

  13. Parallel-scanning tomosynthesis using a slot scanning technique: Fixed-focus reconstruction and the resulting image quality

    Energy Technology Data Exchange (ETDEWEB)

    Shibata, Koichi, E-mail: shibatak@suzuka-u.ac.jp [Department of Radiological Technology, Faculty of Health Science, Suzuka University of Medical Science 1001-1, Kishioka-cho, Suzuka 510-0293 (Japan); Notohara, Daisuke; Sakai, Takihito [R and D Department, Medical Systems Division, Shimadzu Corporation 1, Nishinokyo-Kuwabara-cho, Nakagyo-ku, Kyoto 604-8511 (Japan)

    2014-11-01

    Purpose: Parallel-scanning tomosynthesis (PS-TS) is a novel technique that fuses the slot scanning technique and the conventional tomosynthesis (TS) technique. This approach allows one to obtain long-view tomosynthesis images in addition to normally sized tomosynthesis images, even when using a system that has no linear tomographic scanning function. The reconstruction technique and an evaluation of the resulting image quality for PS-TS are described in this paper. Methods: The PS-TS image-reconstruction technique consists of several steps (1) the projection images are divided into strips, (2) the strips are stitched together to construct images corresponding to the reconstruction plane, (3) the stitched images are filtered, and (4) the filtered stitched images are back-projected. In the case of PS-TS using the fixed-focus reconstruction method (PS-TS-F), one set of stitched images is used for the reconstruction planes at all heights, thus avoiding the necessity of repeating steps (1)–(3). A physical evaluation of the image quality of PS-TS-F compared with that of the conventional linear TS was performed using a R/F table (Sonialvision safire, Shimadzu Corp., Kyoto, Japan). The tomographic plane with the best theoretical spatial resolution (the in-focus plane, IFP) was set at a height of 100 mm from the table top by adjusting the reconstruction program. First, the spatial frequency response was evaluated at heights of −100, −50, 0, 50, 100, and 150 mm from the IFP using the edge of a 0.3-mm-thick copper plate. Second, the spatial resolution at each height was visually evaluated using an x-ray test pattern (Model No. 38, PTW Freiburg, Germany). Third, the slice sensitivity at each height was evaluated via the wire method using a 0.1-mm-diameter tungsten wire. Phantom studies using a knee phantom and a whole-body phantom were also performed. Results: The spatial frequency response of PS-TS-F yielded the best results at the IFP and degraded slightly as the

  14. Minimum Detectable Activity for Tomographic Gamma Scanning System

    Energy Technology Data Exchange (ETDEWEB)

    Venkataraman, Ram [Canberra Industries (AREVA BDNM), Meriden, CT (United States); Smith, Susan [Canberra Industries (AREVA BDNM), Meriden, CT (United States); Kirkpatrick, J. M. [Canberra Industries (AREVA BDNM), Meriden, CT (United States); Croft, Stephen [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2015-01-01

    For any radiation measurement system, it is useful to explore and establish the detection limits and a minimum detectable activity (MDA) for the radionuclides of interest, even if the system is to be used at far higher values. The MDA serves as an important figure of merit, and often a system is optimized and configured so that it can meet the MDA requirements of a measurement campaign. The non-destructive assay (NDA) systems based on gamma ray analysis are no exception and well established conventions, such the Currie method, exist for estimating the detection limits and the MDA. However, the Tomographic Gamma Scanning (TGS) technique poses some challenges for the estimation of detection limits and MDAs. The TGS combines high resolution gamma ray spectrometry (HRGS) with low spatial resolution image reconstruction techniques. In non-imaging gamma ray based NDA techniques measured counts in a full energy peak can be used to estimate the activity of a radionuclide, independently of other counting trials. However, in the case of the TGS each “view” is a full spectral grab (each a counting trial), and each scan consists of 150 spectral grabs in the transmission and emission scans per vertical layer of the item. The set of views in a complete scan are then used to solve for the radionuclide activities on a voxel by voxel basis, over 16 layers of a 10x10 voxel grid. Thus, the raw count data are not independent trials any more, but rather constitute input to a matrix solution for the emission image values at the various locations inside the item volume used in the reconstruction. So, the validity of the methods used to estimate MDA for an imaging technique such as TGS warrant a close scrutiny, because the pair-counting concept of Currie is not directly applicable. One can also raise questions as to whether the TGS, along with other image reconstruction techniques which heavily intertwine data, is a suitable method if one expects to measure samples whose activities

  15. Non-stationary reconstruction for dynamic fluorescence molecular tomography with extended kalman filter.

    Science.gov (United States)

    Liu, Xin; Wang, Hongkai; Yan, Zhuangzhi

    2016-11-01

    Dynamic fluorescence molecular tomography (FMT) plays an important role in drug delivery research. However, the majority of current reconstruction methods focus on solving the stationary FMT problems. If the stationary reconstruction methods are applied to the time-varying fluorescence measurements, the reconstructed results may suffer from a high level of artifacts. In addition, based on the stationary methods, only one tomographic image can be obtained after scanning one circle projection data. As a result, the movement of fluorophore in imaged object may not be detected due to the relative long data acquisition time (typically >1 min). In this paper, we apply extended kalman filter (EKF) technique to solve the non-stationary fluorescence tomography problem. Especially, to improve the EKF reconstruction performance, the generalized inverse of kalman gain is calculated by a second-order iterative method. The numerical simulation, phantom, and in vivo experiments are performed to evaluate the performance of the method. The experimental results indicate that by using the proposed EKF-based second-order iterative (EKF-SOI) method, we cannot only clearly resolve the time-varying distributions of fluorophore within imaged object, but also greatly improve the reconstruction time resolution (~2.5 sec/frame) which makes it possible to detect the movement of fluorophore during the imaging processes.

  16. Using additive manufacturing in accuracy evaluation of reconstructions from computed tomography.

    Science.gov (United States)

    Smith, Erin J; Anstey, Joseph A; Venne, Gabriel; Ellis, Randy E

    2013-05-01

    Bone models derived from patient imaging and fabricated using additive manufacturing technology have many potential uses including surgical planning, training, and research. This study evaluated the accuracy of bone surface reconstruction of two diarthrodial joints, the hip and shoulder, from computed tomography. Image segmentation of the tomographic series was used to develop a three-dimensional virtual model, which was fabricated using fused deposition modelling. Laser scanning was used to compare cadaver bones, printed models, and intermediate segmentations. The overall bone reconstruction process had a reproducibility of 0.3 ± 0.4 mm. Production of the model had an accuracy of 0.1 ± 0.1 mm, while the segmentation had an accuracy of 0.3 ± 0.4 mm, indicating that segmentation accuracy was the key factor in reconstruction. Generally, the shape of the articular surfaces was reproduced accurately, with poorer accuracy near the periphery of the articular surfaces, particularly in regions with periosteum covering and where osteophytes were apparent.

  17. Ratios between the effective doses for tomographic phantoms MAX and FAX

    International Nuclear Information System (INIS)

    Kramer, R.; Khoury, H.J.

    2005-01-01

    In the last two decades, the coefficients for the equivalent dose in organs and tissues, as well as to the effective dose, recommended by the International Commission on Radiological Protection (ICRP) were determined using exposure models based on stylized phantoms type MIRD, representing the human body with its radiosensitive organs and tissues according to the ICRP 23 Reference Man, Monte Carlo codes that simulate in a simplified way radiation physics, fabric compositions from different sources, and sometimes applied in a no realistic way, and by the list of organs and tissues at risk with their corresponding weight factors, published in ICRP 60. In the meantime, the International Commission on radiation units and Measurements (ICRU) published reference data to human tissue compositions in ICRU 44 and ICRP launched new anatomical and physiological data of reference in the report number 89. In addition a draft report with recommendations to be released in 2005 (http://icrp.org/) advances significant changes in the list of radiosensitive organs and tissues as well as their corresponding weight factors. As a practical consequence, all components of the traditional stylized models of exposure should be replaced: Monte Carlo codes, human phantoms, the compositions of the fabric and the selection of the organs and tissues at risk with their respective weight factors to determine the effective dose. This article presents the results of comprehensive research into the dosimetric consequences of replacing the stylized models of exposure. The calculations were done using the EGS4 Monte Carlo and MCNP4C codes for external and internal exposure to photons and electrons with phantoms ADAM and EVA, as well as with tomographic phantoms MAX and FAX, for different compositions and tissue distributions. The ratios between effective doses for models of exposure based on phantoms of voxels and effective doses for the stylized models for external and internal exposure to photons and

  18. Monte Carlo calculations of the optical coupling between bismuth germanate crystals and photomultiplier tubes

    International Nuclear Information System (INIS)

    Derenzo, S.E.; Riles, J.K.

    1981-10-01

    The high density and atomic number of bismuth germanate (Bi 4 Ge 3 O 12 or BGO) make it a very useful detector for positron emission tomography. Modern tomograph designs use large numbers of small, closely-packed crystals for high spatial resolution and high sensitivity. However, the low light output, the high refractive index (n=2.15), and the need for accurate timing make it important to optimize the transfer of light to the photomultiplier tube (PMT). We describe the results of a Monte Carlo computer program developed to study the effect of crystal shape, reflector type, and the refractive index of the PMT window on coupling efficiency. The program simulates total internal, external, and Fresnel reflection as well as internal absorption and scattering by bubbles

  19. Direct reconstruction of pharmacokinetic parameters in dynamic fluorescence molecular tomography by the augmented Lagrangian method

    Science.gov (United States)

    Zhu, Dianwen; Zhang, Wei; Zhao, Yue; Li, Changqing

    2016-03-01

    Dynamic fluorescence molecular tomography (FMT) has the potential to quantify physiological or biochemical information, known as pharmacokinetic parameters, which are important for cancer detection, drug development and delivery etc. To image those parameters, there are indirect methods, which are easier to implement but tend to provide images with low signal-to-noise ratio, and direct methods, which model all the measurement noises together and are statistically more efficient. The direct reconstruction methods in dynamic FMT have attracted a lot of attention recently. However, the coupling of tomographic image reconstruction and nonlinearity of kinetic parameter estimation due to the compartment modeling has imposed a huge computational burden to the direct reconstruction of the kinetic parameters. In this paper, we propose to take advantage of both the direct and indirect reconstruction ideas through a variable splitting strategy under the augmented Lagrangian framework. Each iteration of the direct reconstruction is split into two steps: the dynamic FMT image reconstruction and the node-wise nonlinear least squares fitting of the pharmacokinetic parameter images. Through numerical simulation studies, we have found that the proposed algorithm can achieve good reconstruction results within a small amount of time. This will be the first step for a combined dynamic PET and FMT imaging in the future.

  20. On the estimation of wall pressure coherence using time-resolved tomographic PIV

    Science.gov (United States)

    Pröbsting, Stefan; Scarano, Fulvio; Bernardini, Matteo; Pirozzoli, Sergio

    2013-07-01

    Three-dimensional time-resolved velocity field measurements are obtained using a high-speed tomographic Particle Image Velocimetry (PIV) system on a fully developed flat plate turbulent boundary layer for the estimation of wall pressure fluctuations. The work focuses on the applicability of tomographic PIV to compute the coherence of pressure fluctuations, with attention to the estimation of the stream and spanwise coherence length. The latter is required for estimations of aeroacoustic noise radiation by boundary layers and trailing edge flows, but is also of interest for vibro-structural problems. The pressure field is obtained by solving the Poisson equation for incompressible flows, where the source terms are provided by time-resolved velocity field measurements. Measured 3D velocity data is compared to results obtained from planar PIV, and a Direct Numerical Simulation (DNS) at similar Reynolds number. An improved method for the estimation of the material based on a least squares estimator of the velocity derivative along a particle trajectory is proposed and applied. Computed surface pressure fluctuations are further verified by means of simultaneous measurements by a pinhole microphone and compared to the DNS results and a semi-empirical model available from literature. The correlation coefficient for the reconstructed pressure time series with respect to pinhole microphone measurements attains approximately 0.5 for the band-pass filtered signal over the range of frequencies resolved by the velocity field measurements. Scaled power spectra of the pressure at a single point compare favorably to the DNS results and those available from literature. Finally, the coherence of surface pressure fluctuations and the resulting span- and streamwise coherence lengths are estimated and compared to semi-empirical models and DNS results.