Medical Imaging Image Quality Assessment with Monte Carlo Methods
Michail, C. M.; Karpetas, G. E.; Fountos, G. P.; Kalyvas, N. I.; Martini, Niki; Koukou, Vaia; Valais, I. G.; Kandarakis, I. S.
2015-09-01
The aim of the present study was to assess image quality of PET scanners through a thin layer chromatography (TLC) plane source. The source was simulated using a previously validated Monte Carlo model. The model was developed by using the GATE MC package and reconstructed images obtained with the STIR software for tomographic image reconstruction, with cluster computing. The PET scanner simulated in this study was the GE DiscoveryST. A plane source consisted of a TLC plate, was simulated by a layer of silica gel on aluminum (Al) foil substrates, immersed in 18F-FDG bath solution (1MBq). Image quality was assessed in terms of the Modulation Transfer Function (MTF). MTF curves were estimated from transverse reconstructed images of the plane source. Images were reconstructed by the maximum likelihood estimation (MLE)-OSMAPOSL algorithm. OSMAPOSL reconstruction was assessed by using various subsets (3 to 21) and iterations (1 to 20), as well as by using various beta (hyper) parameter values. MTF values were found to increase up to the 12th iteration whereas remain almost constant thereafter. MTF improves by using lower beta values. The simulated PET evaluation method based on the TLC plane source can be also useful in research for the further development of PET and SPECT scanners though GATE simulations.
Monte Carlo simulation of PET images for injection dose optimization
Czech Academy of Sciences Publication Activity Database
Boldyš, Jiří; Dvořák, Jiří; Bělohlávek, O.; Skopalová, M.
London : Taylor and Francis, 2011 - (Manuel, J.; Tavares, R.; Jorge, N.), s. 1-6 ISBN 978-0-415-68395-1. [VipIMAGE 2011 - third ECCOMAS thematic conference on computational vision and medical image processing. Olhao, Algarve (PT), 12.10.2011-14.10.2011] R&D Projects: GA MŠk(CZ) 1M0572 Institutional research plan: CEZ:AV0Z10750506 Keywords : positron emission tomography * Monte Carlo simulation * biological system modeling * image quality Subject RIV: BD - Theory of Information http://library.utia.cas.cz/separaty/2012/ZOI/boldys-monte carlo simulation of pet images for injection dose optimization.pdf
Monte Carlo studies for medical imaging detector optimization
Fois, G. R.; Cisbani, E.; Garibaldi, F.
2016-02-01
This work reports on the Monte Carlo optimization studies of detection systems for Molecular Breast Imaging with radionuclides and Bremsstrahlung Imaging in nuclear medicine. Molecular Breast Imaging requires competing performances of the detectors: high efficiency and high spatial resolutions; in this direction, it has been proposed an innovative device which combines images from two different, and somehow complementary, detectors at the opposite sides of the breast. The dual detector design allows for spot compression and improves significantly the performance of the overall system if all components are well tuned, layout and processing carefully optimized; in this direction the Monte Carlo simulation represents a valuable tools. In recent years, Bremsstrahlung Imaging potentiality in internal radiotherapy (with beta-radiopharmaceuticals) has been clearly emerged; Bremsstrahlung Imaging is currently performed with existing detector generally used for single photon radioisotopes. We are evaluating the possibility to adapt an existing compact gamma camera and optimize by Monte Carlo its performance for Bremsstrahlung imaging with photons emitted by the beta- from 90 Y.
Relevance of accurate Monte Carlo modeling in nuclear medical imaging
Zaidi, H
1999-01-01
Monte Carlo techniques have become popular in different areas of medical physics with advantage of powerful computing systems. In particular, they have been extensively applied to simulate processes involving random behavior and to quantify physical parameters that are difficult or even impossible to calculate by experimental measurements. Recent nuclear medical imaging innovations such as single-photon emission computed tomography (SPECT), positron emission tomography (PET), and multiple emission tomography (MET) are ideal for Monte Carlo modeling techniques because of the stochastic nature of radiation emission, transport and detection processes. Factors which have contributed to the wider use include improved models of radiation transport processes, the practicality of application with the development of acceleration schemes and the improved speed of computers. This paper presents derivation and methodological basis for this approach and critically reviews their areas of application in nuclear imaging. An ...
GPU based Monte Carlo for PET image reconstruction: detector modeling
International Nuclear Information System (INIS)
Monte Carlo (MC) calculations and Graphical Processing Units (GPUs) are almost like the dedicated hardware designed for the specific task given the similarities between visible light transport and neutral particle trajectories. A GPU based MC gamma transport code has been developed for Positron Emission Tomography iterative image reconstruction calculating the projection from unknowns to data at each iteration step taking into account the full physics of the system. This paper describes the simplified scintillation detector modeling and its effect on convergence. (author)
Monte Carlo simulation of PET images for injection doseoptimization
Czech Academy of Sciences Publication Activity Database
Boldyš, Jiří; Dvořák, Jiří; Skopalová, M.; Bělohlávek, O.
2013-01-01
Roč. 29, č. 9 (2013), s. 988-999. ISSN 2040-7939 R&D Projects: GA MŠk 1M0572 Institutional support: RVO:67985556 Keywords : positron emission tomography * Monte Carlo simulation * biological system modeling * image quality Subject RIV: FD - Oncology ; Hematology Impact factor: 1.542, year: 2013 http://library.utia.cas.cz/separaty/2013/ZOI/boldys-0397175.pdf
Image reconstruction using Monte Carlo simulation and artificial neural networks
International Nuclear Information System (INIS)
PET data sets are subject to two types of distortions during acquisition: the imperfect response of the scanner and attenuation and scattering in the active distribution. In addition, the reconstruction of voxel images from the line projections composing a data set can introduce artifacts. Monte Carlo simulation provides a means for modeling the distortions and artificial neural networks a method for correcting for them as well as minimizing artifacts. (author) figs., tab., refs
Microscopic imaging through turbid media Monte Carlo modeling and applications
Gu, Min; Deng, Xiaoyuan
2015-01-01
This book provides a systematic introduction to the principles of microscopic imaging through tissue-like turbid media in terms of Monte-Carlo simulation. It describes various gating mechanisms based on the physical differences between the unscattered and scattered photons and method for microscopic image reconstruction, using the concept of the effective point spread function. Imaging an object embedded in a turbid medium is a challenging problem in physics as well as in biophotonics. A turbid medium surrounding an object under inspection causes multiple scattering, which degrades the contrast, resolution and signal-to-noise ratio. Biological tissues are typically turbid media. Microscopic imaging through a tissue-like turbid medium can provide higher resolution than transillumination imaging in which no objective is used. This book serves as a valuable reference for engineers and scientists working on microscopy of tissue turbid media.
Monte Carlo simulations in small animal PET imaging
International Nuclear Information System (INIS)
This work is based on the use of an implemented Positron Emission Tomography (PET) simulation system dedicated for small animal PET imaging. Geant4 Application for Tomographic Emission (GATE), a Monte Carlo simulation platform based on the Geant4 libraries, is well suited for modeling the microPET FOCUS system and to implement realistic phantoms, such as the MOBY phantom, and data maps from real examinations. The use of a microPET FOCUS simulation model with GATE has been validated for spatial resolution, counting rates performances, imaging contrast recovery and quantitative analysis. Results from realistic studies of the mouse body using -F and [18F]FDG imaging protocols are presented. These simulations include the injection of realistic doses into the animal and realistic time framing. The results have shown that it is possible to simulate small animal PET acquisitions under realistic conditions, and are expected to be useful to improve the quantitative analysis in PET mouse body studies
Monte Carlo simulations in small animal PET imaging
Energy Technology Data Exchange (ETDEWEB)
Branco, Susana [Universidade de Lisboa, Faculdade de Ciencias, Instituto de Biofisica e Engenharia Biomedica, Lisbon (Portugal)], E-mail: susana.silva@fc.ul.pt; Jan, Sebastien [Service Hospitalier Frederic Joliot, CEA/DSV/DRM, Orsay (France); Almeida, Pedro [Universidade de Lisboa, Faculdade de Ciencias, Instituto de Biofisica e Engenharia Biomedica, Lisbon (Portugal)
2007-10-01
This work is based on the use of an implemented Positron Emission Tomography (PET) simulation system dedicated for small animal PET imaging. Geant4 Application for Tomographic Emission (GATE), a Monte Carlo simulation platform based on the Geant4 libraries, is well suited for modeling the microPET FOCUS system and to implement realistic phantoms, such as the MOBY phantom, and data maps from real examinations. The use of a microPET FOCUS simulation model with GATE has been validated for spatial resolution, counting rates performances, imaging contrast recovery and quantitative analysis. Results from realistic studies of the mouse body using {sup -}F and [{sup 18}F]FDG imaging protocols are presented. These simulations include the injection of realistic doses into the animal and realistic time framing. The results have shown that it is possible to simulate small animal PET acquisitions under realistic conditions, and are expected to be useful to improve the quantitative analysis in PET mouse body studies.
Monte Carlo simulations of landmine detection using neutron backscattering imaging
Energy Technology Data Exchange (ETDEWEB)
Datema, Cor P. E-mail: c.datema@iri.tudelft.nl; Bom, Victor R.; Eijk, Carel W.E. van
2003-11-01
Neutron backscattering is a technique that has successfully been applied to the detection of non-metallic landmines. Most of the effort in this field has concentrated on single detectors that are scanned across the soil. Here, two new approaches are presented in which a two-dimensional image of the hydrogen distribution in the soil is made. The first method uses an array of position-sensitive {sup 3}He-tubes that is placed in close proximity of the soil. The second method is based on coded aperture imaging. Here, thermal neutrons from the soil are projected onto a detector which is typically placed one to several meters above the soil. Both methods use a pulsed D/D neutron source. The Monte Carlo simulation package GEANT 4 was used to investigate the performance of both imaging systems.
Robust Image Denoising using a Virtual Flash Image for Monte Carlo Ray Tracing
DEFF Research Database (Denmark)
Moon, Bochang; Jun, Jong Yun; Lee, JongHyeob;
2013-01-01
We propose an efficient and robust image-space denoising method for noisy images generated by Monte Carlo ray tracing methods. Our method is based on two new concepts: virtual flash images and homogeneous pixels. Inspired by recent developments in flash photography, virtual flash images emulate...... parameters. To highlight the benefits of our method, we apply our method to two Monte Carlo ray tracing methods, photon mapping and path tracing, with various input scenes. We demonstrate that using virtual flash images and homogeneous pixels with a standard denoising method outperforms state...... values. While denoising each pixel, we consider only homogeneous pixels—pixels that are statistically equivalent to each other. This makes it possible to define a stochastic error bound of our method, and this bound goes to zero as the number of ray samples goes to infinity, irrespective of denoising...
Performance of three-photon PET imaging: Monte Carlo simulations
Kacperski, Krzysztof; Spyrou, Nicholas M.
2005-12-01
We have recently introduced the idea of making use of three-photon positron annihilations in positron emission tomography. In this paper, the basic characteristics of the three-gamma imaging in PET are studied by means of Monte Carlo simulations and analytical computations. Two typical configurations of human and small animal scanners are considered. Three-photon imaging requires high-energy resolution detectors. Parameters currently attainable by CdZnTe semiconductor detectors, the technology of choice for the future development of radiation imaging, are assumed. Spatial resolution is calculated as a function of detector energy resolution and size, position in the field of view, scanner size and the energies of the three-gamma annihilation photons. Possible ways to improve the spatial resolution obtained for nominal parameters, 1.5 cm and 3.2 mm FWHM for human and small animal scanners, respectively, are indicated. Counting rates of true and random three-photon events for typical human and small animal scanning configurations are assessed. A simple formula for minimum size of lesions detectable in the three-gamma based images is derived. Depending on the contrast and total number of registered counts, lesions of a few mm size for human and sub mm for small animal scanners can be detected.
GPU based Monte Carlo for PET image reconstruction: parameter optimization
International Nuclear Information System (INIS)
This paper presents the optimization of a fully Monte Carlo (MC) based iterative image reconstruction of Positron Emission Tomography (PET) measurements. With our MC re- construction method all the physical effects in a PET system are taken into account thus superior image quality is achieved in exchange for increased computational effort. The method is feasible because we utilize the enormous processing power of Graphical Processing Units (GPUs) to solve the inherently parallel problem of photon transport. The MC approach regards the simulated positron decays as samples in mathematical sums required in the iterative reconstruction algorithm, so to complement the fast architecture, our work of optimization focuses on the number of simulated positron decays required to obtain sufficient image quality. We have achieved significant results in determining the optimal number of samples for arbitrary measurement data, this allows to achieve the best image quality with the least possible computational effort. Based on this research recommendations can be given for effective partitioning of computational effort into the iterations in limited time reconstructions. (author)
Performance of three-photon PET imaging: Monte Carlo simulations
Kacperski, K; Kacperski, Krzysztof; Spyrou, Nicholas M.
2005-01-01
We have recently introduced the idea of making use of three-photon positron annihilations in positron emission tomography. In this paper the basic characteristics of the three-gamma imaging in PET are studied by means of Monte Carlo simulations and analytical computations. Two typical configurations of human and small animal scanners are considered. Three-photon imaging requires high energy resolution detectors. Parameters currently attainable by CdZnTe semiconductor detectors, the technology of choice for the future development of radiation imaging, are assumed. Spatial resolution is calculated as a function of detector energy resolution and size, position in the field of view, scanner size, and the energies of the three gamma annihilation photons. Possible ways to improve the spatial resolution obtained for nominal parameters: 1.5 cm and 3.2 mm FWHM for human and small animal scanners, respectively, are indicated. Counting rates of true and random three-photon events for typical human and small animal scann...
Effect of paper porosity on OCT images: Monte Carlo study
Kirillin, Mikhail Yu.; Priezzhev, Alexander V.; Myllylä, Risto
2008-06-01
Non-invasive measurement of paper porosity is an important problem for papermaking industry. Presently used techniques are invasive and require long time for processing the sample. In recent years optical coherence tomography (OCT) has been proved to be an effective tool for non-invasive study of optically non-uniform scattering media including paper. The aim of present work is to study the potential ability of OCT for sensing the porosity of a paper sample by means of numerical simulations. The paper sample is characterized by variation of porosity along the sample while numerical simulations allow one to consider the samples with constant porosity which is useful for evaluation of the technique abilities. The calculations were performed implementing Monte Carlo-based technique developed earlier for simulation of OCT signals from multilayer paper models. A 9-layer model of paper consisting of five fiber layers and four air layers with non-planar boundaries was considered. The porosity of the samples was varied from 30 to 80% by varying the thicknesses of the layers. The simulations were performed for model paper samples without and with optical clearing agents (benzyl alcohol, 1-pentanol, isopropanol) applied. It was shown that the simulated OCT images of model paper with various porosities significantly differ revealing the potentiality of the OCT technique for sensing the porosity. When obtaining the images of paper samples with optical clearing agents applied, the inner structure of the samples is also revealed providing additional information about the samples under study.
Monte Carlo simulation of the image formation process in portal imaging
International Nuclear Information System (INIS)
We have written Monte Carlo programs to simulate the formation of radiological images. Our code is used to propagate a simulated x-ray fluence through each component of an existing video-based portal imaging system. This simulated fluence consists of a 512x512 pixel image containing both contrast-detail patterns as well as checker patterns to assess spatial resolution of the simulated portal imager. All of the components of the portal imaging system were modeled as a cascade of eight linear stages. Using this code, one can assess the visual impact of changing components in the imaging chain by changing the appropriate probability density function. Virtual experiments were performed to assess the visual impact of replacing the lens and TV camera by an amorphous silicon array, and the effect of scattered radiation on portal images
Monte Carlo modeling of ultrasound probes for image guided radiotherapy
Energy Technology Data Exchange (ETDEWEB)
Bazalova-Carter, Magdalena, E-mail: bazalova@uvic.ca [Department of Radiation Oncology, Stanford University, Stanford, California 94305 and Department of Physics and Astronomy, University of Victoria, Victoria, British Columbia V8W 2Y2 (Canada); Schlosser, Jeffrey [SoniTrack Systems, Inc., Palo Alto, California 94304 (United States); Chen, Josephine [Department of Radiation Oncology, UCSF, San Francisco, California 94143 (United States); Hristov, Dimitre [Department of Radiation Oncology, Stanford University, Stanford, California 94305 (United States)
2015-10-15
Purpose: To build Monte Carlo (MC) models of two ultrasound (US) probes and to quantify the effect of beam attenuation due to the US probes for radiation therapy delivered under real-time US image guidance. Methods: MC models of two Philips US probes, an X6-1 matrix-array transducer and a C5-2 curved-array transducer, were built based on their megavoltage (MV) CT images acquired in a Tomotherapy machine with a 3.5 MV beam in the EGSnrc, BEAMnrc, and DOSXYZnrc codes. Mass densities in the probes were assigned based on an electron density calibration phantom consisting of cylinders with mass densities between 0.2 and 8.0 g/cm{sup 3}. Beam attenuation due to the US probes in horizontal (for both probes) and vertical (for the X6-1 probe) orientation was measured in a solid water phantom for 6 and 15 MV (15 × 15) cm{sup 2} beams with a 2D ionization chamber array and radiographic films at 5 cm depth. The MC models of the US probes were validated by comparison of the measured dose distributions and dose distributions predicted by MC. Attenuation of depth dose in the (15 × 15) cm{sup 2} beams and small circular beams due to the presence of the probes was assessed by means of MC simulations. Results: The 3.5 MV CT number to mass density calibration curve was found to be linear with R{sup 2} > 0.99. The maximum mass densities in the X6-1 and C5-2 probes were found to be 4.8 and 5.2 g/cm{sup 3}, respectively. Dose profile differences between MC simulations and measurements of less than 3% for US probes in horizontal orientation were found, with the exception of the penumbra region. The largest 6% dose difference was observed in dose profiles of the X6-1 probe placed in vertical orientation, which was attributed to inadequate modeling of the probe cable. Gamma analysis of the simulated and measured doses showed that over 96% of measurement points passed the 3%/3 mm criteria for both probes placed in horizontal orientation and for the X6-1 probe in vertical orientation. The
TH-E-18A-01: Developments in Monte Carlo Methods for Medical Imaging
Energy Technology Data Exchange (ETDEWEB)
Badal, A [U.S. Food and Drug Administration (CDRH/OSEL), Silver Spring, MD (United States); Zbijewski, W [Johns Hopkins University, Baltimore, MD (United States); Bolch, W [University of Florida, Gainesville, FL (United States); Sechopoulos, I [Emory University, Atlanta, GA (United States)
2014-06-15
Monte Carlo simulation methods are widely used in medical physics research and are starting to be implemented in clinical applications such as radiation therapy planning systems. Monte Carlo simulations offer the capability to accurately estimate quantities of interest that are challenging to measure experimentally while taking into account the realistic anatomy of an individual patient. Traditionally, practical application of Monte Carlo simulation codes in diagnostic imaging was limited by the need for large computational resources or long execution times. However, recent advancements in high-performance computing hardware, combined with a new generation of Monte Carlo simulation algorithms and novel postprocessing methods, are allowing for the computation of relevant imaging parameters of interest such as patient organ doses and scatter-to-primaryratios in radiographic projections in just a few seconds using affordable computational resources. Programmable Graphics Processing Units (GPUs), for example, provide a convenient, affordable platform for parallelized Monte Carlo executions that yield simulation times on the order of 10{sup 7} xray/ s. Even with GPU acceleration, however, Monte Carlo simulation times can be prohibitive for routine clinical practice. To reduce simulation times further, variance reduction techniques can be used to alter the probabilistic models underlying the x-ray tracking process, resulting in lower variance in the results without biasing the estimates. Other complementary strategies for further reductions in computation time are denoising of the Monte Carlo estimates and estimating (scoring) the quantity of interest at a sparse set of sampling locations (e.g. at a small number of detector pixels in a scatter simulation) followed by interpolation. Beyond reduction of the computational resources required for performing Monte Carlo simulations in medical imaging, the use of accurate representations of patient anatomy is crucial to the
TH-E-18A-01: Developments in Monte Carlo Methods for Medical Imaging
International Nuclear Information System (INIS)
Monte Carlo simulation methods are widely used in medical physics research and are starting to be implemented in clinical applications such as radiation therapy planning systems. Monte Carlo simulations offer the capability to accurately estimate quantities of interest that are challenging to measure experimentally while taking into account the realistic anatomy of an individual patient. Traditionally, practical application of Monte Carlo simulation codes in diagnostic imaging was limited by the need for large computational resources or long execution times. However, recent advancements in high-performance computing hardware, combined with a new generation of Monte Carlo simulation algorithms and novel postprocessing methods, are allowing for the computation of relevant imaging parameters of interest such as patient organ doses and scatter-to-primaryratios in radiographic projections in just a few seconds using affordable computational resources. Programmable Graphics Processing Units (GPUs), for example, provide a convenient, affordable platform for parallelized Monte Carlo executions that yield simulation times on the order of 107 xray/ s. Even with GPU acceleration, however, Monte Carlo simulation times can be prohibitive for routine clinical practice. To reduce simulation times further, variance reduction techniques can be used to alter the probabilistic models underlying the x-ray tracking process, resulting in lower variance in the results without biasing the estimates. Other complementary strategies for further reductions in computation time are denoising of the Monte Carlo estimates and estimating (scoring) the quantity of interest at a sparse set of sampling locations (e.g. at a small number of detector pixels in a scatter simulation) followed by interpolation. Beyond reduction of the computational resources required for performing Monte Carlo simulations in medical imaging, the use of accurate representations of patient anatomy is crucial to the virtual
Monte Carlo simulation of gamma ray tomography for image reconstruction
International Nuclear Information System (INIS)
The Monte Carlo simulations of known density and shape object was validate with Gamma Ray Tomography in static experiments. An aluminum half-moon piece placed inside a steel pipe was the MC simulation test object that was also measured by means of gamma ray transmission. Wall effect of the steel pipe due to irradiation geometry in a single pair source-detector tomography was evaluated by comparison with theoretical data. MCNPX code requires a defined geometry to each photon trajectory which practically prevents this usage for tomography reconstruction simulation. The solution was found by writing a program in Delphi language to create input files automation code. Simulations of tomography data by automated MNCPX code were carried out and validated by experimental data. Working in this sequence the produced data needed a databank to be stored. Experimental setup used a Cesium-137 isotopic radioactive source (7.4 × 109 Bq), and NaI(Tl) scintillation detector of (51 × 51) × 10−3 m crystal size coupled to a multichannel analyzer. A stainless steel tubes of 0,154 m internal diameter, 0.014 m thickness wall. The results show that the MCNPX simulation code adapted to automated input file is useful for generating a matrix data M(θ,t), of a computerized gamma ray tomography for any known density and regular shape object. Experimental validation used RMSE from gamma ray paths and from attenuation coefficient data. (author)
Monte Carlo simulation of gamma ray tomography for image reconstruction
Energy Technology Data Exchange (ETDEWEB)
Guedes, Karlos A.N.; Moura, Alex; Dantas, Carlos; Melo, Silvio; Lima, Emerson, E-mail: karlosguedes@hotmail.com [Universidade Federal de Pernambuco (UFPE), Recife, PE (Brazil); Meric, Ilker [University of Bergen (Norway)
2015-07-01
The Monte Carlo simulations of known density and shape object was validate with Gamma Ray Tomography in static experiments. An aluminum half-moon piece placed inside a steel pipe was the MC simulation test object that was also measured by means of gamma ray transmission. Wall effect of the steel pipe due to irradiation geometry in a single pair source-detector tomography was evaluated by comparison with theoretical data. MCNPX code requires a defined geometry to each photon trajectory which practically prevents this usage for tomography reconstruction simulation. The solution was found by writing a program in Delphi language to create input files automation code. Simulations of tomography data by automated MNCPX code were carried out and validated by experimental data. Working in this sequence the produced data needed a databank to be stored. Experimental setup used a Cesium-137 isotopic radioactive source (7.4 × 109 Bq), and NaI(Tl) scintillation detector of (51 × 51) × 10−3 m crystal size coupled to a multichannel analyzer. A stainless steel tubes of 0,154 m internal diameter, 0.014 m thickness wall. The results show that the MCNPX simulation code adapted to automated input file is useful for generating a matrix data M(θ,t), of a computerized gamma ray tomography for any known density and regular shape object. Experimental validation used RMSE from gamma ray paths and from attenuation coefficient data. (author)
Energy Technology Data Exchange (ETDEWEB)
Cirrone, G.A.P., E-mail: cirrone@lns.infn.it [Laboratori Nazionali del Sud - National Instiute for Nuclear Physics INFN (INFN-LNS), Via S.Sofia 64, 95100 Catania (Italy); Bucciolini, M. [Department of ' Fisiopatologia Clinica' , University of Florence, V.le Morgagni 85, I-50134 Florence (Italy); Bruzzi, M. [Energetic Department, University of Florence, Via S. Marta 3, I-50139 Florence (Italy); Candiano, G. [Laboratorio di Tecnologie Oncologiche HSR, Giglio Contrada, Pietrapollastra-Pisciotto, 90015 Cefalu, Palermo (Italy); Civinini, C. [National Institute for Nuclear Physics INFN, Section of Florence, Via G. Sansone 1, Sesto Fiorentino, I-50019 Florence (Italy); Cuttone, G. [Laboratori Nazionali del Sud - National Instiute for Nuclear Physics INFN (INFN-LNS), Via S.Sofia 64, 95100 Catania (Italy); Guarino, P. [Nuclear Engineering Department, University of Palermo, Via... Palermo (Italy); Laboratori Nazionali del Sud - National Instiute for Nuclear Physics INFN (INFN-LNS), Via S.Sofia 64, 95100 Catania (Italy); Lo Presti, D. [Physics Department, University of Catania, Via S. Sofia 64, I-95123, Catania (Italy); Mazzaglia, S.E. [Laboratori Nazionali del Sud - National Instiute for Nuclear Physics INFN (INFN-LNS), Via S.Sofia 64, 95100 Catania (Italy); Pallotta, S. [Department of ' Fisiopatologia Clinica' , University of Florence, V.le Morgagni 85, I-50134 Florence (Italy); Randazzo, N. [National Institute for Nuclear Physics INFN, Section of Catania, Via S.Sofia 64, 95123 Catania (Italy); Sipala, V. [National Institute for Nuclear Physics INFN, Section of Catania, Via S.Sofia 64, 95123 Catania (Italy); Physics Department, University of Catania, Via S. Sofia 64, I-95123, Catania (Italy); Stancampiano, C. [National Institute for Nuclear Physics INFN, Section of Catania, Via S.Sofia 64, 95123 Catania (Italy); and others
2011-12-01
In this paper the use of the Filtered Back Projection (FBP) Algorithm, in order to reconstruct tomographic images using the high energy (200-250 MeV) proton beams, is investigated. The algorithm has been studied in detail with a Monte Carlo approach and image quality has been analysed and compared with the total absorbed dose. A proton Computed Tomography (pCT) apparatus, developed by our group, has been fully simulated to exploit the power of the Geant4 Monte Carlo toolkit. From the simulation of the apparatus, a set of tomographic images of a test phantom has been reconstructed using the FBP at different absorbed dose values. The images have been evaluated in terms of homogeneity, noise, contrast, spatial and density resolution.
International Nuclear Information System (INIS)
In this paper the use of the Filtered Back Projection (FBP) Algorithm, in order to reconstruct tomographic images using the high energy (200-250 MeV) proton beams, is investigated. The algorithm has been studied in detail with a Monte Carlo approach and image quality has been analysed and compared with the total absorbed dose. A proton Computed Tomography (pCT) apparatus, developed by our group, has been fully simulated to exploit the power of the Geant4 Monte Carlo toolkit. From the simulation of the apparatus, a set of tomographic images of a test phantom has been reconstructed using the FBP at different absorbed dose values. The images have been evaluated in terms of homogeneity, noise, contrast, spatial and density resolution.
Pet image reconstruction with on-the-Fly Monte Carlo using GPU - 213
International Nuclear Information System (INIS)
Particle transport Monte Carlo is an inherently parallel algorithm and with all the further similarities to visible light ray-tracing it can easily be realized on current Graphics Processing Units (GPUs). This paper describes a GPU-based gamma photon transport code with application to the iterative image reconstruction steps of Positron Emission Tomography. The aim of the investigations here is the development of a Monte Carlo code capable of calculating the forward projection of each iteration step in the reconstruction with calculation times on the minute scale. Achieved simulation speed is in the order of 108 positrons per second. (authors)
Patient-specific CT dose determination from CT images using Monte Carlo simulations
Liang, Qing
Radiation dose from computed tomography (CT) has become a public concern with the increasing application of CT as a diagnostic modality, which has generated a demand for patient-specific CT dose determinations. This thesis work aims to provide a clinically applicable Monte-Carlo-based CT dose calculation tool based on patient CT images. The source spectrum was simulated based on half-value layer measurements. Analytical calculations along with the measured flux distribution were used to estimate the bowtie-filter geometry. Relative source output at different points in a cylindrical phantom was measured and compared with Monte Carlo simulations to verify the determined spectrum and bowtie-filter geometry. Sensitivity tests were designed with four spectra with the same kVp and different half-value layers, and showed that the relative output at different locations in a phantom is sensitive to different beam qualities. An mAs-to-dose conversion factor was determined with in-air measurements using an Exradin A1SL ionization chamber. Longitudinal dose profiles were measured with thermoluminescent dosimeters (TLDs) and compared with the Monte-Carlo-simulated dose profiles to verify the mAs-to-dose conversion factor. Using only the CT images to perform Monte Carlo simulations would cause dose underestimation due to the lack of a scatter region. This scenario was demonstrated with a cylindrical phantom study. Four different image extrapolation methods from the existing CT images and the Scout images were proposed. The results show that performing image extrapolation beyond the scan region improves the dose calculation accuracy under both step-shoot scan mode and helical scan mode. Two clinical studies were designed and comparisons were performed between the current CT dose metrics and the Monte-Carlo-based organ dose determination techniques proposed in this work. The results showed that the current CT dosimetry failed to show dose differences between patients with the same
International Nuclear Information System (INIS)
The resulting neutron captures in 10B are used for radiation therapy. The occurrence point of the characteristic 478 keV prompt gamma rays agrees with the neutron capture point. If these prompt gamma rays are detected by external instruments such as a gamma camera or single photon emission computed tomography (SPECT), the therapy region can be monitored during the treatment using images. A feasibility study and analysis of a reconstructed image using many projections (128) were conducted. The optimization of the detection system and a detailed neutron generator simulation were beyond the scope of this study. The possibility of extracting a 3D BNCT-SPECT image was confirmed using the Monte Carlo simulation and OSEM algorithm. The quality of the prompt gamma ray SPECT image obtained from BNCT was evaluated quantitatively using three different boron uptake regions and was shown to depend on the location and size relations. The prospects for obtaining an actual BNCT-SPECT image were also estimated from the quality of the simulated image and the simulation conditions. When multi tumor regions should be treated using the BNCT method, a reasonable model to determine how many useful images can be obtained from SPECT can be provided to the BNCT facilities based on the preceding imaging research. However, because the scope of this research was limited to checking the feasibility of 3D BNCT-SPECT image reconstruction using multiple projections, along with an evaluation of the image, some simulation conditions were taken from previous studies. In the future, a simulation will be conducted that includes optimized conditions for an actual BNCT facility, along with an imaging process for motion correction in BNCT. Although an excessively long simulation time was required to obtain enough events for image reconstruction, the feasibility of acquiring a 3D BNCT-SPECT image using multiple projections was confirmed using a Monte Carlo simulation, and a quantitative image analysis was
Validation of the Monte Carlo simulator GATE for indium-111 imaging.
Assié, K; Gardin, I; Véra, P; Buvat, I
2005-07-01
Monte Carlo simulations are useful for optimizing and assessing single photon emission computed tomography (SPECT) protocols, especially when aiming at measuring quantitative parameters from SPECT images. Before Monte Carlo simulated data can be trusted, the simulation model must be validated. The purpose of this work was to validate the use of GATE, a new Monte Carlo simulation platform based on GEANT4, for modelling indium-111 SPECT data, the quantification of which is of foremost importance for dosimetric studies. To that end, acquisitions of (111)In line sources in air and in water and of a cylindrical phantom were performed, together with the corresponding simulations. The simulation model included Monte Carlo modelling of the camera collimator and of a back-compartment accounting for photomultiplier tubes and associated electronics. Energy spectra, spatial resolution, sensitivity values, images and count profiles obtained for experimental and simulated data were compared. An excellent agreement was found between experimental and simulated energy spectra. For source-to-collimator distances varying from 0 to 20 cm, simulated and experimental spatial resolution differed by less than 2% in air, while the simulated sensitivity values were within 4% of the experimental values. The simulation of the cylindrical phantom closely reproduced the experimental data. These results suggest that GATE enables accurate simulation of (111)In SPECT acquisitions. PMID:15972984
The Influence of Void on Image Quality in Industrial SPECT: a Monte Carlo Study
International Nuclear Information System (INIS)
Recently, industrial SPECT was developed in the Korea Atomic Energy Research Institute (KAERI) to investigate the spatial distribution, mixing characteristics, and flow pattern of flow media in industrial process plants. To improve the image quality, various technique were proposed by Park et al., and the performance of industrial SPECT was evaluated by the Monte Carlo code and experiments under single phase flow case in the vessel. In practice, industrial flows are mixture of different phases with various specific purposes such as mass transfer, purification and etc. In the present study, industrial SPECT and homogeneous void were simulated by using Monte Carlo code in order to evaluate the effect of void in the vessel on image quality of industrial SPECT
International Nuclear Information System (INIS)
COG is a major multiparticle simulation code in the LLNL Monte Carlo radiation transport toolkit. It was designed to solve deep-penetration radiation shielding problems in arbitrarily complex 3D geometries, involving coupled transport of photons, neutrons, and electrons. COG was written to provide as much accuracy as the underlying cross-sections will allow, and has a number of variance-reduction features to speed computations. Recently COG has been applied to the simulation of high- resolution radiographs of complex objects and the evaluation of contraband detection schemes. In this paper we will give a brief description of the capabilities of the COG transport code and show several examples of neutron and gamma-ray imaging simulations. Keywords: Monte Carlo, radiation transport, simulated radiography, nonintrusive inspection, neutron imaging
Polarization imaging of multiply-scattered radiation based on integral-vector Monte Carlo method
International Nuclear Information System (INIS)
A new integral-vector Monte Carlo method (IVMCM) is developed to analyze the transfer of polarized radiation in 3D multiple scattering particle-laden media. The method is based on a 'successive order of scattering series' expression of the integral formulation of the vector radiative transfer equation (VRTE) for application of efficient statistical tools to improve convergence of Monte Carlo calculations of integrals. After validation against reference results in plane-parallel layer backscattering configurations, the model is applied to a cubic container filled with uniformly distributed monodispersed particles and irradiated by a monochromatic narrow collimated beam. 2D lateral images of effective Mueller matrix elements are calculated in the case of spherical and fractal aggregate particles. Detailed analysis of multiple scattering regimes, which are very similar for unpolarized radiation transfer, allows identifying the sensitivity of polarization imaging to size and morphology.
Feasibility Study of Neutron Dose for Real Time Image Guided Proton Therapy: A Monte Carlo Study
Kim, Jin Sung; Shin, Jung Suk; Kim, Daehyun; Shin, EunHyuk; Chung, Kwangzoo; Cho, Sungkoo; Ahn, Sung Hwan; Ju, Sanggyu; Chung, Yoonsun; Jung, Sang Hoon; Han, Youngyih
2015-01-01
Two full rotating gantry with different nozzles (Multipurpose nozzle with MLC, Scanning Dedicated nozzle) with conventional cyclotron system is installed and under commissioning for various proton treatment options at Samsung Medical Center in Korea. The purpose of this study is to investigate neutron dose equivalent per therapeutic dose, H/D, to x-ray imaging equipment under various treatment conditions with monte carlo simulation. At first, we investigated H/D with the various modifications...
Monte Carlo modeling of neutron imaging at the SINQ spallation source
International Nuclear Information System (INIS)
Modeling of the Swiss Spallation Neutron Source (SINQ) has been used to demonstrate the neutron radiography capability of the newly released MPI-version of the MCNPX Monte Carlo code. A detailed MCNPX model was developed of SINQ and its associated neutron transmission radiography (NEUTRA) facility. Preliminary validation of the model was performed by comparing the calculated and measured neutron fluxes in the NEUTRA beam line, and a simulated radiography image was generated for a sample consisting of steel tubes containing different materials. This paper describes the SINQ facility, provides details of the MCNPX model, and presents preliminary results of the neutron imaging. (authors)
Pinhole X-ray Fluorescence Imaging of Gadolinium Nanoparticles: A Preliminary Monte Carlo Study
Energy Technology Data Exchange (ETDEWEB)
Jung, Seong Moon; Sung, Won Mo; Ye, Sung Joon [Seoul National University, Seoul (Korea, Republic of)
2014-10-15
X-ray fluorescence imaging is a modality for the element-specific imaging of a subject through analysis of characteristic x-rays produced by exploiting the interaction of high atomic number elements and incoming x-rays. Previous studies have utilized a polychromatic x-ray source to investigate the production of in vivo x-ray fluorescence images for the assessment of concentrations and locations of gold nanoparticles. However, previous efforts have so far been unable to detect low concentrations, such as 0.001% gold by weight, which is an expected concentration accumulated in tumors. We examined the feasibility of a monochromatic synchrotron x-rays implementation of pinhole x-ray fluorescence imaging by Monte Carlo simulations using MCNP5. In the current study, gadolinium (Gd) nanoparticles, which have been widely used as a contrast agent in magnetic resonance imaging and also as a dose enhancer in radiation therapy, were chosen for tumor targeting. Since a monochromatic x-ray source is used, the increased x-ray fluorescence signals allow the detection of low concentrations of Gd. Two different monochromatic x-ray beam energies, 50.5 keV near the Kedge energy (i.e., 50.207 keV) of Gd and 55 keV, were compared by their respective imaging results. Using Monte Carlo simulations the feasibility of imaging low concentrations of Gd nanoparticles (e.g., 0.001 wt%) with x-ray fluorescence using monochromatic synchrotron x-rays of two different energies was shown. In the case of imaging a single Gd column inserted in the center of a water phantom, the fluorescence signals from 0.05 wt% and 0.1 wt% Gd columns irradiated with a 50.5 keV photon beam were higher than those irradiated with 55 keV. Below 0.05 wt% region no significant differences were found.
Pinhole X-ray Fluorescence Imaging of Gadolinium Nanoparticles: A Preliminary Monte Carlo Study
International Nuclear Information System (INIS)
X-ray fluorescence imaging is a modality for the element-specific imaging of a subject through analysis of characteristic x-rays produced by exploiting the interaction of high atomic number elements and incoming x-rays. Previous studies have utilized a polychromatic x-ray source to investigate the production of in vivo x-ray fluorescence images for the assessment of concentrations and locations of gold nanoparticles. However, previous efforts have so far been unable to detect low concentrations, such as 0.001% gold by weight, which is an expected concentration accumulated in tumors. We examined the feasibility of a monochromatic synchrotron x-rays implementation of pinhole x-ray fluorescence imaging by Monte Carlo simulations using MCNP5. In the current study, gadolinium (Gd) nanoparticles, which have been widely used as a contrast agent in magnetic resonance imaging and also as a dose enhancer in radiation therapy, were chosen for tumor targeting. Since a monochromatic x-ray source is used, the increased x-ray fluorescence signals allow the detection of low concentrations of Gd. Two different monochromatic x-ray beam energies, 50.5 keV near the Kedge energy (i.e., 50.207 keV) of Gd and 55 keV, were compared by their respective imaging results. Using Monte Carlo simulations the feasibility of imaging low concentrations of Gd nanoparticles (e.g., 0.001 wt%) with x-ray fluorescence using monochromatic synchrotron x-rays of two different energies was shown. In the case of imaging a single Gd column inserted in the center of a water phantom, the fluorescence signals from 0.05 wt% and 0.1 wt% Gd columns irradiated with a 50.5 keV photon beam were higher than those irradiated with 55 keV. Below 0.05 wt% region no significant differences were found
Novel imaging and quality assurance techniques for ion beam therapy a Monte Carlo study
Rinaldi, I; Jäkel, O; Mairani, A; Parodi, K
2010-01-01
Ion beams exhibit a finite and well defined range in matter together with an “inverted” depth-dose profile, the so-called Bragg peak. These favourable physical properties may enable superior tumour-dose conformality for high precision radiation therapy. On the other hand, they introduce the issue of sensitivity to range uncertainties in ion beam therapy. Although these uncertainties are typically taken into account when planning the treatment, correct delivery of the intended ion beam range has to be assured to prevent undesired underdosage of the tumour or overdosage of critical structures outside the target volume. Therefore, it is necessary to define dedicated Quality Assurance procedures to enable in-vivo range verification before or during therapeutic irradiation. For these purposes, Monte Carlo transport codes are very useful tools to support the development of novel imaging modalities for ion beam therapy. In the present work, we present calculations performed with the FLUKA Monte Carlo code and pr...
International Nuclear Information System (INIS)
The image acquisition methods applied to nuclear medicine and radiobiology are a valuable research study for determination of thyroid anatomy to seek disorders associated to follicular cells. The Monte Carlo (MC) simulation has also been used in problems related to radiation detection in order to map medical images since the improvement of data processing compatible with personnel computers (PC). This work presents an innovative study to find out the adequate scintillation inorganic detector array that could be coupled to a specific light photo sensor, a charge coupled device (CCD) through a fiber optic plate in order to map the follicles of thyroid gland. The goal is to choose the type of detector that fits the application suggested here with spatial resolution of 10 μm and good detector efficiency. The methodology results are useful to map a follicle image using gamma radiation emission. A source - detector simulation is performed by using a MCNP4B (Monte Carlo for Neutron Photon transport) general code considering different source energies, detector materials and geometries including pixel sizes and reflector types. The results demonstrate that by using MCNP4B code is possible to searching for useful parameters related to the systems used in nuclear medicine, specifically in radiobiology applied to endocrine physiology studies to acquiring thyroid follicles images. (author)
International Nuclear Information System (INIS)
This research thesis addresses the dosimetric control of radiotherapy treatments by using amorphous silicon digital portal imagery. In a first part, the author reports the analysis of the dosimetric abilities of the imager (iViewGT) which is used in the radiotherapy department. The stability of the imager response on a short and on a long term has been studied. A relationship between the image grey level and the dose has been established for a reference irradiation field. The influence of irradiation parameters on the grey level variation with respect to the dose has been assessed. The obtained results show the possibility to use this system for dosimetry provided that a precise calibration is performed while taking the most influencing irradiation parameters into account, i.e. photon beam nominal energy, field size, and patient thickness. The author reports the development of a Monte Carlo simulation to model the imager response. It models the accelerator head by a generalized source point. Space and energy distributions of photons are calculated. This modelling can also be applied to the calculation of dose distribution within a patient, or to study physical interactions in the accelerator head. Then, the author explores a new approach to dose portal image prediction within the frame of an in vivo dosimetric control. He computes the image transmitted through the patient by Monte Carlo simulation, and measures the portal image of the irradiation field without the patient. Validation experiments are reported, and problems to be solved are highlighted (computation time, improvement of the collimator simulation)
Coded aperture coherent scatter imaging for breast cancer detection: a Monte Carlo evaluation
Lakshmanan, Manu N.; Morris, Robert E.; Greenberg, Joel A.; Samei, Ehsan; Kapadia, Anuj J.
2016-03-01
It is known that conventional x-ray imaging provides a maximum contrast between cancerous and healthy fibroglandular breast tissues of 3% based on their linear x-ray attenuation coefficients at 17.5 keV, whereas coherent scatter signal provides a maximum contrast of 19% based on their differential coherent scatter cross sections. Therefore in order to exploit this potential contrast, we seek to evaluate the performance of a coded- aperture coherent scatter imaging system for breast cancer detection and investigate its accuracy using Monte Carlo simulations. In the simulations we modeled our experimental system, which consists of a raster-scanned pencil beam of x-rays, a bismuth-tin coded aperture mask comprised of a repeating slit pattern with 2-mm periodicity, and a linear-array of 128 detector pixels with 6.5-keV energy resolution. The breast tissue that was scanned comprised a 3-cm sample taken from a patient-based XCAT breast phantom containing a tomosynthesis- based realistic simulated lesion. The differential coherent scatter cross section was reconstructed at each pixel in the image using an iterative reconstruction algorithm. Each pixel in the reconstructed image was then classified as being either air or the type of breast tissue with which its normalized reconstructed differential coherent scatter cross section had the highest correlation coefficient. Comparison of the final tissue classification results with the ground truth image showed that the coded aperture imaging technique has a cancerous pixel detection sensitivity (correct identification of cancerous pixels), specificity (correctly ruling out healthy pixels as not being cancer) and accuracy of 92.4%, 91.9% and 92.0%, respectively. Our Monte Carlo evaluation of our experimental coded aperture coherent scatter imaging system shows that it is able to exploit the greater contrast available from coherently scattered x-rays to increase the accuracy of detecting cancerous regions within the breast.
Monte Carlo Modeling of Cascade Gamma Rays in 86Y PET imaging: Preliminary results
Zhu, Xuping; El Fakhri, Georges
2009-01-01
86Y is a PET agent that could be used as an ideal surrogate to allow personalized dosimetry in 90Y radionuclide therapy. However, 86Y also emits cascade gamma rays. We have developed a Monte Carlo program based on SimSET to model cascade gamma rays in PET imaging. The new simulation was validated with the GATE simulation package. Agreements within 15% were found in spatial resolution, apparent scatter fraction (ratio of coincidences outside peak regions in line source sinograms), singles and ...
Directory of Open Access Journals (Sweden)
Bruno MarceloGS
2008-01-01
Full Text Available We present in this paper a sequential Monte Carlo methodology for joint detection and tracking of a multiaspect target in image sequences. Unlike the traditional contact/association approach found in the literature, the proposed methodology enables integrated, multiframe target detection and tracking incorporating the statistical models for target aspect, target motion, and background clutter. Two implementations of the proposed algorithm are discussed using, respectively, a resample-move (RS particle filter and an auxiliary particle filter (APF. Our simulation results suggest that the APF configuration outperforms slightly the RS filter in scenarios of stealthy targets.
Monte Carlo simulation of grating-based neutron phase contrast imaging at CPHS
International Nuclear Information System (INIS)
Since the launching of the Compact Pulsed Hadron Source (CPHS) project of Tsinghua University in 2009, works have begun on the design and engineering of an imaging/radiography instrument for the neutron source provided by CPHS. The instrument will perform basic tasks such as transmission imaging and computerized tomography. Additionally, we include in the design the utilization of coded-aperture and grating-based phase contrast methodology, as well as the options of prompt gamma-ray analysis and neutron-energy selective imaging. Previously, we had implemented the hardware and data-analysis software for grating-based X-ray phase contrast imaging. Here, we investigate Geant4-based Monte Carlo simulations of neutron refraction phenomena and then model the grating-based neutron phase contrast imaging system according to the classic-optics-based method. The simulated experimental results of the retrieving phase shift gradient information by five-step phase-stepping approach indicate the feasibility of grating-based neutron phase contrast imaging as an option for the cold neutron imaging instrument at the CPHS.
International Nuclear Information System (INIS)
Full text: Medical imaging provides two-dimensional pictures of the human internal anatomy from which may be constructed a three-dimensional model of organs and tissues suitable for calculation of dose from radiation. Diagnostic CT provides the greatest exposure to radiation per examination and the frequency of CT examination is high. Esti mates of dose from diagnostic radiography are still determined from data derived from geometric models (rather than anatomical models), models scaled from adult bodies (rather than bodies of children) and CT scanner hardware that is no longer used. The aim of anatomical modelling is to produce a mathematical representation of internal anatomy that has organs of realistic size, shape and positioning. The organs and tissues are represented by a great many cuboidal volumes (voxels). The conversion of medical images to voxels is called segmentation and on completion every pixel in an image is assigned to a tissue or organ. Segmentation is time consuming. An image processing pack age is used to identify organ boundaries in each image. Thirty to forty tomographic voxel models of anatomy have been reported in the literature. Each model is of an individual, or a composite from several individuals. Images of children are particularly scarce. So there remains a need for more paediatric anatomical models. I am working on segmenting ''William'' who is 368 PET-CT images from head to toe of a seven year old boy. William will be used for Monte Carlo dose calculations of dose from CT examination using a simulated modern CT scanner.
Medical images of patients in voxel structures in high resolution for Monte Carlo simulation
Energy Technology Data Exchange (ETDEWEB)
Boia, Leonardo S.; Menezes, Artur F.; Silva, Ademir X., E-mail: lboia@con.ufrj.b, E-mail: ademir@con.ufrj.b [Universidade Federal do Rio de Janeiro (PEN/COPPE/UFRJ), RJ (Brazil). Coordenacao dos Programas de Pos-Graduacao de Engenharia. Programa de Engenharia Nuclear; Salmon Junior, Helio A. [Clinicas Oncologicas Integradas (COI), Rio de Janeiro, RJ (Brazil)
2011-07-01
This work aims to present a computational process of conversion of tomographic and MRI medical images from patients in voxel structures to an input file, which will be manipulated in Monte Carlo Simulation code for tumor's radiotherapic treatments. The problem's scenario inherent to the patient is simulated by such process, using the volume element (voxel) as a unit of computational tracing. The head's voxel structure geometry has voxels with volumetric dimensions around 1 mm{sup 3} and a population of millions, which helps - in that way, for a realistic simulation and a decrease in image's digital process techniques for adjustments and equalizations. With such additional data from the code, a more critical analysis can be developed in order to determine the volume of the tumor, and the protection, beside the patients' medical images were borrowed by Clinicas Oncologicas Integradas (COI/RJ), joined to the previous performed planning. In order to execute this computational process, SAPDI computational system is used in a digital image process for optimization of data, conversion program Scan2MCNP, which manipulates, processes, and converts the medical images into voxel structures to input files and the graphic visualizer Moritz for the verification of image's geometry placing. (author)
Medical images of patients in voxel structures in high resolution for Monte Carlo simulation
International Nuclear Information System (INIS)
This work aims to present a computational process of conversion of tomographic and MRI medical images from patients in voxel structures to an input file, which will be manipulated in Monte Carlo Simulation code for tumor's radiotherapic treatments. The problem's scenario inherent to the patient is simulated by such process, using the volume element (voxel) as a unit of computational tracing. The head's voxel structure geometry has voxels with volumetric dimensions around 1 mm3 and a population of millions, which helps - in that way, for a realistic simulation and a decrease in image's digital process techniques for adjustments and equalizations. With such additional data from the code, a more critical analysis can be developed in order to determine the volume of the tumor, and the protection, beside the patients' medical images were borrowed by Clinicas Oncologicas Integradas (COI/RJ), joined to the previous performed planning. In order to execute this computational process, SAPDI computational system is used in a digital image process for optimization of data, conversion program Scan2MCNP, which manipulates, processes, and converts the medical images into voxel structures to input files and the graphic visualizer Moritz for the verification of image's geometry placing. (author)
Badal, Andreu; Kyprianou, Iacovos; Badano, Aldo; Sempau, Josep; Myers, Kyle J.
2007-03-01
X-ray imaging system optimization increases the benefit-to-cost ratio by reducing the radiation dose to the patient while maximizing image quality. We present a new simulation tool for the generation of realistic medical x-ray images for assessment and optimization of complete imaging systems. The Monte Carlo code simulates radiation transport physics using the subroutine package PENELOPE, which accurately simulates the transport of electrons and photons within the typical medical imaging energy range. The new code implements a novel object-oriented geometry package that allows simulations with homogeneous objects of arbitrary shapes described by triangle meshes. The flexibility of this code, which uses the industry standard PLY input-file format, allows the use of detailed anatomical models developed using computer-aided design tools applied to segmented CT and MRI data. The use of triangle meshes highly simplifies the ray-tracing algorithm without reducing the generality of the code, since most surface models can be tessellated into triangles while retaining their geometric details. Our algorithm incorporates an octree spatial data structure to sort the triangles and accelerate the simulation, reaching execution speeds comparable to the original quadric geometry model of PENELOPE. Coronary angiograms were simulated using a tessellated version of the NURBS-based Cardiac-Torso (NCAT) phantom. The phantom models 330 objects, comprised in total of 5 million triangles. The dose received by each organ and the contribution of the different scattering processes to the final image were studied in detail.
Energy Technology Data Exchange (ETDEWEB)
Descalle, M-A; Chuang, C; Pouliot, J
2002-01-30
Patient positioning accuracy remains an issue for external beam radiotherapy. Currently, kilovoltage verification images are used as reference by clinicians to compare the actual patient treatment position with the planned position. These images are qualitatively different from treatment-time megavoltage portal images. This study will investigate the feasibility of using PEREGRINE, a 3D Monte Carlo calculation engine, to create reference images for portal image comparisons. Portal images were acquired using an amorphous-silicon flat-panel EPID for (1) the head and pelvic sections of an anthropomorphic phantom with 7-8 mm displacements applied, and (2) a prostate patient on five treatment days. Planning CT scans were used to generate simulated reference images with PEREGRINE. A correlation algorithm quantified the setup deviations between simulated and portal images. Monte Carlo simulated images exhibit similar qualities to portal images, the phantom slabs appear clearly. Initial positioning differences and applied displacements were detected and quantified. We find that images simulated with Monte Carlo methods can be used as reference images to detect and quantify set-up errors during treatment.
Optimal design of Anger camera for bremsstrahlung imaging: Monte Carlo evaluation.
Directory of Open Access Journals (Sweden)
Stephan eWalrand
2014-06-01
Full Text Available A conventional Anger camera is not adapted to bremsstrahlung imaging and, as a result, even using a reduced energy acquisition window, geometric x-rays represent less than 15% of the recorded events. This increases noise, limits the contrast, and reduces the quantification accuracy.Monte Carlo simulations of energy spectra showed that a camera based on a 30mm-thick BGO crystal and equipped with a high energy pinhole collimator is well adapted to bremsstrahlung imaging. The total scatter contamination is reduced by a factor ten versus a conventional NaI camera equipped with a high energy parallel hole collimator enabling acquisition using an extended energy window ranging from 50 to 350 keV. By using the recorded event energy in the reconstruction method, shorter acquisition time and reduced orbit range will be usable allowing the design of a simplified mobile gantry. This is more convenient for use in a busy catheterization room. After injecting a safe activity, a fast SPECT could be performed without moving the catheter tip in order to assess the liver dosimetry and estimate the additional safe activity that could still be injected.Further long running time Monte Carlo simulations of realistic acquisitions will allow assessing the quantification capability of such system. Simultaneously, a dedicated bremsstrahlung prototype camera reusing PMT-BGO blocks coming from a retired PET system is currently under design for further evaluation.
Nishidate, Izumi; Wiswadarma, Aditya; Hase, Yota; Tanaka, Noriyuki; Maeda, Takaaki; Niizeki, Kyuichi; Aizu, Yoshihisa
2011-08-01
In order to visualize melanin and blood concentrations and oxygen saturation in human skin tissue, a simple imaging technique based on multispectral diffuse reflectance images acquired at six wavelengths (500, 520, 540, 560, 580 and 600nm) was developed. The technique utilizes multiple regression analysis aided by Monte Carlo simulation for diffuse reflectance spectra. Using the absorbance spectrum as a response variable and the extinction coefficients of melanin, oxygenated hemoglobin, and deoxygenated hemoglobin as predictor variables, multiple regression analysis provides regression coefficients. Concentrations of melanin and total blood are then determined from the regression coefficients using conversion vectors that are deduced numerically in advance, while oxygen saturation is obtained directly from the regression coefficients. Experiments with a tissue-like agar gel phantom validated the method. In vivo experiments with human skin of the human hand during upper limb occlusion and of the inner forearm exposed to UV irradiation demonstrated the ability of the method to evaluate physiological reactions of human skin tissue.
Monte Carlo simulation of CD-SEM images for linewidth and critical dimension metrology.
Li, Y G; Zhang, P; Ding, Z J
2013-01-01
In semiconductor industry, strict critical dimension control by using a critical dimension scanning electron microscope (CD-SEM) is an extremely urgent task in near-term years. A Monte Carlo simulation model for study of CD-SEM image has been established, which is based on using Mott's cross section for electron elastic scattering and the full Penn dielectric function formalism for electron inelastic scattering and the associated secondary electron (SE) production. In this work, a systematic calculation of CD-SEM line-scan profiles and 2D images of trapezoidal Si lines has been performed by taking into account different experimental factors including electron beam condition (primary energy, probe size), line geometry (width, height, foot/corner rounding, sidewall angle, and roughness), material properties, and SE signal detection. The influences of these factors to the critical dimension metrology are investigated, leading to build a future comprehensive model-based library. PMID:22887037
Peterson, J R; Kahn, S M; Rasmussen, A P; Peng, E; Ahmad, Z; Bankert, J; Chang, C; Claver, C; Gilmore, D K; Grace, E; Hannel, M; Hodge, M; Lorenz, S; Lupu, A; Meert, A; Nagarajan, S; Todd, N; Winans, A; Young, M
2015-01-01
We present a comprehensive methodology for the simulation of astronomical images from optical survey telescopes. We use a photon Monte Carlo approach to construct images by sampling photons from models of astronomical source populations, and then simulating those photons through the system as they interact with the atmosphere, telescope, and camera. We demonstrate that all physical effects for optical light that determine the shapes, locations, and brightnesses of individual stars and galaxies can be accurately represented in this formalism. By using large scale grid computing, modern processors, and an efficient implementation that can produce 400,000 photons/second, we demonstrate that even very large optical surveys can be now be simulated. We demonstrate that we are able to: 1) construct kilometer scale phase screens necessary for wide-field telescopes, 2) reproduce atmospheric point-spread-function moments using a fast novel hybrid geometric/Fourier technique for non-diffraction limited telescopes, 3) ac...
Gigliotti, C. R.; Altabella, L.; Boschi, F.; Spinelli, A. E.
2016-07-01
The goal of this work is to compare the performances of different beta minus detection strategies for image guided surgery or ex vivo tissue analysis. In particular we investigated Cerenkov luminescence imaging (CLI) with and without the use of a radiator, direct and indirect beta detection and bremsstrahlung imaging using beta emitters commonly employed in Nuclear Medicine. Monte Carlo simulations were implemented using the GAMOS plug-in for GEANT4 considering a slab of muscle and a radioactive source (32P or 90Y) placed at 0.5 mm depth. We estimated the gain that can be obtained in terms of produced photons using different materials placed on the slab used as Cerenkov radiators, we then focused on the number of exiting photons and their spatial distribution for the different strategies. The use of radiator to enhance Cerenkov signal reduces the spatial resolution because of the increased optical spread. We found that direct beta detection and CLI are best approaches in term of resolution while the use of a thin scintillator increases the signal but the spatial resolution is degraded. Bremsstrahlung presents lower signal and it does not represent the best choice for image guided surgery. CLI represents a more flexible approach for image guided surgery using or ex vivo tissue analysis using beta-emitter imaging.
Monte Carlo simulation in proton computed tomography: a study of image reconstruction technique
Energy Technology Data Exchange (ETDEWEB)
Inocente, Guilherme Franco; Stenico, Gabriela V.; Hormaza, Joel Mesa [Universidade Estadual Paulista Julio de Mesquita Filho (UNESP), Botucatu, SP (Brazil). Inst. de Biociencias. Dept. de Fisica e Biofisica
2012-07-01
Full text: The radiation method is one of the most used for cancer treatment. In this context arises therapy with proton beams in front of conventional radiotherapy. It is known that with proton therapy there are more advantages to the patient treated when compared with more conventional methods. The dose distributed along the path, especially in healthy tissues - neighbor the tumor, is smaller and the accuracy of treatment is much better. To carry out the treatment, the patient undergoes a plan through images for visualization and location of the target volume. The main method for obtaining these images is computed tomography X-ray (XCT). For treatment with proton beam this imaging technique can to generate some uncertainties. The purpose of this project is to study the feasibility of reconstructing images generated from the irradiation with proton beams, thereby reducing some inaccuracies, as it will be the same type of radiation as treatment planning, and also to drastically reduce some errors location, since the planning can be done at the same place and just before where the patient is treated. This study aims to obtain a relationship between the intrinsic property of the interaction of photons and protons with matter. For this we use computational simulation based on Monte Carlo method with the code SRIM 2008 and MCNPX v.2.5.0, to reconstruct images using the technique used in conventional computed tomography. (author)
Khromova, A N; Arfelli, F; Menk, R H; Besch, H J; Plothow-Besch, H; 10.1109/NSSMIC.2004.1466758
2010-01-01
In this work we present a novel 3D Monte Carlo photon transport program for simulation of multiple refractive scattering based on the refractive properties of X-rays in highly scattering media, like lung tissue. Multiple scattering reduces not only the quality of the image, but contains also information on the internal structure of the object. This information can be exploited utilizing image modalities such as Diffraction Enhanced Imaging (DEI). To study the effect of multiple scattering a Monte Carlo program was developed that simulates multiple refractive scattering of X-ray photons on monodisperse PMMA (poly-methyl-methacrylate) microspheres representing alveoli in lung tissue. Eventually, the results of the Monte Carlo program were compared to the measurements taken at the SYRMEP beamline at Elettra (Trieste, Italy) on special phantoms showing a good agreement between both data.
Monte Carlo validation of optimal material discrimination using spectral x-ray imaging
Nik, Syen J; Watts, Richard; Dale, Tony; Currie, Bryn; Meyer, Juergen
2014-01-01
The validation of a previous work on the optimization of material discrimination in spectral x-ray imaging is reported. Using Monte Carlo simulations based on the BEAMnrc package, material decomposition was performed on the projection images of phantoms containing up to three materials. The simulated projection data was first decomposed into material basis images by minimizing the z-score between expected and simulated counts. Statistical analysis was performed for the pixels within the region-of-interest consisting of contrast material(s) in the BEAMnrc simulations. With the consideration of scattered radiation and a realistic scanning geometry, the theoretical optima of energy bin borders provided by the algorithm were shown to have an accuracy of $\\pm$2 keV for the decomposition of 2 and 3 materials. Finally, the signal-to-noise ratio predicted by the theoretical model was also validated. The counts per pixel needed for achieving a specific imaging aim can therefore be estimated using the validated model.
Monte Carlo simulation of breast tumor imaging properties with compact, discrete gamma cameras
International Nuclear Information System (INIS)
The authors describe Monte Carlo simulation results for breast tumor imaging using a compact, discrete gamma camera. The simulations were designed to analyze and optimize camera design, particularly collimator configuration and detector pixel size. Simulated planar images of 5--15 mm diameter tumors in a phantom patient (including a breast, torso, and heart) were generated for imaging distances of 5--55 mm, pixel sizes of 2 x 2--4 x 4 mm2, and hexagonal and square hole collimators with sensitivities from 4,000 to 16,000 counts/mCi/sec. Other factors considered included T/B (tumor-to-background tissue uptake ratio) and detector energy resolution. Image properties were quantified by computing the observed tumor fwhm (full-width at half-maximum) and S/N (sum of detected tumor events divided by the statistical noise). Results suggest that hexagonal and square hole collimators perform comparably, that higher sensitivity collimators provide higher tumor S/N with little increase in the observed tumor fwhm, that smaller pixels only slightly improve tumor fwhm and S/N, and that improved detector energy resolution has little impact on either the observed tumor fwhm or the observed tumor S/N
Monte Carlo modeling of neutron and gamma-ray imaging systems
Hall, James M.
1997-02-01
Detailed numerical prototypes are essential to the design of efficient and cost-effective neutron and gamma-ray imaging systems. We have exploited the unique capabilities of an LLNL-developed radiation transport code (COG) to develop code modules capable of simulating the performance of neutron and gamma-ray imaging systems over a wide range of source energies. COG allows us to simulate complex, energy-, angle-, and time-dependent radiation sources, model 3D system geometries with 'real world' complexity, specify detailed elemental and isotopic distributions and predict the responses of various types of imaging detectors with full Monte Carlo accuracy. COG references detailed, evaluated nuclear interaction databases allowing users to account for multiple scattering, energy straggling, and secondary particle production phenomena which may significantly effect the performance of an imaging system but may be difficult or even impossible to estimate using simple analytical models. In this work we will present examples illustrating the use of these routines in the analysis of industrial radiographic systems for thick target inspection, non-intrusive luggage and cargo scanning systems, and international treaty verification.
Monte Carlo modeling of neutron and gamma-ray imaging systems
International Nuclear Information System (INIS)
Detailed numerical prototypes are essential to design of efficient and cost-effective neutron and gamma-ray imaging systems. We have exploited the unique capabilities of an LLNL-developed radiation transport code (COG) to develop code modules capable of simulating the performance of neutron and gamma-ray imaging systems over a wide range of source energies. COG allows us to simulate complex, energy-, angle-, and time-dependent radiation sources, model 3-dimensional system geometries with ''real world'' complexity, specify detailed elemental and isotopic distributions and predict the responses of various types of imaging detectors with full Monte Carlo accuray. COG references detailed, evaluated nuclear interaction databases allowingusers to account for multiple scattering, energy straggling, and secondary particle production phenomena which may significantly effect the performance of an imaging system by may be difficult or even impossible to estimate using simple analytical models. This work presents examples illustrating the use of these routines in the analysis of industrial radiographic systems for thick target inspection, nonintrusive luggage and cargoscanning systems, and international treaty verification
International Nuclear Information System (INIS)
A four-dimensional (x, y, z, t) composite superquadric-based object model of the human heart for Monte Carlo simulation of radiological imaging systems has been developed. The phantom models the real temporal geometric conditions of a beating heart for frame rates up to 32 per cardiac cycle. Phantom objects are described by boolean combinations of superquadric ellipsoid sections.Moving spherical coordinate systems are chosen to model wall movement whereby points of the ventricle and atria walls are assumed to move towards a moving center-of-gravity point. Due to the non-static coordinate systems, the atrial/ventricular valve plane of the mathematical heart phantom moves up and down along the left ventricular long axis resulting in reciprocal emptying and filling of atria and ventricles. Compared to the base movement, the epicardial apex as well as the superior atria area are almost fixed in space. Since geometric parameters of the objects are directly applied on intersection calculations of the photon ray with object boundaries during Monte Carlo simulation, no phantom discretization artifacts are involved
Lazaro, Delphine; Buvat, I; Loudos, G.; Strul, D.; Santin, G.; Giokaris, N.; Donnarieix, D.; Maigne, L; Spanoudaki, V.; Styliaris, S.; Staelens, S.; BRETON, Vincent
2004-01-01
Monte Carlo simulations are increasingly used in scintigraphic imaging to model imaging systems and to develop and assess tomographic reconstruction algorithms and correction methods for improved image quantitation. GATE (GEANT 4 Application for Tomographic Emission) is a new Monte Carlo simulation platform based on GEANT4 dedicated to nuclear imaging applications. This paper describes the GATE simulation of a prototype of scintillation camera dedicated to small animal imaging and consisting ...
GPU accelerated Monte-Carlo simulation of SEM images for metrology
Verduin, T.; Lokhorst, S. R.; Hagen, C. W.
2016-03-01
In this work we address the computation times of numerical studies in dimensional metrology. In particular, full Monte-Carlo simulation programs for scanning electron microscopy (SEM) image acquisition are known to be notoriously slow. Our quest in reducing the computation time of SEM image simulation has led us to investigate the use of graphics processing units (GPUs) for metrology. We have succeeded in creating a full Monte-Carlo simulation program for SEM images, which runs entirely on a GPU. The physical scattering models of this GPU simulator are identical to a previous CPU-based simulator, which includes the dielectric function model for inelastic scattering and also refinements for low-voltage SEM applications. As a case study for the performance, we considered the simulated exposure of a complex feature: an isolated silicon line with rough sidewalls located on a at silicon substrate. The surface of the rough feature is decomposed into 408 012 triangles. We have used an exposure dose of 6 mC/cm2, which corresponds to 6 553 600 primary electrons on average (Poisson distributed). We repeat the simulation for various primary electron energies, 300 eV, 500 eV, 800 eV, 1 keV, 3 keV and 5 keV. At first we run the simulation on a GeForce GTX480 from NVIDIA. The very same simulation is duplicated on our CPU-based program, for which we have used an Intel Xeon X5650. Apart from statistics in the simulation, no difference is found between the CPU and GPU simulated results. The GTX480 generates the images (depending on the primary electron energy) 350 to 425 times faster than a single threaded Intel X5650 CPU. Although this is a tremendous speedup, we actually have not reached the maximum throughput because of the limited amount of available memory on the GTX480. Nevertheless, the speedup enables the fast acquisition of simulated SEM images for metrology. We now have the potential to investigate case studies in CD-SEM metrology, which otherwise would take unreasonable
International Nuclear Information System (INIS)
The most dental imaging is performed by means a imaging system consisting of a film/screen combination. Fluorescent intensifying screens for X-ray films are used in order to reduce the radiation dose. They produce visible light which increases the efficiency of the film. In addition, the primary radiation can be scattered elastically (Rayleigh scattering) and inelastically (Compton scattering) which will degrade the image resolution. Scattered radiation produced in Gd2O2S:Tb intensifying screens was simulated by using a Monte Carlo radiation transport code - the EGS4. The magnitude of scattered radiation striking the film is typically quantified using the scatter to primary radiation and the scatter fraction. The angular distribution of the intensity of the scattered radiation (sum of both the scattering effects) was simulated, showing that the ratio of secondary-to-primary radiation incident on the X-ray film is about 5.67% and 3.28 % and the scatter function is about 5.27% and 3.18% for the front and back screen, respectively, over the range from 0 to π rad. (author)
Fast Monte Carlo Simulation for Patient-specific CT/CBCT Imaging Dose Calculation
Jia, Xun; Gu, Xuejun; Jiang, Steve B
2011-01-01
Recently, X-ray imaging dose from computed tomography (CT) or cone beam CT (CBCT) scans has become a serious concern. Patient-specific imaging dose calculation has been proposed for the purpose of dose management. While Monte Carlo (MC) dose calculation can be quite accurate for this purpose, it suffers from low computational efficiency. In response to this problem, we have successfully developed a MC dose calculation package, gCTD, on GPU architecture under the NVIDIA CUDA platform for fast and accurate estimation of the x-ray imaging dose received by a patient during a CT or CBCT scan. Techniques have been developed particularly for the GPU architecture to achieve high computational efficiency. Dose calculations using CBCT scanning geometry in a homogeneous water phantom and a heterogeneous Zubal head phantom have shown good agreement between gCTD and EGSnrc, indicating the accuracy of our code. In terms of improved efficiency, it is found that gCTD attains a speed-up of ~400 times in the homogeneous water ...
Atmospheric correction of Earth-observation remote sensing images by Monte Carlo method
Indian Academy of Sciences (India)
Hanane Hadjit; Abdelaziz Oukebdane; Ahmad Hafid Belbachir
2013-10-01
In earth observation, the atmospheric particles contaminate severely, through absorption and scattering, the reflected electromagnetic signal from the earth surface. It will be greatly beneficial for land surface characterization if we can remove these atmospheric effects from imagery and retrieve surface reflectance that characterizes the surface properties with the purpose of atmospheric correction. Giving the geometric parameters of the studied image and assessing the parameters describing the state of the atmosphere, it is possible to evaluate the atmospheric reflectance, and upward and downward transmittances which take part in the garbling data obtained from the image. To that end, an atmospheric correction algorithm for high spectral resolution data over land surfaces has been developed. It is designed to obtain the main atmospheric parameters needed in the image correction and the interpretation of optical observations. It also estimates the optical characteristics of the Earth-observation imagery (LANDSAT and SPOT). The physics underlying the problem of solar radiation propagations that takes into account multiple scattering and sphericity of the atmosphere has been treated using Monte Carlo techniques.
Feasibility Study of Neutron Dose for Real Time Image Guided Proton Therapy: A Monte Carlo Study
Kim, Jin Sung; Kim, Daehyun; Shin, EunHyuk; Chung, Kwangzoo; Cho, Sungkoo; Ahn, Sung Hwan; Ju, Sanggyu; Chung, Yoonsun; Jung, Sang Hoon; Han, Youngyih
2015-01-01
Two full rotating gantry with different nozzles (Multipurpose nozzle with MLC, Scanning Dedicated nozzle) with conventional cyclotron system is installed and under commissioning for various proton treatment options at Samsung Medical Center in Korea. The purpose of this study is to investigate neutron dose equivalent per therapeutic dose, H/D, to x-ray imaging equipment under various treatment conditions with monte carlo simulation. At first, we investigated H/D with the various modifications of the beam line devices (Scattering, Scanning, Multi-leaf collimator, Aperture, Compensator) at isocenter, 20, 40, 60 cm distance from isocenter and compared with other research groups. Next, we investigated the neutron dose at x-ray equipments used for real time imaging with various treatment conditions. Our investigation showed the 0.07 ~ 0.19 mSv/Gy at x-ray imaging equipments according to various treatment options and intestingly 50% neutron dose reduction effect of flat panel detector was observed due to multi- lea...
Optimal processing for gel electrophoresis images: Applying Monte Carlo Tree Search in GelApp.
Nguyen, Phi-Vu; Ghezal, Ali; Hsueh, Ya-Chih; Boudier, Thomas; Gan, Samuel Ken-En; Lee, Hwee Kuan
2016-08-01
In biomedical research, gel band size estimation in electrophoresis analysis is a routine process. To facilitate and automate this process, numerous software have been released, notably the GelApp mobile app. However, the band detection accuracy is limited due to a band detection algorithm that cannot adapt to the variations in input images. To address this, we used the Monte Carlo Tree Search with Upper Confidence Bound (MCTS-UCB) method to efficiently search for optimal image processing pipelines for the band detection task, thereby improving the segmentation algorithm. Incorporating this into GelApp, we report a significant enhancement of gel band detection accuracy by 55.9 ± 2.0% for protein polyacrylamide gels, and 35.9 ± 2.5% for DNA SYBR green agarose gels. This implementation is a proof-of-concept in demonstrating MCTS-UCB as a strategy to optimize general image segmentation. The improved version of GelApp-GelApp 2.0-is freely available on both Google Play Store (for Android platform), and Apple App Store (for iOS platform). PMID:27251892
Energy Technology Data Exchange (ETDEWEB)
von Wittenau, A; Aufderheide, M B; Henderson, G L
2010-05-07
Given the cost and lead-times involved in high-energy proton radiography, it is prudent to model proposed radiographic experiments to see if the images predicted would return useful information. We recently modified our raytracing transmission radiography modeling code HADES to perform simplified Monte Carlo simulations of the transport of protons in a proton radiography beamline. Beamline objects include the initial diffuser, vacuum magnetic fields, windows, angle-selecting collimators, and objects described as distorted 2D (planar or cylindrical) meshes or as distorted 3D hexahedral meshes. We present an overview of the algorithms used for the modeling and code timings for simulations through typical 2D and 3D meshes. We next calculate expected changes in image blur as scattering materials are placed upstream and downstream of a resolution test object (a 3 mm thick sheet of tantalum, into which 0.4 mm wide slits have been cut), and as the current supplied to the focusing magnets is varied. We compare and contrast the resulting simulations with the results of measurements obtained at the 800 MeV Los Alamos LANSCE Line-C proton radiography facility.
Energy Technology Data Exchange (ETDEWEB)
Schach von Wittenau, Alexis E., E-mail: schachvonwittenau1@llnl.gov [Lawrence Livermore National Laboratory, Livermore, CA 94551 (United States); Aufderheide, Maurice; Henderson, Gary [Lawrence Livermore National Laboratory, Livermore, CA 94551 (United States)
2011-10-01
Given the cost and lead-times involved in high-energy proton radiography, it is prudent to model proposed radiographic experiments to see if the images predicted would return useful information. We recently modified our raytracing transmission radiography modeling code HADES to perform simplified Monte Carlo simulations of the transport of protons in a proton radiography beamline. Beamline objects include the initial diffuser, vacuum magnetic fields, windows, angle-selecting collimators, and objects described as distorted 2D (planar or cylindrical) meshes or as distorted 3D hexahedral meshes. We describe the algorithms used for simulations through typical 2D and 3D meshes. We calculate expected changes in image blur as scattering materials are placed upstream and downstream of a resolution test object (a 3 mm thick sheet of tantalum, into which 0.4 mm wide slits have been cut), and as the current supplied to the focusing magnets is varied. We compare and contrast the resulting simulations with the results of measurements obtained at the 800 MeV Los Alamos LANSCE Line-C proton radiography facility.
Monte Carlo simulation of novel breast imaging modalities based on coherent x-ray scattering
International Nuclear Information System (INIS)
We present upgraded versions of MC-GPU and penEasyImaging, two open-source Monte Carlo codes for the simulation of radiographic projections and CT, that have been extended and validated to account for the effect of molecular interference in the coherent x-ray scatter. The codes were first validation by comparison between simulated and measured energy dispersive x-ray diffraction (EDXRD) spectra. A second validation was by evaluation of the rejection factor of a focused anti-scatter grid. To exemplify the capabilities of the new codes, the modified MC-GPU code was used to examine the possibility of characterizing breast tissue composition and microcalcifications in a volume of interest inside a whole breast phantom using EDXRD and to simulate a coherent scatter computed tomography (CSCT) system based on first generation CT acquisition geometry. It was confirmed that EDXRD and CSCT have the potential to characterize tissue composition inside a whole breast. The GPU-accelerated code was able to simulate, in just a few hours, a complete CSCT acquisition composed of 9758 independent pencil-beam projections. In summary, it has been shown that the presented software can be used for fast and accurate simulation of novel breast imaging modalities relying on scattering measurements and therefore can assist in the characterization and optimization of promising modalities currently under development. (paper)
The use of computed tomography images in Monte Carlo treatment planning
Bazalova, Magdalena
Monte Carlo (MC) dose calculations cannot accurately assess the dose delivered to the patient during radiotherapy unless the patient anatomy is well known. This thesis focuses on the conversion of patient computed tomography (CT) images into MC geometry files. Metal streaking artifacts and their effect on MC dose calculations are first studied. A correction algorithm is applied to artifact-corrupted images and dose errors due to density and tissue mis-assignment are quantified in a phantom and a patient study. The correction algorithm and MC dose calculations for various treatment beams are also investigated using phantoms with real hip prostheses. As a result of this study, we suggest that a metal artifact correction algorithm should be a part of any MC treatment planning. By means of MC simulations, scatter is proven to be a major cause of metal artifacts. The use of dual-energy CT (DECT) for a novel tissue segmentation scheme is thoroughly investigated. First, MC simulations are used to determine the optimal beam filtration for an accurate DECT material extraction. DECT is then tested on a CT scanner with a phantom and a good agreement in the extraction of two material properties, the relative electron density rhoe and the effective atomic number Z is found. Compared to the conventional tissue segmentation based on rhoe-differences, the novel tissue segmentation scheme uses differences in both rhoe and Z. The phantom study demonstrates that the novel method based on rhoe and Z information works well and makes MC dose calculations more accurate. This thesis demonstrates that DECT suppresses streaking artifacts from brachytherapy seeds. Brachytherapy MC dose calculations using single-energy CT images with artifacts and DECT images with suppressed artifacts are performed and the effect of artifact reduction is investigated. The patient and canine DECT studies also show that image noise and object motion are very important factors in DECT. A solution for reduction
Generation of scintigraphic images in a virtual dosimetry trial based on Monte Carlo modelling
International Nuclear Information System (INIS)
Full text of publication follows. Aim: the purpose of dosimetry calculations in therapeutic nuclear medicine is to maximize tumour absorbed dose while minimizing normal tissue toxicities. However a wide heterogeneity of dosimetric approaches is observed: there is no standardized dosimetric protocol to date. The DosiTest project (www.dositest.com) intends to identify critical steps in the dosimetry chain by implementing clinical dosimetry in different Nuclear Medicine departments, on scintigraphic images generated by Monte Carlo simulation from a same virtual patient. This study aims at presenting the different steps contributing to image generation, following the imaging protocol of a given participating centre, Milan's European Institute of Oncology (IEO). Materiel and methods: the chosen clinical application is that of 111In-pentetreotide (OctreoscanTM). Pharmacokinetic data from the literature are used to derive a compartmental model. The kinetic rates between 6 compartments (liver, spleen, kidneys, blood, urine, remainder body) were obtained from WinSaam [3]: the activity in each compartment is known at any time point. The TestDose [1] software (computing architecture of DosiTest) implements the NURBS-based phantom NCAT-WB [2] to generate anatomical data for the virtual patient. IEO gamma-camera was modelled with GATE [4] v6.2. Scintigraphic images were simulated for each compartment and the resulting projections were weighted by the respective pharmacokinetics for each compartment. The final step consisted in aggregating each compartment to generate the resulting image. Results: following IEO's imaging protocol, planar and tomographic image simulations were generated at various time points. Computation times (on a 480 virtual cores computing cluster) for 'step and shoot' whole body simulations (5 steps/time point) and acceptable statistics were: 10 days for extra-vascular fluid, 28 h for blood, 12 h for liver, 7 h for kidneys, and 1-2 h for
Monte Carlo Simulation Of Emission Tomography And Other Medical Imaging Techniques
Harrison, Robert L.
2010-01-01
An introduction to Monte Carlo simulation of emission tomography. This paper reviews the history and principles of Monte Carlo simulation, then applies these principles to emission tomography using the public domain simulation package SimSET (a Simulation System for Emission Tomography) as an example. Finally, the paper discusses how the methods are modified for X-ray computed tomography and radiotherapy simulations.
Monte Carlo Modeling of Cascade Gamma Rays in 86Y PET imaging: Preliminary results
Zhu, Xuping; El Fakhri, Georges
2011-01-01
86Y is a PET agent that could be used as an ideal surrogate to allow personalized dosimetry in 90Y radionuclide therapy. However, 86Y also emits cascade gamma rays. We have developed a Monte Carlo program based on SimSET to model cascade gamma rays in PET imaging. The new simulation was validated with the GATE simulation package. Agreements within 15% were found in spatial resolution, apparent scatter fraction (ratio of coincidences outside peak regions in line source sinograms), singles and coincidences statistics and detected photons energy distribution within the PET energy window. A 20% discrepancy was observed in the absolute scatter fraction, likely caused by differences in the tracking of higher-energy cascade gamma photons. On average the new simulation is 6 times faster than GATE, and the computing time can be further improved by using variance reduction techniques currently available in SimSET. Comparison with phantom acquisitions showed agreements in spatial resolutions and the general shape of projection profiles; however, the standard scatter correction method on the scanner is not directly applicable for 86Y PET as it leads to incorrect scatter fractions. The new simulation was used to characterize 86Y PET. Compared with conventional 18F PET, in which major contamination at low count rates comes from scattered events, cascade gamma-involved events are more important in 86Y PET. The two types of contaminations have completely different distribution patterns, which should be considered for the corrections of their effects. Our approach will be further improved in the future in the modeling of random coincidences and tracking of high energy photons, and simulation results will be used for the development of correction methods in 86Y PET. PMID:19521011
Monte Carlo modeling of cascade gamma rays in {sup 86}Y PET imaging: preliminary results
Energy Technology Data Exchange (ETDEWEB)
Zhu Xuping; El Fakhri, Georges [Radiology Department, Massachusetts General Hospital and Harvard Medical School, 55 Fruit Street, Boston, Massachusetts, MA (United States)], E-mail: xzhu4@Partners.org, E-mail: elfakhri@pet.mgh.harvard.edu
2009-07-07
{sup 86}Y is a PET agent that could be used as an ideal surrogate to allow personalized dosimetry in {sup 90}Y radionuclide therapy. However, {sup 86}Y also emits cascade gamma rays. We have developed a Monte Carlo program based on SimSET (Simulation System for Emission Tomography) to model cascade gamma rays in PET imaging. The new simulation was validated with the GATE simulation package. Agreements within 15% were found in spatial resolution, apparent scatter fraction (ratio of coincidences outside peak regions in line source sinograms), single and coincidence statistics and detected photons energy distribution within the PET energy window. A discrepancy of 20% was observed in the absolute scatter fraction, likely caused by differences in the tracking of higher energy cascade gamma photons. On average, the new simulation is 6 times faster than GATE, and the computing time can be further improved by using variance reduction techniques currently available in SimSET. Comparison with phantom acquisitions showed agreements in spatial resolutions and the general shape of projection profiles; however, the standard scatter correction method on the scanner is not directly applicable to {sup 86}Y PET as it leads to incorrect scatter fractions. The new simulation was used to characterize {sup 86}Y PET. Compared with conventional {sup 18}F PET, in which major contamination at low count rates comes from scattered events, cascade gamma-involved events are more important in {sup 86}Y PET. The two types of contaminations have completely different distribution patterns, which should be considered for the corrections of their effects. Our approach will be further improved in the future in the modeling of random coincidences and tracking of high-energy photons, and simulation results will be used for the development of correction methods in {sup 86}Y PET.
Monte Carlo modeling of cascade gamma rays in 86Y PET imaging: preliminary results
Zhu, Xuping; El Fakhri, Georges
2009-07-01
86Y is a PET agent that could be used as an ideal surrogate to allow personalized dosimetry in 90Y radionuclide therapy. However, 86Y also emits cascade gamma rays. We have developed a Monte Carlo program based on SimSET (Simulation System for Emission Tomography) to model cascade gamma rays in PET imaging. The new simulation was validated with the GATE simulation package. Agreements within 15% were found in spatial resolution, apparent scatter fraction (ratio of coincidences outside peak regions in line source sinograms), single and coincidence statistics and detected photons energy distribution within the PET energy window. A discrepancy of 20% was observed in the absolute scatter fraction, likely caused by differences in the tracking of higher energy cascade gamma photons. On average, the new simulation is 6 times faster than GATE, and the computing time can be further improved by using variance reduction techniques currently available in SimSET. Comparison with phantom acquisitions showed agreements in spatial resolutions and the general shape of projection profiles; however, the standard scatter correction method on the scanner is not directly applicable to 86Y PET as it leads to incorrect scatter fractions. The new simulation was used to characterize 86Y PET. Compared with conventional 18F PET, in which major contamination at low count rates comes from scattered events, cascade gamma-involved events are more important in 86Y PET. The two types of contaminations have completely different distribution patterns, which should be considered for the corrections of their effects. Our approach will be further improved in the future in the modeling of random coincidences and tracking of high-energy photons, and simulation results will be used for the development of correction methods in 86Y PET.
International Nuclear Information System (INIS)
A novel procedure for the generation of a realistic virtual Computed Tomography (CT) image of a patient, using the advanced Boundary RE Presentation (BREP)-based model MASH, has been implemented. This method can be used in radiotherapy assessment. It is shown that it is possible to introduce an artificial cancer, which can be modeled using mesh surfaces. The use of virtual CT images based on BREP models presents several advantages with respect to CT images of actual patients, such as automation, control and flexibility. As an example, two artificial cases, namely a brain and a prostate cancer, were created through the generation of images and tumor/organ contours. As a secondary objective, the described methodology has been used to generate input files for treatment planning system (TPS) and Monte Carlo code dose evaluation. In this paper, we consider treatment plans generated assuming a dose delivery via an active proton beam scanning performed with the INFN-IBA TPS kernel. Additionally, Monte Carlo simulations of the two treatment plans were carried out with GATE/GEANT4. The work demonstrates the feasibility of the approach based on the BREP modeling to produce virtual CT images. In conclusion, this study highlights the benefits in using digital phantom model capable of representing different anatomical structures and varying tumors across different patients. These models could be useful for assessing radiotherapy treatment planning systems (TPS) and computer simulations for the evaluation of the adsorbed dose. (author)
Energy Technology Data Exchange (ETDEWEB)
Milian, F. M.; Attili, A.; Russo, G; Marchetto, F.; Cirio, R., E-mail: felix_mas_milian@yahoo.com, E-mail: attili@to.infn.it, E-mail: russo@to.infn.it, E-mail: fmarchet@to.infn.it, E-mail: cirio@to.infn.it [Istituto Nazionale di Fisica Nucleare (INFN), Torino, TO (Italy); Bourhaleb, F., E-mail: bourhale@to.infn.it [Universita di Torino (UNITO), Torino, TO (Italy)
2013-07-01
A novel procedure for the generation of a realistic virtual Computed Tomography (CT) image of a patient, using the advanced Boundary RE Presentation (BREP)-based model MASH, has been implemented. This method can be used in radiotherapy assessment. It is shown that it is possible to introduce an artificial cancer, which can be modeled using mesh surfaces. The use of virtual CT images based on BREP models presents several advantages with respect to CT images of actual patients, such as automation, control and flexibility. As an example, two artificial cases, namely a brain and a prostate cancer, were created through the generation of images and tumor/organ contours. As a secondary objective, the described methodology has been used to generate input files for treatment planning system (TPS) and Monte Carlo code dose evaluation. In this paper, we consider treatment plans generated assuming a dose delivery via an active proton beam scanning performed with the INFN-IBA TPS kernel. Additionally, Monte Carlo simulations of the two treatment plans were carried out with GATE/GEANT4. The work demonstrates the feasibility of the approach based on the BREP modeling to produce virtual CT images. In conclusion, this study highlights the benefits in using digital phantom model capable of representing different anatomical structures and varying tumors across different patients. These models could be useful for assessing radiotherapy treatment planning systems (TPS) and computer simulations for the evaluation of the adsorbed dose. (author)
International Nuclear Information System (INIS)
Thoracic cancer treatment presents dosimetric difficulties due to respiratory motion and lung inhomogeneity. Monte Carlo and deformable image registration techniques have been proposed to be used in four-dimensional (4D) dose calculations to overcome the difficulties. This study validates the 4D Monte Carlo dosimetry with measurement, compares 4D dosimetry of different tumor sizes and tumor motion ranges, and demonstrates differences of dose-volume histograms (DVH) with the number of respiratory phases that are included in 4D dosimetry. BEAMnrc was used in dose calculations while an optical flow algorithm was used in deformable image registration and dose mapping. Calculated and measured doses of a moving phantom agreed within 3% at the center of the moving gross tumor volumes (GTV). 4D CT image sets of lung cancer cases were used in the analysis of 4D dosimetry. For a small tumor (12.5 cm3) with motion range of 1.5 cm, reduced tumor volume coverage was observed in the 4D dose with a beam margin of 1 cm. For large tumors and tumors with small motion range (around 1 cm), the 4D dosimetry did not differ appreciably from the static plans. The dose-volume histogram (DVH) analysis shows that the inclusion of only extreme respiratory phases in 4D dosimetry is a reasonable approximation of all-phase inclusion for lung cancer cases similar to the ones studied, which reduces the calculation in 4D dosimetry
International Nuclear Information System (INIS)
An optimization of anti-scatter grid design using Monte Carlo techniques in diagnostic radiology is presented. The criterion for optimization was to find the combinations of the grid parameters (lead strip width, grid ratio and strip density) and tube potential which result in the lowest mean absorbed dose in the patient at fixed image contrast. The optimization was performed in three irradiation geometries, representing different scattering conditions (paediatric examinations, and two adult lumbar spine examinations) and was restricted to grids using fibre materials in covers and interspaces. (author)
Energy Technology Data Exchange (ETDEWEB)
Khorsandi, M.; Feghhi, S.A.H., E-mail: A_feghhi@sbu.ac.ir
2015-08-01
In industrial Gamma-ray CT, specifically for large-dimension plants or processes, the simplicity and portability of CT system necessitate to use individual gamma-ray detectors for imaging purposes. Considering properties of the gamma-ray source as well as characteristics of the detectors, including penetration depth, energy resolution, size, etc., the quality of reconstructed images is limited. Therefore, implementation of an appropriate reconstruction procedure is important to improve the image quality. In this paper, an accurate and applicable procedure has been proposed for image reconstruction of Gamma-ray CT of large-dimension industrial plants. Additionally, a portable configuration of tomographic system was introduced and simulated in MCNPX Monte Carlo code. The simulation results were validated through comparison with the experimental results reported in the literature. Evaluations showed that maximum difference between reconstruction error in this work and the benchmark was less than 1.3%. Additional investigation has been carried out on a typical standard phantom introduced by IAEA using the validated procedure. Image quality assessment showed that the reconstruction error was less than 1.7% using different algorithms and a good contrast higher than 76% was obtained. Our overall results are indicative of the fact that the procedures and methods introduced in this work are quite efficient for improving the image quality of gamma CT of industrial plants.
International Nuclear Information System (INIS)
In industrial Gamma-ray CT, specifically for large-dimension plants or processes, the simplicity and portability of CT system necessitate to use individual gamma-ray detectors for imaging purposes. Considering properties of the gamma-ray source as well as characteristics of the detectors, including penetration depth, energy resolution, size, etc., the quality of reconstructed images is limited. Therefore, implementation of an appropriate reconstruction procedure is important to improve the image quality. In this paper, an accurate and applicable procedure has been proposed for image reconstruction of Gamma-ray CT of large-dimension industrial plants. Additionally, a portable configuration of tomographic system was introduced and simulated in MCNPX Monte Carlo code. The simulation results were validated through comparison with the experimental results reported in the literature. Evaluations showed that maximum difference between reconstruction error in this work and the benchmark was less than 1.3%. Additional investigation has been carried out on a typical standard phantom introduced by IAEA using the validated procedure. Image quality assessment showed that the reconstruction error was less than 1.7% using different algorithms and a good contrast higher than 76% was obtained. Our overall results are indicative of the fact that the procedures and methods introduced in this work are quite efficient for improving the image quality of gamma CT of industrial plants
ROSI--an object-oriented and parallel-computing Monte Carlo simulation for X-ray imaging
International Nuclear Information System (INIS)
In the field of X-ray imaging, Monte Carlo simulation is an important tool. It gives the possibility of understanding experimental results and it allows the construction of virtual imaging setups with predictions of their quality. For these reasons, we developed the Roentgen Simulation (ROSI) which is based on the object-oriented C++ class library GISMO. The interaction algorithms are based on the established EGS4-code and its current LSCAT-extension. ROSI introduces random variables for modelling physical parameters by a given random distribution, e.g. the source position or the direction and energy of the photons to be emitted. It is possible to run ROSI in parallel on a local computer network (Beowulf cluster) to obtain simulation data in shorter time. Finally, it has an easy-to-use interface. We will present the concept of ROSI and demonstrate its flexibility by an example
Monte Carlo simulation of image properties of an X-ray intensifying screen
Wang Yi; Wang Kui Lu; Liu Guo Zhi; Liu Ya Qian
2000-01-01
A Monte Carlo simulation program named MCPEP has been developed. Based on the existing simulation program that simulates the transfer of X-ray photons and the secondary electrons, MCPEP also simulates the light photons in the screen. The performances of an intensifying screen (Gd sub 2 O sub 2 S : Tb) with different thickness and different X-ray energies have been analyzed by MCPEP. The calculated light photon probability distribution, average light photon number per absorbed X-ray photon, statistical factor for light emission, X-ray detection efficiency, detective quantum efficiency (DQE) and point spread function (PSF) of the screen are presented.
Lazaro, D; Loudos, G; Strul, D; Santin, G; Giokaris, N; Donnarieix, D; Maigne, L; Spanoudaki, V; Styliaris, S; Staelens, S; Breton, V
2004-01-01
Monte Carlo simulations are increasingly used in scintigraphic imaging to model imaging systems and to develop and assess tomographic reconstruction algorithms and correction methods for improved image quantitation. GATE (GEANT 4 Application for Tomographic Emission) is a new Monte Carlo simulation platform based on GEANT4 dedicated to nuclear imaging applications. This paper describes the GATE simulation of a prototype of scintillation camera dedicated to small animal imaging and consisting of a CsI(Tl) crystal array coupled to a position sensitive photomultiplier tube. The relevance of GATE to model the camera prototype was assessed by comparing simulated 99mTc point spread functions, energy spectra, sensitivities, scatter fractions and image of a capillary phantom with the corresponding experimental measurements. Results showed an excellent agreement between simulated and experimental data: experimental spatial resolutions were predicted with an error less than 100 mu m. The difference between experimental...
A Monte Carlo study of the effect of coded-aperture material and thickness on neutron imaging
International Nuclear Information System (INIS)
In this paper, a coded-aperture design for a scintillator-based neutron imaging system has been simulated using a series of Monte Carlo simulations. Using Monte Carlo simulations, work to optimise a system making use of the EJ-426 neutron scintillator detector has been conducted. This type of scintillator has a low sensitivity to gamma rays and is therefore particularly useful for neutron detection in a mixed radiation environment. Simulations have been conducted using varying coded-aperture materials and different coded-aperture thicknesses. From this, neutron images have been produced, compared qualitatively and quantitatively for each case to find the best material for the MURA (modified uniformly redundant array) pattern. The neutron images generated also allow observations on how differing thicknesses of coded-aperture impact the system. A system in which a neutron sensitive scintillator has been used in conjunction with a MURA coded aperture to detect and locate a neutron emitting point source centralised in the system has been simulated. A comparison between the results of the different coded-aperture thicknesses is discussed, via the calculation of system error between the reconstructed source location and the actual location. As the system is small scale with a relatively large step along the axis (0.5 cm), it is justifiable to say that the smaller error values provide satisfactory results, which correlate with only a few centimetres difference in the reconstructed source location to actual source location. A general trend of increasing error can be deduced both as the thickness of the coded-aperture material decreases and the capture cross section of the different materials reduces. (authors)
Image quality assessment of LaBr3-based whole-body 3D PET scanners: a Monte Carlo evaluation
International Nuclear Information System (INIS)
The main thrust for this work is the investigation and design of a whole-body PET scanner based on new lanthanum bromide scintillators. We use Monte Carlo simulations to generate data for a 3D PET scanner based on LaBr3 detectors, and to assess the count-rate capability and the reconstructed image quality of phantoms with hot and cold spheres using contrast and noise parameters. Previously we have shown that LaBr3 has very high light output, excellent energy resolution and fast timing properties which can lead to the design of a time-of-flight (TOF) whole-body PET camera. The data presented here illustrate the performance of LaBr3 without the additional benefit of TOF information, although our intention is to develop a scanner with TOF measurement capability. The only drawbacks of LaBr3 are the lower stopping power and photo-fraction which affect both sensitivity and spatial resolution. However, in 3D PET imaging where energy resolution is very important for reducing scattered coincidences in the reconstructed image, the image quality attained in a non-TOF LaBr3 scanner can potentially equal or surpass that achieved with other high sensitivity scanners. Our results show that there is a gain in NEC arising from the reduced scatter and random fractions in a LaBr3 scanner. The reconstructed image resolution is slightly worse than a high-Z scintillator, but at increased count-rates, reduced pulse pileup leads to an image resolution similar to that of LSO. Image quality simulations predict reduced contrast for small hot spheres compared to an LSO scanner, but improved noise characteristics at similar clinical activity levels
Feasibility study of the neutron dose for real-time image-guided proton therapy: A Monte Carlo study
Kim, Jin Sung; Shin, Jung Suk; Kim, Daehyun; Shin, Eunhyuk; Chung, Kwangzoo; Cho, Sungkoo; Ahn, Sung Hwan; Ju, Sanggyu; Chung, Yoonsun; Jung, Sang Hoon; Han, Youngyih
2015-07-01
Two full rotating gantries with different nozzles (multipurpose nozzle with MLC, scanning dedicated nozzle) for a conventional cyclotron system are installed and being commissioned for various proton treatment options at Samsung Medical Center in Korea. The purpose of this study is to use Monte Carlo simulation to investigate the neutron dose equivalent per therapeutic dose, H/D, for X-ray imaging equipment under various treatment conditions. At first, we investigated the H/D for various modifications of the beamline devices (scattering, scanning, multi-leaf collimator, aperture, compensator) at the isocenter and at 20, 40 and 60 cm distances from the isocenter, and we compared our results with those of other research groups. Next, we investigated the neutron dose at the X-ray equipment used for real-time imaging under various treatment conditions. Our investigation showed doses of 0.07 ~ 0.19 mSv/Gy at the X-ray imaging equipment, depending on the treatment option and interestingly, the 50% neutron dose reduction was observed due to multileaf collimator during proton scanning treatment with the multipurpose nozzle. In future studies, we plan to measure the neutron dose experimentally and to validate the simulation data for X-ray imaging equipment for use as an additional neutron dose reduction method.
Directory of Open Access Journals (Sweden)
Mondal Nagendra
2009-01-01
Full Text Available This study presents Monte Carlo Simulation (MCS results of detection efficiencies, spatial resolutions and resolving powers of a time-of-flight (TOF PET detector systems. Cerium activated Lutetium Oxyorthosilicate (Lu 2 SiO 5 : Ce in short LSO, Barium Fluoride (BaF 2 and BriLanCe 380 (Cerium doped Lanthanum tri-Bromide, in short LaBr 3 scintillation crystals are studied in view of their good time and energy resolutions and shorter decay times. The results of MCS based on GEANT show that spatial resolution, detection efficiency and resolving power of LSO are better than those of BaF 2 and LaBr 3 , although it possesses inferior time and energy resolutions. Instead of the conventional position reconstruction method, newly established image reconstruction (talked about in the previous work method is applied to produce high-tech images. Validation is a momentous step to ensure that this imaging method fulfills all purposes of motivation discussed by reconstructing images of two tumors in a brain phantom.
Branco, Susana; Almeida, Pedro; Jan, Sébastien
2011-01-01
The rapid growth in genetics and molecular biology combined with the development of techniques for genetically engineering small animals has led to increased interest in in vivo small animal imaging. Small animal imaging has been applied frequently to the imaging of small animals (mice and rats), which are ubiquitous in modeling human diseases and testing treatments. The use of PET in small animals allows the use of subjects as their own control, reducing the interanimal variability....
Sensitivity study for CT image use in Monte Carlo treatment planning
Verhaegen, Frank; Devic, Slobodan
2005-03-01
An important step in Monte Carlo treatment planning (MCTP), which is commonly performed uncritically, is segmentation of the patient CT data into a voxel phantom for dose calculation. In addition to assigning mass densities to voxels, as is done in conventional TP, this entails assigning media. Mis-assignment of media can potentially lead to significant dose errors in MCTP. In this work, a test phantom with exact-known composition was used to study CT segmentation errors and to quantify subsequent MCTP inaccuracies. For our test cases, we observed dose errors in some regions of up to 10% for 6 and 15 MV photons, more than 30% for an 18 MeV electron beam and more than 40% for 250 kVp photons. It is concluded that a careful CT calibration with a suitable phantom is essential. Generic calibrations and the use of commercial CT phantoms have to be critically assessed.
International Nuclear Information System (INIS)
Monte Carlo simulation techniques are applied to track the annihilation photons from positron decay, and store the photon histories. Reasonably realistic models of the isotope distribution in the brain and heart during typical PET studies, as well as the traditional phantoms used for measuring PET scanner performance can be built out of up to 10 hollow or solid cylinders. Separate programs model the source distribution and its attenuation characteristics, the collimators and the detectors. These modules are connected by compact gamma history files which are stored on disc or tape. Over 50 million gamma ray histories can be saved on a 1 Gbyte disc, representing the decay of several billion atoms. This allows for good precision even for single thin slices in scanners with wide axial acceptance. The simulation results include spectrum analysis, sensitivity to true coincident events, scattered coincident and single rays, and the effects on these parameters of detector dead time. (author)
DEFF Research Database (Denmark)
Xie, Haiyan; Liu, Haichun; Svenmarker, Pontus; Axelsson, Johan; Xu, Can T.; Gräfe, Susanna; Lundeman, Jesper Holm; Cheng, Haynes Pak Hay; Svanberg, Sune; Bendsoe, Niels; Andersen, Peter E.; Svanberg, Katarina; Andersson-Engels, Stefan
2011-01-01
light by turbid media, such as biological tissue, the detected fluorescence signal does not have a simple and unique dependence on the fluorophore concentration for different tissues, but depends in a complex way on other parameters as well. For this reason, little has been done on drug quantification...... in vivo by the fluorescence imaging technique. In this paper we present a novel approach to compensate for the light absorption in homogeneous turbid media both for the excitation and emission light, utilizing time-resolved fluorescence white Monte Carlo simulations combined with the Beer-Lambert law....... This method shows that the corrected fluorescence intensity is almost proportional to the absolute fluorophore concentration. The results on controllable tissue phantoms and murine tissues are presented and show good correlations between the evaluated fluorescence intensities after the light...
Cerenkov luminescence imaging of human breast cancer: a Monte Carlo simulations study
Boschi, F.; Pagliazzi, M.; Spinelli, A. E.
2016-03-01
Cerenkov luminescence imaging (CLI) is a novel molecular imaging technique based on the detection of Cerenkov light produced by beta particles traveling through biological tissues. In this paper we simulated using 18F and 90Y the possibility of detecting Cerenkov luminescence in human breast tissues, in order to evaluate the potential of the CLI technique in a clinical setting. A human breast digital phantom was obtained from an 18F-FDG CT-PET scan. The spectral features of the breast surface emission were obtained as well as the simulated images obtainable by a cooled CCD detector. The simulated images revealed a signal to noise ratio equal to 6 for a 300 s of acquisition time. We concluded that a dedicated human Cerenkov imaging detector can be designed in order to offer a valid low cost alternative to diagnostic techniques in nuclear medicine, in particular allowing the detection of beta-minus emitters used in radiotherapy.
Cerenkov luminescence imaging of human breast cancer: a Monte Carlo simulations study
International Nuclear Information System (INIS)
Cerenkov luminescence imaging (CLI) is a novel molecular imaging technique based on the detection of Cerenkov light produced by beta particles traveling through biological tissues. In this paper we simulated using 18F and 90Y the possibility of detecting Cerenkov luminescence in human breast tissues, in order to evaluate the potential of the CLI technique in a clinical setting. A human breast digital phantom was obtained from an 18F-FDG CT-PET scan. The spectral features of the breast surface emission were obtained as well as the simulated images obtainable by a cooled CCD detector. The simulated images revealed a signal to noise ratio equal to 6 for a 300 s of acquisition time. We concluded that a dedicated human Cerenkov imaging detector can be designed in order to offer a valid low cost alternative to diagnostic techniques in nuclear medicine, in particular allowing the detection of beta-minus emitters used in radiotherapy
Golosio, Bruno; Schoonjans, Tom; Brunetti, Antonio; Oliva, Piernicola; Masala, Giovanni Luca
2014-03-01
The simulation of X-ray imaging experiments is often performed using deterministic codes, which can be relatively fast and easy to use. However, such codes are generally not suitable for the simulation of even slightly more complex experimental conditions, involving, for instance, first-order or higher-order scattering, X-ray fluorescence emissions, or more complex geometries, particularly for experiments that combine spatial resolution with spectral information. In such cases, simulations are often performed using codes based on the Monte Carlo method. In a simple Monte Carlo approach, the interaction position of an X-ray photon and the state of the photon after an interaction are obtained simply according to the theoretical probability distributions. This approach may be quite inefficient because the final channels of interest may include only a limited region of space or photons produced by a rare interaction, e.g., fluorescent emission from elements with very low concentrations. In the field of X-ray fluorescence spectroscopy, this problem has been solved by combining the Monte Carlo method with variance reduction techniques, which can reduce the computation time by several orders of magnitude. In this work, we present a C++ code for the general simulation of X-ray imaging and spectroscopy experiments, based on the application of the Monte Carlo method in combination with variance reduction techniques, with a description of sample geometry based on quadric surfaces. We describe the benefits of the object-oriented approach in terms of code maintenance, the flexibility of the program for the simulation of different experimental conditions and the possibility of easily adding new modules. Sample applications in the fields of X-ray imaging and X-ray spectroscopy are discussed. Catalogue identifier: AERO_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AERO_v1_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland
Jiang, Xu; Deng, Yong; Luo, Zhaoyang; Luo, Qingming
2015-10-01
The excessive time required by fluorescence diffuse optical tomography (fDOT) image reconstruction based on path-history fluorescence Monte Carlo model is its primary limiting factor. Herein, we present a method that accelerates fDOT image reconstruction. We employ three-level parallel architecture including multiple nodes in cluster, multiple cores in central processing unit (CPU), and multiple streaming multiprocessors in graphics processing unit (GPU). Different GPU memories are selectively used, the data-writing time is effectively eliminated, and the data transport per iteration is minimized. Simulation experiments demonstrated that this method can utilize general-purpose computing platforms to efficiently implement and accelerate fDOT image reconstruction, thus providing a practical means of using path-history-based fluorescence Monte Carlo model for fDOT imaging. PMID:26480115
International Nuclear Information System (INIS)
A Monte Carlo model of a novel electronic portal imaging device (EPID) has been developed using Geant4 and its performance for imaging and dosimetry applications in radiotherapy has been characterised. The EPID geometry is based on a physical prototype under ongoing investigation and comprises an array of plastic scintillating fibres in place of the metal plate/phosphor screen in standard EPIDs. Geometrical and optical transport parameters were varied to investigate their impact on imaging and dosimetry performance. Detection efficiency was most sensitive to variations in fibre length, achieving a peak value of 36% at 50 mm using 400 keV x-rays for the lengths considered. Increases in efficiency for longer fibres were partially offset by reductions in sensitivity. Removing the extra-mural absorber surrounding individual fibres severely decreased the modulation transfer function (MTF), highlighting its importance in maximising spatial resolution. Field size response and relative dose profile simulations demonstrated a water-equivalent dose response and thus the prototype’s suitability for dosimetry applications. Element-to-element mismatch between scintillating fibres and underlying photodiode pixels resulted in a reduced MTF for high spatial frequencies and quasi-periodic variations in dose profile response. This effect is eliminated when fibres are precisely matched to underlying pixels. Simulations strongly suggest that with further optimisation, this prototype EPID may be capable of simultaneous imaging and dosimetry in radiotherapy. (paper)
International Nuclear Information System (INIS)
Computed tomography has advanced very rapidly, resulting in a rapid increase in the number of examinations performed, as well as in the range of clinical applications now based on this technique. CT is a procedure generally associated with relatively high dose levels. It constitutes 5% of worldwide radiological examinations; yet contributes one third of the collective dose associated with medical imaging. Therefore, studies on the issue of dose optimization are welcome. Computer simulation uses theoretical models to predict performance of real systems. This kind of simulation is able, among other tasks, to evaluate the impact of geometric parameters of a tomograph on spatial resolution, for example: detector size; source-detector distance and source-isocentre distance. The development of new pre-process and image reconstruction algorithms and the evaluation of dose in organs and tissues of human body are tasks that have been developed by means of computer simulations. In this work, conventional CT acquisitions were completely simulated. Monte Carlo techniques were applied in radiation transport simulation to obtain projection data and to calculate conversion coefficients for doses in organs and tissues of a female anthropomorphic phantom. The computer simulation of the entire measurement and reconstruction process is a valuable and very effective means to evaluate the effects of individual parameters. A graphical reconstruction algorithm was used to obtain virtual tomographic images, the attenuation values of each projection were added to the related pixel of the reconstruction matrix. Several different geometries were simulated and the effect of geometric parameters on image quality and dose in organs and tissues was evaluated. (author)
Shirakawa, Seiji; Tadokoro, Masanori; Hashimoto, Hiroshi; Ushiroda, Tomoya; Toyama, Hiroshi
2015-01-01
In this study, we devised and evaluated a method for attenuation correction of the hot spot in (111)In planar images. By use of the difference in transmittance between two energies (171 and 245 keV), the depth of the hot spot was calculated. Planar images of point sources in a numerical phantom (water) with depths from 0 to 20 cm at 2 cm intervals were prepared by Monte Carlo simulation. From the linear attenuation coefficient of the two energies and the 171/245 keV count ratio-depth relationship, the depth of the point source was calculated, and an attenuation correction was performed. A simulation was made under conditions taking into account both attenuation and scatter (A(+)S(+)) and attenuation alone (A(+)S(-)). The attenuation correction was evaluated with use of corrected and true counts obtained from homogeneous phantoms mimicking attenuation in soft tissue, bone, and the lungs, and heterogeneous phantoms prepared by combining them. In the A(+)S(+) condition, images were affected markedly by scattered photons in all phantoms at depths of 4-8 cm. The errors at depths of 10 cm or greater were within ±10 % in water and within ±6 % in soft tissue. However, the errors were about -30 % in bone and about +70 % in lung, indicating that scatter distributions different from those in water increased the errors. In the A(+)S(-) condition, the errors were within ±5 % in all homogeneous and heterogeneous phantoms, and satisfactory results were obtained. Precise attenuation correction of scatter-corrected planar images was confirmed to be possible with this method. PMID:25149323
Monte Carlo simulations of the dose from imaging with GE eXplore 120 micro-CT using GATE
Energy Technology Data Exchange (ETDEWEB)
Bretin, Florian; Bahri, Mohamed Ali; Luxen, André; Phillips, Christophe; Plenevaux, Alain; Seret, Alain, E-mail: aseret@ulg.ac.be [Cyclotron Research Centre, University of Liège, Sart Tilman B30, Liège 4000 (Belgium)
2015-10-15
Purpose: Small animals are increasingly used as translational models in preclinical imaging studies involving microCT, during which the subjects can be exposed to large amounts of radiation. While the radiation levels are generally sublethal, studies have shown that low-level radiation can change physiological parameters in mice. In order to rule out any influence of radiation on the outcome of such experiments, or resulting deterministic effects in the subjects, the levels of radiation involved need to be addressed. The aim of this study was to investigate the radiation dose delivered by the GE eXplore 120 microCT non-invasively using Monte Carlo simulations in GATE and to compare results to previously obtained experimental values. Methods: Tungsten X-ray spectra were simulated at 70, 80, and 97 kVp using an analytical tool and their half-value layers were simulated for spectra validation against experimentally measured values of the physical X-ray tube. A Monte Carlo model of the microCT system was set up and four protocols that are regularly applied to live animal scanning were implemented. The computed tomography dose index (CTDI) inside a PMMA phantom was derived and multiple field of view acquisitions were simulated using the PMMA phantom, a representative mouse and rat. Results: Simulated half-value layers agreed with experimentally obtained results within a 7% error window. The CTDI ranged from 20 to 56 mGy and closely matched experimental values. Derived organ doses in mice reached 459 mGy in bones and up to 200 mGy in soft tissue organs using the highest energy protocol. Dose levels in rats were lower due to the increased mass of the animal compared to mice. The uncertainty of all dose simulations was below 14%. Conclusions: Monte Carlo simulations proved a valuable tool to investigate the 3D dose distribution in animals from microCT. Small animals, especially mice (due to their small volume), receive large amounts of radiation from the GE eXplore 120
Monte Carlo simulations of the dose from imaging with GE eXplore 120 micro-CT using GATE
International Nuclear Information System (INIS)
Purpose: Small animals are increasingly used as translational models in preclinical imaging studies involving microCT, during which the subjects can be exposed to large amounts of radiation. While the radiation levels are generally sublethal, studies have shown that low-level radiation can change physiological parameters in mice. In order to rule out any influence of radiation on the outcome of such experiments, or resulting deterministic effects in the subjects, the levels of radiation involved need to be addressed. The aim of this study was to investigate the radiation dose delivered by the GE eXplore 120 microCT non-invasively using Monte Carlo simulations in GATE and to compare results to previously obtained experimental values. Methods: Tungsten X-ray spectra were simulated at 70, 80, and 97 kVp using an analytical tool and their half-value layers were simulated for spectra validation against experimentally measured values of the physical X-ray tube. A Monte Carlo model of the microCT system was set up and four protocols that are regularly applied to live animal scanning were implemented. The computed tomography dose index (CTDI) inside a PMMA phantom was derived and multiple field of view acquisitions were simulated using the PMMA phantom, a representative mouse and rat. Results: Simulated half-value layers agreed with experimentally obtained results within a 7% error window. The CTDI ranged from 20 to 56 mGy and closely matched experimental values. Derived organ doses in mice reached 459 mGy in bones and up to 200 mGy in soft tissue organs using the highest energy protocol. Dose levels in rats were lower due to the increased mass of the animal compared to mice. The uncertainty of all dose simulations was below 14%. Conclusions: Monte Carlo simulations proved a valuable tool to investigate the 3D dose distribution in animals from microCT. Small animals, especially mice (due to their small volume), receive large amounts of radiation from the GE eXplore 120
Efthimiou, N.; Papadimitroulas, P.; Kostou, T.; Loudos, G.
2015-09-01
Commercial clinical and preclinical PET scanners rely on the full cylindrical geometry for whole body scans as well as for dedicated organs. In this study we propose the construction of a low cost dual-head C-shaped PET system dedicated for small animal brain imaging. Monte Carlo simulation studies were performed using GATE toolkit to evaluate the optimum design in terms of sensitivity, distortions in the FOV and spatial resolution. The PET model is based on SiPMs and BGO pixelated arrays. Four different configurations with C- angle 0°, 15°, 30° and 45° within the modules, were considered. Geometrical phantoms were used for the evaluation process. STIR software, extended by an efficient multi-threaded ray tracing technique, was used for the image reconstruction. The algorithm automatically adjusts the size of the FOV according to the shape of the detector's geometry. The results showed improvement in sensitivity of ∼15% in case of 45° C-angle compared to the 0° case. The spatial resolution was found 2 mm for 45° C-angle.
International Nuclear Information System (INIS)
This work presents an innovative study to find out the adequate scintillation inorganic detector array to be used coupled to a specific light photo sensor, a charge coupled device (CCD), through a fiber optic plate. The goal is to choose the type of detector that fits a 2-dimensional imaging acquisition of a cell thyroid tissue application with high resolution and detection efficiency in order to map a follicle image using gamma radiation emission. A point or volumetric source - detector simulation by using a MCNP4B general code, considering different source energies, detector materials and geometry including pixel sizes and reflector types was performed. In this study, simulations were performed for 7 x 7 and 127 x 127 arrays using CsI(Tl) and BGO scintillation crystals with pixel size ranging from 1 x 1 cm2 to 10 x 10 μm2 and radiation thickness ranging from 1 mm to 10 mm. The effect of all these parameters was investigated to find the best source-detector system that result in an image with the best contrast details. The results showed that it is possible to design a specific imaging system that allows searching for in-vitro studies, specifically in radiobiology applied to endocrine physiology. (author)
Energy Technology Data Exchange (ETDEWEB)
Silva, Carlos Borges da; Santanna, Claudio Reis de [Instituto de Engenharia Nuclear (IEN/CNEN-RJ), Rio de Janeiro, RJ (Brazil)]. E-mails: borges@ien.gov.br; santanna@ien.gov.br; Braz, Delson [Universidade Federal do Rio de Janeiro (UFRJ), RJ (Brazil). Coordenacao dos Programas de Pos-graduacao de Engenharia (COPPE). Lab. de Instrumentacao Nuclear]. E-mail: delson@lin.ufrj.br; Carvalho, Denise Pires de [Universidade Federal do Rio de Janeiro (UFRJ), RJ (Brazil). Inst. de Biofisica Carlos Chagas Filho. Lab. de Fisiologia Endocrina]. E-mail: dencarv@ufrj.br
2007-07-01
This work presents an innovative study to find out the adequate scintillation inorganic detector array to be used coupled to a specific light photo sensor, a charge coupled device (CCD), through a fiber optic plate. The goal is to choose the type of detector that fits a 2-dimensional imaging acquisition of a cell thyroid tissue application with high resolution and detection efficiency in order to map a follicle image using gamma radiation emission. A point or volumetric source - detector simulation by using a MCNP4B general code, considering different source energies, detector materials and geometry including pixel sizes and reflector types was performed. In this study, simulations were performed for 7 x 7 and 127 x 127 arrays using CsI(Tl) and BGO scintillation crystals with pixel size ranging from 1 x 1 cm{sup 2} to 10 x 10 {mu}m{sup 2} and radiation thickness ranging from 1 mm to 10 mm. The effect of all these parameters was investigated to find the best source-detector system that result in an image with the best contrast details. The results showed that it is possible to design a specific imaging system that allows searching for in-vitro studies, specifically in radiobiology applied to endocrine physiology. (author)
International Nuclear Information System (INIS)
Based on the digital image analysis and inverse Monte-Carlo method, the proximate analysis method is deve-loped and the optical properties of hairs of different types are estimated in three spectral ranges corresponding to three colour components. The scattering and absorption properties of hairs are separated for the first time by using the inverse Monte-Carlo method. The content of different types of melanin in hairs is estimated from the absorption coefficient. It is shown that the dominating type of melanin in dark hairs is eumelanin, whereas in light hairs pheomelanin dominates. (special issue devoted to multiple radiation scattering in random media)
Framework for the construction of a Monte Carlo simulated brain PET–MR image database
International Nuclear Information System (INIS)
Simultaneous PET–MR acquisition reduces the possibility of registration mismatch between the two modalities. This facilitates the application of techniques, either during reconstruction or post-reconstruction, that aim to improve the PET resolution by utilising structural information provided by MR. However, in order to validate such methods for brain PET–MR studies it is desirable to evaluate the performance using data where the ground truth is known. In this work, we present a framework for the production of datasets where simulations of both the PET and MR, based on real data, are generated such that reconstruction and post-reconstruction approaches can be fairly compared. -- Highlights: • A framework for simulating realistic brain PET–MR images is proposed. • The imaging data created is formed from real acquisitions. • Partial volume correction techniques can be fairly compared using this framework
3D imaging using combined neutron-photon fan-beam tomography: A Monte Carlo study.
Hartman, J; Yazdanpanah, A Pour; Barzilov, A; Regentova, E
2016-05-01
The application of combined neutron-photon tomography for 3D imaging is examined using MCNP5 simulations for objects of simple shapes and different materials. Two-dimensional transmission projections were simulated for fan-beam scans using 2.5MeV deuterium-deuterium and 14MeV deuterium-tritium neutron sources, and high-energy X-ray sources, such as 1MeV, 6MeV and 9MeV. Photons enable assessment of electron density and related mass density, neutrons aid in estimating the product of density and material-specific microscopic cross section- the ratio between the two provides the composition, while CT allows shape evaluation. Using a developed imaging technique, objects and their material compositions have been visualized. PMID:26953978
Thompson, C J; Moreno-Cantu, J; Picard, Y
1992-03-01
Monte Carlo simulation techniques are applied to track the annihilation photons from positron decay, and store the photon histories. Reasonably realistic models of the isotope distribution in the brain and heart during typical PET studies, as well as the traditional phantoms used for measuring PET scanner performance can be built out of up to 10 hollow or solid cylinders. Separate programs model the source distribution and its attenuation characteristics, the collimators and the detectors. These modules are connected by compact gamma history files which are stored on disc or tape. Over 50 million gamma ray histories can be saved on a 1 Gbyte disc, representing the decay of several billion atoms. This allows for good precision even for single thin slices in scanners with wide axial acceptance. The simulation results include spectrum analysis, sensitivity to true coincident events, scattered coincident and single rays, and the effects on these parameters of detector dead time. The storage of intermediate results on tape reduces simulation time, since most common source geometries need be generated only once. The sensitivities in multi-slice systems are presented as matrices of coincident crystal planes. The matrix shows the true count sensitivity and the scatter fraction together for each valid combination of planes. This presentation is very useful for assessing the effects of various degrees of inter-plane collimation. The spatial resolution analysis includes the effects of positron range, non-collinearity of the gamma rays, multiple interaction within the detectors, and the effects of quantization into single crystals in multiple-crystal block detectors. Each of these effects can be turned on or off without repeating the simulation. Both in-plane and axial resolutions are calculated as a function of location of the positron-emitting nucleus and the angle of incidence of gamma rays on the crystals. Single crystals, blocks and crystals with depth of interaction
International Nuclear Information System (INIS)
Purpose: Monte Carlo simulations play a vital role in the understanding of the fundamental limitations, design, and optimization of existing and emerging medical imaging systems. Efforts in this area have resulted in the development of a wide variety of open-source software packages. One such package, hybridMANTIS, uses a novel hybrid concept to model indirect scintillator detectors by balancing the computational load using dual CPU and graphics processing unit (GPU) processors, obtaining computational efficiency with reasonable accuracy. In this work, the authors describe two open-source visualization interfaces, webMANTIS and visualMANTIS to facilitate the setup of computational experiments via hybridMANTIS. Methods: The visualization tools visualMANTIS and webMANTIS enable the user to control simulation properties through a user interface. In the case of webMANTIS, control via a web browser allows access through mobile devices such as smartphones or tablets. webMANTIS acts as a server back-end and communicates with an NVIDIA GPU computing cluster that can support multiuser environments where users can execute different experiments in parallel. Results: The output consists of point response and pulse-height spectrum, and optical transport statistics generated by hybridMANTIS. The users can download the output images and statistics through a zip file for future reference. In addition, webMANTIS provides a visualization window that displays a few selected optical photon path as they get transported through the detector columns and allows the user to trace the history of the optical photons. Conclusions: The visualization tools visualMANTIS and webMANTIS provide features such as on the fly generation of pulse-height spectra and response functions for microcolumnar x-ray imagers while allowing users to save simulation parameters and results from prior experiments. The graphical interfaces simplify the simulation setup and allow the user to go directly from specifying
Energy Technology Data Exchange (ETDEWEB)
Dong, Han; Sharma, Diksha; Badano, Aldo, E-mail: aldo.badano@fda.hhs.gov [Division of Imaging, Diagnostics, and Software Reliability, Center for Devices and Radiological Health, U.S. Food and Drug Administration, Silver Spring, Maryland 20993 (United States)
2014-12-15
Purpose: Monte Carlo simulations play a vital role in the understanding of the fundamental limitations, design, and optimization of existing and emerging medical imaging systems. Efforts in this area have resulted in the development of a wide variety of open-source software packages. One such package, hybridMANTIS, uses a novel hybrid concept to model indirect scintillator detectors by balancing the computational load using dual CPU and graphics processing unit (GPU) processors, obtaining computational efficiency with reasonable accuracy. In this work, the authors describe two open-source visualization interfaces, webMANTIS and visualMANTIS to facilitate the setup of computational experiments via hybridMANTIS. Methods: The visualization tools visualMANTIS and webMANTIS enable the user to control simulation properties through a user interface. In the case of webMANTIS, control via a web browser allows access through mobile devices such as smartphones or tablets. webMANTIS acts as a server back-end and communicates with an NVIDIA GPU computing cluster that can support multiuser environments where users can execute different experiments in parallel. Results: The output consists of point response and pulse-height spectrum, and optical transport statistics generated by hybridMANTIS. The users can download the output images and statistics through a zip file for future reference. In addition, webMANTIS provides a visualization window that displays a few selected optical photon path as they get transported through the detector columns and allows the user to trace the history of the optical photons. Conclusions: The visualization tools visualMANTIS and webMANTIS provide features such as on the fly generation of pulse-height spectra and response functions for microcolumnar x-ray imagers while allowing users to save simulation parameters and results from prior experiments. The graphical interfaces simplify the simulation setup and allow the user to go directly from specifying
Energy Technology Data Exchange (ETDEWEB)
Hristov, D; Schlosser, J; Bazalova, M [Stanford Universtiy, Stanford, CA (United States); Chen, J [UCSF Comprehensive Cancer Center, Lafayette, CA (United States)
2014-06-01
Purpose: To quantify the effect of ultrasound (US) probe beam attenuation for radiation therapy delivered under real-time US image guidance by means of Monte Carlo (MC) simulations. Methods: MC models of two Philips US probes, an X6-1 matrix-array transducer and a C5-2 curved-array transducer, were built based on their CT images in the EGSnrc BEAMnrc and DOSXYZnrc codes. Due to the metal parts, the probes were scanned in a Tomotherapy machine with a 3.5 MV beam. Mass densities in the probes were assigned based on an electron density calibration phantom consisting of cylinders with mass densities between 0.2–8.0 g/cm{sup 3}. Beam attenuation due to the probes was measured in a solid water phantom for a 6 MV and 15 MV 15x15 cm{sup 2} beam delivered on a Varian Trilogy linear accelerator. The dose was measured with the PTW-729 ionization chamber array at two depths and compared to MC simulations. The extreme case beam attenuation expected in robotic US image guided radiotherapy for probes in upright position was quantified by means of MC simulations. Results: The 3.5 MV CT number to mass density calibration curve was found to be linear with R{sup 2} > 0.99. The maximum mass densities were 4.6 and 4.2 g/cm{sup 3} in the C5-2 and X6-1 probe, respectively. Gamma analysis of the simulated and measured doses revealed that over 98% of measurement points passed the 3%/3mm criteria for both probes and measurement depths. The extreme attenuation for probes in upright position was found to be 25% and 31% for the C5-2 and X6-1 probe, respectively, for both 6 and 15 MV beams at 10 cm depth. Conclusion: MC models of two US probes used for real-time image guidance during radiotherapy have been built. As a Result, radiotherapy treatment planning with the imaging probes in place can now be performed. J Schlosser is an employee of SoniTrack Systems, Inc. D Hristov has financial interest in SoniTrack Systems, Inc.
TH-A-18C-09: Ultra-Fast Monte Carlo Simulation for Cone Beam CT Imaging of Brain Trauma
International Nuclear Information System (INIS)
Purpose: Application of cone-beam CT (CBCT) to low-contrast soft tissue imaging, such as in detection of traumatic brain injury, is challenged by high levels of scatter. A fast, accurate scatter correction method based on Monte Carlo (MC) estimation is developed for application in high-quality CBCT imaging of acute brain injury. Methods: The correction involves MC scatter estimation executed on an NVIDIA GTX 780 GPU (MC-GPU), with baseline simulation speed of ~1e7 photons/sec. MC-GPU is accelerated by a novel, GPU-optimized implementation of variance reduction (VR) techniques (forced detection and photon splitting). The number of simulated tracks and projections is reduced for additional speed-up. Residual noise is removed and the missing scatter projections are estimated via kernel smoothing (KS) in projection plane and across gantry angles. The method is assessed using CBCT images of a head phantom presenting a realistic simulation of fresh intracranial hemorrhage (100 kVp, 180 mAs, 720 projections, source-detector distance 700 mm, source-axis distance 480 mm). Results: For a fixed run-time of ~1 sec/projection, GPU-optimized VR reduces the noise in MC-GPU scatter estimates by a factor of 4. For scatter correction, MC-GPU with VR is executed with 4-fold angular downsampling and 1e5 photons/projection, yielding 3.5 minute run-time per scan, and de-noised with optimized KS. Corrected CBCT images demonstrate uniformity improvement of 18 HU and contrast improvement of 26 HU compared to no correction, and a 52% increase in contrast-tonoise ratio in simulated hemorrhage compared to “oracle” constant fraction correction. Conclusion: Acceleration of MC-GPU achieved through GPU-optimized variance reduction and kernel smoothing yields an efficient (<5 min/scan) and accurate scatter correction that does not rely on additional hardware or simplifying assumptions about the scatter distribution. The method is undergoing implementation in a novel CBCT dedicated to brain
TH-A-18C-09: Ultra-Fast Monte Carlo Simulation for Cone Beam CT Imaging of Brain Trauma
Energy Technology Data Exchange (ETDEWEB)
Sisniega, A; Zbijewski, W; Stayman, J [Department of Biomedical Engineering, Johns Hopkins University (United States); Yorkston, J [Carestream Health (United States); Aygun, N [Department of Radiology, Johns Hopkins University (United States); Koliatsos, V [Department of Neurology, Johns Hopkins University (United States); Siewerdsen, J [Department of Biomedical Engineering, Johns Hopkins University (United States); Department of Radiology, Johns Hopkins University (United States)
2014-06-15
Purpose: Application of cone-beam CT (CBCT) to low-contrast soft tissue imaging, such as in detection of traumatic brain injury, is challenged by high levels of scatter. A fast, accurate scatter correction method based on Monte Carlo (MC) estimation is developed for application in high-quality CBCT imaging of acute brain injury. Methods: The correction involves MC scatter estimation executed on an NVIDIA GTX 780 GPU (MC-GPU), with baseline simulation speed of ~1e7 photons/sec. MC-GPU is accelerated by a novel, GPU-optimized implementation of variance reduction (VR) techniques (forced detection and photon splitting). The number of simulated tracks and projections is reduced for additional speed-up. Residual noise is removed and the missing scatter projections are estimated via kernel smoothing (KS) in projection plane and across gantry angles. The method is assessed using CBCT images of a head phantom presenting a realistic simulation of fresh intracranial hemorrhage (100 kVp, 180 mAs, 720 projections, source-detector distance 700 mm, source-axis distance 480 mm). Results: For a fixed run-time of ~1 sec/projection, GPU-optimized VR reduces the noise in MC-GPU scatter estimates by a factor of 4. For scatter correction, MC-GPU with VR is executed with 4-fold angular downsampling and 1e5 photons/projection, yielding 3.5 minute run-time per scan, and de-noised with optimized KS. Corrected CBCT images demonstrate uniformity improvement of 18 HU and contrast improvement of 26 HU compared to no correction, and a 52% increase in contrast-tonoise ratio in simulated hemorrhage compared to “oracle” constant fraction correction. Conclusion: Acceleration of MC-GPU achieved through GPU-optimized variance reduction and kernel smoothing yields an efficient (<5 min/scan) and accurate scatter correction that does not rely on additional hardware or simplifying assumptions about the scatter distribution. The method is undergoing implementation in a novel CBCT dedicated to brain
Energy Technology Data Exchange (ETDEWEB)
Barrera, C A; Moran, M J
2007-08-21
The Neutron Imaging System (NIS) is one of seven ignition target diagnostics under development for the National Ignition Facility. The NIS is required to record hot-spot (13-15 MeV) and downscattered (6-10 MeV) images with a resolution of 10 microns and a signal-to-noise ratio (SNR) of 10 at the 20% contour. The NIS is a valuable diagnostic since the downscattered neutrons reveal the spatial distribution of the cold fuel during an ignition attempt, providing important information in the case of a failed implosion. The present study explores the parameter space of several line-of-sight (LOS) configurations that could serve as the basis for the final design. Six commercially available organic scintillators were experimentally characterized for their light emission decay profile and neutron sensitivity. The samples showed a long lived decay component that makes direct recording of a downscattered image impossible. The two best candidates for the NIS detector material are: EJ232 (BC422) plastic fibers or capillaries filled with EJ399B. A Monte Carlo-based end-to-end model of the NIS was developed to study the imaging capabilities of several LOS configurations and verify that the recovered sources meet the design requirements. The model includes accurate neutron source distributions, aperture geometries (square pinhole, triangular wedge, mini-penumbral, annular and penumbral), their point spread functions, and a pixelated scintillator detector. The modeling results show that a useful downscattered image can be obtained by recording the primary peak and the downscattered images, and then subtracting a decayed version of the former from the latter. The difference images need to be deconvolved in order to obtain accurate source distributions. The images are processed using a frequency-space modified-regularization algorithm and low-pass filtering. The resolution and SNR of these sources are quantified by using two surrogate sources. The simulations show that all LOS
Directory of Open Access Journals (Sweden)
David G. Gadian
2011-10-01
Full Text Available A common feature of many magnetic resonance image (MRI data processing methods is the voxel-by-voxel (a voxel is a volume element manner in which the processing is performed. In general, however, MRI data are expected to exhibit some level of spatial correlation, rendering an independent-voxels treatment inefficient in its use of the data. Bayesian random effect models are expected to be more efficient owing to their information-borrowing behaviour. To illustrate the Bayesian random effects approach, this paper outlines a Markov chain Monte Carlo (MCMC analysis of a perfusion MRI dataset, implemented in R using the BRugs package. BRugs provides an interface to WinBUGS and its GeoBUGS add-on. WinBUGS is a widely used programme for performing MCMC analyses, with a focus on Bayesian random effect models. A simultaneous modeling of both voxels (restricted to a region of interest and multiple subjects is demonstrated. Despite the low signal-to-noise ratio in the magnetic resonance signal intensity data, useful model signal intensity profiles are obtained. The merits of random effects modeling are discussed in comparison with the alternative approaches based on region-of-interest averaging and repeated independent voxels analysis. This paper focuses on perfusion MRI for the purpose of illustration, the main proposition being that random effects modeling is expected to be beneficial in many other MRI applications in which the signal-to-noise ratio is a limiting factor.
Energy Technology Data Exchange (ETDEWEB)
Fallahpoor, M; Abbasi, M [Tehran University of Medical Sciences, Vali-Asr Hospital, Tehran, Tehran (Iran, Islamic Republic of); Sen, A [University of Houston, Houston, TX (United States); Parach, A [Shahid Sadoughi University of Medical Sciences, Yazd, Yazd (Iran, Islamic Republic of); Kalantari, F [UT Southwestern Medical Center, Dallas, TX (United States)
2015-06-15
Purpose: Patient-specific 3-dimensional (3D) internal dosimetry in targeted radionuclide therapy is essential for efficient treatment. Two major steps to achieve reliable results are: 1) generating quantitative 3D images of radionuclide distribution and attenuation coefficients and 2) using a reliable method for dose calculation based on activity and attenuation map. In this research, internal dosimetry for 153-Samarium (153-Sm) was done by SPECT-CT images coupled GATE Monte Carlo package for internal dosimetry. Methods: A 50 years old woman with bone metastases from breast cancer was prescribed 153-Sm treatment (Gamma: 103keV and beta: 0.81MeV). A SPECT/CT scan was performed with the Siemens Simbia-T scanner. SPECT and CT images were registered using default registration software. SPECT quantification was achieved by compensating for all image degrading factors including body attenuation, Compton scattering and collimator-detector response (CDR). Triple energy window method was used to estimate and eliminate the scattered photons. Iterative ordered-subsets expectation maximization (OSEM) with correction for attenuation and distance-dependent CDR was used for image reconstruction. Bilinear energy mapping is used to convert Hounsfield units in CT image to attenuation map. Organ borders were defined by the itk-SNAP toolkit segmentation on CT image. GATE was then used for internal dose calculation. The Specific Absorbed Fractions (SAFs) and S-values were reported as MIRD schema. Results: The results showed that the largest SAFs and S-values are in osseous organs as expected. S-value for lung is the highest after spine that can be important in 153-Sm therapy. Conclusion: We presented the utility of SPECT-CT images and Monte Carlo for patient-specific dosimetry as a reliable and accurate method. It has several advantages over template-based methods or simplified dose estimation methods. With advent of high speed computers, Monte Carlo can be used for treatment planning
International Nuclear Information System (INIS)
Purpose: Patient-specific 3-dimensional (3D) internal dosimetry in targeted radionuclide therapy is essential for efficient treatment. Two major steps to achieve reliable results are: 1) generating quantitative 3D images of radionuclide distribution and attenuation coefficients and 2) using a reliable method for dose calculation based on activity and attenuation map. In this research, internal dosimetry for 153-Samarium (153-Sm) was done by SPECT-CT images coupled GATE Monte Carlo package for internal dosimetry. Methods: A 50 years old woman with bone metastases from breast cancer was prescribed 153-Sm treatment (Gamma: 103keV and beta: 0.81MeV). A SPECT/CT scan was performed with the Siemens Simbia-T scanner. SPECT and CT images were registered using default registration software. SPECT quantification was achieved by compensating for all image degrading factors including body attenuation, Compton scattering and collimator-detector response (CDR). Triple energy window method was used to estimate and eliminate the scattered photons. Iterative ordered-subsets expectation maximization (OSEM) with correction for attenuation and distance-dependent CDR was used for image reconstruction. Bilinear energy mapping is used to convert Hounsfield units in CT image to attenuation map. Organ borders were defined by the itk-SNAP toolkit segmentation on CT image. GATE was then used for internal dose calculation. The Specific Absorbed Fractions (SAFs) and S-values were reported as MIRD schema. Results: The results showed that the largest SAFs and S-values are in osseous organs as expected. S-value for lung is the highest after spine that can be important in 153-Sm therapy. Conclusion: We presented the utility of SPECT-CT images and Monte Carlo for patient-specific dosimetry as a reliable and accurate method. It has several advantages over template-based methods or simplified dose estimation methods. With advent of high speed computers, Monte Carlo can be used for treatment planning
Wang, Yi; El-Mohri, Youcef; Antonuk, Larry E.; Zhao, Qihua
2010-07-01
The use of thick, segmented scintillators in electronic portal imagers offers the potential for significant improvement in x-ray detection efficiency compared to conventional phosphor screens. Such improvement substantially increases the detective quantum efficiency (DQE), leading to the possibility of achieving soft-tissue visualization at clinically practical (i.e. low) doses using megavoltage (MV) cone-beam computed tomography. While these DQE increases are greatest at zero spatial frequency, they are diminished at higher frequencies as a result of degradation of spatial resolution due to lateral spreading of secondary radiation within the scintillator—an effect that is more pronounced for thicker scintillators. The extent of this spreading is even more accentuated for radiation impinging the scintillator at oblique angles of incidence due to beam divergence. In this paper, Monte Carlo simulations of radiation transport, performed to investigate and quantify the effects of beam divergence on the imaging performance of MV imagers based on two promising scintillators (BGO and CsI:Tl), are reported. In these studies, 10-40 mm thick scintillators, incorporating low-density polymer, or high-density tungsten septal walls, were examined for incident angles corresponding to that encountered at locations up to ~15 cm from the central beam axis (for an imager located 130 cm from a radiotherapy x-ray source). The simulations demonstrate progressively more severe spatial resolution degradation (quantified in terms of the effect on the modulation transfer function) as a function of increasing angle of incidence (as well as of the scintillator thickness). Since the noise power behavior was found to be largely independent of the incident angle, the dependence of the DQE on the incident angle is therefore primarily determined by the spatial resolution. The observed DQE degradation suggests that 10 mm thick scintillators are not strongly affected by beam divergence for
Lazaro, D.; Buvat, I.; Loudos, G.; Strul, D.; Santin, G.; Giokaris, N.; Donnarieix, D.; Maigne, L.; Spanoudaki, V.; Styliaris, S.; Staelens, S.; Breton, V.
2004-01-01
Monte Carlo simulations are increasingly used in scintigraphic imaging to model imaging systems and to develop and assess tomographic reconstruction algorithms and correction methods for improved image quantitation. GATE (GEANT4 application for tomographic emission) is a new Monte Carlo simulation platform based on GEANT4 dedicated to nuclear imaging applications. This paper describes the GATE simulation of a prototype of scintillation camera dedicated to small-animal imaging and consisting of a CsI(Tl) crystal array coupled to a position-sensitive photomultiplier tube. The relevance of GATE to model the camera prototype was assessed by comparing simulated 99mTc point spread functions, energy spectra, sensitivities, scatter fractions and image of a capillary phantom with the corresponding experimental measurements. Results showed an excellent agreement between simulated and experimental data: experimental spatial resolutions were predicted with an error less than 100 µm. The difference between experimental and simulated system sensitivities for different source-to-collimator distances was within 2%. Simulated and experimental scatter fractions in a [98-182 keV] energy window differed by less than 2% for sources located in water. Simulated and experimental energy spectra agreed very well between 40 and 180 keV. These results demonstrate the ability and flexibility of GATE for simulating original detector designs. The main weakness of GATE concerns the long computation time it requires: this issue is currently under investigation by the GEANT4 and the GATE collaborations.
Lazaro, D; Buvat, I; Loudos, G; Strul, D; Santin, G; Giokaris, N; Donnarieix, D; Maigne, L; Spanoudaki, V; Styliaris, S; Staelens, S; Breton, V
2004-01-21
Monte Carlo simulations are increasingly used in scintigraphic imaging to model imaging systems and to develop and assess tomographic reconstruction algorithms and correction methods for improved image quantitation. GATE (GEANT4 application for tomographic emission) is a new Monte Carlo simulation platform based on GEANT4 dedicated to nuclear imaging applications. This paper describes the GATE simulation of a prototype of scintillation camera dedicated to small-animal imaging and consisting of a CsI(Tl) crystal array coupled to a position-sensitive photomultiplier tube. The relevance of GATE to model the camera prototype was assessed by comparing simulated 99mTc point spread functions, energy spectra, sensitivities, scatter fractions and image of a capillary phantom with the corresponding experimental measurements. Results showed an excellent agreement between simulated and experimental data: experimental spatial resolutions were predicted with an error less than 100 microns. The difference between experimental and simulated system sensitivities for different source-to-collimator distances was within 2%. Simulated and experimental scatter fractions in a [98-182 keV] energy window differed by less than 2% for sources located in water. Simulated and experimental energy spectra agreed very well between 40 and 180 keV. These results demonstrate the ability and flexibility of GATE for simulating original detector designs. The main weakness of GATE concerns the long computation time it requires: this issue is currently under investigation by the GEANT4 and the GATE collaborations. PMID:15083671
Energy Technology Data Exchange (ETDEWEB)
Moore, Stephen C. [Department of Radiology, Brigham and Women' s Hospital and Harvard Medical School, 75 Francis Street, Boston, MA 02115 (United States)]. E-mail: scmoore@bwh.harvard.edu; Ouyang, Jinsong [Department of Radiology, Brigham and Women' s Hospital and Harvard Medical School, 75 Francis Street, Boston, MA 02115 (United States); Park, Mi-Ae [Department of Radiology, Brigham and Women' s Hospital and Harvard Medical School, 75 Francis Street, Boston, MA 02115 (United States); El Fakhri, Georges [Department of Radiology, Brigham and Women' s Hospital and Harvard Medical School, 75 Francis Street, Boston, MA 02115 (United States)
2006-12-20
We have incorporated Monte Carlo (MC)-based estimates of patient scatter, detector scatter, and crosstalk into an iterative reconstruction algorithm, and compared its performance to that of a general spectral (GS) approach. We extended the MC-based reconstruction algorithm of de Jong et al. by (1) using the 'Delta scattering' method to determine photon interaction points (2) simulating scatter maps for many energy bins simultaneously, and (3) decoupling the simulation of the object and detector by using pre-stored point spread functions (PSF) that included all collimator and detector effects. A numerical phantom was derived from a segmented CT scan of a torso phantom. The relative values of In-111 activity concentration simulated in soft tissue, liver, spine, left lung, right lung, and five spherical tumors (1.3-2.0 cm diam.) were 1.0, 1.5, 1.5, 0.3, 0.5, and 10.0, respectively. GS scatter projections were incorporated additively in an OSEM reconstruction (6 subsetsx10 projectionsx2 photopeak windows). After three iterations, GS scatter projections were replaced by MC-estimated scatter projections for two additional iterations. MC-based compensation was quantitatively compared to GS-based compensation after five iterations. The bias of organ activity estimates ranged from -13% to -6.5% (GS), and from -1.4% to +5.0% (MC); tumor bias ranged from -20.0% to +10.0% for GS (mean{+-}std.dev.=-4.3{+-}11.9%), and from -2.2 to +18.8% for MC (+4.1{+-}8.6%). Image noise in all organs was less with MC than with GS.
Bozkurt, Ahmet
The distribution of absorbed doses in the body can be computationally determined using mathematical or tomographic representations of human anatomy. A whole- body model was developed from the color images of the National Library of Medicine's Visible Human Project® for simulating the transport of radiation in the human body. The model, called Visible Photographic Man (VIP-Man), has sixty-one organs and tissues represented in the Monte Carlo code MCNPX at 4-mm voxel resolution. Organ dose calculations from external neutron sources were carried out using VIP-man and MCNPX to determine a new set of dose conversion coefficients to be used in radiation protection. Monoenergetic neutron beams between 10-9 MeV and 10 GeV were studied under six different irradiation geometries: anterior-posterior, posterior-anterior, right lateral, left lateral, rotational and isotropic. The results for absorbed doses in twenty-four organs and the effective doses based on twelve critical organs are presented in tabular form. A comprehensive comparison of the results with those from the mathematical models show discrepancies that can be attributed to the variations in body modeling (size, location and shape of the individual organs) and the use of different nuclear datasets or models to derive the reaction cross sections, as well as the use of different transport packages for simulation radiation effects. The organ dose results based on the realistic VIP-Man body model allow the existing radiation protection dosimetry on neutrons to be re-evaluated and improved.
Study of the point spread function (PSF) for {sup 123}I SPECT imaging using Monte Carlo simulation
Energy Technology Data Exchange (ETDEWEB)
Cot, A [Departament de FIsica i Enginyeria Nuclear, Universitat Politecnica de Catalunya, Diagonal 647, 08028 Barcelona (Spain); Sempau, J [Institut de Tecniques Energetiques, Universitat Politecnica de Catalunya, Diagonal 647, 08028 Barcelona (Spain); Pareto, D [Unitat de BiofIsica i Bioenginyeria, Universitat de Barcelona, Casanova 143, 08036 Barcelona (Spain); Bullich, S [Unitat de BiofIsica i Bioenginyeria, Universitat de Barcelona, Casanova 143, 08036 Barcelona (Spain); PavIa, J [Servei de Medicina Nuclear, Hospital ClInic i Provincial de Barcelona, Villarroel 170, 08036 Barcelona (Spain); Calvino, F [Departament de FIsica i Enginyeria Nuclear, Universitat Politecnica de Catalunya, Diagonal 647, 08028 Barcelona (Spain); Ros, D [Unitat de BiofIsica i Bioenginyeria, Universitat de Barcelona, Casanova 143, 08036 Barcelona (Spain)
2004-07-21
The iterative reconstruction algorithms employed in brain single-photon emission computed tomography (SPECT) allow some quantitative parameters of the image to be improved. These algorithms require accurate modelling of the so-called point spread function (PSF). Nowadays, most in vivo neurotransmitter SPECT studies employ pharmaceuticals radiolabelled with {sup 123}I. In addition to an intense line at 159 keV, the decay scheme of this radioisotope includes some higher energy gammas which may have a non-negligible contribution to the PSF. The aim of this work is to study this contribution for two low-energy high-resolution collimator configurations, namely, the parallel and the fan beam. The transport of radiation through the material system is simulated with the Monte Carlo code PENELOPE. We have developed a main program that deals with the intricacies associated with tracking photon trajectories through the geometry of the collimator and detection systems. The simulated PSFs are partly validated with a set of experimental measurements that use the 511 keV annihilation photons emitted by a {sup 18}F source. Sensitivity and spatial resolution have been studied, showing that a significant fraction of the detection events in the energy window centred at 159 keV (up to approximately 49% for the parallel collimator) are originated by higher energy gamma rays, which contribute to the spatial profile of the PSF mostly outside the 'geometrical' region dominated by the low-energy photons. Therefore, these high-energy counts are to be considered as noise, a fact that should be taken into account when modelling PSFs for reconstruction algorithms. We also show that the fan beam collimator gives higher signal-to-noise ratios than the parallel collimator for all the source positions analysed.
Directory of Open Access Journals (Sweden)
Kohei Arai
2013-04-01
Full Text Available Comparative study on linear and nonlinear mixed pixel models of which pixels in remote sensing satellite images is composed with plural ground cover materials mixed together, is conducted for remote sensing satellite image analysis. The mixed pixel models are based on Cierniewski of ground surface reflectance model. The comparative study is conducted by using of Monte Carlo Ray Tracing: MCRT simulations. Through simulation study, the difference between linear and nonlinear mixed pixel models is clarified. Also it is found that the simulation model is validated.
Fast Monte Carlo based joint iterative reconstruction for simultaneous 99mTc/ 123I SPECT imaging.
Ouyang, Jinsong; El Fakhri, Georges; Moore, Stephen C
2007-08-01
Simultaneous 99mTC/ 123I SPECT allows the assessment of two physiological functions under identical conditions. The separation of these radionuclides is difficult, however, because their energies are close. Most energy-window-based scatter correction methods do not fully model either physical factors or patient-specific activity and attenuation distributions. We have developed a fast Monte Carlo (MC) simulation-based multiple-radionuclide and multiple-energy joint ordered-subset expectation-maximization (JOSEM) iterative reconstruction algorithm, MC-JOSEM. MC-JOSEM simultaneously corrects for scatter and cross talk as well as detector response within the reconstruction algorithm. We evaluated MC-JOSEM for simultaneous brain profusion (99mTc-HMPAO) and neurotransmission (123I-altropane) SPECT. MC simulations of 99mTc and 123I studies were generated separately and then combined to mimic simultaneous 99mTc/ 123I SPECT. All the details of photon transport through the brain, the collimator, and detector, including Compton and coherent scatter, septal penetration, and backscatter from components behind the crystal, were modeled. We reconstructed images from simultaneous dual-radionuclide projections in three ways. First, we reconstructed the photopeak-energy-window projections (with an asymmetric energy window for 1231) using the standard ordered-subsets expectation-maximization algorithm (NSC-OSEM). Second, we used standard OSEM to reconstruct 99mTc photopeak-energy-window projections, while including an estimate of scatter from a Compton-scatter energy window (SC-OSEM). Third, we jointly reconstructed both 99mTc and 123I images using projection data associated with two photo-peak energy windows and an intermediate-energy window using MC-JOSEM. For 15 iterations of reconstruction, the bias and standard deviation of 99mTc activity estimates in several brain structures were calculated for NSC-OSEM, SC-OSEM, and MC-JOSEM, using images reconstructed from primary
Longo, Mariaconcetta; Marchioni, Chiara; Insero, Teresa; Donnarumma, Raffaella; D'Adamo, Alessandro; Lucatelli, Pierleone; Fanelli, Fabrizio; Salvatori, Filippo Maria; Cannavale, Alessandro; Di Castro, Elisabetta
2016-03-01
This study evaluates X-ray exposure in patient undergoing abdominal extra-vascular interventional procedures by means of Digital Imaging and COmmunications in Medicine (DICOM) image headers and Monte Carlo simulation. The main aim was to assess the effective and equivalent doses, under the hypothesis of their correlation with the dose area product (DAP) measured during each examination. This allows to collect dosimetric information about each patient and to evaluate associated risks without resorting to in vivo dosimetry. The dose calculation was performed in 79 procedures through the Monte Carlo simulator PCXMC (A PC-based Monte Carlo program for calculating patient doses in medical X-ray examinations), by using the real geometrical and dosimetric irradiation conditions, automatically extracted from DICOM headers. The DAP measurements were also validated by using thermoluminescent dosemeters on an anthropomorphic phantom. The expected linear correlation between effective doses and DAP was confirmed with an R(2) of 0.974. Moreover, in order to easily calculate patient doses, conversion coefficients that relate equivalent doses to measurable quantities, such as DAP, were obtained. PMID:26211013
International Nuclear Information System (INIS)
This study evaluates X-ray exposure in patient undergoing abdominal extra-vascular interventional procedures by means of Digital Imaging and Communications in Medicine (DICOM) image headers and Monte Carlo simulation. The main aim was to assess the effective and equivalent doses, under the hypothesis of their correlation with the dose area product (DAP) measured during each examination. This allows to collect dosimetric information about each patient and to evaluate associated risks without resorting to in vivo dosimetry. The dose calculation was performed in 79 procedures through the Monte Carlo simulator PCXMC (A PC-based Monte Carlo program for calculating patient doses in medical X-ray examinations), by using the real geometrical and dosimetric irradiation conditions, automatically extracted from DICOM headers. The DAP measurements were also validated by using thermoluminescent dosemeters on an anthropomorphic phantom. The expected linear correlation between effective doses and DAP was confirmed with an R2 of 0.974. Moreover, in order to easily calculate patient doses, conversion coefficients that relate equivalent doses to measurable quantities, such as DAP, were obtained. (authors)
Lee, Seung-Wan; Choi, Yu-Na; Cho, Hyo-Min; Lee, Young-Jin; Ryu, Hyun-Ju; Kim, Hee-Joung
2012-08-01
The energy-resolved photon counting detector provides the spectral information that can be used to generate images. The novel imaging methods, including the K-edge imaging, projection-based energy weighting imaging and image-based energy weighting imaging, are based on the energy-resolved photon counting detector and can be realized by using various energy windows or energy bins. The location and width of the energy windows or energy bins are important because these techniques generate an image using the spectral information defined by the energy windows or energy bins. In this study, the reconstructed images acquired with K-edge imaging, projection-based energy weighting imaging and image-based energy weighting imaging were simulated using the Monte Carlo simulation. The effect of energy windows or energy bins was investigated with respect to the contrast, coefficient-of-variation (COV) and contrast-to-noise ratio (CNR). The three images were compared with respect to the CNR. We modeled the x-ray computed tomography system based on the CdTe energy-resolved photon counting detector and polymethylmethacrylate phantom, which have iodine, gadolinium and blood. To acquire K-edge images, the lower energy thresholds were fixed at K-edge absorption energy of iodine and gadolinium and the energy window widths were increased from 1 to 25 bins. The energy weighting factors optimized for iodine, gadolinium and blood were calculated from 5, 10, 15, 19 and 33 energy bins. We assigned the calculated energy weighting factors to the images acquired at each energy bin. In K-edge images, the contrast and COV decreased, when the energy window width was increased. The CNR increased as a function of the energy window width and decreased above the specific energy window width. When the number of energy bins was increased from 5 to 15, the contrast increased in the projection-based energy weighting images. There is a little difference in the contrast, when the number of energy bin is
International Nuclear Information System (INIS)
Computed radiography (CR) is gradually replacing film. The application of CR for two-dimensional profiles and off-axis ratio (OAR) measurement using an imaging plate (IP) in a CR system is currently under discussion. However, a well known problem for IPs in dosimetry is that they use high atomic number (Z) materials, such as Ba, which have an energy dependency in a photon interaction. Although there are some reports that it is possible to compensate for the energy dependency with metal filters, the appropriate thicknesses of these filters and where they should be located have not been investigated. The purpose of this study is to find the most suitable filter for use with an IP as a dosimetric tool. Monte Carlo simulation (Geant4 8.1) was used to determine the filter to minimize the measurement error in OAR measurements of 4 MV x-rays. In this simulation, the material and thickness of the filter and distance between the IP and the filter were varied to determine most suitable filter conditions that gave the best fit to the MC calculated OAR in water. With regard to changing the filter material, we found that using higher Z and higher density material increased the effectiveness of the filter. Also, increasing the distance between the filter and the IP reduced the effectiveness, whereas increasing the thickness of the filter increased the effectiveness. The result of this study showed that the most appropriate filter conditions consistent with the calculated OAR in water were the ones with the IP sandwiched between two 2 mm thick lead filters at a distance of 5 mm from the IP or the IP sandwiched directly between two 1 mm lead filters. Using these filters, we measured the OAR at 10 cm depth with 100 cm source-to-surface distance and surface 10x10 cm2 field size. The results of this measurement represented that it is possible to achieve measurements with less than within 2.0% and 2.0% in the field and with less than 1.1% and 0.6% out of the field by using 2 and 1 mm
,
2015-01-01
We present a sophisticated likelihood reconstruction algorithm for shower-image analysis of imaging Cherenkov telescopes. The reconstruction algorithm is based on the comparison of the camera pixel amplitudes with the predictions from a Monte Carlo based model. Shower parameters are determined by a maximisation of a likelihood function. Maximisation of the likelihood as a function of shower fit parameters is performed using a numerical non-linear optimisation technique. A related reconstruction technique has already been developed by the CAT and the H.E.S.S. experiments, and provides a more precise direction and energy reconstruction of the photon induced shower compared to the second moment of the camera image analysis. Examples are shown of the performance of the analysis on simulated gamma-ray data from the VERITAS array.
International Nuclear Information System (INIS)
The source of uncertainty is not exclusive of the Monte Carlo method, but it will be present in any algorithm which takes into account the correction for heterogeneity. Although we hope that the uncertainty described above is small, the objective of this work is to try to quantify depending on the CT study. (Author)
International Nuclear Information System (INIS)
In this paper, we present the results of Monte Carlo simulations of atmospheric showers induced by diffuse γ-rays as detected by the high-energy gamma-ray astronomy (HEGRA) system of five imaging atmospheric Cerenkov telescopes (IACTs). We have investigated the sensitivity of observations on extended γ-ray emission over the entire field of view of the instrument. We discuss a technique to search for extended γ-ray sources within the field of view of the instrument. We give estimates for HEGRA sensitivity of observations on extended TeV γ-ray sources
International Nuclear Information System (INIS)
In conventional PET systems,the parallax error degrades image resolution and causes image distortion. To remedy this, a PET ring diameter has to be much larger than the required size of field of view (FOV), and therefore the cost goes up. Measurement of depth-of-interaction (DOI) information is effective to reduce the parallax error and improve the image quality. This study is aimed at developing a practical method to incorporate DOI information in PET sonogram generation and image reconstruction processes and evaluate its efficacy through Monte Carlo simulation. An animal PET system with 30-mm long LSO crystals and 2-mm DOI measurement accuracy was simulated and list-mode PET data were collected. A sonogram generation method was proposed to bin each coincidence event to the correct LOR location according to both incident crystal indices and DOI positions of the two annihilation photons. The sonograms were reconstructed with an iterative OSMAPEM (ordered subset maximum a posteriori expectation maximization) algorithm. Two phantoms (a rod source phantom and a Derenzo phantom) were simulated, and the benefits of DOI were investigated in terms of reconstructed source diameter (FWHM) and source positioning accuracy. The results demonstrate that the proposed method works well to incorporate DOI information in data processing, which not only overcomes the image distortion problem but also significantly improves image resolution and resolution uniformity and results in satisfactory image quality. (authors)
International Nuclear Information System (INIS)
A Monte Carlo method of multiple scattered coherent light with the information of shear wave propagation in scattering media is presented. The established Monte-Carlo algorithm is mainly relative to optical phase variations due to the acoustic-radiation-force shear-wave-induced displacements of light scatterers. Both the distributions and temporal behaviors of optical phase increments in probe locations are obtained. Consequently, shear wave speed is evaluated quantitatively. It is noted that the phase increments exactly track the propagations of shear waves induced by focus-ultrasound radiation force. In addition, attenuations of shear waves are demonstrated in simulation results. By using linear regression processing, the shear wave speed, which is set to 2.1 m/s in simulation, is estimated to be 2.18 m/s and 2.35 m/s at time sampling intervals of 0.2 ms and 0.5 ms, respectively
International Nuclear Information System (INIS)
This work presents a simulation based study by Monte Carlo which uses two adaptive neuro-fuzzy inference systems (ANFIS) for cross talk compensation of simultaneous 99mTc/201Tl dual-radioisotope SPECT imaging. We have compared two neuro-fuzzy systems based on fuzzy c-means (FCM) and subtractive (SUB) clustering. Our approach incorporates eight energy-windows image acquisition from 28 keV to 156 keV and two main photo peaks of 201Tl (77±10% keV) and 99mTc (140±10% keV). The Geant4 application in emission tomography (GATE) is used as a Monte Carlo simulator for three cylindrical and a NURBS Based Cardiac Torso (NCAT) phantom study. Three separate acquisitions including two single-isotopes and one dual isotope were performed in this study. Cross talk and scatter corrected projections are reconstructed by an iterative ordered subsets expectation maximization (OSEM) algorithm which models the non-uniform attenuation in the projection/back-projection. ANFIS-FCM/SUB structures are tuned to create three to sixteen fuzzy rules for modeling the photon cross-talk of the two radioisotopes. Applying seven to nine fuzzy rules leads to a total improvement of the contrast and the bias comparatively. It is found that there is an out performance for the ANFIS-FCM due to its acceleration and accurate results
Heidary, Saeed; Setayeshi, Saeed
2015-01-01
This work presents a simulation based study by Monte Carlo which uses two adaptive neuro-fuzzy inference systems (ANFIS) for cross talk compensation of simultaneous 99mTc/201Tl dual-radioisotope SPECT imaging. We have compared two neuro-fuzzy systems based on fuzzy c-means (FCM) and subtractive (SUB) clustering. Our approach incorporates eight energy-windows image acquisition from 28 keV to 156 keV and two main photo peaks of 201Tl (77±10% keV) and 99mTc (140±10% keV). The Geant4 application in emission tomography (GATE) is used as a Monte Carlo simulator for three cylindrical and a NURBS Based Cardiac Torso (NCAT) phantom study. Three separate acquisitions including two single-isotopes and one dual isotope were performed in this study. Cross talk and scatter corrected projections are reconstructed by an iterative ordered subsets expectation maximization (OSEM) algorithm which models the non-uniform attenuation in the projection/back-projection. ANFIS-FCM/SUB structures are tuned to create three to sixteen fuzzy rules for modeling the photon cross-talk of the two radioisotopes. Applying seven to nine fuzzy rules leads to a total improvement of the contrast and the bias comparatively. It is found that there is an out performance for the ANFIS-FCM due to its acceleration and accurate results.
Energy Technology Data Exchange (ETDEWEB)
Heidary, Saeed, E-mail: saeedheidary@aut.ac.ir; Setayeshi, Saeed, E-mail: setayesh@aut.ac.ir
2015-01-11
This work presents a simulation based study by Monte Carlo which uses two adaptive neuro-fuzzy inference systems (ANFIS) for cross talk compensation of simultaneous {sup 99m}Tc/{sup 201}Tl dual-radioisotope SPECT imaging. We have compared two neuro-fuzzy systems based on fuzzy c-means (FCM) and subtractive (SUB) clustering. Our approach incorporates eight energy-windows image acquisition from 28 keV to 156 keV and two main photo peaks of {sup 201}Tl (77±10% keV) and {sup 99m}Tc (140±10% keV). The Geant4 application in emission tomography (GATE) is used as a Monte Carlo simulator for three cylindrical and a NURBS Based Cardiac Torso (NCAT) phantom study. Three separate acquisitions including two single-isotopes and one dual isotope were performed in this study. Cross talk and scatter corrected projections are reconstructed by an iterative ordered subsets expectation maximization (OSEM) algorithm which models the non-uniform attenuation in the projection/back-projection. ANFIS-FCM/SUB structures are tuned to create three to sixteen fuzzy rules for modeling the photon cross-talk of the two radioisotopes. Applying seven to nine fuzzy rules leads to a total improvement of the contrast and the bias comparatively. It is found that there is an out performance for the ANFIS-FCM due to its acceleration and accurate results.
Lee, Youngjin; Lee, Amy Candy; Kim, Hee-Joung
2016-09-01
Recently, significant effort has been spent on the development of photons counting detector (PCD) based on a CdTe for applications in X-ray imaging system. The motivation of developing PCDs is higher image quality. Especially, the K-edge subtraction (KES) imaging technique using a PCD is able to improve image quality and useful for increasing the contrast resolution of a target material by utilizing contrast agent. Based on above-mentioned technique, we presented an idea for an improved K-edge log-subtraction (KELS) imaging technique. The KELS imaging technique based on the PCDs can be realized by using different subtraction energy width of the energy window. In this study, the effects of the KELS imaging technique and subtraction energy width of the energy window was investigated with respect to the contrast, standard deviation, and CNR with a Monte Carlo simulation. We simulated the PCD X-ray imaging system based on a CdTe and polymethylmethacrylate (PMMA) phantom which consists of the various iodine contrast agents. To acquired KELS images, images of the phantom using above and below the iodine contrast agent K-edge absorption energy (33.2 keV) have been acquired at different energy range. According to the results, the contrast and standard deviation were decreased, when subtraction energy width of the energy window is increased. Also, the CNR using a KELS imaging technique is higher than that of the images acquired by using whole energy range. Especially, the maximum differences of CNR between whole energy range and KELS images using a 1, 2, and 3 mm diameter iodine contrast agent were acquired 11.33, 8.73, and 8.29 times, respectively. Additionally, the optimum subtraction energy width of the energy window can be acquired at 5, 4, and 3 keV for the 1, 2, and 3 mm diameter iodine contrast agent, respectively. In conclusion, we successfully established an improved KELS imaging technique and optimized subtraction energy width of the energy window, and based on
Directory of Open Access Journals (Sweden)
Mahsa Noori Asl
2013-01-01
Full Text Available Compton-scattered photons included within the photopeak pulse-height window result in the degradation of SPECT images both qualitatively and quantitatively. The purpose of this study is to evaluate and compare six scatter correction methods based on setting the energy windows in 99m Tc spectrum. SIMIND Monte Carlo simulation is used to generate the projection images from a cold-sphere hot-background phantom. For evaluation of different scatter correction methods, three assessment criteria including image contrast, signal-to-noise ratio (SNR and relative noise of the background (RNB are considered. Except for the dual-photopeak window (DPW method, the image contrast of the five cold spheres is improved in the range of 2.7-26%. Among methods considered, two methods show a nonuniform correction performance. The RNB for all of the scatter correction methods is ranged from minimum 0.03 for DPW method to maximum 0.0727 for the three energy window (TEW method using trapezoidal approximation. The TEW method using triangular approximation because of ease of implementation, good improvement of the image contrast and the SNR for the five cold spheres, and the low noise level is proposed as most appropriate correction method.
International Nuclear Information System (INIS)
Human anatomical models have been indispensable to radiation protection dosimetry using Monte Carlo calculations. Existing MIRD-based mathematical models are easy to compute and standardize, but they are simplified and crude compared to human anatomy. This article describes the development of an image-based whole-body model, called VIP-Man, using transversal color photographic images obtained from the National Library of Medicine's Visible Human Project for Monte Carlo organ dose calculations involving photons, electron, neutrons, and protons. As the first of a series of papers on dose calculations based on VIP-Man, this article provides detailed information about how to construct an image-based model, as well as how to adopt it into well-tested Monte Carlo codes, EGS4, MCNP4B, and MCNPX
International Nuclear Information System (INIS)
Purpose: Cone beam computed tomography (CBCT) image quality is limited by scattered photons. Monte Carlo (MC) simulations provide the ability of predicting the patient-specific scatter contamination in clinical CBCT imaging. Lengthy simulations prevent MC-based scatter correction from being fully implemented in a clinical setting. This study investigates the combination of using fast MC simulations to predict scatter distributions with a ray tracing algorithm to allow calibration between simulated and clinical CBCT images. Material and methods: An EGSnrc-based user code (egscbct), was used to perform MC simulations of an Elekta XVI CBCT imaging system. A 60keV x-ray source was used, and air kerma scored at the detector plane. Several variance reduction techniques (VRTs) were used to increase the scatter calculation efficiency. Three patient phantoms based on CT scans were simulated, namely a brain, a thorax and a pelvis scan. A ray tracing algorithm was used to calculate the detector signal due to primary photons. A total of 288 projections were simulated, one for each thread on the computer cluster used for the investigation. Results: Scatter distributions for the brain, thorax and pelvis scan were simulated within 2 % statistical uncertainty in two hours per scan. Within the same time, the ray tracing algorithm provided the primary signal for each of the projections. Thus, all the data needed for MC-based scatter correction in clinical CBCT imaging was obtained within two hours per patient, using a full simulation of the clinical CBCT geometry. Conclusions: This study shows that use of MC-based scatter corrections in CBCT imaging has a great potential to improve CBCT image quality. By use of powerful VRTs to predict scatter distributions and a ray tracing algorithm to calculate the primary signal, it is possible to obtain the necessary data for patient specific MC scatter correction within two hours per patient
International Nuclear Information System (INIS)
There have been many efforts to advance the technology of X-ray digital mammography in order to enhance the early detection of breast pathology. The purpose of this study was to evaluate image quality and the radiation dose after magnifying X-ray digital mammography using the Geant4 Application for Tomographic Emission (GATE). In this study, we simulated a Monte Carlo model of an X-ray digital mammographic system, and we present a technique for magnification and discuss how it affects the image quality. The simulated X-ray digital mammographic system with GATE consists of an X-ray source, a compression paddle, a supporting plate, and an imaging plate (IP) of computed radiography (CR). The degree of magnification ranged from 1.0 to 2.0. We designed a semi-cylindrical phantom with a thickness of 45-mm and a radius of 50-mm in order to evaluate the image quality after magnification. The phantom was made of poly methyl methacrylate (PMMA) and contained four spherical specks with diameters of 750, 500, 250, and 100-μm to simulate microcalcifications. The simulation studies were performed with an X-ray energy spectrum calculated using the spectrum processor SRS-78. A combination of a molybdenum anode and a molybdenum filter (Mo/Mo) was used for the mammographic X-ray tubes. The effects of the degree of magnification were investigated in terms of both the contrast-to-noise ratio (CNR) and the average glandular dose (AGD). The results show that the CNR increased as the degree of magnification increased and decreased as breast glandularity increased. The AGD showed only a minor increase with magnification. Based on the results, magnification of mammographic images can be used to obtain high image quality with an increased CNR. Our X-ray digital mammographic system model with GATE may be used as a basis for future studies on X-ray imaging characteristics.
Setiani, Tia Dwi; Suprijadi, Haryanto, Freddy
2016-03-01
Monte Carlo (MC) is one of the powerful techniques for simulation in x-ray imaging. MC method can simulate the radiation transport within matter with high accuracy and provides a natural way to simulate radiation transport in complex systems. One of the codes based on MC algorithm that are widely used for radiographic images simulation is MC-GPU, a codes developed by Andrea Basal. This study was aimed to investigate the time computation of x-ray imaging simulation in GPU (Graphics Processing Unit) compared to a standard CPU (Central Processing Unit). Furthermore, the effect of physical parameters to the quality of radiographic images and the comparison of image quality resulted from simulation in the GPU and CPU are evaluated in this paper. The simulations were run in CPU which was simulated in serial condition, and in two GPU with 384 cores and 2304 cores. In simulation using GPU, each cores calculates one photon, so, a large number of photon were calculated simultaneously. Results show that the time simulations on GPU were significantly accelerated compared to CPU. The simulations on the 2304 core of GPU were performed about 64 -114 times faster than on CPU, while the simulation on the 384 core of GPU were performed about 20 - 31 times faster than in a single core of CPU. Another result shows that optimum quality of images from the simulation was gained at the history start from 108 and the energy from 60 Kev to 90 Kev. Analyzed by statistical approach, the quality of GPU and CPU images are relatively the same.
Lakshmanan, Manu N.; Harrawood, Brian P.; Samei, Ehsan; Kapadia, Anuj J.
2015-08-01
Breast cancer patients undergoing surgery often choose to have a breast conserving surgery (BCS) instead of mastectomy for removal of only the breast tumor. If post-surgical analysis such as histological assessment of the resected tumor reveals insufficient healthy tissue margins around the cancerous tumor, the patient must undergo another surgery to remove the missed tumor tissue. Such re-excisions are reported to occur in 20%-70% of BCS patients. A real-time surgical margin assessment technique that is fast and consistently accurate could greatly reduce the number of re-excisions performed in BCS. We describe here a tumor margin assessment method based on x-ray coherent scatter computed tomography (CSCT) imaging and demonstrate its utility in surgical margin assessment using Monte Carlo simulations. A CSCT system was simulated in Geant4 and used to simulate two virtual anthropomorphic CSCT scans of phantoms resembling surgically resected tissue. The resulting images were volume-rendered and found to distinguish cancerous tumors embedded in complex distributions of adipose and fibroglandular breast tissue (as is expected in the breast). The images exhibited sufficient spatial and spectral (i.e. momentum transfer) resolution to classify the tissue in any given voxel as healthy or cancerous. ROC analysis of the classification accuracy revealed an area under the curve of up to 0.97. These results indicate that coherent scatter imaging is promising as a possible fast and accurate surgical margin assessment technique.
International Nuclear Information System (INIS)
Breast cancer patients undergoing surgery often choose to have a breast conserving surgery (BCS) instead of mastectomy for removal of only the breast tumor. If post-surgical analysis such as histological assessment of the resected tumor reveals insufficient healthy tissue margins around the cancerous tumor, the patient must undergo another surgery to remove the missed tumor tissue. Such re-excisions are reported to occur in 20%–70% of BCS patients. A real-time surgical margin assessment technique that is fast and consistently accurate could greatly reduce the number of re-excisions performed in BCS. We describe here a tumor margin assessment method based on x-ray coherent scatter computed tomography (CSCT) imaging and demonstrate its utility in surgical margin assessment using Monte Carlo simulations. A CSCT system was simulated in Geant4 and used to simulate two virtual anthropomorphic CSCT scans of phantoms resembling surgically resected tissue. The resulting images were volume-rendered and found to distinguish cancerous tumors embedded in complex distributions of adipose and fibroglandular breast tissue (as is expected in the breast). The images exhibited sufficient spatial and spectral (i.e. momentum transfer) resolution to classify the tissue in any given voxel as healthy or cancerous. ROC analysis of the classification accuracy revealed an area under the curve of up to 0.97. These results indicate that coherent scatter imaging is promising as a possible fast and accurate surgical margin assessment technique. (paper)
International Nuclear Information System (INIS)
Purpose: The focus of this work was the demonstration and validation of VirtuaLinac with clinical photon beams and to investigate the implementation of low-Z targets in a TrueBeam linear accelerator (Linac) using Monte Carlo modeling. Methods: VirtuaLinac, a cloud based web application utilizing Geant4 Monte Carlo code, was used to model the Linac treatment head components. Particles were propagated through the lower portion of the treatment head using BEAMnrc. Dose distributions and spectral distributions were calculated using DOSXYZnrc and BEAMdp, respectively. For validation, 6 MV flattened and flattening filter free (FFF) photon beams were generated and compared to measurement for square fields, 10 and 40 cm wide and at dmax for diagonal profiles. Two low-Z targets were investigated: a 2.35 MeV carbon target and the proposed 2.50 MeV commercial imaging target for the TrueBeam platform. A 2.35 MeV carbon target was also simulated in a 2100EX Clinac using BEAMnrc. Contrast simulations were made by scoring the dose in the phosphor layer of an IDU20 aSi detector after propagating through a 4 or 20 cm thick phantom composed of water and ICRP bone. Results: Measured and modeled depth dose curves for 6 MV flattened and FFF beams agree within 1% for 98.3% of points at depths greater than 0.85 cm. Ninety three percent or greater of points analyzed for the diagonal profiles had a gamma value less than one for the criteria of 1.5 mm and 1.5%. The two low-Z target photon spectra produced in TrueBeam are harder than that from the carbon target in the Clinac. Percent dose at depth 10 cm is greater by 3.6% and 8.9%; the fraction of photons in the diagnostic energy range (25–150 keV) is lower by 10% and 28%; and contrasts are lower by factors of 1.1 and 1.4 (4 cm thick phantom) and 1.03 and 1.4 (20 cm thick phantom), for the TrueBeam 2.35 MV/carbon and commercial imaging beams, respectively. Conclusions: VirtuaLinac is a promising new tool for Monte Carlo modeling of novel
Miksys, N.; Xu, C.; Beaulieu, L.; Thomson, R. M.
2015-08-01
This work investigates and compares CT image metallic artifact reduction (MAR) methods and tissue assignment schemes (TAS) for the development of virtual patient models for permanent implant brachytherapy Monte Carlo (MC) dose calculations. Four MAR techniques are investigated to mitigate seed artifacts from post-implant CT images of a homogeneous phantom and eight prostate patients: a raw sinogram approach using the original CT scanner data and three methods (simple threshold replacement (STR), 3D median filter, and virtual sinogram) requiring only the reconstructed CT image. Virtual patient models are developed using six TAS ranging from the AAPM-ESTRO-ABG TG-186 basic approach of assigning uniform density tissues (resulting in a model not dependent on MAR) to more complex models assigning prostate, calcification, and mixtures of prostate and calcification using CT-derived densities. The EGSnrc user-code BrachyDose is employed to calculate dose distributions. All four MAR methods eliminate bright seed spot artifacts, and the image-based methods provide comparable mitigation of artifacts compared with the raw sinogram approach. However, each MAR technique has limitations: STR is unable to mitigate low CT number artifacts, the median filter blurs the image which challenges the preservation of tissue heterogeneities, and both sinogram approaches introduce new streaks. Large local dose differences are generally due to differences in voxel tissue-type rather than mass density. The largest differences in target dose metrics (D90, V100, V150), over 50% lower compared to the other models, are when uncorrected CT images are used with TAS that consider calcifications. Metrics found using models which include calcifications are generally a few percent lower than prostate-only models. Generally, metrics from any MAR method and any TAS which considers calcifications agree within 6%. Overall, the studied MAR methods and TAS show promise for further retrospective MC dose
International Nuclear Information System (INIS)
This work investigates and compares CT image metallic artifact reduction (MAR) methods and tissue assignment schemes (TAS) for the development of virtual patient models for permanent implant brachytherapy Monte Carlo (MC) dose calculations. Four MAR techniques are investigated to mitigate seed artifacts from post-implant CT images of a homogeneous phantom and eight prostate patients: a raw sinogram approach using the original CT scanner data and three methods (simple threshold replacement (STR), 3D median filter, and virtual sinogram) requiring only the reconstructed CT image. Virtual patient models are developed using six TAS ranging from the AAPM-ESTRO-ABG TG-186 basic approach of assigning uniform density tissues (resulting in a model not dependent on MAR) to more complex models assigning prostate, calcification, and mixtures of prostate and calcification using CT-derived densities. The EGSnrc user-code BrachyDose is employed to calculate dose distributions. All four MAR methods eliminate bright seed spot artifacts, and the image-based methods provide comparable mitigation of artifacts compared with the raw sinogram approach. However, each MAR technique has limitations: STR is unable to mitigate low CT number artifacts, the median filter blurs the image which challenges the preservation of tissue heterogeneities, and both sinogram approaches introduce new streaks. Large local dose differences are generally due to differences in voxel tissue-type rather than mass density. The largest differences in target dose metrics (D90, V100, V150), over 50% lower compared to the other models, are when uncorrected CT images are used with TAS that consider calcifications. Metrics found using models which include calcifications are generally a few percent lower than prostate-only models. Generally, metrics from any MAR method and any TAS which considers calcifications agree within 6%. Overall, the studied MAR methods and TAS show promise for further retrospective MC dose
Ortigão, C
2004-01-01
A reliable Monte Carlo simulation study is of significance importance to evaluate the performance of a gamma-ray detector and the search for compromises between spatial resolution, sensitivity and energy resolution. The development of a simulation package for a new compact gamma camera based on GEANT3 is described in this report. This simulation takes into account the interaction of gamma-rays in the crystal, the production and transport of scintillation photons and allows an accurate radiation transport description of photon attenuation in high-Z collimators, for SPECT applications. In order to achieve the best setup configuration different detector arrangements were explored, namely different scintillation crystals, coatings, reflector properties and polishing types. The conventional detector system, based on PMT light readout, was compared with an HPD system. Different collimators were studied for high resolution applications with compact gamma-cameras.
International Nuclear Information System (INIS)
The purpose of the present study is to introduce a compression algorithm for the CT (computed tomography) data used in Monte Carlo simulations. Performing simulations on the CT data implies large computational costs as well as large memory requirements since the number of voxels in such data reaches typically into hundreds of millions voxels. CT data, however, contain homogeneous regions which could be regrouped to form larger voxels without affecting the simulation's accuracy. Based on this property we propose a compression algorithm based on octrees: in homogeneous regions the algorithm replaces groups of voxels with a smaller number of larger voxels. This reduces the number of voxels while keeping the critical high-density gradient area. Results obtained using the present algorithm on both phantom and clinical data show that compression rates up to 75% are possible without losing the dosimetric accuracy of the simulation
International Nuclear Information System (INIS)
SPECT was investigated in the 1970's, developed in the 1980's and popularized in the 1990's in developed countries. Now there are more than 160 systems working in those countries. The Institute of Heavy Ion Physics, Beijing University, has built a laboratory. The goal of the laboratory is to develop a low cost and high performance SPECT system with 37 PMT and PC computer system because the number of PMT is not so important for space resolution while it costs too much. The speed of PC development is the fastest among all computers. The high speed, large size of memory and wide developed environment of software is suitable for such purpose. During the development, the Monte Carlo simulation is very important to make the design of the production better
Roé-Vellvé, N.; Pino, F.; Falcon, C.; Cot, A.; Gispert, J. D.; Marin, C.; Pavía, J.; Ros, D.
2014-08-01
SPECT studies with 123I-ioflupane facilitate the diagnosis of Parkinson’s disease (PD). The effect on quantification of image degradations has been extensively evaluated in human studies but their impact on studies of experimental PD models is still unclear. The aim of this work was to assess the effect of compensating for the degrading phenomena on the quantification of small animal SPECT studies using 123I-ioflupane. This assessment enabled us to evaluate the feasibility of quantitatively detecting small pathological changes using different reconstruction methods and levels of compensation for the image degrading phenomena. Monte Carlo simulated studies of a rat phantom were reconstructed and quantified. Compensations for point spread function (PSF), scattering, attenuation and partial volume effect were progressively included in the quantification protocol. A linear relationship was found between calculated and simulated specific uptake ratio (SUR) in all cases. In order to significantly distinguish disease stages, noise-reduction during the reconstruction process was the most relevant factor, followed by PSF compensation. The smallest detectable SUR interval was determined by biological variability rather than by image degradations or coregistration errors. The quantification methods that gave the best results allowed us to distinguish PD stages with SUR values that are as close as 0.5 using groups of six rats to represent each stage.
Imbert, Laetitia; Galbrun, Ernest; Odille, Freddy; Poussier, Sylvain; Noel, Alain; Wolf, Didier; Karcher, Gilles; Marie, Pierre-Yves
2015-02-01
Geant4 application for tomographic emission (GATE), a Monte-Carlo simulation platform, has previously been used for optimizing tomoscintigraphic images recorded with scintillation Anger cameras but not with the new-generation heart-centric cadmium-zinc-telluride (CZT) cameras. Using the GATE platform, this study aimed at simulating the SPECT recordings from one of these new CZT cameras and to assess this simulation by direct comparison between simulated and actual recorded data, ranging from point sources to human images. Geometry and movement of detectors, as well as their respective energy responses, were modeled for the CZT ‘D.SPECT’ camera in the GATE platform. Both simulated and actual recorded data were obtained from: (1) point and linear sources of 99mTc for compared assessments of detection sensitivity and spatial resolution, (2) a cardiac insert filled with a 99mTc solution for compared assessments of contrast-to-noise ratio and sharpness of myocardial borders and (3) in a patient with myocardial infarction using segmented cardiac magnetic resonance imaging images. Most of the data from the simulated images exhibited high concordance with the results of actual images with relative differences of only: (1) 0.5% for detection sensitivity, (2) 6.7% for spatial resolution, (3) 2.6% for contrast-to-noise ratio and 5.0% for sharpness index on the cardiac insert placed in a diffusing environment. There was also good concordance between actual and simulated gated-SPECT patient images for the delineation of the myocardial infarction area, although the quality of the simulated images was clearly superior with increases around 50% for both contrast-to-noise ratio and sharpness index. SPECT recordings from a new heart-centric CZT camera can be simulated with the GATE software with high concordance relative to the actual physical properties of this camera. These simulations may be conducted up to the stage of human SPECT-images even if further refinement is needed
International Nuclear Information System (INIS)
Geant4 application for tomographic emission (GATE), a Monte-Carlo simulation platform, has previously been used for optimizing tomoscintigraphic images recorded with scintillation Anger cameras but not with the new-generation heart-centric cadmium–zinc–telluride (CZT) cameras. Using the GATE platform, this study aimed at simulating the SPECT recordings from one of these new CZT cameras and to assess this simulation by direct comparison between simulated and actual recorded data, ranging from point sources to human images. Geometry and movement of detectors, as well as their respective energy responses, were modeled for the CZT ‘D.SPECT’ camera in the GATE platform. Both simulated and actual recorded data were obtained from: (1) point and linear sources of 99mTc for compared assessments of detection sensitivity and spatial resolution, (2) a cardiac insert filled with a 99mTc solution for compared assessments of contrast-to-noise ratio and sharpness of myocardial borders and (3) in a patient with myocardial infarction using segmented cardiac magnetic resonance imaging images. Most of the data from the simulated images exhibited high concordance with the results of actual images with relative differences of only: (1) 0.5% for detection sensitivity, (2) 6.7% for spatial resolution, (3) 2.6% for contrast-to-noise ratio and 5.0% for sharpness index on the cardiac insert placed in a diffusing environment. There was also good concordance between actual and simulated gated-SPECT patient images for the delineation of the myocardial infarction area, although the quality of the simulated images was clearly superior with increases around 50% for both contrast-to-noise ratio and sharpness index. SPECT recordings from a new heart-centric CZT camera can be simulated with the GATE software with high concordance relative to the actual physical properties of this camera. These simulations may be conducted up to the stage of human SPECT-images even if further refinement is
International Nuclear Information System (INIS)
In-beam positron emission tomography (PET) can enable visualization of an irradiated field using positron emitters (β+ decay). In particle therapies, many kinds of secondary particles are produced by nuclear interactions, which affect PET imaging. Our purpose in this work was to evaluate effects of secondary particles on in-beam PET imaging using the Monte Carlo simulation code, Geant4, by reproducing an experiment with a small OpenPET prototype in which a PMMA phantom was irradiated by a 11C beam. The number of incident particles to the detectors and their spectra, background coincidence for the PET scan, and reconstructed images were evaluated for three periods, spill-time (beam irradiation), pause-time (accelerating the particles) and beam-off time (duration after the final spill). For spill-time, we tested a background reduction technique in which coincidence events correlated with the accelerator radiofrequency were discarded (RF gated) that has been proposed in the literature. Also, background generation processes were identified. For spill-time, most background coincidences were caused by prompt gamma rays, and only 1.4% of the total coincidences generated β+ signals. Differently, for pause-time and beam-off time, more than 75% of the total coincidence events were signals. Using these coincidence events, we failed to reconstruct images during the spill-time, but we obtained successful reconstructions for the pause-time and beam-off time, which was consistent with the experimental results. From the simulation, we found that the absence of materials in the beam line and using the RF gated technique improved the signal-to-noise ratio for the spill-time. From an additional simulation with range shifter-less irradiation and the RF gated technique, we showed the feasibility of image reconstruction during the spill-time. (paper)
Kavanagh, A.; Olivo, A.; Speller, R; Vojnovic, B
2013-01-01
A simple method of simulating possible coded aperture phase contrast X-ray imaging apparatus is presented. The method is based on ray tracing, with the rays treated ballistically within a voxelized sample and with the phase-shift-induced angular deviations and absorptions applied at a plane in the middle of the sample. For the particular case of a coded aperture phase contrast configuration suitable for small animal pre-clinical imaging we present results obtained using a high resolution voxe...
Bonamente, Massimillano; Joy, Marshall K.; Carlstrom, John E.; Reese, Erik D.; LaRoque, Samuel J.
2004-01-01
X-ray and Sunyaev-Zel'dovich effect data can be combined to determine the distance to galaxy clusters. High-resolution X-ray data are now available from Chandra, which provides both spatial and spectral information, and Sunyaev-Zel'dovich effect data were obtained from the BIMA and Owens Valley Radio Observatory (OVRO) arrays. We introduce a Markov Chain Monte Carlo procedure for the joint analysis of X-ray and Sunyaev- Zel'dovich effect data. The advantages of this method are the high computational efficiency and the ability to measure simultaneously the probability distribution of all parameters of interest, such as the spatial and spectral properties of the cluster gas and also for derivative quantities such as the distance to the cluster. We demonstrate this technique by applying it to the Chandra X-ray data and the OVRO radio data for the galaxy cluster A611. Comparisons with traditional likelihood ratio methods reveal the robustness of the method. This method will be used in follow-up paper to determine the distances to a large sample of galaxy cluster.
Zakhnini, Abdelhamid; Kulenkampff, Johannes; Sauerzapf, Sophie; Pietrzyk, Uwe; Lippmann-Pipke, Johanna
2013-08-01
Understanding conservative fluid flow and reactive tracer transport in soils and rock formations requires quantitative transport visualization methods in 3D+t. After a decade of research and development we established the GeoPET as a non-destructive method with unrivalled sensitivity and selectivity, with due spatial and temporal resolution by applying Positron Emission Tomography (PET), a nuclear medicine imaging method, to dense rock material. Requirements for reaching the physical limit of image resolution of nearly 1 mm are (a) a high-resolution PET-camera, like our ClearPET scanner (Raytest), and (b) appropriate correction methods for scatter and attenuation of 511 keV—photons in the dense geological material. The latter are by far more significant in dense geological material than in human and small animal body tissue (water). Here we present data from Monte Carlo simulations (MCS) reflecting selected GeoPET experiments. The MCS consider all involved nuclear physical processes of the measurement with the ClearPET-system and allow us to quantify the sensitivity of the method and the scatter fractions in geological media as function of material (quartz, Opalinus clay and anhydrite compared to water), PET isotope (18F, 58Co and 124I), and geometric system parameters. The synthetic data sets obtained by MCS are the basis for detailed performance assessment studies allowing for image quality improvements. A scatter correction method is applied exemplarily by subtracting projections of simulated scattered coincidences from experimental data sets prior to image reconstruction with an iterative reconstruction process.
International Nuclear Information System (INIS)
A recent study investigated the feasibility to develop a bench-top x-ray fluorescence computed tomography (XFCT) system capable of determining the spatial distribution and concentration of gold nanoparticles (GNPs) in vivo using a diagnostic energy range polychromatic (i.e. 110 kVp) pencil-beam source. In this follow-up study, we examined the feasibility of a polychromatic cone-beam implementation of XFCT by Monte Carlo (MC) simulations using the MCNP5 code. In the current MC model, cylindrical columns with various sizes (5-10 mm in diameter) containing water loaded with GNPs (0.1-2% gold by weight) were inserted into a 5 cm diameter cylindrical polymethyl methacrylate phantom. The phantom was then irradiated by a lead-filtered 110 kVp x-ray source, and the resulting gold fluorescence and Compton-scattered photons were collected by a series of energy-sensitive tallies after passing through lead parallel-hole collimators. A maximum-likelihood iterative reconstruction algorithm was implemented to reconstruct the image of GNP-loaded objects within the phantom. The effects of attenuation of both the primary beam through the phantom and the gold fluorescence photons en route to the detector were corrected during the image reconstruction. Accurate images of the GNP-containing phantom were successfully reconstructed for three different phantom configurations, with both spatial distribution and relative concentration of GNPs well identified. The pixel intensity of regions containing GNPs was linearly proportional to the gold concentration. The current MC study strongly suggests the possibility of developing a bench-top, polychromatic, cone-beam XFCT system for in vivo imaging.
International Nuclear Information System (INIS)
A simulation toolkit, GATE (Geant4 Application for Tomographic Emission), was used to develop an accurate Monte Carlo (MC) simulation of a fully integrated 3T PET/MR hybrid imaging system (Siemens Biograph mMR). The PET/MR components of the Biograph mMR were simulated in order to allow a detailed study of variations of the system design on the PET performance, which are not easy to access and measure on a real PET/MR system. The 3T static magnetic field of the MR system was taken into account in all Monte Carlo simulations. The validation of the MC model was carried out against actual measurements performed on the PET/MR system by following the NEMA (National Electrical Manufacturers Association) NU 2-2007 standard. The comparison of simulated and experimental performance measurements included spatial resolution, sensitivity, scatter fraction, and count rate capability. The validated system model was then used for two different applications. The first application focused on investigating the effect of an extension of the PET field-of-view on the PET performance of the PET/MR system. The second application deals with simulating a modified system timing resolution and coincidence time window of the PET detector electronics in order to simulate time-of-flight (TOF) PET detection. A dedicated phantom was modeled to investigate the impact of TOF on overall PET image quality. Simulation results showed that the overall divergence between simulated and measured data was found to be less than 10%. Varying the detector geometry showed that the system sensitivity and noise equivalent count rate of the PET/MR system increased progressively with an increasing number of axial detector block rings, as to be expected. TOF-based PET reconstructions of the modeled phantom showed an improvement in signal-to-noise ratio and image contrast to the conventional non-TOF PET reconstructions. In conclusion, the validated MC simulation model of an integrated PET/MR system with an overall
International Nuclear Information System (INIS)
The radiation detection efficiency of four scintillators employed, or designed to be employed, in positron emission imaging (PET) was evaluated as a function of the crystal thickness by applying Monte Carlo Methods. The scintillators studied were the LuSiO5 (LSO), LuAlO3 (LuAP), Gd2SiO5 (GSO) and the YAlO3 (YAP). Crystal thicknesses ranged from 0 to 50 mm. The study was performed via a previously generated photon transport Monte Carlo code. All photon track and energy histories were recorded and the energy transferred or absorbed in the scintillator medium was calculated together with the energy redistributed and retransported as secondary characteristic fluorescence radiation. Various parameters were calculated e.g. the fraction of the incident photon energy absorbed, transmitted or redistributed as fluorescence radiation, the scatter to primary ratio, the photon and energy distribution within each scintillator block etc. As being most significant, the fraction of the incident photon energy absorbed was found to increase with increasing crystal thickness tending to form a plateau above the 30 mm thickness. For LSO, LuAP, GSO and YAP scintillators, respectively, this fraction had the value of 44.8, 36.9 and 45.7% at the 10 mm thickness and 96.4, 93.2 and 96.9% at the 50 mm thickness. Within the plateau area approximately (57-59)% (59-63)% (52-63)% and (58-61)% of this fraction was due to scattered and reabsorbed radiation for the LSO, GSO, YAP and LuAP scintillators, respectively. In all cases, a negligible fraction (<0.1%) of the absorbed energy was found to escape the crystal as fluorescence radiation
Montanari, Davide; Silvestri, Chiara; Graves, Yan J; Yan, Hao; Cervino, Laura; Rice, Roger; Jiang, Steve B; Jia, Xun
2013-01-01
Cone beam CT (CBCT) has been widely used for patient setup in image guided radiation therapy (IGRT). Radiation dose from CBCT scans has become a clinical concern. The purposes of this study are 1) to commission a GPU-based Monte Carlo (MC) dose calculation package gCTD for Varian On-Board Imaging (OBI) system and test the calculation accuracy, and 2) to quantitatively evaluate CBCT dose from the OBI system in typical IGRT scan protocols. We first conducted dose measurements in a water phantom. X-ray source model parameters used in gCTD are obtained through a commissioning process. gCTD accuracy is demonstrated by comparing calculations with measurements in water and in CTDI phantoms. 25 brain cancer patients are used to study dose in a standard-dose head protocol, and 25 prostate cancer patients are used to study dose in pelvis protocol and pelvis spotlight protocol. Mean dose to each organ is calculated. Mean dose to 2% voxels that have the highest dose is also computed to quantify the maximum dose. It is fo...
International Nuclear Information System (INIS)
We studied the performance of a dual-panel positron emission tomography (PET) camera dedicated to breast cancer imaging using Monte Carlo simulation. The PET camera under development has two 10x15 cm2 plates that are constructed from arrays of 1x1x3 mm3 LSO crystals coupled to novel ultra-thin (3 intrinsic spatial resolution, 3, 2x2x10 mm3, 3x3x30 mm3, and 4x4x20 mm3 LSO crystal resolutions and different panel separations. Images were reconstructed by focal plane tomography with attenuation and normalization corrections applied. Simulation results indicate that with an activity concentration ratio of tumor:breast:heart:torso of 10:1:10:1 and 30 s of acquisition time, only the dual-plate PET camera comprising 1x1x3 mm3 crystals could resolve 2.5 mm diameter spheres with an average peak-to-valley ratio of 1.3
Hajizadeh-Safar, M; Ghorbani, M; Khoshkharam, S; Ashrafi, Z
2014-07-01
Gamma camera is an important apparatus in nuclear medicine imaging. Its detection part is consists of a scintillation detector with a heavy collimator. Substitution of semiconductor detectors instead of scintillator in these cameras has been effectively studied. In this study, it is aimed to introduce a new design of P-N semiconductor detector array for nuclear medicine imaging. A P-N semiconductor detector composed of N-SnO2 :F, and P-NiO:Li, has been introduced through simulating with MCNPX monte carlo codes. Its sensitivity with different factors such as thickness, dimension, and direction of emission photons were investigated. It is then used to configure a new design of an array in one-dimension and study its spatial resolution for nuclear medicine imaging. One-dimension array with 39 detectors was simulated to measure a predefined linear distribution of Tc(99_m) activity and its spatial resolution. The activity distribution was calculated from detector responses through mathematical linear optimization using LINPROG code on MATLAB software. Three different configurations of one-dimension detector array, horizontal, vertical one sided, and vertical double-sided were simulated. In all of these configurations, the energy windows of the photopeak were ± 1%. The results show that the detector response increases with an increase of dimension and thickness of the detector with the highest sensitivity for emission photons 15-30° above the surface. Horizontal configuration array of detectors is not suitable for imaging of line activity sources. The measured activity distribution with vertical configuration array, double-side detectors, has no similarity with emission sources and hence is not suitable for imaging purposes. Measured activity distribution using vertical configuration array, single side detectors has a good similarity with sources. Therefore, it could be introduced as a suitable configuration for nuclear medicine imaging. It has been shown that using
Bashkatov, A. N.; Genina, Elina A.; Kochubei, V. I.; Tuchin, Valerii V.
2006-12-01
Based on the digital image analysis and inverse Monte-Carlo method, the proximate analysis method is deve-loped and the optical properties of hairs of different types are estimated in three spectral ranges corresponding to three colour components. The scattering and absorption properties of hairs are separated for the first time by using the inverse Monte-Carlo method. The content of different types of melanin in hairs is estimated from the absorption coefficient. It is shown that the dominating type of melanin in dark hairs is eumelanin, whereas in light hairs pheomelanin dominates.
Ney, Michael; Abdulhalim, Ibrahim
2016-03-01
Skin cancer detection at its early stages has been the focus of a large number of experimental and theoretical studies during the past decades. Among these studies two prominent approaches presenting high potential are reflectometric sensing at the THz wavelengths region and polarimetric imaging techniques in the visible wavelengths. While THz radiation contrast agent and source of sensitivity to cancer related tissue alterations was considered to be mainly the elevated water content in the cancerous tissue, the polarimetric approach has been verified to enable cancerous tissue differentiation based on cancer induced structural alterations to the tissue. Combining THz with the polarimetric approach, which is considered in this study, is examined in order to enable higher detection sensitivity than previously pure reflectometric THz measurements. For this, a comprehensive MC simulation of radiative transfer in a complex skin tissue model fitted for the THz domain that considers the skin`s stratified structure, tissue material optical dispersion modeling, surface roughness, scatterers, and substructure organelles has been developed. Additionally, a narrow beam Mueller matrix differential analysis technique is suggested for assessing skin cancer induced changes in the polarimetric image, enabling the tissue model and MC simulation to be utilized for determining the imaging parameters resulting in maximal detection sensitivity.
Monte Carlo techniques in radiation therapy
Verhaegen, Frank
2013-01-01
Modern cancer treatment relies on Monte Carlo simulations to help radiotherapists and clinical physicists better understand and compute radiation dose from imaging devices as well as exploit four-dimensional imaging data. With Monte Carlo-based treatment planning tools now available from commercial vendors, a complete transition to Monte Carlo-based dose calculation methods in radiotherapy could likely take place in the next decade. Monte Carlo Techniques in Radiation Therapy explores the use of Monte Carlo methods for modeling various features of internal and external radiation sources, including light ion beams. The book-the first of its kind-addresses applications of the Monte Carlo particle transport simulation technique in radiation therapy, mainly focusing on external beam radiotherapy and brachytherapy. It presents the mathematical and technical aspects of the methods in particle transport simulations. The book also discusses the modeling of medical linacs and other irradiation devices; issues specific...
Energy Technology Data Exchange (ETDEWEB)
Teymurazyan, A. [Imaging Research, Sunnybrook Health Sciences Centre, Department of Medical Biophysics, University of Toronto, Toronto M4N 3M5 (Canada); Rowlands, J. A. [Imaging Research, Sunnybrook Health Sciences Centre, Department of Medical Biophysics, University of Toronto, Toronto M4N 3M5 (Canada); Thunder Bay Regional Research Institute (TBRRI), Thunder Bay P7A 7T1 (Canada); Department of Radiation Oncology, University of Toronto, Toronto M5S 3E2 (Canada); Pang, G., E-mail: geordi.pang@sunnybrook.ca [Imaging Research, Sunnybrook Health Sciences Centre, Department of Medical Biophysics, University of Toronto, Toronto M4N 3M5 (Canada); Department of Radiation Oncology, University of Toronto, Toronto M5S 3E2 (Canada); Odette Cancer Centre, Toronto M4N 3M5 (Canada); Department of Physics, Ryerson University, Toronto M5B 2K3 (Canada)
2014-04-15
Purpose: Electronic Portal Imaging Devices (EPIDs) have been widely used in radiation therapy and are still needed on linear accelerators (Linacs) equipped with kilovoltage cone beam CT (kV-CBCT) or MRI systems. Our aim is to develop a new high quantum efficiency (QE) Čerenkov Portal Imaging Device (CPID) that is quantum noise limited at dose levels corresponding to a single Linac pulse. Methods: Recently a new concept of CPID for MV x-ray imaging in radiation therapy was introduced. It relies on Čerenkov effect for x-ray detection. The proposed design consisted of a matrix of optical fibers aligned with the incident x-rays and coupled to an active matrix flat panel imager (AMFPI) for image readout. A weakness of such design is that too few Čerenkov light photons reach the AMFPI for each incident x-ray and an AMFPI with an avalanche gain is required in order to overcome the readout noise for portal imaging application. In this work the authors propose to replace the optical fibers in the CPID with light guides without a cladding layer that are suspended in air. The air between the light guides takes on the role of the cladding layer found in a regular optical fiber. Since air has a significantly lower refractive index (∼1 versus 1.38 in a typical cladding layer), a much superior light collection efficiency is achieved. Results: A Monte Carlo simulation of the new design has been conducted to investigate its feasibility. Detector quantities such as quantum efficiency (QE), spatial resolution (MTF), and frequency dependent detective quantum efficiency (DQE) have been evaluated. The detector signal and the quantum noise have been compared to the readout noise. Conclusions: Our studies show that the modified new CPID has a QE and DQE more than an order of magnitude greater than that of current clinical systems and yet a spatial resolution similar to that of current low-QE flat-panel based EPIDs. Furthermore it was demonstrated that the new CPID does not require an
Energy Technology Data Exchange (ETDEWEB)
Crespo, Cristina; Aguiar, Pablo [Universitat de Barcelona - IDIBAPS, Unitat de Biofisica i Bioenginyeria, Departament de Ciencies Fisiologiques I, Facultat de Medicina, Barcelona (Spain); Gallego, Judith [Universitat Politecnica de Catalunya, Institut de Tecniques Energetiques, Barcelona (Spain); Institut de Bioenginyeria de Catalunya, Barcelona (Spain); Cot, Albert [Universitat de Barcelona - IDIBAPS, Unitat de Biofisica i Bioenginyeria, Departament de Ciencies Fisiologiques I, Facultat de Medicina, Barcelona (Spain); Universitat Politecnica de Catalunya, Seccio d' Enginyeria Nuclear, Departament de Fisica i Enginyeria Nuclear, Barcelona (Spain); Falcon, Carles; Ros, Domenec [Universitat de Barcelona - IDIBAPS, Unitat de Biofisica i Bioenginyeria, Departament de Ciencies Fisiologiques I, Facultat de Medicina, Barcelona (Spain); CIBER en Bioingenieria, Biomateriales y Nanomedicina (CIBER-BBN), Barcelona (Spain); Bullich, Santiago [Hospital del Mar, Center for Imaging in Psychiatry, CRC-MAR, Barcelona (Spain); Pareto, Deborah [CIBER en Bioingenieria, Biomateriales y Nanomedicina (CIBER-BBN), Barcelona (Spain); PRBB, Institut d' Alta Tecnologia, Barcelona (Spain); Sempau, Josep [Universitat Politecnica de Catalunya, Institut de Tecniques Energetiques, Barcelona (Spain); CIBER en Bioingenieria, Biomateriales y Nanomedicina (CIBER-BBN), Barcelona (Spain); Lomena, Francisco [IDIBAPS, Servei de Medicina Nuclear, Hospital Clinic, Barcelona (Spain); Calvino, Francisco [Universitat Politecnica de Catalunya, Institut de Tecniques Energetiques, Barcelona (Spain); Universitat Politecnica de Catalunya, Seccio d' Enginyeria Nuclear, Departament de Fisica i Enginyeria Nuclear, Barcelona (Spain); Pavia, Javier [CIBER en Bioingenieria, Biomateriales y Nanomedicina (CIBER-BBN), Barcelona (Spain); IDIBAPS, Servei de Medicina Nuclear, Hospital Clinic, Barcelona (Spain)
2008-07-15
{sup 123}I-labelled radioligands are commonly used for single-photon emission computed tomography (SPECT) imaging of the dopaminergic system to study the dopamine transporter binding. The aim of this work was to compare the quantitative capabilities of two different SPECT systems through Monte Carlo (MC) simulation. The SimSET MC code was employed to generate simulated projections of a numerical phantom for two gamma cameras equipped with a parallel and a fan-beam collimator, respectively. A fully 3D iterative reconstruction algorithm was used to compensate for attenuation, the spatially variant point spread function (PSF) and scatter. A post-reconstruction partial volume effect (PVE) compensation was also developed. For both systems, the correction for all degradations and PVE compensation resulted in recovery factors of the theoretical specific uptake ratio (SUR) close to 100%. For a SUR value of 4, the recovered SUR for the parallel imaging system was 33% for a reconstruction without corrections (OSEM), 45% for a reconstruction with attenuation correction (OSEM-A), 56% for a 3D reconstruction with attenuation and PSF corrections (OSEM-AP), 68% for OSEM-AP with scatter correction (OSEM-APS) and 97% for OSEM-APS plus PVE compensation (OSEM-APSV). For the fan-beam imaging system, the recovered SUR was 41% without corrections, 55% for OSEM-A, 65% for OSEM-AP, 75% for OSEM-APS and 102% for OSEM-APSV. Our findings indicate that the correction for degradations increases the quantification accuracy, with PVE compensation playing a major role in the SUR quantification. The proposed methodology allows us to reach similar SUR values for different SPECT systems, thereby allowing a reliable standardisation in multicentric studies. (orig.)
Energy Technology Data Exchange (ETDEWEB)
Peterson, Mikael, E-mail: Mikael.Peterson@med.lu.se; Strand, Sven-Erik; Ljungberg, Michael [Department of Medical Radiation Physics, Clinical Science, Lund University, Lund 221 85 (Sweden)
2015-04-15
Purpose: Pinhole collimation is the most common method of high-resolution preclinical single photon emission computed tomography imaging. The collimators are usually constructed from dense materials with high atomic numbers, such as gold and platinum, which are expensive and not always flexible in the fabrication step. In this work, the authors have investigated the properties of a fusible alloy called Rose’s metal and its potential in pinhole preclinical imaging. When compared to current standard pinhole materials such as gold and platinum, Rose’s metal has a lower density and a relatively low effective atomic number. However, it is inexpensive, has a low melting point, and does not contract when solidifying. Once cast, the piece can be machined with high precision. The aim of this study was to evaluate the imaging properties for Rose’s metal and compare them with those of standard materials. Methods: After validating their Monte Carlo code by comparing its results with published data and the results from analytical calculations, they investigated different pinhole geometries by varying the collimator material, acceptance angle, aperture diameter, and photon incident angle. The penetration-to-scatter and penetration-to-total component ratios, sensitivity, and the spatial resolution were determined for gold, tungsten, and Rose’s metal for two radionuclides, {sup 99}Tc{sup m} and {sup 125}I. Results: The Rose’s metal pinhole-imaging simulations show higher penetration/total and scatter/total ratios. For example, the penetration/total is 50% for gold and 75% for Rose’s metal when simulating {sup 99}Tc{sup m} with a 0.3 mm aperture diameter and a 60° acceptance angle. However, the degradation in spatial resolution remained below 10% relative to the spatial resolution for gold for acceptance angles below 40° and aperture diameters larger than 0.5 mm. Conclusions: Extra penetration and scatter associated with Rose’s metal contribute to degradation in the
Energy Technology Data Exchange (ETDEWEB)
Laliena Bielsa, V.; Jimenez Albericio, F. J.; Gandia Martinez, A.; Font Gomez, J. A.; Mengual Gil, M. A.; Andres Redondo, M. M.
2013-07-01
The source of uncertainty is not exclusive of the Monte Carlo method, but it will be present in any algorithm which takes into account the correction for heterogeneity. Although we hope that the uncertainty described above is small, the objective of this work is to try to quantify depending on the CT study. (Author)
International Nuclear Information System (INIS)
Purpose: Our purpose is to evaluate the administered absorbed dose in pediatric, nuclear imaging studies. Monte Carlo simulations with the incorporation of pediatric computational models can serve as reference for the accurate determination of absorbed dose. The procedure of the calculated dosimetric factors is described, while a dataset of reference doses is created. Methods: Realistic simulations were executed using the GATE toolkit and a series of pediatric computational models, developed by the “IT'IS Foundation”. The series of the phantoms used in our work includes 6 models in the range of 5–14 years old (3 boys and 3 girls). Pre-processing techniques were applied to the images, to incorporate the phantoms in GATE simulations. The resolution of the phantoms was set to 2 mm3. The most important organ densities were simulated according to the GATE “Materials Database”. Several used radiopharmaceuticals in SPECT and PET applications are being tested, following the EANM pediatric dosage protocol. The biodistributions of the several isotopes used as activity maps in the simulations, were derived by the literature. Results: Initial results of absorbed dose per organ (mGy) are presented in a 5 years old girl from the whole body exposure to 99mTc - SestaMIBI, 30 minutes after administration. Heart, kidney, liver, ovary, pancreas and brain are the most critical organs, in which the S-factors are calculated. The statistical uncertainty in the simulation procedure was kept lower than 5%. The Sfactors for each target organ are calculated in Gy/(MBq*sec) with highest dose being absorbed in kidneys and pancreas (9.29*1010 and 0.15*1010 respectively). Conclusion: An approach for the accurate dosimetry on pediatric models is presented, creating a reference dosage dataset for several radionuclides in children computational models with the advantages of MC techniques. Our study is ongoing, extending our investigation to other reference models and evaluating the
Energy Technology Data Exchange (ETDEWEB)
Papadimitroulas, P; Kagadis, GC [University of Patras, Rion, Ahaia (Greece); Loudos, G [Technical Educational Institute of Athens, Aigaleo, Attiki (Greece)
2014-06-15
Purpose: Our purpose is to evaluate the administered absorbed dose in pediatric, nuclear imaging studies. Monte Carlo simulations with the incorporation of pediatric computational models can serve as reference for the accurate determination of absorbed dose. The procedure of the calculated dosimetric factors is described, while a dataset of reference doses is created. Methods: Realistic simulations were executed using the GATE toolkit and a series of pediatric computational models, developed by the “IT'IS Foundation”. The series of the phantoms used in our work includes 6 models in the range of 5–14 years old (3 boys and 3 girls). Pre-processing techniques were applied to the images, to incorporate the phantoms in GATE simulations. The resolution of the phantoms was set to 2 mm3. The most important organ densities were simulated according to the GATE “Materials Database”. Several used radiopharmaceuticals in SPECT and PET applications are being tested, following the EANM pediatric dosage protocol. The biodistributions of the several isotopes used as activity maps in the simulations, were derived by the literature. Results: Initial results of absorbed dose per organ (mGy) are presented in a 5 years old girl from the whole body exposure to 99mTc - SestaMIBI, 30 minutes after administration. Heart, kidney, liver, ovary, pancreas and brain are the most critical organs, in which the S-factors are calculated. The statistical uncertainty in the simulation procedure was kept lower than 5%. The Sfactors for each target organ are calculated in Gy/(MBq*sec) with highest dose being absorbed in kidneys and pancreas (9.29*10{sup 10} and 0.15*10{sup 10} respectively). Conclusion: An approach for the accurate dosimetry on pediatric models is presented, creating a reference dosage dataset for several radionuclides in children computational models with the advantages of MC techniques. Our study is ongoing, extending our investigation to other reference models and
International Nuclear Information System (INIS)
Ajou University School of Medicine made the serially sectioned anatomical images from the Visible Korean Human (VKH) Project in Korea. The VKH images, which are the high-resolution color photographic images, show the organs and tissues in the human body very clearly at 0.2 mm intervals. In this study, we constructed a high-quality voxel model (VKH-Man) with a total of 30 organs and tissues by manual and automatic segmentation method using the serially sectioned anatomical image data from the Visible Korean Human (VKH) project in Korea. The height and weight of VKH-Man voxel model is 164 cm and 57.6 kg, respectively, and the voxel resolution is 1.875 x 1.875 x 2 mm3. However, this voxel phantom can be used to calculate the organ and tissue doses of only one person. Therefore, in this study, we adjusted the voxel phantom to the 'Reference Korean' data to construct the voxel phantom that represents the radiation workers in Korea. The height and weight of the voxel model (HDRK-Man) that is finally developed are 171 cm and 68 kg, respectively, and the voxel resolution is 1.981 x 1.981 x 2.0854 mm3. VKH-Man and HDRK-Man voxel model were implemented in a Monte Carlo particle transport simulation code for calculation of the organ and tissue doses in various irradiation geometries. The calculated values were compared with each other to see the effect of the adjustment and also compared with other computational models (KTMAN-2, ICRP-74 and VIP-Man). According to the results, the adjustment of the voxel model was found hardly affect the dose calculations and most of the organ and tissue equivalent doses showed some differences among the models. These results shows that the difference in figure, and organ topology affects the organ doses more than the organ size. The calculated values of the effective dose from VKH-Man and HDRK-Man according to the ICRP-60 and upcoming ICRP recommendation were compared. For the other radiation geometries (AP, LLAT, RLAT) except for PA
Schmid, S.; Landry, G.; Thieke, C.; Verhaegen, F.; Ganswindt, U.; Belka, C.; Parodi, K.; Dedes, G.
2015-12-01
Proton range verification based on prompt gamma imaging is increasingly considered in proton therapy. Tissue heterogeneity normal to the beam direction or near the end of range may considerably degrade the ability of prompt gamma imaging to detect proton range shifts. The goal of this study was to systematically investigate the accuracy and precision of range detection from prompt gamma emission profiles for various fractions for intensity modulated proton therapy of prostate cancer, using a comprehensive clinical dataset of 15 different CT scans for 5 patients. Monte Carlo simulations using Geant4 were performed to generate spot-by-spot dose distributions and prompt gamma emission profiles for prostate treatment plans. The prompt gammas were scored at their point of emission. Three CT scans of the same patient were used to evaluate the impact of inter-fractional changes on proton range. The range shifts deduced from the comparison of prompt gamma emission profiles in the planning CT and subsequent CTs were then correlated to the corresponding range shifts deduced from the dose distributions for individual pencil beams. The distributions of range shift differences between prompt gamma and dose were evaluated in terms of precision (defined as half the 95% inter-percentile range IPR) and accuracy (median). In total about 1700 individual proton pencil beams were investigated. The IPR of the relative range shift differences between the dose profiles and the prompt gamma profiles varied between ±1.4 mm and ±2.9 mm when using the more robust profile shifting analysis. The median was found smaller than 1 mm. Methods to identify and reject unreliable spots for range verification due to range mixing were derived and resulted in an average 10% spot rejection, clearly improving the prompt gamma-dose correlation. This work supports that prompt gamma imaging can offer a reliable indicator of range changes due to anatomical variations and tissue heterogeneity
Energy Technology Data Exchange (ETDEWEB)
Silva, Carlos Borges da
2007-05-15
The image acquisition methods applied to nuclear medicine and radiobiology are a valuable research study for determination of thyroid anatomy to seek disorders associated to follicular cells. The Monte Carlo (MC) simulation has also been used in problems related to radiation detection in order to map medical images since the improvement of data processing compatible with personnel computers (PC). This work presents an innovative study to find out the adequate scintillation inorganic detector array that could be coupled to a specific light photo sensor, a charge coupled device (CCD) through a fiber optic plate in order to map the follicles of thyroid gland. The goal is to choose the type of detector that fits the application suggested here with spatial resolution of 10 {mu}m and good detector efficiency. The methodology results are useful to map a follicle image using gamma radiation emission. A source - detector simulation is performed by using a MCNP4B (Monte Carlo for Neutron Photon transport) general code considering different source energies, detector materials and geometries including pixel sizes and reflector types. The results demonstrate that by using MCNP4B code is possible to searching for useful parameters related to the systems used in nuclear medicine, specifically in radiobiology applied to endocrine physiology studies to acquiring thyroid follicles images. (author)
Bauer, J; Unholtz, D; Kurz, C; Parodi, K
2013-08-01
We report on the experimental campaign carried out at the Heidelberg Ion-Beam Therapy Center (HIT) to optimize the Monte Carlo (MC) modelling of proton-induced positron-emitter production. The presented experimental strategy constitutes a pragmatic inverse approach to overcome the known uncertainties in the modelling of positron-emitter production due to the lack of reliable cross-section data for the relevant therapeutic energy range. This work is motivated by the clinical implementation of offline PET/CT-based treatment verification at our facility. Here, the irradiation induced tissue activation in the patient is monitored shortly after the treatment delivery by means of a commercial PET/CT scanner and compared to a MC simulated activity expectation, derived under the assumption of a correct treatment delivery. At HIT, the MC particle transport and interaction code FLUKA is used for the simulation of the expected positron-emitter yield. For this particular application, the code is coupled to externally provided cross-section data of several proton-induced reactions. Studying experimentally the positron-emitting radionuclide yield in homogeneous phantoms provides access to the fundamental production channels. Therefore, five different materials have been irradiated by monoenergetic proton pencil beams at various energies and the induced β(+) activity subsequently acquired with a commercial full-ring PET/CT scanner. With the analysis of dynamically reconstructed PET images, we are able to determine separately the spatial distribution of different radionuclide concentrations at the starting time of the PET scan. The laterally integrated radionuclide yields in depth are used to tune the input cross-section data such that the impact of both the physical production and the imaging process on the various positron-emitter yields is reproduced. The resulting cross-section data sets allow to model the absolute level of measured β(+) activity induced in the investigated
Chabert, I.; Barat, E.; Dautremer, T.; Montagu, T.; Agelou, M.; Croc de Suray, A.; Garcia-Hernandez, J. C.; Gempp, S.; Benkreira, M.; de Carlan, L.; Lazaro, D.
2016-07-01
This work aims at developing a generic virtual source model (VSM) preserving all existing correlations between variables stored in a Monte Carlo pre-computed phase space (PS) file, for dose calculation and high-resolution portal image prediction. The reference PS file was calculated using the PENELOPE code, after the flattening filter (FF) of an Elekta Synergy 6 MV photon beam. Each particle was represented in a mobile coordinate system by its radial position (r s ) in the PS plane, its energy (E), and its polar and azimuthal angles (φ d and θ d ), describing the particle deviation compared to its initial direction after bremsstrahlung, and the deviation orientation. Three sub-sources were created by sorting out particles according to their last interaction location (target, primary collimator or FF). For each sub-source, 4D correlated-histograms were built by storing E, r s , φ d and θ d values. Five different adaptive binning schemes were studied to construct 4D histograms of the VSMs, to ensure histogram efficient handling as well as an accurate reproduction of E, r s , φ d and θ d distribution details. The five resulting VSMs were then implemented in PENELOPE. Their accuracy was first assessed in the PS plane, by comparing E, r s , φ d and θ d distributions with those obtained from the reference PS file. Second, dose distributions computed in water, using the VSMs and the reference PS file located below the FF, and also after collimation in both water and heterogeneous phantom, were compared using a 1.5%–0 mm and a 2%–0 mm global gamma index, respectively. Finally, portal images were calculated without and with phantoms in the beam. The model was then evaluated using a 1%–0 mm global gamma index. Performance of a mono-source VSM was also investigated and led, as with the multi-source model, to excellent results when combined with an adaptive binning scheme.
Chabert, I; Barat, E; Dautremer, T; Montagu, T; Agelou, M; Croc de Suray, A; Garcia-Hernandez, J C; Gempp, S; Benkreira, M; de Carlan, L; Lazaro, D
2016-07-21
This work aims at developing a generic virtual source model (VSM) preserving all existing correlations between variables stored in a Monte Carlo pre-computed phase space (PS) file, for dose calculation and high-resolution portal image prediction. The reference PS file was calculated using the PENELOPE code, after the flattening filter (FF) of an Elekta Synergy 6 MV photon beam. Each particle was represented in a mobile coordinate system by its radial position (r s ) in the PS plane, its energy (E), and its polar and azimuthal angles (φ d and θ d ), describing the particle deviation compared to its initial direction after bremsstrahlung, and the deviation orientation. Three sub-sources were created by sorting out particles according to their last interaction location (target, primary collimator or FF). For each sub-source, 4D correlated-histograms were built by storing E, r s , φ d and θ d values. Five different adaptive binning schemes were studied to construct 4D histograms of the VSMs, to ensure histogram efficient handling as well as an accurate reproduction of E, r s , φ d and θ d distribution details. The five resulting VSMs were then implemented in PENELOPE. Their accuracy was first assessed in the PS plane, by comparing E, r s , φ d and θ d distributions with those obtained from the reference PS file. Second, dose distributions computed in water, using the VSMs and the reference PS file located below the FF, and also after collimation in both water and heterogeneous phantom, were compared using a 1.5%-0 mm and a 2%-0 mm global gamma index, respectively. Finally, portal images were calculated without and with phantoms in the beam. The model was then evaluated using a 1%-0 mm global gamma index. Performance of a mono-source VSM was also investigated and led, as with the multi-source model, to excellent results when combined with an adaptive binning scheme. PMID:27353090
Dunn, William L
2012-01-01
Exploring Monte Carlo Methods is a basic text that describes the numerical methods that have come to be known as "Monte Carlo." The book treats the subject generically through the first eight chapters and, thus, should be of use to anyone who wants to learn to use Monte Carlo. The next two chapters focus on applications in nuclear engineering, which are illustrative of uses in other fields. Five appendices are included, which provide useful information on probability distributions, general-purpose Monte Carlo codes for radiation transport, and other matters. The famous "Buffon's needle proble
International Nuclear Information System (INIS)
Highlights: • A new Monte Carlo photon transport code ARCHER-CT for CT dose calculations is developed to execute on the GPU and coprocessor. • ARCHER-CT is verified against MCNP. • The GPU code on an Nvidia M2090 GPU is 5.15–5.81 times faster than the parallel CPU code on an Intel X5650 6-core CPU. • The coprocessor code on an Intel Xeon Phi 5110p coprocessor is 3.30–3.38 times faster than the CPU code. - Abstract: Hardware accelerators are currently becoming increasingly important in boosting high performance computing systems. In this study, we tested the performance of two accelerator models, Nvidia Tesla M2090 GPU and Intel Xeon Phi 5110p coprocessor, using a new Monte Carlo photon transport package called ARCHER-CT we have developed for fast CT imaging dose calculation. The package contains three components, ARCHER-CTCPU, ARCHER-CTGPU and ARCHER-CTCOP designed to be run on the multi-core CPU, GPU and coprocessor architectures respectively. A detailed GE LightSpeed Multi-Detector Computed Tomography (MDCT) scanner model and a family of voxel patient phantoms are included in the code to calculate absorbed dose to radiosensitive organs under user-specified scan protocols. The results from ARCHER agree well with those from the production code Monte Carlo N-Particle eXtended (MCNPX). It is found that all the code components are significantly faster than the parallel MCNPX run on 12 MPI processes, and that the GPU and coprocessor codes are 5.15–5.81 and 3.30–3.38 times faster than the parallel ARCHER-CTCPU, respectively. The M2090 GPU performs better than the 5110p coprocessor in our specific test. Besides, the heterogeneous computation mode in which the CPU and the hardware accelerator work concurrently can increase the overall performance by 13–18%
International Nuclear Information System (INIS)
Hardware accelerators are currently becoming increasingly important in boosting high performance computing systems. In this study, we tested the performance of two accelerator models, NVIDIA Tesla M2090 GPU and Intel Xeon Phi 5110p coprocessor, using a new Monte Carlo photon transport package called ARCHER-CT we have developed for fast CT imaging dose calculation. The package contains three code variants, ARCHER-CT(CPU), ARCHER-CT(GPU) and ARCHER-CT(COP) to run in parallel on the multi-core CPU, GPU and coprocessor architectures respectively. A detailed GE LightSpeed Multi-Detector Computed Tomography (MDCT) scanner model and a family of voxel patient phantoms were included in the code to calculate absorbed dose to radiosensitive organs under specified scan protocols. The results from ARCHER agreed well with those from the production code Monte Carlo N-Particle eXtended (MCNPX). It was found that all the code variants were significantly faster than the parallel MCNPX running on 12 MPI processes, and that the GPU and coprocessor performed equally well, being 2.89-4.49 and 3.01-3.23 times faster than the parallel ARCHER-CT(CPU) running with 12 hyper-threads. (authors)
Foudray, Angela M K; Habte, Frezghi; Chinn, Garry; Zhang, Jin; Levin, Craig S
2006-01-01
We are investigating a high-sensitivity, high-resolution positron emission tomography (PET) system for clinical use in the detection, diagnosis and staging of breast cancer. Using conventional figures of merit, design parameters were evaluated for count rate performance, module dead time, and construction complexity. The detector system modeled comprises extremely thin position-sensitive avalanche photodiodes coupled to lutetium oxy-orthosilicate scintillation crystals. Previous investigations of detector geometries with Monte Carlo indicated that one of the largest impacts on sensitivity is local scintillation crystal density when considering systems having the same average scintillation crystal densities (same crystal packing fraction and system solid-angle coverage). Our results show the system has very good scatter and randoms rejection at clinical activity ranges ( approximately 200 muCi). PMID:17645997
Coded aperture optimization using Monte Carlo simulations
International Nuclear Information System (INIS)
Coded apertures using Uniformly Redundant Arrays (URA) have been unsuccessfully evaluated for two-dimensional and three-dimensional imaging in Nuclear Medicine. The images reconstructed from coded projections contain artifacts and suffer from poor spatial resolution in the longitudinal direction. We introduce a Maximum-Likelihood Expectation-Maximization (MLEM) algorithm for three-dimensional coded aperture imaging which uses a projection matrix calculated by Monte Carlo simulations. The aim of the algorithm is to reduce artifacts and improve the three-dimensional spatial resolution in the reconstructed images. Firstly, we present the validation of GATE (Geant4 Application for Emission Tomography) for Monte Carlo simulations of a coded mask installed on a clinical gamma camera. The coded mask modelling was validated by comparison between experimental and simulated data in terms of energy spectra, sensitivity and spatial resolution. In the second part of the study, we use the validated model to calculate the projection matrix with Monte Carlo simulations. A three-dimensional thyroid phantom study was performed to compare the performance of the three-dimensional MLEM reconstruction with conventional correlation method. The results indicate that the artifacts are reduced and three-dimensional spatial resolution is improved with the Monte Carlo-based MLEM reconstruction.
Monte Carlo Radiative Transfer
Whitney, Barbara A
2011-01-01
I outline methods for calculating the solution of Monte Carlo Radiative Transfer (MCRT) in scattering, absorption and emission processes of dust and gas, including polarization. I provide a bibliography of relevant papers on methods with astrophysical applications.
Monte Carlo transition probabilities
Lucy, L. B.
2001-01-01
Transition probabilities governing the interaction of energy packets and matter are derived that allow Monte Carlo NLTE transfer codes to be constructed without simplifying the treatment of line formation. These probabilities are such that the Monte Carlo calculation asymptotically recovers the local emissivity of a gas in statistical equilibrium. Numerical experiments with one-point statistical equilibrium problems for Fe II and Hydrogen confirm this asymptotic behaviour. In addition, the re...
International Nuclear Information System (INIS)
The investigation of some effects of the bombardment of material surfaces by keV ions has been carried out using a Monte Carlo computer simulation. This code models the target as an amorphous semi-infinite solid. Up to three elements may be used in the solid. The target's composition change during bombardment is simulated. Of the many phenomena taking place during the bombardment the simulation program will be used to study the dose dependence of the sputtering yield and ion-induced Auger electron emission. The study concludes that implanted Ar beam atoms do not alter the kinematics of the target by the amount necessary to cause the experimentally observed increase in the sputtering yield of Si. The computer study also suggests that for Ar bombardment of Al and Si the Auger electrons are predominantly from sputtered atoms. A novel technique for cleaning the surface of a liquid is described. This method has been used to obtain a highly clean surface on a liquid Ga drop. The surface flow generated by the ion bombardment is discussed. The circuit and structure of a quartz crystal third overtone resonator is presented. The resonator is designed for use in making dose dependent sputtering yield measurements. This apparatus can detect the removal of a fraction of a monolayer from a thin film that has been evaporated on its surface
Ponomarev, Artem; Cucinotta, F.
2011-01-01
To create a generalized mechanistic model of DNA damage in human cells that will generate analytical and image data corresponding to experimentally observed DNA damage foci and will help to improve the experimental foci yields by simulating spatial foci patterns and resolving problems with quantitative image analysis. Material and Methods: The analysis of patterns of RIFs (radiation-induced foci) produced by low- and high-LET (linear energy transfer) radiation was conducted by using a Monte Carlo model that combines the heavy ion track structure with characteristics of the human genome on the level of chromosomes. The foci patterns were also simulated in the maximum projection plane for flat nuclei. Some data analysis was done with the help of image segmentation software that identifies individual classes of RIFs and colocolized RIFs, which is of importance to some experimental assays that assign DNA damage a dual phosphorescent signal. Results: The model predicts the spatial and genomic distributions of DNA DSBs (double strand breaks) and associated RIFs in a human cell nucleus for a particular dose of either low- or high-LET radiation. We used the model to do analyses for different irradiation scenarios. In the beam-parallel-to-the-disk-of-a-flattened-nucleus scenario we found that the foci appeared to be merged due to their high density, while, in the perpendicular-beam scenario, the foci appeared as one bright spot per hit. The statistics and spatial distribution of regions of densely arranged foci, termed DNA foci chains, were predicted numerically using this model. Another analysis was done to evaluate the number of ion hits per nucleus, which were visible from streaks of closely located foci. In another analysis, our image segmentaiton software determined foci yields directly from images with single-class or colocolized foci. Conclusions: We showed that DSB clustering needs to be taken into account to determine the true DNA damage foci yield, which helps to
Hajizadeh-Safar, M.; Ghorbani, M.; Khoshkharam, S.; Ashrafi, Z.
2014-01-01
Gamma camera is an important apparatus in nuclear medicine imaging. Its detection part is consists of a scintillation detector with a heavy collimator. Substitution of semiconductor detectors instead of scintillator in these cameras has been effectively studied. In this study, it is aimed to introduce a new design of P-N semiconductor detector array for nuclear medicine imaging. A P-N semiconductor detector composed of N-SnO2 :F, and P-NiO:Li, has been introduced through simulating with MCNPX...
Institute of Scientific and Technical Information of China (English)
贾清刚; 张天奎; 张凤娜; 胡华四
2013-01-01
开发了基于Geant4的Z箍缩中子编码成像系统模拟平台,实现聚变中子编码成像诊断系统各关键部件的完整模拟.获得了低中子产额(约1010量级)下,中子经编码孔编码后在闪烁体阵列中形成的发光分布图像.利用维纳滤波、Richardson-Lucy(RL)及遗传算法(GA)对低中子产额下获得的极低信噪比图像进行重建,并对信噪比、中子产额及重建效果进行了对比研究,结果表明:遗传算法对低信噪比中子编码图像的重建具有很强的鲁棒性;中子编码图像的信噪比与遗传算法重建结果的准确性呈正比.%The model of Z-pinch driven fusion imaging diagnosis system was set up by a Monte Carlo code based on the Geant4 simulation toolkit. All physical processes that the reality involves are taken into consideration in simulation. The light image of low neutron yield (about 1010) pill was obtained. Three types of image reconstruction algorithm, i. e. Richardson-Lucy, Wiener filtering and genetic algorithm were employed to reconstruct the neutron image with a low signal to noise ratio (SNR) and yield. The effects of neutron yields and the SNR on reconstruction performance were discussed. The results show that genetic algorithm is very robust for reconstructing neutron images with a low SNR. And the index of reconstruction performance and the image correlation coefficient using genetic algorithm, are proportional to the SNR of the neutron coded image.
2009-01-01
Carlo Rubbia turned 75 on March 31, and CERN held a symposium to mark his birthday and pay tribute to his impressive contribution to both CERN and science. Carlo Rubbia, 4th from right, together with the speakers at the symposium.On 7 April CERN hosted a celebration marking Carlo Rubbia’s 75th birthday and 25 years since he was awarded the Nobel Prize for Physics. "Today we will celebrate 100 years of Carlo Rubbia" joked CERN’s Director-General, Rolf Heuer in his opening speech, "75 years of his age and 25 years of the Nobel Prize." Rubbia received the Nobel Prize along with Simon van der Meer for contributions to the discovery of the W and Z bosons, carriers of the weak interaction. During the symposium, which was held in the Main Auditorium, several eminent speakers gave lectures on areas of science to which Carlo Rubbia made decisive contributions. Among those who spoke were Michel Spiro, Director of the French National Insti...
DEFF Research Database (Denmark)
Slot Thing, Rune; Bernchou, Uffe; Mainegra-Hing, Ernesto;
2013-01-01
_cbct), was used to perform MC simulations of an Elekta XVI CBCT imaging system. A 60keV x-ray source was used, and air kerma scored at the detector plane. Several variance reduction techniques (VRTs) were used to increase the scatter calculation efficiency. Three patient phantoms based on CT scans were...... simulated, namely a brain, a thorax and a pelvis scan. A ray tracing algorithm was used to calculate the detector signal due to primary photons. A total of 288 projections were simulated, one for each thread on the computer cluster used for the investigation. Results. Scatter distributions for the brain......, thorax and pelvis scan were simulated within 2% statistical uncertainty in two hours per scan. Within the same time, the ray tracing algorithm provided the primary signal for each of the projections. Thus, all the data needed for MC-based scatter correction in clinical CBCT imaging was obtained within...
Lee, Young-Jin; Park, Su-Jin; Lee, Seung-Wan; Kim, Dae-Hong; Kim, Ye-Seul; Kim, Hee-Joung
2013-05-01
The photon counting detector based on cadmium telluride (CdTe) or cadmium zinc telluride (CZT) is a promising imaging modality that provides many benefits compared to conventional scintillation detectors. By using a pinhole collimator with the photon counting detector, we were able to improve both the spatial resolution and the sensitivity. The purpose of this study was to evaluate the photon counting and conventional scintillation detectors in a pinhole single-photon emission computed tomography (SPECT) system. We designed five pinhole SPECT systems of two types: one type with a CdTe photon counting detector and the other with a conventional NaI(Tl) scintillation detector. We conducted simulation studies and evaluated imaging performance. The results demonstrated that the spatial resolution of the CdTe photon counting detector was 0.38 mm, with a sensitivity 1.40 times greater than that of a conventional NaI(Tl) scintillation detector for the same detector thickness. Also, the average scatter fractions of the CdTe photon counting and the conventional NaI(Tl) scintillation detectors were 1.93% and 2.44%, respectively. In conclusion, we successfully evaluated various pinhole SPECT systems for small animal imaging.
Monte Carlo dose mapping on deforming anatomy
Zhong, Hualiang; Siebers, Jeffrey V.
2009-10-01
This paper proposes a Monte Carlo-based energy and mass congruent mapping (EMCM) method to calculate the dose on deforming anatomy. Different from dose interpolation methods, EMCM separately maps each voxel's deposited energy and mass from a source image to a reference image with a displacement vector field (DVF) generated by deformable image registration (DIR). EMCM was compared with other dose mapping methods: energy-based dose interpolation (EBDI) and trilinear dose interpolation (TDI). These methods were implemented in EGSnrc/DOSXYZnrc, validated using a numerical deformable phantom and compared for clinical CT images. On the numerical phantom with an analytically invertible deformation map, EMCM mapped the dose exactly the same as its analytic solution, while EBDI and TDI had average dose errors of 2.5% and 6.0%. For a lung patient's IMRT treatment plan, EBDI and TDI differed from EMCM by 1.96% and 7.3% in the lung patient's entire dose region, respectively. As a 4D Monte Carlo dose calculation technique, EMCM is accurate and its speed is comparable to 3D Monte Carlo simulation. This method may serve as a valuable tool for accurate dose accumulation as well as for 4D dosimetry QA.
Energy Technology Data Exchange (ETDEWEB)
Wollaber, Allan Benton [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2016-06-16
This is a powerpoint which serves as lecture material for the Parallel Computing summer school. It goes over the fundamentals of Monte Carlo. Welcome to Los Alamos, the birthplace of “Monte Carlo” for computational physics. Stanislaw Ulam, John von Neumann, and Nicholas Metropolis are credited as the founders of modern Monte Carlo methods. The name “Monte Carlo” was chosen in reference to the Monte Carlo Casino in Monaco (purportedly a place where Ulam’s uncle went to gamble). The central idea (for us) – to use computer-generated “random” numbers to determine expected values or estimate equation solutions – has since spread to many fields. "The first thoughts and attempts I made to practice [the Monte Carlo Method] were suggested by a question which occurred to me in 1946 as I was convalescing from an illness and playing solitaires. The question was what are the chances that a Canfield solitaire laid out with 52 cards will come out successfully? After spending a lot of time trying to estimate them by pure combinatorial calculations, I wondered whether a more practical method than “abstract thinking” might not be to lay it out say one hundred times and simply observe and count the number of successful plays... Later [in 1946], I described the idea to John von Neumann, and we began to plan actual calculations." - Stanislaw Ulam.
Leonardo Rossi
Carlo Caso (1940 - 2007) Our friend and colleague Carlo Caso passed away on July 7th, after several months of courageous fight against cancer. Carlo spent most of his scientific career at CERN, taking an active part in the experimental programme of the laboratory. His long and fruitful involvement in particle physics started in the sixties, in the Genoa group led by G. Tomasini. He then made several experiments using the CERN liquid hydrogen bubble chambers -first the 2000HBC and later BEBC- to study various facets of the production and decay of meson and baryon resonances. He later made his own group and joined the NA27 Collaboration to exploit the EHS Spectrometer with a rapid cycling bubble chamber as vertex detector. Amongst their many achievements, they were the first to measure, with excellent precision, the lifetime of the charmed D mesons. At the start of the LEP era, Carlo and his group moved to the DELPHI experiment, participating in the construction and running of the HPC electromagnetic c...
International Nuclear Information System (INIS)
Dedicated single-photon-emission computed tomography (SPECT) systems based on pixelated semiconductors such as cadmium telluride (CdTe) are in development to study small animal models of human disease. In an effort to develop a high-resolution, low-dose system for small animal imaging, we compared a CdTe-based SPECT system and a conventional NaI(Tl)-based SPECT system in terms of spatial resolution, sensitivity, contrast, and contrast-to-noise ratio (CNR). In addition, we investigated the radiation absorbed dose and calculated a figure of merit (FOM) for both SPECT systems. Using the conventional NaI(Tl)-based SPECT system, we achieved a spatial resolution of 1.66 mm at a 30 mm source-to-collimator distance, and a resolution of 2.4-mm hot-rods. Using the newly-developed CdTe-based SPECT system, we achieved a spatial resolution of 1.32 mm FWHM at a 30 mm source-to-collimator distance, and a resolution of 1.7-mm hot-rods. The sensitivities at a 30 mm source-to-collimator distance were 115.73 counts/sec/MBq and 83.38 counts/sec/MBq for the CdTe-based SPECT and conventional NaI(Tl)-based SPECT systems, respectively. To compare quantitative measurements in the mouse brain, we calculated the CNR for images from both systems. The CNR from the CdTe-based SPECT system was 4.41, while that from the conventional NaI(Tl)-based SPECT system was 3.11 when the injected striatal dose was 160 Bq/voxel. The CNR increased as a function of injected dose in both systems. The FOM of the CdTe-based SPECT system was superior to that of the conventional NaI(Tl)-based SPECT system, and the highest FOM was achieved with the CdTe-based SPECT at a dose of 40 Bq/voxel injected into the striatum. Thus, a CdTe-based SPECT system showed significant improvement in performance compared with a conventional system in terms of spatial resolution, sensitivity, and CNR, while reducing the radiation dose to the small animal subject. Herein, we discuss the feasibility of a CdTe-based SPECT system for high
Monte Carlo photon benchmark problems
International Nuclear Information System (INIS)
Photon benchmark calculations have been performed to validate the MCNP Monte Carlo computer code. These are compared to both the COG Monte Carlo computer code and either experimental or analytic results. The calculated solutions indicate that the Monte Carlo method, and MCNP and COG in particular, can accurately model a wide range of physical problems. 8 refs., 5 figs
Charlie Samuya Veric
2001-01-01
The importance of Carlos Bulosan in Filipino and Filipino-American radical history and literature is indisputable. His eminence spans the pacific, and he is known, diversely, as a radical poet, fictionist, novelist, and labor organizer. Author of the canonical America Iis the Hearts, Bulosan is celebrated for chronicling the conditions in America in his time, such as racism and unemployment. In the history of criticism on Bulosan's life and work, however, there is an undeclared general consen...
Energy Technology Data Exchange (ETDEWEB)
Wollaber, Allan Benton [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2016-06-16
This is a powerpoint presentation which serves as lecture material for the Parallel Computing summer school. It goes over the fundamentals of the Monte Carlo calculation method. The material is presented according to the following outline: Introduction (background, a simple example: estimating π), Why does this even work? (The Law of Large Numbers, The Central Limit Theorem), How to sample (inverse transform sampling, rejection), and An example from particle transport.
International Nuclear Information System (INIS)
The contributon Monte Carlo method is based on a new recipe to calculate target responses by means of volume integral of the contributon current in a region between the source and the detector. A comprehensive description of the method, its implementation in the general-purpose MCNP code, and results of the method for realistic nonhomogeneous, energy-dependent problems are presented. 23 figures, 10 tables
Directory of Open Access Journals (Sweden)
Charlie Samuya Veric
2001-12-01
Full Text Available The importance of Carlos Bulosan in Filipino and Filipino-American radical history and literature is indisputable. His eminence spans the pacific, and he is known, diversely, as a radical poet, fictionist, novelist, and labor organizer. Author of the canonical America Iis the Hearts, Bulosan is celebrated for chronicling the conditions in America in his time, such as racism and unemployment. In the history of criticism on Bulosan's life and work, however, there is an undeclared general consensus that views Bulosan and his work as coherent permanent texts of radicalism and anti-imperialism. Central to the existence of such a tradition of critical reception are the generations of critics who, in more ways than one, control the discourse on and of Carlos Bulosan. This essay inquires into the sphere of the critical reception that orders, for our time and for the time ahead, the reading and interpretation of Bulosan. What eye and seeing, the essay asks, determine the perception of Bulosan as the angel of radicalism? What is obscured in constructing Bulosan as an immutable figure of the political? What light does the reader conceive when the personal is brought into the open and situated against the political? the essay explores the answers to these questions in Bulosan's loving letters to various friends, strangers, and white American women. The presence of these interrogations, the essay believes, will secure ultimately the continuing importance of Carlos Bulosan to radical literature and history.
2009-01-01
On 7 April CERN will be holding a symposium to mark the 75th birthday of Carlo Rubbia, who shared the 1984 Nobel Prize for Physics with Simon van der Meer for contributions to the discovery of the W and Z bosons, carriers of the weak interaction. Following a presentation by Rolf Heuer, lectures will be given by eminent speakers on areas of science to which Carlo Rubbia has made decisive contributions. Michel Spiro, Director of the French National Institute of Nuclear and Particle Physics (IN2P3) of the CNRS, Lyn Evans, sLHC Project Leader, and Alan Astbury of the TRIUMF Laboratory will talk about the physics of the weak interaction and the discovery of the W and Z bosons. Former CERN Director-General Herwig Schopper will lecture on CERN’s accelerators from LEP to the LHC. Giovanni Bignami, former President of the Italian Space Agency, will speak about his work with Carlo Rubbia. Finally, Hans Joachim Schellnhuber of the Potsdam Institute for Climate Research and Sven Kul...
2009-01-01
On 7 April CERN will be holding a symposium to mark the 75th birthday of Carlo Rubbia, who shared the 1984 Nobel Prize for Physics with Simon van der Meer for contributions to the discovery of the W and Z bosons, carriers of the weak interaction. Following a presentation by Rolf Heuer, lectures will be given by eminent speakers on areas of science to which Carlo Rubbia has made decisive contributions. Michel Spiro, Director of the French National Institute of Nuclear and Particle Physics (IN2P3) of the CNRS, Lyn Evans, sLHC Project Leader, and Alan Astbury of the TRIUMF Laboratory will talk about the physics of the weak interaction and the discovery of the W and Z bosons. Former CERN Director-General Herwig Schopper will lecture on CERN’s accelerators from LEP to the LHC. Giovanni Bignami, former President of the Italian Space Agency and Professor at the IUSS School for Advanced Studies in Pavia will speak about his work with Carlo Rubbia. Finally, Hans Joachim Sch...
National Aeronautics and Space Administration — Images for the website main pages and all configurations. The upload and access points for the other images are: Website Template RSW images BSCW Images HIRENASD...
Optimization of Monte Carlo simulations
Bryskhe, Henrik
2009-01-01
This thesis considers several different techniques for optimizing Monte Carlo simulations. The Monte Carlo system used is Penelope but most of the techniques are applicable to other systems. The two mayor techniques are the usage of the graphics card to do geometry calculations, and raytracing. Using graphics card provides a very efficient way to do fast ray and triangle intersections. Raytracing provides an approximation of Monte Carlo simulation but is much faster to perform. A program was ...
Quantum Gibbs ensemble Monte Carlo
International Nuclear Information System (INIS)
We present a path integral Monte Carlo method which is the full quantum analogue of the Gibbs ensemble Monte Carlo method of Panagiotopoulos to study the gas-liquid coexistence line of a classical fluid. Unlike previous extensions of Gibbs ensemble Monte Carlo to include quantum effects, our scheme is viable even for systems with strong quantum delocalization in the degenerate regime of temperature. This is demonstrated by an illustrative application to the gas-superfluid transition of 4He in two dimensions
International Nuclear Information System (INIS)
The course of ''Monte Carlo Techniques'' will try to give a general overview of how to build up a method based on a given theory, allowing you to compare the outcome of an experiment with that theory. Concepts related with the construction of the method, such as, random variables, distributions of random variables, generation of random variables, random-based numerical methods, will be introduced in this course. Examples of some of the current theories in High Energy Physics describing the e+e- annihilation processes (QED, Electro-Weak, QCD) will also be briefly introduced. A second step in the employment of this method is related to the detector. The interactions that a particle could have along its way, through the detector as well as the response of the different materials which compound the detector will be quoted in this course. An example of detector at LEP era, in which these techniques are being applied, will close the course. (orig.)
Energy Technology Data Exchange (ETDEWEB)
Marcus, Ryan C. [Los Alamos National Laboratory
2012-07-25
MCMini is a proof of concept that demonstrates the possibility for Monte Carlo neutron transport using OpenCL with a focus on performance. This implementation, written in C, shows that tracing particles and calculating reactions on a 3D mesh can be done in a highly scalable fashion. These results demonstrate a potential path forward for MCNP or other Monte Carlo codes.
Monte Carlo Methods in Physics
International Nuclear Information System (INIS)
Method of Monte Carlo integration is reviewed briefly and some of its applications in physics are explained. A numerical experiment on random generators used in the monte Carlo techniques is carried out to show the behavior of the randomness of various methods in generating them. To account for the weight function involved in the Monte Carlo, the metropolis method is used. From the results of the experiment, one can see that there is no regular patterns of the numbers generated, showing that the program generators are reasonably good, while the experimental results, shows a statistical distribution obeying statistical distribution law. Further some applications of the Monte Carlo methods in physics are given. The choice of physical problems are such that the models have available solutions either in exact or approximate values, in which comparisons can be mode, with the calculations using the Monte Carlo method. Comparison show that for the models to be considered, good agreement have been obtained
Applications of Monte Carlo methods in nuclear science and engineering
International Nuclear Information System (INIS)
With the advent of inexpensive computing power over the past two decades and development of variance reduction techniques, applications of Monte Carlo radiation transport techniques have proliferated dramatically. The motivation of variance reduction technique is for computational efficiency. The typical variance reduction techniques worth mentioning here are: importance sampling, implicit capture, energy and angular biasing, Russian Roulette, exponential transform, next event estimator, weight window generator, range rejection technique (only for charged particles) etc. Applications of Monte Carlo in radiation transport include nuclear safeguards, accelerator applications, homeland security, nuclear criticality, health physics, radiological safety, radiography, radiotherapy physics, radiation standards, nuclear medicine (dosimetry and imaging) etc. Towards health care, Monte Carlo particle transport techniques offer exciting tools for radiotherapy research (cancer treatments involving photons, electrons, neutrons, protons, pions and other heavy ions) where they play an increasingly important role. Research and applications of Monte Carlo techniques in radiotherapy span a very wide range from fundamental studies of cross sections and development of particle transport algorithms, to clinical evaluation of treatment plans for a variety of radiotherapy modalities. Recent development is the voxel-based Monte Carlo Radiotherapy Treatment Planning involving external electron beam and patient data in the form of DICOM (Digital Imaging and Communications in Medicine) images. Articles relevant to the INIS are indexed separately
Computed radiography simulation using the Monte Carlo code MCNPX
International Nuclear Information System (INIS)
Simulating x-ray images has been of great interest in recent years as it makes possible an analysis of how x-ray images are affected owing to relevant operating parameters. In this paper, a procedure for simulating computed radiographic images using the Monte Carlo code MCNPX is proposed. The sensitivity curve of the BaFBr image plate detector as well as the characteristic noise of a 16-bit computed radiography system were considered during the methodology's development. The results obtained confirm that the proposed procedure for simulating computed radiographic images is satisfactory, as it allows obtaining results comparable with experimental data. (author)
Parallelizing Monte Carlo with PMC
Energy Technology Data Exchange (ETDEWEB)
Rathkopf, J.A.; Jones, T.R.; Nessett, D.M.; Stanberry, L.C.
1994-11-01
PMC (Parallel Monte Carlo) is a system of generic interface routines that allows easy porting of Monte Carlo packages of large-scale physics simulation codes to Massively Parallel Processor (MPP) computers. By loading various versions of PMC, simulation code developers can configure their codes to run in several modes: serial, Monte Carlo runs on the same processor as the rest of the code; parallel, Monte Carlo runs in parallel across many processors of the MPP with the rest of the code running on other MPP processor(s); distributed, Monte Carlo runs in parallel across many processors of the MPP with the rest of the code running on a different machine. This multi-mode approach allows maintenance of a single simulation code source regardless of the target machine. PMC handles passing of messages between nodes on the MPP, passing of messages between a different machine and the MPP, distributing work between nodes, and providing independent, reproducible sequences of random numbers. Several production codes have been parallelized under the PMC system. Excellent parallel efficiency in both the distributed and parallel modes results if sufficient workload is available per processor. Experiences with a Monte Carlo photonics demonstration code and a Monte Carlo neutronics package are described.
Monte Carlo Simulation and Experimental Characterization of a Dual Head Gamma Camera
Rodrigues, S; Abreu, M C; Santos, N; Rato-Mendes, P; Peralta, L
2007-01-01
The GEANT4 Monte Carlo simulation and experimental characterization of the Siemens E.Cam Dual Head gamma camera hosted in the Particular Hospital of Algarve have been done. Imaging tests of thyroid and other phantoms have been made "in situ" and compared with the results obtained with the Monte Carlo simulation.
Accelerated GPU based SPECT Monte Carlo simulations.
Garcia, Marie-Paule; Bert, Julien; Benoit, Didier; Bardiès, Manuel; Visvikis, Dimitris
2016-06-01
Monte Carlo (MC) modelling is widely used in the field of single photon emission computed tomography (SPECT) as it is a reliable technique to simulate very high quality scans. This technique provides very accurate modelling of the radiation transport and particle interactions in a heterogeneous medium. Various MC codes exist for nuclear medicine imaging simulations. Recently, new strategies exploiting the computing capabilities of graphical processing units (GPU) have been proposed. This work aims at evaluating the accuracy of such GPU implementation strategies in comparison to standard MC codes in the context of SPECT imaging. GATE was considered the reference MC toolkit and used to evaluate the performance of newly developed GPU Geant4-based Monte Carlo simulation (GGEMS) modules for SPECT imaging. Radioisotopes with different photon energies were used with these various CPU and GPU Geant4-based MC codes in order to assess the best strategy for each configuration. Three different isotopes were considered: (99m) Tc, (111)In and (131)I, using a low energy high resolution (LEHR) collimator, a medium energy general purpose (MEGP) collimator and a high energy general purpose (HEGP) collimator respectively. Point source, uniform source, cylindrical phantom and anthropomorphic phantom acquisitions were simulated using a model of the GE infinia II 3/8" gamma camera. Both simulation platforms yielded a similar system sensitivity and image statistical quality for the various combinations. The overall acceleration factor between GATE and GGEMS platform derived from the same cylindrical phantom acquisition was between 18 and 27 for the different radioisotopes. Besides, a full MC simulation using an anthropomorphic phantom showed the full potential of the GGEMS platform, with a resulting acceleration factor up to 71. The good agreement with reference codes and the acceleration factors obtained support the use of GPU implementation strategies for improving computational
Accelerated GPU based SPECT Monte Carlo simulations
Garcia, Marie-Paule; Bert, Julien; Benoit, Didier; Bardiès, Manuel; Visvikis, Dimitris
2016-06-01
Monte Carlo (MC) modelling is widely used in the field of single photon emission computed tomography (SPECT) as it is a reliable technique to simulate very high quality scans. This technique provides very accurate modelling of the radiation transport and particle interactions in a heterogeneous medium. Various MC codes exist for nuclear medicine imaging simulations. Recently, new strategies exploiting the computing capabilities of graphical processing units (GPU) have been proposed. This work aims at evaluating the accuracy of such GPU implementation strategies in comparison to standard MC codes in the context of SPECT imaging. GATE was considered the reference MC toolkit and used to evaluate the performance of newly developed GPU Geant4-based Monte Carlo simulation (GGEMS) modules for SPECT imaging. Radioisotopes with different photon energies were used with these various CPU and GPU Geant4-based MC codes in order to assess the best strategy for each configuration. Three different isotopes were considered: 99m Tc, 111In and 131I, using a low energy high resolution (LEHR) collimator, a medium energy general purpose (MEGP) collimator and a high energy general purpose (HEGP) collimator respectively. Point source, uniform source, cylindrical phantom and anthropomorphic phantom acquisitions were simulated using a model of the GE infinia II 3/8" gamma camera. Both simulation platforms yielded a similar system sensitivity and image statistical quality for the various combinations. The overall acceleration factor between GATE and GGEMS platform derived from the same cylindrical phantom acquisition was between 18 and 27 for the different radioisotopes. Besides, a full MC simulation using an anthropomorphic phantom showed the full potential of the GGEMS platform, with a resulting acceleration factor up to 71. The good agreement with reference codes and the acceleration factors obtained support the use of GPU implementation strategies for improving computational efficiency
TARC: Carlo Rubbia's Energy Amplifier
Laurent Guiraud
1997-01-01
Transmutation by Adiabatic Resonance Crossing (TARC) is Carlo Rubbia's energy amplifier. This CERN experiment demonstrated that long-lived fission fragments, such as 99-TC, can be efficiently destroyed.
Carlos Chagas: biographical sketch.
Moncayo, Alvaro
2010-01-01
Carlos Chagas was born on 9 July 1878 in the farm "Bon Retiro" located close to the City of Oliveira in the interior of the State of Minas Gerais, Brazil. He started his medical studies in 1897 at the School of Medicine of Rio de Janeiro. In the late XIX century, the works by Louis Pasteur and Robert Koch induced a change in the medical paradigm with emphasis in experimental demonstrations of the causal link between microbes and disease. During the same years in Germany appeared the pathological concept of disease, linking organic lesions with symptoms. All these innovations were adopted by the reforms of the medical schools in Brazil and influenced the scientific formation of Chagas. Chagas completed his medical studies between 1897 and 1903 and his examinations during these years were always ranked with high grades. Oswaldo Cruz accepted Chagas as a doctoral candidate and directed his thesis on "Hematological studies of Malaria" which was received with honors by the examiners. In 1903 the director appointed Chagas as research assistant at the Institute. In those years, the Institute of Manguinhos, under the direction of Oswaldo Cruz, initiated a process of institutional growth and gathered a distinguished group of Brazilian and foreign scientists. In 1907, he was requested to investigate and control a malaria outbreak in Lassance, Minas Gerais. In this moment Chagas could not have imagined that this field research was the beginning of one of the most notable medical discoveries. Chagas was, at the age of 28, a Research Assistant at the Institute of Manguinhos and was studying a new flagellate parasite isolated from triatomine insects captured in the State of Minas Gerais. Chagas made his discoveries in this order: first the causal agent, then the vector and finally the human cases. These notable discoveries were carried out by Chagas in twenty months. At the age of 33 Chagas had completed his discoveries and published the scientific articles that gave him world
Modelling cerebral blood oxygenation using Monte Carlo XYZ-PA
Zam, Azhar; Jacques, Steven L.; Alexandrov, Sergey; Li, Youzhi; Leahy, Martin J.
2013-02-01
Continuous monitoring of cerebral blood oxygenation is critically important for the management of many lifethreatening conditions. Non-invasive monitoring of cerebral blood oxygenation with a photoacoustic technique offers advantages over current invasive and non-invasive methods. We introduce a Monte Carlo XYZ-PA to model the energy deposition in 3D and the time-resolved pressures and velocity potential based on the energy absorbed by the biological tissue. This paper outlines the benefits of using Monte Carlo XYZ-PA for optimization of photoacoustic measurement and imaging. To the best of our knowledge this is the first fully integrated tool for photoacoustic modelling.
Fast Monte Carlo-assisted simulation of cloudy Earth backgrounds
Adler-Golden, Steven; Richtsmeier, Steven C.; Berk, Alexander; Duff, James W.
2012-11-01
A calculation method has been developed for rapidly synthesizing radiometrically accurate ultraviolet through longwavelengthinfrared spectral imagery of the Earth for arbitrary locations and cloud fields. The method combines cloudfree surface reflectance imagery with cloud radiance images calculated from a first-principles 3-D radiation transport model. The MCScene Monte Carlo code [1-4] is used to build a cloud image library; a data fusion method is incorporated to speed convergence. The surface and cloud images are combined with an upper atmospheric description with the aid of solar and thermal radiation transport equations that account for atmospheric inhomogeneity. The method enables a wide variety of sensor and sun locations, cloud fields, and surfaces to be combined on-the-fly, and provides hyperspectral wavelength resolution with minimal computational effort. The simulations agree very well with much more time-consuming direct Monte Carlo calculations of the same scene.
Monte Carlo techniques in diagnostic and therapeutic nuclear medicine
International Nuclear Information System (INIS)
community at large. The application of Monte Carlo techniques in medical physics is an ever lasting enthusiastic topic and an area of considerable research interest. Monte Carlo modelling has contributed to a better understanding of the physics of radiation transport in medical physics. As an example, the large number of applications of the Monte Carlo method attests to its usefulness as a research tool n different areas of nuclear medicine imaging including detector modelling and systems design, image reconstruction and correction techniques, internal dosimetry and pharmacokinetic modelling. In particular, Monte Carlo simulation is a gold standard for the simulation of nuclear medicine imaging systems and is an indispensable research tool to develop and evaluate dose planning algorithms. Recent developments in nuclear medicine instrumentation including high-resolution SPECT/PET scanners and multimodality imagers as well as applications in patient-specific dosimetry are ideal for Monte Carlo modelling techniques because of the stochastic nature of radiation emission, transport and detection processes. Factors, which have contributed to the wider use, include improved models of radiation transport processes, the practicality of application with the development of acceleration schemes and the improved speed of computers as well as the availability of multiple-processor parallel processing systems
Proton Upset Monte Carlo Simulation
O'Neill, Patrick M.; Kouba, Coy K.; Foster, Charles C.
2009-01-01
The Proton Upset Monte Carlo Simulation (PROPSET) program calculates the frequency of on-orbit upsets in computer chips (for given orbits such as Low Earth Orbit, Lunar Orbit, and the like) from proton bombardment based on the results of heavy ion testing alone. The software simulates the bombardment of modern microelectronic components (computer chips) with high-energy (.200 MeV) protons. The nuclear interaction of the proton with the silicon of the chip is modeled and nuclear fragments from this interaction are tracked using Monte Carlo techniques to produce statistically accurate predictions.
Monte Carlo Particle Lists: MCPL
Kittelmann, Thomas; Knudsen, Erik B; Willendrup, Peter; Cai, Xiao Xiao; Kanaki, Kalliopi
2016-01-01
A binary format with lists of particle state information, for interchanging particles between various Monte Carlo simulation applications, is presented. Portable C code for file manipulation is made available to the scientific community, along with converters and plugins for several popular simulation packages.
Monte Carlo scatter correction for SPECT
Liu, Zemei
The goal of this dissertation is to present a quantitatively accurate and computationally fast scatter correction method that is robust and easily accessible for routine applications in SPECT imaging. A Monte Carlo based scatter estimation method is investigated and developed further. The Monte Carlo simulation program SIMIND (Simulating Medical Imaging Nuclear Detectors), was specifically developed to simulate clinical SPECT systems. The SIMIND scatter estimation (SSE) method was developed further using a multithreading technique to distribute the scatter estimation task across multiple threads running concurrently on multi-core CPU's to accelerate the scatter estimation process. An analytical collimator that ensures less noise was used during SSE. The research includes the addition to SIMIND of charge transport modeling in cadmium zinc telluride (CZT) detectors. Phenomena associated with radiation-induced charge transport including charge trapping, charge diffusion, charge sharing between neighboring detector pixels, as well as uncertainties in the detection process are addressed. Experimental measurements and simulation studies were designed for scintillation crystal based SPECT and CZT based SPECT systems to verify and evaluate the expanded SSE method. Jaszczak Deluxe and Anthropomorphic Torso Phantoms (Data Spectrum Corporation, Hillsborough, NC, USA) were used for experimental measurements and digital versions of the same phantoms employed during simulations to mimic experimental acquisitions. This study design enabled easy comparison of experimental and simulated data. The results have consistently shown that the SSE method performed similarly or better than the triple energy window (TEW) and effective scatter source estimation (ESSE) methods for experiments on all the clinical SPECT systems. The SSE method is proven to be a viable method for scatter estimation for routine clinical use.
Microlens assembly error analysis for light field camera based on Monte Carlo method
Li, Sai; Yuan, Yuan; Zhang, Hao-Wei; Liu, Bin; Tan, He-Ping
2016-08-01
This paper describes numerical analysis of microlens assembly errors in light field cameras using the Monte Carlo method. Assuming that there were no manufacturing errors, home-built program was used to simulate images of coupling distance error, movement error and rotation error that could appear during microlens installation. By researching these images, sub-aperture images and refocus images, we found that the images present different degrees of fuzziness and deformation for different microlens assembly errors, while the subaperture image presents aliasing, obscured images and other distortions that result in unclear refocus images.
Monte Carlo simulation of tomography techniques using the platform Gate
International Nuclear Information System (INIS)
Simulations play a key role in functional imaging, with applications ranging from scanner design, scatter correction, protocol optimisation. GATE (Geant4 for Application Tomography Emission) is a platform for Monte Carlo Simulation. It is based on Geant4 to generate and track particles, to model geometry and physics process. Explicit modelling of time includes detector motion, time of flight, tracer kinetics. Interfaces to voxellised models and image reconstruction packages improve the integration of GATE in the global modelling cycle. In this work Monte Carlo simulations are used to understand and optimise the gamma camera's performances. We study the effect of the distance between source and collimator, the diameter of the holes and the thick of the collimator on the spatial resolution, energy resolution and efficiency of the gamma camera. We also study the reduction of simulation's time and implement a model of left ventricle in GATE. (Author). 7 refs
Monte Carlo dose calculation in dental amalgam phantom
Mohd Zahri Abdul Aziz; Yusoff, A. L.; N D Osman; R. Abdullah; Rabaie, N. A.; M S Salikin
2015-01-01
It has become a great challenge in the modern radiation treatment to ensure the accuracy of treatment delivery in electron beam therapy. Tissue inhomogeneity has become one of the factors for accurate dose calculation, and this requires complex algorithm calculation like Monte Carlo (MC). On the other hand, computed tomography (CT) images used in treatment planning system need to be trustful as they are the input in radiotherapy treatment. However, with the presence of metal amalgam in treatm...
Efficient Monte Carlo methods for light transport in scattering media
Jarosz, Wojciech
2008-01-01
In this dissertation we focus on developing accurate and efficient Monte Carlo methods for synthesizing images containing general participating media. Participating media such as clouds, smoke, and fog are ubiquitous in the world and are responsible for many important visual phenomena which are of interest to computer graphics as well as related fields. When present, the medium participates in lighting interactions by scattering or absorbing photons as they travel through the scene. Though th...
Fidorra, M; Seeger, H M
2007-01-01
Monte Carlo (MC) Simulations, Differential Scanning Calorimetry (DSC) and Fourier Transform InfraRed (FTIR) spectroscopy were used to study the melting behavior of single lipid components in two-component membranes of 1,2-Dimyristoyl-D54-sn-Glycero-3-Phosphocholine (DMPC-d54) and 1,2-Distearoyl-sn-Glycero-3-Phosphocholine (DSPC). Microscopic information on the temperature dependent melting of the single lipid species could be investigated using FTIR. The microscopic behavior measured could be well described by the results from the MC simulations. These simulations also allowed to calculate heat capacity profiles as determined with DSC. These ones provide macroscopic information about melting enthalpies and entropy changes which are not accessible with FTIR. Therefore, the MC simulations allowed us to link the two different experimental approaches of FTIR and DSC.
Shell model Monte Carlo methods
International Nuclear Information System (INIS)
We review quantum Monte Carlo methods for dealing with large shell model problems. These methods reduce the imaginary-time many-body evolution operator to a coherent superposition of one-body evolutions in fluctuating one-body fields; resultant path integral is evaluated stochastically. We first discuss the motivation, formalism, and implementation of such Shell Model Monte Carlo methods. There then follows a sampler of results and insights obtained from a number of applications. These include the ground state and thermal properties of pf-shell nuclei, thermal behavior of γ-soft nuclei, and calculation of double beta-decay matrix elements. Finally, prospects for further progress in such calculations are discussed. 87 refs
Kinematics of multigrid Monte Carlo
International Nuclear Information System (INIS)
We study the kinematics of multigrid Monte Carlo algorithms by means of acceptance rates for nonlocal Metropolis update proposals. An approximation formula for acceptance rates is derived. We present a comparison of different coarse-to-fine interpolation schemes in free field theory, where the formula is exact. The predictions of the approximation formula for several interacting models are well confirmed by Monte Carlo simulations. The following rule is found: For a critical model with fundametal Hamiltonian Η(φ), absence of critical slowing down can only be expected if the expansion of (Η(φ+ψ)) in terms of the shift ψ contains no relevant (mass) term. We also introduce a multigrid update procedure for nonabelian lattice gauge theory and study the acceptance rates for gauge group SU(2) in four dimensions. (orig.)
Asynchronous Anytime Sequential Monte Carlo
Paige, Brooks; Wood, Frank; Doucet, Arnaud; Teh, Yee Whye
2014-01-01
We introduce a new sequential Monte Carlo algorithm we call the particle cascade. The particle cascade is an asynchronous, anytime alternative to traditional particle filtering algorithms. It uses no barrier synchronizations which leads to improved particle throughput and memory efficiency. It is an anytime algorithm in the sense that it can be run forever to emit an unbounded number of particles while keeping within a fixed memory budget. We prove that the particle cascade is an unbiased mar...
Neural Adaptive Sequential Monte Carlo
Gu, Shixiang; Ghahramani, Zoubin; Turner, Richard E
2015-01-01
Sequential Monte Carlo (SMC), or particle filtering, is a popular class of methods for sampling from an intractable target distribution using a sequence of simpler intermediate distributions. Like other importance sampling-based methods, performance is critically dependent on the proposal distribution: a bad proposal can lead to arbitrarily inaccurate estimates of the target distribution. This paper presents a new method for automatically adapting the proposal using an approximation of the Ku...
Parallel Monte Carlo reactor neutronics
International Nuclear Information System (INIS)
The issues affecting implementation of parallel algorithms for large-scale engineering Monte Carlo neutron transport simulations are discussed. For nuclear reactor calculations, these include load balancing, recoding effort, reproducibility, domain decomposition techniques, I/O minimization, and strategies for different parallel architectures. Two codes were parallelized and tested for performance. The architectures employed include SIMD, MIMD-distributed memory, and workstation network with uneven interactive load. Speedups linear with the number of nodes were achieved
Adaptive Multilevel Monte Carlo Simulation
Hoel, H
2011-08-23
This work generalizes a multilevel forward Euler Monte Carlo method introduced in Michael B. Giles. (Michael Giles. Oper. Res. 56(3):607–617, 2008.) for the approximation of expected values depending on the solution to an Itô stochastic differential equation. The work (Michael Giles. Oper. Res. 56(3):607– 617, 2008.) proposed and analyzed a forward Euler multilevelMonte Carlo method based on a hierarchy of uniform time discretizations and control variates to reduce the computational effort required by a standard, single level, Forward Euler Monte Carlo method. This work introduces an adaptive hierarchy of non uniform time discretizations, generated by an adaptive algorithmintroduced in (AnnaDzougoutov et al. Raùl Tempone. Adaptive Monte Carlo algorithms for stopped diffusion. In Multiscale methods in science and engineering, volume 44 of Lect. Notes Comput. Sci. Eng., pages 59–88. Springer, Berlin, 2005; Kyoung-Sook Moon et al. Stoch. Anal. Appl. 23(3):511–558, 2005; Kyoung-Sook Moon et al. An adaptive algorithm for ordinary, stochastic and partial differential equations. In Recent advances in adaptive computation, volume 383 of Contemp. Math., pages 325–343. Amer. Math. Soc., Providence, RI, 2005.). This form of the adaptive algorithm generates stochastic, path dependent, time steps and is based on a posteriori error expansions first developed in (Anders Szepessy et al. Comm. Pure Appl. Math. 54(10):1169– 1214, 2001). Our numerical results for a stopped diffusion problem, exhibit savings in the computational cost to achieve an accuracy of ϑ(TOL),from(TOL−3), from using a single level version of the adaptive algorithm to ϑ(((TOL−1)log(TOL))2).
Monomial Gamma Monte Carlo Sampling
Zhang, Yizhe; Wang, Xiangyu; Chen, Changyou; Fan, Kai; Carin, Lawrence
2016-01-01
We unify slice sampling and Hamiltonian Monte Carlo (HMC) sampling by demonstrating their connection under the canonical transformation from Hamiltonian mechanics. This insight enables us to extend HMC and slice sampling to a broader family of samplers, called monomial Gamma samplers (MGS). We analyze theoretically the mixing performance of such samplers by proving that the MGS draws samples from a target distribution with zero-autocorrelation, in the limit of a single parameter. This propert...
Extending canonical Monte Carlo methods
Velazquez, L.; Curilef, S.
2010-02-01
In this paper, we discuss the implications of a recently obtained equilibrium fluctuation-dissipation relation for the extension of the available Monte Carlo methods on the basis of the consideration of the Gibbs canonical ensemble to account for the existence of an anomalous regime with negative heat capacities C < 0. The resulting framework appears to be a suitable generalization of the methodology associated with the so-called dynamical ensemble, which is applied to the extension of two well-known Monte Carlo methods: the Metropolis importance sampling and the Swendsen-Wang cluster algorithm. These Monte Carlo algorithms are employed to study the anomalous thermodynamic behavior of the Potts models with many spin states q defined on a d-dimensional hypercubic lattice with periodic boundary conditions, which successfully reduce the exponential divergence of the decorrelation time τ with increase of the system size N to a weak power-law divergence \\tau \\propto N^{\\alpha } with α≈0.2 for the particular case of the 2D ten-state Potts model.
International Nuclear Information System (INIS)
We have shown that the transport equation can be solved with particles, like the Monte-Carlo method, but without random numbers. In the Monte-Carlo method, particles are created from the source, and are followed from collision to collision until either they are absorbed or they leave the spatial domain. In our method, particles are created from the original source, with a variable weight taking into account both collision and absorption. These particles are followed until they leave the spatial domain, and we use them to determine a first collision source. Another set of particles is then created from this first collision source, and tracked to determine a second collision source, and so on. This process introduces an approximation which does not exist in the Monte-Carlo method. However, we have analyzed the effect of this approximation, and shown that it can be limited. Our method is deterministic, gives reproducible results. Furthermore, when extra accuracy is needed in some region, it is easier to get more particles to go there. It has the same kind of applications: rather problems where streaming is dominant than collision dominated problems
Prediction of beam hardening artefacts in computed tomography using Monte Carlo simulations
DEFF Research Database (Denmark)
Thomsen, M.; Bergbäck Knudsen, Erik; Willendrup, Peter Kjær;
2015-01-01
We show how radiological images of both single and multi material samples can be simulated using the Monte Carlo simulation tool McXtrace and how these images can be used to make a three dimensional reconstruction. Good numerical agreement between the X-ray attenuation coefficient in experimental...
Monte Carlo simulation of a prototype photodetector used in radiotherapy
Kausch, C; Albers, D; Schmidt, R; Schreiber, B
2000-01-01
The imaging performance of prototype electronic portal imaging devices (EPID) has been investigated. Monte Carlo simulations have been applied to calculate the modulation transfer function (MTF( f )), the noise power spectrum (NPS( f )) and the detective quantum efficiency (DQE( f )) for different new type of EPIDs, which consist of a detector combination of metal or polyethylene (PE), a phosphor layer of Gd sub 2 O sub 2 S and a flat array of photodiodes. The simulated results agree well with measurements. Based on simulated results, possible optimization of these devices is discussed.
Forward physics Monte Carlo (FPMC)
Czech Academy of Sciences Publication Activity Database
Boonekamp, M.; Juránek, Vojtěch; Kepka, Oldřich; Royon, C.
Hamburg : Verlag Deutsches Elektronen-Synchrotron, 2009 - (Jung, H.; De Roeck, A.), s. 758-762 ISBN N. [HERA and the LHC workshop series on the implications of HERA for LHC physics. Geneve (CH), 26.05.2008-30.05.2008] R&D Projects: GA MŠk LC527; GA MŠk LA08032 Institutional research plan: CEZ:AV0Z10100502 Keywords : forward physics * diffraction * two-photon * Monte Carlo Subject RIV: BF - Elementary Particles and High Energy Physics http://arxiv.org/PS_cache/arxiv/pdf/0903/0903.3861v2.pdf
Carlos Restrepo. Un verdadero Maestro
Pelayo Correa
2009-01-01
Carlos Restrepo fue el primer profesor de Patología y un miembro ilustre del grupo de pioneros que fundaron la Facultad de Medicina de la Universidad del Valle. Estos pioneros convergieron en Cali en la década de 1950, en posesión de un espíritu renovador y creativo que emprendió con mucho éxito la labor de cambiar la cultura académica del Valle del Cauca. Ellos encontraron una sociedad apacible, que disfrutaba de la generosidad de su entorno, sin deseos de romper las tradiciones centenarias...
Therapeutic Applications of Monte Carlo Calculations in Nuclear Medicine
Energy Technology Data Exchange (ETDEWEB)
Coulot, J
2003-08-07
Monte Carlo techniques are involved in many applications in medical physics, and the field of nuclear medicine has seen a great development in the past ten years due to their wider use. Thus, it is of great interest to look at the state of the art in this domain, when improving computer performances allow one to obtain improved results in a dramatically reduced time. The goal of this book is to make, in 15 chapters, an exhaustive review of the use of Monte Carlo techniques in nuclear medicine, also giving key features which are not necessary directly related to the Monte Carlo method, but mandatory for its practical application. As the book deals with therapeutic' nuclear medicine, it focuses on internal dosimetry. After a general introduction on Monte Carlo techniques and their applications in nuclear medicine (dosimetry, imaging and radiation protection), the authors give an overview of internal dosimetry methods (formalism, mathematical phantoms, quantities of interest). Then, some of the more widely used Monte Carlo codes are described, as well as some treatment planning softwares. Some original techniques are also mentioned, such as dosimetry for boron neutron capture synovectomy. It is generally well written, clearly presented, and very well documented. Each chapter gives an overview of each subject, and it is up to the reader to investigate it further using the extensive bibliography provided. Each topic is discussed from a practical point of view, which is of great help for non-experienced readers. For instance, the chapter about mathematical aspects of Monte Carlo particle transport is very clear and helps one to apprehend the philosophy of the method, which is often a difficulty with a more theoretical approach. Each chapter is put in the general (clinical) context, and this allows the reader to keep in mind the intrinsic limitation of each technique involved in dosimetry (for instance activity quantitation). Nevertheless, there are some minor
Monte Carlo primer for health physicists
International Nuclear Information System (INIS)
The basic ideas and principles of Monte Carlo calculations are presented in the form of a primer for health physicists. A simple integral with a known answer is evaluated by two different Monte Carlo approaches. Random number, which underlie Monte Carlo work, are discussed, and a sample table of random numbers generated by a hand calculator is presented. Monte Carlo calculations of dose and linear energy transfer (LET) from 100-keV neutrons incident on a tissue slab are discussed. The random-number table is used in a hand calculation of the initial sequence of events for a 100-keV neutron entering the slab. Some pitfalls in Monte Carlo work are described. While this primer addresses mainly the bare bones of Monte Carlo, a final section briefly describes some of the more sophisticated techniques used in practice to reduce variance and computing time
Interacting Particle Markov Chain Monte Carlo
Rainforth, Tom; Naesseth, Christian A.; Lindsten, Fredrik; Paige, Brooks; van de Meent, Jan-Willem; Doucet, Arnaud; Wood, Frank
2016-01-01
We introduce interacting particle Markov chain Monte Carlo (iPMCMC), a PMCMC method that introduces a coupling between multiple standard and conditional sequential Monte Carlo samplers. Like related methods, iPMCMC is a Markov chain Monte Carlo sampler on an extended space. We present empirical results that show significant improvements in mixing rates relative to both non-interacting PMCMC samplers and a single PMCMC sampler with an equivalent total computational budget. An additional advant...
Mean field simulation for Monte Carlo integration
Del Moral, Pierre
2013-01-01
In the last three decades, there has been a dramatic increase in the use of interacting particle methods as a powerful tool in real-world applications of Monte Carlo simulation in computational physics, population biology, computer sciences, and statistical machine learning. Ideally suited to parallel and distributed computation, these advanced particle algorithms include nonlinear interacting jump diffusions; quantum, diffusion, and resampled Monte Carlo methods; Feynman-Kac particle models; genetic and evolutionary algorithms; sequential Monte Carlo methods; adaptive and interacting Marko
Multidimensional stochastic approximation Monte Carlo.
Zablotskiy, Sergey V; Ivanov, Victor A; Paul, Wolfgang
2016-06-01
Stochastic Approximation Monte Carlo (SAMC) has been established as a mathematically founded powerful flat-histogram Monte Carlo method, used to determine the density of states, g(E), of a model system. We show here how it can be generalized for the determination of multidimensional probability distributions (or equivalently densities of states) of macroscopic or mesoscopic variables defined on the space of microstates of a statistical mechanical system. This establishes this method as a systematic way for coarse graining a model system, or, in other words, for performing a renormalization group step on a model. We discuss the formulation of the Kadanoff block spin transformation and the coarse-graining procedure for polymer models in this language. We also apply it to a standard case in the literature of two-dimensional densities of states, where two competing energetic effects are present g(E_{1},E_{2}). We show when and why care has to be exercised when obtaining the microcanonical density of states g(E_{1}+E_{2}) from g(E_{1},E_{2}). PMID:27415383
Single scatter electron Monte Carlo
Energy Technology Data Exchange (ETDEWEB)
Svatos, M.M. [Lawrence Livermore National Lab., CA (United States)|Wisconsin Univ., Madison, WI (United States)
1997-03-01
A single scatter electron Monte Carlo code (SSMC), CREEP, has been written which bridges the gap between existing transport methods and modeling real physical processes. CREEP simulates ionization, elastic and bremsstrahlung events individually. Excitation events are treated with an excitation-only stopping power. The detailed nature of these simulations allows for calculation of backscatter and transmission coefficients, backscattered energy spectra, stopping powers, energy deposits, depth dose, and a variety of other associated quantities. Although computationally intense, the code relies on relatively few mathematical assumptions, unlike other charged particle Monte Carlo methods such as the commonly-used condensed history method. CREEP relies on sampling the Lawrence Livermore Evaluated Electron Data Library (EEDL) which has data for all elements with an atomic number between 1 and 100, over an energy range from approximately several eV (or the binding energy of the material) to 100 GeV. Compounds and mixtures may also be used by combining the appropriate element data via Bragg additivity.
Quantitative Monte Carlo-based holmium-166 SPECT reconstruction
International Nuclear Information System (INIS)
Purpose: Quantitative imaging of the radionuclide distribution is of increasing interest for microsphere radioembolization (RE) of liver malignancies, to aid treatment planning and dosimetry. For this purpose, holmium-166 (166Ho) microspheres have been developed, which can be visualized with a gamma camera. The objective of this work is to develop and evaluate a new reconstruction method for quantitative 166Ho SPECT, including Monte Carlo-based modeling of photon contributions from the full energy spectrum.Methods: A fast Monte Carlo (MC) simulator was developed for simulation of 166Ho projection images and incorporated in a statistical reconstruction algorithm (SPECT-fMC). Photon scatter and attenuation for all photons sampled from the full 166Ho energy spectrum were modeled during reconstruction by Monte Carlo simulations. The energy- and distance-dependent collimator-detector response was modeled using precalculated convolution kernels. Phantom experiments were performed to quantitatively evaluate image contrast, image noise, count errors, and activity recovery coefficients (ARCs) of SPECT-fMC in comparison with those of an energy window-based method for correction of down-scattered high-energy photons (SPECT-DSW) and a previously presented hybrid method that combines MC simulation of photopeak scatter with energy window-based estimation of down-scattered high-energy contributions (SPECT-ppMC+DSW). Additionally, the impact of SPECT-fMC on whole-body recovered activities (Aest) and estimated radiation absorbed doses was evaluated using clinical SPECT data of six 166Ho RE patients.Results: At the same noise level, SPECT-fMC images showed substantially higher contrast than SPECT-DSW and SPECT-ppMC+DSW in spheres ≥17 mm in diameter. The count error was reduced from 29% (SPECT-DSW) and 25% (SPECT-ppMC+DSW) to 12% (SPECT-fMC). ARCs in five spherical volumes of 1.96–106.21 ml were improved from 32%–63% (SPECT-DSW) and 50%–80% (SPECT-ppMC+DSW) to 76%–103
1-D EQUILIBRIUM DISCRETE DIFFUSION MONTE CARLO
Energy Technology Data Exchange (ETDEWEB)
T. EVANS; ET AL
2000-08-01
We present a new hybrid Monte Carlo method for 1-D equilibrium diffusion problems in which the radiation field coexists with matter in local thermodynamic equilibrium. This method, the Equilibrium Discrete Diffusion Monte Carlo (EqDDMC) method, combines Monte Carlo particles with spatially discrete diffusion solutions. We verify the EqDDMC method with computational results from three slab problems. The EqDDMC method represents an incremental step toward applying this hybrid methodology to non-equilibrium diffusion, where it could be simultaneously coupled to Monte Carlo transport.
Monte Carlo Application ToolKit (MCATK)
International Nuclear Information System (INIS)
Highlights: • Component-based Monte Carlo radiation transport parallel software library. • Designed to build specialized software applications. • Provides new functionality for existing general purpose Monte Carlo transport codes. • Time-independent and time-dependent algorithms with population control. • Algorithm verification and validation results are provided. - Abstract: The Monte Carlo Application ToolKit (MCATK) is a component-based software library designed to build specialized applications and to provide new functionality for existing general purpose Monte Carlo radiation transport codes. We will describe MCATK and its capabilities along with presenting some verification and validations results
An energy transfer method for 4D Monte Carlo dose calculation
Siebers, Jeffrey V; Zhong, Hualiang
2008-01-01
This article presents a new method for four-dimensional Monte Carlo dose calculations which properly addresses dose mapping for deforming anatomy. The method, called the energy transfer method (ETM), separates the particle transport and particle scoring geometries: Particle transport takes place in the typical rectilinear coordinate system of the source image, while energy deposition scoring takes place in a desired reference image via use of deformable image registration. Dose is the energy ...
Zaidi, H
1999-01-01
the many applications of Monte Carlo modelling in nuclear medicine imaging make it desirable to increase the accuracy and computational speed of Monte Carlo codes. The accuracy of Monte Carlo simulations strongly depends on the accuracy in the probability functions and thus on the cross section libraries used for photon transport calculations. A comparison between different photon cross section libraries and parametrizations implemented in Monte Carlo simulation packages developed for positron emission tomography and the most recent Evaluated Photon Data Library (EPDL97) developed by the Lawrence Livermore National Laboratory was performed for several human tissues and common detector materials for energies from 1 keV to 1 MeV. Different photon cross section libraries and parametrizations show quite large variations as compared to the EPDL97 coefficients. This latter library is more accurate and was carefully designed in the form of look-up tables providing efficient data storage, access, and management. Toge...
Geant4 Simulation of Neutron Penumbral Imaging
Institute of Scientific and Technical Information of China (English)
ZHENG; Yu-lai; WANG; Qiang; YANG; Lu; LI; Yong
2012-01-01
<正>The penumbral imaging technology is effective analysis method of Inertial Confinement Fusion (ICF) neutron imaging. To meet neutron penumbral imaging need, simulation of neutron transport in penumbral imaging systems was done by using Monte Carlo program Geant4, and two-dimensional image was got.
International Nuclear Information System (INIS)
The Monte Carlo code MONK is a general program written to provide a high degree of flexibility to the user. MONK is distinguished by its detailed representation of nuclear data in point form i.e., the cross-section is tabulated at specific energies instead of the more usual group representation. The nuclear data are unadjusted in the point form but recently the code has been modified to accept adjusted group data as used in fast and thermal reactor applications. The various geometrical handling capabilities and importance sampling techniques are described. In addition to the nuclear data aspects, the following features are also described; geometrical handling routines, tracking cycles, neutron source and output facilities. 12 references. (U.S.)
Carlos Battilana: Profesor, Gestor, Amigo
José Pacheco; Eleazar Aliaga
2009-01-01
El Comité Editorial de Anales ha perdido a uno de sus miembros más connotados. Brillante docente de nuestra Facultad, Carlos Alberto Battilana Guanilo (1945-2009) supo transmitir los conocimientos y atraer la atención de sus auditorios, de jóvenes estudiantes o de contemporáneos ya no tan jóvenes. Interesó a sus alumnos en la senda de la capacitación permanente y en la investigación. Por otro lado, comprometió a médicos distinguidos a conformar y liderar grupos con interés en la ciencia-amist...
Monte Carlo lattice program KIM
International Nuclear Information System (INIS)
The Monte Carlo program KIM solves the steady-state linear neutron transport equation for a fixed-source problem or, by successive fixed-source runs, for the eigenvalue problem, in a two-dimensional thermal reactor lattice. Fluxes and reaction rates are the main quantities computed by the program, from which power distribution and few-group averaged cross sections are derived. The simulation ranges from 10 MeV to zero and includes anisotropic and inelastic scattering in the fast energy region, the epithermal Doppler broadening of the resonances of some nuclides, and the thermalization phenomenon by taking into account the thermal velocity distribution of some molecules. Besides the well known combinatorial geometry, the program allows complex configurations to be represented by a discrete set of points, an approach greatly improving calculation speed
Monte Carlo application tool-kit (MCATK)
International Nuclear Information System (INIS)
The Monte Carlo Application tool-kit (MCATK) is a C++ component-based software library designed to build specialized applications and to provide new functionality for existing general purpose Monte Carlo radiation transport codes such as MCNP. We will describe MCATK and its capabilities along with presenting some verification and validations results. (authors)
Heidary, Saeed; Setayeshi, Saeed; Ghannadi-Maragheh, Mohammad
2014-09-01
The aim of this study is to compare the adaptive neuro-fuzzy inference system (ANFIS) and the artificial neural network (ANN) to estimate the cross-talk contamination of 99 m Tc / 201 Tl image acquisition in the 201 Tl energy window (77 ± 15% keV). GATE (Geant4 Application in Emission and Tomography) is employed due to its ability to simulate multiple radioactive sources concurrently. Two kinds of phantoms, including two digital and one physical phantom, are used. In the real and the simulation studies, data acquisition is carried out using eight energy windows. The ANN and the ANFIS are prepared in MATLAB, and the GATE results are used as a training data set. Three indications are evaluated and compared. The ANFIS method yields better outcomes for two indications (Spearman's rank correlation coefficient and contrast) and the two phantom results in each category. The maximum image biasing, which is the third indication, is found to be 6% more than that for the ANN.
Carlos Restrepo. Un verdadero Maestro
Directory of Open Access Journals (Sweden)
Pelayo Correa
2009-12-01
Full Text Available Carlos Restrepo fue el primer profesor de Patología y un miembro ilustre del grupo de pioneros que fundaron la Facultad de Medicina de la Universidad del Valle. Estos pioneros convergieron en Cali en la década de 1950, en posesión de un espíritu renovador y creativo que emprendió con mucho éxito la labor de cambiar la cultura académica del Valle del Cauca. Ellos encontraron una sociedad apacible, que disfrutaba de la generosidad de su entorno, sin deseos de romper las tradiciones centenarias de estilo de vida sencillo y satisfecho. Cuando los hijos tenían el deseo y la capacidad de seguir estudios universitarios, especialmente en el área de la medicina, la familia los enviaba a climas menos cálidos, que supuestamente favorecían la función cerebral y la acumulación de conocimientos. Los pioneros de la educación médica en el Valle del Cauca, en buena parte reclutados en universidades nacionales y extranjeras, sabían muy bien que el ambiente vallecaucano no impide una formación universitaria de primera clase. Carlos Restrepo era prototipo del espíritu de cambio y formación intelectual de las nuevas generaciones. Lo manifestaba de múltiples maneras, en buena parte con su genio alegre, extrovertido, optimista, de risa fácil y contagiosa. Pero esta fase amable de su personalidad no ocultaba su tarea formativa; exigía de sus discípulos dedicación y trabajo duro, con fidelidad expresados en memorables caricaturas que exageraban su genio ocasionalmente explosivo. El grupo de pioneros se enfocó con un espíritu de total entrega (tiempo completo y dedicación exclusiva y organizó la nueva Facultad en bien definidos y estructurados departamentos: Anatomía, Bioquímica, Fisiología, Farmacología, Patología, Medicina Interna, Cirugía, Obstetricia y Ginecología, Psiquiatría y Medicina Preventiva. Los departamentos integraron sus funciones primordiales en la enseñanza, la investigación y el servicio a la comunidad. El centro
Common misconceptions in Monte Carlo particle transport
Energy Technology Data Exchange (ETDEWEB)
Booth, Thomas E., E-mail: teb@lanl.gov [LANL, XCP-7, MS F663, Los Alamos, NM 87545 (United States)
2012-07-15
Monte Carlo particle transport is often introduced primarily as a method to solve linear integral equations such as the Boltzmann transport equation. This paper discusses some common misconceptions about Monte Carlo methods that are often associated with an equation-based focus. Many of the misconceptions apply directly to standard Monte Carlo codes such as MCNP and some are worth noting so that one does not unnecessarily restrict future methods. - Highlights: Black-Right-Pointing-Pointer Adjoint variety and use from a Monte Carlo perspective. Black-Right-Pointing-Pointer Misconceptions and preconceived notions about statistical weight. Black-Right-Pointing-Pointer Reasons that an adjoint based weight window sometimes works well or does not. Black-Right-Pointing-Pointer Pulse height/probability of initiation tallies and 'the' transport equation. Black-Right-Pointing-Pointer Highlights unnecessary preconceived notions about Monte Carlo transport.
MATLAB platform for Monte Carlo planning and dosimetry experimental evaluation
International Nuclear Information System (INIS)
A new platform for the full Monte Carlo planning and an independent experimental evaluation that it can be integrated into clinical practice. The tool has proved its usefulness and efficiency and now forms part of the flow of work of our research group, the tool used for the generation of results, which are to be suitably revised and are being published. This software is an effort of integration of numerous algorithms of image processing, along with planning optimization algorithms, allowing the process of MCTP planning from a single interface. In addition, becomes a flexible and accurate tool for the evaluation of experimental dosimetric data for the quality control of actual treatments. (Author)
Monte Carlo simulation of liver cancer treatment with 166Ho-loaded glass microspheres
International Nuclear Information System (INIS)
Microspheres loaded with pure beta-emitter radioisotopes are used in the treatment of some types of liver cancer. The Instituto de Pesquisas Energéticas e Nucleares (IPEN) is developing 166Ho-loaded glass microspheres as an alternative to the commercially available 90Y microspheres. This work describes the implementation of a Monte Carlo code to simulate both the irradiation effects and the imaging of 166Ho and 90Y sources localized in different parts of the liver. Results obtained with the code and perspectives for the future are discussed. - Highlights: ► Monte Carlo simulation of treatments with 166Ho- and 90Y-loaded microspheres. ► A voxelized anthropomorphic phantom and a simplified gamma camera were used. ► Volumetric dose map with 1.2 mm resolution was calculated. ► Image of the gamma camera was produced. ► Perspectives of treatment planning using Monte Carlo and GEANT4
Use of Monte Carlo Methods in brachytherapy
International Nuclear Information System (INIS)
The Monte Carlo method has become a fundamental tool for brachytherapy dosimetry mainly because no difficulties associated with experimental dosimetry. In brachytherapy the main handicap of experimental dosimetry is the high dose gradient near the present sources making small uncertainties in the positioning of the detectors lead to large uncertainties in the dose. This presentation will review mainly the procedure for calculating dose distributions around a fountain using the Monte Carlo method showing the difficulties inherent in these calculations. In addition we will briefly review other applications of the method of Monte Carlo in brachytherapy dosimetry, as its use in advanced calculation algorithms, calculating barriers or obtaining dose applicators around. (Author)
Monte Carlo simulation for soot dynamics
Directory of Open Access Journals (Sweden)
Zhou Kun
2012-01-01
Full Text Available A new Monte Carlo method termed Comb-like frame Monte Carlo is developed to simulate the soot dynamics. Detailed stochastic error analysis is provided. Comb-like frame Monte Carlo is coupled with the gas phase solver Chemkin II to simulate soot formation in a 1-D premixed burner stabilized flame. The simulated soot number density, volume fraction, and particle size distribution all agree well with the measurement available in literature. The origin of the bimodal distribution of particle size distribution is revealed with quantitative proof.
Monte carlo simulation for soot dynamics
Zhou, Kun
2012-01-01
A new Monte Carlo method termed Comb-like frame Monte Carlo is developed to simulate the soot dynamics. Detailed stochastic error analysis is provided. Comb-like frame Monte Carlo is coupled with the gas phase solver Chemkin II to simulate soot formation in a 1-D premixed burner stabilized flame. The simulated soot number density, volume fraction, and particle size distribution all agree well with the measurement available in literature. The origin of the bimodal distribution of particle size distribution is revealed with quantitative proof.
Fast quantum Monte Carlo on a GPU
Lutsyshyn, Y
2013-01-01
We present a scheme for the parallelization of quantum Monte Carlo on graphical processing units, focusing on bosonic systems and variational Monte Carlo. We use asynchronous execution schemes with shared memory persistence, and obtain an excellent acceleration. Comparing with single core execution, GPU-accelerated code runs over x100 faster. The CUDA code is provided along with the package that is necessary to execute variational Monte Carlo for a system representing liquid helium-4. The program was benchmarked on several models of Nvidia GPU, including Fermi GTX560 and M2090, and the latest Kepler architecture K20 GPU. Kepler-specific optimization is discussed.
Importance iteration in MORSE Monte Carlo calculations
International Nuclear Information System (INIS)
An expression to calculate point values (the expected detector response of a particle emerging from a collision or the source) is derived and implemented in the MORSE-SGC/S Monte Carlo code. It is outlined how these point values can be smoothed as a function of energy and as a function of the optical thickness between the detector and the source. The smoothed point values are subsequently used to calculate the biasing parameters of the Monte Carlo runs to follow. The method is illustrated by an example, which shows that the obtained biasing parameters lead to a more efficient Monte Carlo calculation. (orig.)
Advanced computers and Monte Carlo
International Nuclear Information System (INIS)
High-performance parallelism that is currently available is synchronous in nature. It is manifested in such architectures as Burroughs ILLIAC-IV, CDC STAR-100, TI ASC, CRI CRAY-1, ICL DAP, and many special-purpose array processors designed for signal processing. This form of parallelism has apparently not been of significant value to many important Monte Carlo calculations. Nevertheless, there is much asynchronous parallelism in many of these calculations. A model of a production code that requires up to 20 hours per problem on a CDC 7600 is studied for suitability on some asynchronous architectures that are on the drawing board. The code is described and some of its properties and resource requirements ae identified to compare with corresponding properties and resource requirements are identified to compare with corresponding properties and resource requirements are identified to compare with corresponding properties and resources of some asynchronous multiprocessor architectures. Arguments are made for programer aids and special syntax to identify and support important asynchronous parallelism. 2 figures, 5 tables
Construction of the quantitative analysis environment using Monte Carlo simulation
International Nuclear Information System (INIS)
The thoracic phantom image was acquisitioned of the axial section to construct maps of the source and density with Monte Carlo (MC) simulation. The phantom was Heart/Liver Type HL (Kyoto Kagaku Co., Ltd.) single photon emission CT (SPECT)/CT machine was Symbia T6 (Siemence) with the collimator LMEGP (low-medium energy general purpose). Maps were constructed from CT images with an in-house software using Visual studio C Sharp (Microsoft). The code simulation of imaging nuclear detectors (SIMIND) was used for MC simulation, Prominence processor (Nihon Medi-Physics) for filter processing and image reconstruction, and the environment DELL Precision T7400 for all image processes. For the actual experiment, the phantom was given 15 MBq of 99mTc assuming the uptake 2% at the dose of 740 MBq in its myocardial portion and SPECT image was acquisitioned and reconstructed with Butter-worth filter and filter back projection method. CT images were similarly obtained in 0.3 mm thick slices, which were filed in one formatted with digital imaging and communication in medicine (DICOM), and then processed for application to SIMIND for mapping the source and density. Physical and mensuration factors were examined in ideal images by sequential exclusion and simulation of those factors as attenuation, scattering, spatial resolution deterioration and statistical fluctuation. Gamma energy spectrum, SPECT projection and reconstructed images given by the simulation were found to well agree with the actual data, and the precision of MC simulation was confirmed. Physical and mensuration factors were found to be evaluable individually, suggesting the usefulness of the simulation for assessing the precision of their correction. (T.T.)
Guideline for radiation transport simulation with the Monte Carlo method
International Nuclear Information System (INIS)
Today, the photon and neutron transport calculations with the Monte Carlo method have been progressed with advanced Monte Carlo codes and high-speed computers. Monte Carlo simulation is rather suitable expression than the calculation. Once Monte Carlo codes become more friendly and performance of computer progresses, most of the shielding problems will be solved by using the Monte Carlo codes and high-speed computers. As those codes prepare the standard input data for some problems, the essential techniques for solving the Monte Carlo method and variance reduction techniques of the Monte Carlo calculation might lose the interests to the general Monte Carlo users. In this paper, essential techniques of the Monte Carlo method and the variance reduction techniques, such as importance sampling method, selection of estimator, and biasing technique, are described to afford a better understanding of the Monte Carlo method and Monte Carlo code. (author)
Performance analysis based on a Monte Carlo simulation of a liquid xenon PET detector
International Nuclear Information System (INIS)
Liquid xenon is a very attractive medium for position-sensitive gamma-ray detectors for a very wide range of applications, namely, in medical radionuclide imaging. Recently, the authors have proposed a liquid xenon detector for positron emission tomography (PET). In this paper, some aspects of the performance of a liquid xenon PET detector prototype were studied by means of Monte Carlo simulation
A Monte-Carlo-Based Network Method for Source Positioning in Bioluminescence Tomography
Zhun Xu; Xiaolei Song; Xiaomeng Zhang; Jing Bai
2007-01-01
We present an approach based on the improved Levenberg Marquardt (LM) algorithm of backpropagation (BP) neural network to estimate the light source position in bioluminescent imaging. For solving the forward problem, the table-based random sampling algorithm (TBRS), a fast Monte Carlo simulation method ...
11th International Conference on Monte Carlo and Quasi-Monte Carlo Methods in Scientific Computing
Nuyens, Dirk
2016-01-01
This book presents the refereed proceedings of the Eleventh International Conference on Monte Carlo and Quasi-Monte Carlo Methods in Scientific Computing that was held at the University of Leuven (Belgium) in April 2014. These biennial conferences are major events for Monte Carlo and quasi-Monte Carlo researchers. The proceedings include articles based on invited lectures as well as carefully selected contributed papers on all theoretical aspects and applications of Monte Carlo and quasi-Monte Carlo methods. Offering information on the latest developments in these very active areas, this book is an excellent reference resource for theoreticians and practitioners interested in solving high-dimensional computational problems, arising, in particular, in finance, statistics and computer graphics.
"Shaakal" Carlos kaebas arreteerija kohtusse / Margo Pajuste
Pajuste, Margo
2006-01-01
Ilmunud ka: Postimees : na russkom jazõke 3. juuli lk. 11. Vangistatud kurikuulus terrorist "Shaakal" Carlos alustas kohtuasja oma kunagise vahistaja vastu. Ta süüdistab Prantsusmaa luureteenistuse endist juhti inimröövis
Monte Carlo simulations for plasma physics
Energy Technology Data Exchange (ETDEWEB)
Okamoto, M.; Murakami, S.; Nakajima, N.; Wang, W.X. [National Inst. for Fusion Science, Toki, Gifu (Japan)
2000-07-01
Plasma behaviours are very complicated and the analyses are generally difficult. However, when the collisional processes play an important role in the plasma behaviour, the Monte Carlo method is often employed as a useful tool. For examples, in neutral particle injection heating (NBI heating), electron or ion cyclotron heating, and alpha heating, Coulomb collisions slow down high energetic particles and pitch angle scatter them. These processes are often studied by the Monte Carlo technique and good agreements can be obtained with the experimental results. Recently, Monte Carlo Method has been developed to study fast particle transports associated with heating and generating the radial electric field. Further it is applied to investigating the neoclassical transport in the plasma with steep gradients of density and temperatures which is beyong the conventional neoclassical theory. In this report, we briefly summarize the researches done by the present authors utilizing the Monte Carlo method. (author)
Monte Carlo Treatment Planning for Advanced Radiotherapy
DEFF Research Database (Denmark)
Cronholm, Rickard
validation of a Monte Carlo model of a medical linear accelerator (i), converting a CT scan of a patient to a Monte Carlo compliant phantom (ii) and translating the treatment plan parameters (including beam energy, angles of incidence, collimator settings etc) to a Monte Carlo input file (iii). A protocol...... more sophisticated than previous algorithms since it uses delineations of structures in order to include and/or exclude certain media in various anatomical regions. This method has the potential to reduce anatomically irrelevant media assignment. In house MATLAB scripts translating the treatment plan...... presented. Comparison between dose distribution for clinical treatment plans generated by a commercial Treatment Planning System and by the implemented Monte Carlo Treatment Planning workflow were conducted. Good agreement was generally found, but for regions involving large density gradients differences of...
Experience with the Monte Carlo Method
International Nuclear Information System (INIS)
Monte Carlo simulation of radiation transport provides a powerful research and design tool that resembles in many aspects laboratory experiments. Moreover, Monte Carlo simulations can provide an insight not attainable in the laboratory. However, the Monte Carlo method has its limitations, which if not taken into account can result in misleading conclusions. This paper will present the experience of this author, over almost three decades, in the use of the Monte Carlo method for a variety of applications. Examples will be shown on how the method was used to explore new ideas, as a parametric study and design optimization tool, and to analyze experimental data. The consequences of not accounting in detail for detector response and the scattering of radiation by surrounding structures are two of the examples that will be presented to demonstrate the pitfall of condensed
Frontiers of quantum Monte Carlo workshop: preface
International Nuclear Information System (INIS)
The introductory remarks, table of contents, and list of attendees are presented from the proceedings of the conference, Frontiers of Quantum Monte Carlo, which appeared in the Journal of Statistical Physics
Monte Carlo methods for particle transport
Haghighat, Alireza
2015-01-01
The Monte Carlo method has become the de facto standard in radiation transport. Although powerful, if not understood and used appropriately, the method can give misleading results. Monte Carlo Methods for Particle Transport teaches appropriate use of the Monte Carlo method, explaining the method's fundamental concepts as well as its limitations. Concise yet comprehensive, this well-organized text: * Introduces the particle importance equation and its use for variance reduction * Describes general and particle-transport-specific variance reduction techniques * Presents particle transport eigenvalue issues and methodologies to address these issues * Explores advanced formulations based on the author's research activities * Discusses parallel processing concepts and factors affecting parallel performance Featuring illustrative examples, mathematical derivations, computer algorithms, and homework problems, Monte Carlo Methods for Particle Transport provides nuclear engineers and scientists with a practical guide ...
Guérin, Bastein; Fakhri, Georges El
2008-01-01
We have developed and validated a realistic simulation of random coincidences, pixelated block detectors, light sharing among crystal elements and dead-time in 2D and 3D positron emission tomography (PET) imaging based on the SimSET Monte Carlo simulation software. Our simulation was validated by comparison to a Monte Carlo transport code widely used for PET modeling, GATE, and to measurements made on a PET scanner.
Monte Carlo simulation of granular fluids
Montanero, J. M.
2003-01-01
An overview of recent work on Monte Carlo simulations of a granular binary mixture is presented. The results are obtained numerically solving the Enskog equation for inelastic hard-spheres by means of an extension of the well-known direct Monte Carlo simulation (DSMC) method. The homogeneous cooling state and the stationary state reached using the Gaussian thermostat are considered. The temperature ratio, the fourth velocity moments and the velocity distribution functions are obtained for bot...
Carlos II: el centenario olvidado
Directory of Open Access Journals (Sweden)
Luis Antonio RIBOT GARCÍA
2009-12-01
Full Text Available RESUMEN: A partir de una reflexión inicial sobre el fenómeno de las conmemoraciones, el autor se plantea las causas por las que el tercer centenario de la muerte de Carlos II no dará lugar a ninguna conmemoración. Con independencia de las valoraciones de todo tipo que puedan hacerse de dichas celebraciones, lo cierto es que, en este caso, tal vez hubieran permitido acercar al gran público a uno de los monarcas peor conocidos y menos valorados de la historia de España. Lo más grave, sin embargo, es que la sombra del desconocimiento y el juicio peyorativo se extienden también sobre todo su reinado. Las investigaciones sobre aquel periodo, sin embargo, a pesar de que no abundan, muestran una realidad bastante distinta, en la que la decadencia y la pérdida de la hegemonía internacional convivieron con importantes iniciativas y realizaciones políticas, tanto en el ámbito interno de la Monarquía, como en las relaciones internacionales.ABSTRACT: Parting from an initial reflection about the phenomenon of commemorations, the author ponders the causes for which the third centenary of Charles IFs death will not be the subjet of any celebrations. Besides any evaluations which might be made of these events, the truth is that, perhaps, in this case, a commemoration would have brought the general public closer to one of the least known and worst valued monarchs in the history of Spain. What is more serious, however, is the fact that the shadow of ignorance and pejorative judgement extend also over the entirety of his reign. Though scarce, research about this period shows a very different reality, in wich decadence and the loss of international hegemony cohabitated with important political initiatives and achievements, both in the monarchy's internal domain and in the international arena.
Monte Carlo photon transport techniques
International Nuclear Information System (INIS)
The basis of Monte Carlo calculation of photon transport problems is the computer simulation of individual photon histories and their subsequent averaging to provide the quantities of interest. As the history of a photon is followed the values of variables are selected and decisions made by sampling known distributions using random numbers. The transport of photon is simulated by creation of particles from a defined source region, generally with a random initial orientation in space, with tracking of particles as they travel through the system, sampling the probability density functions for their interactions to evaluate their trajectories and energy deposition at different points in the system. The interactions determine the penetration and the motion of particles. The computational model, for radiation transport problems includes geometry and material specifications. Every computer code contains a database of experimentally obtained quantities, known as cross-sections that determine the probability of a particle interacting with the medium through which it is transported. Every cross-section is peculiar to the type and energy of the incident particle and to the kind of interaction it undergoes. These partial cross-sections are summed to form the total cross-section; the ratio of the partial cross-section to the total cross-section gives the probability of this particular interaction occurring. Cross-section data for the interaction types of interest must be supplied for each material present. The model also consists of algorithms used to compute the result of interactions (changes in particle energy, direction, etc.) based on the physical principles that describe the interaction of radiation with matter and the cross-section data provided
Timing resolution of scintillation-detector systems: a Monte Carlo analysis
Choong, Woon-Seng
2009-01-01
Recent advancements in fast scintillating materials and fast photomultiplier tubes (PMTs) have stimulated renewed interest in time-of-flight (TOF) positron emission tomography (PET). It is well known that the improvement in the timing resolution in PET can significantly reduce the noise variance in the reconstructed image resulting in improved image quality. In order to evaluate the timing performance of scintillation detectors used in TOF PET, we use a Monte Carlo analysis to model the physi...
Monte Carlo studies of the VERITAS array of Cherenkov telescopes
Maier, G
2007-01-01
VERITAS is a system of four imaging Cherenkov telescopes located at the Fred Lawrence Whipple Observatory in southern Arizona. We present here results of detailed Monte Carlo simulations of the array response to extensive air showers. Cherenkov image and shower parameter distributions are calculated and show good agreement with distributions obtained from observations of background cosmic rays and high-energy gamma-rays. Cosmic-ray and gamma-ray rates are accurately predicted by the simulations. The energy threshold of the 3-telescope system is about 150 GeV after gamma-hadron separation cuts; the detection rate after gamma-selection cuts for the Crab Nebula is 7.5 gamma's/min. The three-telescope system is able to detect a source with a flux equivalent to 10% of the Crab Nebula flux in 1.2 h of observations (5 sigma detection).
Therapeutic Applications of Monte Carlo Calculations in Nuclear Medicine
Sgouros, George
2003-01-01
This book examines the applications of Monte Carlo (MC) calculations in therapeutic nuclear medicine, from basic principles to computer implementations of software packages and their applications in radiation dosimetry and treatment planning. It is written for nuclear medicine physicists and physicians as well as radiation oncologists, and can serve as a supplementary text for medical imaging, radiation dosimetry and nuclear engineering graduate courses in science, medical and engineering faculties. With chapters is written by recognised authorities in that particular field, the book covers the entire range of MC applications in therapeutic medical and health physics, from its use in imaging prior to therapy to dose distribution modelling targeted radiotherapy. The contributions discuss the fundamental concepts of radiation dosimetry, radiobiological aspects of targeted radionuclide therapy and the various components and steps required for implementing a dose calculation and treatment planning methodology in ...
Characterization of parallel-hole collimator using Monte Carlo Simulation
International Nuclear Information System (INIS)
Accuracy of in vivo activity quantification improves after the correction of penetrated and scattered photons. However, accurate assessment is not possible with physical experiment. We have used Monte Carlo Simulation to accurately assess the contribution of penetrated and scattered photons in the photopeak window. Simulations were performed with Simulation of Imaging Nuclear Detectors Monte Carlo Code. The simulations were set up in such a way that it provides geometric, penetration, and scatter components after each simulation and writes binary images to a data file. These components were analyzed graphically using Microsoft Excel (Microsoft Corporation, USA). Each binary image was imported in software (ImageJ) and logarithmic transformation was applied for visual assessment of image quality, plotting profile across the center of the images and calculating full width at half maximum (FWHM) in horizontal and vertical directions. The geometric, penetration, and scatter at 140 keV for low-energy general-purpose were 93.20%, 4.13%, 2.67% respectively. Similarly, geometric, penetration, and scatter at 140 keV for low-energy high-resolution (LEHR), medium-energy general-purpose (MEGP), and high-energy general-purpose (HEGP) collimator were (94.06%, 3.39%, 2.55%), (96.42%, 1.52%, 2.06%), and (96.70%, 1.45%, 1.85%), respectively. For MEGP collimator at 245 keV photon and for HEGP collimator at 364 keV were 89.10%, 7.08%, 3.82% and 67.78%, 18.63%, 13.59%, respectively. Low-energy general-purpose and LEHR collimator is best to image 140 keV photon. HEGP can be used for 245 keV and 364 keV; however, correction for penetration and scatter must be applied if one is interested to quantify the in vivo activity of energy 364 keV. Due to heavy penetration and scattering, 511 keV photons should not be imaged with HEGP collimator
The Monte Carlo code MCBEND - where it is and where it's going
International Nuclear Information System (INIS)
The Monte Carlo method forms a corner stone to the calculational procedures established in the UK for shielding design and assessment. The emphasis of the work in the shielding area is centred on the Monte Carlo code MCBEND. The work programme in support of the code is broadly directed towards utilisation of new hardware, the development of improved modelling algorithms, the development of new acceleration methods for specific applications and enhancements to user image. This paper summarises the current status of MCBEND and reviews developments carried out over the past two years and planned for the future. (author)
International Nuclear Information System (INIS)
A Monte Carlo radiation transport simulation program, EGS Nova, and a computer aided design software, BRL-CAD, have been coupled within the framework of Sindbad, a nondestructive evaluation (NDE) simulation system. In its current status, the program is very valuable in a NDE laboratory context, as it helps simulate the images due to the uncollided and scattered photon fluxes in a single NDE software environment, without having to switch to a Monte Carlo code parameters set. Numerical validations show a good agreement with EGS4 computed and published data. As the program's major drawback is the execution time, computational efficiency improvements are foreseen. (orig.)
Optimization of reconstruction algorithms using Monte Carlo simulation
International Nuclear Information System (INIS)
A method for optimizing reconstruction algorithms is presented that is based on how well a specified task can be performed using the reconstructed images. Task performance is numerically assessed by a Monte Carlo simulation of the complete imaging process including the generation of scenes appropriate to the desired application, subsequent data taking, reconstruction, and performance of the stated task based on the final image. The use of this method is demonstrated through the optimization of the Algebraic Reconstruction Technique (ART), which reconstructs images from their projections by a iterative procedure. The optimization is accomplished by varying the relaxation factor employed in the updating procedure. In some of the imaging situations studied, it is found that the optimization of constrained ART, in which a nonnegativity constraint is invoked, can vastly increase the detectability of objects. There is little improvement attained for unconstrained ART. The general method presented may be applied to the problem of designing neutron-diffraction spectrometers. 11 refs., 6 figs., 2 tabs
Quantitative Monte Carlo-based holmium-166 SPECT reconstruction
Energy Technology Data Exchange (ETDEWEB)
Elschot, Mattijs; Smits, Maarten L. J.; Nijsen, Johannes F. W.; Lam, Marnix G. E. H.; Zonnenberg, Bernard A.; Bosch, Maurice A. A. J. van den; Jong, Hugo W. A. M. de [Department of Radiology and Nuclear Medicine, University Medical Center Utrecht, Heidelberglaan 100, 3584 CX Utrecht (Netherlands); Viergever, Max A. [Image Sciences Institute, University Medical Center Utrecht, Heidelberglaan 100, 3584 CX Utrecht (Netherlands)
2013-11-15
Purpose: Quantitative imaging of the radionuclide distribution is of increasing interest for microsphere radioembolization (RE) of liver malignancies, to aid treatment planning and dosimetry. For this purpose, holmium-166 ({sup 166}Ho) microspheres have been developed, which can be visualized with a gamma camera. The objective of this work is to develop and evaluate a new reconstruction method for quantitative {sup 166}Ho SPECT, including Monte Carlo-based modeling of photon contributions from the full energy spectrum.Methods: A fast Monte Carlo (MC) simulator was developed for simulation of {sup 166}Ho projection images and incorporated in a statistical reconstruction algorithm (SPECT-fMC). Photon scatter and attenuation for all photons sampled from the full {sup 166}Ho energy spectrum were modeled during reconstruction by Monte Carlo simulations. The energy- and distance-dependent collimator-detector response was modeled using precalculated convolution kernels. Phantom experiments were performed to quantitatively evaluate image contrast, image noise, count errors, and activity recovery coefficients (ARCs) of SPECT-fMC in comparison with those of an energy window-based method for correction of down-scattered high-energy photons (SPECT-DSW) and a previously presented hybrid method that combines MC simulation of photopeak scatter with energy window-based estimation of down-scattered high-energy contributions (SPECT-ppMC+DSW). Additionally, the impact of SPECT-fMC on whole-body recovered activities (A{sup est}) and estimated radiation absorbed doses was evaluated using clinical SPECT data of six {sup 166}Ho RE patients.Results: At the same noise level, SPECT-fMC images showed substantially higher contrast than SPECT-DSW and SPECT-ppMC+DSW in spheres ≥17 mm in diameter. The count error was reduced from 29% (SPECT-DSW) and 25% (SPECT-ppMC+DSW) to 12% (SPECT-fMC). ARCs in five spherical volumes of 1.96–106.21 ml were improved from 32%–63% (SPECT-DSW) and 50%–80
Successful vectorization - reactor physics Monte Carlo code
International Nuclear Information System (INIS)
Most particle transport Monte Carlo codes in use today are based on the ''history-based'' algorithm, wherein one particle history at a time is simulated. Unfortunately, the ''history-based'' approach (present in all Monte Carlo codes until recent years) is inherently scalar and cannot be vectorized. In particular, the history-based algorithm cannot take advantage of vector architectures, which characterize the largest and fastest computers at the current time, vector supercomputers such as the Cray X/MP or IBM 3090/600. However, substantial progress has been made in recent years in developing and implementing a vectorized Monte Carlo algorithm. This algorithm follows portions of many particle histories at the same time and forms the basis for all successful vectorized Monte Carlo codes that are in use today. This paper describes the basic vectorized algorithm along with descriptions of several variations that have been developed by different researchers for specific applications. These applications have been mainly in the areas of neutron transport in nuclear reactor and shielding analysis and photon transport in fusion plasmas. The relative merits of the various approach schemes will be discussed and the present status of known vectorization efforts will be summarized along with available timing results, including results from the successful vectorization of 3-D general geometry, continuous energy Monte Carlo. (orig.)
International Nuclear Information System (INIS)
Full text: Aim Motion of lung tumours due to respiratory motion is a significant problem in radiotherapy. The aim of this work was to develop a Monte Carlo model of a commercially available motion phantom. Method The Geant4 C++ based Monte Carlo package was used to replicate the QUASAR motion phantom from Modus Medical. The physical QUASAR phantom contains moving inserts which represent the target and is capable of numerous dosimetric and imaging quality assurance functions. The Monte Carlo phantom model in this work allows the user to import patient respiratory data recorded with the Varian Real-time Position Management system. The spatial and temporal motion of the virtual phantom is determined by the patient data, therefore, making it ideal for patient specific QA. A user interface was created that allows patient data and scoring options to be assigned as well as media and density selections for all inserts. Results The virtual QUASAR Monte Carlo phantom is able to replicate patient motion and determine the effects of motion on dose distributions. The Monte Carlo model replicates patient superior inferior respiratory motion accurately and creates a platform for patient specific QA and TPS verification. Furthermore, dose calculation within the phantom can be performed with the increased accuracy of Monte Carlo and compared with measurements. Conclusion The added accuracy of dose-calculation afforded by Monte Carlo methods along with the ability to QA motion management protocols makes the virtual QUASAR phantom a useful tool in motion management for radiotherapy.
Quantum Monte Carlo calculations of light nuclei
International Nuclear Information System (INIS)
Quantum Monte Carlo calculations using realistic two- and three-nucleon interactions are presented for nuclei with up to eight nucleons. We have computed the ground and a few excited states of all such nuclei with Greens function Monte Carlo (GFMC) and all of the experimentally known excited states using variational Monte Carlo (VMC). The GFMC calculations show that for a given Hamiltonian, the VMC calculations of excitation spectra are reliable, but the VMC ground-state energies are significantly above the exact values. We find that the Hamiltonian we are using (which was developed based on 3H, 4He, and nuclear matter calculations) underpredicts the binding energy of p-shell nuclei. However our results for excitation spectra are very good and one can see both shell-model and collective spectra resulting from fundamental many-nucleon calculations. Possible improvements in the three-nucleon potential are also be discussed
Yours in Revolution: Retrofitting Carlos the Jackal
Directory of Open Access Journals (Sweden)
Samuel Thomas
2013-09-01
Full Text Available This paper explores the representation of ‘Carlos the Jackal’, the one-time ‘World’s Most Wanted Man’ and ‘International Face of Terror’ – primarily in cin-ema but also encompassing other forms of popular culture and aspects of Cold War policy-making. At the centre of the analysis is Olivier Assayas’s Carlos (2010, a transnational, five and a half hour film (first screened as a TV mini-series about the life and times of the infamous militant. Concentrating on the var-ious ways in which Assayas expresses a critical preoccupation with names and faces through complex formal composition, the project examines the play of ab-straction and embodiment that emerges from the narrativisation of terrorist vio-lence. Lastly, it seeks to engage with the hidden implications of Carlos in terms of the intertwined trajectories of formal experimentation and revolutionary politics.
Monte Carlo strategies in scientific computing
Liu, Jun S
2008-01-01
This paperback edition is a reprint of the 2001 Springer edition This book provides a self-contained and up-to-date treatment of the Monte Carlo method and develops a common framework under which various Monte Carlo techniques can be "standardized" and compared Given the interdisciplinary nature of the topics and a moderate prerequisite for the reader, this book should be of interest to a broad audience of quantitative researchers such as computational biologists, computer scientists, econometricians, engineers, probabilists, and statisticians It can also be used as the textbook for a graduate-level course on Monte Carlo methods Many problems discussed in the alter chapters can be potential thesis topics for masters’ or PhD students in statistics or computer science departments Jun Liu is Professor of Statistics at Harvard University, with a courtesy Professor appointment at Harvard Biostatistics Department Professor Liu was the recipient of the 2002 COPSS Presidents' Award, the most prestigious one for sta...
Quantum Monte Carlo with Variable Spins
Melton, Cody A; Mitas, Lubos
2016-01-01
We investigate the inclusion of variable spins in electronic structure quantum Monte Carlo, with a focus on diffusion Monte Carlo with Hamiltonians that include spin-orbit interactions. Following our previous introduction of fixed-phase spin-orbit diffusion Monte Carlo (FPSODMC), we thoroughly discuss the details of the method and elaborate upon its technicalities. We present a proof for an upper-bound property for complex nonlocal operators, which allows for the implementation of T-moves to ensure the variational property. We discuss the time step biases associated with our particular choice of spin representation. Applications of the method are also presented for atomic and molecular systems. We calculate the binding energies and geometry of the PbH and Sn$_2$ molecules, as well as the electron affinities of the 6$p$ row elements in close agreement with experiments.
SPQR: a Monte Carlo reactor kinetics code
International Nuclear Information System (INIS)
The SPQR Monte Carlo code has been developed to analyze fast reactor core accident problems where conventional methods are considered inadequate. The code is based on the adiabatic approximation of the quasi-static method. This initial version contains no automatic material motion or feedback. An existing Monte Carlo code is used to calculate the shape functions and the integral quantities needed in the kinetics module. Several sample problems have been devised and analyzed. Due to the large statistical uncertainty associated with the calculation of reactivity in accident simulations, the results, especially at later times, differ greatly from deterministic methods. It was also found that in large uncoupled systems, the Monte Carlo method has difficulty in handling asymmetric perturbations
CosmoPMC: Cosmology Population Monte Carlo
Kilbinger, Martin; Cappe, Olivier; Cardoso, Jean-Francois; Fort, Gersende; Prunet, Simon; Robert, Christian P; Wraith, Darren
2011-01-01
We present the public release of the Bayesian sampling algorithm for cosmology, CosmoPMC (Cosmology Population Monte Carlo). CosmoPMC explores the parameter space of various cosmological probes, and also provides a robust estimate of the Bayesian evidence. CosmoPMC is based on an adaptive importance sampling method called Population Monte Carlo (PMC). Various cosmology likelihood modules are implemented, and new modules can be added easily. The importance-sampling algorithm is written in C, and fully parallelised using the Message Passing Interface (MPI). Due to very little overhead, the wall-clock time required for sampling scales approximately with the number of CPUs. The CosmoPMC package contains post-processing and plotting programs, and in addition a Monte-Carlo Markov chain (MCMC) algorithm. The sampling engine is implemented in the library pmclib, and can be used independently. The software is available for download at http://www.cosmopmc.info.
Analytical positron range modelling in heterogeneous media for PET Monte Carlo simulation
Energy Technology Data Exchange (ETDEWEB)
Lehnert, Wencke; Meikle, Steven R [Discipline of Medical Radiation Sciences, Faculty of Health Sciences, University of Sydney, PO Box 170, Lidcombe NSW 1825 (Australia); Gregoire, Marie-Claude; Reilhac, Anthonin, E-mail: wlehnert@uni.sydney.edu.au [Australian Nuclear Science and Technology Organisation, Lucas Heights NSW 2234 (Australia)
2011-06-07
Monte Carlo simulation codes that model positron interactions along their tortuous path are expected to be accurate but are usually slow. A simpler and potentially faster approach is to model positron range from analytical annihilation density distributions. The aims of this paper were to efficiently implement and validate such a method, with the addition of medium heterogeneity representing a further challenge. The analytical positron range model was evaluated by comparing annihilation density distributions with those produced by the Monte Carlo simulator GATE and by quantitatively analysing the final reconstructed images of Monte Carlo simulated data. In addition, the influence of positronium formation on positron range and hence on the performance of Monte Carlo simulation was investigated. The results demonstrate that 1D annihilation density distributions for different isotope-media combinations can be fitted with Gaussian functions and hence be described by simple look-up-tables of fitting coefficients. Together with the method developed for simulating positron range in heterogeneous media, this allows for efficient modelling of positron range in Monte Carlo simulation. The level of agreement of the analytical model with GATE depends somewhat on the simulated scanner and the particular research task, but appears to be suitable for lower energy positron emitters, such as {sup 18}F or {sup 11}C. No reliable conclusion about the influence of positronium formation on positron range and simulation accuracy could be drawn.
Accuracy of Monte Carlo simulations compared to in-vivo MDCT dosimetry
Energy Technology Data Exchange (ETDEWEB)
Bostani, Maryam, E-mail: mbostani@mednet.ucla.edu; McMillan, Kyle; Cagnon, Chris H.; McNitt-Gray, Michael F. [Departments of Biomedical Physics and Radiology, David Geffen School of Medicine, University of California, Los Angeles, Los Angeles, California 90024 (United States); Mueller, Jonathon W. [United States Air Force, Keesler Air Force Base, Biloxi, Mississippi 39534 (United States); Cody, Dianna D. [University of Texas M.D. Anderson Cancer Center, Houston, Texas 77030 (United States); DeMarco, John J. [Departments of Biomedical Physics and Radiation Oncology, David Geffen School of Medicine, University of California, Los Angeles, Los Angeles, California 90024 (United States)
2015-02-15
Purpose: The purpose of this study was to assess the accuracy of a Monte Carlo simulation-based method for estimating radiation dose from multidetector computed tomography (MDCT) by comparing simulated doses in ten patients to in-vivo dose measurements. Methods: MD Anderson Cancer Center Institutional Review Board approved the acquisition of in-vivo rectal dose measurements in a pilot study of ten patients undergoing virtual colonoscopy. The dose measurements were obtained by affixing TLD capsules to the inner lumen of rectal catheters. Voxelized patient models were generated from the MDCT images of the ten patients, and the dose to the TLD for all exposures was estimated using Monte Carlo based simulations. The Monte Carlo simulation results were compared to the in-vivo dose measurements to determine accuracy. Results: The calculated mean percent difference between TLD measurements and Monte Carlo simulations was −4.9% with standard deviation of 8.7% and a range of −22.7% to 5.7%. Conclusions: The results of this study demonstrate very good agreement between simulated and measured doses in-vivo. Taken together with previous validation efforts, this work demonstrates that the Monte Carlo simulation methods can provide accurate estimates of radiation dose in patients undergoing CT examinations.
Modelling photon transport in non-uniform media for SPECT with a vectorized Monte Carlo code.
Smith, M F
1993-10-01
A vectorized Monte Carlo code has been developed for modelling photon transport in non-uniform media for single-photon-emission computed tomography (SPECT). The code is designed to compute photon detection kernels, which are used to build system matrices for simulating SPECT projection data acquisition and for use in matrix-based image reconstruction. Non-uniform attenuating and scattering regions are constructed from simple three-dimensional geometric shapes, in which the density and mass attenuation coefficients are individually specified. On a Stellar GS1000 computer, Monte Carlo simulations are performed between 1.6 and 2.0 times faster when the vector processor is utilized than when computations are performed in scalar mode. Projection data acquired with a clinical SPECT gamma camera for a line source in a non-uniform thorax phantom are well modelled by Monte Carlo simulations. The vectorized Monte Carlo code was used to stimulate a 99Tcm SPECT myocardial perfusion study, and compensations for non-uniform attenuation and the detection of scattered photons improve activity estimation. The speed increase due to vectorization makes Monte Carlo simulation more attractive as a tool for modelling photon transport in non-uniform media for SPECT. PMID:8248288
Monte Carlo dose calculation in dental amalgam phantom.
Aziz, Mohd Zahri Abdul; Yusoff, A L; Osman, N D; Abdullah, R; Rabaie, N A; Salikin, M S
2015-01-01
It has become a great challenge in the modern radiation treatment to ensure the accuracy of treatment delivery in electron beam therapy. Tissue inhomogeneity has become one of the factors for accurate dose calculation, and this requires complex algorithm calculation like Monte Carlo (MC). On the other hand, computed tomography (CT) images used in treatment planning system need to be trustful as they are the input in radiotherapy treatment. However, with the presence of metal amalgam in treatment volume, the CT images input showed prominent streak artefact, thus, contributed sources of error. Hence, metal amalgam phantom often creates streak artifacts, which cause an error in the dose calculation. Thus, a streak artifact reduction technique was applied to correct the images, and as a result, better images were observed in terms of structure delineation and density assigning. Furthermore, the amalgam density data were corrected to provide amalgam voxel with accurate density value. As for the errors of dose uncertainties due to metal amalgam, they were reduced from 46% to as low as 2% at d80 (depth of the 80% dose beyond Zmax) using the presented strategies. Considering the number of vital and radiosensitive organs in the head and the neck regions, this correction strategy is suggested in reducing calculation uncertainties through MC calculation. PMID:26500401
Monte carlo dose calculation in dental amalgam phantom
Directory of Open Access Journals (Sweden)
Mohd Zahri Abdul Aziz
2015-01-01
Full Text Available It has become a great challenge in the modern radiation treatment to ensure the accuracy of treatment delivery in electron beam therapy. Tissue inhomogeneity has become one of the factors for accurate dose calculation, and this requires complex algorithm calculation like Monte Carlo (MC. On the other hand, computed tomography (CT images used in treatment planning system need to be trustful as they are the input in radiotherapy treatment. However, with the presence of metal amalgam in treatment volume, the CT images input showed prominent streak artefact, thus, contributed sources of error. Hence, metal amalgam phantom often creates streak artifacts, which cause an error in the dose calculation. Thus, a streak artifact reduction technique was applied to correct the images, and as a result, better images were observed in terms of structure delineation and density assigning. Furthermore, the amalgam density data were corrected to provide amalgam voxel with accurate density value. As for the errors of dose uncertainties due to metal amalgam, they were reduced from 46% to as low as 2% at d80 (depth of the 80% dose beyond Zmax using the presented strategies. Considering the number of vital and radiosensitive organs in the head and the neck regions, this correction strategy is suggested in reducing calculation uncertainties through MC calculation.
Directory of Open Access Journals (Sweden)
Pedro Pablo Ferrer Gallego
2012-07-01
Full Text Available RESUMEN: Se presenta y comenta un conjunto de cartas enviadas por Carlos Vicioso a Carlos Pau durante su estancia en Bicorp (Valencia entre 1914 y 1915. Las cartas se encuentran depositadas en el Archivo Histórico del Instituto Botánico de Barcelona. Esta correspondencia epistolar marca el comienzo de la relación científica entre Vicioso y Pau, basada en un primer momento en las consultas que le hace Vicioso al de Segorbe para la determinación de las especies que a través de pliegos de herbario envía desde su estancia en la localidad valenciana. En la actualidad estos pliegos testigo se encuentran conservados en diferentes herbarios oficiales nacionales y también extranjeros, fruto del envío e intercambio de material entre Vicioso y otros botánicos de la época, principalmente con Pau, Sennen y Font Quer.ABSTRACT: Epistolary correspondece between Carlos Vicioso and Carlos Pau during a stay in Bicorp (Valencia. A set of letters sent from Carlos Vicioso to Carlos Pau during a stay in Bicorp (Valencia between 1914 and 1915 are here presented and discussed. The letters are located in the Archivo Histórico del Instituto Botánico de Barcelona. This lengthy correspondence among the authors shows the beginning of their scientific relationship. At first, the correspondence was based on the consults from Vicioso to Pau for the determination of the species which were sent from Bicorp, in the herbarium sheets. Nowadays, these witness sheets are preserved in the national and international herbaria, thanks to the botanical material exchange among Vicioso and other botanist of the time, mainly with Pau, Sennen and Font Quer.
Yours in revolution : retrofitting Carlos the Jackal.
Samuel Thomas
2013-01-01
This paper explores the representation of ‘Carlos the Jackal’, the one-time ‘World’s Most Wanted Man’ and ‘International Face of Terror’ – primarily in cin-ema but also encompassing other forms of popular culture and aspects of Cold War policy-making. At the centre of the analysis is Olivier Assayas’s Carlos (2010), a transnational, five and a half hour film (first screened as a TV mini-series) about the life and times of the infamous militant. Concentrating on the var-ious ways in which Assa...
Geodesic Monte Carlo on Embedded Manifolds.
Byrne, Simon; Girolami, Mark
2013-12-01
Markov chain Monte Carlo methods explicitly defined on the manifold of probability distributions have recently been established. These methods are constructed from diffusions across the manifold and the solution of the equations describing geodesic flows in the Hamilton-Jacobi representation. This paper takes the differential geometric basis of Markov chain Monte Carlo further by considering methods to simulate from probability distributions that themselves are defined on a manifold, with common examples being classes of distributions describing directional statistics. Proposal mechanisms are developed based on the geodesic flows over the manifolds of support for the distributions, and illustrative examples are provided for the hypersphere and Stiefel manifold of orthonormal matrices. PMID:25309024
Interaction picture density matrix quantum Monte Carlo
International Nuclear Information System (INIS)
The recently developed density matrix quantum Monte Carlo (DMQMC) algorithm stochastically samples the N-body thermal density matrix and hence provides access to exact properties of many-particle quantum systems at arbitrary temperatures. We demonstrate that moving to the interaction picture provides substantial benefits when applying DMQMC to interacting fermions. In this first study, we focus on a system of much recent interest: the uniform electron gas in the warm dense regime. The basis set incompleteness error at finite temperature is investigated and extrapolated via a simple Monte Carlo sampling procedure. Finally, we provide benchmark calculations for a four-electron system, comparing our results to previous work where possible
Monte Carlo modeling of Tajoura reactor
International Nuclear Information System (INIS)
From neutronics point of view, reactor modeling is concerned with the determination of the reactor neutronic parameters which can be obtained through the solution of the neutron transport equation. The attractiveness of the Monte Carlo method is in its capability of handling geometrically complicated problems and due to the nature of the method a large number of particles can be tracked from birth to death before any statistically significant results can be obtained. In this paper the MCNP, a Monte Carlo code, is implemented in the modeling of the Tajoura reactor. (author)
Monte Carlo dose computation for IMRT optimization*
Laub, W.; Alber, M.; Birkner, M.; Nüsslin, F.
2000-07-01
A method which combines the accuracy of Monte Carlo dose calculation with a finite size pencil-beam based intensity modulation optimization is presented. The pencil-beam algorithm is employed to compute the fluence element updates for a converging sequence of Monte Carlo dose distributions. The combination is shown to improve results over the pencil-beam based optimization in a lung tumour case and a head and neck case. Inhomogeneity effects like a broader penumbra and dose build-up regions can be compensated for by intensity modulation.
Monte Carlo electron/photon transport
International Nuclear Information System (INIS)
A review of nonplasma coupled electron/photon transport using Monte Carlo method is presented. Remarks are mainly restricted to linerarized formalisms at electron energies from 1 keV to 1000 MeV. Applications involving pulse-height estimation, transport in external magnetic fields, and optical Cerenkov production are discussed to underscore the importance of this branch of computational physics. Advances in electron multigroup cross-section generation is reported, and its impact on future code development assessed. Progress toward the transformation of MCNP into a generalized neutral/charged-particle Monte Carlo code is described. 48 refs
Monte Carlo simulation of neutron scattering instruments
International Nuclear Information System (INIS)
A library of Monte Carlo subroutines has been developed for the purpose of design of neutron scattering instruments. Using small-angle scattering as an example, the philosophy and structure of the library are described and the programs are used to compare instruments at continuous wave (CW) and long-pulse spallation source (LPSS) neutron facilities. The Monte Carlo results give a count-rate gain of a factor between 2 and 4 using time-of-flight analysis. This is comparable to scaling arguments based on the ratio of wavelength bandwidth to resolution width
Monte Carlo applications to radiation shielding problems
International Nuclear Information System (INIS)
Monte Carlo methods are a class of computational algorithms that rely on repeated random sampling of physical and mathematical systems to compute their results. However, basic concepts of MC are both simple and straightforward and can be learned by using a personal computer. Uses of Monte Carlo methods require large amounts of random numbers, and it was their use that spurred the development of pseudorandom number generators, which were far quicker to use than the tables of random numbers which had been previously used for statistical sampling. In Monte Carlo simulation of radiation transport, the history (track) of a particle is viewed as a random sequence of free flights that end with an interaction event where the particle changes its direction of movement, loses energy and, occasionally, produces secondary particles. The Monte Carlo simulation of a given experimental arrangement (e.g., an electron beam, coming from an accelerator and impinging on a water phantom) consists of the numerical generation of random histories. To simulate these histories we need an interaction model, i.e., a set of differential cross sections (DCS) for the relevant interaction mechanisms. The DCSs determine the probability distribution functions (pdf) of the random variables that characterize a track; 1) free path between successive interaction events, 2) type of interaction taking place and 3) energy loss and angular deflection in a particular event (and initial state of emitted secondary particles, if any). Once these pdfs are known, random histories can be generated by using appropriate sampling methods. If the number of generated histories is large enough, quantitative information on the transport process may be obtained by simply averaging over the simulated histories. The Monte Carlo method yields the same information as the solution of the Boltzmann transport equation, with the same interaction model, but is easier to implement. In particular, the simulation of radiation
Monte Carlo dose distributions for radiosurgery
International Nuclear Information System (INIS)
The precision of Radiosurgery Treatment planning systems is limited by the approximations of their algorithms and by their dosimetrical input data. This fact is especially important in small fields. However, the Monte Carlo methods is an accurate alternative as it considers every aspect of particle transport. In this work an acoustic neurinoma is studied by comparing the dose distribution of both a planning system and Monte Carlo. Relative shifts have been measured and furthermore, Dose-Volume Histograms have been calculated for target and adjacent organs at risk. (orig.)
Monte Carlo simulation of granular fluids
Montanero, J M
2003-01-01
An overview of recent work on Monte Carlo simulations of a granular binary mixture is presented. The results are obtained numerically solving the Enskog equation for inelastic hard-spheres by means of an extension of the well-known direct Monte Carlo simulation (DSMC) method. The homogeneous cooling state and the stationary state reached using the Gaussian thermostat are considered. The temperature ratio, the fourth velocity moments and the velocity distribution functions are obtained for both cases. The shear viscosity characterizing the momentum transport in the thermostatted case is calculated as well. The simulation results are compared with analytical predictions showing an excellent agreement.
Fast sequential Monte Carlo methods for counting and optimization
Rubinstein, Reuven Y; Vaisman, Radislav
2013-01-01
A comprehensive account of the theory and application of Monte Carlo methods Based on years of research in efficient Monte Carlo methods for estimation of rare-event probabilities, counting problems, and combinatorial optimization, Fast Sequential Monte Carlo Methods for Counting and Optimization is a complete illustration of fast sequential Monte Carlo techniques. The book provides an accessible overview of current work in the field of Monte Carlo methods, specifically sequential Monte Carlo techniques, for solving abstract counting and optimization problems. Written by authorities in the
Application of Monte Carlo methods in tomotherapy and radiation biophysics
Hsiao, Ya-Yun
Helical tomotherapy is an attractive treatment for cancer therapy because highly conformal dose distributions can be achieved while the on-board megavoltage CT provides simultaneous images for accurate patient positioning. The convolution/superposition (C/S) dose calculation methods typically used for Tomotherapy treatment planning may overestimate skin (superficial) doses by 3-13%. Although more accurate than C/S methods, Monte Carlo (MC) simulations are too slow for routine clinical treatment planning. However, the computational requirements of MC can be reduced by developing a source model for the parts of the accelerator that do not change from patient to patient. This source model then becomes the starting point for additional simulations of the penetration of radiation through patient. In the first section of this dissertation, a source model for a helical tomotherapy is constructed by condensing information from MC simulations into series of analytical formulas. The MC calculated percentage depth dose and beam profiles computed using the source model agree within 2% of measurements for a wide range of field sizes, which suggests that the proposed source model provides an adequate representation of the tomotherapy head for dose calculations. Monte Carlo methods are a versatile technique for simulating many physical, chemical and biological processes. In the second major of this thesis, a new methodology is developed to simulate of the induction of DNA damage by low-energy photons. First, the PENELOPE Monte Carlo radiation transport code is used to estimate the spectrum of initial electrons produced by photons. The initial spectrum of electrons are then combined with DNA damage yields for monoenergetic electrons from the fast Monte Carlo damage simulation (MCDS) developed earlier by Semenenko and Stewart (Purdue University). Single- and double-strand break yields predicted by the proposed methodology are in good agreement (1%) with the results of published
Monte Carlo methods in AB initio quantum chemistry quantum Monte Carlo for molecules
Lester, William A; Reynolds, PJ
1994-01-01
This book presents the basic theory and application of the Monte Carlo method to the electronic structure of atoms and molecules. It assumes no previous knowledge of the subject, only a knowledge of molecular quantum mechanics at the first-year graduate level. A working knowledge of traditional ab initio quantum chemistry is helpful, but not essential.Some distinguishing features of this book are: Clear exposition of the basic theory at a level to facilitate independent study. Discussion of the various versions of the theory: diffusion Monte Carlo, Green's function Monte Carlo, and release n
On the use of stochastic approximation Monte Carlo for Monte Carlo integration
Liang, Faming
2009-03-01
The stochastic approximation Monte Carlo (SAMC) algorithm has recently been proposed as a dynamic optimization algorithm in the literature. In this paper, we show in theory that the samples generated by SAMC can be used for Monte Carlo integration via a dynamically weighted estimator by calling some results from the literature of nonhomogeneous Markov chains. Our numerical results indicate that SAMC can yield significant savings over conventional Monte Carlo algorithms, such as the Metropolis-Hastings algorithm, for the problems for which the energy landscape is rugged. © 2008 Elsevier B.V. All rights reserved.
Use of Monte Carlo Methods in brachytherapy; Uso del metodo de Monte Carlo en braquiterapia
Energy Technology Data Exchange (ETDEWEB)
Granero Cabanero, D.
2015-07-01
The Monte Carlo method has become a fundamental tool for brachytherapy dosimetry mainly because no difficulties associated with experimental dosimetry. In brachytherapy the main handicap of experimental dosimetry is the high dose gradient near the present sources making small uncertainties in the positioning of the detectors lead to large uncertainties in the dose. This presentation will review mainly the procedure for calculating dose distributions around a fountain using the Monte Carlo method showing the difficulties inherent in these calculations. In addition we will briefly review other applications of the method of Monte Carlo in brachytherapy dosimetry, as its use in advanced calculation algorithms, calculating barriers or obtaining dose applicators around. (Author)
Poster — Thur Eve — 14: Improving Tissue Segmentation for Monte Carlo Dose Calculation using DECT
Energy Technology Data Exchange (ETDEWEB)
Di Salvio, A.; Bedwani, S.; Carrier, J-F. [Centre hospitalier de l' Université de Montréal (Canada); Bouchard, H. [National Physics Laboratory, Teddington (United Kingdom)
2014-08-15
Purpose: To improve Monte Carlo dose calculation accuracy through a new tissue segmentation technique with dual energy CT (DECT). Methods: Electron density (ED) and effective atomic number (EAN) can be extracted directly from DECT data with a stoichiometric calibration method. Images are acquired with Monte Carlo CT projections using the user code egs-cbct and reconstructed using an FDK backprojection algorithm. Calibration is performed using projections of a numerical RMI phantom. A weighted parameter algorithm then uses both EAN and ED to assign materials to voxels from DECT simulated images. This new method is compared to a standard tissue characterization from single energy CT (SECT) data using a segmented calibrated Hounsfield unit (HU) to ED curve. Both methods are compared to the reference numerical head phantom. Monte Carlo simulations on uniform phantoms of different tissues using dosxyz-nrc show discrepancies in depth-dose distributions. Results: Both SECT and DECT segmentation methods show similar performance assigning soft tissues. Performance is however improved with DECT in regions with higher density, such as bones, where it assigns materials correctly 8% more often than segmentation with SECT, considering the same set of tissues and simulated clinical CT images, i.e. including noise and reconstruction artifacts. Furthermore, Monte Carlo results indicate that kV photon beam depth-dose distributions can double between two tissues of density higher than muscle. Conclusions: A direct acquisition of ED and the added information of EAN with DECT data improves tissue segmentation and increases the accuracy of Monte Carlo dose calculation in kV photon beams.
Accelerating Hasenbusch's acceleration of hybrid Monte Carlo
International Nuclear Information System (INIS)
Hasenbusch has proposed splitting the pseudo-fermionic action into two parts, in order to speed-up Hybrid Monte Carlo simulations of QCD. We have tested a different splitting, also using clover-improved Wilson fermions. An additional speed-up between 5 and 20% over the original proposal was achieved in production runs. (orig.)
A comparison of Monte Carlo generators
Golan, Tomasz
2014-01-01
A comparison of GENIE, NEUT, NUANCE, and NuWro Monte Carlo neutrino event generators is presented using a set of four observables: protons multiplicity, total visible energy, most energetic proton momentum, and $\\pi^+$ two-dimensional energy vs cosine distribution.
Scalable Domain Decomposed Monte Carlo Particle Transport
Energy Technology Data Exchange (ETDEWEB)
O' Brien, Matthew Joseph [Univ. of California, Davis, CA (United States)
2013-12-05
In this dissertation, we present the parallel algorithms necessary to run domain decomposed Monte Carlo particle transport on large numbers of processors (millions of processors). Previous algorithms were not scalable, and the parallel overhead became more computationally costly than the numerical simulation.
Using CIPSI nodes in diffusion Monte Carlo
Caffarel, Michel; Giner, Emmanuel; Scemama, Anthony
2016-01-01
Several aspects of the recently proposed DMC-CIPSI approach consisting in using selected Configuration Interaction (SCI) approaches such as CIPSI (Configuration Interaction using a Perturbative Selection done Iteratively) to build accurate nodes for diffusion Monte Carlo (DMC) calculations are presented and discussed. The main ideas are illustrated with a number of calculations for diatomics molecules and for the benchmark G1 set.
Monte Carlo methods beyond detailed balance
Schram, Raoul D.; Barkema, Gerard T.
2015-01-01
Monte Carlo algorithms are nearly always based on the concept of detailed balance and ergodicity. In this paper we focus on algorithms that do not satisfy detailed balance. We introduce a general method for designing non-detailed balance algorithms, starting from a conventional algorithm satisfying
Monte Carlo Renormalization Group: a review
International Nuclear Information System (INIS)
The logic and the methods of Monte Carlo Renormalization Group (MCRG) are reviewed. A status report of results for 4-dimensional lattice gauge theories derived using MCRG is presented. Existing methods for calculating the improved action are reviewed and evaluated. The Gupta-Cordery improved MCRG method is described and compared with the standard one. 71 refs., 8 figs
Monte Carlo simulation of the microcanonical ensemble
International Nuclear Information System (INIS)
We consider simulating statistical systems with a random walk on a constant energy surface. This combines features of deterministic molecular dynamics techniques and conventional Monte Carlo simulations. For discrete systems the method can be programmed to run an order of magnitude faster than other approaches. It does not require high quality random numbers and may also be useful for nonequilibrium studies. 10 references
Extending canonical Monte Carlo methods: II
International Nuclear Information System (INIS)
We have previously presented a methodology for extending canonical Monte Carlo methods inspired by a suitable extension of the canonical fluctuation relation C = β2(δE2) compatible with negative heat capacities, C α, as is shown in the particular case of the 2D seven-state Potts model where the exponent α = 0.14–0.18
A Monte Carlo simulation of photomultiplier resolution
International Nuclear Information System (INIS)
A Monte Carlo simulation of dynode statistics has been used to generate multiphotoelectron distributions to compare with actual photomultiplier resolution results. In place of Poission of Polya statistics, in this novel approach, the basis for the simulation is an experimentally determined single electron response. The relevance of this method to the study of intrinsic line widths of scintillators is discussed
Parallel processing Monte Carlo radiation transport codes
International Nuclear Information System (INIS)
Issues related to distributed-memory multiprocessing as applied to Monte Carlo radiation transport are discussed. Measurements of communication overhead are presented for the radiation transport code MCNP which employs the communication software package PVM, and average efficiency curves are provided for a homogeneous virtual machine
A note on simultaneous Monte Carlo tests
DEFF Research Database (Denmark)
Hahn, Ute
In this short note, Monte Carlo tests of goodness of fit for data of the form X(t), t ∈ I are considered, that reject the null hypothesis if X(t) leaves an acceptance region bounded by an upper and lower curve for some t in I. A construction of the acceptance region is proposed that complies to a...
Optical monitoring of rheumatoid arthritis: Monte Carlo generated reconstruction kernels
Minet, O.; Beuthan, J.; Hielscher, A. H.; Zabarylo, U.
2008-06-01
Optical imaging in biomedicine is governed by the light absorption and scattering interaction on microscopic and macroscopic constituents in the medium. Therefore, light scattering characteristics of human tissue correlate with the stage of some diseases. In the near infrared range the scattering event with the coefficient approximately two orders of magnitude greater than absorption plays a dominant role. When measuring the optical parameters variations were discovered that correlate with the rheumatoid arthritis of a small joint. The potential of an experimental setup for transillumination the finger joint with a laser diode and the pattern of the stray light detection are demonstrated. The scattering caused by skin contains no useful information and it can be removed by a deconvolution technique to enhance the diagnostic value of this non-invasive optical method. Monte Carlo simulations ensure both the construction of the corresponding point spread function and both the theoretical verification of the stray light picture in rather complex geometry.
Kuidas kirjutatakse ajalugu? / Carlo Ginzburg ; interv. Marek Tamm
Ginzburg, Carlo
2007-01-01
Ülevaade Pisa Euroopa kultuuride professori C. Ginzburg'i teostest. Varem. ilm.: Märgid, jäljed ja tõendid : intervjuu Carlo Ginzburgiga // Ginzburg, Carlo. Juust ja vaglad. - Tallinn, 2000. - Lk. 262-271
R and D on automatic modeling methods for Monte Carlo codes FLUKA
International Nuclear Information System (INIS)
FLUKA is a fully integrated particle physics Monte Carlo simulation package. It is necessary to create the geometry models before calculation. However, it is time- consuming and error-prone to describe the geometry models manually. This study developed an automatic modeling method which could automatically convert computer-aided design (CAD) geometry models into FLUKA models. The conversion program was integrated into CAD/image-based automatic modeling program for nuclear and radiation transport simulation (MCAM). Its correctness has been demonstrated. (authors)
Monte Carlo study of real time dynamics
Alexandru, Andrei; Bedaque, Paulo F; Vartak, Sohan; Warrington, Neill C
2016-01-01
Monte Carlo studies involving real time dynamics are severely restricted by the sign problem that emerges from highly oscillatory phase of the path integral. In this letter, we present a new method to compute real time quantities on the lattice using the Schwinger-Keldysh formalism via Monte Carlo simulations. The key idea is to deform the path integration domain to a complex manifold where the phase oscillations are mild and the sign problem is manageable. We use the previously introduced "contraction algorithm" to create a Markov chain on this alternative manifold. We substantiate our approach by analyzing the quantum mechanical anharmonic oscillator. Our results are in agreement with the exact ones obtained by diagonalization of the Hamiltonian. The method we introduce is generic and in principle applicable to quantum field theory albeit very slow. We discuss some possible improvements that should speed up the algorithm.
Monte Carlo Simulation for Particle Detectors
Pia, Maria Grazia
2012-01-01
Monte Carlo simulation is an essential component of experimental particle physics in all the phases of its life-cycle: the investigation of the physics reach of detector concepts, the design of facilities and detectors, the development and optimization of data reconstruction software, the data analysis for the production of physics results. This note briefly outlines some research topics related to Monte Carlo simulation, that are relevant to future experimental perspectives in particle physics. The focus is on physics aspects: conceptual progress beyond current particle transport schemes, the incorporation of materials science knowledge relevant to novel detection technologies, functionality to model radiation damage, the capability for multi-scale simulation, quantitative validation and uncertainty quantification to determine the predictive power of simulation. The R&D on simulation for future detectors would profit from cooperation within various components of the particle physics community, and synerg...
Multilevel Monte Carlo Approaches for Numerical Homogenization
Efendiev, Yalchin R.
2015-10-01
In this article, we study the application of multilevel Monte Carlo (MLMC) approaches to numerical random homogenization. Our objective is to compute the expectation of some functionals of the homogenized coefficients, or of the homogenized solutions. This is accomplished within MLMC by considering different sizes of representative volumes (RVEs). Many inexpensive computations with the smallest RVE size are combined with fewer expensive computations performed on larger RVEs. Likewise, when it comes to homogenized solutions, different levels of coarse-grid meshes are used to solve the homogenized equation. We show that, by carefully selecting the number of realizations at each level, we can achieve a speed-up in the computations in comparison to a standard Monte Carlo method. Numerical results are presented for both one-dimensional and two-dimensional test-cases that illustrate the efficiency of the approach.
Coevolution Based Adaptive Monte Carlo Localization (CEAMCL
Directory of Open Access Journals (Sweden)
Luo Ronghua
2008-11-01
Full Text Available An adaptive Monte Carlo localization algorithm based on coevolution mechanism of ecological species is proposed. Samples are clustered into species, each of which represents a hypothesis of the robot's pose. Since the coevolution between the species ensures that the multiple distinct hypotheses can be tracked stably, the problem of premature convergence when using MCL in highly symmetric environments can be solved. And the sample size can be adjusted adaptively over time according to the uncertainty of the robot's pose by using the population growth model. In addition, by using the crossover and mutation operators in evolutionary computation, intra-species evolution can drive the samples move towards the regions where the desired posterior density is large. So a small size of samples can represent the desired density well enough to make precise localization. The new algorithm is termed coevolution based adaptive Monte Carlo localization (CEAMCL. Experiments have been carried out to prove the efficiency of the new localization algorithm.
Monte Carlo simulation of gas Cerenkov detectors
International Nuclear Information System (INIS)
Theoretical study of selected gamma-ray and electron diagnostic necessitates coupling Cerenkov radiation to electron/photon cascades. A Cerenkov production model and its incorporation into a general geometry Monte Carlo coupled electron/photon transport code is discussed. A special optical photon ray-trace is implemented using bulk optical properties assigned to each Monte Carlo zone. Good agreement exists between experimental and calculated Cerenkov data in the case of a carbon-dioxide gas Cerenkov detector experiment. Cerenkov production and threshold data are presented for a typical carbon-dioxide gas detector that converts a 16.7 MeV photon source to Cerenkov light, which is collected by optics and detected by a photomultiplier
No-compromise reptation quantum Monte Carlo
International Nuclear Information System (INIS)
Since its publication, the reptation quantum Monte Carlo algorithm of Baroni and Moroni (1999 Phys. Rev. Lett. 82 4745) has been applied to several important problems in physics, but its mathematical foundations are not well understood. We show that their algorithm is not of typical Metropolis-Hastings type, and we specify conditions required for the generated Markov chain to be stationary and to converge to the intended distribution. The time-step bias may add up, and in many applications it is only the middle of a reptile that is the most important. Therefore, we propose an alternative, 'no-compromise reptation quantum Monte Carlo' to stabilize the middle of the reptile. (fast track communication)
Fast Lattice Monte Carlo Simulations of Polymers
Wang, Qiang; Zhang, Pengfei
2014-03-01
The recently proposed fast lattice Monte Carlo (FLMC) simulations (with multiple occupancy of lattice sites (MOLS) and Kronecker δ-function interactions) give much faster/better sampling of configuration space than both off-lattice molecular simulations (with pair-potential calculations) and conventional lattice Monte Carlo simulations (with self- and mutual-avoiding walk and nearest-neighbor interactions) of polymers.[1] Quantitative coarse-graining of polymeric systems can also be performed using lattice models with MOLS.[2] Here we use several model systems, including polymer melts, solutions, blends, as well as confined and/or grafted polymers, to demonstrate the great advantages of FLMC simulations in the study of equilibrium properties of polymers.
Status of Monte Carlo at Los Alamos
International Nuclear Information System (INIS)
Four papers were presented by Group X-6 on April 22, 1980, at the Oak Ridge Radiation Shielding Information Center (RSIC) Seminar-Workshop on Theory and Applications of Monte Carlo Methods. These papers are combined into one report for convenience and because they are related to each other. The first paper (by Thompson and Cashwell) is a general survey about X-6 and MCNP and is an introduction to the other three papers. It can also serve as a resume of X-6. The second paper (by Godfrey) explains some of the details of geometry specification in MCNP. The third paper (by Cashwell and Schrandt) illustrates calculating flux at a point with MCNP; in particular, the once-more-collided flux estimator is demonstrated. Finally, the fourth paper (by Thompson, Deutsch, and Booth) is a tutorial on some variance-reduction techniques. It should be required for a fledging Monte Carlo practitioner
Composite biasing in Monte Carlo radiative transfer
Baes, Maarten; Lunttila, Tuomas; Bianchi, Simone; Camps, Peter; Juvela, Mika; Kuiper, Rolf
2016-01-01
Biasing or importance sampling is a powerful technique in Monte Carlo radiative transfer, and can be applied in different forms to increase the accuracy and efficiency of simulations. One of the drawbacks of the use of biasing is the potential introduction of large weight factors. We discuss a general strategy, composite biasing, to suppress the appearance of large weight factors. We use this composite biasing approach for two different problems faced by current state-of-the-art Monte Carlo radiative transfer codes: the generation of photon packages from multiple components, and the penetration of radiation through high optical depth barriers. In both cases, the implementation of the relevant algorithms is trivial and does not interfere with any other optimisation techniques. Through simple test models, we demonstrate the general applicability, accuracy and efficiency of the composite biasing approach. In particular, for the penetration of high optical depths, the gain in efficiency is spectacular for the spe...
EU Commissioner Carlos Moedas visits SESAME
CERN Bulletin
2015-01-01
The European Commissioner for research, science and innovation, Carlos Moedas, visited the SESAME laboratory in Jordan on Monday 13 April. When it begins operation in 2016, SESAME, a synchrotron light source, will be the Middle East’s first major international science centre, carrying out experiments ranging from the physical sciences to environmental science and archaeology. CERN Director-General Rolf Heuer (left) and European Commissioner Carlos Moedas with the model SESAME magnet. © European Union, 2015. Commissioner Moedas was accompanied by a European Commission delegation led by Robert-Jan Smits, Director-General of DG Research and Innovation, as well as Rolf Heuer, CERN Director-General, Jean-Pierre Koutchouk, coordinator of the CERN-EC Support for SESAME Magnets (CESSAMag) project and Princess Sumaya bint El Hassan of Jordan, a leading advocate of science in the region. They toured the SESAME facility together with SESAME Director, Khaled Tou...
Hybrid Monte Carlo with Chaotic Mixing
Kadakia, Nirag
2016-01-01
We propose a hybrid Monte Carlo (HMC) technique applicable to high-dimensional multivariate normal distributions that effectively samples along chaotic trajectories. The method is predicated on the freedom of choice of the HMC momentum distribution, and due to its mixing properties, exhibits sample-to-sample autocorrelations that decay far faster than those in the traditional hybrid Monte Carlo algorithm. We test the methods on distributions of varying correlation structure, finding that the proposed technique produces superior covariance estimates, is less reliant on step-size tuning, and can even function with sparse or no momentum re-sampling. The method presented here is promising for more general distributions, such as those that arise in Bayesian learning of artificial neural networks and in the state and parameter estimation of dynamical systems.
Monte Carlo Shell Model Mass Predictions
International Nuclear Information System (INIS)
The nuclear mass calculation is discussed in terms of large-scale shell model calculations. First, the development and limitations of the conventional shell model calculations are mentioned. In order to overcome the limitations, the Quantum Monte Carlo Diagonalization (QMCD) method has been proposed. The basic formulation and features of the QMCD method are presented as well as its application to the nuclear shell model, referred to as Monte Carlo Shell Model (MCSM). The MCSM provides us with a breakthrough in shell model calculations: the structure of low-lying states can be studied with realistic interactions for a nearly unlimited variety of nuclei. Thus, the MCSM can contribute significantly to the study of nuclear masses. An application to N∼20 unstable nuclei far from the β-stability line is mentioned
Quantum Monte Carlo calculations for carbon nanotubes
Luu, Thomas; Lähde, Timo A.
2016-04-01
We show how lattice quantum Monte Carlo can be applied to the electronic properties of carbon nanotubes in the presence of strong electron-electron correlations. We employ the path-integral formalism and use methods developed within the lattice QCD community for our numerical work. Our lattice Hamiltonian is closely related to the hexagonal Hubbard model augmented by a long-range electron-electron interaction. We apply our method to the single-quasiparticle spectrum of the (3,3) armchair nanotube configuration, and consider the effects of strong electron-electron correlations. Our approach is equally applicable to other nanotubes, as well as to other carbon nanostructures. We benchmark our Monte Carlo calculations against the two- and four-site Hubbard models, where a direct numerical solution is feasible.
Status of Monte Carlo at Los Alamos
International Nuclear Information System (INIS)
At Los Alamos the early work of Fermi, von Neumann, and Ulam has been developed and supplemented by many followers, notably Cashwell and Everett, and the main product today is the continuous-energy, general-purpose, generalized-geometry, time-dependent, coupled neutron-photon transport code called MCNP. The Los Alamos Monte Carlo research and development effort is concentrated in Group X-6. MCNP treats an arbitrary three-dimensional configuration of arbitrary materials in geometric cells bounded by first- and second-degree surfaces and some fourth-degree surfaces (elliptical tori). Monte Carlo has evolved into perhaps the main method for radiation transport calculations at Los Alamos. MCNP is used in every technical division at the Laboratory by over 130 users about 600 times a month accounting for nearly 200 hours of CDC-7600 time
A Monte Carlo solution to skyshine radiation
International Nuclear Information System (INIS)
A Monte Carlo method was used to calculate the skyshine doses from 2-ft exposure cell ceiling of an accelerator. Modifications were made to the Monte Carlo program MORSE code to perform this analysis. Adjoint mode calculations provided optimum Russian roulette and splitting parameters which were later used in the forward mode calculations. Russian roulette and splitting were used at the collision sites and at boundary crossings. Exponential transform was used for particle pathlength stretching. The TIGER code was used to generate the anisotropic source term and P5 Legendre expansion was used to compute the cross sections. Where negative fluxes occured at detector locations due to large angle scatterings, a macroscopic cross section data bank was used to make Klein-Nishina and pair production flux estimates. With the above modifications, sixty detectors at locations ranging from 10 to 300 ft from the cell wall showed good statistical responses (5 to 10% fsd)
The lund Monte Carlo for jet fragmentation
International Nuclear Information System (INIS)
We present a Monte Carlo program based on the Lund model for jet fragmentation. Quark, gluon, diquark and hadron jets are considered. Special emphasis is put on the fragmentation of colour singlet jet systems, for which energy, momentum and flavour are conserved explicitly. The model for decays of unstable particles, in particular the weak decay of heavy hadrons, is described. The central part of the paper is a detailed description on how to use the FORTRAN 77 program. (Author)
New Dynamic Monte Carlo Renormalization Group Method
Lacasse, Martin-D.; Vinals, Jorge; Grant, Martin
1992-01-01
The dynamical critical exponent of the two-dimensional spin-flip Ising model is evaluated by a Monte Carlo renormalization group method involving a transformation in time. The results agree very well with a finite-size scaling analysis performed on the same data. The value of $z = 2.13 \\pm 0.01$ is obtained, which is consistent with most recent estimates.
Autocorrelations in hybrid Monte Carlo simulations
International Nuclear Information System (INIS)
Simulations of QCD suffer from severe critical slowing down towards the continuum limit. This problem is known to be prominent in the topological charge, however, all observables are affected to various degree by these slow modes in the Monte Carlo evolution. We investigate the slowing down in high statistics simulations and propose a new error analysis method, which gives a realistic estimate of the contribution of the slow modes to the errors. (orig.)
Monte Carlo methods for preference learning
DEFF Research Database (Denmark)
Viappiani, P.
2012-01-01
Utility elicitation is an important component of many applications, such as decision support systems and recommender systems. Such systems query the users about their preferences and give recommendations based on the system’s belief about the utility function. Critical to these applications is th...... is the acquisition of prior distribution about the utility parameters and the possibility of real time Bayesian inference. In this paper we consider Monte Carlo methods for these problems....
The Moment Guided Monte Carlo Method
Degond, Pierre; Dimarco, Giacomo; Pareschi, Lorenzo
2009-01-01
In this work we propose a new approach for the numerical simulation of kinetic equations through Monte Carlo schemes. We introduce a new technique which permits to reduce the variance of particle methods through a matching with a set of suitable macroscopic moment equations. In order to guarantee that the moment equations provide the correct solutions, they are coupled to the kinetic equation through a non equilibrium term. The basic idea, on which the method relies, consists in guiding the p...
Handbook of Markov chain Monte Carlo
Brooks, Steve
2011-01-01
""Handbook of Markov Chain Monte Carlo"" brings together the major advances that have occurred in recent years while incorporating enough introductory material for new users of MCMC. Along with thorough coverage of the theoretical foundations and algorithmic and computational methodology, this comprehensive handbook includes substantial realistic case studies from a variety of disciplines. These case studies demonstrate the application of MCMC methods and serve as a series of templates for the construction, implementation, and choice of MCMC methodology.
Simulated Annealing using Hybrid Monte Carlo
Salazar, Rafael; Toral, Raúl
1997-01-01
We propose a variant of the simulated annealing method for optimization in the multivariate analysis of differentiable functions. The method uses global actualizations via the hybrid Monte Carlo algorithm in their generalized version for the proposal of new configurations. We show how this choice can improve upon the performance of simulated annealing methods (mainly when the number of variables is large) by allowing a more effective searching scheme and a faster annealing schedule.
Coevolution Based Adaptive Monte Carlo Localization (CEAMCL)
Luo Ronghua; Hong Bingrong
2004-01-01
An adaptive Monte Carlo localization algorithm based on coevolution mechanism of ecological species is proposed. Samples are clustered into species, each of which represents a hypothesis of the robot's pose. Since the coevolution between the species ensures that the multiple distinct hypotheses can be tracked stably, the problem of premature convergence when using MCL in highly symmetric environments can be solved. And the sample size can be adjusted adaptively over time according to the unce...
Monte Carlo modeling of liquid scinttilation spectra
Czech Academy of Sciences Publication Activity Database
Šimek, Ondřej; Šídlová, V.; Světlík, Ivo; Tomášková, Lenka
Praha : ČVUT v Praze, 2007, s. 90-93. ISBN 978-80-01-03901-4. [Dny radiační ochrany /29./. Kouty nad Desnou, Hrubý Jeseník (CZ), 05.11.2007-09.11.2007] Institutional research plan: CEZ:AV0Z10480505 Keywords : Monte Carlo modelling * liquid scintillation spectra * energy deposition Subject RIV: BG - Nuclear, Atomic and Molecular Physics, Colliders
Monte Carlo Simulations of Star Clusters
Giersz, M
2000-01-01
A revision of Stod\\'o{\\l}kiewicz's Monte Carlo code is used to simulate evolution of large star clusters. The survey on the evolution of multi-mass N-body systems influenced by the tidal field of a parent galaxy and by stellar evolution is discussed. For the first time, the simulation on the "star-by-star" bases of evolution of 1,000,000 body star cluster is presented. \\
Replica Exchange for Reactive Monte Carlo Simulations
Czech Academy of Sciences Publication Activity Database
Turner, C.H.; Brennan, J.K.; Lísal, Martin
2007-01-01
Roč. 111, č. 43 (2007), s. 15706-15715. ISSN 1932-7447 R&D Projects: GA ČR GA203/05/0725; GA AV ČR 1ET400720409; GA AV ČR 1ET400720507 Institutional research plan: CEZ:AV0Z40720504 Keywords : monte carlo * simulation * reactive system Subject RIV: CF - Physical ; Theoretical Chemistry
A Ballistic Monte Carlo Approximation of {\\pi}
Dumoulin, Vincent
2014-01-01
We compute a Monte Carlo approximation of {\\pi} using importance sampling with shots coming out of a Mossberg 500 pump-action shotgun as the proposal distribution. An approximated value of 3.136 is obtained, corresponding to a 0.17% error on the exact value of {\\pi}. To our knowledge, this represents the first attempt at estimating {\\pi} using such method, thus opening up new perspectives towards computing mathematical constants using everyday tools.
Topological zero modes in Monte Carlo simulations
International Nuclear Information System (INIS)
We present an improvement of global Metropolis updating steps, the instanton hits, used in a hybrid Monte Carlo simulation of the two-flavor Schwinger model with staggered fermions. These hits are designed to change the topological sector of the gauge field. In order to match these hits to an unquenched simulation with pseudofermions, the approximate zero mode structure of the lattice Dirac operator has to be considered explicitly. (orig.)
On adaptive Markov chain Monte Carlo algorithms
Atchadé, Yves F.; Rosenthal, Jeffrey S.
2005-01-01
We look at adaptive Markov chain Monte Carlo algorithms that generate stochastic processes based on sequences of transition kernels, where each transition kernel is allowed to depend on the history of the process. We show under certain conditions that the stochastic process generated is ergodic, with appropriate stationary distribution. We use this result to analyse an adaptive version of the random walk Metropolis algorithm where the scale parameter σ is sequentially adapted using a Robbins-...
Introduction to the Monte Carlo methods
International Nuclear Information System (INIS)
Codes illustrating the use of Monte Carlo methods in high energy physics such as the inverse transformation method, the ejection method, the particle propagation through the nucleus, the particle interaction with the nucleus, etc. are presented. A set of useful algorithms of random number generators is given (the binomial distribution, the Poisson distribution, β-distribution, γ-distribution and normal distribution). 5 figs., 1 tab
Tracklength biassing in Monte Carlo radiation transport
International Nuclear Information System (INIS)
Tracklength stretching is employed in deep penetration Monte Carlo studies for variance reduction. Incorporating a dependence of the biassing on the angular disposition of the track improves the procedure. Linear and exponential forms for this dependence are investigated here, using Spanier's self-learning technique. Suitable biassing parameters are worked out for representative shield systems, for use in practical simulations. Of the two, we find that the exponential scheme performs better. (orig.)
A Monte Carlo for BFKL Physics
Orr, Lynne H.; Stirling, W. J.
2000-01-01
Virtual photon scattering in e^+e^- collisions can result in events with the electron-positron pair at large rapidity separation with hadronic activity in between. The BFKL equation resums large logarithms that dominate the cross section for this process. We report here on a Monte Carlo method for solving the BFKL equation that allows kinematic constraints to be taken into account. The application to e^+e^- collisions is in progress.
Lookahead Strategies for Sequential Monte Carlo
Lin, Ming; Chen, Rong; Liu, Jun
2013-01-01
Based on the principles of importance sampling and resampling, sequential Monte Carlo (SMC) encompasses a large set of powerful techniques dealing with complex stochastic dynamic systems. Many of these systems possess strong memory, with which future information can help sharpen the inference about the current state. By providing theoretical justification of several existing algorithms and introducing several new ones, we study systematically how to construct efficient SMC algorithms to take ...
Archimedes, the Free Monte Carlo simulator
Sellier, Jean Michel D.
2012-01-01
Archimedes is the GNU package for Monte Carlo simulations of electron transport in semiconductor devices. The first release appeared in 2004 and since then it has been improved with many new features like quantum corrections, magnetic fields, new materials, GUI, etc. This document represents the first attempt to have a complete manual. Many of the Physics models implemented are described and a detailed description is presented to make the user able to write his/her own input deck. Please, fee...
Monte Carlo simulation and numerical integration
Geweke, John F.
1995-01-01
This is a survey of simulation methods in economics, with a specific focus on integration problems. It describes acceptance methods, importance sampling procedures, and Markov chain Monte Carlo methods for simulation from univariate and multivariate distributions and their application to the approximation of integrals. The exposition gives emphasis to combinations of different approaches and assessment of the accuracy of numerical approximations to integrals and expectations. The survey illus...
jTracker and Monte Carlo Comparison
Selensky, Lauren; SeaQuest/E906 Collaboration
2015-10-01
SeaQuest is designed to observe the characteristics and behavior of `sea-quarks' in a proton by reconstructing them from the subatomic particles produced in a collision. The 120 GeV beam from the main injector collides with a fixed target and then passes through a series of detectors which records information about the particles produced in the collision. However, this data becomes meaningful only after it has been processed, stored, analyzed, and interpreted. Several programs are involved in this process. jTracker (sqerp) reads wire or hodoscope hits and reconstructs the tracks of potential dimuon pairs from a run, and Geant4 Monte Carlo simulates dimuon production and background noise from the beam. During track reconstruction, an event must meet the criteria set by the tracker to be considered a viable dimuon pair; this ensures that relevant data is retained. As a check, a comparison between a new version of jTracker and Monte Carlo was made in order to see how accurately jTracker could reconstruct the events created by Monte Carlo. In this presentation, the results of the inquest and their potential effects on the programming will be shown. This work is supported by U.S. DOE MENP Grant DE-FG02-03ER41243.
Quantum Monte Carlo for vibrating molecules
International Nuclear Information System (INIS)
Quantum Monte Carlo (QMC) has successfully computed the total electronic energies of atoms and molecules. The main goal of this work is to use correlation function quantum Monte Carlo (CFQMC) to compute the vibrational state energies of molecules given a potential energy surface (PES). In CFQMC, an ensemble of random walkers simulate the diffusion and branching processes of the imaginary-time time dependent Schroedinger equation in order to evaluate the matrix elements. The program QMCVIB was written to perform multi-state VMC and CFQMC calculations and employed for several calculations of the H2O and C3 vibrational states, using 7 PES's, 3 trial wavefunction forms, two methods of non-linear basis function parameter optimization, and on both serial and parallel computers. In order to construct accurate trial wavefunctions different wavefunctions forms were required for H2O and C3. In order to construct accurate trial wavefunctions for C3, the non-linear parameters were optimized with respect to the sum of the energies of several low-lying vibrational states. In order to stabilize the statistical error estimates for C3 the Monte Carlo data was collected into blocks. Accurate vibrational state energies were computed using both serial and parallel QMCVIB programs. Comparison of vibrational state energies computed from the three C3 PES's suggested that a non-linear equilibrium geometry PES is the most accurate and that discrete potential representations may be used to conveniently determine vibrational state energies
Monte Carlo small-sample perturbation calculations
International Nuclear Information System (INIS)
Two different Monte Carlo methods have been developed for benchmark computations of small-sample-worths in simplified geometries. The first is basically a standard Monte Carlo perturbation method in which neutrons are steered towards the sample by roulette and splitting. One finds, however, that two variance reduction methods are required to make this sort of perturbation calculation feasible. First, neutrons that have passed through the sample must be exempted from roulette. Second, neutrons must be forced to undergo scattering collisions in the sample. Even when such methods are invoked, however, it is still necessary to exaggerate the volume fraction of the sample by drastically reducing the size of the core. The benchmark calculations are then used to test more approximate methods, and not directly to analyze experiments. In the second method the flux at the surface of the sample is assumed to be known. Neutrons entering the sample are drawn from this known flux and tracking by Monte Carlo. The effect of the sample or the fission rate is then inferred from the histories of these neutrons. The characteristics of both of these methods are explored empirically
Exploring backscattered imaging in low voltage FE-SEM
Lewis, P.; Micklethwaite, S.; Harrington, J; Dixon, M.; Brydson, R; Hondow, N
2015-01-01
Contrast levels in backscattered SEM images were investigated, utilising stage deceleration for low voltage imaging and also electron energy filtering. Image contrast variations are explained via use of Monte Carlo simulations which can predict the optimum accelerating and filter voltages for imaging complex sample mixtures.
Juste, Belén; Miró, R.; Abella, V.; Santos, A.; Verdú, Gumersindo
2015-11-01
Radiation therapy treatment planning based on Monte Carlo simulation provide a very accurate dose calculation compared to deterministic systems. Nowadays, Metal-Oxide-Semiconductor Field Effect Transistor (MOSFET) dosimeters are increasingly utilized in radiation therapy to verify the received dose by patients. In the present work, we have used the MCNP6 (Monte Carlo N-Particle transport code) to simulate the irradiation of an anthropomorphic phantom (RANDO) with a medical linear accelerator. The detailed model of the Elekta Precise multileaf collimator using a 6 MeV photon beam was designed and validated by means of different beam sizes and shapes in previous works. To include in the simulation the RANDO phantom geometry a set of Computer Tomography images of the phantom was obtained and formatted. The slices are input in PLUNC software, which performs the segmentation by defining anatomical structures and a Matlab algorithm writes the phantom information in MCNP6 input deck format. The simulation was verified and therefore the phantom model and irradiation was validated throughout the comparison of High-Sensitivity MOSFET dosimeter (Best medical Canada) measurements in different points inside the phantom with simulation results. On-line Wireless MOSFET provide dose estimation in the extremely thin sensitive volume, so a meticulous and accurate validation has been performed. The comparison show good agreement between the MOSFET measurements and the Monte Carlo calculations, confirming the validity of the developed procedure to include patients CT in simulations and approving the use of Monte Carlo simulations as an accurate therapy treatment plan.
Monte Carlo simulation for dual head gamma camera
International Nuclear Information System (INIS)
Monte Carlo (MC) simulation technique was used widely in medical physics applications. In nuclear medicine MC was used to design new medical imaging devices such as positron emission tomography (PET), gamma camera and single photon emission computed tomography (SPECT). Also it can be used to study the factors affecting image quality and internal dosimetry, Gate is on of monte Carlo code that has a number of advantages for simulation of SPECT and PET. There is a limit accessibilities in machines which are used in clinics because of the work load of machines. This makes it hard to evaluate some factors effecting machine performance which must be evaluated routinely. Also because of difficulties of carrying out scientific research and training of students, MC model can be optimum solution for the problem. The aim of this study was to use gate monte Carlo code to model Nucline spirit, medico dual head gamma camera hosted in radiation and isotopes center of Khartoum which is equipped with low energy general purpose LEGP collimators. This was used model to evaluate spatial resolution and sensitivity which is important factor affecting image quality and to demonstrate the validity of gate by comparing experimental results with simulation results on spatial resolution. The gate model of Nuclide spirit, medico dual head gamma camera was developed by applying manufacturer specifications. Then simulation was run. In evaluation of spatial resolution the FWHM was calculated from image profile of line source of Tc 99m gammas emitter of energy 140 KeV at different distances from modeled camera head at 5,10,15,20,22,27,32,37 cm and for these distances the spatial resolution was founded to be 5.76, 7.73, 10.7, 13.8, 14.01,16.91, 19.75 and 21.9 mm, respectively. These results showed a decrement of spatial resolution with increase of the distance between object (line source) and collimator in linear manner. FWHM calculated at 10 cm was compared with experimental results. The
Cost effective distributed computing for Monte Carlo radiation dosimetry
International Nuclear Information System (INIS)
Full text: An inexpensive computing facility has been established for performing repetitive Monte Carlo simulations with the BEAM and EGS4/EGSnrc codes of linear accelerator beams, for calculating effective dose from diagnostic imaging procedures and of ion chambers and phantoms used for the Australian high energy absorbed dose standards. The facility currently consists of 3 dual-processor 450 MHz processor PCs linked by a high speed LAN. The 3 PCs can be accessed either locally from a single keyboard/monitor/mouse combination using a SwitchView controller or remotely via a computer network from PCs with suitable communications software (e.g. Telnet, Kermit etc). All 3 PCs are identically configured to have the Red Hat Linux 6.0 operating system. A Fortran compiler and the BEAM and EGS4/EGSnrc codes are available on the 3 PCs. The preparation of sequences of jobs utilising the Monte Carlo codes is simplified using load-distributing software (enFuzion 6.0 marketed by TurboLinux Inc, formerly Cluster from Active Tools) which efficiently distributes the computing load amongst all 6 processors. We describe 3 applications of the system - (a) energy spectra from radiotherapy sources, (b) mean mass-energy absorption coefficients and stopping powers for absolute absorbed dose standards and (c) dosimetry for diagnostic procedures; (a) and (b) are based on the transport codes BEAM and FLURZnrc while (c) is a Fortran/EGS code developed at ARPANSA. Efficiency gains ranged from 3 for (c) to close to the theoretical maximum of 6 for (a) and (b), with the gain depending on the amount of 'bookkeeping' to begin each task and the time taken to complete a single task. We have found the use of a load-balancing batch processing system with many PCs to be an economical way of achieving greater productivity for Monte Carlo calculations or of any computer intensive task requiring many runs with different parameters. Copyright (2000) Australasian College of Physical Scientists and
Benchmarking of proton transport in Super Monte Carlo simulation program
International Nuclear Information System (INIS)
Full text of the publication follows. The Monte Carlo (MC) method has been traditionally applied in nuclear design and analysis due to its capability of dealing with complicated geometries and multi-dimensional physics problems as well as obtaining accurate results. The Super Monte Carlo Simulation Program (SuperMC) is developed by FDS Team in China for fusion, fission, and other nuclear applications. The simulations of radiation transport, isotope burn-up, material activation, radiation dose, and biology damage could be performed using SuperMC. Complicated geometries and the whole physical process of various types of particles in broad energy scale can be well handled. Bi-directional automatic conversion between general CAD models and full-formed input files of SuperMC is supported by MCAM, which is a CAD/image-based automatic modeling program for neutronics and radiation transport simulation. Mixed visualization of dynamical 3D dataset and geometry model is supported by RVIS, which is a nuclear radiation virtual simulation and assessment system. Continuous-energy cross section data from hybrid evaluated nuclear data library HENDL are utilized to support simulation. Neutronic fixed source and critical design parameters calculates for reactors of complex geometry and material distribution based on the transport of neutron and photon have been achieved in our former version of SuperMC. Recently, the proton transport has also been integrated in SuperMC in the energy region up to 10 GeV. The physical processes considered for proton transport include electromagnetic processes and hadronic processes. The electromagnetic processes include ionization, multiple scattering, Bremsstrahlung, and pair production processes. Public evaluated data from HENDL are used in some electromagnetic processes. In hadronic physics, the Bertini intra-nuclear cascade model with excitons, preequilibrium model, nucleus explosion model, fission model, and evaporation model are incorporated to
Virtual detector characterisation with Monte-Carlo simulations
Sukowski, F.; Yaneu Yaneu, J. F.; Salamon, M.; Ebert, S.; Uhlmann, N.
2009-08-01
In the field of X-ray imaging flat-panel detectors which convert X-rays into electrical signals, are widely used. For different applications, detectors differ in several specific parameters that can be used for characterizing the detector. At the Development Center X-ray Technology EZRT we studied the question how well these characteristics can be determined by only knowing the layer composition of a detector. In order to determine the required parameters, the Monte-Carlo (MC) simulation program ROSI [J. Giersch et al., Nucl. Instr. and Meth. A 509 (2003) 151] was used while taking into account all primary and secondary particle interactions as well as the focal spot size of the X-ray tube. For the study, the Hamamatsu C9311DK [Technical Datasheet Hamamatsu C9311DK flat panel sensor, Hamamatsu Photonics, ( www.hamamatsu.com)], a scintillator-based detector, and the Ajat DIC 100TL [Technical description of Ajat DIC 100TL, Ajat Oy Ltd., ( www.ajat.fi)], a direct converting semiconductor detector, were used. The layer compositions of the two detectors were implemented into the MC simulation program. The following characteristics were measured [N. Uhlmann et al., Nucl. Instr. and Meth. A 591 (2008) 46] and compared to simulation results: The basic spatial resolution (BSR), the modulation transfer function (MTF), the contrast sensitivity (CS) and the specific material thickness range (SMTR). To take scattering of optical photons into account DETECT2000 [C. Moisan et al., DETECT2000—A Program for Modeling Optical Properties of Scintillators, Department of Electrical and Computer Engineering, Laval University, Quebec City, 2000], another Monte-Carlo simulation was used.
Guideline of Monte Carlo calculation. Neutron/gamma ray transport simulation by Monte Carlo method
2002-01-01
This report condenses basic theories and advanced applications of neutron/gamma ray transport calculations in many fields of nuclear energy research. Chapters 1 through 5 treat historical progress of Monte Carlo methods, general issues of variance reduction technique, cross section libraries used in continuous energy Monte Carlo codes. In chapter 6, the following issues are discussed: fusion benchmark experiments, design of ITER, experiment analyses of fast critical assembly, core analyses of JMTR, simulation of pulsed neutron experiment, core analyses of HTTR, duct streaming calculations, bulk shielding calculations, neutron/gamma ray transport calculations of the Hiroshima atomic bomb. Chapters 8 and 9 treat function enhancements of MCNP and MVP codes, and a parallel processing of Monte Carlo calculation, respectively. An important references are attached at the end of this report.
International Nuclear Information System (INIS)
Methods for Monte Carlo procedure in radiation measurement by SPECT (single photon emission computed tomography) and 3-D PET (3-dimensional positron emission tomography) are described together with its application to develop and optimize the scattering correction method in 201Tl-SPECT. In the medical technology, the Monte Carlo simulation makes it possible to quantify the behavior of a photon like scattering and absorption, and which can be performed by the use of EGS4 simulation code consisting from Step A - E. With the method, data collection procedures of the diagnostic equipments for nuclear medicine and application to develop the transmission radiation source for SPECT are described. Precision of the scattering correction method is also evaluated in the SPECT by the Monte Carlo simulation. The simulation is a useful tool for evaluating the behavior of radiation in the human body which can not be actually measured. (K.H.)
Radiographic imaging simulator
International Nuclear Information System (INIS)
Three-dimensional model for radiography imaging simulation was developed. The simulation is based on voxelized model and ray-tracing technique. To reduce the computational burden, the detector is treated in the simplified manner. Multiple scattering was treated by means of dose buildup factors. Results are compared to ones from realistic MCNP and FOTELP Monte Carlo simulations. It was shown that proposed model can be used in clinical practice when more exact techniques are not economical. (author)
Monte Carlo modelling for individual monitoring
International Nuclear Information System (INIS)
Full text: Individual monitoring techniques provide suitable tools for the estimate of personal dose equivalent Hp(d), representative of the effective dose, in case of external irradiation, or the evaluation of the committed effective dose by inference from activity measurements, in case of internal contamination. In both these fields Monte Carlo techniques play a crucial role: they can provide a series of parameters that are usually difficult, sometimes impossible, to be assessed experimentally. The aim of this paper is to give a panoramic view of Monte Carlo studies in external exposures individual monitoring field; internal dosimetry applications are briefly summarized in another paper. The operative practice in the field of occupational exposure relies on the employment of personal dosemeters to be worn appropriately on the body in order to guarantee a reliable estimate of the radiation protection quantities (i.e. effective dose or equivalent dose). Personal dosemeters are calibrated in terms of the ICRU operational quantity personal dose equivalent, Hp(d), that should, in principle, represent a reasonably conservative approximation of the radiation protection quantity (this condition is not fulfilled in a specific neutron energy range). All the theoretical and practical implementation of photon individual monitoring relies on two main aspects: the definition of the operational quantities and the calculation of the corresponding conversion coefficients for the field quantities (fluence and air kerma); the characterization of individual dosemeters in terms of these operational quantities with the associated energy and angular type test evaluations carried out on suitable calibration phantoms. For the first aspect (evaluation of conversion coefficients) rather exhaustive tabulations of Monte Carlo evaluated conversion coefficients has been published in ICRP and ICRU reports as well as in the open literature. For the second aspect (type test and calibration
Quantum Monte Carlo for vibrating molecules
Energy Technology Data Exchange (ETDEWEB)
Brown, W.R. [Univ. of California, Berkeley, CA (United States). Chemistry Dept.]|[Lawrence Berkeley National Lab., CA (United States). Chemical Sciences Div.
1996-08-01
Quantum Monte Carlo (QMC) has successfully computed the total electronic energies of atoms and molecules. The main goal of this work is to use correlation function quantum Monte Carlo (CFQMC) to compute the vibrational state energies of molecules given a potential energy surface (PES). In CFQMC, an ensemble of random walkers simulate the diffusion and branching processes of the imaginary-time time dependent Schroedinger equation in order to evaluate the matrix elements. The program QMCVIB was written to perform multi-state VMC and CFQMC calculations and employed for several calculations of the H{sub 2}O and C{sub 3} vibrational states, using 7 PES`s, 3 trial wavefunction forms, two methods of non-linear basis function parameter optimization, and on both serial and parallel computers. In order to construct accurate trial wavefunctions different wavefunctions forms were required for H{sub 2}O and C{sub 3}. In order to construct accurate trial wavefunctions for C{sub 3}, the non-linear parameters were optimized with respect to the sum of the energies of several low-lying vibrational states. In order to stabilize the statistical error estimates for C{sub 3} the Monte Carlo data was collected into blocks. Accurate vibrational state energies were computed using both serial and parallel QMCVIB programs. Comparison of vibrational state energies computed from the three C{sub 3} PES`s suggested that a non-linear equilibrium geometry PES is the most accurate and that discrete potential representations may be used to conveniently determine vibrational state energies.
A Monte Carlo approach to water management
Koutsoyiannis, D.
2012-04-01
Common methods for making optimal decisions in water management problems are insufficient. Linear programming methods are inappropriate because hydrosystems are nonlinear with respect to their dynamics, operation constraints and objectives. Dynamic programming methods are inappropriate because water management problems cannot be divided into sequential stages. Also, these deterministic methods cannot properly deal with the uncertainty of future conditions (inflows, demands, etc.). Even stochastic extensions of these methods (e.g. linear-quadratic-Gaussian control) necessitate such drastic oversimplifications of hydrosystems that may make the obtained results irrelevant to the real world problems. However, a Monte Carlo approach is feasible and can form a general methodology applicable to any type of hydrosystem. This methodology uses stochastic simulation to generate system inputs, either unconditional or conditioned on a prediction, if available, and represents the operation of the entire system through a simulation model as faithful as possible, without demanding a specific mathematical form that would imply oversimplifications. Such representation fully respects the physical constraints, while at the same time it evaluates the system operation constraints and objectives in probabilistic terms, and derives their distribution functions and statistics through Monte Carlo simulation. As the performance criteria of a hydrosystem operation will generally be highly nonlinear and highly nonconvex functions of the control variables, a second Monte Carlo procedure, implementing stochastic optimization, is necessary to optimize system performance and evaluate the control variables of the system. The latter is facilitated if the entire representation is parsimonious, i.e. if the number of control variables is kept at a minimum by involving a suitable system parameterization. The approach is illustrated through three examples for (a) a hypothetical system of two reservoirs
Zhang, Xiaofeng
2012-03-01
Image formation in fluorescence diffuse optical tomography is critically dependent on construction of the Jacobian matrix. For clinical and preclinical applications, because of the highly heterogeneous characteristics of the medium, Monte Carlo methods are frequently adopted to construct the Jacobian. Conventional adjoint Monte Carlo method typically compute the Jacobian by multiplying the photon density fields radiated from the source at the excitation wavelength and from the detector at the emission wavelength. Nonetheless, this approach assumes that the source and the detector in Green's function are reciprocal, which is invalid in general. This assumption is particularly questionable in small animal imaging, where the mean free path length of photons is typically only one order of magnitude smaller than the representative dimension of the medium. We propose a new method that does not rely on the reciprocity of the source and the detector by tracing photon propagation entirely from the source to the detector. This method relies on the perturbation Monte Carlo theory to account for the differences in optical properties of the medium at the excitation and the emission wavelengths. Compared to the adjoint methods, the proposed method is more valid in reflecting the physical process of photon transport in diffusive media and is more efficient in constructing the Jacobian matrix for densely sampled configurations.
Validation of variance reduction techniques in Mediso (SPIRIT DH-V) SPECT system by Monte Carlo
International Nuclear Information System (INIS)
Monte Carlo simulation of nuclear medical imaging systems is a widely used method for reproducing their operation in a real clinical environment, There are several Single Photon Emission Tomography (SPECT) systems in Cuba. For this reason it is clearly necessary to introduce a reliable and fast simulation platform in order to obtain consistent image data. This data will reproduce the original measurements conditions. In order to fulfill these requirements Monte Carlo platform GAMOS (Geant4 Medicine Oriented Architecture for Applications) have been used. Due to the very size and complex configuration of parallel hole collimators in real clinical SPECT systems, Monte Carlo simulation usually consumes excessively high time and computing resources. main goal of the present work is to optimize the efficiency of calculation by means of new GAMOS functionality. There were developed and validated two GAMOS variance reduction techniques to speed up calculations. These procedures focus and limit transport of gamma quanta inside the collimator. The obtained results were asses experimentally in Mediso (SPIRIT DH-V) SPECT system. Main quality control parameters, such as sensitivity and spatial resolution were determined. Differences of 4.6% sensitivity and 8.7% spatial resolution were reported against manufacturer values. Simulation time was decreased up to 650 times. Using these techniques it was possible to perform several studies in almost 8 hours each. (Author)
Modulated pulse bathymetric lidar Monte Carlo simulation
Luo, Tao; Wang, Yabo; Wang, Rong; Du, Peng; Min, Xia
2015-10-01
A typical modulated pulse bathymetric lidar system is investigated by simulation using a modulated pulse lidar simulation system. In the simulation, the return signal is generated by Monte Carlo method with modulated pulse propagation model and processed by mathematical tools like cross-correlation and digital filter. Computer simulation results incorporating the modulation detection scheme reveal a significant suppression of the water backscattering signal and corresponding target contrast enhancement. More simulation experiments are performed with various modulation and reception variables to investigate the effect of them on the bathymetric system performance.
Monte Carlo Simulation of an American Option
Directory of Open Access Journals (Sweden)
Gikiri Thuo
2007-04-01
Full Text Available We implement gradient estimation techniques for sensitivity analysis of option pricing which can be efficiently employed in Monte Carlo simulation. Using these techniques we can simultaneously obtain an estimate of the option value together with the estimates of sensitivities of the option value to various parameters of the model. After deriving the gradient estimates we incorporate them in an iterative stochastic approximation algorithm for pricing an option with early exercise features. We illustrate the procedure using an example of an American call option with a single dividend that is analytically tractable. In particular we incorporate estimates for the gradient with respect to the early exercise threshold level.
Monte-Carlo simulations: FLUKA vs. MCNPX
Czech Academy of Sciences Publication Activity Database
Oden, M.; Krása, Antonín; Majerle, Mitja; Svoboda, Ondřej; Wagner, Vladimír
Melville : AMER INST PHYSICS, 2007 - (Granja, C.; Leroy, C.; Štekl, I.), s. 219-221 ISBN 978-0-7354-0472-4. ISSN 0094-243X. - (AIP Conference Proceedings. 958). [4th International Summer School on Nuclear Physics Methods and Accelerators in Biology and Medicine . Praha (CZ), 08.07.2007-19.07.2007] R&D Projects: GA MŠk(CZ) LC07050 Institutional research plan: CEZ:AV0Z10480505 Keywords : neutron production * spallation reaction * Monte-Carlo simulation Subject RIV: BG - Nuclear , Atomic and Molecular Physics, Colliders
The Moment Guided Monte Carlo Method
Degond, Pierre; Pareschi, Lorenzo
2009-01-01
In this work we propose a new approach for the numerical simulation of kinetic equations through Monte Carlo schemes. We introduce a new technique which permits to reduce the variance of particle methods through a matching with a set of suitable macroscopic moment equations. In order to guarantee that the moment equations provide the correct solutions, they are coupled to the kinetic equation through a non equilibrium term. The basic idea, on which the method relies, consists in guiding the particle positions and velocities through moment equations so that the concurrent solution of the moment and kinetic models furnishes the same macroscopic quantities.
Carlos Pereda y la cultura argumental
Eduardo Harada O.
2010-01-01
En este artículo se discute la fenomenología de la atención argumental de Carlos Pereda. Se trata de mostrar que esta fenomenología toma en cuenta todos los aspectos de la argumentación, principalmente, las reglas y virtudes epistémicas que sirven para controlar esta actividad de manera interna así como evitar los vértigos argumentales, además, no sólo estudia a los argumentos o apoyos determinados o deductivos sino, igualmente, a los subdeterminados, pues sostiene que éstos son una parte imp...
Markov chains analytic and Monte Carlo computations
Graham, Carl
2014-01-01
Markov Chains: Analytic and Monte Carlo Computations introduces the main notions related to Markov chains and provides explanations on how to characterize, simulate, and recognize them. Starting with basic notions, this book leads progressively to advanced and recent topics in the field, allowing the reader to master the main aspects of the classical theory. This book also features: Numerous exercises with solutions as well as extended case studies.A detailed and rigorous presentation of Markov chains with discrete time and state space.An appendix presenting probabilistic notions that are nec
Discovering correlated fermions using quantum Monte Carlo.
Wagner, Lucas K; Ceperley, David M
2016-09-01
It has become increasingly feasible to use quantum Monte Carlo (QMC) methods to study correlated fermion systems for realistic Hamiltonians. We give a summary of these techniques targeted at researchers in the field of correlated electrons, focusing on the fundamentals, capabilities, and current status of this technique. The QMC methods often offer the highest accuracy solutions available for systems in the continuum, and, since they address the many-body problem directly, the simulations can be analyzed to obtain insight into the nature of correlated quantum behavior. PMID:27518859
Energy Technology Data Exchange (ETDEWEB)
Marcus, Ryan C. [Los Alamos National Laboratory
2012-07-24
Overview of this presentation is (1) Exascale computing - different technologies, getting there; (2) high-performance proof-of-concept MCMini - features and results; and (3) OpenCL toolkit - Oatmeal (OpenCL Automatic Memory Allocation Library) - purpose and features. Despite driver issues, OpenCL seems like a good, hardware agnostic tool. MCMini demonstrates the possibility for GPGPU-based Monte Carlo methods - it shows great scaling for HPC application and algorithmic equivalence. Oatmeal provides a flexible framework to aid in the development of scientific OpenCL codes.
Monte Carlo methods for applied scientists
Dimov, Ivan T
2007-01-01
The Monte Carlo method is inherently parallel and the extensive and rapid development in parallel computers, computational clusters and grids has resulted in renewed and increasing interest in this method. At the same time there has been an expansion in the application areas and the method is now widely used in many important areas of science including nuclear and semiconductor physics, statistical mechanics and heat and mass transfer. This book attempts to bridge the gap between theory and practice concentrating on modern algorithmic implementation on parallel architecture machines. Although
Variation After Response in Quantum Monte Carlo
Neuscamman, Eric
2016-01-01
We present a new method for modeling electronically excited states that overcomes a key failing of linear response theory by allowing the underlying ground state ansatz to relax in the presence of an excitation. The method is variational, has a cost similar to ground state variational Monte Carlo, and admits both open and periodic boundary conditions. We present preliminary numerical results showing that, when paired with the Jastrow antisymmetric geminal power ansatz, the variation-after-response formalism delivers accuracies for valence and charge transfer single excitations on par with equation of motion coupled cluster, while surpassing even this very high-level method's accuracy for excitations with significant doubly excited character.
Monte Carlo method in radiation transport problems
International Nuclear Information System (INIS)
In neutral radiation transport problems (neutrons, photons), two values are important: the flux in the phase space and the density of particles. To solve the problem with Monte Carlo method leads to, among other things, build a statistical process (called the play) and to provide a numerical value to a variable x (this attribution is called score). Sampling techniques are presented. Play biasing necessity is proved. A biased simulation is made. At last, the current developments (rewriting of programs for instance) are presented due to several reasons: two of them are the vectorial calculation apparition and the photon and neutron transport in vacancy media
Introduction to Monte-Carlo method
International Nuclear Information System (INIS)
We recall first some well known facts about random variables and sampling. Then we define the Monte-Carlo method in the case where one wants to compute a given integral. Afterwards, we ship to discrete Markov chains for which we define random walks, and apply to finite difference approximations of diffusion equations. Finally we consider Markov chains with continuous state (but discrete time), transition probabilities and random walks, which are the main piece of this work. The applications are: diffusion and advection equations, and the linear transport equation with scattering
Monte Carlo simulation of block copolymer brushes
International Nuclear Information System (INIS)
We studied a simplified model of a polymer brush formed by linear chains, which were restricted to a simple cubic lattice. The chain macromolecules consisted of a sequence of two kinds of segment, arranged in a specific sequence. The chains were grafted to an impenetrable surface, i.e. they were terminally attached to the surface at one end. The number of chains was varied from low to high grafting density. The model system was studied under different solvent quality, from good to poor solvent. The properties of this model system were studied by means of Monte Carlo simulations. The sampling algorithm was based on local changes of the chain's conformations
Discovering correlated fermions using quantum Monte Carlo
Wagner, Lucas K.; Ceperley, David M.
2016-09-01
It has become increasingly feasible to use quantum Monte Carlo (QMC) methods to study correlated fermion systems for realistic Hamiltonians. We give a summary of these techniques targeted at researchers in the field of correlated electrons, focusing on the fundamentals, capabilities, and current status of this technique. The QMC methods often offer the highest accuracy solutions available for systems in the continuum, and, since they address the many-body problem directly, the simulations can be analyzed to obtain insight into the nature of correlated quantum behavior.
IN MEMORIAM CARLOS RESTREPO. UN VERDADERO MAESTRO
Pelayo Correa
2009-01-01
Carlos Restrepo fue el primer profesor de Patología y un miembro ilustre del grupo de pioneros que fundaron la Facultad de Medicina de la Universidad del Valle. Estos pioneros convergieron en Cali en la década de 1950, en posesión de un espíritu renovador y creativo que emprendió con mucho éxito la labor de cambiar la cultura académica del Valle del Cauca. Ellos encontraron una sociedad apacible, que disfrutaba de la generosidad de su entorno, sin deseos de romper las tradiciones centenarias ...
Monte Carlo modelling for neutron guide losses
International Nuclear Information System (INIS)
In modern research reactors, neutron guides are commonly used for beam conducting. The neutron guide is a well polished or equivalently smooth glass tube covered inside by sputtered or evaporated film of natural Ni or 58Ni isotope where the neutrons are totally reflected. A Monte Carlo calculation was carried out to establish the real efficiency and the spectral as well as spatial distribution of the neutron beam at the end of a glass mirror guide. The losses caused by mechanical inaccuracy and mirror quality were considered and the effects due to the geometrical arrangement were analyzed. (author) 2 refs.; 2 figs
Kinetic Monte Carlo simulation of dislocation dynamics
International Nuclear Information System (INIS)
A kinetic Monte Carlo simulation of dislocation motion is introduced. The dislocations are assumed to be composed of pure edge and screw segments confined to a fixed lattice. The stress and temperature dependence of the dislocation velocity is studied, and finite-size effects are discussed. It is argued that surfaces and boundaries may play a significant role in the velocity of dislocations. The simulated dislocations are shown to display kinetic roughening according to the exponents predicted by the Kardar-Parisi-Zhang equation. copyright 1999 The American Physical Society
Monte Carlo Simulation of Quantum Computation
Cerf, N. J.; Koonin, S. E.
1997-01-01
The many-body dynamics of a quantum computer can be reduced to the time evolution of non-interacting quantum bits in auxiliary fields by use of the Hubbard-Stratonovich representation of two-bit quantum gates in terms of one-bit gates. This makes it possible to perform the stochastic simulation of a quantum algorithm, based on the Monte Carlo evaluation of an integral of dimension polynomial in the number of quantum bits. As an example, the simulation of the quantum circuit for the Fast Fouri...
Monte Carlo simulation for Kaonic deuterium studies
International Nuclear Information System (INIS)
Full text: The SIDDHARTA experiment at the DAFNE collider measured the shift and with of the ground level in kaonic hydrogen caused by the strong interaction between the kaons and protons. The measurement of the X-ray transitions to the 1s level in kaonic deuterium will allow, together with the available results from kaonic hydrogen, to extract the isospin- dependent antikaon-nucleon scattering lengths. I will present the Monte Carlo simulation of the SIDDHARTA-2 setup, in the framework of GEANT4. The program is used to optimize the critical parameters of the setup in order to perform the kaonic deuterium measurement. (author)
Monte Carlo simulations for heavy ion dosimetry
Geithner, Oksana
2006-01-01
Water-to-air stopping power ratio ( ) calculations for the ionization chamber dosimetry of clinically relevant ion beams with initial energies from 50 to 450 MeV/u have been performed using the Monte Carlo technique. To simulate the transport of a particle in water the computer code SHIELD-HIT v2 was used which is a substantially modified version of its predecessor SHIELD-HIT v1. The code was partially rewritten, replacing formerly used single precision variables with double precision variabl...
by means of FLUKA Monte Carlo method
Directory of Open Access Journals (Sweden)
Ermis Elif Ebru
2015-01-01
Full Text Available Calculations of gamma-ray mass attenuation coefficients of various detector materials (crystals were carried out by means of FLUKA Monte Carlo (MC method at different gamma-ray energies. NaI, PVT, GSO, GaAs and CdWO4 detector materials were chosen in the calculations. Calculated coefficients were also compared with the National Institute of Standards and Technology (NIST values. Obtained results through this method were highly in accordance with those of the NIST values. It was concluded from the study that FLUKA MC method can be an alternative way to calculate the gamma-ray mass attenuation coefficients of the detector materials.
Archimedes, the Free Monte Carlo simulator
Sellier, Jean Michel D
2012-01-01
Archimedes is the GNU package for Monte Carlo simulations of electron transport in semiconductor devices. The first release appeared in 2004 and since then it has been improved with many new features like quantum corrections, magnetic fields, new materials, GUI, etc. This document represents the first attempt to have a complete manual. Many of the Physics models implemented are described and a detailed description is presented to make the user able to write his/her own input deck. Please, feel free to contact the author if you want to contribute to the project.
BOOK REVIEW: Therapeutic Applications of Monte Carlo Calculations in Nuclear Medicine
Coulot, J.
2003-08-01
H Zaidi and G Sgouros (eds) Bristol: Institute of Physics Publishing (2002) £70.00, ISBN: 0750308168 Monte Carlo techniques are involved in many applications in medical physics, and the field of nuclear medicine has seen a great development in the past ten years due to their wider use. Thus, it is of great interest to look at the state of the art in this domain, when improving computer performances allow one to obtain improved results in a dramatically reduced time. The goal of this book is to make, in 15 chapters, an exhaustive review of the use of Monte Carlo techniques in nuclear medicine, also giving key features which are not necessary directly related to the Monte Carlo method, but mandatory for its practical application. As the book deals with `therapeutic' nuclear medicine, it focuses on internal dosimetry. After a general introduction on Monte Carlo techniques and their applications in nuclear medicine (dosimetry, imaging and radiation protection), the authors give an overview of internal dosimetry methods (formalism, mathematical phantoms, quantities of interest). Then, some of the more widely used Monte Carlo codes are described, as well as some treatment planning softwares. Some original techniques are also mentioned, such as dosimetry for boron neutron capture synovectomy. It is generally well written, clearly presented, and very well documented. Each chapter gives an overview of each subject, and it is up to the reader to investigate it further using the extensive bibliography provided. Each topic is discussed from a practical point of view, which is of great help for non-experienced readers. For instance, the chapter about mathematical aspects of Monte Carlo particle transport is very clear and helps one to apprehend the philosophy of the method, which is often a difficulty with a more theoretical approach. Each chapter is put in the general (clinical) context, and this allows the reader to keep in mind the intrinsic limitation of each technique
Simulation of Cone Beam CT System Based on Monte Carlo Method
Wang, Yu; Cao, Ruifen; Hu, Liqin; Li, Bingbing
2014-01-01
Adaptive Radiation Therapy (ART) was developed based on Image-guided Radiation Therapy (IGRT) and it is the trend of photon radiation therapy. To get a better use of Cone Beam CT (CBCT) images for ART, the CBCT system model was established based on Monte Carlo program and validated against the measurement. The BEAMnrc program was adopted to the KV x-ray tube. Both IOURCE-13 and ISOURCE-24 were chosen to simulate the path of beam particles. The measured Percentage Depth Dose (PDD) and lateral dose profiles under 1cm water were compared with the dose calculated by DOSXYZnrc program. The calculated PDD was better than 1% within the depth of 10cm. More than 85% points of calculated lateral dose profiles was within 2%. The correct CBCT system model helps to improve CBCT image quality for dose verification in ART and assess the CBCT image concomitant dose risk.
Monte Carlo Hamiltonian: Generalization to Quantum Field Theory
Luo, Xiang-Qian; Jirari, H.; Kroger, H; Moriarty, K.
2001-01-01
Monte Carlo techniques with importance sampling have been extensively applied to lattice gauge theory in the Lagrangian formulation. Unfortunately, it is extremely difficult to compute the excited states using the conventional Monte Carlo algorithm. Our recently developed approach: the Monte Carlo Hamiltonian method, has been designed to overcome the difficulties of the conventional approach. In this paper, we extend the method to many body systems and quantum field theory. The Klein-Gordon f...
Alternative Monte Carlo Approach for General Global Illumination
Institute of Scientific and Technical Information of China (English)
徐庆; 李朋; 徐源; 孙济洲
2004-01-01
An alternative Monte Carlo strategy for the computation of global illumination problem was presented.The proposed approach provided a new and optimal way for solving Monte Carlo global illumination based on the zero variance importance sampling procedure. A new importance driven Monte Carlo global illumination algorithm in the framework of the new computing scheme was developed and implemented. Results, which were obtained by rendering test scenes, show that this new framework and the newly derived algorithm are effective and promising.
Unbiased combinations of nonanalog Monte Carlo techniques and fair games
International Nuclear Information System (INIS)
Historically, Monte Carlo variance reduction techniques have developed one at a time in response to calculational needs. This paper provides the theoretical basis for obtaining unbiased Monte Carlo estimates from all possible combinations of variance reduction techniques. Hitherto, the techniques have not been proven to be unbiased in arbitrary combinations. The authors are unaware of any Monte Carlo techniques (in any linear process) that are not treated by the theorem herein. (author)
Temperature variance study in Monte-Carlo photon transport theory
International Nuclear Information System (INIS)
We study different Monte-Carlo methods for solving radiative transfer problems, and particularly Fleck's Monte-Carlo method. We first give the different time-discretization schemes and the corresponding stability criteria. Then we write the temperature variance as a function of the variances of temperature and absorbed energy at the previous time step. Finally we obtain some stability criteria for the Monte-Carlo method in the stationary case
Monte Carlo likelihood inference for missing data models
Sung, Yun Ju; Geyer, Charles J.
2007-01-01
We describe a Monte Carlo method to approximate the maximum likelihood estimate (MLE), when there are missing data and the observed data likelihood is not available in closed form. This method uses simulated missing data that are independent and identically distributed and independent of the observed data. Our Monte Carlo approximation to the MLE is a consistent and asymptotically normal estimate of the minimizer θ* of the Kullback–Leibler information, as both Monte Carlo and observed data sa...
MontePython: Implementing Quantum Monte Carlo using Python
J.K. Nilsen
2006-01-01
We present a cross-language C++/Python program for simulations of quantum mechanical systems with the use of Quantum Monte Carlo (QMC) methods. We describe a system for which to apply QMC, the algorithms of variational Monte Carlo and diffusion Monte Carlo and we describe how to implement theses methods in pure C++ and C++/Python. Furthermore we check the efficiency of the implementations in serial and parallel cases to show that the overhead using Python can be negligible.
Validation of Compton Scattering Monte Carlo Simulation Models
Weidenspointner, Georg; Hauf, Steffen; Hoff, Gabriela; Kuster, Markus; Pia, Maria Grazia; Saracco, Paolo
2014-01-01
Several models for the Monte Carlo simulation of Compton scattering on electrons are quantitatively evaluated with respect to a large collection of experimental data retrieved from the literature. Some of these models are currently implemented in general purpose Monte Carlo systems; some have been implemented and evaluated for possible use in Monte Carlo particle transport for the first time in this study. Here we present first and preliminary results concerning total and differential Compton scattering cross sections.
Combinatorial nuclear level density by a Monte Carlo method
Cerf, N.
1993-01-01
We present a new combinatorial method for the calculation of the nuclear level density. It is based on a Monte Carlo technique, in order to avoid a direct counting procedure which is generally impracticable for high-A nuclei. The Monte Carlo simulation, making use of the Metropolis sampling scheme, allows a computationally fast estimate of the level density for many fermion systems in large shell model spaces. We emphasize the advantages of this Monte Carlo approach, particularly concerning t...
Discrete diffusion Monte Carlo for frequency-dependent radiative transfer
Energy Technology Data Exchange (ETDEWEB)
Densmore, Jeffrey D [Los Alamos National Laboratory; Kelly, Thompson G [Los Alamos National Laboratory; Urbatish, Todd J [Los Alamos National Laboratory
2010-11-17
Discrete Diffusion Monte Carlo (DDMC) is a technique for increasing the efficiency of Implicit Monte Carlo radiative-transfer simulations. In this paper, we develop an extension of DDMC for frequency-dependent radiative transfer. We base our new DDMC method on a frequency-integrated diffusion equation for frequencies below a specified threshold. Above this threshold we employ standard Monte Carlo. With a frequency-dependent test problem, we confirm the increased efficiency of our new DDMC technique.
Challenges and prospects for whole-coremonte Carlo analysis
International Nuclear Information System (INIS)
The advantages for using Monte Carlo methods to analyze full-core reactor configurations include essentially exact representation of geometry and physical phenomena that are important for reactor analysis. But this substantial advantage comes at a substantial cost because of the computational burden, both in terms of memory demand and computational time. This paper focuses on the challenges facing full-core Monte Carlo for keff calculations and the prospects for Monte Carlo becoming a routine tool for reactor analysis.
Neutron transport calculations using Quasi-Monte Carlo methods
Energy Technology Data Exchange (ETDEWEB)
Moskowitz, B.S.
1997-07-01
This paper examines the use of quasirandom sequences of points in place of pseudorandom points in Monte Carlo neutron transport calculations. For two simple demonstration problems, the root mean square error, computed over a set of repeated runs, is found to be significantly less when quasirandom sequences are used ({open_quotes}Quasi-Monte Carlo Method{close_quotes}) than when a standard Monte Carlo calculation is performed using only pseudorandom points.
FastDIRC: a fast Monte Carlo and reconstruction algorithm for DIRC detectors
Hardin, John
2016-01-01
FastDIRC is a novel fast Monte Carlo and reconstruction algorithm for DIRC detectors. A DIRC employs rectangular fused-silica bars both as Cherenkov radiators and as light guides. Cherenkov-photon imaging and time-of-propagation information are utilized by a DIRC to identify charged particles. GEANT-based DIRC Monte Carlo simulations are extremely CPU intensive. The FastDIRC algorithm permits fully simulating a DIRC detector more than 10000 times faster than using GEANT. This facilitates designing a DIRC-reconstruction algorithm that improves the Cherenkov-angle resolution of a DIRC detector by about 30% compared to existing algorithms. FastDIRC also greatly reduces the time required to study competing DIRC-detector designs.
Institute of Scientific and Technical Information of China (English)
Dong CHEN; Yongping LEI; Xiaoyan LI; Yaowu SHI; Zhiling TIAN
2003-01-01
In the present research Monte Carlo technique was used to simulate the grain growth in heat-affected zone(HAZ) of an ultrafine grain steel. An experimental data based (EBD) model proposed by Gao was used to establish the relation between tMCS and real time temperature kinetics in our simulation. The simulations give out the evolution of grain structure and grain size distribution in HAZ of the ultrafine grain steel. A Microsoft Window based on computer program for the simulation of grain growth in the HAZ of weldment in three dimensions has been developed using Monte Carlo technique. For the system, inputting the temperature field data and material properties, the evolution of grain structure, both image of simulated grain structure and numerical datum reflecting grain size distribution can be produced by the program. The system was applied to the ultrafine grain steel welding, and the simulated results show that the ultrafine grain steel has large tendency of grain growth.
Information Geometry and Sequential Monte Carlo
Sim, Aaron; Stumpf, Michael P H
2012-01-01
This paper explores the application of methods from information geometry to the sequential Monte Carlo (SMC) sampler. In particular the Riemannian manifold Metropolis-adjusted Langevin algorithm (mMALA) is adapted for the transition kernels in SMC. Similar to its function in Markov chain Monte Carlo methods, the mMALA is a fully adaptable kernel which allows for efficient sampling of high-dimensional and highly correlated parameter spaces. We set up the theoretical framework for its use in SMC with a focus on the application to the problem of sequential Bayesian inference for dynamical systems as modelled by sets of ordinary differential equations. In addition, we argue that defining the sequence of distributions on geodesics optimises the effective sample sizes in the SMC run. We illustrate the application of the methodology by inferring the parameters of simulated Lotka-Volterra and Fitzhugh-Nagumo models. In particular we demonstrate that compared to employing a standard adaptive random walk kernel, the SM...
Quantum Monte Carlo for atoms and molecules
International Nuclear Information System (INIS)
The diffusion quantum Monte Carlo with fixed nodes (QMC) approach has been employed in studying energy-eigenstates for 1--4 electron systems. Previous work employing the diffusion QMC technique yielded energies of high quality for H2, LiH, Li2, and H2O. Here, the range of calculations with this new approach has been extended to include additional first-row atoms and molecules. In addition, improvements in the previously computed fixed-node energies of LiH, Li2, and H2O have been obtained using more accurate trial functions. All computations were performed within, but are not limited to, the Born-Oppenheimer approximation. In our computations, the effects of variation of Monte Carlo parameters on the QMC solution of the Schroedinger equation were studied extensively. These parameters include the time step, renormalization time and nodal structure. These studies have been very useful in determining which choices of such parameters will yield accurate QMC energies most efficiently. Generally, very accurate energies (90--100% of the correlation energy is obtained) have been computed with single-determinant trail functions multiplied by simple correlation functions. Improvements in accuracy should be readily obtained using more complex trial functions
High performance computing&Monte Carlo
Energy Technology Data Exchange (ETDEWEB)
Brown, F. B. (Forrest B.); Martin, W. R. (William R.)
2004-01-01
High performance computing (HPC), used for the most demanding computational problems, has evolved from single processor custom systems in the 1960s and 1970s, to vector processors in the 1980s, to parallel processors in the 1990s, to clusters of commodity processors in the 2000s. Performance/price has increased by a factor of more than I million over that time, so that today's desktop PC is more powerful than yesterday's supercomputer. With the introduction of inexpensive Linux clusters and the standardization of parallel software through MPI and OpenMP, parallel computing is now widespread and available to everyone. Monte Carlo codes for particle transport are especially well-positioned to take advantage of accessible parallel computing, due to the inherently parallel nature of the computational algorithm. We review Monte Carlo particle parallelism, including the basic algorithm, load-balancing, fault tolerance, and scaling, using MCNP5 as an example. Due to memory limitations, especially on single nodes of Linux clusters, domain decomposition has been tried, with partial success. We conclude with a new scheme, data decomposition, which holds promise for very large problems.
Monte Carlo generators in ATLAS software
International Nuclear Information System (INIS)
This document describes how Monte Carlo (MC) generators can be used in the ATLAS software framework (Athena). The framework is written in C++ using Python scripts for job configuration. Monte Carlo generators that provide the four-vectors describing the results of LHC collisions are written in general by third parties and are not part of Athena. These libraries are linked from the LCG Generator Services (GENSER) distribution. Generators are run from within Athena and the generated event output is put into a transient store, in HepMC format, using StoreGate. A common interface, implemented via inheritance of a GeneratorModule class, guarantees common functionality for the basic generation steps. The generator information can be accessed and manipulated by helper packages like TruthHelper. The ATLAS detector simulation as well access the truth information from StoreGate1. Steering is done through specific interfaces to allow for flexible configuration using ATLAS Python scripts. Interfaces to most general purpose generators, including: Pythia6, Pythia8, Herwig, Herwig++ and Sherpa are provided, as well as to more specialized packages, for example Phojet and Cascade. A second type of interface exist for the so called Matrix Element generators that only generate the particles produced in the hard scattering process and write events in the Les Houches event format. A generic interface to pass these events to Pythia6 and Herwig for parton showering and hadronisation has been written.
Monte Carlo calculations in lattice gauge theories
International Nuclear Information System (INIS)
This paper covers the following: a few words of motivation for numerical simulations, and a description of the Monte Carlo method as applied to lattice gauge theories. This is followed by a discussion of systems that contain bosonic degrees of freedom only. The authors review Monte Carlo results for pure gauge systems, illustrating the determination of a variety of observables - the string tension, the potential, the temperature at which quarks become deconfined, and attempts to calculate the mass gap of the theory, also called the glue-ball mass. They try to explain what happens if one considers various types of the action, how one verifies universality in the passage to the continuum limit and we mention briefly simulations applied to systems that go beyond just gauge fields and include other bosonic fields, known in general as Higgs scalars. Finally they consider fermions on the lattice, pointing out conceptual problems in the formulation of the Dirac equation on the lattice, and then discussing the difficulties that arise in attempting to apply the same kind of numerical methods to fermionic systems, the approximations and the techniques that are used to overcome these problems and some of the numerical results
Feedback-optimized parallel tempering Monte Carlo
Katzgraber, Helmut G.; Trebst, Simon; Huse, David A.; Troyer, Matthias
2006-03-01
We introduce an algorithm for systematically improving the efficiency of parallel tempering Monte Carlo simulations by optimizing the simulated temperature set. Our approach is closely related to a recently introduced adaptive algorithm that optimizes the simulated statistical ensemble in generalized broad-histogram Monte Carlo simulations. Conventionally, a temperature set is chosen in such a way that the acceptance rates for replica swaps between adjacent temperatures are independent of the temperature and large enough to ensure frequent swaps. In this paper, we show that by choosing the temperatures with a modified version of the optimized ensemble feedback method we can minimize the round-trip times between the lowest and highest temperatures which effectively increases the efficiency of the parallel tempering algorithm. In particular, the density of temperatures in the optimized temperature set increases at the 'bottlenecks' of the simulation, such as phase transitions. In turn, the acceptance rates are now temperature dependent in the optimized temperature ensemble. We illustrate the feedback-optimized parallel tempering algorithm by studying the two-dimensional Ising ferromagnet and the two-dimensional fully frustrated Ising model, and briefly discuss possible feedback schemes for systems that require configurational averages, such as spin glasses.
The MCNPX Monte Carlo Radiation Transport Code
International Nuclear Information System (INIS)
MCNPX (Monte Carlo N-Particle eXtended) is a general-purpose Monte Carlo radiation transport code with three-dimensional geometry and continuous-energy transport of 34 particles and light ions. It contains flexible source and tally options, interactive graphics, and support for both sequential and multi-processing computer platforms. MCNPX is based on MCNP4c and has been upgraded to most MCNP5 capabilities. MCNP is a highly stable code tracking neutrons, photons and electrons, and using evaluated nuclear data libraries for low-energy interaction probabilities. MCNPX has extended this base to a comprehensive set of particles and light ions, with heavy ion transport in development. Models have been included to calculate interaction probabilities when libraries are not available. Recent additions focus on the time evolution of residual nuclei decay, allowing calculation of transmutation and delayed particle emission. MCNPX is now a code of great dynamic range, and the excellent neutronics capabilities allow new opportunities to simulate devices of interest to experimental particle physics, particularly calorimetry. This paper describes the capabilities of the current MCNPX version 2.6.C, and also discusses ongoing code development
THE MCNPX MONTE CARLO RADIATION TRANSPORT CODE
Energy Technology Data Exchange (ETDEWEB)
WATERS, LAURIE S. [Los Alamos National Laboratory; MCKINNEY, GREGG W. [Los Alamos National Laboratory; DURKEE, JOE W. [Los Alamos National Laboratory; FENSIN, MICHAEL L. [Los Alamos National Laboratory; JAMES, MICHAEL R. [Los Alamos National Laboratory; JOHNS, RUSSELL C. [Los Alamos National Laboratory; PELOWITZ, DENISE B. [Los Alamos National Laboratory
2007-01-10
MCNPX (Monte Carlo N-Particle eXtended) is a general-purpose Monte Carlo radiation transport code with three-dimensional geometry and continuous-energy transport of 34 particles and light ions. It contains flexible source and tally options, interactive graphics, and support for both sequential and multi-processing computer platforms. MCNPX is based on MCNP4B, and has been upgraded to most MCNP5 capabilities. MCNP is a highly stable code tracking neutrons, photons and electrons, and using evaluated nuclear data libraries for low-energy interaction probabilities. MCNPX has extended this base to a comprehensive set of particles and light ions, with heavy ion transport in development. Models have been included to calculate interaction probabilities when libraries are not available. Recent additions focus on the time evolution of residual nuclei decay, allowing calculation of transmutation and delayed particle emission. MCNPX is now a code of great dynamic range, and the excellent neutronics capabilities allow new opportunities to simulate devices of interest to experimental particle physics; particularly calorimetry. This paper describes the capabilities of the current MCNPX version 2.6.C, and also discusses ongoing code development.
Reactor perturbation calculations by Monte Carlo methods
International Nuclear Information System (INIS)
Whilst Monte Carlo methods are useful for reactor calculations involving complicated geometry, it is difficult to apply them to the calculation of perturbation worths because of the large amount of computing time needed to obtain good accuracy. Various ways of overcoming these difficulties are investigated in this report, with the problem of estimating absorbing control rod worths particularly in mind. As a basis for discussion a method of carrying out multigroup reactor calculations by Monte Carlo methods is described. Two methods of estimating a perturbation worth directly, without differencing two quantities of like magnitude, are examined closely but are passed over in favour of a third method based on a correlation technique. This correlation method is described, and demonstrated by a limited range of calculations for absorbing control rods in a fast reactor. In these calculations control rod worths of between 1% and 7% in reactivity are estimated to an accuracy better than 10% (3 standard errors) in about one hour's computing time on the English Electric KDF.9 digital computer. (author)
Bayesian segmentation of hyperspectral images
Mohammadpour, Adel; Mohammad-Djafari, Ali
2007-01-01
In this paper we consider the problem of joint segmentation of hyperspectral images in the Bayesian framework. The proposed approach is based on a Hidden Markov Modeling (HMM) of the images with common segmentation, or equivalently with common hidden classification label variables which is modeled by a Potts Markov Random Field. We introduce an appropriate Markov Chain Monte Carlo (MCMC) algorithm to implement the method and show some simulation results.
Bayesian segmentation of hyperspectral images
Mohammadpour, Adel; Féron, Olivier; Mohammad-Djafari, Ali
2004-11-01
In this paper we consider the problem of joint segmentation of hyperspectral images in the Bayesian framework. The proposed approach is based on a Hidden Markov Modeling (HMM) of the images with common segmentation, or equivalently with common hidden classification label variables which is modeled by a Potts Markov Random Field. We introduce an appropriate Markov Chain Monte Carlo (MCMC) algorithm to implement the method and show some simulation results.
Monte Carlo techniques for analyzing deep-penetration problems
International Nuclear Information System (INIS)
Current methods and difficulties in Monte Carlo deep-penetration calculations are reviewed, including statistical uncertainty and recent adjoint optimization of splitting, Russian roulette, and exponential transformation biasing. Other aspects of the random walk and estimation processes are covered, including the relatively new DXANG angular biasing technique. Specific items summarized are albedo scattering, Monte Carlo coupling techniques with discrete ordinates and other methods, adjoint solutions, and multigroup Monte Carlo. The topic of code-generated biasing parameters is presented, including the creation of adjoint importance functions from forward calculations. Finally, current and future work in the area of computer learning and artificial intelligence is discussed in connection with Monte Carlo applications
Quantum Monte Carlo Endstation for Petascale Computing
Energy Technology Data Exchange (ETDEWEB)
Lubos Mitas
2011-01-26
NCSU research group has been focused on accomplising the key goals of this initiative: establishing new generation of quantum Monte Carlo (QMC) computational tools as a part of Endstation petaflop initiative for use at the DOE ORNL computational facilities and for use by computational electronic structure community at large; carrying out high accuracy quantum Monte Carlo demonstration projects in application of these tools to the forefront electronic structure problems in molecular and solid systems; expanding the impact of QMC methods and approaches; explaining and enhancing the impact of these advanced computational approaches. In particular, we have developed quantum Monte Carlo code (QWalk, www.qwalk.org) which was significantly expanded and optimized using funds from this support and at present became an actively used tool in the petascale regime by ORNL researchers and beyond. These developments have been built upon efforts undertaken by the PI's group and collaborators over the period of the last decade. The code was optimized and tested extensively on a number of parallel architectures including petaflop ORNL Jaguar machine. We have developed and redesigned a number of code modules such as evaluation of wave functions and orbitals, calculations of pfaffians and introduction of backflow coordinates together with overall organization of the code and random walker distribution over multicore architectures. We have addressed several bottlenecks such as load balancing and verified efficiency and accuracy of the calculations with the other groups of the Endstation team. The QWalk package contains about 50,000 lines of high quality object-oriented C++ and includes also interfaces to data files from other conventional electronic structure codes such as Gamess, Gaussian, Crystal and others. This grant supported PI for one month during summers, a full-time postdoc and partially three graduate students over the period of the grant duration, it has resulted in 13
Acceleration of GATE Monte Carlo simulations
De Beenhouwer, Jan
2008-01-01
Positron Emission Tomography (PET) and Single Photon Emission Computed Tomography are forms of medical imaging that produce functional images that reflect biological processes. They are based on the tracer principle. A biologically active substance, a pharmaceutical, is selected so that its spatial and temporal distribution in the body reflects a certain body function or metabolism. In order to form images of the distribution, the pharmaceutical is labeled with gamma-ray-emitting or positron-...
Morse Monte Carlo Radiation Transport Code System
Energy Technology Data Exchange (ETDEWEB)
Emmett, M.B.
1975-02-01
The report contains sections containing descriptions of the MORSE and PICTURE codes, input descriptions, sample problems, deviations of the physical equations and explanations of the various error messages. The MORSE code is a multipurpose neutron and gamma-ray transport Monte Carlo code. Time dependence for both shielding and criticality problems is provided. General three-dimensional geometry may be used with an albedo option available at any material surface. The PICTURE code provide aid in preparing correct input data for the combinatorial geometry package CG. It provides a printed view of arbitrary two-dimensional slices through the geometry. By inspecting these pictures one may determine if the geometry specified by the input cards is indeed the desired geometry. 23 refs. (WRF)
Computation cluster for Monte Carlo calculations
International Nuclear Information System (INIS)
Two computation clusters based on Rocks Clusters 5.1 Linux distribution with Intel Core Duo and Intel Core Quad based computers were made at the Department of the Nuclear Physics and Technology. Clusters were used for Monte Carlo calculations, specifically for MCNP calculations applied in Nuclear reactor core simulations. Optimization for computation speed was made on hardware and software basis. Hardware cluster parameters, such as size of the memory, network speed, CPU speed, number of processors per computation, number of processors in one computer were tested for shortening the calculation time. For software optimization, different Fortran compilers, MPI implementations and CPU multi-core libraries were tested. Finally computer cluster was used in finding the weighting functions of neutron ex-core detectors of VVER-440. (authors)
Atomistic Monte Carlo simulation of lipid membranes
DEFF Research Database (Denmark)
Wüstner, Daniel; Sklenar, Heinz
2014-01-01
molecule, as assessed by calculation of molecular energies and entropies. We also show transition from a crystalline-like to a fluid DPPC bilayer by the CBC local-move MC method, as indicated by the electron density profile, head group orientation, area per lipid, and whole-lipid displacements. We discuss......Biological membranes are complex assemblies of many different molecules of which analysis demands a variety of experimental and computational approaches. In this article, we explain challenges and advantages of atomistic Monte Carlo (MC) simulation of lipid membranes. We provide an introduction...... into the various move sets that are implemented in current MC methods for efficient conformational sampling of lipids and other molecules. In the second part, we demonstrate for a concrete example, how an atomistic local-move set can be implemented for MC simulations of phospholipid monomers and...
Monte Carlo modeling and meteor showers
International Nuclear Information System (INIS)
Prediction of short lived increases in the cosmic dust influx, the concentration in lower thermosphere of atoms and ions of meteor origin and the determination of the frequency of micrometeor impacts on spacecraft are all of scientific and practical interest and all require adequate models of meteor showers at an early stage of their existence. A Monte Carlo model of meteor matter ejection from a parent body at any point of space was worked out by other researchers. This scheme is described. According to the scheme, the formation of ten well known meteor streams was simulated and the possibility of genetic affinity of each of them with the most probable parent comet was analyzed. Some of the results are presented
Monte Carlo modeling and meteor showers
Kulikova, N. V.
1987-08-01
Prediction of short lived increases in the cosmic dust influx, the concentration in lower thermosphere of atoms and ions of meteor origin and the determination of the frequency of micrometeor impacts on spacecraft are all of scientific and practical interest and all require adequate models of meteor showers at an early stage of their existence. A Monte Carlo model of meteor matter ejection from a parent body at any point of space was worked out by other researchers. This scheme is described. According to the scheme, the formation of ten well known meteor streams was simulated and the possibility of genetic affinity of each of them with the most probable parent comet was analyzed. Some of the results are presented.
Monte Carlo Exploration of Warped Higgsless Models
Hewett, J L; Rizzo, T G
2004-01-01
We have performed a detailed Monte Carlo exploration of the parameter space for a warped Higgsless model of electroweak symmetry breaking in 5 dimensions. This model is based on the $SU(2)_L\\times SU(2)_R\\times U(1)_{B-L}$ gauge group in an AdS$_5$ bulk with arbitrary gauge kinetic terms on both the Planck and TeV branes. Constraints arising from precision electroweak measurements and collider data are found to be relatively easy to satisfy. We show, however, that the additional requirement of perturbative unitarity up to the cut-off, $\\simeq 10$ TeV, in $W_L^+W_L^-$ elastic scattering in the absence of dangerous tachyons eliminates all models. If successful models of this class exist, they must be highly fine-tuned.
MORSE Monte Carlo radiation transport code system
International Nuclear Information System (INIS)
This report is an addendum to the MORSE report, ORNL-4972, originally published in 1975. This addendum contains descriptions of several modifications to the MORSE Monte Carlo Code, replacement pages containing corrections, Part II of the report which was previously unpublished, and a new Table of Contents. The modifications include a Klein Nishina estimator for gamma rays. Use of such an estimator required changing the cross section routines to process pair production and Compton scattering cross sections directly from ENDF tapes and writing a new version of subroutine RELCOL. Another modification is the use of free form input for the SAMBO analysis data. This required changing subroutines SCORIN and adding new subroutine RFRE. References are updated, and errors in the original report have been corrected
Monte Carlo and detector simulation in OOP
International Nuclear Information System (INIS)
Object-Oriented Programming techniques are explored with an eye towards applications in High Energy Physics codes. Two prototype examples are given: MCOOP (a particle Monte Carlo generator) and GISMO (a detector simulation/analysis package). The OOP programmer does no explicit or detailed memory management nor other bookkeeping chores; hence, the writing, modification, and extension of the code is considerably simplified. Inheritance can be used to simplify the class definitions as well as the instance variables and action methods of each class; thus the work required to add new classes, parameters, or new methods is minimal. The software industry is moving rapidly to OOP since it has been proven to improve programmer productivity, and promises even more for the future by providing truly reusable software. The High Energy Physics community clearly needs to follow this trend
Variable length trajectory compressible hybrid Monte Carlo
Nishimura, Akihiko
2016-01-01
Hybrid Monte Carlo (HMC) generates samples from a prescribed probability distribution in a configuration space by simulating Hamiltonian dynamics, followed by the Metropolis (-Hastings) acceptance/rejection step. Compressible HMC (CHMC) generalizes HMC to a situation in which the dynamics is reversible but not necessarily Hamiltonian. This article presents a framework to further extend the algorithm. Within the existing framework, each trajectory of the dynamics must be integrated for the same amount of (random) time to generate a valid Metropolis proposal. Our generalized acceptance/rejection mechanism allows a more deliberate choice of the integration time for each trajectory. The proposed algorithm in particular enables an effective application of variable step size integrators to HMC-type sampling algorithms based on reversible dynamics. The potential of our framework is further demonstrated by another extension of HMC which reduces the wasted computations due to unstable numerical approximations and corr...
Monte Carlo stratified source-sampling
International Nuclear Information System (INIS)
In 1995, at a conference on criticality safety, a special session was devoted to the Monte Carlo open-quotes eigenvalue of the worldclose quotes problem. Argonne presented a paper, at that session, in which the anomalies originally observed in that problem were reproduced in a much simplified model-problem configuration, and removed by a version of stratified source-sampling. The original test-problem was treated by a special code designed specifically for that purpose. Recently ANL started work on a method for dealing with more realistic eigenvalue of the world configurations, and has been incorporating this method into VIM. The original method has been modified to take into account real-world statistical noise sources not included in the model problem. This paper constitutes a status report on work still in progress
Response decomposition with Monte Carlo correlated coupling
International Nuclear Information System (INIS)
Particle histories that contribute to a detector response are categorized according to whether they are fully confined inside a source-detector enclosure or cross and recross the same enclosure. The contribution from the confined histories is expressed using a forward problem with the external boundary condition on the source-detector enclosure. The contribution from the crossing and recrossing histories is expressed as the surface integral at the same enclosure of the product of the directional cosine and the fluxes in the foregoing forward problem and the adjoint problem for the whole spatial domain. The former contribution can be calculated by a standard forward Monte Carlo. The latter contribution can be calculated by correlated coupling of forward and adjoint histories independently of the former contribution. We briefly describe the computational method and discuss its application to perturbation analysis for localized material changes. (orig.)
Hybrid algorithms in quantum Monte Carlo
International Nuclear Information System (INIS)
With advances in algorithms and growing computing powers, quantum Monte Carlo (QMC) methods have become a leading contender for high accuracy calculations for the electronic structure of realistic systems. The performance gain on recent HPC systems is largely driven by increasing parallelism: the number of compute cores of a SMP and the number of SMPs have been going up, as the Top500 list attests. However, the available memory as well as the communication and memory bandwidth per element has not kept pace with the increasing parallelism. This severely limits the applicability of QMC and the problem size it can handle. OpenMP/MPI hybrid programming provides applications with simple but effective solutions to overcome efficiency and scalability bottlenecks on large-scale clusters based on multi/many-core SMPs. We discuss the design and implementation of hybrid methods in QMCPACK and analyze its performance on current HPC platforms characterized by various memory and communication hierarchies.
San Carlos Apache Tribe - Energy Organizational Analysis
Energy Technology Data Exchange (ETDEWEB)
Rapp, James; Albert, Steve
2012-04-01
The San Carlos Apache Tribe (SCAT) was awarded $164,000 in late-2011 by the U.S. Department of Energy (U.S. DOE) Tribal Energy Program's "First Steps Toward Developing Renewable Energy and Energy Efficiency on Tribal Lands" Grant Program. This grant funded: The analysis and selection of preferred form(s) of tribal energy organization (this Energy Organization Analysis, hereinafter referred to as "EOA"). Start-up staffing and other costs associated with the Phase 1 SCAT energy organization. An intern program. Staff training. Tribal outreach and workshops regarding the new organization and SCAT energy programs and projects, including two annual tribal energy summits (2011 and 2012). This report documents the analysis and selection of preferred form(s) of a tribal energy organization.
Commensurabilities between ETNOs: a Monte Carlo survey
Marcos, C de la Fuente
2016-01-01
Many asteroids in the main and trans-Neptunian belts are trapped in mean motion resonances with Jupiter and Neptune, respectively. As a side effect, they experience accidental commensurabilities among themselves. These commensurabilities define characteristic patterns that can be used to trace the source of the observed resonant behaviour. Here, we explore systematically the existence of commensurabilities between the known ETNOs using their heliocentric and barycentric semimajor axes, their uncertainties, and Monte Carlo techniques. We find that the commensurability patterns present in the known ETNO population resemble those found in the main and trans-Neptunian belts. Although based on small number statistics, such patterns can only be properly explained if most, if not all, of the known ETNOs are subjected to the resonant gravitational perturbations of yet undetected trans-Plutonian planets. We show explicitly that some of the statistically significant commensurabilities are compatible with the Planet Nin...
International Nuclear Information System (INIS)
This study aims to utilize a measurement-based Monte Carlo (MBMC) method to evaluate the accuracy of dose distributions calculated using the Eclipse radiotherapy treatment planning system (TPS) based on the anisotropic analytical algorithm. Dose distributions were calculated for the nasopharyngeal carcinoma (NPC) patients treated with the intensity modulated radiotherapy (IMRT). Ten NPC IMRT plans were evaluated by comparing their dose distributions with those obtained from the in-house MBMC programs for the same CT images and beam geometry. To reconstruct the fluence distribution of the IMRT field, an efficiency map was obtained by dividing the energy fluence of the intensity modulated field by that of the open field, both acquired from an aS1000 electronic portal imaging device. The integrated image of the non-gated mode was used to acquire the full dose distribution delivered during the IMRT treatment. This efficiency map redistributed the particle weightings of the open field phase-space file for IMRT applications. Dose differences were observed in the tumor and air cavity boundary. The mean difference between MBMC and TPS in terms of the planning target volume coverage was 0.6% (range: 0.0–2.3%). The mean difference for the conformity index was 0.01 (range: 0.0–0.01). In conclusion, the MBMC method serves as an independent IMRT dose verification tool in a clinical setting. - Highlights: ► The patient-based Monte Carlo method serves as a reference standard to verify IMRT doses. ► 3D Dose distributions for NPC patients have been verified by the Monte Carlo method. ► Doses predicted by the Monte Carlo method matched closely with those by the TPS. ► The Monte Carlo method predicted a higher mean dose to the middle ears than the TPS. ► Critical organ doses should be confirmed to avoid overdose to normal organs
International Nuclear Information System (INIS)
This study focused on predicting the electronic portal imaging device (EPID) image of intensity modulated radiation treatment (IMRT) fields in the absence of attenuation material in the beam with Monte Carlo methods. As IMRT treatments consist of a series of segments of various sizes that are not always delivered on the central axis, large spectral variations may be observed between the segments. The effect of these spectral variations on the EPID response was studied with fields of various sizes and off-axis positions. A detailed description of the EPID was implemented in a Monte Carlo model. The EPID model was validated by comparing the EPID output factors for field sizes between 1x1 and 26x26 cm2 at the isocenter. The Monte Carlo simulations agreed with the measurements to within 1.5%. The Monte Carlo model succeeded in predicting the EPID response at the center of the fields of various sizes and offsets to within 1% of the measurements. Large variations (up to 29%) of the EPID response were observed between the various offsets. The EPID response increased with field size and with field offset for most cases. The Monte Carlo model was then used to predict the image of a simple test IMRT field delivered on the beam axis and with an offset. A variation of EPID response up to 28% was found between the on- and off-axis delivery. Finally, two clinical IMRT fields were simulated and compared to the measurements. For all IMRT fields, simulations and measurements agreed within 3%--0.2 cm for 98% of the pixels. The spectral variations were quantified by extracting from the spectra at the center of the fields the total photon yield (Ytotal), the photon yield below 1 MeV (Ylow), and the percentage of photons below 1 MeV (Plow). For the studied cases, a correlation was shown between the EPID response variation and Ytotal, Ylow, and Plow
Monte Carlo approaches to effective field theories
International Nuclear Information System (INIS)
In this paper, we explore the application of continuum Monte Carlo methods to effective field theory models. Effective field theories, in this context, are those in which a Fock space decomposition of the state is useful. These problems arise both in nuclear and condensed matter physica. In nuclear physics, much work has been done on effective field theories of mesons and baryons. While the theories are not fundamental, they should be able to describe nuclear properties at low energy and momentum scales. After describing the methods, we solve two simple scalar field theory problems; the polaron and two nucleons interacting through scalar meson exchange. The methods presented here are rather straightforward extensions of methods used to solve quantum mechanics problems. Monte Carlo methods are used to avoid the truncation inherent in a Tamm-Dancoff approach and its associated difficulties. Nevertheless, the methods will be most valuable when the Fock space decomposition of the states is useful. Hence, while they are not intended for ab initio studies of QCD, they may prove valuable in studies of light nuclei, or for systems of interacting electrons and phonons. In these problems a Fock space decomposition can be used to reduce the number of degrees of freedom and to retain the rotational symmetries exactly. The problems we address here are comparatively simple, but offer useful initial tests of the method. We present results for the polaron and two non-relativistic nucleons interacting through scalar meson exchange. In each case, it is possible to integrate out the boson degrees of freedom exactly, and obtain a retarded form of the action that depends only upon the fermion paths. Here we keep the explicit bosons, though, since we would like to retain information about the boson components of the states and it will be necessary to keep these components in order to treat non-scalar of interacting bosonic fields
Monte Carlo modelling of TRIGA research reactor
Energy Technology Data Exchange (ETDEWEB)
El Bakkari, B., E-mail: bakkari@gmail.co [Reactor Operating Unit (UCR), National Centre of Sciences, Energy and Nuclear Techniques (CNESTEN/CENM), POB 1382, Rabat (Morocco); ERSN-LMR, Department of Physics, Faculty of Sciences, POB 2121, Tetuan (Morocco); Nacir, B. [Reactor Operating Unit (UCR), National Centre of Sciences, Energy and Nuclear Techniques (CNESTEN/CENM), POB 1382, Rabat (Morocco); El Bardouni, T. [ERSN-LMR, Department of Physics, Faculty of Sciences, POB 2121, Tetuan (Morocco); El Younoussi, C. [Reactor Operating Unit (UCR), National Centre of Sciences, Energy and Nuclear Techniques (CNESTEN/CENM), POB 1382, Rabat (Morocco); ERSN-LMR, Department of Physics, Faculty of Sciences, POB 2121, Tetuan (Morocco); Merroun, O. [ERSN-LMR, Department of Physics, Faculty of Sciences, POB 2121, Tetuan (Morocco); Htet, A. [Reactor Technology Unit (UTR), National Centre of Sciences, Energy and Nuclear Techniques (CNESTEN/CENM), POB 1382, Rabat (Morocco); Boulaich, Y. [Reactor Operating Unit (UCR), National Centre of Sciences, Energy and Nuclear Techniques (CNESTEN/CENM), POB 1382, Rabat (Morocco); ERSN-LMR, Department of Physics, Faculty of Sciences, POB 2121, Tetuan (Morocco); Zoubair, M.; Boukhal, H. [ERSN-LMR, Department of Physics, Faculty of Sciences, POB 2121, Tetuan (Morocco); Chakir, M. [EPTN-LPMR, Faculty of Sciences, Kenitra (Morocco)
2010-10-15
The Moroccan 2 MW TRIGA MARK II research reactor at Centre des Etudes Nucleaires de la Maamora (CENM) achieved initial criticality on May 2, 2007. The reactor is designed to effectively implement the various fields of basic nuclear research, manpower training, and production of radioisotopes for their use in agriculture, industry, and medicine. This study deals with the neutronic analysis of the 2-MW TRIGA MARK II research reactor at CENM and validation of the results by comparisons with the experimental, operational, and available final safety analysis report (FSAR) values. The study was prepared in collaboration between the Laboratory of Radiation and Nuclear Systems (ERSN-LMR) from Faculty of Sciences of Tetuan (Morocco) and CENM. The 3-D continuous energy Monte Carlo code MCNP (version 5) was used to develop a versatile and accurate full model of the TRIGA core. The model represents in detailed all components of the core with literally no physical approximation. Continuous energy cross-section data from the more recent nuclear data evaluations (ENDF/B-VI.8, ENDF/B-VII.0, JEFF-3.1, and JENDL-3.3) as well as S({alpha}, {beta}) thermal neutron scattering functions distributed with the MCNP code were used. The cross-section libraries were generated by using the NJOY99 system updated to its more recent patch file 'up259'. The consistency and accuracy of both the Monte Carlo simulation and neutron transport physics were established by benchmarking the TRIGA experiments. Core excess reactivity, total and integral control rods worth as well as power peaking factors were used in the validation process. Results of calculations are analysed and discussed.
Monte Carlo modelling of TRIGA research reactor
International Nuclear Information System (INIS)
The Moroccan 2 MW TRIGA MARK II research reactor at Centre des Etudes Nucleaires de la Maamora (CENM) achieved initial criticality on May 2, 2007. The reactor is designed to effectively implement the various fields of basic nuclear research, manpower training, and production of radioisotopes for their use in agriculture, industry, and medicine. This study deals with the neutronic analysis of the 2-MW TRIGA MARK II research reactor at CENM and validation of the results by comparisons with the experimental, operational, and available final safety analysis report (FSAR) values. The study was prepared in collaboration between the Laboratory of Radiation and Nuclear Systems (ERSN-LMR) from Faculty of Sciences of Tetuan (Morocco) and CENM. The 3-D continuous energy Monte Carlo code MCNP (version 5) was used to develop a versatile and accurate full model of the TRIGA core. The model represents in detailed all components of the core with literally no physical approximation. Continuous energy cross-section data from the more recent nuclear data evaluations (ENDF/B-VI.8, ENDF/B-VII.0, JEFF-3.1, and JENDL-3.3) as well as S(α, β) thermal neutron scattering functions distributed with the MCNP code were used. The cross-section libraries were generated by using the NJOY99 system updated to its more recent patch file 'up259'. The consistency and accuracy of both the Monte Carlo simulation and neutron transport physics were established by benchmarking the TRIGA experiments. Core excess reactivity, total and integral control rods worth as well as power peaking factors were used in the validation process. Results of calculations are analysed and discussed.
Monte Carlo modelling of TRIGA research reactor
El Bakkari, B.; Nacir, B.; El Bardouni, T.; El Younoussi, C.; Merroun, O.; Htet, A.; Boulaich, Y.; Zoubair, M.; Boukhal, H.; Chakir, M.
2010-10-01
The Moroccan 2 MW TRIGA MARK II research reactor at Centre des Etudes Nucléaires de la Maâmora (CENM) achieved initial criticality on May 2, 2007. The reactor is designed to effectively implement the various fields of basic nuclear research, manpower training, and production of radioisotopes for their use in agriculture, industry, and medicine. This study deals with the neutronic analysis of the 2-MW TRIGA MARK II research reactor at CENM and validation of the results by comparisons with the experimental, operational, and available final safety analysis report (FSAR) values. The study was prepared in collaboration between the Laboratory of Radiation and Nuclear Systems (ERSN-LMR) from Faculty of Sciences of Tetuan (Morocco) and CENM. The 3-D continuous energy Monte Carlo code MCNP (version 5) was used to develop a versatile and accurate full model of the TRIGA core. The model represents in detailed all components of the core with literally no physical approximation. Continuous energy cross-section data from the more recent nuclear data evaluations (ENDF/B-VI.8, ENDF/B-VII.0, JEFF-3.1, and JENDL-3.3) as well as S( α, β) thermal neutron scattering functions distributed with the MCNP code were used. The cross-section libraries were generated by using the NJOY99 system updated to its more recent patch file "up259". The consistency and accuracy of both the Monte Carlo simulation and neutron transport physics were established by benchmarking the TRIGA experiments. Core excess reactivity, total and integral control rods worth as well as power peaking factors were used in the validation process. Results of calculations are analysed and discussed.
Criticality benchmarking of ANET Monte Carlo code
International Nuclear Information System (INIS)
In this work the new Monte Carlo code ANET is tested on criticality calculations. ANET is developed based on the high energy physics code GEANT of CERN and aims at progressively satisfying several requirements regarding both simulations of GEN II/III reactors, as well as of innovative nuclear reactor designs such as the Accelerator Driven Systems (ADSs). Here ANET is applied on three different nuclear configurations, including a subcritical assembly, a Material Testing Reactor and the conceptual configuration of an ADS. In the first case, calculation of the effective multiplication factor (keff) are performed for the Training Nuclear Reactor of the Aristotle University of Thessaloniki, while in the second case keff is computed for the fresh fueled core of the Portuguese research reactor (RPJ) just after its conversion to Low Enriched Uranium, considering the control rods at the position that renders the reactor critical. In both cases ANET computations are compared with corresponding results obtained by three different well established codes, including both deterministic (XSDRNPM/CITATION) and Monte Carlo (TRIPOLI, MCNP). In the RPI case, keff computations are also compared with observations during the reactor core commissioning since the control rods are considered at criticality position. The above verification studies show ANET to produce reasonable results since they are satisfactorily compared with other models as well as with observations. For the third case (ADS), preliminary ANET computations of keff for various intensities of the proton beam are presented, showing also a reasonable code performance concerning both the order of magnitude and the relative variation of the computed parameter. (author)
Energy Technology Data Exchange (ETDEWEB)
Gallego Franco, P.; Garcia Marcos, R.
2015-07-01
GAMOS simulation code based on Geant4 is a very powerful tool for the design and modeling optimization on Positron Emission Tomography (PET) systems. In order to obtain a proper image quality, it results to be extremely important determine the optimal activity which is going to be delivered. For this reason a study about the internal system parameters that affects image quality, such as scatter fraction (SF) and the count rate equivalent noise (NEC), has been carried out. The study involves the comparison of experimental measures on both parameters, with those obtained by Monte Carlo simulation of Siemens Pet Biograph 6 True Point with True V option. Based on simulations results, a paralizable dead-time model that adjusts, depending on the activity provided, the proper dead-time for scanner detectors. Also a study about the variation of this proper dead-time with the activity has been carried out. (Author)
Quantum Monte Carlo Simulations : Algorithms, Limitations and Applications
Raedt, H. De
1992-01-01
A survey is given of Quantum Monte Carlo methods currently used to simulate quantum lattice models. The formalisms employed to construct the simulation algorithms are sketched. The origin of fundamental (minus sign) problems which limit the applicability of the Quantum Monte Carlo approach is shown
CERN Summer Student Report 2016 Monte Carlo Data Base Improvement
Caciulescu, Alexandru Razvan
2016-01-01
During my Summer Student project I worked on improving the Monte Carlo Data Base and MonALISA services for the ALICE Collaboration. The project included learning the infrastructure for tracking and monitoring of the Monte Carlo productions as well as developing a new RESTful API for seamless integration with the JIRA issue tracking framework.
Managing the Knowledge Commons: Interview with Carlo Vercellone -
Vercellone, Carlo
2015-01-01
Interview with Dr. Carlo Vercellone, one of the leading theorists of cognitive capitalism and economist at the CNRS Lab of The Sorbonne Economic Centre (Centre d'Economie de la Sorbonne, CES). - See more at: http://www.nesta.org.uk/blog/managing-knowledge-commons-interview-carlo-vercellone#sthash.1F1Ig5dF.dpuf,
Adjoint electron-photon transport Monte Carlo calculations with ITS
International Nuclear Information System (INIS)
A general adjoint coupled electron-photon Monte Carlo code for solving the Boltzmann-Fokker-Planck equation has recently been created. It is a modified version of ITS 3.0, a coupled electronphoton Monte Carlo code that has world-wide distribution. The applicability of the new code to radiation-interaction problems of the type found in space environments is demonstrated
Neutron point-flux calculation by Monte Carlo
International Nuclear Information System (INIS)
A survey of the usual methods for estimating flux at a point is given. The associated variance-reducing techniques in direct Monte Carlo games are explained. The multigroup Monte Carlo codes MC for critical systems and PUNKT for point source-point detector-systems are represented, and problems in applying the codes to practical tasks are discussed. (author)
Monte Carlo simulations of single polymer force-extension relations
International Nuclear Information System (INIS)
We present Monte Carlo simulations for studying the statistical mechanics of arbitrarily long single molecules under stretching. In many cases in which the thermodynamic limit is not satisfied, different statistical ensembles yield different macroscopic force-displacement curves. In this work we provide a description of the Monte Carlo simulations and discuss in details the assumptions adopted.
Nuclear data treatment for SAM-CE Monte Carlo calculations
International Nuclear Information System (INIS)
The treatment of nuclear data by the SAM-CE Monte Carlo code system is presented. The retrieval of neutron, gamma production, and photon data from the ENDF/B fils is described. Integral cross sections as well as differential data are utilized in the Monte Carlo calculations, and the processing procedures for the requisite data are summarized
Fission Matrix Capability for MCNP Monte Carlo
Energy Technology Data Exchange (ETDEWEB)
Carney, Sean E. [Los Alamos National Laboratory; Brown, Forrest B. [Los Alamos National Laboratory; Kiedrowski, Brian C. [Los Alamos National Laboratory; Martin, William R. [Los Alamos National Laboratory
2012-09-05
In a Monte Carlo criticality calculation, before the tallying of quantities can begin, a converged fission source (the fundamental eigenvector of the fission kernel) is required. Tallies of interest may include powers, absorption rates, leakage rates, or the multiplication factor (the fundamental eigenvalue of the fission kernel, k{sub eff}). Just as in the power iteration method of linear algebra, if the dominance ratio (the ratio of the first and zeroth eigenvalues) is high, many iterations of neutron history simulations are required to isolate the fundamental mode of the problem. Optically large systems have large dominance ratios, and systems containing poor neutron communication between regions are also slow to converge. The fission matrix method, implemented into MCNP[1], addresses these problems. When Monte Carlo random walk from a source is executed, the fission kernel is stochastically applied to the source. Random numbers are used for: distances to collision, reaction types, scattering physics, fission reactions, etc. This method is used because the fission kernel is a complex, 7-dimensional operator that is not explicitly known. Deterministic methods use approximations/discretization in energy, space, and direction to the kernel. Consequently, they are faster. Monte Carlo directly simulates the physics, which necessitates the use of random sampling. Because of this statistical noise, common convergence acceleration methods used in deterministic methods do not work. In the fission matrix method, we are using the random walk information not only to build the next-iteration fission source, but also a spatially-averaged fission kernel. Just like in deterministic methods, this involves approximation and discretization. The approximation is the tallying of the spatially-discretized fission kernel with an incorrect fission source. We address this by making the spatial mesh fine enough that this error is negligible. As a consequence of discretization we get a
IN MEMORIAM CARLOS RESTREPO. UN VERDADERO MAESTRO
Directory of Open Access Journals (Sweden)
Pelayo Correa
2009-06-01
Full Text Available Carlos Restrepo fue el primer profesor de Patología y un miembro ilustre del grupo de pioneros que fundaron la Facultad de Medicina de la Universidad del Valle. Estos pioneros convergieron en Cali en la década de 1950, en posesión de un espíritu renovador y creativo que emprendió con mucho éxito la labor de cambiar la cultura académica del Valle del Cauca. Ellos encontraron una sociedad apacible, que disfrutaba de la generosidad de su entorno, sin deseos de romper las tradiciones centenarias de estilo de vida sencillo y satisfecho. Cuando los hijos tenían el deseo y la capacidad de seguir estudios universitarios, especialmente en el área de la medicina, la familia los enviaba a climas menos cálidos, que supuestamente favorecían la función cerebral y la acumulación de conocimientos. Los pioneros de la educación médica en el Valle del Cauca, en buena parte reclutados en universidades nacionales y extranjeras, sabían muy bien que el ambiente vallecaucano no impide una formación universitaria de primera clase.Carlos Restrepo era prototipo del espíritu de cambio y formación intelectual de las nuevas generaciones. Lo manifestaba de múltiples maneras, en buena parte con su genio alegre, extrovertido, optimista, de risa fácil y contagiosa. Pero esta fase amable de su personalidad no ocultaba su tarea formativa; exigía de sus discípulos dedicación y trabajo duro, con fidelidad expresados en memorables caricaturas que exageraban su genio ocasionalmente explosivo.El grupo de pioneros se enfocó con un espíritu de total entrega (tiempo completo y dedicación exclusiva y organizó la nueva Facultad en bien definidos y estructurados departamentos: Anatomía, Bioquímica, Fisiología, Farmacología, Patología, Medicina Interna, Cirugía, Obstetricia y Ginecología, Psiquiatría y Medicina Preventiva. Los departamentos integraron sus funciones primordiales en la enseñanza, la investigación y el servicio a la comunidad. El centro
Radiation doses in volume-of-interest breast computed tomography—A Monte Carlo simulation study
Energy Technology Data Exchange (ETDEWEB)
Lai, Chao-Jen, E-mail: cjlai3711@gmail.com; Zhong, Yuncheng; Yi, Ying; Wang, Tianpeng; Shaw, Chris C. [Department of Imaging Physics, The University of Texas MD Anderson Cancer Center, Houston, Texas 77030-4009 (United States)
2015-06-15
Purpose: Cone beam breast computed tomography (breast CT) with true three-dimensional, nearly isotropic spatial resolution has been developed and investigated over the past decade to overcome the problem of lesions overlapping with breast anatomical structures on two-dimensional mammographic images. However, the ability of breast CT to detect small objects, such as tissue structure edges and small calcifications, is limited. To resolve this problem, the authors proposed and developed a volume-of-interest (VOI) breast CT technique to image a small VOI using a higher radiation dose to improve that region’s visibility. In this study, the authors performed Monte Carlo simulations to estimate average breast dose and average glandular dose (AGD) for the VOI breast CT technique. Methods: Electron–Gamma-Shower system code-based Monte Carlo codes were used to simulate breast CT. The Monte Carlo codes estimated were validated using physical measurements of air kerma ratios and point doses in phantoms with an ion chamber and optically stimulated luminescence dosimeters. The validated full cone x-ray source was then collimated to simulate half cone beam x-rays to image digital pendant-geometry, hemi-ellipsoidal, homogeneous breast phantoms and to estimate breast doses with full field scans. 13-cm in diameter, 10-cm long hemi-ellipsoidal homogeneous phantoms were used to simulate median breasts. Breast compositions of 25% and 50% volumetric glandular fractions (VGFs) were used to investigate the influence on breast dose. The simulated half cone beam x-rays were then collimated to a narrow x-ray beam with an area of 2.5 × 2.5 cm{sup 2} field of view at the isocenter plane and to perform VOI field scans. The Monte Carlo results for the full field scans and the VOI field scans were then used to estimate the AGD for the VOI breast CT technique. Results: The ratios of air kerma ratios and dose measurement results from the Monte Carlo simulation to those from the physical
Baräo, Fernando; Nakagawa, Masayuki; Távora, Luis; Vaz, Pedro
2001-01-01
This book focusses on the state of the art of Monte Carlo methods in radiation physics and particle transport simulation and applications, the latter involving in particular, the use and development of electron--gamma, neutron--gamma and hadronic codes. Besides the basic theory and the methods employed, special attention is paid to algorithm development for modeling, and the analysis of experiments and measurements in a variety of fields ranging from particle to medical physics.
Quantum Monte Carlo methods algorithms for lattice models
Gubernatis, James; Werner, Philipp
2016-01-01
Featuring detailed explanations of the major algorithms used in quantum Monte Carlo simulations, this is the first textbook of its kind to provide a pedagogical overview of the field and its applications. The book provides a comprehensive introduction to the Monte Carlo method, its use, and its foundations, and examines algorithms for the simulation of quantum many-body lattice problems at finite and zero temperature. These algorithms include continuous-time loop and cluster algorithms for quantum spins, determinant methods for simulating fermions, power methods for computing ground and excited states, and the variational Monte Carlo method. Also discussed are continuous-time algorithms for quantum impurity models and their use within dynamical mean-field theory, along with algorithms for analytically continuing imaginary-time quantum Monte Carlo data. The parallelization of Monte Carlo simulations is also addressed. This is an essential resource for graduate students, teachers, and researchers interested in ...
Hybrid SN/Monte Carlo research and results
International Nuclear Information System (INIS)
The neutral particle transport equation is solved by a hybrid method that iteratively couples regions where deterministic (SN) and stochastic (Monte Carlo) methods are applied. The Monte Carlo and SN regions are fully coupled in the sense that no assumption is made about geometrical separation or decoupling. The hybrid method has been successfully applied to realistic shielding problems. The vectorized Monte Carlo algorithm in the hybrid method has been ported to the massively parallel architecture of the Connection Machine. Comparisons of performance on a vector machine (Cray Y-MP) and the Connection Machine (CM-2) show that significant speedups are obtainable for vectorized Monte Carlo algorithms on massively parallel machines, even when realistic problems requiring variance reduction are considered. However, the architecture of the Connection Machine does place some limitations on the regime in which the Monte Carlo algorithm may by expected to perform well. (author)
Reconstruction of Monte Carlo replicas from Hessian parton distributions
Hou, Tie-Jiun; Huston, Joey; Nadolsky, Pavel; Schmidt, Carl; Stump, Daniel; Wang, Bo-Ting; Xie, Ke-Ping; Dulat, Sayipjamal; Pumplin, Jon; Yuan, C -P
2016-01-01
We explore connections between two common methods for quantifying the uncertainty in parton distribution functions (PDFs), based on the Hessian error matrix and Monte-Carlo sampling. CT14 parton distributions in the Hessian representation are converted into Monte-Carlo replicas by a numerical method that reproduces important properties of CT14 Hessian PDFs: the asymmetry of CT14 uncertainties and positivity of individual parton distributions. The ensembles of CT14 Monte-Carlo replicas constructed this way at NNLO and NLO are suitable for various collider applications, such as cross section reweighting. Master formulas for computation of asymmetric standard deviations in the Monte-Carlo representation are derived. A numerical program is made available for conversion of Hessian PDFs into Monte-Carlo replicas according to normal, log-normal, and Watt-Thorne sampling procedures.
Problems in radiation shielding calculations with Monte Carlo methods
International Nuclear Information System (INIS)
The Monte Carlo method is a very useful tool for solving a large class of radiation transport problem. In contrast with deterministic method, geometric complexity is a much less significant problem for Monte Carlo calculations. However, the accuracy of Monte Carlo calculations is of course, limited by statistical error of the quantities to be estimated. In this report, we point out some typical problems to solve a large shielding system including radiation streaming. The Monte Carlo coupling technique was developed to settle such a shielding problem accurately. However, the variance of the Monte Carlo results using the coupling technique of which detectors were located outside the radiation streaming, was still not enough. So as to bring on more accurate results for the detectors located outside the streaming and also for a multi-legged-duct streaming problem, a practicable way of ''Prism Scattering technique'' is proposed in the study. (author)
Contrast to Noise Ratio and Contrast Detail Analysis in Mammography:A Monte Carlo Study
Metaxas, V.; Delis, H.; Kalogeropoulou, C.; Zampakis, P.; Panayiotakis, G.
2015-09-01
The mammographic spectrum is one of the major factors affecting image quality in mammography. In this study, a Monte Carlo (MC) simulation model was used to evaluate image quality characteristics of various mammographic spectra. The anode/filter combinations evaluated, were those traditionally used in mammography, for tube voltages between 26 and 30 kVp. The imaging performance was investigated in terms of Contrast to Noise Ratio (CNR) and Contrast Detail (CD) analysis, by involving human observers, utilizing a mathematical CD phantom. Soft spectra provided the best characteristics in terms of both CNR and CD scores, while tube voltage had a limited effect. W-anode spectra filtered with k-edge filters demonstrated an improved performance, that sometimes was better compared to softer x-ray spectra, produced by Mo or Rh anode. Regarding the filter material, k-edge filters showed superior performance compared to Al filters.
Commissioning and First Observations with Wide FastCam at the Telescopio Carlos S\\'anchez
Velasco, Sergio; Oscoz, Alejandro; López, Roberto L; Puga, Marta; Murga, Gaizka; Pérez-Garrido, Antonio; Pallé, Enric; Ricci, Davide; Ayuso, Ismael; Hernández-Sánchez, Mónica; Truant, Nicola
2016-01-01
The FastCam instrument platform, jointly developed by the IAC and the UPCT, allows, in real-time, acquisition, selection and storage of images with a resolution that reaches the diffraction limit of medium-sized telescopes. FastCam incorporates a specially designed software package to analyse series of tens of thousands of images in parallel with the data acquisition at the telescope. Wide FastCam is a new instrument that, using the same software for data acquisition, does not look for lucky imaging but fast observations in a much larger field of view. Here we describe the commissioning process and first observations with Wide FastCam at the Telescopio Carlos S\\'anchez (TCS) in the Observatorio del Teide.
In silico imaging: Definition, possibilities and challenges
International Nuclear Information System (INIS)
The capability to simulate the imaging performance of new detector concepts is crucial to develop the next generation of medical imaging systems. Proper modeling tools allow for optimal designs that maximize image quality while minimizing patient and occupational radiation doses. In this context, in silico imaging has become an emerging field of imaging research. This paper reviews current progress and challenges in the simulation of imaging systems with a focus on Monte Carlo approaches to X-ray detector modeling, acceleration approaches, and validation strategies.
A hybrid Monte Carlo and response matrix Monte Carlo method in criticality calculation
International Nuclear Information System (INIS)
Full core calculations are very useful and important in reactor physics analysis, especially in computing the full core power distributions, optimizing the refueling strategies and analyzing the depletion of fuels. To reduce the computing time and accelerate the convergence, a method named Response Matrix Monte Carlo (RMMC) method based on analog Monte Carlo simulation was used to calculate the fixed source neutron transport problems in repeated structures. To make more accurate calculations, we put forward the RMMC method based on non-analog Monte Carlo simulation and investigate the way to use RMMC method in criticality calculations. Then a new hybrid RMMC and MC (RMMC+MC) method is put forward to solve the criticality problems with combined repeated and flexible geometries. This new RMMC+MC method, having the advantages of both MC method and RMMC method, can not only increase the efficiency of calculations, also simulate more complex geometries rather than repeated structures. Several 1-D numerical problems are constructed to test the new RMMC and RMMC+MC method. The results show that RMMC method and RMMC+MC method can efficiently reduce the computing time and variations in the calculations. Finally, the future research directions are mentioned and discussed at the end of this paper to make RMMC method and RMMC+MC method more powerful. (authors)
International Nuclear Information System (INIS)
Track structure Monte Carlo simulations of ionising radiation in water are often used to estimate radiation damage to DNA. For this purpose, an accurate simulation of the transport of densely ionising low-energy secondary electrons is particularly important, but is impaired by a high uncertainty of the required physical interaction cross section data of liquid water. A possible tool for the verification of the secondary electron transport in a track structure simulation has been suggested by Toburen et al. (2010), who have measured the angle-dependent energy spectra of electrons, emitted from a thin layer of amorphous solid water (ASW) upon a passage of 6 MeV protons. In this work, simulations were performed for the setup of their experiment, using the PTB Track structure code (PTra) and Geant4-DNA. To enable electron transport below the ionisation threshold, additional excitation and dissociative attachment anion states were included in PTra and activated in Geant4. Additionally, a surface potential was considered in both simulations, such that the escape probability for an electron is dependent on its energy and impact angle at the ASW/vacuum interface. For vanishing surface potential, the simulated spectra are in good agreement with the measured spectra for energies above 50 eV. Below, the simulations overestimate the yield of electrons by a factor up to 4 (PTra) or 7 (Geant4-DNA), which is still a better agreement than obtained in previous simulations of this experimental situation. The agreement of the simulations with experimental data was significantly improved by using a step-like increase of the potential energy at the ASW surface. - Highlights: ► Benchmarked electron transport in track structure simulations using liquid water. ► Simulated differential electron spectra agree with measured data. ► The agreement was improved by including a 3 eV surface potential step.
International Nuclear Information System (INIS)
The accuracy of Single Photon Emission Computed Tomography (SPECT) images is degraded by physical effects, namely photon attenuation, Compton scatter and spatially varying collimator response. The 3D nature of these effects is usually neglected by the methods used to correct for these effects. To deal with the 3D nature of the problem, a 3D projector modeling the spread of photons in 3D can be used in iterative tomographic reconstruction. The 3D projector can be estimated analytically with some approximations, or using precise Monte Carlo simulations. This latter approach has not been applied to fully 3D reconstruction yet due to impractical storage and computation time. The goal of this paper was to determine the gain to be expected from fully 3D Monte Carlo (F3DMC) modeling of the projector in iterative reconstruction, compared to conventional 2D and 3D reconstruction methods. As a proof-of-concept, two small datasets were considered. The projections of the two phantoms were simulated using the Monte Carlo simulation code GATE, as well as the corresponding projector, by taking into account all physical effects (attenuation, scatter, camera point spread function) affecting the imaging process. F3DMC was implemented by using this 3D projector in a maximum likelihood expectation maximization (MLEM) iterative reconstruction. To assess the value of F3DMC, data were reconstructed using 4 methods: filtered backprojection (FBP), MLEM without attenuation correction (MLEM), MLEM with attenuation correction, Jaszczak scatter correction and 3D correction for depth-dependent spatial resolution using an analytical model (MLEMC) and F3DMC. Our results suggest that F3DMC improves mainly imaging sensitivity and signal-to-noise ratio (SNR): sensitivity is multiplied by about 103 and SNR is increased by 20 to 70% compared to MLEMC. Computation of a more robust projector and application of the method on more realistic datasets are currently under investigation. (authors)
PeneloPET, a Monte Carlo PET simulation tool based on PENELOPE: features and validation
Energy Technology Data Exchange (ETDEWEB)
Espana, S; Herraiz, J L; Vicente, E; Udias, J M [Grupo de Fisica Nuclear, Departmento de Fisica Atomica, Molecular y Nuclear, Universidad Complutense de Madrid, Madrid (Spain); Vaquero, J J; Desco, M [Unidad de Medicina y CirugIa Experimental, Hospital General Universitario Gregorio Maranon, Madrid (Spain)], E-mail: jose@nuc2.fis.ucm.es
2009-03-21
Monte Carlo simulations play an important role in positron emission tomography (PET) imaging, as an essential tool for the research and development of new scanners and for advanced image reconstruction. PeneloPET, a PET-dedicated Monte Carlo tool, is presented and validated in this work. PeneloPET is based on PENELOPE, a Monte Carlo code for the simulation of the transport in matter of electrons, positrons and photons, with energies from a few hundred eV to 1 GeV. PENELOPE is robust, fast and very accurate, but it may be unfriendly to people not acquainted with the FORTRAN programming language. PeneloPET is an easy-to-use application which allows comprehensive simulations of PET systems within PENELOPE. Complex and realistic simulations can be set by modifying a few simple input text files. Different levels of output data are available for analysis, from sinogram and lines-of-response (LORs) histogramming to fully detailed list mode. These data can be further exploited with the preferred programming language, including ROOT. PeneloPET simulates PET systems based on crystal array blocks coupled to photodetectors and allows the user to define radioactive sources, detectors, shielding and other parts of the scanner. The acquisition chain is simulated in high level detail; for instance, the electronic processing can include pile-up rejection mechanisms and time stamping of events, if desired. This paper describes PeneloPET and shows the results of extensive validations and comparisons of simulations against real measurements from commercial acquisition systems. PeneloPET is being extensively employed to improve the image quality of commercial PET systems and for the development of new ones.
International Nuclear Information System (INIS)
Quantitative scintigrafic images, obtained by NaI(Tl) scintillation cameras, are limited by photon attenuation and contribution from scattered photons. A Monte Carlo program was developed in order to evaluate these effects. Simple source-phantom geometries and more complex nonhomogeneous cases can be simulated. Comparisons with experimental data for both homogeneous and nonhomogeneous regions and with published results have shown good agreement. The usefulness for simulation of parameters in scintillation camera systems, stationary as well as in SPECT systems, has also been demonstrated. An attenuation correction method based on density maps and build-up functions has been developed. The maps were obtained from a transmission measurement using an external 57Co flood source and the build-up was simulated by the Monte Carlo code. Two scatter correction methods, the dual-window method and the convolution-subtraction method, have been compared using the Monte Carlo method. The aim was to compare the estimated scatter with the true scatter in the photo-peak window. It was concluded that accurate depth-dependent scatter functions are essential for a proper scatter correction. A new scatter and attenuation correction method has been developed based on scatter line-spread functions (SLSF) obtained for different depths and lateral positions in the phantom. An emission image is used to determine the source location in order to estimate the scatter in the photo-peak window. Simulation studies of a clinically realistic source in different positions in cylindrical water phantoms were made for three photon energies. The SLSF-correction method was also evaluated by simulation studies for 1. a myocardial source, 2. uniform source in the lungs and 3. a tumour located in the lungs in a realistic, nonhomogeneous computer phantom. The results showed that quantitative images could be obtained in nonhomogeneous regions. (67 refs.)
Evaluation of high packing density powder X-ray screens by Monte Carlo methods
International Nuclear Information System (INIS)
Phosphor materials are employed in intensifying screens of both digital and conventional X-ray imaging detectors. High packing density powder screens have been developed (e.g. screens in ceramic form) exhibiting high-resolution and light emission properties, and thus contributing to improved image transfer characteristics and higher radiation to light conversion efficiency. For the present study, a custom Monte Carlo simulation program was used in order to examine the performance of ceramic powder screens, under various radiographic conditions. The model was developed using Mie scattering theory for the description of light interactions, based on the physical characteristics (e.g. complex refractive index, light wavelength) of the phosphor material. Monte Carlo simulations were carried out assuming: (a) X-ray photon energy ranging from 18 up to 49 keV, (b) Gd2O2S:Tb phosphor material with packing density of 70% and grain size of 7 μm and (c) phosphor thickness ranging between 30 and 70 mg/cm2. The variation of the Modulation Transfer Function (MTF) and the Luminescence Efficiency (LE) with respect to the X-ray energy and the phosphor thickness was evaluated. Both aforementioned imaging characteristics were shown to take high values at 49 keV X-ray energy and 70 mg/cm2 phosphor thickness. It was found that high packing density screens may be appropriate for use in medical radiographic systems
Energy Technology Data Exchange (ETDEWEB)
Tringe, J.W., E-mail: tringe2@llnl.gov [Lawrence Livermore National Laboratory, 7000 East Avenue, Livermore, CA (United States); Ileri, N. [Lawrence Livermore National Laboratory, 7000 East Avenue, Livermore, CA (United States); Department of Chemical Engineering & Materials Science, University of California, Davis, CA (United States); Levie, H.W. [Lawrence Livermore National Laboratory, 7000 East Avenue, Livermore, CA (United States); Stroeve, P.; Ustach, V.; Faller, R. [Department of Chemical Engineering & Materials Science, University of California, Davis, CA (United States); Renaud, P. [Swiss Federal Institute of Technology, Lausanne, (EPFL) (Switzerland)
2015-08-18
Highlights: • WGA proteins in nanochannels modeled by Molecular Dynamics and Monte Carlo. • Protein surface coverage characterized by atomic force microscopy. • Models indicate transport characteristics depend strongly on surface coverage. • Results resolve of a four orders of magnitude difference in diffusion coefficient values. - Abstract: We use Molecular Dynamics and Monte Carlo simulations to examine molecular transport phenomena in nanochannels, explaining four orders of magnitude difference in wheat germ agglutinin (WGA) protein diffusion rates observed by fluorescence correlation spectroscopy (FCS) and by direct imaging of fluorescently-labeled proteins. We first use the ESPResSo Molecular Dynamics code to estimate the surface transport distance for neutral and charged proteins. We then employ a Monte Carlo model to calculate the paths of protein molecules on surfaces and in the bulk liquid transport medium. Our results show that the transport characteristics depend strongly on the degree of molecular surface coverage. Atomic force microscope characterization of surfaces exposed to WGA proteins for 1000 s show large protein aggregates consistent with the predicted coverage. These calculations and experiments provide useful insight into the details of molecular motion in confined geometries.
Development and validation of MCNPX-based Monte Carlo treatment plan verification system
Directory of Open Access Journals (Sweden)
Iraj Jabbari
2015-01-01
Full Text Available A Monte Carlo treatment plan verification (MCTPV system was developed for clinical treatment plan verification (TPV, especially for the conformal and intensity-modulated radiotherapy (IMRT plans. In the MCTPV, the MCNPX code was used for particle transport through the accelerator head and the patient body. MCTPV has an interface with TiGRT planning system and reads the information which is needed for Monte Carlo calculation transferred in digital image communications in medicine-radiation therapy (DICOM-RT format. In MCTPV several methods were applied in order to reduce the simulation time. The relative dose distribution of a clinical prostate conformal plan calculated by the MCTPV was compared with that of TiGRT planning system. The results showed well implementation of the beams configuration and patient information in this system. For quantitative evaluation of MCTPV a two-dimensional (2D diode array (MapCHECK2 and gamma index analysis were used. The gamma passing rate (3%/3 mm of an IMRT plan was found to be 98.5% for total beams. Also, comparison of the measured and Monte Carlo calculated doses at several points inside an inhomogeneous phantom for 6- and 18-MV photon beams showed a good agreement (within 1.5%. The accuracy and timing results of MCTPV showed that MCTPV could be used very efficiently for additional assessment of complicated plans such as IMRT plan.
High-resolution and Monte Carlo additions to the SASKTRAN radiative transfer model
Directory of Open Access Journals (Sweden)
D. J. Zawada
2015-06-01
Full Text Available The Optical Spectrograph and InfraRed Imaging System (OSIRIS instrument on board the Odin spacecraft has been measuring limb-scattered radiance since 2001. The vertical radiance profiles measured as the instrument nods are inverted, with the aid of the SASKTRAN radiative transfer model, to obtain vertical profiles of trace atmospheric constituents. Here we describe two newly developed modes of the SASKTRAN radiative transfer model: a high-spatial-resolution mode and a Monte Carlo mode. The high-spatial-resolution mode is a successive-orders model capable of modelling the multiply scattered radiance when the atmosphere is not spherically symmetric; the Monte Carlo mode is intended for use as a highly accurate reference model. It is shown that the two models agree in a wide variety of solar conditions to within 0.2 %. As an example case for both models, Odin–OSIRIS scans were simulated with the Monte Carlo model and retrieved using the high-resolution model. A systematic bias of up to 4 % in retrieved ozone number density between scans where the instrument is scanning up or scanning down was identified. The bias is largest when the sun is near the horizon and the solar scattering angle is far from 90°. It was found that calculating the multiply scattered diffuse field at five discrete solar zenith angles is sufficient to eliminate the bias for typical Odin–OSIRIS geometries.
International Nuclear Information System (INIS)
Highlights: • WGA proteins in nanochannels modeled by Molecular Dynamics and Monte Carlo. • Protein surface coverage characterized by atomic force microscopy. • Models indicate transport characteristics depend strongly on surface coverage. • Results resolve of a four orders of magnitude difference in diffusion coefficient values. - Abstract: We use Molecular Dynamics and Monte Carlo simulations to examine molecular transport phenomena in nanochannels, explaining four orders of magnitude difference in wheat germ agglutinin (WGA) protein diffusion rates observed by fluorescence correlation spectroscopy (FCS) and by direct imaging of fluorescently-labeled proteins. We first use the ESPResSo Molecular Dynamics code to estimate the surface transport distance for neutral and charged proteins. We then employ a Monte Carlo model to calculate the paths of protein molecules on surfaces and in the bulk liquid transport medium. Our results show that the transport characteristics depend strongly on the degree of molecular surface coverage. Atomic force microscope characterization of surfaces exposed to WGA proteins for 1000 s show large protein aggregates consistent with the predicted coverage. These calculations and experiments provide useful insight into the details of molecular motion in confined geometries
Comparing analytical and Monte Carlo optical diffusion models in phosphor-based X-ray detectors
Kalyvas, N.; Liaparinos, P.
2014-03-01
Luminescent materials are employed as radiation to light converters in detectors of medical imaging systems, often referred to as phosphor screens. Several processes affect the light transfer properties of phosphors. Amongst the most important is the interaction of light. Light attenuation (absorption and scattering) can be described either through "diffusion" theory in theoretical models or "quantum" theory in Monte Carlo methods. Although analytical methods, based on photon diffusion equations, have been preferentially employed to investigate optical diffusion in the past, Monte Carlo simulation models can overcome several of the analytical modelling assumptions. The present study aimed to compare both methodologies and investigate the dependence of the analytical model optical parameters as a function of particle size. It was found that the optical photon attenuation coefficients calculated by analytical modeling are decreased with respect to the particle size (in the region 1- 12 μm). In addition, for particles sizes smaller than 6μm there is no simultaneous agreement between the theoretical modulation transfer function and light escape values with respect to the Monte Carlo data.
Calculation of radiation dose to the lens of the eye using Monte Carlo simulation
International Nuclear Information System (INIS)
The radiation dose to the lens of the eye of patients undergoing diagnostic and interventional radiological procedures of the lacrimal drainage system has been calculated using a Monte Carlo technique. The technique has also been suggested for the retrospective estimation of the lens dose; when applied to individual patients, good correlation is obtained. In such study, data is required for image acquisition frame numbers and fluoro on-time, mean exposure values for these parameters, and the ratio of lens-to-air dose (viz. the head factor, HF) derived for a standard adult head
Monte Carlo design for a new neutron collimator at the ENEA Casaccia TRIGA reactor.
Burgio, N; Rosa, R
2004-10-01
The TRIGA RC-1 1MW reactor operating at ENEA Casaccia Center is currently being developed as a second neutron imaging facility that shall be devoted to computed tomography as well as neutron tomography. In order to reduce the gamma-ray content in the neutron beam, the reactor tangential piercing channel was selected. A set of Monte Carlo simulation was used to design the neutron collimator, to determine the preliminary choice of the materials to be employed in the collimator design. PMID:15246415
Rapid Monte Carlo simulation of detector DQE(f)
International Nuclear Information System (INIS)
Purpose: Performance optimization of indirect x-ray detectors requires proper characterization of both ionizing (gamma) and optical photon transport in a heterogeneous medium. As the tool of choice for modeling detector physics, Monte Carlo methods have failed to gain traction as a design utility, due mostly to excessive simulation times and a lack of convenient simulation packages. The most important figure-of-merit in assessing detector performance is the detective quantum efficiency (DQE), for which most of the computational burden has traditionally been associated with the determination of the noise power spectrum (NPS) from an ensemble of flood images, each conventionally having 107 − 109 detected gamma photons. In this work, the authors show that the idealized conditions inherent in a numerical simulation allow for a dramatic reduction in the number of gamma and optical photons required to accurately predict the NPS. Methods: The authors derived an expression for the mean squared error (MSE) of a simulated NPS when computed using the International Electrotechnical Commission-recommended technique based on taking the 2D Fourier transform of flood images. It is shown that the MSE is inversely proportional to the number of flood images, and is independent of the input fluence provided that the input fluence is above a minimal value that avoids biasing the estimate. The authors then propose to further lower the input fluence so that each event creates a point-spread function rather than a flood field. The authors use this finding as the foundation for a novel algorithm in which the characteristic MTF(f), NPS(f), and DQE(f) curves are simultaneously generated from the results of a single run. The authors also investigate lowering the number of optical photons used in a scintillator simulation to further increase efficiency. Simulation results are compared with measurements performed on a Varian AS1000 portal imager, and with a previously published simulation
TOPAS: An innovative proton Monte Carlo platform for research and clinical applications
Energy Technology Data Exchange (ETDEWEB)
Perl, J.; Shin, J.; Schuemann, J.; Faddegon, B.; Paganetti, H. [SLAC National Accelerator Laboratory, 2575 Sand Hill Road, Menlo Park, California 94025 (United States); University of California San Francisco Comprehensive Cancer Center, 1600 Divisadero Street, San Francisco, California 94143-1708 (United States); Department of Radiation Oncology, Massachusetts General Hospital and Harvard Medical School, Boston, Massachusetts 02114 (United States); University of California San Francisco Comprehensive Cancer Center, 1600 Divisadero Street, San Francisco, California 94143-1708 (United States); Department of Radiation Oncology, Massachusetts General Hospital and Harvard Medical School, Boston, Massachusetts 02114 (United States)
2012-11-15
Purpose: While Monte Carlo particle transport has proven useful in many areas (treatment head design, dose calculation, shielding design, and imaging studies) and has been particularly important for proton therapy (due to the conformal dose distributions and a finite beam range in the patient), the available general purpose Monte Carlo codes in proton therapy have been overly complex for most clinical medical physicists. The learning process has large costs not only in time but also in reliability. To address this issue, we developed an innovative proton Monte Carlo platform and tested the tool in a variety of proton therapy applications. Methods: Our approach was to take one of the already-established general purpose Monte Carlo codes and wrap and extend it to create a specialized user-friendly tool for proton therapy. The resulting tool, TOol for PArticle Simulation (TOPAS), should make Monte Carlo simulation more readily available for research and clinical physicists. TOPAS can model a passive scattering or scanning beam treatment head, model a patient geometry based on computed tomography (CT) images, score dose, fluence, etc., save and restart a phase space, provides advanced graphics, and is fully four-dimensional (4D) to handle variations in beam delivery and patient geometry during treatment. A custom-designed TOPAS parameter control system was placed at the heart of the code to meet requirements for ease of use, reliability, and repeatability without sacrificing flexibility. Results: We built and tested the TOPAS code. We have shown that the TOPAS parameter system provides easy yet flexible control over all key simulation areas such as geometry setup, particle source setup, scoring setup, etc. Through design consistency, we have insured that user experience gained in configuring one component, scorer or filter applies equally well to configuring any other component, scorer or filter. We have incorporated key lessons from safety management, proactively
Penumbral imaging and numerical evaluation of large area source neutron imaging system
Institute of Scientific and Technical Information of China (English)
无
2009-01-01
The fusion neutron penumbral imaging system Monte Carlo model was established. The transfer functions of the two discrete units in the neutron source were obtained in two situations:Imaging in geometrical near-optical and real situation. The spatial resolutions of the imaging system in two situations were evaluated and compared. The penumbral images of four units in the source were obtained by means of 2-dimensional (2D) convolution and Monte Carlo simulation. The penumbral images were reconstructed with the same method of filter. The same results were confirmed. The encoding essence of penumbral imaging was revealed. With MCNP(Monte Carlo N-particle) simulation,the neutron penumbral images of the large area source (200 μm×200 μm) on scintillation fiber array were obtained. The improved Wiener filter method was used to reconstruct the penumbral image and the source image was obtained. The results agree with the preset neutron source image. The feasibility of the neutron imaging system was verified.
Penumbral imaging and numerical evaluation of large area source neutron imaging system
Institute of Scientific and Technical Information of China (English)
WU YueLei; HU HuaSi; ZHANG BoPing; LI LinBo; CHEN Da; SHAN Qing; ZHU Jie
2009-01-01
The fusion neutron penumbral imaging system Monte Carlo model was established. The transfer func-tions of the two discrete units in the neutron source were obtained in two situations: Imaging in geo-metrical near-optical and real situation. The spatial resolutions of the imaging system in two situations were evaluated and compared. The penumbral images of four units in the source were obtained by means of 2-dimensional (2D) convolution and Monte Carlo simulation. The penumbral images were reconstructed with the same method of filter. The same results were confirmed. The encoding essence of penumbral imaging was revealed. With MCNP(Monte Carlo N-particle) simulation, the neutron pen-umbral images of the large area source (200 μm×200 μm) on scintillation fiber array were obtained. The improved Wiener filter method was used to reconstruct the penumbral image and the source image was obtained. The results agree with the preset neutron source image. The feasibility of the neutron imaging system was verified.
SU-E-J-144: Low Activity Studies of Carbon 11 Activation Via GATE Monte Carlo
International Nuclear Information System (INIS)
Purpose: To investigate the behavior of a Monte Carlo simulation code with low levels of activity (∼1,000Bq). Such activity levels are expected from phantoms and patients activated via a proton therapy beam. Methods: Three different ranges for a therapeutic proton radiation beam were examined in a Monte Carlo simulation code: 13.5, 17.0 and 21.0cm. For each range, the decay of an equivalent length11C source and additional sources of length plus or minus one cm was studied in a benchmark PET simulation for activities of 1000, 2000 and 3000Bq. The ranges were chosen to coincide with a previous activation study, and the activities were chosen to coincide with the approximate level of isotope creation expected in a phantom or patient irradiated by a therapeutic proton beam. The GATE 7.0 simulation was completed on a cluster node, running Scientific Linux Carbon 6 (Red Hat©). The resulting Monte Carlo data were investigated with the ROOT (CERN) analysis tool. The half-life of11C was extracted via a histogram fit to the number of simulated PET events vs. time. Results: The average slope of the deviation of the extracted carbon half life from the expected/nominal value vs. activity showed a generally positive value. This was unexpected, as the deviation should, in principal, decrease with increased activity and lower statistical uncertainty. Conclusion: For activity levels on the order of 1,000Bq, the behavior of a benchmark PET test was somewhat unexpected. It is important to be aware of the limitations of low activity PET images, and low activity Monte Carlo simulations. This work was funded in part by the Philips corporation
Carlos Gardel, el patrimonio que sonrie
Directory of Open Access Journals (Sweden)
María Julia Carozzi
2003-10-01
Full Text Available Analizando los modos en que los porteños recordaron a Carlos Gardel en el mes del 68 aniversario de su muerte, el artículo intenta dar cuenta de una de las formas en que los habitantes de la ciudad de Buenos Aires conciben aquello que es memorable, identifican aquello en que se reconocen como porteños y singularizan aquello frente a lo cual experimentan sentimientos de pertenencia colectiva. El trabajo señala la centralidad que el milagro, la mimesis y el contacto directo con su cuerpo desempeñan en la preservación de la memoria de Gardel, quien encarna tanto al tango como a su éxito en el mundo. El caso de Gardel se presenta como un ejemplo de la organización de la memoria y la identidad de los porteños en particular y los argentinos en general en torno a personas reales a quienes se les asigna un valor extraordinario. Al sostener su profundo enraizamiento en cuerpos humanos concretos, tornan problemática la adopción local de los conceptos globalmente aceptados de patrimonio histórico y cultural.The article analyses one of the ways in which the inhabitants of Buenos Aires conceive that which is memorable, source of positive identification and origin of feelings of communitas by examining their commemoration of the 68th anniversary of the death of Carlos Gardel. It underscores the central role that miracles, mimesis and direct bodily contact play in the preservation of the memory of the star, who incarnates both the tango and its world-wide success. The case of Gardel is presented as an example of the centrality that real persons of extraordinary value have in the organization of local memory and collective identity. Since they are embedded in concrete human bodies, they reveal problems in the local adoption of globally accepted concepts of historical and cultural heritage.
Recent advances and future prospects for Monte Carlo
Energy Technology Data Exchange (ETDEWEB)
Brown, Forrest B [Los Alamos National Laboratory
2010-01-01
The history of Monte Carlo methods is closely linked to that of computers: The first known Monte Carlo program was written in 1947 for the ENIAC; a pre-release of the first Fortran compiler was used for Monte Carlo In 1957; Monte Carlo codes were adapted to vector computers in the 1980s, clusters and parallel computers in the 1990s, and teraflop systems in the 2000s. Recent advances include hierarchical parallelism, combining threaded calculations on multicore processors with message-passing among different nodes. With the advances In computmg, Monte Carlo codes have evolved with new capabilities and new ways of use. Production codes such as MCNP, MVP, MONK, TRIPOLI and SCALE are now 20-30 years old (or more) and are very rich in advanced featUres. The former 'method of last resort' has now become the first choice for many applications. Calculations are now routinely performed on office computers, not just on supercomputers. Current research and development efforts are investigating the use of Monte Carlo methods on FPGAs. GPUs, and many-core processors. Other far-reaching research is exploring ways to adapt Monte Carlo methods to future exaflop systems that may have 1M or more concurrent computational processes.
Development of Monte Carlo machine for particle transport problem
International Nuclear Information System (INIS)
Monte Carlo machine, Monte-4 has been developed to realize high performance computing of Monte Carlo codes for particle transport. The calculation for particle tracking in a complex geometry requires (1) classification of particles by the region types using multi-way conditional branches, and (2) determination whether intersections of particle paths with surfaces of the regions are on the boundaries of the regions or not, using nests of conditional branches. However, these procedures require scalar operations or unusual vector operations. Thus the speedup ratios have been low, i.e. nearly two times, in vector processing of Monte Carlo codes for particle transport on conventional vector processors. The Monte Carlo machine Monte-4 has been equipped with the special hardware called Monte Carlo pipelines to process these procedures with high performance. Additionally Monte-4 has been equipped with enhanced load/store pipelines to realize fast transfer of indirectly addressed data for the purpose of resolving imbalances between the performance of data transfers and arithmetic operations in vector processing of Monte Carlo codes on conventional vector processors. Finally, Monte-4 has a parallel processing capability with four processors to multiply the performance of vector processing. We have evaluated the effective performance of Monte-4 using production-level Monte Carlo codes such as vectorized KENO-IV and MCNP. In the performance evaluation, nearly ten times speedup ratios have been obtained, compared with scalar processing of the original codes. (author)