Energy Technology Data Exchange (ETDEWEB)
Kuruvilla Verghese
2002-04-05
This report summarizes the highlights of the research performed under the 1-year NEER grant from the Department of Energy. The primary goal of this study was to investigate the effects of certain design changes in the Fisher Senoscan mammography system and in the degree of breast compression on the discernability of microcalcifications in calcification clusters often observed in mammograms with tumor lesions. The most important design change that one can contemplate in a digital mammography system to improve resolution of calcifications is the reduction of pixel dimensions of the digital detector. Breast compression is painful to the patient and is though to be a deterrent to women to get routine mammographic screening. Calcification clusters often serve as markers (indicators ) of breast cancer.
Monte Carlo simulations of medical imaging modalities
Energy Technology Data Exchange (ETDEWEB)
Estes, G.P. [Los Alamos National Lab., NM (United States)
1998-09-01
Because continuous-energy Monte Carlo radiation transport calculations can be nearly exact simulations of physical reality (within data limitations, geometric approximations, transport algorithms, etc.), it follows that one should be able to closely approximate the results of many experiments from first-principles computations. This line of reasoning has led to various MCNP studies that involve simulations of medical imaging modalities and other visualization methods such as radiography, Anger camera, computerized tomography (CT) scans, and SABRINA particle track visualization. It is the intent of this paper to summarize some of these imaging simulations in the hope of stimulating further work, especially as computer power increases. Improved interpretation and prediction of medical images should ultimately lead to enhanced medical treatments. It is also reasonable to assume that such computations could be used to design new or more effective imaging instruments.
Monte Carlo studies for medical imaging detector optimization
Fois, G. R.; Cisbani, E.; Garibaldi, F.
2016-02-01
This work reports on the Monte Carlo optimization studies of detection systems for Molecular Breast Imaging with radionuclides and Bremsstrahlung Imaging in nuclear medicine. Molecular Breast Imaging requires competing performances of the detectors: high efficiency and high spatial resolutions; in this direction, it has been proposed an innovative device which combines images from two different, and somehow complementary, detectors at the opposite sides of the breast. The dual detector design allows for spot compression and improves significantly the performance of the overall system if all components are well tuned, layout and processing carefully optimized; in this direction the Monte Carlo simulation represents a valuable tools. In recent years, Bremsstrahlung Imaging potentiality in internal radiotherapy (with beta-radiopharmaceuticals) has been clearly emerged; Bremsstrahlung Imaging is currently performed with existing detector generally used for single photon radioisotopes. We are evaluating the possibility to adapt an existing compact gamma camera and optimize by Monte Carlo its performance for Bremsstrahlung imaging with photons emitted by the beta- from 90 Y.
Reconstruction of Human Monte Carlo Geometry from Segmented Images
Zhao, Kai; Cheng, Mengyun; Fan, Yanchang; Wang, Wen; Long, Pengcheng; Wu, Yican
2014-06-01
Human computational phantoms have been used extensively for scientific experimental analysis and experimental simulation. This article presented a method for human geometry reconstruction from a series of segmented images of a Chinese visible human dataset. The phantom geometry could actually describe detailed structure of an organ and could be converted into the input file of the Monte Carlo codes for dose calculation. A whole-body computational phantom of Chinese adult female has been established by FDS Team which is named Rad-HUMAN with about 28.8 billion voxel number. For being processed conveniently, different organs on images were segmented with different RGB colors and the voxels were assigned with positions of the dataset. For refinement, the positions were first sampled. Secondly, the large sums of voxels inside the organ were three-dimensional adjacent, however, there were not thoroughly mergence methods to reduce the cell amounts for the description of the organ. In this study, the voxels on the organ surface were taken into consideration of the mergence which could produce fewer cells for the organs. At the same time, an indexed based sorting algorithm was put forward for enhancing the mergence speed. Finally, the Rad-HUMAN which included a total of 46 organs and tissues was described by the cuboids into the Monte Carlo Monte Carlo Geometry for the simulation. The Monte Carlo geometry was constructed directly from the segmented images and the voxels was merged exhaustively. Each organ geometry model was constructed without ambiguity and self-crossing, its geometry information could represent the accuracy appearance and precise interior structure of the organs. The constructed geometry largely retaining the original shape of organs could easily be described into different Monte Carlo codes input file such as MCNP. Its universal property was testified and high-performance was experimentally verified
Robust Image Denoising using a Virtual Flash Image for Monte Carlo Ray Tracing
DEFF Research Database (Denmark)
Moon, Bochang; Jun, Jong Yun; Lee, JongHyeob
2013-01-01
We propose an efficient and robust image-space denoising method for noisy images generated by Monte Carlo ray tracing methods. Our method is based on two new concepts: virtual flash images and homogeneous pixels. Inspired by recent developments in flash photography, virtual flash images emulate...
Monte Carlo simulation of PET images for injection doseoptimization
Czech Academy of Sciences Publication Activity Database
Boldyš, Jiří; Dvořák, Jiří; Skopalová, M.; Bělohlávek, O.
2013-01-01
Roč. 29, č. 9 (2013), s. 988-999 ISSN 2040-7939 R&D Projects: GA MŠk 1M0572 Institutional support: RVO:67985556 Keywords : positron emission tomography * Monte Carlo simulation * biological system modeling * image quality Subject RIV: FD - Oncology ; Hematology Impact factor: 1.542, year: 2013 http://library.utia.cas.cz/separaty/2013/ZOI/boldys-0397175.pdf
Image reconstruction using Monte Carlo simulation and artificial neural networks
International Nuclear Information System (INIS)
Emert, F.; Missimner, J.; Blass, W.; Rodriguez, A.
1997-01-01
PET data sets are subject to two types of distortions during acquisition: the imperfect response of the scanner and attenuation and scattering in the active distribution. In addition, the reconstruction of voxel images from the line projections composing a data set can introduce artifacts. Monte Carlo simulation provides a means for modeling the distortions and artificial neural networks a method for correcting for them as well as minimizing artifacts. (author) figs., tab., refs
Monte Carlo modeling of human tooth optical coherence tomography imaging
International Nuclear Information System (INIS)
Shi, Boya; Meng, Zhuo; Wang, Longzhi; Liu, Tiegen
2013-01-01
We present a Monte Carlo model for optical coherence tomography (OCT) imaging of human tooth. The model is implemented by combining the simulation of a Gaussian beam with simulation for photon propagation in a two-layer human tooth model with non-parallel surfaces through a Monte Carlo method. The geometry and the optical parameters of the human tooth model are chosen on the basis of the experimental OCT images. The results show that the simulated OCT images are qualitatively consistent with the experimental ones. Using the model, we demonstrate the following: firstly, two types of photons contribute to the information of morphological features and noise in the OCT image of a human tooth, respectively. Secondly, the critical imaging depth of the tooth model is obtained, and it is found to decrease significantly with increasing mineral loss, simulated as different enamel scattering coefficients. Finally, the best focus position is located below and close to the dental surface by analysis of the effect of focus positions on the OCT signal and critical imaging depth. We anticipate that this modeling will become a powerful and accurate tool for a preliminary numerical study of the OCT technique on diseases of dental hard tissue in human teeth. (paper)
Monte Carlo modeling of human tooth optical coherence tomography imaging
Shi, Boya; Meng, Zhuo; Wang, Longzhi; Liu, Tiegen
2013-07-01
We present a Monte Carlo model for optical coherence tomography (OCT) imaging of human tooth. The model is implemented by combining the simulation of a Gaussian beam with simulation for photon propagation in a two-layer human tooth model with non-parallel surfaces through a Monte Carlo method. The geometry and the optical parameters of the human tooth model are chosen on the basis of the experimental OCT images. The results show that the simulated OCT images are qualitatively consistent with the experimental ones. Using the model, we demonstrate the following: firstly, two types of photons contribute to the information of morphological features and noise in the OCT image of a human tooth, respectively. Secondly, the critical imaging depth of the tooth model is obtained, and it is found to decrease significantly with increasing mineral loss, simulated as different enamel scattering coefficients. Finally, the best focus position is located below and close to the dental surface by analysis of the effect of focus positions on the OCT signal and critical imaging depth. We anticipate that this modeling will become a powerful and accurate tool for a preliminary numerical study of the OCT technique on diseases of dental hard tissue in human teeth.
Microscopic imaging through turbid media Monte Carlo modeling and applications
Gu, Min; Deng, Xiaoyuan
2015-01-01
This book provides a systematic introduction to the principles of microscopic imaging through tissue-like turbid media in terms of Monte-Carlo simulation. It describes various gating mechanisms based on the physical differences between the unscattered and scattered photons and method for microscopic image reconstruction, using the concept of the effective point spread function. Imaging an object embedded in a turbid medium is a challenging problem in physics as well as in biophotonics. A turbid medium surrounding an object under inspection causes multiple scattering, which degrades the contrast, resolution and signal-to-noise ratio. Biological tissues are typically turbid media. Microscopic imaging through a tissue-like turbid medium can provide higher resolution than transillumination imaging in which no objective is used. This book serves as a valuable reference for engineers and scientists working on microscopy of tissue turbid media.
Computational radiology and imaging with the MCNP Monte Carlo code
Energy Technology Data Exchange (ETDEWEB)
Estes, G.P.; Taylor, W.M.
1995-05-01
MCNP, a 3D coupled neutron/photon/electron Monte Carlo radiation transport code, is currently used in medical applications such as cancer radiation treatment planning, interpretation of diagnostic radiation images, and treatment beam optimization. This paper will discuss MCNP`s current uses and capabilities, as well as envisioned improvements that would further enhance MCNP role in computational medicine. It will be demonstrated that the methodology exists to simulate medical images (e.g. SPECT). Techniques will be discussed that would enable the construction of 3D computational geometry models of individual patients for use in patient-specific studies that would improve the quality of care for patients.
Markov chain Monte Carlo sampling based terahertz holography image denoising.
Chen, Guanghao; Li, Qi
2015-05-10
Terahertz digital holography has attracted much attention in recent years. This technology combines the strong transmittance of terahertz and the unique features of digital holography. Nonetheless, the low clearness of the images captured has hampered the popularization of this imaging technique. In this paper, we perform a digital image denoising technique on our multiframe superposed images. The noise suppression model is concluded as Bayesian least squares estimation and is solved with Markov chain Monte Carlo (MCMC) sampling. In this algorithm, a weighted mean filter with a Gaussian kernel is first applied to the noisy image, and then by nonlinear contrast transform, the contrast of the image is restored to the former level. By randomly walking on the preprocessed image, the MCMC-based filter keeps collecting samples, assigning them weights by similarity assessment, and constructs multiple sample sequences. Finally, these sequences are used to estimate the value of each pixel. Our algorithm shares some good qualities with nonlocal means filtering and the algorithm based on conditional sampling proposed by Wong et al. [Opt. Express18, 8338 (2010)10.1364/OE.18.008338OPEXFF1094-4087], such as good uniformity, and, moreover, reveals better performance in structure preservation, as shown in numerical comparison using the structural similarity index measurement and the peak signal-to-noise ratio.
Image based Monte Carlo modeling for computational phantom
International Nuclear Information System (INIS)
Cheng, M.; Wang, W.; Zhao, K.; Fan, Y.; Long, P.; Wu, Y.
2013-01-01
Full text of the publication follows. The evaluation on the effects of ionizing radiation and the risk of radiation exposure on human body has been becoming one of the most important issues for radiation protection and radiotherapy fields, which is helpful to avoid unnecessary radiation and decrease harm to human body. In order to accurately evaluate the dose on human body, it is necessary to construct more realistic computational phantom. However, manual description and verification of the models for Monte Carlo (MC) simulation are very tedious, error-prone and time-consuming. In addition, it is difficult to locate and fix the geometry error, and difficult to describe material information and assign it to cells. MCAM (CAD/Image-based Automatic Modeling Program for Neutronics and Radiation Transport Simulation) was developed as an interface program to achieve both CAD- and image-based automatic modeling. The advanced version (Version 6) of MCAM can achieve automatic conversion from CT/segmented sectioned images to computational phantoms such as MCNP models. Imaged-based automatic modeling program(MCAM6.0) has been tested by several medical images and sectioned images. And it has been applied in the construction of Rad-HUMAN. Following manual segmentation and 3D reconstruction, a whole-body computational phantom of Chinese adult female called Rad-HUMAN was created by using MCAM6.0 from sectioned images of a Chinese visible human dataset. Rad-HUMAN contains 46 organs/tissues, which faithfully represented the average anatomical characteristics of the Chinese female. The dose conversion coefficients (Dt/Ka) from kerma free-in-air to absorbed dose of Rad-HUMAN were calculated. Rad-HUMAN can be applied to predict and evaluate dose distributions in the Treatment Plan System (TPS), as well as radiation exposure for human body in radiation protection. (authors)
Image based Monte Carlo Modeling for Computational Phantom
Cheng, Mengyun; Wang, Wen; Zhao, Kai; Fan, Yanchang; Long, Pengcheng; Wu, Yican
2014-06-01
The evaluation on the effects of ionizing radiation and the risk of radiation exposure on human body has been becoming one of the most important issues for radiation protection and radiotherapy fields, which is helpful to avoid unnecessary radiation and decrease harm to human body. In order to accurately evaluate the dose on human body, it is necessary to construct more realistic computational phantom. However, manual description and verfication of the models for Monte carlo(MC)simulation are very tedious, error-prone and time-consuming. In addiation, it is difficult to locate and fix the geometry error, and difficult to describe material information and assign it to cells. MCAM (CAD/Image-based Automatic Modeling Program for Neutronics and Radiation Transport Simulation) was developed as an interface program to achieve both CAD- and image-based automatic modeling by FDS Team (Advanced Nuclear Energy Research Team, http://www.fds.org.cn). The advanced version (Version 6) of MCAM can achieve automatic conversion from CT/segmented sectioned images to computational phantoms such as MCNP models. Imaged-based automatic modeling program(MCAM6.0) has been tested by several medical images and sectioned images. And it has been applied in the construction of Rad-HUMAN. Following manual segmentation and 3D reconstruction, a whole-body computational phantom of Chinese adult female called Rad-HUMAN was created by using MCAM6.0 from sectioned images of a Chinese visible human dataset. Rad-HUMAN contains 46 organs/tissues, which faithfully represented the average anatomical characteristics of the Chinese female. The dose conversion coefficients(Dt/Ka) from kerma free-in-air to absorbed dose of Rad-HUMAN were calculated. Rad-HUMAN can be applied to predict and evaluate dose distributions in the Treatment Plan System (TPS), as well as radiation exposure for human body in radiation protection.
Monte Carlo simulation of breast imaging using synchrotron radiation
Energy Technology Data Exchange (ETDEWEB)
Fitousi, N. T.; Delis, H.; Panayiotakis, G. [Department of Medical Physics, Faculty of Medicine, University of Patras, 26504 Patras (Greece)
2012-04-15
Purpose: Synchrotron radiation (SR), being the brightest artificial source of x-rays with a very promising geometry, has raised the scientific expectations that it could be used for breast imaging with optimized results. The ''in situ'' evaluation of this technique is difficult to perform, mostly due to the limited available SR facilities worldwide. In this study, a simulation model for SR breast imaging was developed, based on Monte Carlo simulation techniques, and validated using data acquired in the SYRMEP beamline of the Elettra facility in Trieste, Italy. Furthermore, primary results concerning the performance of SR were derived. Methods: The developed model includes the exact setup of the SR beamline, considering that the x-ray source is located at almost 23 m from the slit, while the photon energy was considered to originate from a very narrow Gaussian spectrum. Breast phantoms, made of Perspex and filled with air cavities, were irradiated with energies in the range of 16-28 keV. The model included a Gd{sub 2}O{sub 2}S detector with the same characteristics as the one available in the SYRMEP beamline. Following the development and validation of the model, experiments were performed in order to evaluate the contrast resolution of SR. A phantom made of adipose tissue and filled with inhomogeneities of several compositions and sizes was designed and utilized to simulate the irradiation under conventional mammography and SR conditions. Results: The validation results of the model showed an excellent agreement with the experimental data, with the correlation for contrast being 0.996. Significant differences only appeared at the edges of the phantom, where phase effects occur. The initial evaluation experiments revealed that SR shows very good performance in terms of the image quality indices utilized, namely subject contrast and contrast to noise ratio. The response of subject contrast to energy is monotonic; however, this does not stand for
TH-E-18A-01: Developments in Monte Carlo Methods for Medical Imaging
International Nuclear Information System (INIS)
Badal, A; Zbijewski, W; Bolch, W; Sechopoulos, I
2014-01-01
Monte Carlo simulation methods are widely used in medical physics research and are starting to be implemented in clinical applications such as radiation therapy planning systems. Monte Carlo simulations offer the capability to accurately estimate quantities of interest that are challenging to measure experimentally while taking into account the realistic anatomy of an individual patient. Traditionally, practical application of Monte Carlo simulation codes in diagnostic imaging was limited by the need for large computational resources or long execution times. However, recent advancements in high-performance computing hardware, combined with a new generation of Monte Carlo simulation algorithms and novel postprocessing methods, are allowing for the computation of relevant imaging parameters of interest such as patient organ doses and scatter-to-primaryratios in radiographic projections in just a few seconds using affordable computational resources. Programmable Graphics Processing Units (GPUs), for example, provide a convenient, affordable platform for parallelized Monte Carlo executions that yield simulation times on the order of 10 7 xray/ s. Even with GPU acceleration, however, Monte Carlo simulation times can be prohibitive for routine clinical practice. To reduce simulation times further, variance reduction techniques can be used to alter the probabilistic models underlying the x-ray tracking process, resulting in lower variance in the results without biasing the estimates. Other complementary strategies for further reductions in computation time are denoising of the Monte Carlo estimates and estimating (scoring) the quantity of interest at a sparse set of sampling locations (e.g. at a small number of detector pixels in a scatter simulation) followed by interpolation. Beyond reduction of the computational resources required for performing Monte Carlo simulations in medical imaging, the use of accurate representations of patient anatomy is crucial to the virtual
Monte Carlo Simulations of Background Spectra in Integral Imager Detectors
Armstrong, T. W.; Colborn, B. L.; Dietz, K. L.; Ramsey, B. D.; Weisskopf, M. C.
1998-01-01
Predictions of the expected gamma-ray backgrounds in the ISGRI (CdTe) and PiCsIT (Csl) detectors on INTEGRAL due to cosmic-ray interactions and the diffuse gamma-ray background have been made using a coupled set of Monte Carlo radiation transport codes (HETC, FLUKA, EGS4, and MORSE) and a detailed, 3-D mass model of the spacecraft and detector assemblies. The simulations include both the prompt background component from induced hadronic and electromagnetic cascades and the delayed component due to emissions from induced radioactivity. Background spectra have been obtained with and without the use of active (BGO) shielding and charged particle rejection to evaluate the effectiveness of anticoincidence counting on background rejection.
TH-E-18A-01: Developments in Monte Carlo Methods for Medical Imaging
Energy Technology Data Exchange (ETDEWEB)
Badal, A [U.S. Food and Drug Administration (CDRH/OSEL), Silver Spring, MD (United States); Zbijewski, W [Johns Hopkins University, Baltimore, MD (United States); Bolch, W [University of Florida, Gainesville, FL (United States); Sechopoulos, I [Emory University, Atlanta, GA (United States)
2014-06-15
Monte Carlo simulation methods are widely used in medical physics research and are starting to be implemented in clinical applications such as radiation therapy planning systems. Monte Carlo simulations offer the capability to accurately estimate quantities of interest that are challenging to measure experimentally while taking into account the realistic anatomy of an individual patient. Traditionally, practical application of Monte Carlo simulation codes in diagnostic imaging was limited by the need for large computational resources or long execution times. However, recent advancements in high-performance computing hardware, combined with a new generation of Monte Carlo simulation algorithms and novel postprocessing methods, are allowing for the computation of relevant imaging parameters of interest such as patient organ doses and scatter-to-primaryratios in radiographic projections in just a few seconds using affordable computational resources. Programmable Graphics Processing Units (GPUs), for example, provide a convenient, affordable platform for parallelized Monte Carlo executions that yield simulation times on the order of 10{sup 7} xray/ s. Even with GPU acceleration, however, Monte Carlo simulation times can be prohibitive for routine clinical practice. To reduce simulation times further, variance reduction techniques can be used to alter the probabilistic models underlying the x-ray tracking process, resulting in lower variance in the results without biasing the estimates. Other complementary strategies for further reductions in computation time are denoising of the Monte Carlo estimates and estimating (scoring) the quantity of interest at a sparse set of sampling locations (e.g. at a small number of detector pixels in a scatter simulation) followed by interpolation. Beyond reduction of the computational resources required for performing Monte Carlo simulations in medical imaging, the use of accurate representations of patient anatomy is crucial to the
Monte Carlo SURE-based parameter selection for parallel magnetic resonance imaging reconstruction.
Weller, Daniel S; Ramani, Sathish; Nielsen, Jon-Fredrik; Fessler, Jeffrey A
2014-05-01
Regularizing parallel magnetic resonance imaging (MRI) reconstruction significantly improves image quality but requires tuning parameter selection. We propose a Monte Carlo method for automatic parameter selection based on Stein's unbiased risk estimate that minimizes the multichannel k-space mean squared error (MSE). We automatically tune parameters for image reconstruction methods that preserve the undersampled acquired data, which cannot be accomplished using existing techniques. We derive a weighted MSE criterion appropriate for data-preserving regularized parallel imaging reconstruction and the corresponding weighted Stein's unbiased risk estimate. We describe a Monte Carlo approximation of the weighted Stein's unbiased risk estimate that uses two evaluations of the reconstruction method per candidate parameter value. We reconstruct images using the denoising sparse images from GRAPPA using the nullspace method (DESIGN) and L1 iterative self-consistent parallel imaging (L1 -SPIRiT). We validate Monte Carlo Stein's unbiased risk estimate against the weighted MSE. We select the regularization parameter using these methods for various noise levels and undersampling factors and compare the results to those using MSE-optimal parameters. Our method selects nearly MSE-optimal regularization parameters for both DESIGN and L1 -SPIRiT over a range of noise levels and undersampling factors. The proposed method automatically provides nearly MSE-optimal choices of regularization parameters for data-preserving nonlinear parallel MRI reconstruction methods. Copyright © 2013 Wiley Periodicals, Inc.
Adaptable three-dimensional Monte Carlo modeling of imaged blood vessels in skin
Pfefer, T. Joshua; Barton, Jennifer K.; Chan, Eric K.; Ducros, Mathieu G.; Sorg, Brian S.; Milner, Thomas E.; Nelson, J. Stuart; Welch, Ashley J.
1997-06-01
In order to reach a higher level of accuracy in simulation of port wine stain treatment, we propose to discard the typical layered geometry and cylindrical blood vessel assumptions made in optical models and use imaging techniques to define actual tissue geometry. Two main additions to the typical 3D, weighted photon, variable step size Monte Carlo routine were necessary to achieve this goal. First, optical low coherence reflectometry (OLCR) images of rat skin were used to specify a 3D material array, with each entry assigned a label to represent the type of tissue in that particular voxel. Second, the Monte Carlo algorithm was altered so that when a photon crosses into a new voxel, the remaining path length is recalculated using the new optical properties, as specified by the material array. The model has shown good agreement with data from the literature. Monte Carlo simulations using OLCR images of asymmetrically curved blood vessels show various effects such as shading, scattering-induced peaks at vessel surfaces, and directionality-induced gradients in energy deposition. In conclusion, this augmentation of the Monte Carlo method can accurately simulate light transport for a wide variety of nonhomogeneous tissue geometries.
Monte Carlo Radiative Transfer Modeling of Lightning Observed in Galileo Images of Jupiter
Dyudine, U. A.; Ingersoll, Andrew P.
2002-01-01
We study lightning on Jupiter and the clouds illuminated by the lightning using images taken by the Galileo orbiter. The Galileo images have a resolution of 25 km/pixel and axe able to resolve the shape of the single lightning spots in the images, which have full widths at half the maximum intensity in the range of 90-160 km. We compare the measured lightning flash images with simulated images produced by our ED Monte Carlo light-scattering model. The model calculates Monte Carlo scattering of photons in a ED opacity distribution. During each scattering event, light is partially absorbed. The new direction of the photon after scattering is chosen according to a Henyey-Greenstein phase function. An image from each direction is produced by accumulating photons emerging from the cloud in a small range (bins) of emission angles. Lightning bolts are modeled either as points or vertical lines. Our results suggest that some of the observed scattering patterns axe produced in a 3-D cloud rather than in a plane-parallel cloud layer. Lightning is estimated to occur at least as deep as the bottom of the expected water cloud. For the six cases studied, we find that the clouds above the lightning are optically thick (tau > 5). Jovian flashes are more regular and circular than the largest terrestrial flashes observed from space. On Jupiter there is nothing equivalent to the 30-40-km horizontal flashes which axe seen on Earth.
International Nuclear Information System (INIS)
Roshan, Hoda Rezaei; Mahmoudian, Babak; Gharepapagh, Esmaeil; Azarm, Ahmadreza; Pirayesh Islamian, Jalil
2016-01-01
Treatment efficacy of radioembolization using Yttrium-90 ( 90 Y) microspheres is assessed by the 90 Y bremsstrahlung single photon emission computed tomography (SPECT) imaging following radioembolization. The radioisotopic image has the potential of providing reliable activity map of 90 Y microspheres distribution. One of the main reasons of the poor image quality in 90 Y bremsstrahlung SPECT imaging is the continuous and broad energy spectrum of the related bremsstrahlung photons. Furthermore, collimator geometry plays an impressive role in the spatial resolution, sensitivity and image contrast. Due to the relatively poor quality of the 90 Y bremsstrahlung SPECT images, we intend to optimize the medium-energy (ME) parallel-hole collimator and energy window. The Siemens e.cam gamma camera equipped with a ME collimator and a voxelized phantom was simulated by the SImulating Medical Imaging Nuclear Detectors (SIMIND) program. We used the SIMIND Monte Carlo program to generate the 90 Y bremsstrahlung SPECT projection of the digital Jaszczak phantom. The phantom consist of the six hot spheres ranging from 9.5 to 31.8 mm in diameter, which are used to evaluate the image contrast. In order to assess the effect of the energy window on the image contrast, three energy windows ranging from 60 to 160 KeV, 160 to 400 KeV, and 60 to 400 KeV were set on a 90 Y bremsstrahlung spectrum. As well, the effect of the hole diameter of a ME collimator on the image contrast and bremsstrahlung spectrum were investigated. For the fixed collimator and septa thickness values (3.28 cm and 1.14 mm, respectively), a hole diameter range (2.35–3.3 mm) was chosen based on the appropriate balance between the spatial resolution and sensitivity. The optimal energy window for 90 Y bremsstrahlung SPECT imaging was extended energy window from 60 to 400 KeV. Besides, The optimal value of the hole diameter of ME collimator was obtained 3.3 mm. Geometry of the ME parallel-hole collimator and energy
Monte Carlo modeling of neutron imaging at the SINQ spallation source
International Nuclear Information System (INIS)
Lebenhaft, J.R.; Lehmann, E.H.; Pitcher, E.J.; McKinney, G.W.
2003-01-01
Modeling of the Swiss Spallation Neutron Source (SINQ) has been used to demonstrate the neutron radiography capability of the newly released MPI-version of the MCNPX Monte Carlo code. A detailed MCNPX model was developed of SINQ and its associated neutron transmission radiography (NEUTRA) facility. Preliminary validation of the model was performed by comparing the calculated and measured neutron fluxes in the NEUTRA beam line, and a simulated radiography image was generated for a sample consisting of steel tubes containing different materials. This paper describes the SINQ facility, provides details of the MCNPX model, and presents preliminary results of the neutron imaging. (authors)
Preliminary Monte Carlo Investigation of Using Ir-192 as the Source for Real Time Imaging Purpose.
Shi, Chengyu; Wang, Brian
2017-02-01
The purpose of this study is to investigate the potential use of Ir-192 as the source for real time imaging during HDR (High Dose Rate) brachytherapy treatment. Phantom measurement was performed to determine outside of the body dose. Monte Carlo code, EGSnrcMP egs_inprz, was used for the simulation to calculate the outside of the body x-ray signal for CT reconstruction. Matlab code was developed to reconstruct the Ir-192 source and for 3D visualization in order to assess reconstructed CT resolution, signal-to-noise ratio, and imaging dose information. The measured dose was 0.67 ± 0.04 cGy, which was comparable to the Monte Carlo simulation result 0.71 ± 0.20 cGy. The reconstructed source diameter dimension was 1.3 mm compared with 1.1 mm for the real source dimension. The signal-to-noise ratio was 19.91 db following de-noising. Source position was within a 1 mm difference between programmed and simulated results. Although the Ir-192 signal is weak for CT imaging, it is possible to use it as a CT imaging x-ray source for HDR treatment localization, verification and dosimetry purposes. Further study is needed for the detailed design of an outside of the body CT-like device for use in brachytherapy imaging.
Novel imaging and quality assurance techniques for ion beam therapy a Monte Carlo study
Rinaldi, I; Jäkel, O; Mairani, A; Parodi, K
2010-01-01
Ion beams exhibit a finite and well defined range in matter together with an “inverted” depth-dose profile, the so-called Bragg peak. These favourable physical properties may enable superior tumour-dose conformality for high precision radiation therapy. On the other hand, they introduce the issue of sensitivity to range uncertainties in ion beam therapy. Although these uncertainties are typically taken into account when planning the treatment, correct delivery of the intended ion beam range has to be assured to prevent undesired underdosage of the tumour or overdosage of critical structures outside the target volume. Therefore, it is necessary to define dedicated Quality Assurance procedures to enable in-vivo range verification before or during therapeutic irradiation. For these purposes, Monte Carlo transport codes are very useful tools to support the development of novel imaging modalities for ion beam therapy. In the present work, we present calculations performed with the FLUKA Monte Carlo code and pr...
Freud, N.; Letang, J.-M.; Babot, D.
2005-10-01
In this paper, we propose a hybrid approach to simulate multiple scattering of photons in objects under X-ray inspection, without recourse to parallel computing and without any approximation sacrificing accuracy. Photon scattering is considered from two points of view: it contributes to X-ray imaging and to the dose absorbed by the patient. The proposed hybrid approach consists of a Monte Carlo stage followed by a deterministic phase, thus taking advantage of the complementarity between these two methods. In the first stage, a set of scattering events occurring in the inspected object is determined by means of classical Monte Carlo simulation. Then this set of scattering events is used to compute the energy imparted to the detector, with a deterministic algorithm based on a "forced detection" scheme. Regarding dose evaluation, we propose to assess separately the energy deposited by direct radiation (using a deterministic algorithm) and by scattered radiation (using our hybrid approach). The results obtained in a test case are compared to those obtained with the Monte Carlo method alone (Geant4 code) and found to be in excellent agreement. The proposed hybrid approach makes it possible to simulate the contribution of each type (Compton or Rayleigh) and order of scattering, separately or together, with a single PC, within reasonable computation times (from minutes to hours, depending on the required detector resolution and statistics). It is possible to simulate radiographic images virtually free from photon noise. In the case of dose evaluation, the hybrid approach appears particularly suitable to calculate the dose absorbed by regions of interest (rather than the entire irradiated organ) with computation time and statistical fluctuations considerably reduced in comparison with conventional Monte Carlo simulation.
Monte Carlo simulation of the iView GT portal imager dosimetry
International Nuclear Information System (INIS)
Juste, B.; Miro, R.; Diez, S.; Campayo, J.M.; Verdu, G.
2010-01-01
This work is mainly focused on developing a methodology to obtain portal dosimetry with an amorphous silicon electronic portal image device (EPID) by means of Monte Carlo simulations and experimental measures. According to this, pixel intensity values of portal images have been compared with dose measured from an ionization chamber and dose obtained from Monte Carlo simulations. To that, several images were acquired with the Elekta iView GT EPID using an attenuator phantom slab (10 cm thickness of solid water) and a 6 MV photon energy beam with different monitor units. The average pixel value in a region of interest (ROI) centered at the beam selecting each image was extracted and compared to dose measures performed with the ionization chamber. These parameters were found to be linearly correlated with the number of monitor units (MU). Since, MCNP5 simulations allow calculating the deposited dose in the ROI within the phosphor layer of the EPID model, we can compare the portal dose with the simulated transit dose in order to perform a treatment control.
Cipiccia, S.; Reboredo, D.; Vittoria, Fabio A.; Welsh, G. H.; Grant, P.; Grant, D. W.; Brunetti, E.; Wiggins, S. M.; Olivo, A.; Jaroszynski, D. A.
2015-05-01
X-ray phase contrast imaging (X-PCi) is a very promising method of dramatically enhancing the contrast of X-ray images of microscopic weakly absorbing objects and soft tissue, which may lead to significant advancement in medical imaging with high-resolution and low-dose. The interest in X-PCi is giving rise to a demand for effective simulation methods. Monte Carlo codes have been proved a valuable tool for studying X-PCi including coherent effects. The laser-plasma wakefield accelerators (LWFA) is a very compact particle accelerator that uses plasma as an accelerating medium. Accelerating gradient in excess of 1 GV/cm can be obtained, which makes them over a thousand times more compact than conventional accelerators. LWFA are also sources of brilliant betatron radiation, which are promising for applications including medical imaging. We present a study that explores the potential of LWFA-based betatron sources for medical X-PCi and investigate its resolution limit using numerical simulations based on the FLUKA Monte Carlo code, and present preliminary experimental results.
International Nuclear Information System (INIS)
Silva, Carlos Borges da
2007-05-01
The image acquisition methods applied to nuclear medicine and radiobiology are a valuable research study for determination of thyroid anatomy to seek disorders associated to follicular cells. The Monte Carlo (MC) simulation has also been used in problems related to radiation detection in order to map medical images since the improvement of data processing compatible with personnel computers (PC). This work presents an innovative study to find out the adequate scintillation inorganic detector array that could be coupled to a specific light photo sensor, a charge coupled device (CCD) through a fiber optic plate in order to map the follicles of thyroid gland. The goal is to choose the type of detector that fits the application suggested here with spatial resolution of 10 μm and good detector efficiency. The methodology results are useful to map a follicle image using gamma radiation emission. A source - detector simulation is performed by using a MCNP4B (Monte Carlo for Neutron Photon transport) general code considering different source energies, detector materials and geometries including pixel sizes and reflector types. The results demonstrate that by using MCNP4B code is possible to searching for useful parameters related to the systems used in nuclear medicine, specifically in radiobiology applied to endocrine physiology studies to acquiring thyroid follicles images. (author)
X-ray imaging plate performance investigation based on a Monte Carlo simulation tool
Energy Technology Data Exchange (ETDEWEB)
Yao, M., E-mail: philippe.duvauchelle@insa-lyon.fr [Laboratoire Vibration Acoustique (LVA), INSA de Lyon, 25 Avenue Jean Capelle, 69621 Villeurbanne Cedex (France); Duvauchelle, Ph.; Kaftandjian, V. [Laboratoire Vibration Acoustique (LVA), INSA de Lyon, 25 Avenue Jean Capelle, 69621 Villeurbanne Cedex (France); Peterzol-Parmentier, A. [AREVA NDE-Solutions, 4 Rue Thomas Dumorey, 71100 Chalon-sur-Saône (France); Schumm, A. [EDF R& D SINETICS, 1 Avenue du Général de Gaulle, 92141 Clamart Cedex (France)
2015-01-01
Computed radiography (CR) based on imaging plate (IP) technology represents a potential replacement technique for traditional film-based industrial radiography. For investigating the IP performance especially at high energies, a Monte Carlo simulation tool based on PENELOPE has been developed. This tool tracks separately direct and secondary radiations, and monitors the behavior of different particles. The simulation output provides 3D distribution of deposited energy in IP and evaluation of radiation spectrum propagation allowing us to visualize the behavior of different particles and the influence of different elements. A detailed analysis, on the spectral and spatial responses of IP at different energies up to MeV, has been performed. - Highlights: • A Monte Carlo tool for imaging plate (IP) performance investigation is presented. • The tool outputs 3D maps of energy deposition in IP due to different signals. • The tool also provides the transmitted spectra along the radiation propagation. • An industrial imaging case is simulated with the presented tool. • A detailed analysis, on the spectral and spatial responses of IP, is presented.
Monte Carlo simulation of grating-based neutron phase contrast imaging at CPHS
International Nuclear Information System (INIS)
Zhang Ran; Chen Zhiqiang; Huang Zhifeng; Xiao Yongshun; Wang Xuewu; Wie Jie; Loong, C.-K.
2011-01-01
Since the launching of the Compact Pulsed Hadron Source (CPHS) project of Tsinghua University in 2009, works have begun on the design and engineering of an imaging/radiography instrument for the neutron source provided by CPHS. The instrument will perform basic tasks such as transmission imaging and computerized tomography. Additionally, we include in the design the utilization of coded-aperture and grating-based phase contrast methodology, as well as the options of prompt gamma-ray analysis and neutron-energy selective imaging. Previously, we had implemented the hardware and data-analysis software for grating-based X-ray phase contrast imaging. Here, we investigate Geant4-based Monte Carlo simulations of neutron refraction phenomena and then model the grating-based neutron phase contrast imaging system according to the classic-optics-based method. The simulated experimental results of the retrieving phase shift gradient information by five-step phase-stepping approach indicate the feasibility of grating-based neutron phase contrast imaging as an option for the cold neutron imaging instrument at the CPHS.
Medical images of patients in voxel structures in high resolution for Monte Carlo simulation
International Nuclear Information System (INIS)
Boia, Leonardo S.; Menezes, Artur F.; Silva, Ademir X.
2011-01-01
This work aims to present a computational process of conversion of tomographic and MRI medical images from patients in voxel structures to an input file, which will be manipulated in Monte Carlo Simulation code for tumor's radiotherapic treatments. The problem's scenario inherent to the patient is simulated by such process, using the volume element (voxel) as a unit of computational tracing. The head's voxel structure geometry has voxels with volumetric dimensions around 1 mm 3 and a population of millions, which helps - in that way, for a realistic simulation and a decrease in image's digital process techniques for adjustments and equalizations. With such additional data from the code, a more critical analysis can be developed in order to determine the volume of the tumor, and the protection, beside the patients' medical images were borrowed by Clinicas Oncologicas Integradas (COI/RJ), joined to the previous performed planning. In order to execute this computational process, SAPDI computational system is used in a digital image process for optimization of data, conversion program Scan2MCNP, which manipulates, processes, and converts the medical images into voxel structures to input files and the graphic visualizer Moritz for the verification of image's geometry placing. (author)
Medical images of patients in voxel structures in high resolution for Monte Carlo simulation
Energy Technology Data Exchange (ETDEWEB)
Boia, Leonardo S.; Menezes, Artur F.; Silva, Ademir X., E-mail: lboia@con.ufrj.b, E-mail: ademir@con.ufrj.b [Universidade Federal do Rio de Janeiro (PEN/COPPE/UFRJ), RJ (Brazil). Coordenacao dos Programas de Pos-Graduacao de Engenharia. Programa de Engenharia Nuclear; Salmon Junior, Helio A. [Clinicas Oncologicas Integradas (COI), Rio de Janeiro, RJ (Brazil)
2011-07-01
This work aims to present a computational process of conversion of tomographic and MRI medical images from patients in voxel structures to an input file, which will be manipulated in Monte Carlo Simulation code for tumor's radiotherapic treatments. The problem's scenario inherent to the patient is simulated by such process, using the volume element (voxel) as a unit of computational tracing. The head's voxel structure geometry has voxels with volumetric dimensions around 1 mm{sup 3} and a population of millions, which helps - in that way, for a realistic simulation and a decrease in image's digital process techniques for adjustments and equalizations. With such additional data from the code, a more critical analysis can be developed in order to determine the volume of the tumor, and the protection, beside the patients' medical images were borrowed by Clinicas Oncologicas Integradas (COI/RJ), joined to the previous performed planning. In order to execute this computational process, SAPDI computational system is used in a digital image process for optimization of data, conversion program Scan2MCNP, which manipulates, processes, and converts the medical images into voxel structures to input files and the graphic visualizer Moritz for the verification of image's geometry placing. (author)
Optimal design of Anger camera for bremsstrahlung imaging: Monte Carlo evaluation.
Directory of Open Access Journals (Sweden)
Stephan eWalrand
2014-06-01
Full Text Available A conventional Anger camera is not adapted to bremsstrahlung imaging and, as a result, even using a reduced energy acquisition window, geometric x-rays represent less than 15% of the recorded events. This increases noise, limits the contrast, and reduces the quantification accuracy.Monte Carlo simulations of energy spectra showed that a camera based on a 30mm-thick BGO crystal and equipped with a high energy pinhole collimator is well adapted to bremsstrahlung imaging. The total scatter contamination is reduced by a factor ten versus a conventional NaI camera equipped with a high energy parallel hole collimator enabling acquisition using an extended energy window ranging from 50 to 350 keV. By using the recorded event energy in the reconstruction method, shorter acquisition time and reduced orbit range will be usable allowing the design of a simplified mobile gantry. This is more convenient for use in a busy catheterization room. After injecting a safe activity, a fast SPECT could be performed without moving the catheter tip in order to assess the liver dosimetry and estimate the additional safe activity that could still be injected.Further long running time Monte Carlo simulations of realistic acquisitions will allow assessing the quantification capability of such system. Simultaneously, a dedicated bremsstrahlung prototype camera reusing PMT-BGO blocks coming from a retired PET system is currently under design for further evaluation.
Nishidate, Izumi; Wiswadarma, Aditya; Hase, Yota; Tanaka, Noriyuki; Maeda, Takaaki; Niizeki, Kyuichi; Aizu, Yoshihisa
2011-08-01
In order to visualize melanin and blood concentrations and oxygen saturation in human skin tissue, a simple imaging technique based on multispectral diffuse reflectance images acquired at six wavelengths (500, 520, 540, 560, 580 and 600nm) was developed. The technique utilizes multiple regression analysis aided by Monte Carlo simulation for diffuse reflectance spectra. Using the absorbance spectrum as a response variable and the extinction coefficients of melanin, oxygenated hemoglobin, and deoxygenated hemoglobin as predictor variables, multiple regression analysis provides regression coefficients. Concentrations of melanin and total blood are then determined from the regression coefficients using conversion vectors that are deduced numerically in advance, while oxygen saturation is obtained directly from the regression coefficients. Experiments with a tissue-like agar gel phantom validated the method. In vivo experiments with human skin of the human hand during upper limb occlusion and of the inner forearm exposed to UV irradiation demonstrated the ability of the method to evaluate physiological reactions of human skin tissue.
Energy Technology Data Exchange (ETDEWEB)
Lee, Taewoong; Lee, Hyounggun; Kim, Younghak; Lee, Wonho [Korea University, Seoul (Korea, Republic of)
2017-07-15
The performance of a Compton imager using a single three-dimensional position-sensitive LYSO scintillator detector was estimated using a Monte Carlo simulation. The Compton imager consisted of a single LYSO scintillator with a pixelized structure. The size of the scintillator and each pixel were 1.3 × 1.3 × 1.3 cm{sup 3} and 0.3 × 0.3 × 0.3 cm{sup 3}, respectively. The order of γ-ray interactions was determined based on the deposited energies in each detector. After the determination of the interaction sequence, various types of reconstruction algorithms such as simple back-projection, filtered back-projection, and list-mode maximum-likelihood expectation maximization (LM-MLEM) were applied and compared with each other in terms of their angular resolution and signal-tonoise ratio (SNR) for several γ-ray energies. The LM-MLEM reconstruction algorithm exhibited the best performance for Compton imaging in maintaining high angular resolution and SNR. The two sources of {sup 137}Cs (662 keV) could be distinguishable if they were more than 17 ◦ apart. The reconstructed Compton images showed the precise position and distribution of various radiation isotopes, which demonstrated the feasibility of the monitoring of nuclear materials in homeland security and radioactive waste management applications.
A Monte Carlo study for optimizing the detector of SPECT imaging using a XCAT human phantom.
Khoshakhlagh, Mohammad; Pirayesh Islamian, Jalil; Abedi, Seyyed Mohammad; Mahmoudian, Babak; Shayesteh Azar, Masoud
2017-01-01
Acquiring a high quality image has assigned an important concern for obtaining accurate diagnosis in nuclear medicine. Detector is a critical component of Single Photon Emission Computed Tomography (SPECT) imaging system for giving accurate information from exact pattern of radionuclide distribution in the target organ. The images are strongly affected by the attenuation, scattering, and response of the detector. The conventional detector is mainly made from sodium iodide activated by thallium [NaI(Tl)] in nuclear medicine imaging. This study has planned to introduce a suitable for an optimized SPECT imaging. SIMIND Monte Carlo program was utilized for simulating a SPECT imaging system with a NaI(Tl) detector, and a low-energy high-resolution (LEHR) collimator. The Planar and SPECT scans of a 99mTc point source and also an extended Cardiac-Torso (XCAT) computerized phantom with the experiment and simulated systems were prepared. After verification and validation of the simulated system, the similar scans of the phantoms were compared from the point of view of image quality for 7 scintillator crystals including: NaI(Tl), BGO, YAG:Ce, YAP:Ce, LuAG:Ce, LaBr3 and CZT. The parameters of energy and spatial resolution, and sensitivity of the systems were compared. Images were analyzed quantitatively by SSIM algorithm with Zhou Wang and Rouse/Hemami methods, and also qualitatively by two nuclear medicine specialists. Energy resolutions of the mentioned crystals obtained were: 9.864, 9.8545, 10.229, 10.221, 10.230, 10.131and10.223 percentage for 99mTc photopeak 140 Kev, respectively. Finally, SSIM indexes for the related phantom images were calculated to 0.794, 0.738, 0.735, 0.607, 0.760 and 0.811 compared to the NaI(Tl) acquired images, respectively. Medical diagnosis of the SPECT images of the phantom showed that the system with BGO crystal potentially provides a better detectability for hot and cold lesions in the liver of XCAT phantom. The results showed that BGO
Prediction of ICP Pose Uncertainties Using Monte Carlo Simulation with Synthetic Depth Images
DEFF Research Database (Denmark)
Iversen, Thorbjørn Mosekjær; Buch, Anders Glent; Kraft, Dirk
2017-01-01
on the generation of synthetic depth images in a Monte Carlo simulation. In this paper we demonstrate our method for depth sensors which rely on Kinect v1 like technology. We evaluate our method using real depth sensor recordings from the publicly available BigBird dataset. The evaluation shows that the uncertainty......In robotics, vision sensors are used to estimate the poses of objects in the environment. However, it is a fundamental problem that the estimated poses are not always accurate enough for a given robotic task. Proper sensor placement can mitigate this problem. We present a method which can predict...... the pose uncertainties in the Iterative Closest Point (ICP) algorithm, which is often used as the last critical pose refinement step in a pose estimation system. With our method we thus provide a crucial tool needed for the optimization of a robust pose estimation system. Our method relies...
International Nuclear Information System (INIS)
Mallett, M.W.
1991-01-01
Lawrence Livermore National Laboratory (LLNL) is currently investigating a new method for obtaining absolute calibration factors for radiation measurement systems used to measure internally deposited radionuclides in vivo. This method uses magnetic resonance imaging (MRI) to determine the anatomical makeup of an individual. A new MRI technique is also employed that is capable of resolving the fat and water content of the human tissue. This anatomical and biochemical information is used to model a mathematical phantom. Monte Carlo methods are then used to simulate the transport of radiation throughout the phantom. By modeling the detection equipment of the in vivo measurement system into the code, calibration factors are generated that are specific to the individual. Furthermore, this method eliminates the need for surrogate human structures in the calibration process. A demonstration of the proposed method is being performed using a fat/water matrix
Abstract ID: 197 Monte Carlo simulations of X-ray grating interferometry based imaging systems.
Tessarini, Stefan; Fix, Michael K; Volken, Werner; Frei, Daniel; Stampanoni, Marco F M
2018-01-01
Over the last couple of years the implementation of Monte Carlo (MC) methods of grating based imaging techniques is of increasing interest. Several different approaches were taken to include coherent effects into MC in order to simulate the radiation transport of the image forming procedure. These include full MC using FLUKA [1], which however are only considering monochromatic sources. Alternatively, ray-tracing based MC [2] allow fast simulations with the limitation to provide only qualitative results, i.e. this technique is not suitable for dose calculation in the imaged object. Finally, hybrid models [3] were used allowing quantitative results in reasonable computation time, however only two-dimensional implementations are available. Thus, this work aims to develop a full MC framework for X-ray grating interferometry imaging systems using polychromatic sources suitable for large-scale samples. For this purpose the EGSnrc C++ MC code system is extended to take Snell's law, the optical path length and Huygens principle into account. Thereby the EGSnrc library was modified, e.g. the complex index of refraction has to be assigned to each region depending on the material. The framework is setup to be user-friendly and robust with respect to future updates of the EGSnrc package. These implementations have to be tested using dedicated academic situations. Next steps include the validation by comparisons of measurements for different setups with the corresponding MC simulations. Furthermore, the newly developed implementation will be compared with other simulation approaches. This framework will then serve as bases for dose calculation on CT data and has further potential to investigate the image formation process in grating based imaging systems. Copyright © 2017.
Dana, Nicholas; Sowers, Timothy; Karpiouk, Andrei; Vanderlaan, Donald; Emelianov, Stanislav
2017-10-01
Coronary heart disease (the presence of coronary atherosclerotic plaques) is a significant health problem in the industrialized world. A clinical method to accurately visualize and characterize atherosclerotic plaques is needed. Intravascular photoacoustic (IVPA) imaging is being developed to fill this role, but questions remain regarding optimal imaging wavelengths. We utilized a Monte Carlo optical model to simulate IVPA excitation in coronary tissues, identifying optimal wavelengths for plaque characterization. Near-infrared wavelengths (≤1800 nm) were simulated, and single- and dual-wavelength data were analyzed for accuracy of plaque characterization. Results indicate light penetration is best in the range of 1050 to 1370 nm, where 5% residual fluence can be achieved at clinically relevant depths of ≥2 mm in arteries. Across the arterial wall, fluence may vary by over 10-fold, confounding plaque characterization. For single-wavelength results, plaque segmentation accuracy peaked at 1210 and 1720 nm, though correlation was poor (primary wavelength (≈1.0). Results suggest that, without flushing the luminal blood, a primary and secondary wavelength near 1210 and 1350 nm, respectively, may offer the best implementation of dual-wavelength IVPA imaging. These findings could guide the development of a cost-effective clinical system by highlighting optimal wavelengths and improving plaque characterization.
Indian Academy of Sciences (India)
. Keywords. Gibbs sampling, Markov Chain. Monte Carlo, Bayesian inference, stationary distribution, conver- gence, image restoration. Arnab Chakraborty. We describe the mathematics behind the Markov. Chain Monte Carlo method of ...
DEFF Research Database (Denmark)
Xie, Haiyan; Liu, Haichun; Svenmarker, Pontus
2011-01-01
in vivo by the fluorescence imaging technique. In this paper we present a novel approach to compensate for the light absorption in homogeneous turbid media both for the excitation and emission light, utilizing time-resolved fluorescence white Monte Carlo simulations combined with the Beer-Lambert law...
Kim, Hyun Suk; Choi, Hong Yeop; Lee, Gyemin; Ye, Sung-Joon; Smith, Martin B; Kim, Geehyun
2017-12-22
This work is to develop a gamma-ray/neutron dual-particle imager, based on rotating modulation collimators (RMC) and pulse shape discrimination (PSD)-capable scintillators, for possible applications on radioactivity monitoring as well as nuclear security and safeguards. A Monte Carlo simulation study was performed to design an RMC system for the dual-particle imaging, and modulation patterns were obtained for gamma-ray and neutron sources on various configurations. We applied an image reconstruction algorithm utilizing the maximum-likelihood expectation maximization (MLEM) method based on the analytical modeling of source-detector configurations, to the Monte Carlo simulation results. Both gamma-ray and neutron source distributions were reconstructed and evaluated in terms of signal-to-noise ratio (SNR), showing viability of developing an RMC-based gamma-ray/neutron dual-particle imager using PSD-capable scintillators. © 2017 IOP Publishing Ltd.
Synchrotron imaging and Markov Chain Monte Carlo reveal tooth mineralization patterns.
Directory of Open Access Journals (Sweden)
Daniel R Green
Full Text Available The progressive character of tooth formation records aspects of mammalian life history, diet, seasonal behavior and climate. Tooth mineralization occurs in two stages: secretion and maturation, which overlap to some degree. Despite decades of study, the spatial and temporal pattern of elemental incorporation during enamel mineralization remains poorly characterized. Here we use synchrotron X-ray microtomography and Markov Chain Monte Carlo sampling to estimate mineralization patterns from an ontogenetic series of sheep molars (n = 45 M1s, 18 M2s. We adopt a Bayesian approach that posits a general pattern of maturation estimated from individual- and population-level mineral density variation over time. This approach converts static images of mineral density into a dynamic model of mineralization, and demonstrates that enamel secretion and maturation waves advance at nonlinear rates with distinct geometries. While enamel secretion is ordered, maturation geometry varies within a population and appears to be driven by diffusive processes. Our model yields concrete expectations for the integration of physiological and environmental signals, which is of particular significance for paleoseasonality research. This study also provides an avenue for characterizing mineralization patterns in other taxa. Our synchrotron imaging data and model are available for application to multiple disciplines, including health, material science, and paleontological research.
Adluru, Nagesh; Yang, Xingwei; Latecki, Longin Jan
2015-05-01
We consider a problem of finding maximum weight subgraphs (MWS) that satisfy hard constraints in a weighted graph. The constraints specify the graph nodes that must belong to the solution as well as mutual exclusions of graph nodes, i.e., pairs of nodes that cannot belong to the same solution. Our main contribution is a novel inference approach for solving this problem in a sequential monte carlo (SMC) sampling framework. Usually in an SMC framework there is a natural ordering of the states of the samples. The order typically depends on observations about the states or on the annealing setup used. In many applications (e.g., image jigsaw puzzle problems), all observations (e.g., puzzle pieces) are given at once and it is hard to define a natural ordering. Therefore, we relax the assumption of having ordered observations about states and propose a novel SMC algorithm for obtaining maximum a posteriori estimate of a high-dimensional posterior distribution. This is achieved by exploring different orders of states and selecting the most informative permutations in each step of the sampling. Our experimental results demonstrate that the proposed inference framework significantly outperforms loopy belief propagation in solving the image jigsaw puzzle problem. In particular, our inference quadruples the accuracy of the puzzle assembly compared to that of loopy belief propagation.
Voxel-based Monte Carlo simulation of X-ray imaging and spectroscopy experiments
International Nuclear Information System (INIS)
Bottigli, U.; Brunetti, A.; Golosio, B.; Oliva, P.; Stumbo, S.; Vincze, L.; Randaccio, P.; Bleuet, P.; Simionovici, A.; Somogyi, A.
2004-01-01
A Monte Carlo code for the simulation of X-ray imaging and spectroscopy experiments in heterogeneous samples is presented. The energy spectrum, polarization and profile of the incident beam can be defined so that X-ray tube systems as well as synchrotron sources can be simulated. The sample is modeled as a 3D regular grid. The chemical composition and density is given at each point of the grid. Photoelectric absorption, fluorescent emission, elastic and inelastic scattering are included in the simulation. The core of the simulation is a fast routine for the calculation of the path lengths of the photon trajectory intersections with the grid voxels. The voxel representation is particularly useful for samples that cannot be well described by a small set of polyhedra. This is the case of most naturally occurring samples. In such cases, voxel-based simulations are much less expensive in terms of computational cost than simulations on a polygonal representation. The efficient scheme used for calculating the path lengths in the voxels and the use of variance reduction techniques make the code suitable for the detailed simulation of complex experiments on generic samples in a relatively short time. Examples of applications to X-ray imaging and spectroscopy experiments are discussed
A Monte Carlo study of the effect of coded-aperture material and thickness on neutron imaging.
Hayes, S C; Gamage, K A A
2014-10-01
In this paper, a coded-aperture design for a scintillator-based neutron imaging system has been simulated using a series of Monte Carlo simulations. Using Monte Carlo simulations, work to optimise a system making use of the EJ-426 neutron scintillator detector has been conducted. This type of scintillator has a low sensitivity to gamma rays and is therefore particularly useful for neutron detection in a mixed radiation environment. Simulations have been conducted using varying coded-aperture materials and different coded-aperture thicknesses. From this, neutron images have been produced, compared qualitatively and quantitatively for each case to find the best material for the MURA (modified uniformly redundant array) pattern. The neutron images generated also allow observations on how differing thicknesses of coded-aperture impact the system. © The Author 2013. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Sharma, Diksha; Sempau, Josep; Badano, Aldo
2018-02-01
Monte Carlo simulations require large number of histories to obtain reliable estimates of the quantity of interest and its associated statistical uncertainty. Numerous variance reduction techniques (VRTs) have been employed to increase computational efficiency by reducing the statistical uncertainty. We investigate the effect of two VRTs for optical transport methods on accuracy and computing time for the estimation of variance (noise) in x-ray imaging detectors. We describe two VRTs. In the first, we preferentially alter the direction of the optical photons to increase detection probability. In the second, we follow only a fraction of the total optical photons generated. In both techniques, the statistical weight of photons is altered to maintain the signal mean. We use fastdetect2, an open-source, freely available optical transport routine from the hybridmantis package. We simulate VRTs for a variety of detector models and energy sources. The imaging data from the VRT simulations are then compared to the analog case (no VRT) using pulse height spectra, Swank factor, and the variance of the Swank estimate. We analyze the effect of VRTs on the statistical uncertainty associated with Swank factors. VRTs increased the relative efficiency by as much as a factor of 9. We demonstrate that we can achieve the same variance of the Swank factor with less computing time. With this approach, the simulations can be stopped when the variance of the variance estimates reaches the desired level of uncertainty. We implemented analytic estimates of the variance of Swank factor and demonstrated the effect of VRTs on image quality calculations. Our findings indicate that the Swank factor is dominated by the x-ray interaction profile as compared to the additional uncertainty introduced in the optical transport by the use of VRTs. For simulation experiments that aim at reducing the uncertainty in the Swank factor estimate, any of the proposed VRT can be used for increasing the relative
The use of computed tomography images in Monte Carlo treatment planning
Bazalova, Magdalena
Monte Carlo (MC) dose calculations cannot accurately assess the dose delivered to the patient during radiotherapy unless the patient anatomy is well known. This thesis focuses on the conversion of patient computed tomography (CT) images into MC geometry files. Metal streaking artifacts and their effect on MC dose calculations are first studied. A correction algorithm is applied to artifact-corrupted images and dose errors due to density and tissue mis-assignment are quantified in a phantom and a patient study. The correction algorithm and MC dose calculations for various treatment beams are also investigated using phantoms with real hip prostheses. As a result of this study, we suggest that a metal artifact correction algorithm should be a part of any MC treatment planning. By means of MC simulations, scatter is proven to be a major cause of metal artifacts. The use of dual-energy CT (DECT) for a novel tissue segmentation scheme is thoroughly investigated. First, MC simulations are used to determine the optimal beam filtration for an accurate DECT material extraction. DECT is then tested on a CT scanner with a phantom and a good agreement in the extraction of two material properties, the relative electron density rhoe and the effective atomic number Z is found. Compared to the conventional tissue segmentation based on rhoe-differences, the novel tissue segmentation scheme uses differences in both rhoe and Z. The phantom study demonstrates that the novel method based on rhoe and Z information works well and makes MC dose calculations more accurate. This thesis demonstrates that DECT suppresses streaking artifacts from brachytherapy seeds. Brachytherapy MC dose calculations using single-energy CT images with artifacts and DECT images with suppressed artifacts are performed and the effect of artifact reduction is investigated. The patient and canine DECT studies also show that image noise and object motion are very important factors in DECT. A solution for reduction
Monte Carlo simulations of elemental imaging using the neutron-associated particle technique.
Abel, Michael R; Nie, Linda H
2018-02-06
The purpose of this study is to develop and employ a Monte Carlo (MC) simulation model of associated particle neutron elemental imaging (APNEI) in order to determine the three-dimensional (3D) imaging resolution of such a system by examining relevant physical and technological parameters and to thereby begin to explore the range of clinical applicability of APNEI to fields such as medical diagnostics, intervention, and etiological research. The presented APNEI model was defined in MCNP by a Gaussian-distributed and isotropic surface source emitting deuterium + deuterium (DD) neutrons, iron as the target element, nine iron-containing voxels (1 cm 3 volume each) arranged in a 3-by-3 array as the interrogated volume of interest, and finally, by high-purity germanium (HPGe) gamma-ray detectors anterior and posterior to the 9-voxel array. The MCNP f8 pulse height tally was employed in conjunction with the PTRAC particle tracking function to not only determine the signal acquired from iron inelastic scatter gamma-rays but also to quantitate each of the nine target voxels' contribution to the overall iron signal - each detected iron inelastic scatter gamma-ray being traced to the source neutron which incited its emission. With the spatial, vector, and timing information of the series of events for each relevant neutron history as collected by PTRAC, realistic grayscale images of the distribution of iron concentration in the 9-voxel array were simulated in both the projective and depth dimensions. With an overall 225 ps timing resolution, 6.25 mm 2 imaging plate pixels assumed to have well localized scintillation, and a DD neutron, Gaussian-distributed source spot with a diameter of 2 mm, projective and depth resolutions of imaging resolution offered by APNEI of target elements such as iron lends itself to potential applications in disease diagnosis and treatment planning (high resolution) as well as to ordnance and contraband detection (low resolution). However
Architectural Ruins and Urban Imaginaries: Carlos Garaicoa’s Images of Havana
Directory of Open Access Journals (Sweden)
Jodi Kovach
2016-11-01
Full Text Available Contemporary Cuban artist Carlos Garaicoa juxtaposes photographic images of Havana’s architectural ruins with timidly articulated drawings that trace the outlines of the dilapidated buildings in empty urbanscapes. Each of these fragile drawings, often composed of delicate threads adhered to a photograph of a site after demolition, serves as a vestige of the sagging structure that the artist photographed prior to destruction. The dialogue that emerges from these photograph/drawing diptychs implies the unmooring of the radical utopian underpinnings of revolutionary ideology that persisted in the policies of Cuba’s Período especial (Special Period of the 1990s, and suggests a more complicated narrative of Cuba’s modernity, in which the ambiguous drawings—which could indicate construction plans or function as mnemonic images—represent empty promises of economic growth that must negotiate the real socio-economic crises of the present. This article proposes that Garaicoa’s critique of the goals and outcomes of the Special Period through Havana’s ruins suggests a new articulation of the baroque expression— one that calls to mind the anti-authoritative strategies of twentieth-century Neo-Baroque literature and criticism. The artist historically grounds the legacy of the Cuban Revolution’s modernizing project in the country’s real economic decline in the post-Soviet era, but he also takes this approach to representing cities beyond Cuba’s borders, thereby posing broader questions about the architectural symbolism of the 21st-century city in the ideological construction of modern globalizing society.
International Nuclear Information System (INIS)
Milian, F. M.; Attili, A.; Russo, G; Marchetto, F.; Cirio, R.; Bourhaleb, F.
2013-01-01
A novel procedure for the generation of a realistic virtual Computed Tomography (CT) image of a patient, using the advanced Boundary RE Presentation (BREP)-based model MASH, has been implemented. This method can be used in radiotherapy assessment. It is shown that it is possible to introduce an artificial cancer, which can be modeled using mesh surfaces. The use of virtual CT images based on BREP models presents several advantages with respect to CT images of actual patients, such as automation, control and flexibility. As an example, two artificial cases, namely a brain and a prostate cancer, were created through the generation of images and tumor/organ contours. As a secondary objective, the described methodology has been used to generate input files for treatment planning system (TPS) and Monte Carlo code dose evaluation. In this paper, we consider treatment plans generated assuming a dose delivery via an active proton beam scanning performed with the INFN-IBA TPS kernel. Additionally, Monte Carlo simulations of the two treatment plans were carried out with GATE/GEANT4. The work demonstrates the feasibility of the approach based on the BREP modeling to produce virtual CT images. In conclusion, this study highlights the benefits in using digital phantom model capable of representing different anatomical structures and varying tumors across different patients. These models could be useful for assessing radiotherapy treatment planning systems (TPS) and computer simulations for the evaluation of the adsorbed dose. (author)
Comparison of TOF-PET and Bremsstrahlung SPECT images of Yttrium-90: A Monte Carlo Simulation Study
Directory of Open Access Journals (Sweden)
Akihiko Takahashi
2018-01-01
Full Text Available Objective(s: Yttrium-90 (90Y is a beta particle nuclide used in targeted radionuclide therapy which is available to both single-photon emission computed tomography (SPECT and time-of-flight (TOF positron emission tomography (PET imaging. The purpose of this study was to assess the image quality of PET and Bremsstrahlung SPECT by simulating PET and SPECT images of 90Y using Monte Carlo simulation codes under the same conditions and to compare them. Methods: In-house Monte Carlo codes, MCEP-PET and MCEP-SPECT, were employed to simulate images. The phantom was a torso-shaped phantom containing six hot spheres of various sizes. The background concentrations of 90Y were set to 50, 100, 150, and 200 kBq/mL, and the concentrations of the hot spheres were 10, 20, and 40 times of those of the background concentrations. The acquisition time was set to 30 min, and the simulated sinogram data were reconstructed using the ordered subset expectation maximization method. The contrast recovery coefficient (CRC and contrast-to-noise ratio (CNR were employed to evaluate the image qualities. Results: The CRC values of SPECT images were less than 40%, while those of PET images were more than 40% when the hot sphere was larger than 20 mm in diameter. The CNR values of PET images of hot spheres of diameter smaller than 20 mm were larger than those of SPECT images. The CNR values mostly exceeded 4, which is a criterion to evaluate the discernibility of hot areas. In the case of SPECT, hot spheres of diameter smaller than 20 mm were not discernable. On the contrary, the CNR values of PET images decreased to the level of SPECT, in the case of low concentration. Conclusion: In almost all the cases examined in this investigation, the quantitative indexes of TOF-PET 90Y images were better than those of Bremsstrahlung SPECT images. However, the superiority of PET image became critical in the case of low activity concentrations.
Comparison of TOF-PET and Bremsstrahlung SPECT Images of Yttrium-90: A Monte Carlo Simulation Study.
Takahashi, Akihiko; Himuro, Kazuhiko; Baba, Shingo; Yamashita, Yasuo; Sasaki, Masayuki
2018-01-01
Yttrium-90 ( 90 Y) is a beta particle nuclide used in targeted radionuclide therapy which is available to both single-photon emission computed tomography (SPECT) and time-of-flight (TOF) positron emission tomography (PET) imaging. The purpose of this study was to assess the image quality of PET and Bremsstrahlung SPECT by simulating PET and SPECT images of 90 Y using Monte Carlo simulation codes under the same conditions and to compare them. In-house Monte Carlo codes, MCEP-PET and MCEP-SPECT, were employed to simulate images. The phantom was a torso-shaped phantom containing six hot spheres of various sizes. The background concentrations of 90 Y were set to 50, 100, 150, and 200 kBq/mL, and the concentrations of the hot spheres were 10, 20, and 40 times of those of the background concentrations. The acquisition time was set to 30 min, and the simulated sinogram data were reconstructed using the ordered subset expectation maximization method. The contrast recovery coefficient (CRC) and contrast-to-noise ratio (CNR) were employed to evaluate the image qualities. The CRC values of SPECT images were less than 40%, while those of PET images were more than 40% when the hot sphere was larger than 20 mm in diameter. The CNR values of PET images of hot spheres of diameter smaller than 20 mm were larger than those of SPECT images. The CNR values mostly exceeded 4, which is a criterion to evaluate the discernibility of hot areas. In the case of SPECT, hot spheres of diameter smaller than 20 mm were not discernable. On the contrary, the CNR values of PET images decreased to the level of SPECT, in the case of low concentration. In almost all the cases examined in this investigation, the quantitative indexes of TOF-PET 90 Y images were better than those of Bremsstrahlung SPECT images. However, the superiority of PET image became critical in the case of low activity concentrations.
Jung, Seongmoon; Sung, Wonmo; Ye, Sung-Joon
2017-01-01
This work aims to develop a Monte Carlo (MC) model for pinhole K-shell X-ray fluorescence (XRF) imaging of metal nanoparticles using polychromatic X-rays. The MC model consisted of two-dimensional (2D) position-sensitive detectors and fan-beam X-rays used to stimulate the emission of XRF photons from gadolinium (Gd) or gold (Au) nanoparticles. Four cylindrical columns containing different concentrations of nanoparticles ranging from 0.01% to 0.09% by weight (wt%) were placed in a 5 cm diameter cylindrical water phantom. The images of the columns had detectable contrast-to-noise ratios (CNRs) of 5.7 and 4.3 for 0.01 wt% Gd and for 0.03 wt% Au, respectively. Higher concentrations of nanoparticles yielded higher CNR. For 1×10 11 incident particles, the radiation dose to the phantom was 19.9 mGy for 110 kVp X-rays (Gd imaging) and 26.1 mGy for 140 kVp X-rays (Au imaging). The MC model of a pinhole XRF can acquire direct 2D slice images of the object without image reconstruction. The MC model demonstrated that the pinhole XRF imaging system could be a potential bioimaging modality for nanomedicine.
DEFF Research Database (Denmark)
Slot Thing, Rune; Bernchou, Uffe; Mainegra-Hing, Ernesto
2013-01-01
Abstract Purpose. Cone beam computed tomography (CBCT) image quality is limited by scattered photons. Monte Carlo (MC) simulations provide the ability of predicting the patient-specific scatter contamination in clinical CBCT imaging. Lengthy simulations prevent MC-based scatter correction from...
Roshan, Hoda Rezaei; Mahmoudian, Babak; Gharepapagh, Esmaeil; Azarm, Ahmadreza; Pirayesh Islamian, Jalil
2016-02-01
Treatment efficacy of radioembolization using Yttrium-90 ((90)Y) microspheres is assessed by the (90)Y bremsstrahlung single photon emission computed tomography (SPECT) imaging following radioembolization. The radioisotopic image has the potential of providing reliable activity map of (90)Y microspheres distribution. One of the main reasons of the poor image quality in (90)Y bremsstrahlung SPECT imaging is the continuous and broad energy spectrum of the related bremsstrahlung photons. Furthermore, collimator geometry plays an impressive role in the spatial resolution, sensitivity and image contrast. Due to the relatively poor quality of the (90)Y bremsstrahlung SPECT images, we intend to optimize the medium-energy (ME) parallel-hole collimator and energy window. The Siemens e.cam gamma camera equipped with a ME collimator and a voxelized phantom was simulated by the SImulating Medical Imaging Nuclear Detectors (SIMIND) program. We used the SIMIND Monte Carlo program to generate the (90)Y bremsstrahlung SPECT projection of the digital Jaszczak phantom. The phantom consist of the six hot spheres ranging from 9.5 to 31.8mm in diameter, which are used to evaluate the image contrast. In order to assess the effect of the energy window on the image contrast, three energy windows ranging from 60 to 160 KeV, 160 to 400 KeV, and 60 to 400 KeV were set on a (90)Y bremsstrahlung spectrum. As well, the effect of the hole diameter of a ME collimator on the image contrast and bremsstrahlung spectrum were investigated. For the fixed collimator and septa thickness values (3.28 cm and 1.14 mm, respectively), a hole diameter range (2.35-3.3mm) was chosen based on the appropriate balance between the spatial resolution and sensitivity. The optimal energy window for (90)Y bremsstrahlung SPECT imaging was extended energy window from 60 to 400 KeV. Besides, The optimal value of the hole diameter of ME collimator was obtained 3.3mm. Geometry of the ME parallel-hole collimator and energy
ROSI--an object-oriented and parallel-computing Monte Carlo simulation for X-ray imaging
International Nuclear Information System (INIS)
Giersch, Juergen; Weidemann, Andreas; Anton, Gisela
2003-01-01
In the field of X-ray imaging, Monte Carlo simulation is an important tool. It gives the possibility of understanding experimental results and it allows the construction of virtual imaging setups with predictions of their quality. For these reasons, we developed the Roentgen Simulation (ROSI) which is based on the object-oriented C++ class library GISMO. The interaction algorithms are based on the established EGS4-code and its current LSCAT-extension. ROSI introduces random variables for modelling physical parameters by a given random distribution, e.g. the source position or the direction and energy of the photons to be emitted. It is possible to run ROSI in parallel on a local computer network (Beowulf cluster) to obtain simulation data in shorter time. Finally, it has an easy-to-use interface. We will present the concept of ROSI and demonstrate its flexibility by an example
Energy Technology Data Exchange (ETDEWEB)
Becchetti, M; Tian, X; Segars, P; Samei, E [Clinical Imaging Physics Group, Department of Radiology, Duke University Me, Durham, NC (United States)
2015-06-15
Purpose: To develop an accurate and fast Monte Carlo (MC) method of simulating CT that is capable of correlating dose with image quality using voxelized phantoms. Methods: A realistic voxelized phantom based on patient CT data, XCAT, was used with a GPU accelerated MC code for helical MDCT. Simulations were done with both uniform density organs and with textured organs. The organ doses were validated using previous experimentally validated simulations of the same phantom under the same conditions. Images acquired by tracking photons through the phantom with MC require lengthy computation times due to the large number of photon histories necessary for accurate representation of noise. A substantial speed up of the process was attained by using a low number of photon histories with kernel denoising of the projections from the scattered photons. These FBP reconstructed images were validated against those that were acquired in simulations using many photon histories by ensuring a minimal normalized root mean square error. Results: Organ doses simulated in the XCAT phantom are within 10% of the reference values. Corresponding images attained using projection kernel smoothing were attained with 3 orders of magnitude less computation time compared to a reference simulation using many photon histories. Conclusion: Combining GPU acceleration with kernel denoising of scattered photon projections in MC simulations allows organ dose and corresponding image quality to be attained with reasonable accuracy and substantially reduced computation time than is possible with standard simulation approaches.
Monte Carlo simulation of image properties of an X-ray intensifying screen
Wang Yi; Wang Kui Lu; Liu Guo Zhi; Liu Ya Qian
2000-01-01
A Monte Carlo simulation program named MCPEP has been developed. Based on the existing simulation program that simulates the transfer of X-ray photons and the secondary electrons, MCPEP also simulates the light photons in the screen. The performances of an intensifying screen (Gd sub 2 O sub 2 S : Tb) with different thickness and different X-ray energies have been analyzed by MCPEP. The calculated light photon probability distribution, average light photon number per absorbed X-ray photon, statistical factor for light emission, X-ray detection efficiency, detective quantum efficiency (DQE) and point spread function (PSF) of the screen are presented.
Image quality assessment of LaBr3-based whole-body 3D PET scanners: a Monte Carlo evaluation
International Nuclear Information System (INIS)
Surti, S; Karp, J S; Muehllehner, G
2004-01-01
The main thrust for this work is the investigation and design of a whole-body PET scanner based on new lanthanum bromide scintillators. We use Monte Carlo simulations to generate data for a 3D PET scanner based on LaBr 3 detectors, and to assess the count-rate capability and the reconstructed image quality of phantoms with hot and cold spheres using contrast and noise parameters. Previously we have shown that LaBr 3 has very high light output, excellent energy resolution and fast timing properties which can lead to the design of a time-of-flight (TOF) whole-body PET camera. The data presented here illustrate the performance of LaBr 3 without the additional benefit of TOF information, although our intention is to develop a scanner with TOF measurement capability. The only drawbacks of LaBr 3 are the lower stopping power and photo-fraction which affect both sensitivity and spatial resolution. However, in 3D PET imaging where energy resolution is very important for reducing scattered coincidences in the reconstructed image, the image quality attained in a non-TOF LaBr 3 scanner can potentially equal or surpass that achieved with other high sensitivity scanners. Our results show that there is a gain in NEC arising from the reduced scatter and random fractions in a LaBr 3 scanner. The reconstructed image resolution is slightly worse than a high-Z scintillator, but at increased count-rates, reduced pulse pileup leads to an image resolution similar to that of LSO. Image quality simulations predict reduced contrast for small hot spheres compared to an LSO scanner, but improved noise characteristics at similar clinical activity levels
International Nuclear Information System (INIS)
Valente, Mauro; Castellano, Gustavo; Sosa, Carlos
2008-01-01
Full text: Radiotherapy is one of the most effective techniques for tumour treatment and control. During the last years, significant developments were performed regarding both irradiation technology and techniques. However, accurate 3D dosimetric techniques are nowadays not commercially available. Due to their intrinsic characteristics, traditional dosimetric techniques like ionisation chamber, film dosimetry or TLD do not offer proper continuous 3D dose mapping. The possibility of using ferrous sulphate (Fricke) dosimeters suitably fixed to a gel matrix, along with dedicated optical analysis methods, based on light transmission measurements for 3D absorbed dose imaging in tissue-equivalent materials, has become great interest in radiotherapy. Since Gore et al. showed in 1984 that the oxidation of ferrous ions to ferric ions still happen even when fixing the ferrous sulphate solution to a gelatine matrix, important efforts have been dedicated in developing and improving real continuous 3D dosimetric systems based on Fricke solution. The purpose of this work is to investigate the capability and suitability of Fricke gel dosimetry for arc therapy irradiations. The dosimetric system is mainly composed by Fricke gel dosimeters, suitably shaped in form of thin layers and optically analysed by means of visible light transmission measurements, acquiring sample images just before and after irradiation by means of a commercial flatbed-like scanner. Image acquisition, conversion to matrices and further analysis are accomplished by means of dedicated developed software, which includes suitable algorithms for optical density differences calculation and corresponding absorbed dose conversion. Dedicated subroutines allow 3D dose imaging reconstruction from single layer information, by means of computer tomography-like algorithms. Also, dedicated Monte Carlo (PENELOPE) subroutines have been adapted in order to achieve accurate simulation of arc therapy irradiation techniques
Boia, L S; Menezes, A F; Cardoso, M A C; da Rosa, L A R; Batista, D V S; Cardoso, S C; Silva, A X; Facure, A
2012-01-01
This paper presents the application of a computational methodology for optimizing the conversion of medical tomographic images in voxel anthropomorphic models for simulation of radiation transport using the MCNP code. A computational system was developed for digital image processing that compresses the information from the DICOM medical image before it is converted to the Scan2MCNP software input file for optimization of the image data. In order to validate the computational methodology, a radiosurgery treatment simulation was performed using the Alderson Rando phantom and the acquisition of DICOM images was performed. The simulation results were compared with data obtained with the BrainLab planning system. The comparison showed good agreement for three orthogonal treatment beams of (60)Co gamma radiation. The percentage differences were 3.07%, 0.77% and 6.15% for axial, coronal and sagital projections, respectively. Copyright © 2011 Elsevier Ltd. All rights reserved.
Energy Technology Data Exchange (ETDEWEB)
Boia, L.S.; Menezes, A.F.; Cardoso, M.A.C. [Programa de Engenharia Nuclear/COPPE (Brazil); Rosa, L.A.R. da [Instituto de Radioprotecao e Dosimetria-IRD, Av. Salvador Allende, s/no Recreio dos Bandeirantes, CP 37760, CEP 22780-160 Rio de Janeiro, RJ (Brazil); Batista, D.V.S. [Instituto de Radioprotecao e Dosimetria-IRD, Av. Salvador Allende, s/no Recreio dos Bandeirantes, CP 37760, CEP 22780-160 Rio de Janeiro, RJ (Brazil); Instituto Nacional de Cancer-Secao de Fisica Medica, Praca Cruz Vermelha, 23-Centro, 20230-130 Rio de Janeiro, RJ (Brazil); Cardoso, S.C. [Departamento de Fisica Nuclear, Instituto de Fisica, Universidade Federal do Rio de Janeiro, Bloco A-Sala 307, CP 68528, CEP 21941-972 Rio de Janeiro, RJ (Brazil); Silva, A.X., E-mail: ademir@con.ufrj.br [Programa de Engenharia Nuclear/COPPE (Brazil); Departamento de Engenharia Nuclear/Escola Politecnica, Universidade Federal do Rio de Janeiro, Ilha do Fundao, Caixa Postal 68509, 21945-970 Rio de Janeiro, RJ (Brazil); Facure, A. [Comissao Nacional de Energia Nuclear, R. Gal. Severiano 90, sala 409, 22294-900 Rio de Janeiro, RJ (Brazil)
2012-01-15
This paper presents the application of a computational methodology for optimizing the conversion of medical tomographic images in voxel anthropomorphic models for simulation of radiation transport using the MCNP code. A computational system was developed for digital image processing that compresses the information from the DICOM medical image before it is converted to the Scan2MCNP software input file for optimization of the image data. In order to validate the computational methodology, a radiosurgery treatment simulation was performed using the Alderson Rando phantom and the acquisition of DICOM images was performed. The simulation results were compared with data obtained with the BrainLab planning system. The comparison showed good agreement for three orthogonal treatment beams of {sup 60}Co gamma radiation. The percentage differences were 3.07%, 0.77% and 6.15% for axial, coronal and sagital projections, respectively. - Highlights: Black-Right-Pointing-Pointer We use a method to optimize the CT image conversion in voxel model for MCNP simulation. Black-Right-Pointing-Pointer We present a methodology to compress a DICOM image before conversion to input file. Black-Right-Pointing-Pointer To validate this study an idealized radiosurgery applied to the Alderson phantom was used.
Golosio, Bruno; Schoonjans, Tom; Brunetti, Antonio; Oliva, Piernicola; Masala, Giovanni Luca
2014-03-01
The simulation of X-ray imaging experiments is often performed using deterministic codes, which can be relatively fast and easy to use. However, such codes are generally not suitable for the simulation of even slightly more complex experimental conditions, involving, for instance, first-order or higher-order scattering, X-ray fluorescence emissions, or more complex geometries, particularly for experiments that combine spatial resolution with spectral information. In such cases, simulations are often performed using codes based on the Monte Carlo method. In a simple Monte Carlo approach, the interaction position of an X-ray photon and the state of the photon after an interaction are obtained simply according to the theoretical probability distributions. This approach may be quite inefficient because the final channels of interest may include only a limited region of space or photons produced by a rare interaction, e.g., fluorescent emission from elements with very low concentrations. In the field of X-ray fluorescence spectroscopy, this problem has been solved by combining the Monte Carlo method with variance reduction techniques, which can reduce the computation time by several orders of magnitude. In this work, we present a C++ code for the general simulation of X-ray imaging and spectroscopy experiments, based on the application of the Monte Carlo method in combination with variance reduction techniques, with a description of sample geometry based on quadric surfaces. We describe the benefits of the object-oriented approach in terms of code maintenance, the flexibility of the program for the simulation of different experimental conditions and the possibility of easily adding new modules. Sample applications in the fields of X-ray imaging and X-ray spectroscopy are discussed. Catalogue identifier: AERO_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AERO_v1_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland
Energy Technology Data Exchange (ETDEWEB)
Morris, R [Durham, NC (United States); Lakshmanan, M; Fong, G; Kapadia, A [Carl E Ravin Advanced Imaging Laboratories, Durham, NC (United States); Greenberg, J [Duke University, Durham, NC (United States)
2016-06-15
Purpose: Coherent scatter based imaging has shown improved contrast and molecular specificity over conventional digital mammography however the biological risks have not been quantified due to a lack of accurate information on absorbed dose. This study intends to characterize the dose distribution and average glandular dose from coded aperture coherent scatter spectral imaging of the breast. The dose deposited in the breast from this new diagnostic imaging modality has not yet been quantitatively evaluated. Here, various digitized anthropomorphic phantoms are tested in a Monte Carlo simulation to evaluate the absorbed dose distribution and average glandular dose using clinically feasible scan protocols. Methods: Geant4 Monte Carlo radiation transport simulation software is used to replicate the coded aperture coherent scatter spectral imaging system. Energy sensitive, photon counting detectors are used to characterize the x-ray beam spectra for various imaging protocols. This input spectra is cross-validated with the results from XSPECT, a commercially available application that yields x-ray tube specific spectra for the operating parameters employed. XSPECT is also used to determine the appropriate number of photons emitted per mAs of tube current at a given kVp tube potential. With the implementation of the XCAT digital anthropomorphic breast phantom library, a variety of breast sizes with differing anatomical structure are evaluated. Simulations were performed with and without compression of the breast for dose comparison. Results: Through the Monte Carlo evaluation of a diverse population of breast types imaged under real-world scan conditions, a clinically relevant average glandular dose for this new imaging modality is extrapolated. Conclusion: With access to the physical coherent scatter imaging system used in the simulation, the results of this Monte Carlo study may be used to directly influence the future development of the modality to keep breast dose to
Monte Carlo simulation of the imaging properties of scintillator-coated X-ray pixel detectors
International Nuclear Information System (INIS)
Hjelm, M.; Norlin, B.; Nilsson, H.-E.; Froejdh, C.; Badel, X.
2003-01-01
The spatial resolution of scintillator-coated X-ray pixel detectors is usually limited by the isotropic light spread in the scintillator. One way to overcome this limitation is to use a pixellated scintillating layer on top of the semiconductor pixel detector. Using advanced etching and filling techniques, arrays of CsI columns have been successfully fabricated and characterized. Each CsI waveguide matches one pixel of the semiconductor detector, limiting the spatial spread of light. Another concept considered in this study is to detect the light emitted from the scintillator by diodes formed in the silicon pore walls. There is so far no knowledge regarding the theoretical limits for these two approaches, which makes the evaluation of the fabrication process difficult. In this work we present numerical calculations of the signal-to-noise ratio (SNR) for detector designs based on scintillator-filled pores in silicon. The calculations are based on separate Monte Carlo (MC) simulations of X-ray absorption and light transport in scintillator waveguides. The resulting data are used in global MC simulations of flood exposures of the detector array, from which the SNR values are obtained. Results are presented for two scintillator materials, namely CsI(Tl) and GADOX
Jiang, Xu; Deng, Yong; Luo, Zhaoyang; Luo, Qingming
2015-10-05
The excessive time required by fluorescence diffuse optical tomography (fDOT) image reconstruction based on path-history fluorescence Monte Carlo model is its primary limiting factor. Herein, we present a method that accelerates fDOT image reconstruction. We employ three-level parallel architecture including multiple nodes in cluster, multiple cores in central processing unit (CPU), and multiple streaming multiprocessors in graphics processing unit (GPU). Different GPU memories are selectively used, the data-writing time is effectively eliminated, and the data transport per iteration is minimized. Simulation experiments demonstrated that this method can utilize general-purpose computing platforms to efficiently implement and accelerate fDOT image reconstruction, thus providing a practical means of using path-history-based fluorescence Monte Carlo model for fDOT imaging.
Monte Carlo simulations of the dose from imaging with GE eXplore 120 micro-CT using GATE.
Bretin, Florian; Bahri, Mohamed Ali; Luxen, André; Phillips, Christophe; Plenevaux, Alain; Seret, Alain
2015-10-01
Small animals are increasingly used as translational models in preclinical imaging studies involving microCT, during which the subjects can be exposed to large amounts of radiation. While the radiation levels are generally sublethal, studies have shown that low-level radiation can change physiological parameters in mice. In order to rule out any influence of radiation on the outcome of such experiments, or resulting deterministic effects in the subjects, the levels of radiation involved need to be addressed. The aim of this study was to investigate the radiation dose delivered by the GE eXplore 120 microCT non-invasively using Monte Carlo simulations in GATE and to compare results to previously obtained experimental values. Tungsten X-ray spectra were simulated at 70, 80, and 97 kVp using an analytical tool and their half-value layers were simulated for spectra validation against experimentally measured values of the physical X-ray tube. A Monte Carlo model of the microCT system was set up and four protocols that are regularly applied to live animal scanning were implemented. The computed tomography dose index (CTDI) inside a PMMA phantom was derived and multiple field of view acquisitions were simulated using the PMMA phantom, a representative mouse and rat. Simulated half-value layers agreed with experimentally obtained results within a 7% error window. The CTDI ranged from 20 to 56 mGy and closely matched experimental values. Derived organ doses in mice reached 459 mGy in bones and up to 200 mGy in soft tissue organs using the highest energy protocol. Dose levels in rats were lower due to the increased mass of the animal compared to mice. The uncertainty of all dose simulations was below 14%. Monte Carlo simulations proved a valuable tool to investigate the 3D dose distribution in animals from microCT. Small animals, especially mice (due to their small volume), receive large amounts of radiation from the GE eXplore 120 microCT, which might alter physiological
Directory of Open Access Journals (Sweden)
Retif P
2016-11-01
Full Text Available Paul Retif,1–3 Aurélie Reinhard,2,3 Héna Paquot,2,3 Valérie Jouan-Hureaux,2,3 Alicia Chateau,2,3 Lucie Sancey,4 Muriel Barberi-Heyob,2,3 Sophie Pinel,2,3 Thierry Bastogne2,3,5 1Unité de Physique Médicale, CHR Metz-Thionville, Ars-Laquenexy, 2Université de Lorraine, 3CRAN, UMR 7039, CNRS, Vandoeuvre-lès-Nancy, 4Institut Lumière Matière, UMR 5306, CNRS, Villeurbanne, 5INRIA-BIGS & CRAN, Université de Lorraine, Vandoeuvre-lès-Nancy Cedex, France Abstract: This article addresses the in silico–in vitro prediction issue of organometallic nanoparticles (NPs-based radiosensitization enhancement. The goal was to carry out computational experiments to quickly identify efficient nanostructures and then to preferentially select the most promising ones for the subsequent in vivo studies. To this aim, this interdisciplinary article introduces a new theoretical Monte Carlo computational ranking method and tests it using 3 different organometallic NPs in terms of size and composition. While the ranking predicted in a classical theoretical scenario did not fit the reference results at all, in contrast, we showed for the first time how our accelerated in silico virtual screening method, based on basic in vitro experimental data (which takes into account the NPs cell biodistribution, was able to predict a relevant ranking in accordance with in vitro clonogenic efficiency. This corroborates the pertinence of such a prior ranking method that could speed up the preclinical development of NPs in radiation therapy. Keywords: biomedical applications of radiations, computer simulation, nanomedicine, virtual screening
International Nuclear Information System (INIS)
Rolison, L; Samant, S; Baciak, J; Jordan, K
2016-01-01
Purpose: To develop a Monte Carlo N-Particle (MCNP) model for the validation of a prototype backscatter x-ray (BSX) imager, and optimization of BSX technology for medical applications, including selective object-plane imaging. Methods: BSX is an emerging technology that represents an alternative to conventional computed tomography (CT) and projective digital radiography (DR). It employs detectors located on the same side as the incident x-ray source, making use of backscatter and avoiding ring geometry to enclose the imaging object. Current BSX imagers suffer from low spatial resolution. A MCNP model was designed to replicate a BSX prototype used for flaw detection in industrial materials. This prototype consisted of a 1.5mm diameter 60kVp pencil beam surrounded by a ring of four 5.0cm diameter NaI scintillation detectors. The imaging phantom consisted of a 2.9cm thick aluminum plate with five 0.6cm diameter holes drilled halfway. The experimental image was created using a raster scanning motion (in 1.5mm increments). Results: A qualitative comparison between the physical and simulated images showed very good agreement with 1.5mm spatial resolution in plane perpendicular to incident x-ray beam. The MCNP model developed the concept of radiography by selective plane detection (RSPD) for BSX, whereby specific object planes can be imaged by varying kVp. 10keV increments in mean x-ray energy yielded 4mm thick slice resolution in the phantom. Image resolution in the MCNP model can be further increased by increasing the number of detectors, and decreasing raster step size. Conclusion: MCNP modelling was used to validate a prototype BSX imager and introduce the RSPD concept, allowing for selective object-plane imaging. There was very good visual agreement between the experimental and MCNP imaging. Beyond optimizing system parameters for the existing prototype, new geometries can be investigated for volumetric image acquisition in medical applications. This material is
International Nuclear Information System (INIS)
Bashkatov, A N; Genina, Elina A; Kochubei, V I; Tuchin, Valerii V
2006-01-01
Based on the digital image analysis and inverse Monte-Carlo method, the proximate analysis method is deve-loped and the optical properties of hairs of different types are estimated in three spectral ranges corresponding to three colour components. The scattering and absorption properties of hairs are separated for the first time by using the inverse Monte-Carlo method. The content of different types of melanin in hairs is estimated from the absorption coefficient. It is shown that the dominating type of melanin in dark hairs is eumelanin, whereas in light hairs pheomelanin dominates. (special issue devoted to multiple radiation scattering in random media)
International Nuclear Information System (INIS)
Yorulmaz, N.; Bozkurt, A.
2009-01-01
In nuclear medicine applications, the aim is to obtain diagnostic information about the organs and tissues of the patient with the help of some radiopharmaceuticals administered to him/her. Because some organs of the patient other than those under investigation will also be exposed to the radiation, it is important for radiation risk assessment to know how much radiation is received by the vital or radio-sensitive organs or tissues. In this study, an image-based body model created from the realistic images of a human is used together with the Monte Carlo code MCNP to compute the radiation doses absorbed by organs and tissues for some nuclear medicine procedures at gamma energies of 0.01, 0.015, 0.02, 0.03, 0.05, 0.1, 0.2, 0.5 and 1 MeV. Later, these values are used in conjunction with radiation weighting factors and organ weighting factors to estimate the effective dose for each diagnostic application.
Otsuki, Soichi
2018-04-01
Polarimetric imaging of absorbing, strongly scattering, or birefringent inclusions is investigated in a negligibly absorbing, moderately scattering, and isotropic slab medium. It was proved that the reduced effective scattering Mueller matrix is exactly calculated from experimental or simulated raw matrices even if the medium is anisotropic and/or heterogeneous, or the outgoing light beam exits obliquely to the normal of the slab surface. The calculation also gives a reasonable approximation of the reduced matrix using a light beam with a finite diameter for illumination. The reduced matrix was calculated using a Monte Carlo simulation and was factorized in two dimensions by the Lu-Chipman polar decomposition. The intensity of backscattered light shows clear and modestly clear differences for absorbing and strongly scattering inclusions, respectively, whereas it shows no difference for birefringent inclusions. Conversely, some polarization parameters, for example, the selective depolarization coefficients exhibit only a slight difference for the absorbing inclusions, whereas they showed clear difference for the strongly scattering or birefringent inclusions. Moreover, these quantities become larger with increasing the difference in the optical properties of the inclusions relative to the surrounding medium. However, it is difficult to recognize inclusions that buried at the depth deeper than 3 mm under the surface. Thus, the present technique can detect the approximate shape and size of these inclusions, and considering the depth where inclusions lie, estimate their optical properties. This study reveals the possibility of the polarization-sensitive imaging of turbid inhomogeneous media using a pencil beam for illumination.
Blake, S. J.; McNamara, A. L.; Vial, P.; Holloway, L.; Kuncic, Z.
2014-11-01
A Monte Carlo model of a novel electronic portal imaging device (EPID) has been developed using Geant4 and its performance for imaging and dosimetry applications in radiotherapy has been characterised. The EPID geometry is based on a physical prototype under ongoing investigation and comprises an array of plastic scintillating fibres in place of the metal plate/phosphor screen in standard EPIDs. Geometrical and optical transport parameters were varied to investigate their impact on imaging and dosimetry performance. Detection efficiency was most sensitive to variations in fibre length, achieving a peak value of 36% at 50 mm using 400 keV x-rays for the lengths considered. Increases in efficiency for longer fibres were partially offset by reductions in sensitivity. Removing the extra-mural absorber surrounding individual fibres severely decreased the modulation transfer function (MTF), highlighting its importance in maximising spatial resolution. Field size response and relative dose profile simulations demonstrated a water-equivalent dose response and thus the prototype’s suitability for dosimetry applications. Element-to-element mismatch between scintillating fibres and underlying photodiode pixels resulted in a reduced MTF for high spatial frequencies and quasi-periodic variations in dose profile response. This effect is eliminated when fibres are precisely matched to underlying pixels. Simulations strongly suggest that with further optimisation, this prototype EPID may be capable of simultaneous imaging and dosimetry in radiotherapy.
Framework for the construction of a Monte Carlo simulated brain PET–MR image database
International Nuclear Information System (INIS)
Thomas, B.A.; Erlandsson, K.; Drobnjak, I.; Pedemonte, S.; Vunckx, K.; Bousse, A.; Reilhac-Laborde, A.; Ourselin, S.; Hutton, B.F.
2014-01-01
Simultaneous PET–MR acquisition reduces the possibility of registration mismatch between the two modalities. This facilitates the application of techniques, either during reconstruction or post-reconstruction, that aim to improve the PET resolution by utilising structural information provided by MR. However, in order to validate such methods for brain PET–MR studies it is desirable to evaluate the performance using data where the ground truth is known. In this work, we present a framework for the production of datasets where simulations of both the PET and MR, based on real data, are generated such that reconstruction and post-reconstruction approaches can be fairly compared. -- Highlights: • A framework for simulating realistic brain PET–MR images is proposed. • The imaging data created is formed from real acquisitions. • Partial volume correction techniques can be fairly compared using this framework
Kano, Masayuki; Nagao, Hiromichi; Nagata, Kenji; Ito, Shin-ichi; Sakai, Shin'ichi; Nakagawa, Shigeki; Hori, Muneo; Hirata, Naoshi
2017-04-01
Earthquakes sometimes cause serious disasters not only directly by ground motion itself but also secondarily by infrastructure damage, particularly in densely populated urban areas. To reduce these secondary disasters, it is important to rapidly evaluate seismic hazards by analyzing the seismic responses of individual structures due to the input ground motions. Such input motions are estimated utilizing an array of seismometers that are distributed more sparsely than the structures. We propose a methodology that integrates physics-based and data-driven approaches in order to obtain the seismic wavefield to be input into seismic response analysis. This study adopts the replica exchange Monte Carlo (REMC) method, which is one of the Markov chain Monte Carlo (MCMC) methods, for the estimation of the seismic wavefield together with one-dimensional local subsurface structure and source information. Numerical tests show that the REMC method is able to search the parameters related to the source and the local subsurface structure in broader parameter space than the Metropolis method, which is an ordinary MCMC method. The REMC method well reproduces the seismic wavefield consistent with the true one. In contrast, the ordinary kriging, which is a classical data-driven interpolation method for spatial data, is hardly able to reproduce the true wavefield even at low frequencies. This indicates that it is essential to take both physics-based and data-driven approaches into consideration for seismic wavefield imaging. Then the REMC method is applied to the actual waveforms observed by a dense seismic array MeSO-net (Metropolitan Seismic Observation network), in which 296 accelerometers are continuously in operation with several kilometer intervals in the Tokyo metropolitan area, Japan. The estimated wavefield within a frequency band of 0.10-0.20 Hz is absolutely consistent with the observed waveforms. Further investigation suggests that the seismic wavefield is successfully
Energy Technology Data Exchange (ETDEWEB)
Dong, Han; Sharma, Diksha; Badano, Aldo, E-mail: aldo.badano@fda.hhs.gov [Division of Imaging, Diagnostics, and Software Reliability, Center for Devices and Radiological Health, U.S. Food and Drug Administration, Silver Spring, Maryland 20993 (United States)
2014-12-15
Purpose: Monte Carlo simulations play a vital role in the understanding of the fundamental limitations, design, and optimization of existing and emerging medical imaging systems. Efforts in this area have resulted in the development of a wide variety of open-source software packages. One such package, hybridMANTIS, uses a novel hybrid concept to model indirect scintillator detectors by balancing the computational load using dual CPU and graphics processing unit (GPU) processors, obtaining computational efficiency with reasonable accuracy. In this work, the authors describe two open-source visualization interfaces, webMANTIS and visualMANTIS to facilitate the setup of computational experiments via hybridMANTIS. Methods: The visualization tools visualMANTIS and webMANTIS enable the user to control simulation properties through a user interface. In the case of webMANTIS, control via a web browser allows access through mobile devices such as smartphones or tablets. webMANTIS acts as a server back-end and communicates with an NVIDIA GPU computing cluster that can support multiuser environments where users can execute different experiments in parallel. Results: The output consists of point response and pulse-height spectrum, and optical transport statistics generated by hybridMANTIS. The users can download the output images and statistics through a zip file for future reference. In addition, webMANTIS provides a visualization window that displays a few selected optical photon path as they get transported through the detector columns and allows the user to trace the history of the optical photons. Conclusions: The visualization tools visualMANTIS and webMANTIS provide features such as on the fly generation of pulse-height spectra and response functions for microcolumnar x-ray imagers while allowing users to save simulation parameters and results from prior experiments. The graphical interfaces simplify the simulation setup and allow the user to go directly from specifying
Dong, Han; Sharma, Diksha; Badano, Aldo
2014-12-01
Monte Carlo simulations play a vital role in the understanding of the fundamental limitations, design, and optimization of existing and emerging medical imaging systems. Efforts in this area have resulted in the development of a wide variety of open-source software packages. One such package, hybridmantis, uses a novel hybrid concept to model indirect scintillator detectors by balancing the computational load using dual CPU and graphics processing unit (GPU) processors, obtaining computational efficiency with reasonable accuracy. In this work, the authors describe two open-source visualization interfaces, webmantis and visualmantis to facilitate the setup of computational experiments via hybridmantis. The visualization tools visualmantis and webmantis enable the user to control simulation properties through a user interface. In the case of webmantis, control via a web browser allows access through mobile devices such as smartphones or tablets. webmantis acts as a server back-end and communicates with an NVIDIA GPU computing cluster that can support multiuser environments where users can execute different experiments in parallel. The output consists of point response and pulse-height spectrum, and optical transport statistics generated by hybridmantis. The users can download the output images and statistics through a zip file for future reference. In addition, webmantis provides a visualization window that displays a few selected optical photon path as they get transported through the detector columns and allows the user to trace the history of the optical photons. The visualization tools visualmantis and webmantis provide features such as on the fly generation of pulse-height spectra and response functions for microcolumnar x-ray imagers while allowing users to save simulation parameters and results from prior experiments. The graphical interfaces simplify the simulation setup and allow the user to go directly from specifying input parameters to receiving visual
International Nuclear Information System (INIS)
Dong, Han; Sharma, Diksha; Badano, Aldo
2014-01-01
Purpose: Monte Carlo simulations play a vital role in the understanding of the fundamental limitations, design, and optimization of existing and emerging medical imaging systems. Efforts in this area have resulted in the development of a wide variety of open-source software packages. One such package, hybridMANTIS, uses a novel hybrid concept to model indirect scintillator detectors by balancing the computational load using dual CPU and graphics processing unit (GPU) processors, obtaining computational efficiency with reasonable accuracy. In this work, the authors describe two open-source visualization interfaces, webMANTIS and visualMANTIS to facilitate the setup of computational experiments via hybridMANTIS. Methods: The visualization tools visualMANTIS and webMANTIS enable the user to control simulation properties through a user interface. In the case of webMANTIS, control via a web browser allows access through mobile devices such as smartphones or tablets. webMANTIS acts as a server back-end and communicates with an NVIDIA GPU computing cluster that can support multiuser environments where users can execute different experiments in parallel. Results: The output consists of point response and pulse-height spectrum, and optical transport statistics generated by hybridMANTIS. The users can download the output images and statistics through a zip file for future reference. In addition, webMANTIS provides a visualization window that displays a few selected optical photon path as they get transported through the detector columns and allows the user to trace the history of the optical photons. Conclusions: The visualization tools visualMANTIS and webMANTIS provide features such as on the fly generation of pulse-height spectra and response functions for microcolumnar x-ray imagers while allowing users to save simulation parameters and results from prior experiments. The graphical interfaces simplify the simulation setup and allow the user to go directly from specifying
TH-A-18C-09: Ultra-Fast Monte Carlo Simulation for Cone Beam CT Imaging of Brain Trauma
Energy Technology Data Exchange (ETDEWEB)
Sisniega, A; Zbijewski, W; Stayman, J [Department of Biomedical Engineering, Johns Hopkins University (United States); Yorkston, J [Carestream Health (United States); Aygun, N [Department of Radiology, Johns Hopkins University (United States); Koliatsos, V [Department of Neurology, Johns Hopkins University (United States); Siewerdsen, J [Department of Biomedical Engineering, Johns Hopkins University (United States); Department of Radiology, Johns Hopkins University (United States)
2014-06-15
Purpose: Application of cone-beam CT (CBCT) to low-contrast soft tissue imaging, such as in detection of traumatic brain injury, is challenged by high levels of scatter. A fast, accurate scatter correction method based on Monte Carlo (MC) estimation is developed for application in high-quality CBCT imaging of acute brain injury. Methods: The correction involves MC scatter estimation executed on an NVIDIA GTX 780 GPU (MC-GPU), with baseline simulation speed of ~1e7 photons/sec. MC-GPU is accelerated by a novel, GPU-optimized implementation of variance reduction (VR) techniques (forced detection and photon splitting). The number of simulated tracks and projections is reduced for additional speed-up. Residual noise is removed and the missing scatter projections are estimated via kernel smoothing (KS) in projection plane and across gantry angles. The method is assessed using CBCT images of a head phantom presenting a realistic simulation of fresh intracranial hemorrhage (100 kVp, 180 mAs, 720 projections, source-detector distance 700 mm, source-axis distance 480 mm). Results: For a fixed run-time of ~1 sec/projection, GPU-optimized VR reduces the noise in MC-GPU scatter estimates by a factor of 4. For scatter correction, MC-GPU with VR is executed with 4-fold angular downsampling and 1e5 photons/projection, yielding 3.5 minute run-time per scan, and de-noised with optimized KS. Corrected CBCT images demonstrate uniformity improvement of 18 HU and contrast improvement of 26 HU compared to no correction, and a 52% increase in contrast-tonoise ratio in simulated hemorrhage compared to “oracle” constant fraction correction. Conclusion: Acceleration of MC-GPU achieved through GPU-optimized variance reduction and kernel smoothing yields an efficient (<5 min/scan) and accurate scatter correction that does not rely on additional hardware or simplifying assumptions about the scatter distribution. The method is undergoing implementation in a novel CBCT dedicated to brain
TH-A-18C-09: Ultra-Fast Monte Carlo Simulation for Cone Beam CT Imaging of Brain Trauma
International Nuclear Information System (INIS)
Sisniega, A; Zbijewski, W; Stayman, J; Yorkston, J; Aygun, N; Koliatsos, V; Siewerdsen, J
2014-01-01
Purpose: Application of cone-beam CT (CBCT) to low-contrast soft tissue imaging, such as in detection of traumatic brain injury, is challenged by high levels of scatter. A fast, accurate scatter correction method based on Monte Carlo (MC) estimation is developed for application in high-quality CBCT imaging of acute brain injury. Methods: The correction involves MC scatter estimation executed on an NVIDIA GTX 780 GPU (MC-GPU), with baseline simulation speed of ~1e7 photons/sec. MC-GPU is accelerated by a novel, GPU-optimized implementation of variance reduction (VR) techniques (forced detection and photon splitting). The number of simulated tracks and projections is reduced for additional speed-up. Residual noise is removed and the missing scatter projections are estimated via kernel smoothing (KS) in projection plane and across gantry angles. The method is assessed using CBCT images of a head phantom presenting a realistic simulation of fresh intracranial hemorrhage (100 kVp, 180 mAs, 720 projections, source-detector distance 700 mm, source-axis distance 480 mm). Results: For a fixed run-time of ~1 sec/projection, GPU-optimized VR reduces the noise in MC-GPU scatter estimates by a factor of 4. For scatter correction, MC-GPU with VR is executed with 4-fold angular downsampling and 1e5 photons/projection, yielding 3.5 minute run-time per scan, and de-noised with optimized KS. Corrected CBCT images demonstrate uniformity improvement of 18 HU and contrast improvement of 26 HU compared to no correction, and a 52% increase in contrast-tonoise ratio in simulated hemorrhage compared to “oracle” constant fraction correction. Conclusion: Acceleration of MC-GPU achieved through GPU-optimized variance reduction and kernel smoothing yields an efficient (<5 min/scan) and accurate scatter correction that does not rely on additional hardware or simplifying assumptions about the scatter distribution. The method is undergoing implementation in a novel CBCT dedicated to brain
International Nuclear Information System (INIS)
Hristov, D; Schlosser, J; Bazalova, M; Chen, J
2014-01-01
Purpose: To quantify the effect of ultrasound (US) probe beam attenuation for radiation therapy delivered under real-time US image guidance by means of Monte Carlo (MC) simulations. Methods: MC models of two Philips US probes, an X6-1 matrix-array transducer and a C5-2 curved-array transducer, were built based on their CT images in the EGSnrc BEAMnrc and DOSXYZnrc codes. Due to the metal parts, the probes were scanned in a Tomotherapy machine with a 3.5 MV beam. Mass densities in the probes were assigned based on an electron density calibration phantom consisting of cylinders with mass densities between 0.2–8.0 g/cm 3 . Beam attenuation due to the probes was measured in a solid water phantom for a 6 MV and 15 MV 15x15 cm 2 beam delivered on a Varian Trilogy linear accelerator. The dose was measured with the PTW-729 ionization chamber array at two depths and compared to MC simulations. The extreme case beam attenuation expected in robotic US image guided radiotherapy for probes in upright position was quantified by means of MC simulations. Results: The 3.5 MV CT number to mass density calibration curve was found to be linear with R 2 > 0.99. The maximum mass densities were 4.6 and 4.2 g/cm 3 in the C5-2 and X6-1 probe, respectively. Gamma analysis of the simulated and measured doses revealed that over 98% of measurement points passed the 3%/3mm criteria for both probes and measurement depths. The extreme attenuation for probes in upright position was found to be 25% and 31% for the C5-2 and X6-1 probe, respectively, for both 6 and 15 MV beams at 10 cm depth. Conclusion: MC models of two US probes used for real-time image guidance during radiotherapy have been built. As a Result, radiotherapy treatment planning with the imaging probes in place can now be performed. J Schlosser is an employee of SoniTrack Systems, Inc. D Hristov has financial interest in SoniTrack Systems, Inc
Peterson, Mikael; Strand, Sven-Erik; Ljungberg, Michael
2015-04-01
Pinhole collimation is the most common method of high-resolution preclinical single photon emission computed tomography imaging. The collimators are usually constructed from dense materials with high atomic numbers, such as gold and platinum, which are expensive and not always flexible in the fabrication step. In this work, the authors have investigated the properties of a fusible alloy called Rose's metal and its potential in pinhole preclinical imaging. When compared to current standard pinhole materials such as gold and platinum, Rose's metal has a lower density and a relatively low effective atomic number. However, it is inexpensive, has a low melting point, and does not contract when solidifying. Once cast, the piece can be machined with high precision. The aim of this study was to evaluate the imaging properties for Rose's metal and compare them with those of standard materials. After validating their Monte Carlo code by comparing its results with published data and the results from analytical calculations, they investigated different pinhole geometries by varying the collimator material, acceptance angle, aperture diameter, and photon incident angle. The penetration-to-scatter and penetration-to-total component ratios, sensitivity, and the spatial resolution were determined for gold, tungsten, and Rose's metal for two radionuclides, (99)Tc(m) and (125)I. The Rose's metal pinhole-imaging simulations show higher penetration/total and scatter/total ratios. For example, the penetration/total is 50% for gold and 75% for Rose's metal when simulating (99)Tc(m) with a 0.3 mm aperture diameter and a 60° acceptance angle. However, the degradation in spatial resolution remained below 10% relative to the spatial resolution for gold for acceptance angles below 40° and aperture diameters larger than 0.5 mm. Extra penetration and scatter associated with Rose's metal contribute to degradation in the spatial resolution, but this degradation is not always substantial. The
International Nuclear Information System (INIS)
Noblet, Caroline
2014-01-01
Innovating irradiators dedicated to small animal allow to mimic clinical treatments in image-guided radiation therapy. Clinical practice is scaled down to the small animal by reducing beam dimensions (from cm to mm) and energy (from MeV to keV). Millimeter medium energy beams (<300 keV) are used to treat animals. This scaling induces higher constraints than in clinical practice especially for absorbed dose calculation in animals. Due to the beam dimensions and the medium energy range, clinical dose calculation methods are not easily applicable to the preclinical practice. Monte Carlo methods are needed. To this aim, a Monte Carlo model of the XRAD225Cx preclinical irradiator has been developed with the GATE (Geant4) framework. This model was validated by comparing simulation results against measurements and results obtained with a reference Monte Carlo code in external beam radiation therapy, EGSnrc. A specific issue has been highlighted: the significant dosimetric impact of tissue segmentation in the animal CT images. Indeed, at medium energy range, thresholding based on electronic density cannot accurately take into account the heterogeneities. Materials should be defined using both the tissue elemental composition and the mass density. An original segmentation method has been developed to obtain realistic dose distributions in small animals. Finally, our Monte Carlo platform has been successfully used for several radiobiological studies with mice and rats. (author) [fr
Saghamanesh, S.; Aghamiri, S. M.; Kamali-Asl, A.; Yashiro, W.
2017-09-01
An important challenge in real-world biomedical applications of x-ray phase contrast imaging (XPCI) techniques is the efficient use of the photon flux generated by an incoherent and polychromatic x-ray source. This efficiency can directly influence dose and exposure time and ideally should not affect the superior contrast and sensitivity of XPCI. In this paper, we present a quantitative evaluation of the photon detection efficiency of two laboratory-based XPCI methods, grating interferometry (GI) and coded-aperture (CA). We adopt a Monte Carlo approach to simulate existing prototypes of those systems, tailored for mammography applications. Our simulations were validated by means of a simple experiment performed on a CA XPCI system. Our results show that the fraction of detected photons in the standard energy range of mammography are about 1.4% and 10% for the GI and CA techniques, respectively. The simulations indicate that the design of the optical components plays an important role in the higher efficiency of CA compared to the GI method. It is shown that the use of lower absorbing materials as the substrates for GI gratings can improve its flux efficiency by up to four times. Along similar lines, we also show that an optimized and compact configuration of GI could lead to a 3.5 times higher fraction of detected counts compared to a standard and non-optimised GI implementation.
Energy Technology Data Exchange (ETDEWEB)
Fallahpoor, M; Abbasi, M [Tehran University of Medical Sciences, Vali-Asr Hospital, Tehran, Tehran (Iran, Islamic Republic of); Sen, A [University of Houston, Houston, TX (United States); Parach, A [Shahid Sadoughi University of Medical Sciences, Yazd, Yazd (Iran, Islamic Republic of); Kalantari, F [UT Southwestern Medical Center, Dallas, TX (United States)
2015-06-15
Purpose: Patient-specific 3-dimensional (3D) internal dosimetry in targeted radionuclide therapy is essential for efficient treatment. Two major steps to achieve reliable results are: 1) generating quantitative 3D images of radionuclide distribution and attenuation coefficients and 2) using a reliable method for dose calculation based on activity and attenuation map. In this research, internal dosimetry for 153-Samarium (153-Sm) was done by SPECT-CT images coupled GATE Monte Carlo package for internal dosimetry. Methods: A 50 years old woman with bone metastases from breast cancer was prescribed 153-Sm treatment (Gamma: 103keV and beta: 0.81MeV). A SPECT/CT scan was performed with the Siemens Simbia-T scanner. SPECT and CT images were registered using default registration software. SPECT quantification was achieved by compensating for all image degrading factors including body attenuation, Compton scattering and collimator-detector response (CDR). Triple energy window method was used to estimate and eliminate the scattered photons. Iterative ordered-subsets expectation maximization (OSEM) with correction for attenuation and distance-dependent CDR was used for image reconstruction. Bilinear energy mapping is used to convert Hounsfield units in CT image to attenuation map. Organ borders were defined by the itk-SNAP toolkit segmentation on CT image. GATE was then used for internal dose calculation. The Specific Absorbed Fractions (SAFs) and S-values were reported as MIRD schema. Results: The results showed that the largest SAFs and S-values are in osseous organs as expected. S-value for lung is the highest after spine that can be important in 153-Sm therapy. Conclusion: We presented the utility of SPECT-CT images and Monte Carlo for patient-specific dosimetry as a reliable and accurate method. It has several advantages over template-based methods or simplified dose estimation methods. With advent of high speed computers, Monte Carlo can be used for treatment planning
Wang, Yi; El-Mohri, Youcef; Antonuk, Larry E.; Zhao, Qihua
2010-07-01
The use of thick, segmented scintillators in electronic portal imagers offers the potential for significant improvement in x-ray detection efficiency compared to conventional phosphor screens. Such improvement substantially increases the detective quantum efficiency (DQE), leading to the possibility of achieving soft-tissue visualization at clinically practical (i.e. low) doses using megavoltage (MV) cone-beam computed tomography. While these DQE increases are greatest at zero spatial frequency, they are diminished at higher frequencies as a result of degradation of spatial resolution due to lateral spreading of secondary radiation within the scintillator—an effect that is more pronounced for thicker scintillators. The extent of this spreading is even more accentuated for radiation impinging the scintillator at oblique angles of incidence due to beam divergence. In this paper, Monte Carlo simulations of radiation transport, performed to investigate and quantify the effects of beam divergence on the imaging performance of MV imagers based on two promising scintillators (BGO and CsI:Tl), are reported. In these studies, 10-40 mm thick scintillators, incorporating low-density polymer, or high-density tungsten septal walls, were examined for incident angles corresponding to that encountered at locations up to ~15 cm from the central beam axis (for an imager located 130 cm from a radiotherapy x-ray source). The simulations demonstrate progressively more severe spatial resolution degradation (quantified in terms of the effect on the modulation transfer function) as a function of increasing angle of incidence (as well as of the scintillator thickness). Since the noise power behavior was found to be largely independent of the incident angle, the dependence of the DQE on the incident angle is therefore primarily determined by the spatial resolution. The observed DQE degradation suggests that 10 mm thick scintillators are not strongly affected by beam divergence for
Energy Technology Data Exchange (ETDEWEB)
Safavi-Naeini, M.; Han, Z.; Cutajar, D.; Guatelli, S.; Petasecca, M.; Lerch, M. L. F. [Centre for Medical Radiation Physics, University of Wollongong, Wollongong, NSW 2522 (Australia); Franklin, D. R. [Faculty of Engineering and Information Technology, University of Technology, Sydney, NSW 2007 (Australia); Jakubek, J.; Pospisil, S. [Institute of Experimental and Applied Physics (IEAP), Czech Technical University in Prague (CTU) (Czech Republic); Bucci, J.; Zaider, M.; Rosenfeld, A. B. [St. George Hospital Cancer Care Centre, Gray Street, Kogarah, NSW 2217 (Australia); Memorial Sloan Kettering Cancer Center, 1275 York Avenue, New York, New York 10065 (United States); Centre for Medical Radiation Physics, University of Wollongong, Wollongong, NSW 2522 (Australia)
2013-07-15
Purpose: High dose rate (HDR) brachytherapy is a form of radiation therapy for treating prostate cancer whereby a high activity radiation source is moved between predefined positions inside applicators inserted within the treatment volume. Accurate positioning of the source is essential in delivering the desired dose to the target area while avoiding radiation injury to the surrounding tissue. In this paper, HDR BrachyView, a novel inbody dosimetric imaging system for real time monitoring and verification of the radioactive seed position in HDR prostate brachytherapy treatment is introduced. The current prototype consists of a 15 Multiplication-Sign 60 mm{sup 2} silicon pixel detector with a multipinhole tungsten collimator placed 6.5 mm above the detector. Seven identical pinholes allow full imaging coverage of the entire treatment volume. The combined pinhole and pixel sensor arrangement is geometrically designed to be able to resolve the three-dimensional location of the source. The probe may be rotated to keep the whole prostate within the transverse plane. The purpose of this paper is to demonstrate the efficacy of the design through computer simulation, and to estimate the accuracy in resolving the source position (in detector plane and in 3D space) as part of the feasibility study for the BrachyView project.Methods: Monte Carlo simulations were performed using the GEANT4 radiation transport model, with a {sup 192}Ir source placed in different locations within a prostate phantom. A geometrically accurate model of the detector and collimator were constructed. Simulations were conducted with a single pinhole to evaluate the pinhole design and the signal to background ratio obtained. Second, a pair of adjacent pinholes were simulated to evaluate the error in calculated source location.Results: Simulation results show that accurate determination of the true source position is easily obtainable within the typical one second source dwell time. The maximum error in
Safavi-Naeini, M; Han, Z; Cutajar, D; Guatelli, S; Petasecca, M; Lerch, M L F; Franklin, D R; Jakubek, J; Pospisil, S; Bucci, J; Zaider, M; Rosenfeld, A B
2013-07-01
High dose rate (HDR) brachytherapy is a form of radiation therapy for treating prostate cancer whereby a high activity radiation source is moved between predefined positions inside applicators inserted within the treatment volume. Accurate positioning of the source is essential in delivering the desired dose to the target area while avoiding radiation injury to the surrounding tissue. In this paper, HDR BrachyView, a novel inbody dosimetric imaging system for real time monitoring and verification of the radioactive seed position in HDR prostate brachytherapy treatment is introduced. The current prototype consists of a 15 × 60 mm(2) silicon pixel detector with a multipinhole tungsten collimator placed 6.5 mm above the detector. Seven identical pinholes allow full imaging coverage of the entire treatment volume. The combined pinhole and pixel sensor arrangement is geometrically designed to be able to resolve the three-dimensional location of the source. The probe may be rotated to keep the whole prostate within the transverse plane. The purpose of this paper is to demonstrate the efficacy of the design through computer simulation, and to estimate the accuracy in resolving the source position (in detector plane and in 3D space) as part of the feasibility study for the BrachyView project. Monte Carlo simulations were performed using the GEANT4 radiation transport model, with a (192)Ir source placed in different locations within a prostate phantom. A geometrically accurate model of the detector and collimator were constructed. Simulations were conducted with a single pinhole to evaluate the pinhole design and the signal to background ratio obtained. Second, a pair of adjacent pinholes were simulated to evaluate the error in calculated source location. Simulation results show that accurate determination of the true source position is easily obtainable within the typical one second source dwell time. The maximum error in the estimated projection position was found to be
Energy Technology Data Exchange (ETDEWEB)
Barrera, C A; Moran, M J
2007-08-21
The Neutron Imaging System (NIS) is one of seven ignition target diagnostics under development for the National Ignition Facility. The NIS is required to record hot-spot (13-15 MeV) and downscattered (6-10 MeV) images with a resolution of 10 microns and a signal-to-noise ratio (SNR) of 10 at the 20% contour. The NIS is a valuable diagnostic since the downscattered neutrons reveal the spatial distribution of the cold fuel during an ignition attempt, providing important information in the case of a failed implosion. The present study explores the parameter space of several line-of-sight (LOS) configurations that could serve as the basis for the final design. Six commercially available organic scintillators were experimentally characterized for their light emission decay profile and neutron sensitivity. The samples showed a long lived decay component that makes direct recording of a downscattered image impossible. The two best candidates for the NIS detector material are: EJ232 (BC422) plastic fibers or capillaries filled with EJ399B. A Monte Carlo-based end-to-end model of the NIS was developed to study the imaging capabilities of several LOS configurations and verify that the recovered sources meet the design requirements. The model includes accurate neutron source distributions, aperture geometries (square pinhole, triangular wedge, mini-penumbral, annular and penumbral), their point spread functions, and a pixelated scintillator detector. The modeling results show that a useful downscattered image can be obtained by recording the primary peak and the downscattered images, and then subtracting a decayed version of the former from the latter. The difference images need to be deconvolved in order to obtain accurate source distributions. The images are processed using a frequency-space modified-regularization algorithm and low-pass filtering. The resolution and SNR of these sources are quantified by using two surrogate sources. The simulations show that all LOS
Chuang, Ching-Cheng; Lee, Chia-Yen; Chen, Chung-Ming; Hsieh, Yao-Sheng; Liu, Tsan-Chi; Sun, Chia-Wei
2012-05-01
This study proposed diffuser-aided diffuse optical imaging (DADOI) as a new approach to improve the performance of the conventional diffuse optical tomography (DOT) approach for breast imaging. The 3-D breast model for Monte Carlo simulation is remodeled from clinical MRI image. The modified Beer-Lambert's law is adopted with the DADOI approach to substitute the complex algorithms of inverse problem for mapping of spatial distribution, and the depth information is obtained based on the time-of-flight estimation. The simulation results demonstrate that the time-resolved Monte Carlo method can be capable of performing source-detector separations analysis. The dynamics of photon migration with various source-detector separations are analyzed for the characterization of breast tissue and estimation of optode arrangement. The source-detector separations should be less than 4 cm for breast imaging in DOT system. Meanwhile, the feasibility of DADOI was manifested in this study. In the results, DADOI approach can provide better imaging contrast and faster imaging than conventional DOT measurement. The DADOI approach possesses great potential to detect the breast tumor in early stage and chemotherapy monitoring that implies a good feasibility for clinical application.
International Nuclear Information System (INIS)
Lazaro, D; Buvat, I; Loudos, G; Strul, D; Santin, G; Giokaris, N; Donnarieix, D; Maigne, L; Spanoudaki, V; Styliaris, S; Staelens, S; Breton, V
2004-01-01
Monte Carlo simulations are increasingly used in scintigraphic imaging to model imaging systems and to develop and assess tomographic reconstruction algorithms and correction methods for improved image quantitation. GATE (GEANT4 application for tomographic emission) is a new Monte Carlo simulation platform based on GEANT4 dedicated to nuclear imaging applications. This paper describes the GATE simulation of a prototype of scintillation camera dedicated to small-animal imaging and consisting of a CsI(Tl) crystal array coupled to a position-sensitive photomultiplier tube. The relevance of GATE to model the camera prototype was assessed by comparing simulated 99m Tc point spread functions, energy spectra, sensitivities, scatter fractions and image of a capillary phantom with the corresponding experimental measurements. Results showed an excellent agreement between simulated and experimental data: experimental spatial resolutions were predicted with an error less than 100 μm. The difference between experimental and simulated system sensitivities for different source-to-collimator distances was within 2%. Simulated and experimental scatter fractions in a [98-82 keV] energy window differed by less than 2% for sources located in water. Simulated and experimental energy spectra agreed very well between 40 and 180 keV. These results demonstrate the ability and flexibility of GATE for simulating original detector designs. The main weakness of GATE concerns the long computation time it requires: this issue is currently under investigation by the GEANT4 and the GATE collaborations
International Nuclear Information System (INIS)
Momennezhad, Mehdi; Nasseri, Shahrokh; Zakavi, Seyed Rasoul; Parach, Ali Asghar; Ghorbani, Mahdi; Asl, Ruhollah Ghahraman
2016-01-01
Single-photon emission computed tomography (SPECT)-based tracers are easily available and more widely used than positron emission tomography (PET)-based tracers, and SPECT imaging still remains the most prevalent nuclear medicine imaging modality worldwide. The aim of this study is to implement an image-based Monte Carlo method for patient-specific three-dimensional (3D) absorbed dose calculation in patients after injection of 99m Tc-hydrazinonicotinamide (hynic)-Tyr 3 -octreotide as a SPECT radiotracer. 99m Tc patient-speci@@@@@@c S values and the absorbed doses were calculated with GATE code for each source-target organ pair in four patients who were imaged for suspected neuroendocrine tumors. Each patient underwent multiple whole-body planar scans as well as SPECT imaging over a period of 1-24 h after intravenous injection of 99m hynic-Tyr 3 -octreotide. The patient-specific S values calculated by GATE Monte Carlo code and the corresponding S values obtained by MIRDOSE program differed within 4.3% on an average for self-irradiation, and differed within 69.6% on an average for cross-irradiation. However, the agreement between total organ doses calculated by GATE code and MIRDOSE program for all patients was reasonably well (percentage difference was about 4.6% on an average). Normal and tumor absorbed doses calculated with GATE were slightly higher than those calculated with MIRDOSE program. The average ratio of GATE absorbed doses to MIRDOSE was 1.07 ± 0.11 (ranging from 0.94 to 1.36). According to the results, it is proposed that when cross-organ irradiation is dominant, a comprehensive approach such as GATE Monte Carlo dosimetry be used since it provides more reliable dosimetric results
Moslemi, Vahid; Ashoor, Mansour
2017-10-01
One of the major problems associated with parallel hole collimators (PCs) is the trade-off between their resolution and sensitivity. To solve this problem, a novel PC - namely, extended parallel hole collimator (EPC) - was proposed, in which particular trapezoidal denticles were increased upon septa on the side of the detector. In this study, an EPC was designed and its performance was compared with that of two PCs, PC35 and PC41, with a hole size of 1.5 mm and hole lengths of 35 and 41 mm, respectively. The Monte Carlo method was used to calculate the important parameters such as resolution, sensitivity, scattering, and penetration ratio. A Jaszczak phantom was also simulated to evaluate the resolution and contrast of tomographic images, which were produced by the EPC6, PC35, and PC41 using the Monte Carlo N-particle version 5 code, and tomographic images were reconstructed by using maximum likelihood expectation maximization algorithm. Sensitivity of the EPC6 was increased by 20.3% in comparison with that of the PC41 at the identical spatial resolution and full-width at tenth of maximum here. Moreover, the penetration and scattering ratio of the EPC6 was 1.2% less than that of the PC41. The simulated phantom images show that the EPC6 increases contrast-resolution and contrast-to-noise ratio compared with those of PC41 and PC35. When compared with PC41 and PC35, EPC6 improved trade-off between resolution and sensitivity, reduced penetrating and scattering ratios, and produced images with higher quality. EPC6 can be used to increase detectability of more details in nuclear medicine images.
International Nuclear Information System (INIS)
Shi Chengyu; Xu, X. George
2004-01-01
Assessment of radiation dose and risk to a pregnant woman and her fetus is an important task in radiation protection. Although tomographic models for male and female patients of different ages have been developed using medical images, such models for pregnant women had not been developed to date. This paper reports the construction of a partial-body model of a pregnant woman from a set of computed tomography (CT) images. The patient was 30 weeks into pregnancy, and the CT scan covered the portion of the body from above liver to below pubic symphysis in 70 slices. The thickness for each slice is 7 mm, and the image resolution is 512x512 pixels in a 48 cmx48 cm field; thus, the voxel size is 6.15 mm 3 . The images were segmented to identify 34 major internal organs and tissues considered sensitive to radiation. Even though the masses are noticeably different from other models, the three-dimensional visualization verified the segmentation and its suitability for Monte Carlo calculations. The model has been implemented into a Monte Carlo code, EGS4-VLSI (very large segmented images), for the calculations of radiation dose to a pregnant woman. The specific absorbed fraction (SAF) results for internal photons were compared with those from a stylized model. Small and large differences were found, and the differences can be explained by mass differences and by the relative geometry differences between the source and the target organs. The research provides the radiation dosimetry community with the first voxelized tomographic model of a pregnant woman, opening the door to future dosimetry studies
International Nuclear Information System (INIS)
Jarry, Genevieve; Verhaegen, Frank
2007-01-01
Electronic portal imagers have promising dosimetric applications in external beam radiation therapy. In this study a patient dose computation algorithm based on Monte Carlo (MC) simulations and on portal images is developed and validated. The patient exit fluence from primary photons is obtained from the portal image after correction for scattered radiation. The scattered radiation at the portal imager and the spectral energy distribution of the primary photons are estimated from MC simulations at the treatment planning stage. The patient exit fluence and the spectral energy distribution of the primary photons are then used to ray-trace the photons from the portal image towards the source through the CT geometry of the patient. Photon weights which reflect the probability of a photon being transmitted are computed during this step. A dedicated MC code is used to transport back these photons from the source through the patient CT geometry to obtain patient dose. Only Compton interactions are considered. This code also produces a reconstructed portal image which is used as a verification tool to ensure that the dose reconstruction is reliable. The dose reconstruction algorithm is compared against MC dose calculation (MCDC) predictions and against measurements in phantom. The reconstructed absolute absorbed doses and the MCDC predictions in homogeneous and heterogeneous phantoms agree within 3% for simple open fields. Comparison with film-measured relative dose distributions for IMRT fields yields agreement within 3 mm, 5%. This novel dose reconstruction algorithm allows for daily patient-specific dosimetry and verification of patient movement
Energy Technology Data Exchange (ETDEWEB)
Heidary, Saeed, E-mail: saeedheidary@aut.ac.ir; Setayeshi, Saeed, E-mail: setayesh@aut.ac.ir
2015-01-11
This work presents a simulation based study by Monte Carlo which uses two adaptive neuro-fuzzy inference systems (ANFIS) for cross talk compensation of simultaneous {sup 99m}Tc/{sup 201}Tl dual-radioisotope SPECT imaging. We have compared two neuro-fuzzy systems based on fuzzy c-means (FCM) and subtractive (SUB) clustering. Our approach incorporates eight energy-windows image acquisition from 28 keV to 156 keV and two main photo peaks of {sup 201}Tl (77±10% keV) and {sup 99m}Tc (140±10% keV). The Geant4 application in emission tomography (GATE) is used as a Monte Carlo simulator for three cylindrical and a NURBS Based Cardiac Torso (NCAT) phantom study. Three separate acquisitions including two single-isotopes and one dual isotope were performed in this study. Cross talk and scatter corrected projections are reconstructed by an iterative ordered subsets expectation maximization (OSEM) algorithm which models the non-uniform attenuation in the projection/back-projection. ANFIS-FCM/SUB structures are tuned to create three to sixteen fuzzy rules for modeling the photon cross-talk of the two radioisotopes. Applying seven to nine fuzzy rules leads to a total improvement of the contrast and the bias comparatively. It is found that there is an out performance for the ANFIS-FCM due to its acceleration and accurate results.
Energy Technology Data Exchange (ETDEWEB)
Lee, Youngjin, E-mail: radioyoungj@gmail.com [Department of Radiological Science, Eulji University, 553, Sanseong-daero, Sujeong-gu, Seongnam-si, Gyeonggi-do (Korea, Republic of); Lee, Amy Candy [Department of Mathematics and Statistics, McGill University (Canada); Kim, Hee-Joung [Department of Radiological Science and Radiation Convergence Engineering, Yonsei University (Korea, Republic of)
2016-09-11
Recently, significant effort has been spent on the development of photons counting detector (PCD) based on a CdTe for applications in X-ray imaging system. The motivation of developing PCDs is higher image quality. Especially, the K-edge subtraction (KES) imaging technique using a PCD is able to improve image quality and useful for increasing the contrast resolution of a target material by utilizing contrast agent. Based on above-mentioned technique, we presented an idea for an improved K-edge log-subtraction (KELS) imaging technique. The KELS imaging technique based on the PCDs can be realized by using different subtraction energy width of the energy window. In this study, the effects of the KELS imaging technique and subtraction energy width of the energy window was investigated with respect to the contrast, standard deviation, and CNR with a Monte Carlo simulation. We simulated the PCD X-ray imaging system based on a CdTe and polymethylmethacrylate (PMMA) phantom which consists of the various iodine contrast agents. To acquired KELS images, images of the phantom using above and below the iodine contrast agent K-edge absorption energy (33.2 keV) have been acquired at different energy range. According to the results, the contrast and standard deviation were decreased, when subtraction energy width of the energy window is increased. Also, the CNR using a KELS imaging technique is higher than that of the images acquired by using whole energy range. Especially, the maximum differences of CNR between whole energy range and KELS images using a 1, 2, and 3 mm diameter iodine contrast agent were acquired 11.33, 8.73, and 8.29 times, respectively. Additionally, the optimum subtraction energy width of the energy window can be acquired at 5, 4, and 3 keV for the 1, 2, and 3 mm diameter iodine contrast agent, respectively. In conclusion, we successfully established an improved KELS imaging technique and optimized subtraction energy width of the energy window, and based on
International Nuclear Information System (INIS)
Lee, Youngjin; Lee, Amy Candy; Kim, Hee-Joung
2016-01-01
Recently, significant effort has been spent on the development of photons counting detector (PCD) based on a CdTe for applications in X-ray imaging system. The motivation of developing PCDs is higher image quality. Especially, the K-edge subtraction (KES) imaging technique using a PCD is able to improve image quality and useful for increasing the contrast resolution of a target material by utilizing contrast agent. Based on above-mentioned technique, we presented an idea for an improved K-edge log-subtraction (KELS) imaging technique. The KELS imaging technique based on the PCDs can be realized by using different subtraction energy width of the energy window. In this study, the effects of the KELS imaging technique and subtraction energy width of the energy window was investigated with respect to the contrast, standard deviation, and CNR with a Monte Carlo simulation. We simulated the PCD X-ray imaging system based on a CdTe and polymethylmethacrylate (PMMA) phantom which consists of the various iodine contrast agents. To acquired KELS images, images of the phantom using above and below the iodine contrast agent K-edge absorption energy (33.2 keV) have been acquired at different energy range. According to the results, the contrast and standard deviation were decreased, when subtraction energy width of the energy window is increased. Also, the CNR using a KELS imaging technique is higher than that of the images acquired by using whole energy range. Especially, the maximum differences of CNR between whole energy range and KELS images using a 1, 2, and 3 mm diameter iodine contrast agent were acquired 11.33, 8.73, and 8.29 times, respectively. Additionally, the optimum subtraction energy width of the energy window can be acquired at 5, 4, and 3 keV for the 1, 2, and 3 mm diameter iodine contrast agent, respectively. In conclusion, we successfully established an improved KELS imaging technique and optimized subtraction energy width of the energy window, and based on
International Nuclear Information System (INIS)
Parsons, David; Robar, James L.; Sawkey, Daren
2014-01-01
Purpose: The focus of this work was the demonstration and validation of VirtuaLinac with clinical photon beams and to investigate the implementation of low-Z targets in a TrueBeam linear accelerator (Linac) using Monte Carlo modeling. Methods: VirtuaLinac, a cloud based web application utilizing Geant4 Monte Carlo code, was used to model the Linac treatment head components. Particles were propagated through the lower portion of the treatment head using BEAMnrc. Dose distributions and spectral distributions were calculated using DOSXYZnrc and BEAMdp, respectively. For validation, 6 MV flattened and flattening filter free (FFF) photon beams were generated and compared to measurement for square fields, 10 and 40 cm wide and at d max for diagonal profiles. Two low-Z targets were investigated: a 2.35 MeV carbon target and the proposed 2.50 MeV commercial imaging target for the TrueBeam platform. A 2.35 MeV carbon target was also simulated in a 2100EX Clinac using BEAMnrc. Contrast simulations were made by scoring the dose in the phosphor layer of an IDU20 aSi detector after propagating through a 4 or 20 cm thick phantom composed of water and ICRP bone. Results: Measured and modeled depth dose curves for 6 MV flattened and FFF beams agree within 1% for 98.3% of points at depths greater than 0.85 cm. Ninety three percent or greater of points analyzed for the diagonal profiles had a gamma value less than one for the criteria of 1.5 mm and 1.5%. The two low-Z target photon spectra produced in TrueBeam are harder than that from the carbon target in the Clinac. Percent dose at depth 10 cm is greater by 3.6% and 8.9%; the fraction of photons in the diagnostic energy range (25–150 keV) is lower by 10% and 28%; and contrasts are lower by factors of 1.1 and 1.4 (4 cm thick phantom) and 1.03 and 1.4 (20 cm thick phantom), for the TrueBeam 2.35 MV/carbon and commercial imaging beams, respectively. Conclusions: VirtuaLinac is a promising new tool for Monte Carlo modeling of novel
International Nuclear Information System (INIS)
Setiani, Tia Dwi; Suprijadi; Haryanto, Freddy
2016-01-01
Monte Carlo (MC) is one of the powerful techniques for simulation in x-ray imaging. MC method can simulate the radiation transport within matter with high accuracy and provides a natural way to simulate radiation transport in complex systems. One of the codes based on MC algorithm that are widely used for radiographic images simulation is MC-GPU, a codes developed by Andrea Basal. This study was aimed to investigate the time computation of x-ray imaging simulation in GPU (Graphics Processing Unit) compared to a standard CPU (Central Processing Unit). Furthermore, the effect of physical parameters to the quality of radiographic images and the comparison of image quality resulted from simulation in the GPU and CPU are evaluated in this paper. The simulations were run in CPU which was simulated in serial condition, and in two GPU with 384 cores and 2304 cores. In simulation using GPU, each cores calculates one photon, so, a large number of photon were calculated simultaneously. Results show that the time simulations on GPU were significantly accelerated compared to CPU. The simulations on the 2304 core of GPU were performed about 64 -114 times faster than on CPU, while the simulation on the 384 core of GPU were performed about 20 – 31 times faster than in a single core of CPU. Another result shows that optimum quality of images from the simulation was gained at the history start from 10 8 and the energy from 60 Kev to 90 Kev. Analyzed by statistical approach, the quality of GPU and CPU images are relatively the same.
Directory of Open Access Journals (Sweden)
Thiago Yamada
2017-11-01
Full Text Available ABSTRACT It is well-known that conducting experimental research aiming the characterization of canopy structure of forests can be a difficult and costly task and, generally, requires an expert to extract, in loco, relevant information. Aiming at easing studies related to canopy structures, several techniques have been proposed in the literature and, among them, various are based on canopy digital image analysis. The research work described in this paper empirically compares two techniques that measure the integrity of the canopy structure of a forest fragment; one of them is based on central parts of canopy cover images and, the other, on canopy closure images. For the experiments, 22 central parts of canopy cover images and 22 canopy closure images were used. The images were captured along two transects: T1 (located in the conserved area and T2 (located in the naturally disturbance area. The canopy digital images were computationally processed and analyzed using the MATLAB platform for the canopy cover images and the Gap Light Analyzer (GLA, for the canopy closure images. The results obtained using these two techniques showed that canopy cover images and, among the employed algorithms, the Jseg, characterize the canopy integrity best. It is worth mentioning that part of the analysis can be automatically conducted, as a quick and precise process, with low material costs involved.
Ortigão, C
2004-01-01
A reliable Monte Carlo simulation study is of significance importance to evaluate the performance of a gamma-ray detector and the search for compromises between spatial resolution, sensitivity and energy resolution. The development of a simulation package for a new compact gamma camera based on GEANT3 is described in this report. This simulation takes into account the interaction of gamma-rays in the crystal, the production and transport of scintillation photons and allows an accurate radiation transport description of photon attenuation in high-Z collimators, for SPECT applications. In order to achieve the best setup configuration different detector arrangements were explored, namely different scintillation crystals, coatings, reflector properties and polishing types. The conventional detector system, based on PMT light readout, was compared with an HPD system. Different collimators were studied for high resolution applications with compact gamma-cameras.
International Nuclear Information System (INIS)
Hubert-Tremblay, Vincent; Archambault, Louis; Tubic, Dragan; Roy, Rene; Beaulieu, Luc
2006-01-01
The purpose of the present study is to introduce a compression algorithm for the CT (computed tomography) data used in Monte Carlo simulations. Performing simulations on the CT data implies large computational costs as well as large memory requirements since the number of voxels in such data reaches typically into hundreds of millions voxels. CT data, however, contain homogeneous regions which could be regrouped to form larger voxels without affecting the simulation's accuracy. Based on this property we propose a compression algorithm based on octrees: in homogeneous regions the algorithm replaces groups of voxels with a smaller number of larger voxels. This reduces the number of voxels while keeping the critical high-density gradient area. Results obtained using the present algorithm on both phantom and clinical data show that compression rates up to 75% are possible without losing the dosimetric accuracy of the simulation
Energy Technology Data Exchange (ETDEWEB)
Mohammadian-Behbahani, Mohammad-Reza [Department of Energy Engineering and Physics, Amir-Kabir University of Technology (Tehran Polytechnic), Tehran (Iran, Islamic Republic of); School of Particles and Accelerators, Institute for Research in Fundamental Sciences (IPM), Tehran (Iran, Islamic Republic of); Saramad, Shahyar, E-mail: ssaramad@aut.ac.ir [Department of Energy Engineering and Physics, Amir-Kabir University of Technology (Tehran Polytechnic), Tehran (Iran, Islamic Republic of); School of Particles and Accelerators, Institute for Research in Fundamental Sciences (IPM), Tehran (Iran, Islamic Republic of); Mohammadi, Mohammad [Department of Electrical Engineering, Amir-Kabir University of Technology (Tehran Polytechnic), Tehran (Iran, Islamic Republic of); School of Particles and Accelerators, Institute for Research in Fundamental Sciences (IPM), Tehran (Iran, Islamic Republic of)
2017-05-01
A combination of Finite Difference Time Domain (FDTD) and Monte Carlo (MC) methods is proposed for simulation and analysis of ZnO microscintillators grown in polycarbonate membrane. A planar 10 keV X-ray source irradiating the detector is simulated by MC method, which provides the amount of absorbed X-ray energy in the assembly. The transport of generated UV scintillation light and its propagation in the detector was studied by the FDTD method. Detector responses to different probable scintillation sites and under different energies of X-ray source from 10 to 25 keV are reported. Finally, the tapered geometry for the scintillators is proposed, which shows enhanced spatial resolution in comparison to cylindrical geometry for imaging applications.
Berradja, Khadidja; Boughanmi, Nabil
2016-09-01
In dynamic cardiac PET FDG studies the assessment of myocardial metabolic rate of glucose (MMRG) requires the knowledge of the blood input function (IF). IF can be obtained by manual or automatic blood sampling and cross calibrated with PET. These procedures are cumbersome, invasive and generate uncertainties. The IF is contaminated by spillover of radioactivity from the adjacent myocardium and this could cause important error in the estimated MMRG. In this study, we show that the IF can be extracted from the images in a rat heart study with 18F-fluorodeoxyglucose (18F-FDG) by means of Independent Component Analysis (ICA) based on Bayesian theory and Markov Chain Monte Carlo (MCMC) sampling method (BICA). Images of the heart from rats were acquired with the Sherbrooke small animal PET scanner. A region of interest (ROI) was drawn around the rat image and decomposed into blood and tissue using BICA. The Statistical study showed that there is a significant difference (p corrupted with spillover.
Wu, Kui; Li, Guangjun; Bai, Sen
2012-06-01
This paper is to investigate how the different energy impact the accuracy of X-ray Voxel Monte Carlo (XVMC) algorithm when it is applied for dose calculation in Kilovoltage cone beam CT(kv-CBCT) images. The CIRS model 062 was used to calibrate the CT numbers-relative electron density table of CT and CBCT images. CT and CBCT scans were performed when simulation model of human head-and-neck placed in same position to simulate locally advanced nasopharyngeal carcinoma. 6MV and 15MV photon were selected in Monaco TPS to design intensity-modulated radiotherapy (IMRT) plans. XVMC algorithm was selected for dose calculation then the calculation results were compared and the impact of energy on the calculation accuracy was analyzed. The comparison results of dose volume histograms (DVHs), dose received by targets, organs at risk, conform index and uniform index of targets indicate a high agreement between CT based and CBCT based plans. More evaluation indicators show higher accuracy when 15MV photon was selected for dose calculation. gamma index analysis with the criterion of 2mm/2% and threshold of 10% was used for comparison of dose distribution. The average pass rate of each plane was 99.3% +/- 0.47% on the base of 6MV and 99.4% +/- 0.44% on the base of 15MV. CBCT images after calibration has high accuracy of dose calculation and has higher accuracy when 15MV photon was selected.
International Nuclear Information System (INIS)
Roé-Vellvé, N; Pino, F; Cot, A; Ros, D; Falcon, C; Gispert, J D; Pavía, J; Marin, C
2014-01-01
SPECT studies with 123 I-ioflupane facilitate the diagnosis of Parkinson’s disease (PD). The effect on quantification of image degradations has been extensively evaluated in human studies but their impact on studies of experimental PD models is still unclear. The aim of this work was to assess the effect of compensating for the degrading phenomena on the quantification of small animal SPECT studies using 123 I-ioflupane. This assessment enabled us to evaluate the feasibility of quantitatively detecting small pathological changes using different reconstruction methods and levels of compensation for the image degrading phenomena. Monte Carlo simulated studies of a rat phantom were reconstructed and quantified. Compensations for point spread function (PSF), scattering, attenuation and partial volume effect were progressively included in the quantification protocol. A linear relationship was found between calculated and simulated specific uptake ratio (SUR) in all cases. In order to significantly distinguish disease stages, noise-reduction during the reconstruction process was the most relevant factor, followed by PSF compensation. The smallest detectable SUR interval was determined by biological variability rather than by image degradations or coregistration errors. The quantification methods that gave the best results allowed us to distinguish PD stages with SUR values that are as close as 0.5 using groups of six rats to represent each stage. (paper)
Monte Carlo techniques in radiation therapy
Verhaegen, Frank
2013-01-01
Modern cancer treatment relies on Monte Carlo simulations to help radiotherapists and clinical physicists better understand and compute radiation dose from imaging devices as well as exploit four-dimensional imaging data. With Monte Carlo-based treatment planning tools now available from commercial vendors, a complete transition to Monte Carlo-based dose calculation methods in radiotherapy could likely take place in the next decade. Monte Carlo Techniques in Radiation Therapy explores the use of Monte Carlo methods for modeling various features of internal and external radiation sources, including light ion beams. The book-the first of its kind-addresses applications of the Monte Carlo particle transport simulation technique in radiation therapy, mainly focusing on external beam radiotherapy and brachytherapy. It presents the mathematical and technical aspects of the methods in particle transport simulations. The book also discusses the modeling of medical linacs and other irradiation devices; issues specific...
International Nuclear Information System (INIS)
Nikolopoulos, Dimitrios; Kandarakis, Ioannis; Tsantilas, Xenophon; Valais, Ioannis; Cavouras, Dionisios; Louizi, Anna
2006-01-01
The radiation detection efficiency of four scintillators employed, or designed to be employed, in positron emission imaging (PET) was evaluated as a function of the crystal thickness by applying Monte Carlo Methods. The scintillators studied were the LuSiO 5 (LSO), LuAlO 3 (LuAP), Gd 2 SiO 5 (GSO) and the YAlO 3 (YAP). Crystal thicknesses ranged from 0 to 50 mm. The study was performed via a previously generated photon transport Monte Carlo code. All photon track and energy histories were recorded and the energy transferred or absorbed in the scintillator medium was calculated together with the energy redistributed and retransported as secondary characteristic fluorescence radiation. Various parameters were calculated e.g. the fraction of the incident photon energy absorbed, transmitted or redistributed as fluorescence radiation, the scatter to primary ratio, the photon and energy distribution within each scintillator block etc. As being most significant, the fraction of the incident photon energy absorbed was found to increase with increasing crystal thickness tending to form a plateau above the 30 mm thickness. For LSO, LuAP, GSO and YAP scintillators, respectively, this fraction had the value of 44.8, 36.9 and 45.7% at the 10 mm thickness and 96.4, 93.2 and 96.9% at the 50 mm thickness. Within the plateau area approximately (57-59)% (59-63)% (52-63)% and (58-61)% of this fraction was due to scattered and reabsorbed radiation for the LSO, GSO, YAP and LuAP scintillators, respectively. In all cases, a negligible fraction (<0.1%) of the absorbed energy was found to escape the crystal as fluorescence radiation
Zakhnini, Abdelhamid; Kulenkampff, Johannes; Sauerzapf, Sophie; Pietrzyk, Uwe; Lippmann-Pipke, Johanna
2013-08-01
Understanding conservative fluid flow and reactive tracer transport in soils and rock formations requires quantitative transport visualization methods in 3D+t. After a decade of research and development we established the GeoPET as a non-destructive method with unrivalled sensitivity and selectivity, with due spatial and temporal resolution by applying Positron Emission Tomography (PET), a nuclear medicine imaging method, to dense rock material. Requirements for reaching the physical limit of image resolution of nearly 1 mm are (a) a high-resolution PET-camera, like our ClearPET scanner (Raytest), and (b) appropriate correction methods for scatter and attenuation of 511 keV—photons in the dense geological material. The latter are by far more significant in dense geological material than in human and small animal body tissue (water). Here we present data from Monte Carlo simulations (MCS) reflecting selected GeoPET experiments. The MCS consider all involved nuclear physical processes of the measurement with the ClearPET-system and allow us to quantify the sensitivity of the method and the scatter fractions in geological media as function of material (quartz, Opalinus clay and anhydrite compared to water), PET isotope (18F, 58Co and 124I), and geometric system parameters. The synthetic data sets obtained by MCS are the basis for detailed performance assessment studies allowing for image quality improvements. A scatter correction method is applied exemplarily by subtracting projections of simulated scattered coincidences from experimental data sets prior to image reconstruction with an iterative reconstruction process.
Biegun, A K; van Goethem, M-J; van der Graaf, E R; van Beuzekom, M; Koffeman, E N; Nakaji, T; Takatsu, J; Visser, J; Brandenburg, S
2017-09-01
Proton radiography is a novel imaging modality that allows direct measurement of the proton energy loss in various tissues. Currently, due to the conversion of so-called Hounsfield units from X-ray Computed Tomography (CT) into relative proton stopping powers (RPSP), the uncertainties of RPSP are 3-5% or higher, which need to be minimized down to 1% to make the proton treatment plans more accurate. In this work, we simulated a proton radiography system, with position-sensitive detectors (PSDs) and a residual energy detector (RED). The simulations were built using Geant4, a Monte Carlo simulation toolkit. A phantom, consisting of several materials was placed between the PSDs of various Water Equivalent Thicknesses (WET), corresponding to an ideal detector, a gaseous detector, silicon and plastic scintillator detectors. The energy loss radiograph and the scattering angle distributions of the protons were studied for proton beam energies of 150MeV, 190MeV and 230MeV. To improve the image quality deteriorated by the multiple Coulomb scattering (MCS), protons with small angles were selected. Two ways of calculating a scattering angle were considered using the proton's direction and position. A scattering angle cut of 8.7mrad was applied giving an optimal balance between quality and efficiency of the radiographic image. For the three proton beam energies, the number of protons used in image reconstruction with the direction method was half the number of protons kept using the position method. Copyright © 2017 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.
Energy Technology Data Exchange (ETDEWEB)
Eom, Ji Soo; Kang, Soon Cheol; Lee, Seung Wan [Konyang University, Daejeon (Korea, Republic of)
2017-09-15
Mammography is commonly used for screening early breast cancer. However, mammographic images, which depend on the physical properties of breast components, are limited to provide information about whether a lesion is malignant or benign. Although a dual-energy subtraction technique decomposes a certain material from a mixture, it increases radiation dose and degrades the accuracy of material decomposition. In this study, we simulated a breast phantom using attenuation characteristics, and we proposed a technique to enable the accurate material decomposition by applying weighting factors for the dual-energy mammography based on a photon-counting detector using a Monte Carlo simulation tool. We also evaluated the contrast and noise of simulated breast images for validating the proposed technique. As a result, the contrast for a malignant tumor in the dual-energy weighted subtraction technique was 0.98 and 1.06 times similar than those in the general mammography and dual-energy subtraction techniques, respectively. However the contrast between malignant and benign tumors dramatically increased 13.54 times due to the low contrast of a benign tumor. Therefore, the proposed technique can increase the material decomposition accuracy for malignant tumor and improve the diagnostic accuracy of mammography.
Monte Carlo simulation in nuclear medicine
International Nuclear Information System (INIS)
Morel, Ch.
2007-01-01
The Monte Carlo method allows for simulating random processes by using series of pseudo-random numbers. It became an important tool in nuclear medicine to assist in the design of new medical imaging devices, optimise their use and analyse their data. Presently, the sophistication of the simulation tools allows the introduction of Monte Carlo predictions in data correction and image reconstruction processes. The availability to simulate time dependent processes opens up new horizons for Monte Carlo simulation in nuclear medicine. In a near future, these developments will allow to tackle simultaneously imaging and dosimetry issues and soon, case system Monte Carlo simulations may become part of the nuclear medicine diagnostic process. This paper describes some Monte Carlo method basics and the sampling methods that were developed for it. It gives a referenced list of different simulation software used in nuclear medicine and enumerates some of their present and prospective applications. (author)
Energy Technology Data Exchange (ETDEWEB)
Laliena Bielsa, V.; Jimenez Albericio, F. J.; Gandia Martinez, A.; Font Gomez, J. A.; Mengual Gil, M. A.; Andres Redondo, M. M.
2013-07-01
The source of uncertainty is not exclusive of the Monte Carlo method, but it will be present in any algorithm which takes into account the correction for heterogeneity. Although we hope that the uncertainty described above is small, the objective of this work is to try to quantify depending on the CT study. (Author)
Crespo, Cristina; Gallego, Judith; Cot, Albert; Falcón, Carles; Bullich, Santiago; Pareto, Deborah; Aguiar, Pablo; Sempau, Josep; Lomeña, Francisco; Calviño, Francisco; Pavía, Javier; Ros, Domènec
2008-07-01
(123)I-labelled radioligands are commonly used for single-photon emission computed tomography (SPECT) imaging of the dopaminergic system to study the dopamine transporter binding. The aim of this work was to compare the quantitative capabilities of two different SPECT systems through Monte Carlo (MC) simulation. The SimSET MC code was employed to generate simulated projections of a numerical phantom for two gamma cameras equipped with a parallel and a fan-beam collimator, respectively. A fully 3D iterative reconstruction algorithm was used to compensate for attenuation, the spatially variant point spread function (PSF) and scatter. A post-reconstruction partial volume effect (PVE) compensation was also developed. For both systems, the correction for all degradations and PVE compensation resulted in recovery factors of the theoretical specific uptake ratio (SUR) close to 100%. For a SUR value of 4, the recovered SUR for the parallel imaging system was 33% for a reconstruction without corrections (OSEM), 45% for a reconstruction with attenuation correction (OSEM-A), 56% for a 3D reconstruction with attenuation and PSF corrections (OSEM-AP), 68% for OSEM-AP with scatter correction (OSEM-APS) and 97% for OSEM-APS plus PVE compensation (OSEM-APSV). For the fan-beam imaging system, the recovered SUR was 41% without corrections, 55% for OSEM-A, 65% for OSEM-AP, 75% for OSEM-APS and 102% for OSEM-APSV. Our findings indicate that the correction for degradations increases the quantification accuracy, with PVE compensation playing a major role in the SUR quantification. The proposed methodology allows us to reach similar SUR values for different SPECT systems, thereby allowing a reliable standardisation in multicentric studies.
International Nuclear Information System (INIS)
Crespo, Cristina; Aguiar, Pablo; Gallego, Judith; Cot, Albert; Falcon, Carles; Ros, Domenec; Bullich, Santiago; Pareto, Deborah; Sempau, Josep; Lomena, Francisco; Calvino, Francisco; Pavia, Javier
2008-01-01
123 I-labelled radioligands are commonly used for single-photon emission computed tomography (SPECT) imaging of the dopaminergic system to study the dopamine transporter binding. The aim of this work was to compare the quantitative capabilities of two different SPECT systems through Monte Carlo (MC) simulation. The SimSET MC code was employed to generate simulated projections of a numerical phantom for two gamma cameras equipped with a parallel and a fan-beam collimator, respectively. A fully 3D iterative reconstruction algorithm was used to compensate for attenuation, the spatially variant point spread function (PSF) and scatter. A post-reconstruction partial volume effect (PVE) compensation was also developed. For both systems, the correction for all degradations and PVE compensation resulted in recovery factors of the theoretical specific uptake ratio (SUR) close to 100%. For a SUR value of 4, the recovered SUR for the parallel imaging system was 33% for a reconstruction without corrections (OSEM), 45% for a reconstruction with attenuation correction (OSEM-A), 56% for a 3D reconstruction with attenuation and PSF corrections (OSEM-AP), 68% for OSEM-AP with scatter correction (OSEM-APS) and 97% for OSEM-APS plus PVE compensation (OSEM-APSV). For the fan-beam imaging system, the recovered SUR was 41% without corrections, 55% for OSEM-A, 65% for OSEM-AP, 75% for OSEM-APS and 102% for OSEM-APSV. Our findings indicate that the correction for degradations increases the quantification accuracy, with PVE compensation playing a major role in the SUR quantification. The proposed methodology allows us to reach similar SUR values for different SPECT systems, thereby allowing a reliable standardisation in multicentric studies. (orig.)
Energy Technology Data Exchange (ETDEWEB)
Crespo, Cristina; Aguiar, Pablo [Universitat de Barcelona - IDIBAPS, Unitat de Biofisica i Bioenginyeria, Departament de Ciencies Fisiologiques I, Facultat de Medicina, Barcelona (Spain); Gallego, Judith [Universitat Politecnica de Catalunya, Institut de Tecniques Energetiques, Barcelona (Spain); Institut de Bioenginyeria de Catalunya, Barcelona (Spain); Cot, Albert [Universitat de Barcelona - IDIBAPS, Unitat de Biofisica i Bioenginyeria, Departament de Ciencies Fisiologiques I, Facultat de Medicina, Barcelona (Spain); Universitat Politecnica de Catalunya, Seccio d' Enginyeria Nuclear, Departament de Fisica i Enginyeria Nuclear, Barcelona (Spain); Falcon, Carles; Ros, Domenec [Universitat de Barcelona - IDIBAPS, Unitat de Biofisica i Bioenginyeria, Departament de Ciencies Fisiologiques I, Facultat de Medicina, Barcelona (Spain); CIBER en Bioingenieria, Biomateriales y Nanomedicina (CIBER-BBN), Barcelona (Spain); Bullich, Santiago [Hospital del Mar, Center for Imaging in Psychiatry, CRC-MAR, Barcelona (Spain); Pareto, Deborah [CIBER en Bioingenieria, Biomateriales y Nanomedicina (CIBER-BBN), Barcelona (Spain); PRBB, Institut d' Alta Tecnologia, Barcelona (Spain); Sempau, Josep [Universitat Politecnica de Catalunya, Institut de Tecniques Energetiques, Barcelona (Spain); CIBER en Bioingenieria, Biomateriales y Nanomedicina (CIBER-BBN), Barcelona (Spain); Lomena, Francisco [IDIBAPS, Servei de Medicina Nuclear, Hospital Clinic, Barcelona (Spain); Calvino, Francisco [Universitat Politecnica de Catalunya, Institut de Tecniques Energetiques, Barcelona (Spain); Universitat Politecnica de Catalunya, Seccio d' Enginyeria Nuclear, Departament de Fisica i Enginyeria Nuclear, Barcelona (Spain); Pavia, Javier [CIBER en Bioingenieria, Biomateriales y Nanomedicina (CIBER-BBN), Barcelona (Spain); IDIBAPS, Servei de Medicina Nuclear, Hospital Clinic, Barcelona (Spain)
2008-07-15
{sup 123}I-labelled radioligands are commonly used for single-photon emission computed tomography (SPECT) imaging of the dopaminergic system to study the dopamine transporter binding. The aim of this work was to compare the quantitative capabilities of two different SPECT systems through Monte Carlo (MC) simulation. The SimSET MC code was employed to generate simulated projections of a numerical phantom for two gamma cameras equipped with a parallel and a fan-beam collimator, respectively. A fully 3D iterative reconstruction algorithm was used to compensate for attenuation, the spatially variant point spread function (PSF) and scatter. A post-reconstruction partial volume effect (PVE) compensation was also developed. For both systems, the correction for all degradations and PVE compensation resulted in recovery factors of the theoretical specific uptake ratio (SUR) close to 100%. For a SUR value of 4, the recovered SUR for the parallel imaging system was 33% for a reconstruction without corrections (OSEM), 45% for a reconstruction with attenuation correction (OSEM-A), 56% for a 3D reconstruction with attenuation and PSF corrections (OSEM-AP), 68% for OSEM-AP with scatter correction (OSEM-APS) and 97% for OSEM-APS plus PVE compensation (OSEM-APSV). For the fan-beam imaging system, the recovered SUR was 41% without corrections, 55% for OSEM-A, 65% for OSEM-AP, 75% for OSEM-APS and 102% for OSEM-APSV. Our findings indicate that the correction for degradations increases the quantification accuracy, with PVE compensation playing a major role in the SUR quantification. The proposed methodology allows us to reach similar SUR values for different SPECT systems, thereby allowing a reliable standardisation in multicentric studies. (orig.)
Kalyagina, N.; Loschenov, V.; Wolf, D.; Daul, C.; Blondel, W.; Savelieva, T.
2011-11-01
We have investigated the influence of scatterer size changes on the laser light diffusion, induced by collimated monochromatic laser irradiation, in tissue-like optical phantoms using diffuse-reflectance imaging. For that purpose, three-layer optical phantoms were prepared, in which nano- and microsphere size varied in order to simulate the scattering properties of healthy and cancerous urinary bladder walls. The informative areas of the surface diffuse-reflected light distributions were about 15×18 pixels for the smallest scattering particles of 0.05 μm, about 21×25 pixels for the medium-size particles of 0.53 μm, and about 25×30 pixels for the largest particles of 5.09 μm. The computation of the laser spot areas provided useful information for the analysis of the light distribution with high measurement accuracy of up to 92%. The minimal stability of 78% accuracy was observed for superficial scattering signals on the phantoms with the largest particles. The experimental results showed a good agreement with the results obtained by the Monte Carlo simulations. The presented method shows a good potential to be useful for a tissue-state diagnosis of the urinary bladder.
International Nuclear Information System (INIS)
Kolbun, N.; Leveque, Ph.; Abboud, F.; Bol, A.; Vynckier, S.; Gallez, B.
2010-01-01
Purpose: The experimental determination of doses at proximal distances from radioactive sources is difficult because of the steepness of the dose gradient. The goal of this study was to determine the relative radial dose distribution for a low dose rate 192 Ir wire source using electron paramagnetic resonance imaging (EPRI) and to compare the results to those obtained using Gafchromic EBT film dosimetry and Monte Carlo (MC) simulations. Methods: Lithium formate and ammonium formate were chosen as the EPR dosimetric materials and were used to form cylindrical phantoms. The dose distribution of the stable radiation-induced free radicals in the lithium formate and ammonium formate phantoms was assessed by EPRI. EBT films were also inserted inside in ammonium formate phantoms for comparison. MC simulation was performed using the MCNP4C2 software code. Results: The radical signal in irradiated ammonium formate is contained in a single narrow EPR line, with an EPR peak-to-peak linewidth narrower than that of lithium formate (∼0.64 and 1.4 mT, respectively). The spatial resolution of EPR images was enhanced by a factor of 2.3 using ammonium formate compared to lithium formate because its linewidth is about 0.75 mT narrower than that of lithium formate. The EPRI results were consistent to within 1% with those of Gafchromic EBT films and MC simulations at distances from 1.0 to 2.9 mm. The radial dose values obtained by EPRI were about 4% lower at distances from 2.9 to 4.0 mm than those determined by MC simulation and EBT film dosimetry. Conclusions: Ammonium formate is a suitable material under certain conditions for use in brachytherapy dosimetry using EPRI. In this study, the authors demonstrated that the EPRI technique allows the estimation of the relative radial dose distribution at short distances for a 192 Ir wire source.
Energy Technology Data Exchange (ETDEWEB)
Papadimitroulas, P; Kagadis, GC [University of Patras, Rion, Ahaia (Greece); Loudos, G [Technical Educational Institute of Athens, Aigaleo, Attiki (Greece)
2014-06-15
Purpose: Our purpose is to evaluate the administered absorbed dose in pediatric, nuclear imaging studies. Monte Carlo simulations with the incorporation of pediatric computational models can serve as reference for the accurate determination of absorbed dose. The procedure of the calculated dosimetric factors is described, while a dataset of reference doses is created. Methods: Realistic simulations were executed using the GATE toolkit and a series of pediatric computational models, developed by the “IT'IS Foundation”. The series of the phantoms used in our work includes 6 models in the range of 5–14 years old (3 boys and 3 girls). Pre-processing techniques were applied to the images, to incorporate the phantoms in GATE simulations. The resolution of the phantoms was set to 2 mm3. The most important organ densities were simulated according to the GATE “Materials Database”. Several used radiopharmaceuticals in SPECT and PET applications are being tested, following the EANM pediatric dosage protocol. The biodistributions of the several isotopes used as activity maps in the simulations, were derived by the literature. Results: Initial results of absorbed dose per organ (mGy) are presented in a 5 years old girl from the whole body exposure to 99mTc - SestaMIBI, 30 minutes after administration. Heart, kidney, liver, ovary, pancreas and brain are the most critical organs, in which the S-factors are calculated. The statistical uncertainty in the simulation procedure was kept lower than 5%. The Sfactors for each target organ are calculated in Gy/(MBq*sec) with highest dose being absorbed in kidneys and pancreas (9.29*10{sup 10} and 0.15*10{sup 10} respectively). Conclusion: An approach for the accurate dosimetry on pediatric models is presented, creating a reference dosage dataset for several radionuclides in children computational models with the advantages of MC techniques. Our study is ongoing, extending our investigation to other reference models and
International Nuclear Information System (INIS)
Choi, Sang Hyoun
2007-08-01
Ajou University School of Medicine made the serially sectioned anatomical images from the Visible Korean Human (VKH) Project in Korea. The VKH images, which are the high-resolution color photographic images, show the organs and tissues in the human body very clearly at 0.2 mm intervals. In this study, we constructed a high-quality voxel model (VKH-Man) with a total of 30 organs and tissues by manual and automatic segmentation method using the serially sectioned anatomical image data from the Visible Korean Human (VKH) project in Korea. The height and weight of VKH-Man voxel model is 164 cm and 57.6 kg, respectively, and the voxel resolution is 1.875 x 1.875 x 2 mm 3 . However, this voxel phantom can be used to calculate the organ and tissue doses of only one person. Therefore, in this study, we adjusted the voxel phantom to the 'Reference Korean' data to construct the voxel phantom that represents the radiation workers in Korea. The height and weight of the voxel model (HDRK-Man) that is finally developed are 171 cm and 68 kg, respectively, and the voxel resolution is 1.981 x 1.981 x 2.0854 mm 3 . VKH-Man and HDRK-Man voxel model were implemented in a Monte Carlo particle transport simulation code for calculation of the organ and tissue doses in various irradiation geometries. The calculated values were compared with each other to see the effect of the adjustment and also compared with other computational models (KTMAN-2, ICRP-74 and VIP-Man). According to the results, the adjustment of the voxel model was found hardly affect the dose calculations and most of the organ and tissue equivalent doses showed some differences among the models. These results shows that the difference in figure, and organ topology affects the organ doses more than the organ size. The calculated values of the effective dose from VKH-Man and HDRK-Man according to the ICRP-60 and upcoming ICRP recommendation were compared. For the other radiation geometries (AP, LLAT, RLAT) except for PA
Ding, Aiping; Mille, Matthew M.; Liu, Tianyu; Caracappa, Peter F.; Xu, X. George
2012-05-01
Although it is known that obesity has a profound effect on x-ray computed tomography (CT) image quality and patient organ dose, quantitative data describing this relationship are not currently available. This study examines the effect of obesity on the calculated radiation dose to organs and tissues from CT using newly developed phantoms representing overweight and obese patients. These phantoms were derived from the previously developed RPI-adult male and female computational phantoms. The result was a set of ten phantoms (five males, five females) with body mass indexes ranging from 23.5 (normal body weight) to 46.4 kg m-2 (morbidly obese). The phantoms were modeled using triangular mesh geometry and include specified amounts of the subcutaneous adipose tissue and visceral adipose tissue. The mesh-based phantoms were then voxelized and defined in the Monte Carlo N-Particle Extended code to calculate organ doses from CT imaging. Chest-abdomen-pelvis scanning protocols for a GE LightSpeed 16 scanner operating at 120 and 140 kVp were considered. It was found that for the same scanner operating parameters, radiation doses to organs deep in the abdomen (e.g., colon) can be up to 59% smaller for obese individuals compared to those of normal body weight. This effect was found to be less significant for shallow organs. On the other hand, increasing the tube potential from 120 to 140 kVp for the same obese individual resulted in increased organ doses by as much as 56% for organs within the scan field (e.g., stomach) and 62% for those out of the scan field (e.g., thyroid), respectively. As higher tube currents are often used for larger patients to maintain image quality, it was of interest to quantify the associated effective dose. It was found from this study that when the mAs was doubled for the obese level-I, obese level-II and morbidly-obese phantoms, the effective dose relative to that of the normal weight phantom increased by 57%, 42% and 23%, respectively. This set
Energy Technology Data Exchange (ETDEWEB)
Silva, Carlos Borges da
2007-05-15
The image acquisition methods applied to nuclear medicine and radiobiology are a valuable research study for determination of thyroid anatomy to seek disorders associated to follicular cells. The Monte Carlo (MC) simulation has also been used in problems related to radiation detection in order to map medical images since the improvement of data processing compatible with personnel computers (PC). This work presents an innovative study to find out the adequate scintillation inorganic detector array that could be coupled to a specific light photo sensor, a charge coupled device (CCD) through a fiber optic plate in order to map the follicles of thyroid gland. The goal is to choose the type of detector that fits the application suggested here with spatial resolution of 10 {mu}m and good detector efficiency. The methodology results are useful to map a follicle image using gamma radiation emission. A source - detector simulation is performed by using a MCNP4B (Monte Carlo for Neutron Photon transport) general code considering different source energies, detector materials and geometries including pixel sizes and reflector types. The results demonstrate that by using MCNP4B code is possible to searching for useful parameters related to the systems used in nuclear medicine, specifically in radiobiology applied to endocrine physiology studies to acquiring thyroid follicles images. (author)
International Nuclear Information System (INIS)
Velo, Alexandre F.; Carvalho, Diego V.; Alvarez, Alexandre G.; Hamada, Margarida M.; Mesquita, Carlos H.
2017-01-01
The greatest impact of the tomography technology application currently occurs in medicine. The great success of medical tomography is due to the human body presents reasonably standardized dimensions with well established chemical composition. Generally, these favorable conditions are not found in large industrial objects. In the industry there is much interest in using the information of the tomograph in order to know the interior of: (1) manufactured industrial objects or (2) machines and their means of production. In these cases, the purpose of the tomograph is to: (a) control the quality of the final product and (b) optimize production, contributing to the pilot phase of the projects and analyzing the quality of the means of production. In different industrial processes, e. g. in chemical reactors and distillation columns, the phenomena related to multiphase processes are usually fast, requiring high temporal resolution of the computed tomography (CT) data acquisition. In this context, Instant non-scanning tomograph and fifth generation tomograph meets these requirements. An instant non scanning tomography system is being developed at the IPEN/CNEN. In this work, in order to optimize the system, this tomograph comprised different collimators was simulated, with Monte Carlo method using the MCNP4C. The image quality was evaluated with MATLAB® 2013b, by analysis of the following parameters: contrast to noise (CNR), root mean square ratio (RMSE), signal to noise ratio (SNR) and the spatial resolution by the Modulation Transfer Function (MTF(f)), to analyze which collimator fits better to the instant non scanning tomography. It was simulated three situations; (1) with no collimator; (2) ?25 mm x 50 mm cylindrical collimator with a septum of ø5.0 mm x 50 mm; (3) ø25 mm x 50 mm cylindrical collimator with a slit septum of 24 mm x 5.0 mm x 50 mm. (author)
Variational Monte Carlo Technique
Indian Academy of Sciences (India)
ias
RESONANCE ⎜ August 2014. GENERAL ⎜ ARTICLE. Variational Monte Carlo Technique. Ground State Energies of Quantum Mechanical Systems. Sukanta Deb. Keywords. Variational methods, Monte. Carlo techniques, harmonic os- cillators, quantum mechanical systems. Sukanta Deb is an. Assistant Professor in the.
Blake, S; Vial, P; Holloway, L; McNamara, A; Greer, P; Kuncic, Z
2012-06-01
To investigate the sensitivity of a Monte Carlo (MC) model of a standard clinical amorphous silicon (a-Si) electron portal imaging device (EPID) to variations in optical photon transport parameters. The Geant4 MC toolkit was used to develop a comprehensive model of an indirect-detection a-Si EPID incorporating x-ray and optical photon transport. The EPID was modeled as a series of uniform layers with properties specified by the manufacturer (PerkinElmer, Santa Clara, CA) of a research EPID at our centre. Optical processes that were modeled include bulk absorption, Rayleigh scattering, and boundary processes (reflection and refraction). Model performance was evaluated by scoring optical photons absorbed by the a-Si photodiode as a function of radial distance from a point source of x-rays on an event-by-event basis (0.025 mm resolution). Primary x-ray energies were sampled from a clinical 6 MV photon spectrum. Simulations were performed by varying optical transport parameters and the resulting point spread functions (PSFs) were compared. The optical parameters investigated include: x-ray transport cutoff thresholds; absorption path length; optical energy spectrum; refractive indices; and the 'roughness' of boundaries within phosphor screen layers. The transport cutoffs and refractive indices studied were found to minimally affect resulting PSFs. A monoenergetic optical spectrum slightly broadened the PSF in comparison with the use of a polyenergetic spectrum. The absorption path length only significantly altered the PSF when decreased drastically. Variations in the treatment of boundaries noticeably broadened resulting PSFs. Variation in optical transport parameters was found to affect resulting PSF calculations. Current work is focusing on repeating this analysis with a coarser resolution more typical of a commercial a-Si EPID to observe if these effects continue to alter the EPID PSF. Experimental measurement of the EPID line spread function to validate these
Dunn, William L
2012-01-01
Exploring Monte Carlo Methods is a basic text that describes the numerical methods that have come to be known as "Monte Carlo." The book treats the subject generically through the first eight chapters and, thus, should be of use to anyone who wants to learn to use Monte Carlo. The next two chapters focus on applications in nuclear engineering, which are illustrative of uses in other fields. Five appendices are included, which provide useful information on probability distributions, general-purpose Monte Carlo codes for radiation transport, and other matters. The famous "Buffon's needle proble
Directory of Open Access Journals (Sweden)
Bardenet Rémi
2013-07-01
Full Text Available Bayesian inference often requires integrating some function with respect to a posterior distribution. Monte Carlo methods are sampling algorithms that allow to compute these integrals numerically when they are not analytically tractable. We review here the basic principles and the most common Monte Carlo algorithms, among which rejection sampling, importance sampling and Monte Carlo Markov chain (MCMC methods. We give intuition on the theoretical justification of the algorithms as well as practical advice, trying to relate both. We discuss the application of Monte Carlo in experimental physics, and point to landmarks in the literature for the curious reader.
Energy Technology Data Exchange (ETDEWEB)
Cramer, S.N.
1984-01-01
The MORSE code is a large general-use multigroup Monte Carlo code system. Although no claims can be made regarding its superiority in either theoretical details or Monte Carlo techniques, MORSE has been, since its inception at ORNL in the late 1960s, the most widely used Monte Carlo radiation transport code. The principal reason for this popularity is that MORSE is relatively easy to use, independent of any installation or distribution center, and it can be easily customized to fit almost any specific need. Features of the MORSE code are described.
International Nuclear Information System (INIS)
Cramer, S.N.
1984-01-01
The MORSE code is a large general-use multigroup Monte Carlo code system. Although no claims can be made regarding its superiority in either theoretical details or Monte Carlo techniques, MORSE has been, since its inception at ORNL in the late 1960s, the most widely used Monte Carlo radiation transport code. The principal reason for this popularity is that MORSE is relatively easy to use, independent of any installation or distribution center, and it can be easily customized to fit almost any specific need. Features of the MORSE code are described
Magota, Keiichi; Shiga, Tohru; Asano, Yukari; Shinyama, Daiki; Ye, Jinghan; Perkins, Amy E; Maniawski, Piotr J; Toyonaga, Takuya; Kobayashi, Kentaro; Hirata, Kenji; Katoh, Chietsugu; Hattori, Naoya; Tamaki, Nagara
2017-12-01
In 3-dimensional PET/CT imaging of the brain with 15 O-gas inhalation, high radioactivity in the face mask creates cold artifacts and affects the quantitative accuracy when scatter is corrected by conventional methods (e.g., single-scatter simulation [SSS] with tail-fitting scaling [TFS-SSS]). Here we examined the validity of a newly developed scatter-correction method that combines SSS with a scaling factor calculated by Monte Carlo simulation (MCS-SSS). Methods: We performed phantom experiments and patient studies. In the phantom experiments, a plastic bottle simulating a face mask was attached to a cylindric phantom simulating the brain. The cylindric phantom was filled with 18 F-FDG solution (3.8-7.0 kBq/mL). The bottle was filled with nonradioactive air or various levels of 18 F-FDG (0-170 kBq/mL). Images were corrected either by TFS-SSS or MCS-SSS using the CT data of the bottle filled with nonradioactive air. We compared the image activity concentration in the cylindric phantom with the true activity concentration. We also performed 15 O-gas brain PET based on the steady-state method on patients with cerebrovascular disease to obtain quantitative images of cerebral blood flow and oxygen metabolism. Results: In the phantom experiments, a cold artifact was observed immediately next to the bottle on TFS-SSS images, where the image activity concentrations in the cylindric phantom were underestimated by 18%, 36%, and 70% at the bottle radioactivity levels of 2.4, 5.1, and 9.7 kBq/mL, respectively. At higher bottle radioactivity, the image activity concentrations in the cylindric phantom were greater than 98% underestimated. For the MCS-SSS, in contrast, the error was within 5% at each bottle radioactivity level, although the image generated slight high-activity artifacts around the bottle when the bottle contained significantly high radioactivity. In the patient imaging with 15 O 2 and C 15 O 2 inhalation, cold artifacts were observed on TFS-SSS images, whereas
Variational Monte Carlo Technique
Indian Academy of Sciences (India)
Home; Journals; Resonance – Journal of Science Education; Volume 19; Issue 8. Variational Monte Carlo Technique: Ground State Energies of Quantum Mechanical Systems. Sukanta Deb. General Article Volume 19 Issue 8 August 2014 pp 713-739 ...
Coded aperture optimization using Monte Carlo simulations
International Nuclear Information System (INIS)
Martineau, A.; Rocchisani, J.M.; Moretti, J.L.
2010-01-01
Coded apertures using Uniformly Redundant Arrays (URA) have been unsuccessfully evaluated for two-dimensional and three-dimensional imaging in Nuclear Medicine. The images reconstructed from coded projections contain artifacts and suffer from poor spatial resolution in the longitudinal direction. We introduce a Maximum-Likelihood Expectation-Maximization (MLEM) algorithm for three-dimensional coded aperture imaging which uses a projection matrix calculated by Monte Carlo simulations. The aim of the algorithm is to reduce artifacts and improve the three-dimensional spatial resolution in the reconstructed images. Firstly, we present the validation of GATE (Geant4 Application for Emission Tomography) for Monte Carlo simulations of a coded mask installed on a clinical gamma camera. The coded mask modelling was validated by comparison between experimental and simulated data in terms of energy spectra, sensitivity and spatial resolution. In the second part of the study, we use the validated model to calculate the projection matrix with Monte Carlo simulations. A three-dimensional thyroid phantom study was performed to compare the performance of the three-dimensional MLEM reconstruction with conventional correlation method. The results indicate that the artifacts are reduced and three-dimensional spatial resolution is improved with the Monte Carlo-based MLEM reconstruction.
Monte Carlo codes and Monte Carlo simulator program
International Nuclear Information System (INIS)
Higuchi, Kenji; Asai, Kiyoshi; Suganuma, Masayuki.
1990-03-01
Four typical Monte Carlo codes KENO-IV, MORSE, MCNP and VIM have been vectorized on VP-100 at Computing Center, JAERI. The problems in vector processing of Monte Carlo codes on vector processors have become clear through the work. As the result, it is recognized that these are difficulties to obtain good performance in vector processing of Monte Carlo codes. A Monte Carlo computing machine, which processes the Monte Carlo codes with high performances is being developed at our Computing Center since 1987. The concept of Monte Carlo computing machine and its performance have been investigated and estimated by using a software simulator. In this report the problems in vectorization of Monte Carlo codes, Monte Carlo pipelines proposed to mitigate these difficulties and the results of the performance estimation of the Monte Carlo computing machine by the simulator are described. (author)
2009-01-01
Carlo Rubbia turned 75 on March 31, and CERN held a symposium to mark his birthday and pay tribute to his impressive contribution to both CERN and science. Carlo Rubbia, 4th from right, together with the speakers at the symposium.On 7 April CERN hosted a celebration marking Carlo Rubbia’s 75th birthday and 25 years since he was awarded the Nobel Prize for Physics. "Today we will celebrate 100 years of Carlo Rubbia" joked CERN’s Director-General, Rolf Heuer in his opening speech, "75 years of his age and 25 years of the Nobel Prize." Rubbia received the Nobel Prize along with Simon van der Meer for contributions to the discovery of the W and Z bosons, carriers of the weak interaction. During the symposium, which was held in the Main Auditorium, several eminent speakers gave lectures on areas of science to which Carlo Rubbia made decisive contributions. Among those who spoke were Michel Spiro, Director of the French National Insti...
Hrivnacova, I; Berejnov, V V; Brun, R; Carminati, F; Fassò, A; Futo, E; Gheata, A; Caballero, I G; Morsch, Andreas
2003-01-01
The concept of Virtual Monte Carlo (VMC) has been developed by the ALICE Software Project to allow different Monte Carlo simulation programs to run without changing the user code, such as the geometry definition, the detector response simulation or input and output formats. Recently, the VMC classes have been integrated into the ROOT framework, and the other relevant packages have been separated from the AliRoot framework and can be used individually by any other HEP project. The general concept of the VMC and its set of base classes provided in ROOT will be presented. Existing implementations for Geant3, Geant4 and FLUKA and simple examples of usage will be described.
Ponomarev, Artem; Cucinotta, F.
2011-01-01
To create a generalized mechanistic model of DNA damage in human cells that will generate analytical and image data corresponding to experimentally observed DNA damage foci and will help to improve the experimental foci yields by simulating spatial foci patterns and resolving problems with quantitative image analysis. Material and Methods: The analysis of patterns of RIFs (radiation-induced foci) produced by low- and high-LET (linear energy transfer) radiation was conducted by using a Monte Carlo model that combines the heavy ion track structure with characteristics of the human genome on the level of chromosomes. The foci patterns were also simulated in the maximum projection plane for flat nuclei. Some data analysis was done with the help of image segmentation software that identifies individual classes of RIFs and colocolized RIFs, which is of importance to some experimental assays that assign DNA damage a dual phosphorescent signal. Results: The model predicts the spatial and genomic distributions of DNA DSBs (double strand breaks) and associated RIFs in a human cell nucleus for a particular dose of either low- or high-LET radiation. We used the model to do analyses for different irradiation scenarios. In the beam-parallel-to-the-disk-of-a-flattened-nucleus scenario we found that the foci appeared to be merged due to their high density, while, in the perpendicular-beam scenario, the foci appeared as one bright spot per hit. The statistics and spatial distribution of regions of densely arranged foci, termed DNA foci chains, were predicted numerically using this model. Another analysis was done to evaluate the number of ion hits per nucleus, which were visible from streaks of closely located foci. In another analysis, our image segmentaiton software determined foci yields directly from images with single-class or colocolized foci. Conclusions: We showed that DSB clustering needs to be taken into account to determine the true DNA damage foci yield, which helps to
Variational Monte Carlo Technique
Indian Academy of Sciences (India)
ias
nonprobabilistic) problem [5]. ... In quantum mechanics, the MC methods are used to simulate many-particle systems us- ing random ...... D Ceperley, G V Chester and M H Kalos, Monte Carlo simulation of a many-fermion study, Physical Review Vol.
Indian Academy of Sciences (India)
Home; Journals; Resonance – Journal of Science Education; Volume 7; Issue 3. Markov Chain Monte Carlo - Examples. Arnab Chakraborty. General Article Volume 7 Issue 3 March 2002 pp 25-34. Fulltext. Click here to view fulltext PDF. Permanent link: https://www.ias.ac.in/article/fulltext/reso/007/03/0025-0034. Keywords.
Leonardo Rossi
Carlo Caso (1940 - 2007) Our friend and colleague Carlo Caso passed away on July 7th, after several months of courageous fight against cancer. Carlo spent most of his scientific career at CERN, taking an active part in the experimental programme of the laboratory. His long and fruitful involvement in particle physics started in the sixties, in the Genoa group led by G. Tomasini. He then made several experiments using the CERN liquid hydrogen bubble chambers -first the 2000HBC and later BEBC- to study various facets of the production and decay of meson and baryon resonances. He later made his own group and joined the NA27 Collaboration to exploit the EHS Spectrometer with a rapid cycling bubble chamber as vertex detector. Amongst their many achievements, they were the first to measure, with excellent precision, the lifetime of the charmed D mesons. At the start of the LEP era, Carlo and his group moved to the DELPHI experiment, participating in the construction and running of the HPC electromagnetic c...
Biegun, A K; van Goethem, M-J; van der Graaf, E R; van Beuzekom, M; Koffeman, E N; Nakaji, T; Takatsu, J; Visser, J; Brandenburg, S
Proton radiography is a novel imaging modality that allows direct measurement of the proton energy loss in various tissues. Currently, due to the conversion of so-called Hounsfield units from X-ray Computed Tomography (CT) into relative proton stopping powers (RPSP), the uncertainties of RPSP are
Kalos, Melvin H
2008-01-01
This introduction to Monte Carlo methods seeks to identify and study the unifying elements that underlie their effective application. Initial chapters provide a short treatment of the probability and statistics needed as background, enabling those without experience in Monte Carlo techniques to apply these ideas to their research.The book focuses on two basic themes: The first is the importance of random walks as they occur both in natural stochastic systems and in their relationship to integral and differential equations. The second theme is that of variance reduction in general and importance sampling in particular as a technique for efficient use of the methods. Random walks are introduced with an elementary example in which the modeling of radiation transport arises directly from a schematic probabilistic description of the interaction of radiation with matter. Building on this example, the relationship between random walks and integral equations is outlined
Directory of Open Access Journals (Sweden)
Pedro Medina Avendaño
1981-01-01
Full Text Available Carlos Vega Duarte tenía la sencillez de los seres elementales y puros. Su corazón era limpio como oro de aluvión. Su trato directo y coloquial ponía de relieve a un santandereano sin contaminaciones que amaba el fulgor de las armas y se encandilaba con el destello de las frases perfectas
Energy Technology Data Exchange (ETDEWEB)
Wollaber, Allan Benton [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2016-06-16
This is a powerpoint presentation which serves as lecture material for the Parallel Computing summer school. It goes over the fundamentals of the Monte Carlo calculation method. The material is presented according to the following outline: Introduction (background, a simple example: estimating π), Why does this even work? (The Law of Large Numbers, The Central Limit Theorem), How to sample (inverse transform sampling, rejection), and An example from particle transport.
Wormhole Hamiltonian Monte Carlo
Lan, S; Streets, J; Shahbaba, B
2014-01-01
Copyright © 2014, Association for the Advancement of Artificial Intelligence. In machine learning and statistics, probabilistic inference involving multimodal distributions is quite difficult. This is especially true in high dimensional problems, where most existing algorithms cannot easily move from one mode to another. To address this issue, we propose a novel Bayesian inference approach based on Markov Chain Monte Carlo. Our method can effectively sample from multimodal distributions, espe...
International Nuclear Information System (INIS)
Creutz, M.
1986-01-01
The author discusses a recently developed algorithm for simulating statistical systems. The procedure interpolates between molecular dynamics methods and canonical Monte Carlo. The primary advantages are extremely fast simulations of discrete systems such as the Ising model and a relative insensitivity to random number quality. A variation of the algorithm gives rise to a deterministic dynamics for Ising spins. This model may be useful for high speed simulation of non-equilibrium phenomena
Energy Technology Data Exchange (ETDEWEB)
Ma, Y; Beaulieu, L [Centre Hospitalier University de Quebec, Quebec, QC (Canada); Laprise-Pelletier, M; Lagueux, J; Cote, M; Fordin, M [Dept of Mining, Metallurgy and Materials Engineering, University Laval, Quebec, QC, CA (Canada)
2016-06-15
Purpose: Gold nanoparticle (GNP) is a promising radiosensitizer that selectively boosts tumor dose in radiotherapy. Transmission electron microscopy (TEM) imaging observations recently revealed for the first time that GNP exists in vivo in the form of highly localized vesicles, instead of hypothetical uniform distribution. This work investigates the corresponding difference of energy deposition in proton therapy. Methods: First, single vesicles of various radii were constructed by packing GNPs (as Φ50 nm gold spheres) in spheres and were simulated, as well as a single GNP. The radial energy depositions (REDs) were scored using 100 concentric spherical shells from 0.1µm to 10µm, 0.1µm thickness each, for both vesicles and GNP, and compared. TEM images, 8 days after injection in a PC3 prostate cancer murine model, were used to extract position/dimension of vesicles, as well as contours of cytoplasmic and nucleus membranes. Vesicles were then constructed based on the TEM images. A 100 MeV proton beam was studied by using the Geant4-DNA code, which simulates all energy deposition events. Results: The vesicle REDs, normalized to the same proton energy loss as in a single GNP, are larger (smaller) than that of a single GNP when radius >2µm (<2µm). The peak increase (at about 3µm radius) is about 10% and 18% for Φ1µm and Φ1.6µm vesicles respectively, relative to a single GNP. The TEM-based simulation resulted in a larger energy deposition (by about one order of magnitude) that follows completely different pattern from that of hypothetical GNP distributions (regular dotted pattern in extracellular and/or extranucleus regions). Conclusion: The in vivo energy deposition, both in pattern and magnitude, of proton therapy is greatly affected by the true distribution of the GNP, as illustrated by the presence of GNP vesicles compared to hypothetical scenarios. Work supported by NSERC Discovery Grant #435510, Canada.
Energy Technology Data Exchange (ETDEWEB)
Brockway, D.; Soran, P.; Whalen, P.
1985-01-01
A Monte Carlo algorithm to efficiently calculate static alpha eigenvalues, N = ne/sup ..cap alpha..t/, for supercritical systems has been developed and tested. A direct Monte Carlo approach to calculating a static alpha is to simply follow the buildup in time of neutrons in a supercritical system and evaluate the logarithmic derivative of the neutron population with respect to time. This procedure is expensive, and the solution is very noisy and almost useless for a system near critical. The modified approach is to convert the time-dependent problem to a static ..cap alpha../sup -/eigenvalue problem and regress ..cap alpha.. on solutions of a/sup -/ k/sup -/eigenvalue problem. In practice, this procedure is much more efficient than the direct calculation, and produces much more accurate results. Because the Monte Carlo codes are intrinsically three-dimensional and use elaborate continuous-energy cross sections, this technique is now used as a standard for evaluating other calculational techniques in odd geometries or with group cross sections.
Directory of Open Access Journals (Sweden)
Charlie Samuya Veric
2001-12-01
Full Text Available The importance of Carlos Bulosan in Filipino and Filipino-American radical history and literature is indisputable. His eminence spans the pacific, and he is known, diversely, as a radical poet, fictionist, novelist, and labor organizer. Author of the canonical America Iis the Hearts, Bulosan is celebrated for chronicling the conditions in America in his time, such as racism and unemployment. In the history of criticism on Bulosan's life and work, however, there is an undeclared general consensus that views Bulosan and his work as coherent permanent texts of radicalism and anti-imperialism. Central to the existence of such a tradition of critical reception are the generations of critics who, in more ways than one, control the discourse on and of Carlos Bulosan. This essay inquires into the sphere of the critical reception that orders, for our time and for the time ahead, the reading and interpretation of Bulosan. What eye and seeing, the essay asks, determine the perception of Bulosan as the angel of radicalism? What is obscured in constructing Bulosan as an immutable figure of the political? What light does the reader conceive when the personal is brought into the open and situated against the political? the essay explores the answers to these questions in Bulosan's loving letters to various friends, strangers, and white American women. The presence of these interrogations, the essay believes, will secure ultimately the continuing importance of Carlos Bulosan to radical literature and history.
2009-01-01
On 7 April CERN will be holding a symposium to mark the 75th birthday of Carlo Rubbia, who shared the 1984 Nobel Prize for Physics with Simon van der Meer for contributions to the discovery of the W and Z bosons, carriers of the weak interaction. Following a presentation by Rolf Heuer, lectures will be given by eminent speakers on areas of science to which Carlo Rubbia has made decisive contributions. Michel Spiro, Director of the French National Institute of Nuclear and Particle Physics (IN2P3) of the CNRS, Lyn Evans, sLHC Project Leader, and Alan Astbury of the TRIUMF Laboratory will talk about the physics of the weak interaction and the discovery of the W and Z bosons. Former CERN Director-General Herwig Schopper will lecture on CERN’s accelerators from LEP to the LHC. Giovanni Bignami, former President of the Italian Space Agency and Professor at the IUSS School for Advanced Studies in Pavia will speak about his work with Carlo Rubbia. Finally, Hans Joachim Sch...
2009-01-01
On 7 April CERN will be holding a symposium to mark the 75th birthday of Carlo Rubbia, who shared the 1984 Nobel Prize for Physics with Simon van der Meer for contributions to the discovery of the W and Z bosons, carriers of the weak interaction. Following a presentation by Rolf Heuer, lectures will be given by eminent speakers on areas of science to which Carlo Rubbia has made decisive contributions. Michel Spiro, Director of the French National Institute of Nuclear and Particle Physics (IN2P3) of the CNRS, Lyn Evans, sLHC Project Leader, and Alan Astbury of the TRIUMF Laboratory will talk about the physics of the weak interaction and the discovery of the W and Z bosons. Former CERN Director-General Herwig Schopper will lecture on CERN’s accelerators from LEP to the LHC. Giovanni Bignami, former President of the Italian Space Agency, will speak about his work with Carlo Rubbia. Finally, Hans Joachim Schellnhuber of the Potsdam Institute for Climate Research and Sven Kul...
International Nuclear Information System (INIS)
Park, S-J; Yu, A R; Lee, Y-J; Kim, Y-S; Kim, H-J
2014-01-01
Dedicated single-photon-emission computed tomography (SPECT) systems based on pixelated semiconductors such as cadmium telluride (CdTe) are in development to study small animal models of human disease. In an effort to develop a high-resolution, low-dose system for small animal imaging, we compared a CdTe-based SPECT system and a conventional NaI(Tl)-based SPECT system in terms of spatial resolution, sensitivity, contrast, and contrast-to-noise ratio (CNR). In addition, we investigated the radiation absorbed dose and calculated a figure of merit (FOM) for both SPECT systems. Using the conventional NaI(Tl)-based SPECT system, we achieved a spatial resolution of 1.66 mm at a 30 mm source-to-collimator distance, and a resolution of 2.4-mm hot-rods. Using the newly-developed CdTe-based SPECT system, we achieved a spatial resolution of 1.32 mm FWHM at a 30 mm source-to-collimator distance, and a resolution of 1.7-mm hot-rods. The sensitivities at a 30 mm source-to-collimator distance were 115.73 counts/sec/MBq and 83.38 counts/sec/MBq for the CdTe-based SPECT and conventional NaI(Tl)-based SPECT systems, respectively. To compare quantitative measurements in the mouse brain, we calculated the CNR for images from both systems. The CNR from the CdTe-based SPECT system was 4.41, while that from the conventional NaI(Tl)-based SPECT system was 3.11 when the injected striatal dose was 160 Bq/voxel. The CNR increased as a function of injected dose in both systems. The FOM of the CdTe-based SPECT system was superior to that of the conventional NaI(Tl)-based SPECT system, and the highest FOM was achieved with the CdTe-based SPECT at a dose of 40 Bq/voxel injected into the striatum. Thus, a CdTe-based SPECT system showed significant improvement in performance compared with a conventional system in terms of spatial resolution, sensitivity, and CNR, while reducing the radiation dose to the small animal subject. Herein, we discuss the feasibility of a CdTe-based SPECT system for high
Directory of Open Access Journals (Sweden)
Ho Chul Kim
2017-06-01
Full Text Available To avoid imaging artifacts and interpretation mistakes, an improvement of the uniformity in gamma camera systems is a very important point. We can expect excellent uniformity using cadmium zinc telluride (CZT photon counting detector (PCD because of the direct conversion of the gamma rays energy into electrons. In addition, the uniformity performance such as integral uniformity (IU, differential uniformity (DU, scatter fraction (SF, and contrast-to-noise ratio (CNR varies according to the energy window setting. In this study, we compared a PCD and conventional scintillation detector with respect to the energy windows (5%, 10%, 15%, and 20% using a 99mTc gamma source with a Geant4 Application for Tomography Emission simulation tool. The gamma camera systems used in this work are a CZT PCD and NaI(Tl conventional scintillation detector with a 1-mm thickness. According to the results, although the IU and DU results were improved with the energy window, the SF and CNR results deteriorated with the energy window. In particular, the uniformity for the PCD was higher than that of the conventional scintillation detector in all cases. In conclusion, our results demonstrated that the uniformity of the CZT PCD was higher than that of the conventional scintillation detector.
International Nuclear Information System (INIS)
Talley, T.L.; Evans, F.
1988-01-01
Prior work demonstrated the importance of nuclear scattering to fusion product energy deposition in hot plasmas. This suggests careful examination of nuclear physics details in burning plasma simulations. An existing Monte Carlo fast ion transport code is being expanded to be a test bed for this examination. An initial extension, the energy deposition of fast alpha particles in a hot deuterium plasma, is reported. The deposition times and deposition ranges are modified by allowing nuclear scattering. Up to 10% of the initial alpha particle energy is carried to greater ranges and times by the more mobile recoil deuterons. 4 refs., 5 figs., 2 tabs
Adaptive sample map for Monte Carlo ray tracing
Teng, Jun; Luo, Lixin; Chen, Zhibo
2010-07-01
Monte Carlo ray tracing algorithm is widely used by production quality renderers to generate synthesized images in films and TV programs. Noise artifact exists in synthetic images generated by Monte Carlo ray tracing methods. In this paper, a novel noise artifact detection and noise level representation method is proposed. We first apply discrete wavelet transform (DWT) on a synthetic image; the high frequency sub-bands of the DWT result encode the noise information. The sub-bands coefficients are then combined to generate a noise level description of the synthetic image, which is called noise map in the paper. This noise map is then subdivided into blocks for robust noise level metric calculation. Increasing the samples per pixel in Monte Carlo ray tracer can reduce the noise of a synthetic image to visually unnoticeable level. A noise-to-sample number mapping algorithm is thus performed on each block of the noise map, higher noise value is mapped to larger sample number, and lower noise value is mapped to smaller sample number, the result of mapping is called sample map. Each pixel in a sample map can be used by Monte Carlo ray tracer to reduce the noise level in the corresponding block of pixels in a synthetic image. However, this block based scheme produces blocky artifact as appeared in video and image compression algorithms. We use Gaussian filter to smooth the sample map, the result is adaptive sample map (ASP). ASP serves two purposes in rendering process; its statistics information can be used as noise level metric in synthetic image, and it can also be used by a Monte Carlo ray tracer to refine the synthetic image adaptively in order to reduce the noise to unnoticeable level but with less rendering time than the brute force method.
Monte Carlo Methods in Physics
International Nuclear Information System (INIS)
Santoso, B.
1997-01-01
Method of Monte Carlo integration is reviewed briefly and some of its applications in physics are explained. A numerical experiment on random generators used in the monte Carlo techniques is carried out to show the behavior of the randomness of various methods in generating them. To account for the weight function involved in the Monte Carlo, the metropolis method is used. From the results of the experiment, one can see that there is no regular patterns of the numbers generated, showing that the program generators are reasonably good, while the experimental results, shows a statistical distribution obeying statistical distribution law. Further some applications of the Monte Carlo methods in physics are given. The choice of physical problems are such that the models have available solutions either in exact or approximate values, in which comparisons can be mode, with the calculations using the Monte Carlo method. Comparison show that for the models to be considered, good agreement have been obtained
Metropolis Methods for Quantum Monte Carlo Simulations
Ceperley, D. M.
2003-01-01
Since its first description fifty years ago, the Metropolis Monte Carlo method has been used in a variety of different ways for the simulation of continuum quantum many-body systems. This paper will consider some of the generalizations of the Metropolis algorithm employed in quantum Monte Carlo: Variational Monte Carlo, dynamical methods for projector monte carlo ({\\it i.e.} diffusion Monte Carlo with rejection), multilevel sampling in path integral Monte Carlo, the sampling of permutations, ...
Parallelizing Monte Carlo with PMC
International Nuclear Information System (INIS)
Rathkopf, J.A.; Jones, T.R.; Nessett, D.M.; Stanberry, L.C.
1994-11-01
PMC (Parallel Monte Carlo) is a system of generic interface routines that allows easy porting of Monte Carlo packages of large-scale physics simulation codes to Massively Parallel Processor (MPP) computers. By loading various versions of PMC, simulation code developers can configure their codes to run in several modes: serial, Monte Carlo runs on the same processor as the rest of the code; parallel, Monte Carlo runs in parallel across many processors of the MPP with the rest of the code running on other MPP processor(s); distributed, Monte Carlo runs in parallel across many processors of the MPP with the rest of the code running on a different machine. This multi-mode approach allows maintenance of a single simulation code source regardless of the target machine. PMC handles passing of messages between nodes on the MPP, passing of messages between a different machine and the MPP, distributing work between nodes, and providing independent, reproducible sequences of random numbers. Several production codes have been parallelized under the PMC system. Excellent parallel efficiency in both the distributed and parallel modes results if sufficient workload is available per processor. Experiences with a Monte Carlo photonics demonstration code and a Monte Carlo neutronics package are described
Lectures on Monte Carlo methods
Madras, Neal
2001-01-01
Monte Carlo methods form an experimental branch of mathematics that employs simulations driven by random number generators. These methods are often used when others fail, since they are much less sensitive to the "curse of dimensionality", which plagues deterministic methods in problems with a large number of variables. Monte Carlo methods are used in many fields: mathematics, statistics, physics, chemistry, finance, computer science, and biology, for instance. This book is an introduction to Monte Carlo methods for anyone who would like to use these methods to study various kinds of mathemati
Wormhole Hamiltonian Monte Carlo
Lan, Shiwei; Streets, Jeffrey; Shahbaba, Babak
2015-01-01
In machine learning and statistics, probabilistic inference involving multimodal distributions is quite difficult. This is especially true in high dimensional problems, where most existing algorithms cannot easily move from one mode to another. To address this issue, we propose a novel Bayesian inference approach based on Markov Chain Monte Carlo. Our method can effectively sample from multimodal distributions, especially when the dimension is high and the modes are isolated. To this end, it exploits and modifies the Riemannian geometric properties of the target distribution to create wormholes connecting modes in order to facilitate moving between them. Further, our proposed method uses the regeneration technique in order to adapt the algorithm by identifying new modes and updating the network of wormholes without affecting the stationary distribution. To find new modes, as opposed to redis-covering those previously identified, we employ a novel mode searching algorithm that explores a residual energy function obtained by subtracting an approximate Gaussian mixture density (based on previously discovered modes) from the target density function. PMID:25861551
Wormhole Hamiltonian Monte Carlo.
Lan, Shiwei; Streets, Jeffrey; Shahbaba, Babak
2014-07-31
In machine learning and statistics, probabilistic inference involving multimodal distributions is quite difficult. This is especially true in high dimensional problems, where most existing algorithms cannot easily move from one mode to another. To address this issue, we propose a novel Bayesian inference approach based on Markov Chain Monte Carlo. Our method can effectively sample from multimodal distributions, especially when the dimension is high and the modes are isolated. To this end, it exploits and modifies the Riemannian geometric properties of the target distribution to create wormholes connecting modes in order to facilitate moving between them. Further, our proposed method uses the regeneration technique in order to adapt the algorithm by identifying new modes and updating the network of wormholes without affecting the stationary distribution. To find new modes, as opposed to redis-covering those previously identified, we employ a novel mode searching algorithm that explores a residual energy function obtained by subtracting an approximate Gaussian mixture density (based on previously discovered modes) from the target density function.
Fixed forced detection for fast SPECT Monte-Carlo simulation
Cajgfinger, T.; Rit, S.; Létang, J. M.; Halty, A.; Sarrut, D.
2018-03-01
Monte-Carlo simulations of SPECT images are notoriously slow to converge due to the large ratio between the number of photons emitted and detected in the collimator. This work proposes a method to accelerate the simulations based on fixed forced detection (FFD) combined with an analytical response of the detector. FFD is based on a Monte-Carlo simulation but forces the detection of a photon in each detector pixel weighted by the probability of emission (or scattering) and transmission to this pixel. The method was evaluated with numerical phantoms and on patient images. We obtained differences with analog Monte Carlo lower than the statistical uncertainty. The overall computing time gain can reach up to five orders of magnitude. Source code and examples are available in the Gate V8.0 release.
Computed radiography simulation using the Monte Carlo code MCNPX
International Nuclear Information System (INIS)
Correa, S.C.A.; Souza, E.M.; Silva, A.X.; Lopes, R.T.
2009-01-01
Simulating x-ray images has been of great interest in recent years as it makes possible an analysis of how x-ray images are affected owing to relevant operating parameters. In this paper, a procedure for simulating computed radiographic images using the Monte Carlo code MCNPX is proposed. The sensitivity curve of the BaFBr image plate detector as well as the characteristic noise of a 16-bit computed radiography system were considered during the methodology's development. The results obtained confirm that the proposed procedure for simulating computed radiographic images is satisfactory, as it allows obtaining results comparable with experimental data. (author)
Computed radiography simulation using the Monte Carlo code MCNPX
Energy Technology Data Exchange (ETDEWEB)
Correa, S.C.A. [Programa de Engenharia Nuclear/COPPE, Universidade Federal do Rio de Janeiro, Ilha do Fundao, Caixa Postal 68509, 21945-970, Rio de Janeiro, RJ (Brazil); Centro Universitario Estadual da Zona Oeste (CCMAT)/UEZO, Av. Manuel Caldeira de Alvarenga, 1203, Campo Grande, 23070-200, Rio de Janeiro, RJ (Brazil); Souza, E.M. [Programa de Engenharia Nuclear/COPPE, Universidade Federal do Rio de Janeiro, Ilha do Fundao, Caixa Postal 68509, 21945-970, Rio de Janeiro, RJ (Brazil); Silva, A.X., E-mail: ademir@con.ufrj.b [PEN/COPPE-DNC/Poli CT, Universidade Federal do Rio de Janeiro, Ilha do Fundao, Caixa Postal 68509, 21945-970, Rio de Janeiro, RJ (Brazil); Cassiano, D.H. [Instituto de Radioprotecao e Dosimetria/CNEN Av. Salvador Allende, s/n, Recreio, 22780-160, Rio de Janeiro, RJ (Brazil); Lopes, R.T. [Programa de Engenharia Nuclear/COPPE, Universidade Federal do Rio de Janeiro, Ilha do Fundao, Caixa Postal 68509, 21945-970, Rio de Janeiro, RJ (Brazil)
2010-09-15
Simulating X-ray images has been of great interest in recent years as it makes possible an analysis of how X-ray images are affected owing to relevant operating parameters. In this paper, a procedure for simulating computed radiographic images using the Monte Carlo code MCNPX is proposed. The sensitivity curve of the BaFBr image plate detector as well as the characteristic noise of a 16-bit computed radiography system were considered during the methodology's development. The results obtained confirm that the proposed procedure for simulating computed radiographic images is satisfactory, as it allows obtaining results comparable with experimental data.
Advanced Multilevel Monte Carlo Methods
Jasra, Ajay
2017-04-24
This article reviews the application of advanced Monte Carlo techniques in the context of Multilevel Monte Carlo (MLMC). MLMC is a strategy employed to compute expectations which can be biased in some sense, for instance, by using the discretization of a associated probability law. The MLMC approach works with a hierarchy of biased approximations which become progressively more accurate and more expensive. Using a telescoping representation of the most accurate approximation, the method is able to reduce the computational cost for a given level of error versus i.i.d. sampling from this latter approximation. All of these ideas originated for cases where exact sampling from couples in the hierarchy is possible. This article considers the case where such exact sampling is not currently possible. We consider Markov chain Monte Carlo and sequential Monte Carlo methods which have been introduced in the literature and we describe different strategies which facilitate the application of MLMC within these methods.
Handbook of Monte Carlo methods
National Research Council Canada - National Science Library
Kroese, Dirk P; Taimre, Thomas; Botev, Zdravko I
2011-01-01
... in rapid succession, the staggering number of related techniques, ideas, concepts and algorithms makes it difficult to maintain an overall picture of the Monte Carlo approach. This book attempts to encapsulate the emerging dynamics of this field of study"--
TARC: Carlo Rubbia's Energy Amplifier
Laurent Guiraud
1997-01-01
Transmutation by Adiabatic Resonance Crossing (TARC) is Carlo Rubbia's energy amplifier. This CERN experiment demonstrated that long-lived fission fragments, such as 99-TC, can be efficiently destroyed.
Monte Carlo simulation for IRRMA
International Nuclear Information System (INIS)
Gardner, R.P.; Liu Lianyan
2000-01-01
Monte Carlo simulation is fast becoming a standard approach for many radiation applications that were previously treated almost entirely by experimental techniques. This is certainly true for Industrial Radiation and Radioisotope Measurement Applications - IRRMA. The reasons for this include: (1) the increased cost and inadequacy of experimentation for design and interpretation purposes; (2) the availability of low cost, large memory, and fast personal computers; and (3) the general availability of general purpose Monte Carlo codes that are increasingly user-friendly, efficient, and accurate. This paper discusses the history and present status of Monte Carlo simulation for IRRMA including the general purpose (GP) and specific purpose (SP) Monte Carlo codes and future needs - primarily from the experience of the authors
Accelerated GPU based SPECT Monte Carlo simulations.
Garcia, Marie-Paule; Bert, Julien; Benoit, Didier; Bardiès, Manuel; Visvikis, Dimitris
2016-06-07
Monte Carlo (MC) modelling is widely used in the field of single photon emission computed tomography (SPECT) as it is a reliable technique to simulate very high quality scans. This technique provides very accurate modelling of the radiation transport and particle interactions in a heterogeneous medium. Various MC codes exist for nuclear medicine imaging simulations. Recently, new strategies exploiting the computing capabilities of graphical processing units (GPU) have been proposed. This work aims at evaluating the accuracy of such GPU implementation strategies in comparison to standard MC codes in the context of SPECT imaging. GATE was considered the reference MC toolkit and used to evaluate the performance of newly developed GPU Geant4-based Monte Carlo simulation (GGEMS) modules for SPECT imaging. Radioisotopes with different photon energies were used with these various CPU and GPU Geant4-based MC codes in order to assess the best strategy for each configuration. Three different isotopes were considered: (99m) Tc, (111)In and (131)I, using a low energy high resolution (LEHR) collimator, a medium energy general purpose (MEGP) collimator and a high energy general purpose (HEGP) collimator respectively. Point source, uniform source, cylindrical phantom and anthropomorphic phantom acquisitions were simulated using a model of the GE infinia II 3/8" gamma camera. Both simulation platforms yielded a similar system sensitivity and image statistical quality for the various combinations. The overall acceleration factor between GATE and GGEMS platform derived from the same cylindrical phantom acquisition was between 18 and 27 for the different radioisotopes. Besides, a full MC simulation using an anthropomorphic phantom showed the full potential of the GGEMS platform, with a resulting acceleration factor up to 71. The good agreement with reference codes and the acceleration factors obtained support the use of GPU implementation strategies for improving computational
Accelerated GPU based SPECT Monte Carlo simulations
Garcia, Marie-Paule; Bert, Julien; Benoit, Didier; Bardiès, Manuel; Visvikis, Dimitris
2016-06-01
Monte Carlo (MC) modelling is widely used in the field of single photon emission computed tomography (SPECT) as it is a reliable technique to simulate very high quality scans. This technique provides very accurate modelling of the radiation transport and particle interactions in a heterogeneous medium. Various MC codes exist for nuclear medicine imaging simulations. Recently, new strategies exploiting the computing capabilities of graphical processing units (GPU) have been proposed. This work aims at evaluating the accuracy of such GPU implementation strategies in comparison to standard MC codes in the context of SPECT imaging. GATE was considered the reference MC toolkit and used to evaluate the performance of newly developed GPU Geant4-based Monte Carlo simulation (GGEMS) modules for SPECT imaging. Radioisotopes with different photon energies were used with these various CPU and GPU Geant4-based MC codes in order to assess the best strategy for each configuration. Three different isotopes were considered: 99m Tc, 111In and 131I, using a low energy high resolution (LEHR) collimator, a medium energy general purpose (MEGP) collimator and a high energy general purpose (HEGP) collimator respectively. Point source, uniform source, cylindrical phantom and anthropomorphic phantom acquisitions were simulated using a model of the GE infinia II 3/8" gamma camera. Both simulation platforms yielded a similar system sensitivity and image statistical quality for the various combinations. The overall acceleration factor between GATE and GGEMS platform derived from the same cylindrical phantom acquisition was between 18 and 27 for the different radioisotopes. Besides, a full MC simulation using an anthropomorphic phantom showed the full potential of the GGEMS platform, with a resulting acceleration factor up to 71. The good agreement with reference codes and the acceleration factors obtained support the use of GPU implementation strategies for improving computational efficiency
Carlos Chagas: biographical sketch.
Moncayo, Alvaro
2010-01-01
Carlos Chagas was born on 9 July 1878 in the farm "Bon Retiro" located close to the City of Oliveira in the interior of the State of Minas Gerais, Brazil. He started his medical studies in 1897 at the School of Medicine of Rio de Janeiro. In the late XIX century, the works by Louis Pasteur and Robert Koch induced a change in the medical paradigm with emphasis in experimental demonstrations of the causal link between microbes and disease. During the same years in Germany appeared the pathological concept of disease, linking organic lesions with symptoms. All these innovations were adopted by the reforms of the medical schools in Brazil and influenced the scientific formation of Chagas. Chagas completed his medical studies between 1897 and 1903 and his examinations during these years were always ranked with high grades. Oswaldo Cruz accepted Chagas as a doctoral candidate and directed his thesis on "Hematological studies of Malaria" which was received with honors by the examiners. In 1903 the director appointed Chagas as research assistant at the Institute. In those years, the Institute of Manguinhos, under the direction of Oswaldo Cruz, initiated a process of institutional growth and gathered a distinguished group of Brazilian and foreign scientists. In 1907, he was requested to investigate and control a malaria outbreak in Lassance, Minas Gerais. In this moment Chagas could not have imagined that this field research was the beginning of one of the most notable medical discoveries. Chagas was, at the age of 28, a Research Assistant at the Institute of Manguinhos and was studying a new flagellate parasite isolated from triatomine insects captured in the State of Minas Gerais. Chagas made his discoveries in this order: first the causal agent, then the vector and finally the human cases. These notable discoveries were carried out by Chagas in twenty months. At the age of 33 Chagas had completed his discoveries and published the scientific articles that gave him world
Adjoint electron Monte Carlo calculations
International Nuclear Information System (INIS)
Jordan, T.M.
1986-01-01
Adjoint Monte Carlo is the most efficient method for accurate analysis of space systems exposed to natural and artificially enhanced electron environments. Recent adjoint calculations for isotropic electron environments include: comparative data for experimental measurements on electronics boxes; benchmark problem solutions for comparing total dose prediction methodologies; preliminary assessment of sectoring methods used during space system design; and total dose predictions on an electronics package. Adjoint Monte Carlo, forward Monte Carlo, and experiment are in excellent agreement for electron sources that simulate space environments. For electron space environments, adjoint Monte Carlo is clearly superior to forward Monte Carlo, requiring one to two orders of magnitude less computer time for relatively simple geometries. The solid-angle sectoring approximations used for routine design calculations can err by more than a factor of 2 on dose in simple shield geometries. For critical space systems exposed to severe electron environments, these potential sectoring errors demand the establishment of large design margins and/or verification of shield design by adjoint Monte Carlo/experiment
Monte Carlo simulation for radiographic applications
International Nuclear Information System (INIS)
Tillack, G.R.; Bellon, C.
2003-01-01
Standard radiography simulators are based on the attenuation law complemented by built-up-factors (BUF) to describe the interaction of radiation with material. The assumption of BUF implies that scattered radiation reduces only the contrast in radiographic images. This simplification holds for a wide range of applications like weld inspection as known from practical experience. But only a detailed description of the different underlying interaction mechanisms is capable to explain effects like mottling or others that every radiographer has experienced in practice. The application of Monte Carlo models is capable to handle primary and secondary interaction mechanisms contributing to the image formation process like photon interactions (absorption, incoherent and coherent scattering including electron-binding effects, pair production) and electron interactions (electron tracing including X-Ray fluorescence and Bremsstrahlung production). It opens up possibilities like the separation of influencing factors and the understanding of the functioning of intensifying screen used in film radiography. The paper discusses the opportunities in applying the Monte Carlo method to investigate special features in radiography in terms of selected examples. (orig.) [de
Parallel Monte Carlo Search for Hough Transform
Lopes, Raul H. C.; Franqueira, Virginia N. L.; Reid, Ivan D.; Hobson, Peter R.
2017-10-01
We investigate the problem of line detection in digital image processing and in special how state of the art algorithms behave in the presence of noise and whether CPU efficiency can be improved by the combination of a Monte Carlo Tree Search, hierarchical space decomposition, and parallel computing. The starting point of the investigation is the method introduced in 1962 by Paul Hough for detecting lines in binary images. Extended in the 1970s to the detection of space forms, what came to be known as Hough Transform (HT) has been proposed, for example, in the context of track fitting in the LHC ATLAS and CMS projects. The Hough Transform transfers the problem of line detection, for example, into one of optimization of the peak in a vote counting process for cells which contain the possible points of candidate lines. The detection algorithm can be computationally expensive both in the demands made upon the processor and on memory. Additionally, it can have a reduced effectiveness in detection in the presence of noise. Our first contribution consists in an evaluation of the use of a variation of the Radon Transform as a form of improving theeffectiveness of line detection in the presence of noise. Then, parallel algorithms for variations of the Hough Transform and the Radon Transform for line detection are introduced. An algorithm for Parallel Monte Carlo Search applied to line detection is also introduced. Their algorithmic complexities are discussed. Finally, implementations on multi-GPU and multicore architectures are discussed.
National Aeronautics and Space Administration — Images for the website main pages and all configurations. The upload and access points for the other images are: Website Template RSW images BSCW Images HIRENASD...
Multilevel sequential Monte Carlo samplers
Beskos, Alexandros
2016-08-29
In this article we consider the approximation of expectations w.r.t. probability distributions associated to the solution of partial differential equations (PDEs); this scenario appears routinely in Bayesian inverse problems. In practice, one often has to solve the associated PDE numerically, using, for instance finite element methods which depend on the step-size level . hL. In addition, the expectation cannot be computed analytically and one often resorts to Monte Carlo methods. In the context of this problem, it is known that the introduction of the multilevel Monte Carlo (MLMC) method can reduce the amount of computational effort to estimate expectations, for a given level of error. This is achieved via a telescoping identity associated to a Monte Carlo approximation of a sequence of probability distributions with discretization levels . âˆž>h0>h1â‹¯>hL. In many practical problems of interest, one cannot achieve an i.i.d. sampling of the associated sequence and a sequential Monte Carlo (SMC) version of the MLMC method is introduced to deal with this problem. It is shown that under appropriate assumptions, the attractive property of a reduction of the amount of computational effort to estimate expectations, for a given level of error, can be maintained within the SMC context. That is, relative to exact sampling and Monte Carlo for the distribution at the finest level . hL. The approach is numerically illustrated on a Bayesian inverse problem. Â© 2016 Elsevier B.V.
Strategies for CT tissue segmentation for Monte Carlo calculations in nuclear medicine dosimetry
DEFF Research Database (Denmark)
Braad, Poul-Erik; Andersen, Thomas; Hansen, Søren Baarsgaard
2016-01-01
Purpose: CT images are used for patient specific Monte Carlo treatment planning in radionuclide therapy. The authors investigated the impact of tissue classification, CT image segmentation, and CT errors on Monte Carlo calculated absorbed dose estimates in nuclear medicine. Methods: CT errors...... calibration of the CT number-to-density conversion ramp. Tissue segmentation by a 13-tissue CT conversion ramp, calibrated by a stoichiometric method, resulted in low (isotopes. Conclusions: A calibrated CT scanner specific conversion ramp is required for accurate...
Monte Carlo techniques in diagnostic and therapeutic nuclear medicine
International Nuclear Information System (INIS)
Zaidi, H.
2002-01-01
community at large. The application of Monte Carlo techniques in medical physics is an ever lasting enthusiastic topic and an area of considerable research interest. Monte Carlo modelling has contributed to a better understanding of the physics of radiation transport in medical physics. As an example, the large number of applications of the Monte Carlo method attests to its usefulness as a research tool n different areas of nuclear medicine imaging including detector modelling and systems design, image reconstruction and correction techniques, internal dosimetry and pharmacokinetic modelling. In particular, Monte Carlo simulation is a gold standard for the simulation of nuclear medicine imaging systems and is an indispensable research tool to develop and evaluate dose planning algorithms. Recent developments in nuclear medicine instrumentation including high-resolution SPECT/PET scanners and multimodality imagers as well as applications in patient-specific dosimetry are ideal for Monte Carlo modelling techniques because of the stochastic nature of radiation emission, transport and detection processes. Factors, which have contributed to the wider use, include improved models of radiation transport processes, the practicality of application with the development of acceleration schemes and the improved speed of computers as well as the availability of multiple-processor parallel processing systems
Markov Chain Monte Carlo Methods-Simple Monte Carlo
Indian Academy of Sciences (India)
Home; Journals; Resonance – Journal of Science Education; Volume 8; Issue 4. Markov Chain Monte Carlo ... New York 14853, USA. Indian Statistical Institute 8th Mile, Mysore Road Bangalore 560 059, India. Systat Software Asia-Pacific (PI Ltd., Floor 5, 'C' Tower Golden Enclave, Airport Road Bangalore 560017, India.
Exact Monte Carlo for molecules
Energy Technology Data Exchange (ETDEWEB)
Lester, W.A. Jr.; Reynolds, P.J.
1985-03-01
A brief summary of the fixed-node quantum Monte Carlo method is presented. Results obtained for binding energies, the classical barrier height for H + H2, and the singlet-triplet splitting in methylene are presented and discussed. 17 refs.
Markov Chain Monte Carlo Methods
Indian Academy of Sciences (India)
time Technical Consultant to. Systat Software Asia-Pacific. (P) Ltd., in Bangalore, where the technical work for the development of the statistical software Systat takes place. His research interests have been in statistical pattern recognition and biostatistics. Keywords. Markov chain, Monte Carlo sampling, Markov chain Monte.
Markov Chain Monte Carlo Methods
Indian Academy of Sciences (India)
Markov Chain Monte Carlo Methods. 2. The Markov Chain Case. K B Athreya, Mohan Delampady and T Krishnan. K B Athreya is a Professor at. Cornell University. His research interests include mathematical analysis, probability theory and its application and statistics. He enjoys writing for Resonance. His spare time is ...
Markov Chain Monte Carlo Methods
Indian Academy of Sciences (India)
GENERAL ! ARTICLE. Markov Chain Monte Carlo Methods. 3. Statistical Concepts. K B Athreya, Mohan Delampady and T Krishnan. K B Athreya is a Professor at. Cornell University. His research interests include mathematical analysis, probability theory and its application and statistics. He enjoys writing for Resonance.
Monte Carlo calculations of nuclei
Energy Technology Data Exchange (ETDEWEB)
Pieper, S.C. [Argonne National Lab., IL (United States). Physics Div.
1997-10-01
Nuclear many-body calculations have the complication of strong spin- and isospin-dependent potentials. In these lectures the author discusses the variational and Green`s function Monte Carlo techniques that have been developed to address this complication, and presents a few results.
Markov Chain Monte Carlo Methods
Indian Academy of Sciences (India)
ter of the 20th century, due to rapid developments in computing technology ... early part of this development saw a host of Monte ... These iterative. Monte Carlo procedures typically generate a random se- quence with the Markov property such that the Markov chain is ergodic with a limiting distribution coinciding with the ...
Is Monte Carlo embarrassingly parallel?
International Nuclear Information System (INIS)
Hoogenboom, J. E.
2012-01-01
Monte Carlo is often stated as being embarrassingly parallel. However, running a Monte Carlo calculation, especially a reactor criticality calculation, in parallel using tens of processors shows a serious limitation in speedup and the execution time may even increase beyond a certain number of processors. In this paper the main causes of the loss of efficiency when using many processors are analyzed using a simple Monte Carlo program for criticality. The basic mechanism for parallel execution is MPI. One of the bottlenecks turn out to be the rendez-vous points in the parallel calculation used for synchronization and exchange of data between processors. This happens at least at the end of each cycle for fission source generation in order to collect the full fission source distribution for the next cycle and to estimate the effective multiplication factor, which is not only part of the requested results, but also input to the next cycle for population control. Basic improvements to overcome this limitation are suggested and tested. Also other time losses in the parallel calculation are identified. Moreover, the threading mechanism, which allows the parallel execution of tasks based on shared memory using OpenMP, is analyzed in detail. Recommendations are given to get the maximum efficiency out of a parallel Monte Carlo calculation. (authors)
Monte Carlo - Advances and Challenges
International Nuclear Information System (INIS)
Brown, Forrest B.; Mosteller, Russell D.; Martin, William R.
2008-01-01
Abstract only, full text follows: With ever-faster computers and mature Monte Carlo production codes, there has been tremendous growth in the application of Monte Carlo methods to the analysis of reactor physics and reactor systems. In the past, Monte Carlo methods were used primarily for calculating k eff of a critical system. More recently, Monte Carlo methods have been increasingly used for determining reactor power distributions and many design parameters, such as β eff , l eff , τ, reactivity coefficients, Doppler defect, dominance ratio, etc. These advanced applications of Monte Carlo methods are now becoming common, not just feasible, but bring new challenges to both developers and users: Convergence of 3D power distributions must be assured; confidence interval bias must be eliminated; iterated fission probabilities are required, rather than single-generation probabilities; temperature effects including Doppler and feedback must be represented; isotopic depletion and fission product buildup must be modeled. This workshop focuses on recent advances in Monte Carlo methods and their application to reactor physics problems, and on the resulting challenges faced by code developers and users. The workshop is partly tutorial, partly a review of the current state-of-the-art, and partly a discussion of future work that is needed. It should benefit both novice and expert Monte Carlo developers and users. In each of the topic areas, we provide an overview of needs, perspective on past and current methods, a review of recent work, and discussion of further research and capabilities that are required. Electronic copies of all workshop presentations and material will be available. The workshop is structured as 2 morning and 2 afternoon segments: - Criticality Calculations I - convergence diagnostics, acceleration methods, confidence intervals, and the iterated fission probability, - Criticality Calculations II - reactor kinetics parameters, dominance ratio, temperature
(U) Introduction to Monte Carlo Methods
Energy Technology Data Exchange (ETDEWEB)
Hungerford, Aimee L. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2017-03-20
Monte Carlo methods are very valuable for representing solutions to particle transport problems. Here we describe a “cook book” approach to handling the terms in a transport equation using Monte Carlo methods. Focus is on the mechanics of a numerical Monte Carlo code, rather than the mathematical foundations of the method.
Monte Carlo simulation of tomography techniques using the platform Gate
International Nuclear Information System (INIS)
Barbouchi, Asma
2007-01-01
Simulations play a key role in functional imaging, with applications ranging from scanner design, scatter correction, protocol optimisation. GATE (Geant4 for Application Tomography Emission) is a platform for Monte Carlo Simulation. It is based on Geant4 to generate and track particles, to model geometry and physics process. Explicit modelling of time includes detector motion, time of flight, tracer kinetics. Interfaces to voxellised models and image reconstruction packages improve the integration of GATE in the global modelling cycle. In this work Monte Carlo simulations are used to understand and optimise the gamma camera's performances. We study the effect of the distance between source and collimator, the diameter of the holes and the thick of the collimator on the spatial resolution, energy resolution and efficiency of the gamma camera. We also study the reduction of simulation's time and implement a model of left ventricle in GATE. (Author). 7 refs
Shell model Monte Carlo methods
International Nuclear Information System (INIS)
Koonin, S.E.
1996-01-01
We review quantum Monte Carlo methods for dealing with large shell model problems. These methods reduce the imaginary-time many-body evolution operator to a coherent superposition of one-body evolutions in fluctuating one-body fields; resultant path integral is evaluated stochastically. We first discuss the motivation, formalism, and implementation of such Shell Model Monte Carlo methods. There then follows a sampler of results and insights obtained from a number of applications. These include the ground state and thermal properties of pf-shell nuclei, thermal behavior of γ-soft nuclei, and calculation of double beta-decay matrix elements. Finally, prospects for further progress in such calculations are discussed. 87 refs
Zimmerman, George B.
Monte Carlo methods appropriate to simulate the transport of x-rays, neutrons, ions and electrons in Inertial Confinement Fusion targets are described and analyzed. The Implicit Monte Carlo method of x-ray transport handles symmetry within indirect drive ICF hohlraums well, but can be improved 50X in efficiency by angular biasing the x-rays towards the fuel capsule. Accurate simulation of thermonuclear burn and burn diagnostics involves detailed particle source spectra, charged particle ranges, inflight reaction kinematics, corrections for bulk and thermal Doppler effects and variance reduction to obtain adequate statistics for rare events. It is found that the effects of angular Coulomb scattering must be included in models of charged particle transport through heterogeneous materials.
International Nuclear Information System (INIS)
Zimmerman, G.B.
1997-01-01
Monte Carlo methods appropriate to simulate the transport of x-rays, neutrons, ions and electrons in Inertial Confinement Fusion targets are described and analyzed. The Implicit Monte Carlo method of x-ray transport handles symmetry within indirect drive ICF hohlraums well, but can be improved 50X in efficiency by angular biasing the x-rays towards the fuel capsule. Accurate simulation of thermonuclear burn and burn diagnostics involves detailed particle source spectra, charged particle ranges, inflight reaction kinematics, corrections for bulk and thermal Doppler effects and variance reduction to obtain adequate statistics for rare events. It is found that the effects of angular Coulomb scattering must be included in models of charged particle transport through heterogeneous materials. copyright 1997 American Institute of Physics
International Nuclear Information System (INIS)
Zimmerman, George B.
1997-01-01
Monte Carlo methods appropriate to simulate the transport of x-rays, neutrons, ions and electrons in Inertial Confinement Fusion targets are described and analyzed. The Implicit Monte Carlo method of x-ray transport handles symmetry within indirect drive ICF hohlraums well, but can be improved 50X in efficiency by angular biasing the x-rays towards the fuel capsule. Accurate simulation of thermonuclear burn and burn diagnostics involves detailed particle source spectra, charged particle ranges, inflight reaction kinematics, corrections for bulk and thermal Doppler effects and variance reduction to obtain adequate statistics for rare events. It is found that the effects of angular Coulomb scattering must be included in models of charged particle transport through heterogeneous materials
Adaptive Multilevel Monte Carlo Simulation
Hoel, H
2011-08-23
This work generalizes a multilevel forward Euler Monte Carlo method introduced in Michael B. Giles. (Michael Giles. Oper. Res. 56(3):607–617, 2008.) for the approximation of expected values depending on the solution to an Itô stochastic differential equation. The work (Michael Giles. Oper. Res. 56(3):607– 617, 2008.) proposed and analyzed a forward Euler multilevelMonte Carlo method based on a hierarchy of uniform time discretizations and control variates to reduce the computational effort required by a standard, single level, Forward Euler Monte Carlo method. This work introduces an adaptive hierarchy of non uniform time discretizations, generated by an adaptive algorithmintroduced in (AnnaDzougoutov et al. Raùl Tempone. Adaptive Monte Carlo algorithms for stopped diffusion. In Multiscale methods in science and engineering, volume 44 of Lect. Notes Comput. Sci. Eng., pages 59–88. Springer, Berlin, 2005; Kyoung-Sook Moon et al. Stoch. Anal. Appl. 23(3):511–558, 2005; Kyoung-Sook Moon et al. An adaptive algorithm for ordinary, stochastic and partial differential equations. In Recent advances in adaptive computation, volume 383 of Contemp. Math., pages 325–343. Amer. Math. Soc., Providence, RI, 2005.). This form of the adaptive algorithm generates stochastic, path dependent, time steps and is based on a posteriori error expansions first developed in (Anders Szepessy et al. Comm. Pure Appl. Math. 54(10):1169– 1214, 2001). Our numerical results for a stopped diffusion problem, exhibit savings in the computational cost to achieve an accuracy of ϑ(TOL),from(TOL−3), from using a single level version of the adaptive algorithm to ϑ(((TOL−1)log(TOL))2).
Extending canonical Monte Carlo methods
International Nuclear Information System (INIS)
Velazquez, L; Curilef, S
2010-01-01
In this paper, we discuss the implications of a recently obtained equilibrium fluctuation-dissipation relation for the extension of the available Monte Carlo methods on the basis of the consideration of the Gibbs canonical ensemble to account for the existence of an anomalous regime with negative heat capacities C α with α≈0.2 for the particular case of the 2D ten-state Potts model
Parallel Monte Carlo reactor neutronics
International Nuclear Information System (INIS)
Blomquist, R.N.; Brown, F.B.
1994-01-01
The issues affecting implementation of parallel algorithms for large-scale engineering Monte Carlo neutron transport simulations are discussed. For nuclear reactor calculations, these include load balancing, recoding effort, reproducibility, domain decomposition techniques, I/O minimization, and strategies for different parallel architectures. Two codes were parallelized and tested for performance. The architectures employed include SIMD, MIMD-distributed memory, and workstation network with uneven interactive load. Speedups linear with the number of nodes were achieved
Monte carlo dose calculation in dental amalgam phantom
Mohd Zahri Abdul Aziz; A L Yusoff; N D Osman; R Abdullah; N A Rabaie; M S Salikin
2015-01-01
It has become a great challenge in the modern radiation treatment to ensure the accuracy of treatment delivery in electron beam therapy. Tissue inhomogeneity has become one of the factors for accurate dose calculation, and this requires complex algorithm calculation like Monte Carlo (MC). On the other hand, computed tomography (CT) images used in treatment planning system need to be trustful as they are the input in radiotherapy treatment. However, with the presence of metal amalgam in treatm...
Optimization of reconstruction algorithms using Monte Carlo simulation
International Nuclear Information System (INIS)
Hanson, K.M.
1989-01-01
A method for optimizing reconstruction algorithms is presented that is based on how well a specified task can be performed using the reconstructed images. Task performance is numerically assessed by a Monte Carlo simulation of the complete imaging process including the generation of scenes appropriate to the desired application, subsequent data taking, reconstruction, and performance of the stated task based on the final image. The use of this method is demonstrated through the optimization of the Algebraic Reconstruction Technique (ART), which reconstructs images from their projections by an iterative procedure. The optimization is accomplished by varying the relaxation factor employed in the updating procedure. In some of the imaging situations studied, it is found that the optimization of constrained ART, in which a non-negativity constraint is invoked, can vastly increase the detectability of objects. There is little improvement attained for unconstrained ART. The general method presented may be applied to the problem of designing neutron-diffraction spectrometers. (author)
Monte Carlo simulation of a prototype photodetector used in radiotherapy
Kausch, C; Albers, D; Schmidt, R; Schreiber, B
2000-01-01
The imaging performance of prototype electronic portal imaging devices (EPID) has been investigated. Monte Carlo simulations have been applied to calculate the modulation transfer function (MTF( f )), the noise power spectrum (NPS( f )) and the detective quantum efficiency (DQE( f )) for different new type of EPIDs, which consist of a detector combination of metal or polyethylene (PE), a phosphor layer of Gd sub 2 O sub 2 S and a flat array of photodiodes. The simulated results agree well with measurements. Based on simulated results, possible optimization of these devices is discussed.
Monte Carlo methods for medical physics a practical introduction
Schuemann, Jan; Paganetti, Harald
2018-01-01
The Monte Carlo (MC) method, established as the gold standard to predict results of physical processes, is now fast becoming a routine clinical tool for applications that range from quality control to treatment verification. This book provides a basic understanding of the fundamental principles and limitations of the MC method in the interpretation and validation of results for various scenarios. It shows how user-friendly and speed optimized MC codes can achieve online image processing or dose calculations in a clinical setting. It introduces this essential method with emphasis on applications in hardware design and testing, radiological imaging, radiation therapy, and radiobiology.
Optical coherence tomography: Monte Carlo simulation and improvement by optical amplification
DEFF Research Database (Denmark)
Tycho, Andreas
2002-01-01
, and high insertion loss of the fast optical delay-line scanners that are necessary for fast imaging. Correspondingly, an increase in penetration depth of about 30-100% is demonstrated for OCT imaging in skin based on results obtained with the new Monte Carlo model. Accordingly, the two new models...
Cuartel San Carlos. Yacimiento veterano
Directory of Open Access Journals (Sweden)
Mariana Flores
2007-01-01
Full Text Available El Cuartel San Carlos es un monumento histórico nacional (1986 de finales del siglo XVIII (1785-1790, caracterizado por sufrir diversas adversidades en su construcción y soportar los terremotos de 1812 y 1900. En el año 2006, el organismo encargado de su custodia, el Instituto de Patrimonio Cultural del Ministerio de Cultura, ejecutó tres etapas de exploración arqueológica, que abarcaron las áreas Traspatio, Patio Central y las Naves Este y Oeste de la edificación. Este trabajo reseña el análisis de la documentación arqueológica obtenida en el sitio, a partir de la realización de dicho proyecto, denominado EACUSAC (Estudio Arqueológico del Cuartel San Carlos, que representa además, la tercera campaña realizada en el sitio. La importancia de este yacimiento histórico, radica en su participación en los acontecimientos que propiciaron conflictos de poder durante el surgimiento de la República y en los sucesos políticos del siglo XX. De igual manera, se encontró en el sitio una amplia muestra de materiales arqueológicos que reseñan un estilo de vida cotidiana militar, así como las dinámicas sociales internas ocurridas en el San Carlos, como lugar estratégico para la defensa de los diferentes regímenes que atravesó el país, desde la época del imperialismo español hasta nuestros días.
Carlos Battilana: Profesor, Gestor, Amigo
Directory of Open Access Journals (Sweden)
José Pacheco
2009-12-01
Full Text Available El Comité Editorial de Anales ha perdido a uno de sus miembros más connotados. Brillante docente de nuestra Facultad, Carlos Alberto Battilana Guanilo (1945-2009 supo transmitir los conocimientos y atraer la atención de sus auditorios, de jóvenes estudiantes o de contemporáneos ya no tan jóvenes. Interesó a sus alumnos en la senda de la capacitación permanente y en la investigación. Por otro lado, comprometió a médicos distinguidos a conformar y liderar grupos con interés en la ciencia-amistad. Su vocación docente lo vinculó a facultades de medicina y academias y sociedades científicas, en donde coordinó cursos y congresos de grato recuerdo. Su producción científica la dedicó a la nefrología, inmunología, cáncer, costos en el tratamiento médico. Su capacidad gestora y de liderazgo presente desde su época de estudiante, le permitió llegar a ser director regional de un laboratorio farmacéutico de mucho prestigio, organizar una facultad de medicina y luego tener el cargo de decano de la facultad de ciencias de la salud de dicha universidad privada. Carlos fue elemento importante para que Anales alcanzara un sitial de privilegio entre las revistas biomédicas peruanas. En la semblanza que publicamos tratamos de resumir apretadamente la trayectoria de Carlos Battilana, semanas después de su partida sin retorno.
Directory of Open Access Journals (Sweden)
Rafael Maya
1979-04-01
Full Text Available Entre los poetasa del Centenario tuvo Luis Carlos López mucha popularidad en el extranjero, desde la publicación de su primer libro. Creo que su obra llamó la atención de filósofos como Unamuno y, si no estoy equivocado, Darío se refirió a ella en términos elogiosos. En Colombia ha sido encomiada hiperbólicamente por algunos, a tiemp que otros no le conceden mayor mérito.
Antitwilight II: Monte Carlo simulations.
Richtsmeier, Steven C; Lynch, David K; Dearborn, David S P
2017-07-01
For this paper, we employ the Monte Carlo scene (MCScene) radiative transfer code to elucidate the underlying physics giving rise to the structure and colors of the antitwilight, i.e., twilight opposite the Sun. MCScene calculations successfully reproduce colors and spatial features observed in videos and still photos of the antitwilight taken under clear, aerosol-free sky conditions. Through simulations, we examine the effects of solar elevation angle, Rayleigh scattering, molecular absorption, aerosol scattering, multiple scattering, and surface reflectance on the appearance of the antitwilight. We also compare MCScene calculations with predictions made by the MODTRAN radiative transfer code for a solar elevation angle of +1°.
Carlos Restrepo. Un verdadero Maestro
Pelayo Correa
2009-01-01
Carlos Restrepo fue el primer profesor de Patología y un miembro ilustre del grupo de pioneros que fundaron la Facultad de Medicina de la Universidad del Valle. Estos pioneros convergieron en Cali en la década de 1950, en posesión de un espíritu renovador y creativo que emprendió con mucho éxito la labor de cambiar la cultura académica del Valle del Cauca. Ellos encontraron una sociedad apacible, que disfrutaba de la generosidad de su entorno, sin deseos de romper las tradiciones centenarias...
Therapeutic Applications of Monte Carlo Calculations in Nuclear Medicine
International Nuclear Information System (INIS)
Coulot, J
2003-01-01
Monte Carlo techniques are involved in many applications in medical physics, and the field of nuclear medicine has seen a great development in the past ten years due to their wider use. Thus, it is of great interest to look at the state of the art in this domain, when improving computer performances allow one to obtain improved results in a dramatically reduced time. The goal of this book is to make, in 15 chapters, an exhaustive review of the use of Monte Carlo techniques in nuclear medicine, also giving key features which are not necessary directly related to the Monte Carlo method, but mandatory for its practical application. As the book deals with therapeutic' nuclear medicine, it focuses on internal dosimetry. After a general introduction on Monte Carlo techniques and their applications in nuclear medicine (dosimetry, imaging and radiation protection), the authors give an overview of internal dosimetry methods (formalism, mathematical phantoms, quantities of interest). Then, some of the more widely used Monte Carlo codes are described, as well as some treatment planning softwares. Some original techniques are also mentioned, such as dosimetry for boron neutron capture synovectomy. It is generally well written, clearly presented, and very well documented. Each chapter gives an overview of each subject, and it is up to the reader to investigate it further using the extensive bibliography provided. Each topic is discussed from a practical point of view, which is of great help for non-experienced readers. For instance, the chapter about mathematical aspects of Monte Carlo particle transport is very clear and helps one to apprehend the philosophy of the method, which is often a difficulty with a more theoretical approach. Each chapter is put in the general (clinical) context, and this allows the reader to keep in mind the intrinsic limitation of each technique involved in dosimetry (for instance activity quantitation). Nevertheless, there are some minor remarks to
Mean field simulation for Monte Carlo integration
Del Moral, Pierre
2013-01-01
In the last three decades, there has been a dramatic increase in the use of interacting particle methods as a powerful tool in real-world applications of Monte Carlo simulation in computational physics, population biology, computer sciences, and statistical machine learning. Ideally suited to parallel and distributed computation, these advanced particle algorithms include nonlinear interacting jump diffusions; quantum, diffusion, and resampled Monte Carlo methods; Feynman-Kac particle models; genetic and evolutionary algorithms; sequential Monte Carlo methods; adaptive and interacting Marko
Monte Carlo simulations of neutron scattering instruments
International Nuclear Information System (INIS)
Aestrand, Per-Olof; Copenhagen Univ.; Lefmann, K.; Nielsen, K.
2001-01-01
A Monte Carlo simulation is an important computational tool used in many areas of science and engineering. The use of Monte Carlo techniques for simulating neutron scattering instruments is discussed. The basic ideas, techniques and approximations are presented. Since the construction of a neutron scattering instrument is very expensive, Monte Carlo software used for design of instruments have to be validated and tested extensively. The McStas software was designed with these aspects in mind and some of the basic principles of the McStas software will be discussed. Finally, some future prospects are discussed for using Monte Carlo simulations in optimizing neutron scattering experiments. (R.P.)
Status of Monte Carlo dose planning
International Nuclear Information System (INIS)
Mackie, T.R.
1995-01-01
Monte Carlo simulation will become increasing important for treatment planning for radiotherapy. The EGS4 Monte Carlo system, a general particle transport system, has been used most often for simulation tasks in radiotherapy although ETRAN/ITS and MCNP have also been used. Monte Carlo treatment planning requires that the beam characteristics such as the energy spectrum and angular distribution of particles emerging from clinical accelerators be accurately represented. An EGS4 Monte Carlo code, called BEAM, was developed by the OMEGA Project (a collaboration between the University of Wisconsin and the National Research Council of Canada) to transport particles through linear accelerator heads. This information was used as input to simulate the passage of particles through CT-based representations of phantoms or patients using both an EGS4 code (DOSXYZ) and the macro Monte Carlo (MMC) method. Monte Carlo computed 3-D electron beam dose distributions compare well to measurements obtained in simple and complex heterogeneous phantoms. The present drawback with most Monte Carlo codes is that simulation times are slower than most non-stochastic dose computation algorithms. This is especially true for photon dose planning. In the future dedicated Monte Carlo treatment planning systems like Peregrine (from Lawrence Livermore National Laboratory), which will be capable of computing the dose from all beam types, or the Macro Monte Carlo (MMC) system, which is an order of magnitude faster than other algorithms, may dominate the field
Evaluation of cobalt-60 energy deposit in mouse and monkey using Monte Carlo simulation
Energy Technology Data Exchange (ETDEWEB)
Woo, Sang Keun; Kim, Wook; Park, Yong Sung; Kang, Joo Hyun; Lee, Yong Jin [Korea Institute of Radiological and Medical Sciences, KIRAMS, Seoul (Korea, Republic of); Cho, Doo Wan; Lee, Hong Soo; Han, Su Cheol [Jeonbuk Department of Inhalation Research, Korea Institute of toxicology, KRICT, Jeongeup (Korea, Republic of)
2016-12-15
These absorbed dose can calculated using the Monte Carlo transport code MCNP (Monte Carlo N-particle transport code). Internal radiotherapy absorbed dose was calculated using conventional software, such as OLINDA/EXM or Monte Carlo simulation. However, the OLINDA/EXM does not calculate individual absorbed dose and non-standard organ, such as tumor. While the Monte Carlo simulation can calculated non-standard organ and specific absorbed dose using individual CT image. External radiotherapy, absorbed dose can calculated by specific absorbed energy in specific organs using Monte Carlo simulation. The specific absorbed energy in each organ was difference between species or even if the same species. Since they have difference organ sizes, position, and density of organs. The aim of this study was to individually evaluated cobalt-60 energy deposit in mouse and monkey using Monte Carlo simulation. We evaluation of cobalt-60 energy deposit in mouse and monkey using Monte Carlo simulation. The absorbed energy in each organ compared with mouse heart was 54.6 fold higher than monkey absorbed energy in heart. Likewise lung was 88.4, liver was 16.0, urinary bladder was 29.4 fold higher than monkey. It means that the distance of each organs and organ mass was effects of the absorbed energy. This result may help to can calculated absorbed dose and more accuracy plan for external radiation beam therapy and internal radiotherapy.
Directory of Open Access Journals (Sweden)
Fernando Garavito
1981-06-01
Full Text Available La crítica literaria de los últimos años se ha acostumbrado a ver en Guillermo Valencia la cifra de una época, a la que es necesario referirse, para bien o para mal, cuando se trata de fijar límites a la actividad poética de cualquiera otro de sus contemporáneos. Y aunque el aserto no es valedero en un todo respecto de quienes se consideran sus discípulos, porque en este caso la augusta soberbia del maestro de Popayán los coloca al margen, sí lo es, y en alto grado, cuando se trata de Luis Carlos López, quien por su tono, sus temas y su "aliento" ha pasado a ser manoseable.
Monte Carlo lattice program KIM
International Nuclear Information System (INIS)
Cupini, E.; De Matteis, A.; Simonini, R.
1980-01-01
The Monte Carlo program KIM solves the steady-state linear neutron transport equation for a fixed-source problem or, by successive fixed-source runs, for the eigenvalue problem, in a two-dimensional thermal reactor lattice. Fluxes and reaction rates are the main quantities computed by the program, from which power distribution and few-group averaged cross sections are derived. The simulation ranges from 10 MeV to zero and includes anisotropic and inelastic scattering in the fast energy region, the epithermal Doppler broadening of the resonances of some nuclides, and the thermalization phenomenon by taking into account the thermal velocity distribution of some molecules. Besides the well known combinatorial geometry, the program allows complex configurations to be represented by a discrete set of points, an approach greatly improving calculation speed
Directory of Open Access Journals (Sweden)
Bárbara Bustamante
2005-01-01
Full Text Available El talento de Carlos Alonso (Argentina, 1929 ha logrado conquistar un lenguaje con estilo propio. La creación de dibujos, pinturas, pasteles y tintas, collages y grabados fijaron en el campo visual la proyección de su subjetividad. Tanto la imagen como la palabra explicitan una visión crítica de la realidad, que tensiona al espectador obligándolo a una condición reflexiva y comprometida con el mensaje; este es el aspecto más destacado por los historiadores del arte. Sin embargo, la presente investigación pretende focalizar aspectos icónicos y plásticos de su hacer.
Directory of Open Access Journals (Sweden)
Bárbara Bustamante
2005-10-01
Full Text Available El talento de Carlos Alonso (Argentina, 1929 ha logrado conquistar un lenguaje con estilo propio. La creación de dibujos, pinturas, pasteles y tintas, collages y grabados fijaron en el campo visual la proyección de su subjetividad. Tanto la imagen como la palabra explicitan una visión crítica de la realidad, que tensiona al espectador obligándolo a una condición reflexiva y comprometida con el mensaje; este es el aspecto más destacado por los historiadores del arte. Sin embargo, la presente investigación pretende focalizar aspectos icónicos y plásticos de su hacer.
Monte Carlo simulation of experiments
International Nuclear Information System (INIS)
Opat, G.I.
1977-07-01
An outline of the technique of computer simulation of particle physics experiments by the Monte Carlo method is presented. Useful special purpose subprograms are listed and described. At each stage the discussion is made concrete by direct reference to the programs SIMUL8 and its variant MONTE-PION, written to assist in the analysis of the radiative decay experiments μ + → e + ν sub(e) antiνγ and π + → e + ν sub(e)γ, respectively. These experiments were based on the use of two large sodium iodide crystals, TINA and MINA, as e and γ detectors. Instructions for the use of SIMUL8 and MONTE-PION are given. (author)
Monte Carlo Simulation of Phase Transitions
村井, 信行; N., MURAI; 中京大学教養部
1983-01-01
In the Monte Carlo simulation of phase transition, a simple heat bath method is applied to the classical Heisenberg model in two dimensions. It reproduces the correlation length predicted by the Monte Carlo renor-malization group and also computed in the non-linear σ model
Advanced Computational Methods for Monte Carlo Calculations
Energy Technology Data Exchange (ETDEWEB)
Brown, Forrest B. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2018-01-12
This course is intended for graduate students who already have a basic understanding of Monte Carlo methods. It focuses on advanced topics that may be needed for thesis research, for developing new state-of-the-art methods, or for working with modern production Monte Carlo codes.
Markov Chain Monte Carlo Methods
Indian Academy of Sciences (India)
Keywords. Markov chain; state space; stationary transition probability; stationary distribution; irreducibility; aperiodicity; stationarity; M-H algorithm; proposal distribution; acceptance probability; image processing; Gibbs sampler.
The MC21 Monte Carlo Transport Code
International Nuclear Information System (INIS)
Sutton TM; Donovan TJ; Trumbull TH; Dobreff PS; Caro E; Griesheimer DP; Tyburski LJ; Carpenter DC; Joo H
2007-01-01
MC21 is a new Monte Carlo neutron and photon transport code currently under joint development at the Knolls Atomic Power Laboratory and the Bettis Atomic Power Laboratory. MC21 is the Monte Carlo transport kernel of the broader Common Monte Carlo Design Tool (CMCDT), which is also currently under development. The vision for CMCDT is to provide an automated, computer-aided modeling and post-processing environment integrated with a Monte Carlo solver that is optimized for reactor analysis. CMCDT represents a strategy to push the Monte Carlo method beyond its traditional role as a benchmarking tool or ''tool of last resort'' and into a dominant design role. This paper describes various aspects of the code, including the neutron physics and nuclear data treatments, the geometry representation, and the tally and depletion capabilities
SIMIND Monte Carlo simulation of a single photon emission CT
International Nuclear Information System (INIS)
Bahreyni Toossi, M.T.; Pirayesh Islamian, J.; Naseri, S.H.; Momennezhad, M.; Ljungberg, M.
2010-01-01
In this study, we simulated a Siemens E.CAM SPECT system using SIMIND Monte Carlo program to acquire its experimental characterization in terms of energy resolution, sensitivity, spatial resolution and imaging of phantoms using 99m Tc. The experimental and simulation data for SPECT imaging was acquired from a point source and Jaszczak phantom. Verification of the simulation was done by comparing two sets of images and related data obtained from the actual and simulated systems. Image quality was assessed by comparing image contrast and resolution. Simulated and measured energy spectra (with or without a collimator) and spatial resolution from point sources in air were compared. The resulted energy spectra present similar peaks for the gamma energy of 99m Tc at 140 KeV. FWHM for the simulation calculated to 14.01 KeV and 13.80 KeV for experimental data, corresponding to energy resolution of 10.01 and 9.86% compared to defined 9.9% for both systems, respectively. Sensitivities of the real and virtual gamma cameras were calculated to 85.11 and 85.39 cps/MBq, respectively. The energy spectra of both simulated and real gamma cameras were matched. Images obtained from Jaszczak phantom, experimentally and by simulation, showed similarly in contrast and resolution. SIMIND Monte Carlo could successfully simulate the Siemens E.CAM gamma camera. The results validate the use of the simulated system for further investigation, including modification, planning, and developing a SPECT system to improve the quality of images. (author)
Barr, Catherine, Ed.
1997-01-01
The theme of this month's issue is "Images"--from early paintings and statuary to computer-generated design. Resources on the theme include Web sites, CD-ROMs and software, videos, books, and others. A page of reproducible activities is also provided. Features include photojournalism, inspirational Web sites, art history, pop art, and myths. (AEF)
Monte Carlo Method with Heuristic Adjustment for Irregularly Shaped Food Product Volume Measurement
Directory of Open Access Journals (Sweden)
Joko Siswantoro
2014-01-01
Full Text Available Volume measurement plays an important role in the production and processing of food products. Various methods have been proposed to measure the volume of food products with irregular shapes based on 3D reconstruction. However, 3D reconstruction comes with a high-priced computational cost. Furthermore, some of the volume measurement methods based on 3D reconstruction have a low accuracy. Another method for measuring volume of objects uses Monte Carlo method. Monte Carlo method performs volume measurements using random points. Monte Carlo method only requires information regarding whether random points fall inside or outside an object and does not require a 3D reconstruction. This paper proposes volume measurement using a computer vision system for irregularly shaped food products without 3D reconstruction based on Monte Carlo method with heuristic adjustment. Five images of food product were captured using five cameras and processed to produce binary images. Monte Carlo integration with heuristic adjustment was performed to measure the volume based on the information extracted from binary images. The experimental results show that the proposed method provided high accuracy and precision compared to the water displacement method. In addition, the proposed method is more accurate and faster than the space carving method.
Monte Carlo method with heuristic adjustment for irregularly shaped food product volume measurement.
Siswantoro, Joko; Prabuwono, Anton Satria; Abdullah, Azizi; Idrus, Bahari
2014-01-01
Volume measurement plays an important role in the production and processing of food products. Various methods have been proposed to measure the volume of food products with irregular shapes based on 3D reconstruction. However, 3D reconstruction comes with a high-priced computational cost. Furthermore, some of the volume measurement methods based on 3D reconstruction have a low accuracy. Another method for measuring volume of objects uses Monte Carlo method. Monte Carlo method performs volume measurements using random points. Monte Carlo method only requires information regarding whether random points fall inside or outside an object and does not require a 3D reconstruction. This paper proposes volume measurement using a computer vision system for irregularly shaped food products without 3D reconstruction based on Monte Carlo method with heuristic adjustment. Five images of food product were captured using five cameras and processed to produce binary images. Monte Carlo integration with heuristic adjustment was performed to measure the volume based on the information extracted from binary images. The experimental results show that the proposed method provided high accuracy and precision compared to the water displacement method. In addition, the proposed method is more accurate and faster than the space carving method.
International Nuclear Information System (INIS)
Kellum, C.D.; Fisher, L.M.; Tegtmeyer, C.J.
1987-01-01
This paper examines the advantages of the use of excretory urography for diagnosis. According to the authors, excretory urography remains the basic radiologic examination of the urinary tract and is the foundation for the evaluation of suspected urologic disease. Despite development of the newer diagnostic modalities such as isotope scanning, ultrasonography, CT, and magnetic resonsance imaging (MRI), excretory urography has maintained a prominent role in ruorradiology. Some indications have been altered and will continue to change with the newer imaging modalities, but the initial evaluation of suspected urinary tract structural abnormalities; hematuria, pyuria, and calculus disease is best performed with excretory urography. The examination is relatively inexpensive and simple to perform, with few contraindictions. Excretory urography, when properly performed, can provide valuable information about the renal parenchyma, pelvicalyceal system, ureters, and urinary bladder
MATLAB platform for Monte Carlo planning and dosimetry experimental evaluation
International Nuclear Information System (INIS)
Baeza, J. A.; Ureba, A.; Jimenez-Ortega, E.; Pereira-Barbeiro, A. R.; Leal, A.
2013-01-01
A new platform for the full Monte Carlo planning and an independent experimental evaluation that it can be integrated into clinical practice. The tool has proved its usefulness and efficiency and now forms part of the flow of work of our research group, the tool used for the generation of results, which are to be suitably revised and are being published. This software is an effort of integration of numerous algorithms of image processing, along with planning optimization algorithms, allowing the process of MCTP planning from a single interface. In addition, becomes a flexible and accurate tool for the evaluation of experimental dosimetric data for the quality control of actual treatments. (Author)
Monte carlo simulation for soot dynamics
Zhou, Kun
2012-01-01
A new Monte Carlo method termed Comb-like frame Monte Carlo is developed to simulate the soot dynamics. Detailed stochastic error analysis is provided. Comb-like frame Monte Carlo is coupled with the gas phase solver Chemkin II to simulate soot formation in a 1-D premixed burner stabilized flame. The simulated soot number density, volume fraction, and particle size distribution all agree well with the measurement available in literature. The origin of the bimodal distribution of particle size distribution is revealed with quantitative proof.
Monte Carlo approaches to light nuclei
International Nuclear Information System (INIS)
Carlson, J.
1990-01-01
Significant progress has been made recently in the application of Monte Carlo methods to the study of light nuclei. We review new Green's function Monte Carlo results for the alpha particle, Variational Monte Carlo studies of 16 O, and methods for low-energy scattering and transitions. Through these calculations, a coherent picture of the structure and electromagnetic properties of light nuclei has arisen. In particular, we examine the effect of the three-nucleon interaction and the importance of exchange currents in a variety of experimentally measured properties, including form factors and capture cross sections. 29 refs., 7 figs
Monte Carlo approaches to light nuclei
Energy Technology Data Exchange (ETDEWEB)
Carlson, J.
1990-01-01
Significant progress has been made recently in the application of Monte Carlo methods to the study of light nuclei. We review new Green's function Monte Carlo results for the alpha particle, Variational Monte Carlo studies of {sup 16}O, and methods for low-energy scattering and transitions. Through these calculations, a coherent picture of the structure and electromagnetic properties of light nuclei has arisen. In particular, we examine the effect of the three-nucleon interaction and the importance of exchange currents in a variety of experimentally measured properties, including form factors and capture cross sections. 29 refs., 7 figs.
Importance iteration in MORSE Monte Carlo calculations
International Nuclear Information System (INIS)
Kloosterman, J.L.; Hoogenboom, J.E.
1994-02-01
An expression to calculate point values (the expected detector response of a particle emerging from a collision or the source) is derived and implemented in the MORSE-SGC/S Monte Carlo code. It is outlined how these point values can be smoothed as a function of energy and as a function of the optical thickness between the detector and the source. The smoothed point values are subsequently used to calculate the biasing parameters of the Monte Carlo runs to follow. The method is illustrated by an example, which shows that the obtained biasing parameters lead to a more efficient Monte Carlo calculation. (orig.)
Adaptive Markov Chain Monte Carlo
Jadoon, Khan
2016-08-08
A substantial interpretation of electromagnetic induction (EMI) measurements requires quantifying optimal model parameters and uncertainty of a nonlinear inverse problem. For this purpose, an adaptive Bayesian Markov chain Monte Carlo (MCMC) algorithm is used to assess multi-orientation and multi-offset EMI measurements in an agriculture field with non-saline and saline soil. In the MCMC simulations, posterior distribution was computed using Bayes rule. The electromagnetic forward model based on the full solution of Maxwell\\'s equations was used to simulate the apparent electrical conductivity measured with the configurations of EMI instrument, the CMD mini-Explorer. The model parameters and uncertainty for the three-layered earth model are investigated by using synthetic data. Our results show that in the scenario of non-saline soil, the parameters of layer thickness are not well estimated as compared to layers electrical conductivity because layer thicknesses in the model exhibits a low sensitivity to the EMI measurements, and is hence difficult to resolve. Application of the proposed MCMC based inversion to the field measurements in a drip irrigation system demonstrate that the parameters of the model can be well estimated for the saline soil as compared to the non-saline soil, and provide useful insight about parameter uncertainty for the assessment of the model outputs.
Advanced computers and Monte Carlo
International Nuclear Information System (INIS)
Jordan, T.L.
1979-01-01
High-performance parallelism that is currently available is synchronous in nature. It is manifested in such architectures as Burroughs ILLIAC-IV, CDC STAR-100, TI ASC, CRI CRAY-1, ICL DAP, and many special-purpose array processors designed for signal processing. This form of parallelism has apparently not been of significant value to many important Monte Carlo calculations. Nevertheless, there is much asynchronous parallelism in many of these calculations. A model of a production code that requires up to 20 hours per problem on a CDC 7600 is studied for suitability on some asynchronous architectures that are on the drawing board. The code is described and some of its properties and resource requirements ae identified to compare with corresponding properties and resource requirements are identified to compare with corresponding properties and resource requirements are identified to compare with corresponding properties and resources of some asynchronous multiprocessor architectures. Arguments are made for programer aids and special syntax to identify and support important asynchronous parallelism. 2 figures, 5 tables
Energy Technology Data Exchange (ETDEWEB)
Baeza, J. A.; Ureba, A.; Jimenez-Ortega, E.; Pereira-Barbeiro, A. R.; Leal, A.
2013-07-01
A new platform for the full Monte Carlo planning and an independent experimental evaluation that it can be integrated into clinical practice. The tool has proved its usefulness and efficiency and now forms part of the flow of work of our research group, the tool used for the generation of results, which are to be suitably revised and are being published. This software is an effort of integration of numerous algorithms of image processing, along with planning optimization algorithms, allowing the process of MCTP planning from a single interface. In addition, becomes a flexible and accurate tool for the evaluation of experimental dosimetric data for the quality control of actual treatments. (Author)
11th International Conference on Monte Carlo and Quasi-Monte Carlo Methods in Scientific Computing
Nuyens, Dirk
2016-01-01
This book presents the refereed proceedings of the Eleventh International Conference on Monte Carlo and Quasi-Monte Carlo Methods in Scientific Computing that was held at the University of Leuven (Belgium) in April 2014. These biennial conferences are major events for Monte Carlo and quasi-Monte Carlo researchers. The proceedings include articles based on invited lectures as well as carefully selected contributed papers on all theoretical aspects and applications of Monte Carlo and quasi-Monte Carlo methods. Offering information on the latest developments in these very active areas, this book is an excellent reference resource for theoreticians and practitioners interested in solving high-dimensional computational problems, arising, in particular, in finance, statistics and computer graphics.
Monte Carlo simulations for plasma physics
International Nuclear Information System (INIS)
Okamoto, M.; Murakami, S.; Nakajima, N.; Wang, W.X.
2000-07-01
Plasma behaviours are very complicated and the analyses are generally difficult. However, when the collisional processes play an important role in the plasma behaviour, the Monte Carlo method is often employed as a useful tool. For examples, in neutral particle injection heating (NBI heating), electron or ion cyclotron heating, and alpha heating, Coulomb collisions slow down high energetic particles and pitch angle scatter them. These processes are often studied by the Monte Carlo technique and good agreements can be obtained with the experimental results. Recently, Monte Carlo Method has been developed to study fast particle transports associated with heating and generating the radial electric field. Further it is applied to investigating the neoclassical transport in the plasma with steep gradients of density and temperatures which is beyong the conventional neoclassical theory. In this report, we briefly summarize the researches done by the present authors utilizing the Monte Carlo method. (author)
Hybrid Monte Carlo methods in computational finance
Leitao Rodriguez, A.
2017-01-01
Monte Carlo methods are highly appreciated and intensively employed in computational finance in the context of financial derivatives valuation or risk management. The method offers valuable advantages like flexibility, easy interpretation and straightforward implementation. Furthermore, the
Simulation and the Monte Carlo method
Rubinstein, Reuven Y
2016-01-01
Simulation and the Monte Carlo Method, Third Edition reflects the latest developments in the field and presents a fully updated and comprehensive account of the major topics that have emerged in Monte Carlo simulation since the publication of the classic First Edition over more than a quarter of a century ago. While maintaining its accessible and intuitive approach, this revised edition features a wealth of up-to-date information that facilitates a deeper understanding of problem solving across a wide array of subject areas, such as engineering, statistics, computer science, mathematics, and the physical and life sciences. The book begins with a modernized introduction that addresses the basic concepts of probability, Markov processes, and convex optimization. Subsequent chapters discuss the dramatic changes that have occurred in the field of the Monte Carlo method, with coverage of many modern topics including: Markov Chain Monte Carlo, variance reduction techniques such as the transform likelihood ratio...
"Shaakal" Carlos kaebas arreteerija kohtusse / Margo Pajuste
Pajuste, Margo
2006-01-01
Ilmunud ka: Postimees : na russkom jazõke 3. juuli lk. 11. Vangistatud kurikuulus terrorist "Shaakal" Carlos alustas kohtuasja oma kunagise vahistaja vastu. Ta süüdistab Prantsusmaa luureteenistuse endist juhti inimröövis
Monte Carlo methods for particle transport
Haghighat, Alireza
2015-01-01
The Monte Carlo method has become the de facto standard in radiation transport. Although powerful, if not understood and used appropriately, the method can give misleading results. Monte Carlo Methods for Particle Transport teaches appropriate use of the Monte Carlo method, explaining the method's fundamental concepts as well as its limitations. Concise yet comprehensive, this well-organized text: * Introduces the particle importance equation and its use for variance reduction * Describes general and particle-transport-specific variance reduction techniques * Presents particle transport eigenvalue issues and methodologies to address these issues * Explores advanced formulations based on the author's research activities * Discusses parallel processing concepts and factors affecting parallel performance Featuring illustrative examples, mathematical derivations, computer algorithms, and homework problems, Monte Carlo Methods for Particle Transport provides nuclear engineers and scientists with a practical guide ...
Monte Carlo code development in Los Alamos
International Nuclear Information System (INIS)
Carter, L.L.; Cashwell, E.D.; Everett, C.J.; Forest, C.A.; Schrandt, R.G.; Taylor, W.M.; Thompson, W.L.; Turner, G.D.
1974-01-01
The present status of Monte Carlo code development at Los Alamos Scientific Laboratory is discussed. A brief summary is given of several of the most important neutron, photon, and electron transport codes. 17 references. (U.S.)
Quantum Monte Carlo approaches for correlated systems
Becca, Federico
2017-01-01
Over the past several decades, computational approaches to studying strongly-interacting systems have become increasingly varied and sophisticated. This book provides a comprehensive introduction to state-of-the-art quantum Monte Carlo techniques relevant for applications in correlated systems. Providing a clear overview of variational wave functions, and featuring a detailed presentation of stochastic samplings including Markov chains and Langevin dynamics, which are developed into a discussion of Monte Carlo methods. The variational technique is described, from foundations to a detailed description of its algorithms. Further topics discussed include optimisation techniques, real-time dynamics and projection methods, including Green's function, reptation and auxiliary-field Monte Carlo, from basic definitions to advanced algorithms for efficient codes, and the book concludes with recent developments on the continuum space. Quantum Monte Carlo Approaches for Correlated Systems provides an extensive reference ...
Monte Carlo Algorithms for Linear Problems
Dimov, Ivan
2000-01-01
MSC Subject Classification: 65C05, 65U05. Monte Carlo methods are a powerful tool in many fields of mathematics, physics and engineering. It is known, that these methods give statistical estimates for the functional of the solution by performing random sampling of a certain chance variable whose mathematical expectation is the desired functional. Monte Carlo methods are methods for solving problems using random variables. In the book [16] edited by Yu. A. Shreider one can find the followin...
Multilevel Monte Carlo in Approximate Bayesian Computation
Jasra, Ajay
2017-02-13
In the following article we consider approximate Bayesian computation (ABC) inference. We introduce a method for numerically approximating ABC posteriors using the multilevel Monte Carlo (MLMC). A sequential Monte Carlo version of the approach is developed and it is shown under some assumptions that for a given level of mean square error, this method for ABC has a lower cost than i.i.d. sampling from the most accurate ABC approximation. Several numerical examples are given.
Construction of the quantitative analysis environment using Monte Carlo simulation
International Nuclear Information System (INIS)
Shirakawa, Seiji; Ushiroda, Tomoya; Hashimoto, Hiroshi; Tadokoro, Masanori; Uno, Masaki; Tsujimoto, Masakazu; Ishiguro, Masanobu; Toyama, Hiroshi
2013-01-01
The thoracic phantom image was acquisitioned of the axial section to construct maps of the source and density with Monte Carlo (MC) simulation. The phantom was Heart/Liver Type HL (Kyoto Kagaku Co., Ltd.) single photon emission CT (SPECT)/CT machine was Symbia T6 (Siemence) with the collimator LMEGP (low-medium energy general purpose). Maps were constructed from CT images with an in-house software using Visual studio C Sharp (Microsoft). The code simulation of imaging nuclear detectors (SIMIND) was used for MC simulation, Prominence processor (Nihon Medi-Physics) for filter processing and image reconstruction, and the environment DELL Precision T7400 for all image processes. For the actual experiment, the phantom was given 15 MBq of 99m Tc assuming the uptake 2% at the dose of 740 MBq in its myocardial portion and SPECT image was acquisitioned and reconstructed with Butter-worth filter and filter back projection method. CT images were similarly obtained in 0.3 mm thick slices, which were filed in one formatted with digital imaging and communication in medicine (DICOM), and then processed for application to SIMIND for mapping the source and density. Physical and mensuration factors were examined in ideal images by sequential exclusion and simulation of those factors as attenuation, scattering, spatial resolution deterioration and statistical fluctuation. Gamma energy spectrum, SPECT projection and reconstructed images given by the simulation were found to well agree with the actual data, and the precision of MC simulation was confirmed. Physical and mensuration factors were found to be evaluable individually, suggesting the usefulness of the simulation for assessing the precision of their correction. (T.T.)
Treatment planning for a small animal using Monte Carlo simulation
International Nuclear Information System (INIS)
Chow, James C. L.; Leung, Michael K. K.
2007-01-01
The development of a small animal model for radiotherapy research requires a complete setup of customized imaging equipment, irradiators, and planning software that matches the sizes of the subjects. The purpose of this study is to develop and demonstrate the use of a flexible in-house research environment for treatment planning on small animals. The software package, called DOSCTP, provides a user-friendly platform for DICOM computed tomography-based Monte Carlo dose calculation using the EGSnrcMP-based DOSXYZnrc code. Validation of the treatment planning was performed by comparing the dose distributions for simple photon beam geometries calculated through the Pinnacle3 treatment planning system and measurements. A treatment plan for a mouse based on a CT image set by a 360-deg photon arc is demonstrated. It is shown that it is possible to create 3D conformal treatment plans for small animals with consideration of inhomogeneities using small photon beam field sizes in the diameter range of 0.5-5 cm, with conformal dose covering the target volume while sparing the surrounding critical tissue. It is also found that Monte Carlo simulation is suitable to carry out treatment planning dose calculation for small animal anatomy with voxel size about one order of magnitude smaller than that of the human
Therapeutic Applications of Monte Carlo Calculations in Nuclear Medicine
Sgouros, George
2003-01-01
This book examines the applications of Monte Carlo (MC) calculations in therapeutic nuclear medicine, from basic principles to computer implementations of software packages and their applications in radiation dosimetry and treatment planning. It is written for nuclear medicine physicists and physicians as well as radiation oncologists, and can serve as a supplementary text for medical imaging, radiation dosimetry and nuclear engineering graduate courses in science, medical and engineering faculties. With chapters is written by recognised authorities in that particular field, the book covers the entire range of MC applications in therapeutic medical and health physics, from its use in imaging prior to therapy to dose distribution modelling targeted radiotherapy. The contributions discuss the fundamental concepts of radiation dosimetry, radiobiological aspects of targeted radionuclide therapy and the various components and steps required for implementing a dose calculation and treatment planning methodology in ...
MONTE CARLO SIMULATION OF MULTIFOCAL STOCHASTIC SCANNING SYSTEM
Directory of Open Access Journals (Sweden)
LIXIN LIU
2014-01-01
Full Text Available Multifocal multiphoton microscopy (MMM has greatly improved the utilization of excitation light and imaging speed due to parallel multiphoton excitation of the samples and simultaneous detection of the signals, which allows it to perform three-dimensional fast fluorescence imaging. Stochastic scanning can provide continuous, uniform and high-speed excitation of the sample, which makes it a suitable scanning scheme for MMM. In this paper, the graphical programming language — LabVIEW is used to achieve stochastic scanning of the two-dimensional galvo scanners by using white noise signals to control the x and y mirrors independently. Moreover, the stochastic scanning process is simulated by using Monte Carlo method. Our results show that MMM can avoid oversampling or subsampling in the scanning area and meet the requirements of uniform sampling by stochastically scanning the individual units of the N × N foci array. Therefore, continuous and uniform scanning in the whole field of view is implemented.
Bayesian statistics and Monte Carlo methods
Koch, K. R.
2018-03-01
The Bayesian approach allows an intuitive way to derive the methods of statistics. Probability is defined as a measure of the plausibility of statements or propositions. Three rules are sufficient to obtain the laws of probability. If the statements refer to the numerical values of variables, the so-called random variables, univariate and multivariate distributions follow. They lead to the point estimation by which unknown quantities, i.e. unknown parameters, are computed from measurements. The unknown parameters are random variables, they are fixed quantities in traditional statistics which is not founded on Bayes' theorem. Bayesian statistics therefore recommends itself for Monte Carlo methods, which generate random variates from given distributions. Monte Carlo methods, of course, can also be applied in traditional statistics. The unknown parameters, are introduced as functions of the measurements, and the Monte Carlo methods give the covariance matrix and the expectation of these functions. A confidence region is derived where the unknown parameters are situated with a given probability. Following a method of traditional statistics, hypotheses are tested by determining whether a value for an unknown parameter lies inside or outside the confidence region. The error propagation of a random vector by the Monte Carlo methods is presented as an application. If the random vector results from a nonlinearly transformed vector, its covariance matrix and its expectation follow from the Monte Carlo estimate. This saves a considerable amount of derivatives to be computed, and errors of the linearization are avoided. The Monte Carlo method is therefore efficient. If the functions of the measurements are given by a sum of two or more random vectors with different multivariate distributions, the resulting distribution is generally not known. TheMonte Carlo methods are then needed to obtain the covariance matrix and the expectation of the sum.
The Monte Carlo code MCBEND - where it is and where it's going
International Nuclear Information System (INIS)
Chukas, S.J.; Miller, P.C.; Power, S.W.
1990-05-01
The Monte Carlo method forms a corner stone to the calculational procedures established in the UK for shielding design and assessment. The emphasis of the work in the shielding area is centred on the Monte Carlo code MCBEND. The work programme in support of the code is broadly directed towards utilisation of new hardware, the development of improved modelling algorithms, the development of new acceleration methods for specific applications and enhancements to user image. This paper summarises the current status of MCBEND and reviews developments carried out over the past two years and planned for the future. (author)
Characterization of parallel-hole collimator using Monte Carlo Simulation
International Nuclear Information System (INIS)
Pandey, Anil Kumar; Sharma, Sanjay Kumar; Karunanithi, Sellam; Kumar, Praveen; Bal, Chandrasekhar; Kumar, Rakesh
2015-01-01
Accuracy of in vivo activity quantification improves after the correction of penetrated and scattered photons. However, accurate assessment is not possible with physical experiment. We have used Monte Carlo Simulation to accurately assess the contribution of penetrated and scattered photons in the photopeak window. Simulations were performed with Simulation of Imaging Nuclear Detectors Monte Carlo Code. The simulations were set up in such a way that it provides geometric, penetration, and scatter components after each simulation and writes binary images to a data file. These components were analyzed graphically using Microsoft Excel (Microsoft Corporation, USA). Each binary image was imported in software (ImageJ) and logarithmic transformation was applied for visual assessment of image quality, plotting profile across the center of the images and calculating full width at half maximum (FWHM) in horizontal and vertical directions. The geometric, penetration, and scatter at 140 keV for low-energy general-purpose were 93.20%, 4.13%, 2.67% respectively. Similarly, geometric, penetration, and scatter at 140 keV for low-energy high-resolution (LEHR), medium-energy general-purpose (MEGP), and high-energy general-purpose (HEGP) collimator were (94.06%, 3.39%, 2.55%), (96.42%, 1.52%, 2.06%), and (96.70%, 1.45%, 1.85%), respectively. For MEGP collimator at 245 keV photon and for HEGP collimator at 364 keV were 89.10%, 7.08%, 3.82% and 67.78%, 18.63%, 13.59%, respectively. Low-energy general-purpose and LEHR collimator is best to image 140 keV photon. HEGP can be used for 245 keV and 364 keV; however, correction for penetration and scatter must be applied if one is interested to quantify the in vivo activity of energy 364 keV. Due to heavy penetration and scattering, 511 keV photons should not be imaged with HEGP collimator
Successful vectorization - reactor physics Monte Carlo code
International Nuclear Information System (INIS)
Martin, W.R.
1989-01-01
Most particle transport Monte Carlo codes in use today are based on the ''history-based'' algorithm, wherein one particle history at a time is simulated. Unfortunately, the ''history-based'' approach (present in all Monte Carlo codes until recent years) is inherently scalar and cannot be vectorized. In particular, the history-based algorithm cannot take advantage of vector architectures, which characterize the largest and fastest computers at the current time, vector supercomputers such as the Cray X/MP or IBM 3090/600. However, substantial progress has been made in recent years in developing and implementing a vectorized Monte Carlo algorithm. This algorithm follows portions of many particle histories at the same time and forms the basis for all successful vectorized Monte Carlo codes that are in use today. This paper describes the basic vectorized algorithm along with descriptions of several variations that have been developed by different researchers for specific applications. These applications have been mainly in the areas of neutron transport in nuclear reactor and shielding analysis and photon transport in fusion plasmas. The relative merits of the various approach schemes will be discussed and the present status of known vectorization efforts will be summarized along with available timing results, including results from the successful vectorization of 3-D general geometry, continuous energy Monte Carlo. (orig.)
Optimization of reconstruction algorithms using Monte Carlo simulation
International Nuclear Information System (INIS)
Hanson, K.M.
1989-01-01
A method for optimizing reconstruction algorithms is presented that is based on how well a specified task can be performed using the reconstructed images. Task performance is numerically assessed by a Monte Carlo simulation of the complete imaging process including the generation of scenes appropriate to the desired application, subsequent data taking, reconstruction, and performance of the stated task based on the final image. The use of this method is demonstrated through the optimization of the Algebraic Reconstruction Technique (ART), which reconstructs images from their projections by a iterative procedure. The optimization is accomplished by varying the relaxation factor employed in the updating procedure. In some of the imaging situations studied, it is found that the optimization of constrained ART, in which a nonnegativity constraint is invoked, can vastly increase the detectability of objects. There is little improvement attained for unconstrained ART. The general method presented may be applied to the problem of designing neutron-diffraction spectrometers. 11 refs., 6 figs., 2 tabs
Monte Carlo simulation of Markov unreliability models
International Nuclear Information System (INIS)
Lewis, E.E.; Boehm, F.
1984-01-01
A Monte Carlo method is formulated for the evaluation of the unrealibility of complex systems with known component failure and repair rates. The formulation is in terms of a Markov process allowing dependences between components to be modeled and computational efficiencies to be achieved in the Monte Carlo simulation. Two variance reduction techniques, forced transition and failure biasing, are employed to increase computational efficiency of the random walk procedure. For an example problem these result in improved computational efficiency by more than three orders of magnitudes over analog Monte Carlo. The method is generalized to treat problems with distributed failure and repair rate data, and a batching technique is introduced and shown to result in substantial increases in computational efficiency for an example problem. A method for separating the variance due to the data uncertainty from that due to the finite number of random walks is presented. (orig.)
Adiabatic optimization versus diffusion Monte Carlo methods
Jarret, Michael; Jordan, Stephen P.; Lackey, Brad
2016-10-01
Most experimental and theoretical studies of adiabatic optimization use stoquastic Hamiltonians, whose ground states are expressible using only real nonnegative amplitudes. This raises a question as to whether classical Monte Carlo methods can simulate stoquastic adiabatic algorithms with polynomial overhead. Here we analyze diffusion Monte Carlo algorithms. We argue that, based on differences between L1 and L2 normalized states, these algorithms suffer from certain obstructions preventing them from efficiently simulating stoquastic adiabatic evolution in generality. In practice however, we obtain good performance by introducing a method that we call Substochastic Monte Carlo. In fact, our simulations are good classical optimization algorithms in their own right, competitive with the best previously known heuristic solvers for MAX-k -SAT at k =2 ,3 ,4 .
Shell model the Monte Carlo way
International Nuclear Information System (INIS)
Ormand, W.E.
1995-01-01
The formalism for the auxiliary-field Monte Carlo approach to the nuclear shell model is presented. The method is based on a linearization of the two-body part of the Hamiltonian in an imaginary-time propagator using the Hubbard-Stratonovich transformation. The foundation of the method, as applied to the nuclear many-body problem, is discussed. Topics presented in detail include: (1) the density-density formulation of the method, (2) computation of the overlaps, (3) the sign of the Monte Carlo weight function, (4) techniques for performing Monte Carlo sampling, and (5) the reconstruction of response functions from an imaginary-time auto-correlation function using MaxEnt techniques. Results obtained using schematic interactions, which have no sign problem, are presented to demonstrate the feasibility of the method, while an extrapolation method for realistic Hamiltonians is presented. In addition, applications at finite temperature are outlined
Shell model the Monte Carlo way
Energy Technology Data Exchange (ETDEWEB)
Ormand, W.E.
1995-03-01
The formalism for the auxiliary-field Monte Carlo approach to the nuclear shell model is presented. The method is based on a linearization of the two-body part of the Hamiltonian in an imaginary-time propagator using the Hubbard-Stratonovich transformation. The foundation of the method, as applied to the nuclear many-body problem, is discussed. Topics presented in detail include: (1) the density-density formulation of the method, (2) computation of the overlaps, (3) the sign of the Monte Carlo weight function, (4) techniques for performing Monte Carlo sampling, and (5) the reconstruction of response functions from an imaginary-time auto-correlation function using MaxEnt techniques. Results obtained using schematic interactions, which have no sign problem, are presented to demonstrate the feasibility of the method, while an extrapolation method for realistic Hamiltonians is presented. In addition, applications at finite temperature are outlined.
Off-diagonal expansion quantum Monte Carlo.
Albash, Tameem; Wagenbreth, Gene; Hen, Itay
2017-12-01
We propose a Monte Carlo algorithm designed to simulate quantum as well as classical systems at equilibrium, bridging the algorithmic gap between quantum and classical thermal simulation algorithms. The method is based on a decomposition of the quantum partition function that can be viewed as a series expansion about its classical part. We argue that the algorithm not only provides a theoretical advancement in the field of quantum Monte Carlo simulations, but is optimally suited to tackle quantum many-body systems that exhibit a range of behaviors from "fully quantum" to "fully classical," in contrast to many existing methods. We demonstrate the advantages, sometimes by orders of magnitude, of the technique by comparing it against existing state-of-the-art schemes such as path integral quantum Monte Carlo and stochastic series expansion. We also illustrate how our method allows for the unification of quantum and classical thermal parallel tempering techniques into a single algorithm and discuss its practical significance.
Off-diagonal expansion quantum Monte Carlo
Albash, Tameem; Wagenbreth, Gene; Hen, Itay
2017-12-01
We propose a Monte Carlo algorithm designed to simulate quantum as well as classical systems at equilibrium, bridging the algorithmic gap between quantum and classical thermal simulation algorithms. The method is based on a decomposition of the quantum partition function that can be viewed as a series expansion about its classical part. We argue that the algorithm not only provides a theoretical advancement in the field of quantum Monte Carlo simulations, but is optimally suited to tackle quantum many-body systems that exhibit a range of behaviors from "fully quantum" to "fully classical," in contrast to many existing methods. We demonstrate the advantages, sometimes by orders of magnitude, of the technique by comparing it against existing state-of-the-art schemes such as path integral quantum Monte Carlo and stochastic series expansion. We also illustrate how our method allows for the unification of quantum and classical thermal parallel tempering techniques into a single algorithm and discuss its practical significance.
Random Numbers and Monte Carlo Methods
Scherer, Philipp O. J.
Many-body problems often involve the calculation of integrals of very high dimension which cannot be treated by standard methods. For the calculation of thermodynamic averages Monte Carlo methods are very useful which sample the integration volume at randomly chosen points. After summarizing some basic statistics, we discuss algorithms for the generation of pseudo-random numbers with given probability distribution which are essential for all Monte Carlo methods. We show how the efficiency of Monte Carlo integration can be improved by sampling preferentially the important configurations. Finally the famous Metropolis algorithm is applied to classical many-particle systems. Computer experiments visualize the central limit theorem and apply the Metropolis method to the traveling salesman problem.
Monte Carlo strategies in scientific computing
Liu, Jun S
2008-01-01
This paperback edition is a reprint of the 2001 Springer edition This book provides a self-contained and up-to-date treatment of the Monte Carlo method and develops a common framework under which various Monte Carlo techniques can be "standardized" and compared Given the interdisciplinary nature of the topics and a moderate prerequisite for the reader, this book should be of interest to a broad audience of quantitative researchers such as computational biologists, computer scientists, econometricians, engineers, probabilists, and statisticians It can also be used as the textbook for a graduate-level course on Monte Carlo methods Many problems discussed in the alter chapters can be potential thesis topics for masters’ or PhD students in statistics or computer science departments Jun Liu is Professor of Statistics at Harvard University, with a courtesy Professor appointment at Harvard Biostatistics Department Professor Liu was the recipient of the 2002 COPSS Presidents' Award, the most prestigious one for sta...
Simulation of transport equations with Monte Carlo
International Nuclear Information System (INIS)
Matthes, W.
1975-09-01
The main purpose of the report is to explain the relation between the transport equation and the Monte Carlo game used for its solution. The introduction of artificial particles carrying a weight provides one with high flexibility in constructing many different games for the solution of the same equation. This flexibility opens a way to construct a Monte Carlo game for the solution of the adjoint transport equation. Emphasis is laid mostly on giving a clear understanding of what to do and not on the details of how to do a specific game
Self-learning Monte Carlo (dynamical biasing)
International Nuclear Information System (INIS)
Matthes, W.
1981-01-01
In many applications the histories of a normal Monte Carlo game rarely reach the target region. An approximate knowledge of the importance (with respect to the target) may be used to guide the particles more frequently into the target region. A Monte Carlo method is presented in which each history contributes to update the importance field such that eventually most target histories are sampled. It is a self-learning method in the sense that the procedure itself: (a) learns which histories are important (reach the target) and increases their probability; (b) reduces the probabilities of unimportant histories; (c) concentrates gradually on the more important target histories. (U.K.)
Monte Carlo electron/photon transport
International Nuclear Information System (INIS)
Mack, J.M.; Morel, J.E.; Hughes, H.G.
1985-01-01
A review of nonplasma coupled electron/photon transport using Monte Carlo method is presented. Remarks are mainly restricted to linerarized formalisms at electron energies from 1 keV to 1000 MeV. Applications involving pulse-height estimation, transport in external magnetic fields, and optical Cerenkov production are discussed to underscore the importance of this branch of computational physics. Advances in electron multigroup cross-section generation is reported, and its impact on future code development assessed. Progress toward the transformation of MCNP into a generalized neutral/charged-particle Monte Carlo code is described. 48 refs
A keff calculation method by Monte Carlo
International Nuclear Information System (INIS)
Shen, H; Wang, K.
2008-01-01
The effective multiplication factor (k eff ) is defined as the ratio between the number of neutrons in successive generations, which definition is adopted by most Monte Carlo codes (e.g. MCNP). Also, it can be thought of as the ratio of the generation rate of neutrons by the sum of the leakage rate and the absorption rate, which should exclude the effect of the neutron reaction such as (n, 2n) and (n, 3n). This article discusses the Monte Carlo method for k eff calculation based on the second definition. A new code has been developed and the results are presented. (author)
Monte Carlo Treatment Planning for Advanced Radiotherapy
DEFF Research Database (Denmark)
Cronholm, Rickard
This Ph.d. project describes the development of a workflow for Monte Carlo Treatment Planning for clinical radiotherapy plans. The workflow may be utilized to perform an independent dose verification of treatment plans. Modern radiotherapy treatment delivery is often conducted by dynamically...... modulating the intensity of the field during the irradiation. The workflow described has the potential to fully model the dynamic delivery, including gantry rotation during irradiation, of modern radiotherapy. Three corner stones of Monte Carlo Treatment Planning are identified: Building, commissioning...
Monte Carlo dose distributions for radiosurgery
International Nuclear Information System (INIS)
Perucha, M.; Leal, A.; Rincon, M.; Carrasco, E.
2001-01-01
The precision of Radiosurgery Treatment planning systems is limited by the approximations of their algorithms and by their dosimetrical input data. This fact is especially important in small fields. However, the Monte Carlo methods is an accurate alternative as it considers every aspect of particle transport. In this work an acoustic neurinoma is studied by comparing the dose distribution of both a planning system and Monte Carlo. Relative shifts have been measured and furthermore, Dose-Volume Histograms have been calculated for target and adjacent organs at risk. (orig.)
Monte Carlo applications to radiation shielding problems
International Nuclear Information System (INIS)
Subbaiah, K.V.
2009-01-01
Monte Carlo methods are a class of computational algorithms that rely on repeated random sampling of physical and mathematical systems to compute their results. However, basic concepts of MC are both simple and straightforward and can be learned by using a personal computer. Uses of Monte Carlo methods require large amounts of random numbers, and it was their use that spurred the development of pseudorandom number generators, which were far quicker to use than the tables of random numbers which had been previously used for statistical sampling. In Monte Carlo simulation of radiation transport, the history (track) of a particle is viewed as a random sequence of free flights that end with an interaction event where the particle changes its direction of movement, loses energy and, occasionally, produces secondary particles. The Monte Carlo simulation of a given experimental arrangement (e.g., an electron beam, coming from an accelerator and impinging on a water phantom) consists of the numerical generation of random histories. To simulate these histories we need an interaction model, i.e., a set of differential cross sections (DCS) for the relevant interaction mechanisms. The DCSs determine the probability distribution functions (pdf) of the random variables that characterize a track; 1) free path between successive interaction events, 2) type of interaction taking place and 3) energy loss and angular deflection in a particular event (and initial state of emitted secondary particles, if any). Once these pdfs are known, random histories can be generated by using appropriate sampling methods. If the number of generated histories is large enough, quantitative information on the transport process may be obtained by simply averaging over the simulated histories. The Monte Carlo method yields the same information as the solution of the Boltzmann transport equation, with the same interaction model, but is easier to implement. In particular, the simulation of radiation
Nonlinear Spatial Inversion Without Monte Carlo Sampling
Curtis, A.; Nawaz, A.
2017-12-01
High-dimensional, nonlinear inverse or inference problems usually have non-unique solutions. The distribution of solutions are described by probability distributions, and these are usually found using Monte Carlo (MC) sampling methods. These take pseudo-random samples of models in parameter space, calculate the probability of each sample given available data and other information, and thus map out high or low probability values of model parameters. However, such methods would converge to the solution only as the number of samples tends to infinity; in practice, MC is found to be slow to converge, convergence is not guaranteed to be achieved in finite time, and detection of convergence requires the use of subjective criteria. We propose a method for Bayesian inversion of categorical variables such as geological facies or rock types in spatial problems, which requires no sampling at all. The method uses a 2-D Hidden Markov Model over a grid of cells, where observations represent localized data constraining the model in each cell. The data in our example application are seismic properties such as P- and S-wave impedances or rock density; our model parameters are the hidden states and represent the geological rock types in each cell. The observations at each location are assumed to depend on the facies at that location only - an assumption referred to as `localized likelihoods'. However, the facies at a location cannot be determined solely by the observation at that location as it also depends on prior information concerning its correlation with the spatial distribution of facies elsewhere. Such prior information is included in the inversion in the form of a training image which represents a conceptual depiction of the distribution of local geologies that might be expected, but other forms of prior information can be used in the method as desired. The method provides direct (pseudo-analytic) estimates of posterior marginal probability distributions over each variable
Range uncertainties in proton therapy and the role of Monte Carlo simulations
International Nuclear Information System (INIS)
Paganetti, Harald
2012-01-01
The main advantages of proton therapy are the reduced total energy deposited in the patient as compared to photon techniques and the finite range of the proton beam. The latter adds an additional degree of freedom to treatment planning. The range in tissue is associated with considerable uncertainties caused by imaging, patient setup, beam delivery and dose calculation. Reducing the uncertainties would allow a reduction of the treatment volume and thus allow a better utilization of the advantages of protons. This paper summarizes the role of Monte Carlo simulations when aiming at a reduction of range uncertainties in proton therapy. Differences in dose calculation when comparing Monte Carlo with analytical algorithms are analyzed as well as range uncertainties due to material constants and CT conversion. Range uncertainties due to biological effects and the role of Monte Carlo for in vivo range verification are discussed. Furthermore, the current range uncertainty recipes used at several proton therapy facilities are revisited. We conclude that a significant impact of Monte Carlo dose calculation can be expected in complex geometries where local range uncertainties due to multiple Coulomb scattering will reduce the accuracy of analytical algorithms. In these cases Monte Carlo techniques might reduce the range uncertainty by several mm. (topical review)
Fast sequential Monte Carlo methods for counting and optimization
Rubinstein, Reuven Y; Vaisman, Radislav
2013-01-01
A comprehensive account of the theory and application of Monte Carlo methods Based on years of research in efficient Monte Carlo methods for estimation of rare-event probabilities, counting problems, and combinatorial optimization, Fast Sequential Monte Carlo Methods for Counting and Optimization is a complete illustration of fast sequential Monte Carlo techniques. The book provides an accessible overview of current work in the field of Monte Carlo methods, specifically sequential Monte Carlo techniques, for solving abstract counting and optimization problems. Written by authorities in the
Use of Monte Carlo Methods in brachytherapy; Uso del metodo de Monte Carlo en braquiterapia
Energy Technology Data Exchange (ETDEWEB)
Granero Cabanero, D.
2015-07-01
The Monte Carlo method has become a fundamental tool for brachytherapy dosimetry mainly because no difficulties associated with experimental dosimetry. In brachytherapy the main handicap of experimental dosimetry is the high dose gradient near the present sources making small uncertainties in the positioning of the detectors lead to large uncertainties in the dose. This presentation will review mainly the procedure for calculating dose distributions around a fountain using the Monte Carlo method showing the difficulties inherent in these calculations. In addition we will briefly review other applications of the method of Monte Carlo in brachytherapy dosimetry, as its use in advanced calculation algorithms, calculating barriers or obtaining dose applicators around. (Author)
Specialized Monte Carlo codes versus general-purpose Monte Carlo codes
International Nuclear Information System (INIS)
Moskvin, Vadim; DesRosiers, Colleen; Papiez, Lech; Lu, Xiaoyi
2002-01-01
The possibilities of Monte Carlo modeling for dose calculations and optimization treatment are quite limited in radiation oncology applications. The main reason is that the Monte Carlo technique for dose calculations is time consuming while treatment planning may require hundreds of possible cases of dose simulations to be evaluated for dose optimization. The second reason is that general-purpose codes widely used in practice, require an experienced user to customize them for calculations. This paper discusses the concept of Monte Carlo code design that can avoid the main problems that are preventing wide spread use of this simulation technique in medical physics. (authors)
On the use of stochastic approximation Monte Carlo for Monte Carlo integration
Liang, Faming
2009-03-01
The stochastic approximation Monte Carlo (SAMC) algorithm has recently been proposed as a dynamic optimization algorithm in the literature. In this paper, we show in theory that the samples generated by SAMC can be used for Monte Carlo integration via a dynamically weighted estimator by calling some results from the literature of nonhomogeneous Markov chains. Our numerical results indicate that SAMC can yield significant savings over conventional Monte Carlo algorithms, such as the Metropolis-Hastings algorithm, for the problems for which the energy landscape is rugged. © 2008 Elsevier B.V. All rights reserved.
Monte Carlo methods in AB initio quantum chemistry quantum Monte Carlo for molecules
Lester, William A; Reynolds, PJ
1994-01-01
This book presents the basic theory and application of the Monte Carlo method to the electronic structure of atoms and molecules. It assumes no previous knowledge of the subject, only a knowledge of molecular quantum mechanics at the first-year graduate level. A working knowledge of traditional ab initio quantum chemistry is helpful, but not essential.Some distinguishing features of this book are: Clear exposition of the basic theory at a level to facilitate independent study. Discussion of the various versions of the theory: diffusion Monte Carlo, Green's function Monte Carlo, and release n
Monte carlo dose calculation in dental amalgam phantom
Directory of Open Access Journals (Sweden)
Mohd Zahri Abdul Aziz
2015-01-01
Full Text Available It has become a great challenge in the modern radiation treatment to ensure the accuracy of treatment delivery in electron beam therapy. Tissue inhomogeneity has become one of the factors for accurate dose calculation, and this requires complex algorithm calculation like Monte Carlo (MC. On the other hand, computed tomography (CT images used in treatment planning system need to be trustful as they are the input in radiotherapy treatment. However, with the presence of metal amalgam in treatment volume, the CT images input showed prominent streak artefact, thus, contributed sources of error. Hence, metal amalgam phantom often creates streak artifacts, which cause an error in the dose calculation. Thus, a streak artifact reduction technique was applied to correct the images, and as a result, better images were observed in terms of structure delineation and density assigning. Furthermore, the amalgam density data were corrected to provide amalgam voxel with accurate density value. As for the errors of dose uncertainties due to metal amalgam, they were reduced from 46% to as low as 2% at d80 (depth of the 80% dose beyond Zmax using the presented strategies. Considering the number of vital and radiosensitive organs in the head and the neck regions, this correction strategy is suggested in reducing calculation uncertainties through MC calculation.
Monte Carlo dose calculation in dental amalgam phantom.
Aziz, Mohd Zahri Abdul; Yusoff, A L; Osman, N D; Abdullah, R; Rabaie, N A; Salikin, M S
2015-01-01
It has become a great challenge in the modern radiation treatment to ensure the accuracy of treatment delivery in electron beam therapy. Tissue inhomogeneity has become one of the factors for accurate dose calculation, and this requires complex algorithm calculation like Monte Carlo (MC). On the other hand, computed tomography (CT) images used in treatment planning system need to be trustful as they are the input in radiotherapy treatment. However, with the presence of metal amalgam in treatment volume, the CT images input showed prominent streak artefact, thus, contributed sources of error. Hence, metal amalgam phantom often creates streak artifacts, which cause an error in the dose calculation. Thus, a streak artifact reduction technique was applied to correct the images, and as a result, better images were observed in terms of structure delineation and density assigning. Furthermore, the amalgam density data were corrected to provide amalgam voxel with accurate density value. As for the errors of dose uncertainties due to metal amalgam, they were reduced from 46% to as low as 2% at d80 (depth of the 80% dose beyond Zmax) using the presented strategies. Considering the number of vital and radiosensitive organs in the head and the neck regions, this correction strategy is suggested in reducing calculation uncertainties through MC calculation.
Monte Carlo method in neutron activation analysis
International Nuclear Information System (INIS)
Majerle, M.; Krasa, A.; Svoboda, O.; Wagner, V.; Adam, J.; Peetermans, S.; Slama, O.; Stegajlov, V.I.; Tsupko-Sitnikov, V.M.
2009-01-01
Neutron activation detectors are a useful technique for the neutron flux measurements in spallation experiments. The study of the usefulness and the accuracy of this method at similar experiments was performed with the help of Monte Carlo codes MCNPX and FLUKA
Biases in Monte Carlo eigenvalue calculations
Energy Technology Data Exchange (ETDEWEB)
Gelbard, E.M.
1992-12-01
The Monte Carlo method has been used for many years to analyze the neutronics of nuclear reactors. In fact, as the power of computers has increased the importance of Monte Carlo in neutronics has also increased, until today this method plays a central role in reactor analysis and design. Monte Carlo is used in neutronics for two somewhat different purposes, i.e., (a) to compute the distribution of neutrons in a given medium when the neutron source-density is specified, and (b) to compute the neutron distribution in a self-sustaining chain reaction, in which case the source is determined as the eigenvector of a certain linear operator. In (b), then, the source is not given, but must be computed. In the first case (the ``fixed-source`` case) the Monte Carlo calculation is unbiased. That is to say that, if the calculation is repeated (``replicated``) over and over, with independent random number sequences for each replica, then averages over all replicas will approach the correct neutron distribution as the number of replicas goes to infinity. Unfortunately, the computation is not unbiased in the second case, which we discuss here.
Biases in Monte Carlo eigenvalue calculations
Energy Technology Data Exchange (ETDEWEB)
Gelbard, E.M.
1992-01-01
The Monte Carlo method has been used for many years to analyze the neutronics of nuclear reactors. In fact, as the power of computers has increased the importance of Monte Carlo in neutronics has also increased, until today this method plays a central role in reactor analysis and design. Monte Carlo is used in neutronics for two somewhat different purposes, i.e., (a) to compute the distribution of neutrons in a given medium when the neutron source-density is specified, and (b) to compute the neutron distribution in a self-sustaining chain reaction, in which case the source is determined as the eigenvector of a certain linear operator. In (b), then, the source is not given, but must be computed. In the first case (the fixed-source'' case) the Monte Carlo calculation is unbiased. That is to say that, if the calculation is repeated ( replicated'') over and over, with independent random number sequences for each replica, then averages over all replicas will approach the correct neutron distribution as the number of replicas goes to infinity. Unfortunately, the computation is not unbiased in the second case, which we discuss here.
Monte Carlo method for random surfaces
International Nuclear Information System (INIS)
Berg, B.
1985-01-01
Previously two of the authors proposed a Monte Carlo method for sampling statistical ensembles of random walks and surfaces with a Boltzmann probabilistic weight. In the present paper we work out the details for several models of random surfaces, defined on d-dimensional hypercubic lattices. (orig.)
Computer system for Monte Carlo experimentation
International Nuclear Information System (INIS)
Grier, D.A.
1986-01-01
A new computer system for Monte Carlo Experimentation is presented. The new system speeds and simplifies the process of coding and preparing a Monte Carlo Experiment; it also encourages the proper design of Monte Carlo Experiments, and the careful analysis of the experimental results. A new functional language is the core of this system. Monte Carlo Experiments, and their experimental designs, are programmed in this new language; those programs are compiled into Fortran output. The Fortran output is then compiled and executed. The experimental results are analyzed with a standard statistics package such as Si, Isp, or Minitab or with a user-supplied program. Both the experimental results and the experimental design may be directly loaded into the workspace of those packages. The new functional language frees programmers from many of the details of programming an experiment. Experimental designs such as factorial, fractional factorial, or latin square are easily described by the control structures and expressions of the language. Specific mathematical modes are generated by the routines of the language
Monte Carlo simulation of the microcanonical ensemble
International Nuclear Information System (INIS)
Creutz, M.
1984-01-01
We consider simulating statistical systems with a random walk on a constant energy surface. This combines features of deterministic molecular dynamics techniques and conventional Monte Carlo simulations. For discrete systems the method can be programmed to run an order of magnitude faster than other approaches. It does not require high quality random numbers and may also be useful for nonequilibrium studies. 10 references
Workshop: Monte Carlo computational performance benchmark - Contributions
International Nuclear Information System (INIS)
Hoogenboom, J.E.; Petrovic, B.; Martin, W.R.; Sutton, T.; Leppaenen, J.; Forget, B.; Romano, P.; Siegel, A.; Hoogenboom, E.; Wang, K.; Li, Z.; She, D.; Liang, J.; Xu, Q.; Qiu, Y.; Yu, J.; Sun, J.; Fan, X.; Yu, G.; Bernard, F.; Cochet, B.; Jinaphanh, A.; Jacquet, O.; Van der Marck, S.; Tramm, J.; Felker, K.; Smith, K.; Horelik, N.; Capellan, N.; Herman, B.
2013-01-01
This series of slides is divided into 3 parts. The first part is dedicated to the presentation of the Monte-Carlo computational performance benchmark (aims, specifications and results). This benchmark aims at performing a full-size Monte Carlo simulation of a PWR core with axial and pin-power distribution. Many different Monte Carlo codes have been used and their results have been compared in terms of computed values and processing speeds. It appears that local power values mostly agree quite well. The first part also includes the presentations of about 10 participants in which they detail their calculations. In the second part, an extension of the benchmark is proposed in order to simulate a more realistic reactor core (for instance non-uniform temperature) and to assess feedback coefficients due to change of some parameters. The third part deals with another benchmark, the BEAVRS benchmark (Benchmark for Evaluation And Validation of Reactor Simulations). BEAVRS is also a full-core PWR benchmark for Monte Carlo simulations
Monte Carlo determination of heteroepitaxial misfit structures
DEFF Research Database (Denmark)
Baker, J.; Lindgård, Per-Anker
1996-01-01
We use Monte Carlo simulations to determine the structure of KBr overlayers on a NaCl(001) substrate, a system with large (17%) heteroepitaxial misfit. The equilibrium relaxation structure is determined for films of 2-6 ML, for which extensive helium-atom scattering data exist for comparison...
Dynamic bounds coupled with Monte Carlo simulations
Rajabali Nejad, Mohammadreza; Meester, L.E.; van Gelder, P.H.A.J.M.; Vrijling, J.K.
2011-01-01
For the reliability analysis of engineering structures a variety of methods is known, of which Monte Carlo (MC) simulation is widely considered to be among the most robust and most generally applicable. To reduce simulation cost of the MC method, variance reduction methods are applied. This paper
Atomistic Monte Carlo simulation of lipid membranes
DEFF Research Database (Denmark)
Wüstner, Daniel; Sklenar, Heinz
2014-01-01
Biological membranes are complex assemblies of many different molecules of which analysis demands a variety of experimental and computational approaches. In this article, we explain challenges and advantages of atomistic Monte Carlo (MC) simulation of lipid membranes. We provide an introduction...... of local-move MC methods in combination with molecular dynamics simulations, for example, for studying multi-component lipid membranes containing cholesterol....
Design and analysis of Monte Carlo experiments
Kleijnen, Jack P.C.; Gentle, J.E.; Haerdle, W.; Mori, Y.
2012-01-01
By definition, computer simulation or Monte Carlo models are not solved by mathematical analysis (such as differential calculus), but are used for numerical experimentation. The goal of these experiments is to answer questions about the real world; i.e., the experimenters may use their models to
Juan Carlos D'Olivo: A portrait
Aguilar-Arévalo, Alexis A.
2013-06-01
This report attempts to give a brief bibliographical sketch of the academic life of Juan Carlos D'Olivo, researcher and teacher at the Instituto de Ciencias Nucleares of UNAM, devoted to advancing the fields of High Energy Physics and Astroparticle Physics in Mexico and Latin America.
Scalable Domain Decomposed Monte Carlo Particle Transport
Energy Technology Data Exchange (ETDEWEB)
O' Brien, Matthew Joseph [Univ. of California, Davis, CA (United States)
2013-12-05
In this dissertation, we present the parallel algorithms necessary to run domain decomposed Monte Carlo particle transport on large numbers of processors (millions of processors). Previous algorithms were not scalable, and the parallel overhead became more computationally costly than the numerical simulation.
An analysis of Monte Carlo tree search
CSIR Research Space (South Africa)
James, S
2017-02-01
Full Text Available Monte Carlo Tree Search (MCTS) is a family of directed search algorithms that has gained widespread attention in recent years. Despite the vast amount of research into MCTS, the effect of modifications on the algorithm, as well as the manner...
Parallel processing Monte Carlo radiation transport codes
International Nuclear Information System (INIS)
McKinney, G.W.
1994-01-01
Issues related to distributed-memory multiprocessing as applied to Monte Carlo radiation transport are discussed. Measurements of communication overhead are presented for the radiation transport code MCNP which employs the communication software package PVM, and average efficiency curves are provided for a homogeneous virtual machine
Monte Carlo studies of uranium calorimetry
International Nuclear Information System (INIS)
Brau, J.; Hargis, H.J.; Gabriel, T.A.; Bishop, B.L.
1985-01-01
Detailed Monte Carlo calculations of uranium calorimetry are presented which reveal a significant difference in the responses of liquid argon and plastic scintillator in uranium calorimeters. Due to saturation effects, neutrons from the uranium are found to contribute only weakly to the liquid argon signal. Electromagnetic sampling inefficiencies are significant and contribute substantially to compensation in both systems. 17 references
Uncertainty analysis in Monte Carlo criticality computations
International Nuclear Information System (INIS)
Qi Ao
2011-01-01
Highlights: ► Two types of uncertainty methods for k eff Monte Carlo computations are examined. ► Sampling method has the least restrictions on perturbation but computing resources. ► Analytical method is limited to small perturbation on material properties. ► Practicality relies on efficiency, multiparameter applicability and data availability. - Abstract: Uncertainty analysis is imperative for nuclear criticality risk assessments when using Monte Carlo neutron transport methods to predict the effective neutron multiplication factor (k eff ) for fissionable material systems. For the validation of Monte Carlo codes for criticality computations against benchmark experiments, code accuracy and precision are measured by both the computational bias and uncertainty in the bias. The uncertainty in the bias accounts for known or quantified experimental, computational and model uncertainties. For the application of Monte Carlo codes for criticality analysis of fissionable material systems, an administrative margin of subcriticality must be imposed to provide additional assurance of subcriticality for any unknown or unquantified uncertainties. Because of a substantial impact of the administrative margin of subcriticality on economics and safety of nuclear fuel cycle operations, recently increasing interests in reducing the administrative margin of subcriticality make the uncertainty analysis in criticality safety computations more risk-significant. This paper provides an overview of two most popular k eff uncertainty analysis methods for Monte Carlo criticality computations: (1) sampling-based methods, and (2) analytical methods. Examples are given to demonstrate their usage in the k eff uncertainty analysis due to uncertainties in both neutronic and non-neutronic parameters of fissionable material systems.
The adaptation method in the Monte Carlo simulation for computed tomography
Energy Technology Data Exchange (ETDEWEB)
Lee, Hyoung Gun; Yoon, Chang Yeon; Lee, Won Ho [Dept. of Bio-convergence Engineering, Korea University, Seoul (Korea, Republic of); Cho, Seung Ryong [Dept. of Nuclear and Quantum Engineering, Korea Advanced Institute of Science and Technology, Daejeon (Korea, Republic of); Park, Sung Ho [Dept. of Neurosurgery, Ulsan University Hospital, Ulsan (Korea, Republic of)
2015-06-15
The patient dose incurred from diagnostic procedures during advanced radiotherapy has become an important issue. Many researchers in medical physics are using computational simulations to calculate complex parameters in experiments. However, extended computation times make it difficult for personal computers to run the conventional Monte Carlo method to simulate radiological images with high-flux photons such as images produced by computed tomography (CT). To minimize the computation time without degrading imaging quality, we applied a deterministic adaptation to the Monte Carlo calculation and verified its effectiveness by simulating CT image reconstruction for an image evaluation phantom (Catphan; Phantom Laboratory, New York NY, USA) and a human-like voxel phantom (KTMAN-2) (Los Alamos National Laboratory, Los Alamos, NM, USA). For the deterministic adaptation, the relationship between iteration numbers and the simulations was estimated and the option to simulate scattered radiation was evaluated. The processing times of simulations using the adaptive method were at least 500 times faster than those using a conventional statistical process. In addition, compared with the conventional statistical method, the adaptive method provided images that were more similar to the experimental images, which proved that the adaptive method was highly effective for a simulation that requires a large number of iterations-assuming no radiation scattering in the vicinity of detectors minimized artifacts in the reconstructed image.
The adaptation method in the Monte Carlo simulation for computed tomography
Directory of Open Access Journals (Sweden)
Hyounggun Lee
2015-06-01
Full Text Available The patient dose incurred from diagnostic procedures during advanced radiotherapy has become an important issue. Many researchers in medical physics are using computational simulations to calculate complex parameters in experiments. However, extended computation times make it difficult for personal computers to run the conventional Monte Carlo method to simulate radiological images with high-flux photons such as images produced by computed tomography (CT. To minimize the computation time without degrading imaging quality, we applied a deterministic adaptation to the Monte Carlo calculation and verified its effectiveness by simulating CT image reconstruction for an image evaluation phantom (Catphan; Phantom Laboratory, New York NY, USA and a human-like voxel phantom (KTMAN-2 (Los Alamos National Laboratory, Los Alamos, NM, USA. For the deterministic adaptation, the relationship between iteration numbers and the simulations was estimated and the option to simulate scattered radiation was evaluated. The processing times of simulations using the adaptive method were at least 500 times faster than those using a conventional statistical process. In addition, compared with the conventional statistical method, the adaptive method provided images that were more similar to the experimental images, which proved that the adaptive method was highly effective for a simulation that requires a large number of iterations—assuming no radiation scattering in the vicinity of detectors minimized artifacts in the reconstructed image.
Optical monitoring of rheumatoid arthritis: Monte Carlo generated reconstruction kernels
Minet, O.; Beuthan, J.; Hielscher, A. H.; Zabarylo, U.
2008-06-01
Optical imaging in biomedicine is governed by the light absorption and scattering interaction on microscopic and macroscopic constituents in the medium. Therefore, light scattering characteristics of human tissue correlate with the stage of some diseases. In the near infrared range the scattering event with the coefficient approximately two orders of magnitude greater than absorption plays a dominant role. When measuring the optical parameters variations were discovered that correlate with the rheumatoid arthritis of a small joint. The potential of an experimental setup for transillumination the finger joint with a laser diode and the pattern of the stray light detection are demonstrated. The scattering caused by skin contains no useful information and it can be removed by a deconvolution technique to enhance the diagnostic value of this non-invasive optical method. Monte Carlo simulations ensure both the construction of the corresponding point spread function and both the theoretical verification of the stray light picture in rather complex geometry.
minimum thresholds of monte carlo cycles for nigerian empirical
African Journals Online (AJOL)
2012-11-03
Nov 3, 2012 ... Abstract. Monte Carlo simulation has proven to be an effective means of incorporating reliability analysis into the ... Monte Carlo simulation cycle of 2, 500 thresholds were enough to be used to provide sufficient repeatability for ... rameters using Monte Carlo method with the aid of. MATrixLABoratory.
Clear-PEM system counting rates: a Monte Carlo study
Rodrigues, P.; Trindade, A.; Varela, J.
2007-01-01
Positron Emission Mammography (PEM) with 18F-Fluorodeoxyglucose (18F-FDG) is a functional imaging technique for breast cancer detection. The development of dedicated imaging systems with high sensitivity and spatial resolution are crucial for early breast cancer diagnosis and an efficient therapy. Clear-PEM is a dual planar scanner designed for high-resolution breast cancer imaging under development by the Portuguese PET Mammography consortium within the Crystal Clear Collaboration. It brings together a favorable combination of high-density scintillator crystals coupled to compact photodetectors, arranged in a double readout scheme capable of providing depth-of-interaction information. A Monte Carlo study of the Clear-PEM system counting rates is presented in this paper. Hypothetical breast exam scenarios were simulated to estimate the single event rates, true and random coincidence rates. A realistic description of the patient and detector geometry, radiation environment, physics and instrumentation factors was adopted in this work. Special attention was given to the 18F-FDG accumulation in the patient torso organs which, for the Clear-PEM scanner, represent significant activity outside the field-of-view (FOV) contributing to an increase of singles, randoms and scattered coincidences affecting the overall system performance. The potential benefits of patient shielding to minimize the influence of the out-of-field background was explored. The influence of LYSO:Ce crystal intrinsic natural activity due to the presence of the 176Lu isotope on the counting rate performance of the proposed scanner, was also investigated.
Directory of Open Access Journals (Sweden)
Aline Lopes de Lacerda
2009-07-01
Full Text Available Examina questões metodológicas referentes ao tratamento técnico de arquivos, a partir da organização do arquivo de Carlos Chagas. Tendo como objeto as fotografias integrantes dessa documentação, discute a organização arquivística de fotografias e analisa alguns grupos dessas imagens, com enfoque nas fotografias relacionadas à descoberta da doença de Chagas em Lassance. O objetivo é verificar os processos de produção de sentido embutidos na metodologia de classificação desses documentos. Aponta em que medida as imagens relacionadas à estadia de Chagas em Lassance - para onde ele foi inicialmente combater a malária - assumiram novo sentido, diante da importância da descoberta da doença para a sua trajetória e na produção e disseminação de sua memória.Centered on the organization of the Carlos Chagas archive, the article explores methodological concerns in the technical handling of such material. It discusses the archival organization of photographs and analyzes some groups of images from the Chagas archive, particularly photographs concerning the discovery of Chagas' disease, in Lassance, where the scientist initially went to combat malaria. The goal is to identify processes of production of meaning that are built into the methodology for classifying these documents. The article examines to what extent the images of Chagas' stay in Lassance took on new meaning because of the weight this discovery had on the scientist's career and likewise on the production and dissemination of his memory.
Multilevel sequential Monte-Carlo samplers
Jasra, Ajay
2016-01-05
Multilevel Monte-Carlo methods provide a powerful computational technique for reducing the computational cost of estimating expectations for a given computational effort. They are particularly relevant for computational problems when approximate distributions are determined via a resolution parameter h, with h=0 giving the theoretical exact distribution (e.g. SDEs or inverse problems with PDEs). The method provides a benefit by coupling samples from successive resolutions, and estimating differences of successive expectations. We develop a methodology that brings Sequential Monte-Carlo (SMC) algorithms within the framework of the Multilevel idea, as SMC provides a natural set-up for coupling samples over different resolutions. We prove that the new algorithm indeed preserves the benefits of the multilevel principle, even if samples at all resolutions are now correlated.
Status of Monte Carlo at Los Alamos
International Nuclear Information System (INIS)
Thompson, W.L.; Cashwell, E.D.
1980-01-01
At Los Alamos the early work of Fermi, von Neumann, and Ulam has been developed and supplemented by many followers, notably Cashwell and Everett, and the main product today is the continuous-energy, general-purpose, generalized-geometry, time-dependent, coupled neutron-photon transport code called MCNP. The Los Alamos Monte Carlo research and development effort is concentrated in Group X-6. MCNP treats an arbitrary three-dimensional configuration of arbitrary materials in geometric cells bounded by first- and second-degree surfaces and some fourth-degree surfaces (elliptical tori). Monte Carlo has evolved into perhaps the main method for radiation transport calculations at Los Alamos. MCNP is used in every technical division at the Laboratory by over 130 users about 600 times a month accounting for nearly 200 hours of CDC-7600 time
Monte Carlo simulation of gas Cerenkov detectors
International Nuclear Information System (INIS)
Mack, J.M.; Jain, M.; Jordan, T.M.
1984-01-01
Theoretical study of selected gamma-ray and electron diagnostic necessitates coupling Cerenkov radiation to electron/photon cascades. A Cerenkov production model and its incorporation into a general geometry Monte Carlo coupled electron/photon transport code is discussed. A special optical photon ray-trace is implemented using bulk optical properties assigned to each Monte Carlo zone. Good agreement exists between experimental and calculated Cerenkov data in the case of a carbon-dioxide gas Cerenkov detector experiment. Cerenkov production and threshold data are presented for a typical carbon-dioxide gas detector that converts a 16.7 MeV photon source to Cerenkov light, which is collected by optics and detected by a photomultiplier
No-compromise reptation quantum Monte Carlo
International Nuclear Information System (INIS)
Yuen, W K; Farrar, Thomas J; Rothstein, Stuart M
2007-01-01
Since its publication, the reptation quantum Monte Carlo algorithm of Baroni and Moroni (1999 Phys. Rev. Lett. 82 4745) has been applied to several important problems in physics, but its mathematical foundations are not well understood. We show that their algorithm is not of typical Metropolis-Hastings type, and we specify conditions required for the generated Markov chain to be stationary and to converge to the intended distribution. The time-step bias may add up, and in many applications it is only the middle of a reptile that is the most important. Therefore, we propose an alternative, 'no-compromise reptation quantum Monte Carlo' to stabilize the middle of the reptile. (fast track communication)
Multilevel Monte Carlo Approaches for Numerical Homogenization
Efendiev, Yalchin R.
2015-10-01
In this article, we study the application of multilevel Monte Carlo (MLMC) approaches to numerical random homogenization. Our objective is to compute the expectation of some functionals of the homogenized coefficients, or of the homogenized solutions. This is accomplished within MLMC by considering different sizes of representative volumes (RVEs). Many inexpensive computations with the smallest RVE size are combined with fewer expensive computations performed on larger RVEs. Likewise, when it comes to homogenized solutions, different levels of coarse-grid meshes are used to solve the homogenized equation. We show that, by carefully selecting the number of realizations at each level, we can achieve a speed-up in the computations in comparison to a standard Monte Carlo method. Numerical results are presented for both one-dimensional and two-dimensional test-cases that illustrate the efficiency of the approach.
EU Commissioner Carlos Moedas visits SESAME
CERN Bulletin
2015-01-01
The European Commissioner for research, science and innovation, Carlos Moedas, visited the SESAME laboratory in Jordan on Monday 13 April. When it begins operation in 2016, SESAME, a synchrotron light source, will be the Middle East’s first major international science centre, carrying out experiments ranging from the physical sciences to environmental science and archaeology. CERN Director-General Rolf Heuer (left) and European Commissioner Carlos Moedas with the model SESAME magnet. © European Union, 2015. Commissioner Moedas was accompanied by a European Commission delegation led by Robert-Jan Smits, Director-General of DG Research and Innovation, as well as Rolf Heuer, CERN Director-General, Jean-Pierre Koutchouk, coordinator of the CERN-EC Support for SESAME Magnets (CESSAMag) project and Princess Sumaya bint El Hassan of Jordan, a leading advocate of science in the region. They toured the SESAME facility together with SESAME Director, Khaled Tou...
Status of Monte Carlo at Los Alamos
International Nuclear Information System (INIS)
Thompson, W.L.; Cashwell, E.D.; Godfrey, T.N.K.; Schrandt, R.G.; Deutsch, O.L.; Booth, T.E.
1980-05-01
Four papers were presented by Group X-6 on April 22, 1980, at the Oak Ridge Radiation Shielding Information Center (RSIC) Seminar-Workshop on Theory and Applications of Monte Carlo Methods. These papers are combined into one report for convenience and because they are related to each other. The first paper (by Thompson and Cashwell) is a general survey about X-6 and MCNP and is an introduction to the other three papers. It can also serve as a resume of X-6. The second paper (by Godfrey) explains some of the details of geometry specification in MCNP. The third paper (by Cashwell and Schrandt) illustrates calculating flux at a point with MCNP; in particular, the once-more-collided flux estimator is demonstrated. Finally, the fourth paper (by Thompson, Deutsch, and Booth) is a tutorial on some variance-reduction techniques. It should be required for a fledging Monte Carlo practitioner
Monte Carlo Particle Transport: Algorithm and Performance Overview
International Nuclear Information System (INIS)
Gentile, N.; Procassini, R.; Scott, H.
2005-01-01
Monte Carlo methods are frequently used for neutron and radiation transport. These methods have several advantages, such as relative ease of programming and dealing with complex meshes. Disadvantages include long run times and statistical noise. Monte Carlo photon transport calculations also often suffer from inaccuracies in matter temperature due to the lack of implicitness. In this paper we discuss the Monte Carlo algorithm as it is applied to neutron and photon transport, detail the differences between neutron and photon Monte Carlo, and give an overview of the ways the numerical method has been modified to deal with issues that arise in photon Monte Carlo simulations
Introduction to the Monte Carlo methods
International Nuclear Information System (INIS)
Uzhinskij, V.V.
1993-01-01
Codes illustrating the use of Monte Carlo methods in high energy physics such as the inverse transformation method, the ejection method, the particle propagation through the nucleus, the particle interaction with the nucleus, etc. are presented. A set of useful algorithms of random number generators is given (the binomial distribution, the Poisson distribution, β-distribution, γ-distribution and normal distribution). 5 figs., 1 tab
Monte Carlo modeling of eye iris color
Koblova, Ekaterina V.; Bashkatov, Alexey N.; Dolotov, Leonid E.; Sinichkin, Yuri P.; Kamenskikh, Tatyana G.; Genina, Elina A.; Tuchin, Valery V.
2007-05-01
Based on the presented two-layer eye iris model, the iris diffuse reflectance has been calculated by Monte Carlo technique in the spectral range 400-800 nm. The diffuse reflectance spectra have been recalculated in L*a*b* color coordinate system. Obtained results demonstrated that the iris color coordinates (hue and chroma) can be used for estimation of melanin content in the range of small melanin concentrations, i.e. for estimation of melanin content in blue and green eyes.
Handbook of Markov chain Monte Carlo
Brooks, Steve
2011-01-01
""Handbook of Markov Chain Monte Carlo"" brings together the major advances that have occurred in recent years while incorporating enough introductory material for new users of MCMC. Along with thorough coverage of the theoretical foundations and algorithmic and computational methodology, this comprehensive handbook includes substantial realistic case studies from a variety of disciplines. These case studies demonstrate the application of MCMC methods and serve as a series of templates for the construction, implementation, and choice of MCMC methodology.
Monte Carlo methods for shield design calculations
International Nuclear Information System (INIS)
Grimstone, M.J.
1974-01-01
A suite of Monte Carlo codes is being developed for use on a routine basis in commercial reactor shield design. The methods adopted for this purpose include the modular construction of codes, simplified geometries, automatic variance reduction techniques, continuous energy treatment of cross section data, and albedo methods for streaming. Descriptions are given of the implementation of these methods and of their use in practical calculations. 26 references. (U.S.)
Replica Exchange for Reactive Monte Carlo Simulations
Czech Academy of Sciences Publication Activity Database
Turner, C.H.; Brennan, J.K.; Lísal, Martin
2007-01-01
Roč. 111, č. 43 (2007), s. 15706-15715 ISSN 1932-7447 R&D Projects: GA ČR GA203/05/0725; GA AV ČR 1ET400720409; GA AV ČR 1ET400720507 Institutional research plan: CEZ:AV0Z40720504 Keywords : monte carlo * simulation * reactive system Subject RIV: CF - Physical ; Theoretical Chemistry
Applications of Maxent to quantum Monte Carlo
Energy Technology Data Exchange (ETDEWEB)
Silver, R.N.; Sivia, D.S.; Gubernatis, J.E. (Los Alamos National Lab., NM (USA)); Jarrell, M. (Ohio State Univ., Columbus, OH (USA). Dept. of Physics)
1990-01-01
We consider the application of maximum entropy methods to the analysis of data produced by computer simulations. The focus is the calculation of the dynamical properties of quantum many-body systems by Monte Carlo methods, which is termed the Analytical Continuation Problem.'' For the Anderson model of dilute magnetic impurities in metals, we obtain spectral functions and transport coefficients which obey Kondo Universality.'' 24 refs., 7 figs.
Monte Carlo methods for preference learning
DEFF Research Database (Denmark)
Viappiani, P.
2012-01-01
Utility elicitation is an important component of many applications, such as decision support systems and recommender systems. Such systems query the users about their preferences and give recommendations based on the system’s belief about the utility function. Critical to these applications is th...... is the acquisition of prior distribution about the utility parameters and the possibility of real time Bayesian inference. In this paper we consider Monte Carlo methods for these problems....
General purpose code for Monte Carlo simulations
International Nuclear Information System (INIS)
Wilcke, W.W.
1983-01-01
A general-purpose computer called MONTHY has been written to perform Monte Carlo simulations of physical systems. To achieve a high degree of flexibility the code is organized like a general purpose computer, operating on a vector describing the time dependent state of the system under simulation. The instruction set of the computer is defined by the user and is therefore adaptable to the particular problem studied. The organization of MONTHY allows iterative and conditional execution of operations
The lund Monte Carlo for jet fragmentation
International Nuclear Information System (INIS)
Sjoestrand, T.
1982-03-01
We present a Monte Carlo program based on the Lund model for jet fragmentation. Quark, gluon, diquark and hadron jets are considered. Special emphasis is put on the fragmentation of colour singlet jet systems, for which energy, momentum and flavour are conserved explicitly. The model for decays of unstable particles, in particular the weak decay of heavy hadrons, is described. The central part of the paper is a detailed description on how to use the FORTRAN 77 program. (Author)
Carlo Rosselli e il socialismo delle autonomie
Calabrò, Carmelo
2008-01-01
L’impegno teorico di Carlo Rosselli è riconducibile alle molteplici esperienze minoritarie (almeno a livello continentale) che, negli anni ’20, mirano al superamento dell’impianto dottrinario del socialismo marxista. Tanto nella variante riformista, quanto in quella massimalista, classismo, olismo e collettivismo sono principi tendenzialmente comuni alla cultura del marxismo; principi dicotomici rispetto al liberalismo e problematici nei confronti della democrazia. Rosselli, contro questa tra...
Autocorrelations in hybrid Monte Carlo simulations
International Nuclear Information System (INIS)
Schaefer, Stefan; Virotta, Francesco
2010-11-01
Simulations of QCD suffer from severe critical slowing down towards the continuum limit. This problem is known to be prominent in the topological charge, however, all observables are affected to various degree by these slow modes in the Monte Carlo evolution. We investigate the slowing down in high statistics simulations and propose a new error analysis method, which gives a realistic estimate of the contribution of the slow modes to the errors. (orig.)
Topological zero modes in Monte Carlo simulations
International Nuclear Information System (INIS)
Dilger, H.
1994-08-01
We present an improvement of global Metropolis updating steps, the instanton hits, used in a hybrid Monte Carlo simulation of the two-flavor Schwinger model with staggered fermions. These hits are designed to change the topological sector of the gauge field. In order to match these hits to an unquenched simulation with pseudofermions, the approximate zero mode structure of the lattice Dirac operator has to be considered explicitly. (orig.)
Monte Carlo simulation of Touschek effect
Directory of Open Access Journals (Sweden)
Aimin Xiao
2010-07-01
Full Text Available We present a Monte Carlo method implementation in the code elegant for simulating Touschek scattering effects in a linac beam. The local scattering rate and the distribution of scattered electrons can be obtained from the code either for a Gaussian-distributed beam or for a general beam whose distribution function is given. In addition, scattered electrons can be tracked through the beam line and the local beam-loss rate and beam halo information recorded.
R and D on automatic modeling methods for Monte Carlo codes FLUKA
International Nuclear Information System (INIS)
Wang Dianxi; Hu Liqin; Wang Guozhong; Zhao Zijia; Nie Fanzhi; Wu Yican; Long Pengcheng
2013-01-01
FLUKA is a fully integrated particle physics Monte Carlo simulation package. It is necessary to create the geometry models before calculation. However, it is time- consuming and error-prone to describe the geometry models manually. This study developed an automatic modeling method which could automatically convert computer-aided design (CAD) geometry models into FLUKA models. The conversion program was integrated into CAD/image-based automatic modeling program for nuclear and radiation transport simulation (MCAM). Its correctness has been demonstrated. (authors)
Monte Carlo simulation of an optical coherence tomography signal in homogeneous turbid media
Yao, Gang; Wang, Lihong V.
1999-01-01
The Monte Carlo technique with angle biasing is used to simulate the optical coherence tomography (OCT) signal from homogeneous turbid media. The OCT signal is divided into two categories: one is from a target imaging layer in the medium (Class I); the other is from the rest of the medium (Class II). These two classes of signal are very different in their spatial distributions, angular distributions and the numbers of experienced scattering events. Multiply scattered light contributes to the ...
Biased Monte Carlo optimization: the basic approach
International Nuclear Information System (INIS)
Campioni, Luca; Scardovelli, Ruben; Vestrucci, Paolo
2005-01-01
It is well-known that the Monte Carlo method is very successful in tackling several kinds of system simulations. It often happens that one has to deal with rare events, and the use of a variance reduction technique is almost mandatory, in order to have Monte Carlo efficient applications. The main issue associated with variance reduction techniques is related to the choice of the value of the biasing parameter. Actually, this task is typically left to the experience of the Monte Carlo user, who has to make many attempts before achieving an advantageous biasing. A valuable result is provided: a methodology and a practical rule addressed to establish an a priori guidance for the choice of the optimal value of the biasing parameter. This result, which has been obtained for a single component system, has the notable property of being valid for any multicomponent system. In particular, in this paper, the exponential and the uniform biases of exponentially distributed phenomena are investigated thoroughly
Generalized hybrid Monte Carlo - CMFD methods for fission source convergence
International Nuclear Information System (INIS)
Wolters, Emily R.; Larsen, Edward W.; Martin, William R.
2011-01-01
In this paper, we generalize the recently published 'CMFD-Accelerated Monte Carlo' method and present two new methods that reduce the statistical error in CMFD-Accelerated Monte Carlo. The CMFD-Accelerated Monte Carlo method uses Monte Carlo to estimate nonlinear functionals used in low-order CMFD equations for the eigenfunction and eigenvalue. The Monte Carlo fission source is then modified to match the resulting CMFD fission source in a 'feedback' procedure. The two proposed methods differ from CMFD-Accelerated Monte Carlo in the definition of the required nonlinear functionals, but they have identical CMFD equations. The proposed methods are compared with CMFD-Accelerated Monte Carlo on a high dominance ratio test problem. All hybrid methods converge the Monte Carlo fission source almost immediately, leading to a large reduction in the number of inactive cycles required. The proposed methods stabilize the fission source more efficiently than CMFD-Accelerated Monte Carlo, leading to a reduction in the number of active cycles required. Finally, as in CMFD-Accelerated Monte Carlo, the apparent variance of the eigenfunction is approximately equal to the real variance, so the real error is well-estimated from a single calculation. This is an advantage over standard Monte Carlo, in which the real error can be underestimated due to inter-cycle correlation. (author)
Monte carlo methods and models in finance and insurance
Korn, Ralf; Kroisandt, Gerald
2010-01-01
Offering a unique balance between applications and calculations, Monte Carlo Methods and Models in Finance and Insurance incorporates the application background of finance and insurance with the theory and applications of Monte Carlo methods. It presents recent methods and algorithms, including the multilevel Monte Carlo method, the statistical Romberg method, and the Heath-Platen estimator, as well as recent financial and actuarial models, such as the Cheyette and dynamic mortality models. The authors separately discuss Monte Carlo techniques, stochastic process basics, and the theoretical background and intuition behind financial and actuarial mathematics, before bringing the topics together to apply the Monte Carlo methods to areas of finance and insurance. This allows for the easy identification of standard Monte Carlo tools and for a detailed focus on the main principles of financial and insurance mathematics. The book describes high-level Monte Carlo methods for standard simulation and the simulation of...
Monte Carlo methods and models in finance and insurance
Korn, Ralf; Kroisandt, Gerald
2010-01-01
Offering a unique balance between applications and calculations, Monte Carlo Methods and Models in Finance and Insurance incorporates the application background of finance and insurance with the theory and applications of Monte Carlo methods. It presents recent methods and algorithms, including the multilevel Monte Carlo method, the statistical Romberg method, and the Heath-Platen estimator, as well as recent financial and actuarial models, such as the Cheyette and dynamic mortality models. The authors separately discuss Monte Carlo techniques, stochastic process basics, and the theoretical background and intuition behind financial and actuarial mathematics, before bringing the topics together to apply the Monte Carlo methods to areas of finance and insurance. This allows for the easy identification of standard Monte Carlo tools and for a detailed focus on the main principles of financial and insurance mathematics. The book describes high-level Monte Carlo methods for standard simulation and the simulation of...
Monte Carlo model of diagnostic X-ray dosimetry
International Nuclear Information System (INIS)
Khrutchinsky, Arkady; Kutsen, Semion; Gatskevich, George
2008-01-01
Full text: A Monte Carlo simulation of absorbed dose distribution in patient's tissues is often used in a dosimetry assessment of X-ray examinations. The results of such simulations in Belarus are presented in the report based on an anthropomorphic tissue-equivalent Rando-like physical phantom. The phantom corresponds to an adult 173 cm high and of 73 kg and consists of a torso and a head made of tissue-equivalent plastics which model soft (muscular), bone, and lung tissues. It consists of 39 layers (each 25 mm thick), including 10 head and neck ones, 16 chest and 13 pelvis ones. A tomographic model of the phantom has been developed from its CT-scan images with a voxel size of 0.88 x 0.88 x 4 mm 3 . A necessary pixelization in Mathematics-based in-house program was carried out for the phantom to be used in the radiation transport code MCNP-4b. The final voxel size of 14.2 x 14.2 x 8 mm 3 was used for the reasonable computer consuming calculations of absorbed dose in tissues and organs in various diagnostic X-ray examinations. MCNP point detectors allocated through body slices obtained as a result of the pixelization were used to calculate the absorbed dose. X-ray spectra generated by the empirical TASMIP model were verified on the X-ray units MEVASIM and SIREGRAPH CF. Absorbed dose distributions in the phantom volume were determined by the corresponding Monte Carlo simulations with a set of point detectors. Doses in organs of the adult phantom computed from the absorbed dose distributions by another Mathematics-based in-house program were estimated for 22 standard organs for various standard X-ray examinations. The results of Monte Carlo simulations were compared with the results of direct measurements of the absorbed dose in the phantom on the X-ray unit SIREGRAPH CF with the calibrated thermo-luminescent dosimeter DTU-01. The measurements were carried out in specified locations of different layers in heart, lungs, liver, pancreas, and stomach at high voltage of
Monte Carlo simulation for dual head gamma camera
International Nuclear Information System (INIS)
Osman, Yousif Bashir Soliman
2015-12-01
Monte Carlo (MC) simulation technique was used widely in medical physics applications. In nuclear medicine MC was used to design new medical imaging devices such as positron emission tomography (PET), gamma camera and single photon emission computed tomography (SPECT). Also it can be used to study the factors affecting image quality and internal dosimetry, Gate is on of monte Carlo code that has a number of advantages for simulation of SPECT and PET. There is a limit accessibilities in machines which are used in clinics because of the work load of machines. This makes it hard to evaluate some factors effecting machine performance which must be evaluated routinely. Also because of difficulties of carrying out scientific research and training of students, MC model can be optimum solution for the problem. The aim of this study was to use gate monte Carlo code to model Nucline spirit, medico dual head gamma camera hosted in radiation and isotopes center of Khartoum which is equipped with low energy general purpose LEGP collimators. This was used model to evaluate spatial resolution and sensitivity which is important factor affecting image quality and to demonstrate the validity of gate by comparing experimental results with simulation results on spatial resolution. The gate model of Nuclide spirit, medico dual head gamma camera was developed by applying manufacturer specifications. Then simulation was run. In evaluation of spatial resolution the FWHM was calculated from image profile of line source of Tc 99m gammas emitter of energy 140 KeV at different distances from modeled camera head at 5,10,15,20,22,27,32,37 cm and for these distances the spatial resolution was founded to be 5.76, 7.73, 10.7, 13.8, 14.01,16.91, 19.75 and 21.9 mm, respectively. These results showed a decrement of spatial resolution with increase of the distance between object (line source) and collimator in linear manner. FWHM calculated at 10 cm was compared with experimental results. The
Investigating the impossible: Monte Carlo simulations
International Nuclear Information System (INIS)
Kramer, Gary H.; Crowley, Paul; Burns, Linda C.
2000-01-01
Designing and testing new equipment can be an expensive and time consuming process or the desired performance characteristics may preclude its construction due to technological shortcomings. Cost may also prevent equipment being purchased for other scenarios to be tested. An alternative is to use Monte Carlo simulations to make the investigations. This presentation exemplifies how Monte Carlo code calculations can be used to fill the gap. An example is given for the investigation of two sizes of germanium detector (70 mm and 80 mm diameter) at four different crystal thicknesses (15, 20, 25, and 30 mm) and makes predictions on how the size affects the counting efficiency and the Minimum Detectable Activity (MDA). The Monte Carlo simulations have shown that detector efficiencies can be adequately modelled using photon transport if the data is used to investigate trends. The investigation of the effect of detector thickness on the counting efficiency has shown that thickness for a fixed diameter detector of either 70 mm or 80 mm is unimportant up to 60 keV. At higher photon energies, the counting efficiency begins to decrease as the thickness decreases as expected. The simulations predict that the MDA of either the 70 mm or 80 mm diameter detectors does not differ by more than a factor of 1.15 at 17 keV or 1.2 at 60 keV when comparing detectors of equivalent thicknesses. The MDA is slightly increased at 17 keV, and rises by about 52% at 660 keV, when the thickness is decreased from 30 mm to 15 mm. One could conclude from this information that the extra cost associated with the larger area Ge detectors may not be justified for the slight improvement predicted in the MDA. (author)
Monte Carlo eigenfunction strategies and uncertainties
International Nuclear Information System (INIS)
Gast, R.C.; Candelore, N.R.
1974-01-01
Comparisons of convergence rates for several possible eigenfunction source strategies led to the selection of the ''straight'' analog of the analytic power method as the source strategy for Monte Carlo eigenfunction calculations. To insure a fair game strategy, the number of histories per iteration increases with increasing iteration number. The estimate of eigenfunction uncertainty is obtained from a modification of a proposal by D. B. MacMillan and involves only estimates of the usual purely statistical component of uncertainty and a serial correlation coefficient of lag one. 14 references. (U.S.)
Atomistic Monte Carlo simulation of lipid membranes
DEFF Research Database (Denmark)
Wüstner, Daniel; Sklenar, Heinz
2014-01-01
Biological membranes are complex assemblies of many different molecules of which analysis demands a variety of experimental and computational approaches. In this article, we explain challenges and advantages of atomistic Monte Carlo (MC) simulation of lipid membranes. We provide an introduction...... into the various move sets that are implemented in current MC methods for efficient conformational sampling of lipids and other molecules. In the second part, we demonstrate for a concrete example, how an atomistic local-move set can be implemented for MC simulations of phospholipid monomers and bilayer patches...
Monte Carlo method in radiation transport problems
International Nuclear Information System (INIS)
Dejonghe, G.; Nimal, J.C.; Vergnaud, T.
1986-11-01
In neutral radiation transport problems (neutrons, photons), two values are important: the flux in the phase space and the density of particles. To solve the problem with Monte Carlo method leads to, among other things, build a statistical process (called the play) and to provide a numerical value to a variable x (this attribution is called score). Sampling techniques are presented. Play biasing necessity is proved. A biased simulation is made. At last, the current developments (rewriting of programs for instance) are presented due to several reasons: two of them are the vectorial calculation apparition and the photon and neutron transport in vacancy media [fr
MBR Monte Carlo Simulation in PYTHIA8
Ciesielski, R.
We present the MBR (Minimum Bias Rockefeller) Monte Carlo simulation of (anti)proton-proton interactions and its implementation in the PYTHIA8 event generator. We discuss the total, elastic, and total-inelastic cross sections, and three contributions from diffraction dissociation processes that contribute to the latter: single diffraction, double diffraction, and central diffraction or double-Pomeron exchange. The event generation follows a renormalized-Regge-theory model, successfully tested using CDF data. Based on the MBR-enhanced PYTHIA8 simulation, we present cross-section predictions for the LHC and beyond, up to collision energies of 50 TeV.
Markov chains analytic and Monte Carlo computations
Graham, Carl
2014-01-01
Markov Chains: Analytic and Monte Carlo Computations introduces the main notions related to Markov chains and provides explanations on how to characterize, simulate, and recognize them. Starting with basic notions, this book leads progressively to advanced and recent topics in the field, allowing the reader to master the main aspects of the classical theory. This book also features: Numerous exercises with solutions as well as extended case studies.A detailed and rigorous presentation of Markov chains with discrete time and state space.An appendix presenting probabilistic notions that are nec
Score Bounded Monte-Carlo Tree Search
Cazenave, Tristan; Saffidine, Abdallah
Monte-Carlo Tree Search (MCTS) is a successful algorithm used in many state of the art game engines. We propose to improve a MCTS solver when a game has more than two outcomes. It is for example the case in games that can end in draw positions. In this case it improves significantly a MCTS solver to take into account bounds on the possible scores of a node in order to select the nodes to explore. We apply our algorithm to solving Seki in the game of Go and to Connect Four.
IN MEMORIAM CARLOS RESTREPO. UN VERDADERO MAESTRO
Pelayo Correa
2009-01-01
Carlos Restrepo fue el primer profesor de Patología y un miembro ilustre del grupo de pioneros que fundaron la Facultad de Medicina de la Universidad del Valle. Estos pioneros convergieron en Cali en la década de 1950, en posesión de un espíritu renovador y creativo que emprendió con mucho éxito la labor de cambiar la cultura académica del Valle del Cauca. Ellos encontraron una sociedad apacible, que disfrutaba de la generosidad de su entorno, sin deseos de romper las tradiciones centenarias ...
Monte Carlo study of the multiquark systems
International Nuclear Information System (INIS)
Kerbikov, B.O.; Polikarpov, M.I.; Zamolodchikov, A.B.
1986-01-01
Random walks have been used to calculate the energies of the ground states in systems of N=3, 6, 9, 12 quarks. Multiquark states with N>3 are unstable with respect to the spontaneous dissociation into color singlet hadrons. The modified Green's function Monte Carlo algorithm which proved to be more simple and much accurate than the conventional few body methods have been employed. In contrast to other techniques, the same equations are used for any number of particles, while the computer time increases only linearly V, S the number of particles
by means of FLUKA Monte Carlo method
Directory of Open Access Journals (Sweden)
Ermis Elif Ebru
2015-01-01
Full Text Available Calculations of gamma-ray mass attenuation coefficients of various detector materials (crystals were carried out by means of FLUKA Monte Carlo (MC method at different gamma-ray energies. NaI, PVT, GSO, GaAs and CdWO4 detector materials were chosen in the calculations. Calculated coefficients were also compared with the National Institute of Standards and Technology (NIST values. Obtained results through this method were highly in accordance with those of the NIST values. It was concluded from the study that FLUKA MC method can be an alternative way to calculate the gamma-ray mass attenuation coefficients of the detector materials.
Pseudo-extended Markov chain Monte Carlo
Nemeth, Christopher; Lindsten, Fredrik; Filippone, Maurizio; Hensman, James
2017-01-01
Sampling from the posterior distribution using Markov chain Monte Carlo (MCMC) methods can require an exhaustive number of iterations to fully explore the correct posterior. This is often the case when the posterior of interest is multi-modal, as the MCMC sampler can become trapped in a local mode for a large number of iterations. In this paper, we introduce the pseudo-extended MCMC method as an approach for improving the mixing of the MCMC sampler in complex posterior distributions. The pseu...
Diffusion quantum Monte Carlo for molecules
International Nuclear Information System (INIS)
Lester, W.A. Jr.
1986-07-01
A quantum mechanical Monte Carlo method has been used for the treatment of molecular problems. The imaginary-time Schroedinger equation written with a shift in zero energy [E/sub T/ - V(R)] can be interpreted as a generalized diffusion equation with a position-dependent rate or branching term. Since diffusion is the continuum limit of a random walk, one may simulate the Schroedinger equation with a function psi (note, not psi 2 ) as a density of ''walks.'' The walks undergo an exponential birth and death as given by the rate term. 16 refs., 2 tabs
Discrete diffusion Monte Carlo for frequency-dependent radiative transfer
Energy Technology Data Exchange (ETDEWEB)
Densmore, Jeffrey D [Los Alamos National Laboratory; Kelly, Thompson G [Los Alamos National Laboratory; Urbatish, Todd J [Los Alamos National Laboratory
2010-11-17
Discrete Diffusion Monte Carlo (DDMC) is a technique for increasing the efficiency of Implicit Monte Carlo radiative-transfer simulations. In this paper, we develop an extension of DDMC for frequency-dependent radiative transfer. We base our new DDMC method on a frequency-integrated diffusion equation for frequencies below a specified threshold. Above this threshold we employ standard Monte Carlo. With a frequency-dependent test problem, we confirm the increased efficiency of our new DDMC technique.
Monte Carlo criticality analysis for dissolvers with neutron poison
International Nuclear Information System (INIS)
Yu, Deshun; Dong, Xiufang; Pu, Fuxiang.
1987-01-01
Criticality analysis for dissolvers with neutron poison is given on the basis of Monte Carlo method. In Monte Carlo calculations of thermal neutron group parameters for fuel pieces, neutron transport length is determined in terms of maximum cross section approach. A set of related effective multiplication factors (K eff ) are calculated by Monte Carlo method for the three cases. Related numerical results are quite useful for the design and operation of this kind of dissolver in the criticality safety analysis. (author)
Monte Carlo Based Framework to Support HAZOP Study
DEFF Research Database (Denmark)
Danko, Matej; Frutiger, Jerome; Jelemenský, Ľudovít
2017-01-01
This study combines Monte Carlo based process simulation features with classical hazard identification techniques for consequences of deviations from normal operating conditions investigation and process safety examination. A Monte Carlo based method has been used to sample and evaluate different...... deviations in process parameters simultaneously, thereby bringing an improvement to the Hazard and Operability study (HAZOP), which normally considers only one at a time deviation in process parameters. Furthermore, Monte Carlo filtering was then used to identify operability and hazard issues including...
Validation of variance reduction techniques in Mediso (SPIRIT DH-V) SPECT system by Monte Carlo
International Nuclear Information System (INIS)
Rodriguez Marrero, J. P.; Diaz Garcia, A.; Gomez Facenda, A.
2015-01-01
Monte Carlo simulation of nuclear medical imaging systems is a widely used method for reproducing their operation in a real clinical environment, There are several Single Photon Emission Tomography (SPECT) systems in Cuba. For this reason it is clearly necessary to introduce a reliable and fast simulation platform in order to obtain consistent image data. This data will reproduce the original measurements conditions. In order to fulfill these requirements Monte Carlo platform GAMOS (Geant4 Medicine Oriented Architecture for Applications) have been used. Due to the very size and complex configuration of parallel hole collimators in real clinical SPECT systems, Monte Carlo simulation usually consumes excessively high time and computing resources. main goal of the present work is to optimize the efficiency of calculation by means of new GAMOS functionality. There were developed and validated two GAMOS variance reduction techniques to speed up calculations. These procedures focus and limit transport of gamma quanta inside the collimator. The obtained results were asses experimentally in Mediso (SPIRIT DH-V) SPECT system. Main quality control parameters, such as sensitivity and spatial resolution were determined. Differences of 4.6% sensitivity and 8.7% spatial resolution were reported against manufacturer values. Simulation time was decreased up to 650 times. Using these techniques it was possible to perform several studies in almost 8 hours each. (Author)
Monte Carlo simulation for the design of industrial gamma-ray transmission tomography
International Nuclear Information System (INIS)
Kim, Jongbum; Jung, Sunghee; Moon, Jinho; Kwon, Taekyong; Cho, Gyuseong
2011-01-01
The Monte Carlo simulation and experiment were carried out for a large-scale industrial gamma ray tomographic scanning geometry. The geometry of the tomographic system has a moving source with 16 stationary detectors. This geometry is advantageous for the diagnosis of a large-scale industrial plant. The simulation data was carried out for the phantom with 32 views, 16 detectors, and a different energy bin. The simulation data was processed to be used for image reconstruction. Image reconstruction was performed by a Diagonally-Scaled Gradient-Ascent algorithm for simulation data. Experiments were conducted in a 78 cm diameter column filled with polypropylene grains. Sixteen 0.5-inch-thick and 1 inch long NaI(Tl) cylindrical detectors, and 20 mCi of 137 Cs radioactive source were used. The experimental results were compared to the simulation data. The experimental results were similar to Monte Carlo simulation results. This result showed that the Monte Carlo simulation is useful for predicting the result of the industrial gamma tomographic scan method And it can also give a solution for designing the industrial gamma tomography system and preparing the field experiment. (author)
Strategies for CT tissue segmentation for Monte Carlo calculations in nuclear medicine dosimetry.
Braad, P E N; Andersen, T; Hansen, S B; Høilund-Carlsen, P F
2016-12-01
CT images are used for patient specific Monte Carlo treatment planning in radionuclide therapy. The authors investigated the impact of tissue classification, CT image segmentation, and CT errors on Monte Carlo calculated absorbed dose estimates in nuclear medicine. CT errors as a function of patient size, CT reconstruction, and tube current modulation methods were assessed in a phantom experiment on a clinical CT system. The impact of tissue segmentation methods and CT number variations on EGSnrc Monte Carlo calculated absorbed dose distributions was assessed for 99m Tc and 131 I in the ICRP/ICRU male phantom and in a patient PET/CT-scanned with 124 I prior to radioiodine therapy. CT number variations segmentation by a 13-tissue CT conversion ramp, calibrated by a stoichiometric method, resulted in low (<4%) dose errors in selected organs for both isotopes. A calibrated CT scanner specific conversion ramp is required for accurate patient specific dosimetry in nuclear medicine. Accurate dosimetry was obtained with a 13-tissue ramp that included five different bone types.
Energy Technology Data Exchange (ETDEWEB)
Rodrigues, Bruno L.; Tomal, Alessandra [Universidade Estadual de Campinas (UNICAMP), Campinas, SP (Brazil). Instituto de Fisica Gleb Wataghin
2016-07-01
Mammography is the main tool for breast cancer diagnosis, and it is based on the use of X-rays to obtain images. However, the glandular tissue present within the breast is highly sensitive to ionizing radiation, and therefore requires strict quality control in order to minimize the absorbed dose. The quantification of the absorbed dose in the breast tissue can be done by using Monte Carlo simulation, which allows a detailed study of the deposition of energy in different regions of the breast. Besides, the results obtained from the simulation can be associated with experimental data and provide values of dose interest, such as the dose deposited in glandular tissue. (author)
Monte Carlo simulations for instrumentation at SINQ
International Nuclear Information System (INIS)
Filges, U.; Ronnow, H.M.; Zsigmond, G.
2006-01-01
The Paul Scherrer Institut (PSI) operates a spallation source SINQ equipped with 11 different neutron scattering instruments. Beside the optimization of the existing instruments, the extension with new instruments and devices are continuously done at PSI. For design and performance studies different Monte Carlo packages are used. Presently two major projects are in an advanced stage of planning. These are the new thermal neutron triple-axis spectrometer Enhanced Intensity and Greater Energy Range (EIGER) and the ultra-cold neutron source (UCN-PSI). The EIGER instrument design is focused on an optimal signal-to-background ratio. A very important design part was to realize a monochromator shielding which covers best shielding characteristic, low background production and high instrument functionality. The Monte Carlo package MCNPX was used to find the best choice. Due to the sharp energy distribution of ultra-cold neutrons (UCN) which can be Doppler-shifted towards cold neutron energies, a UCN phase space transformation (PST) device could produce highly monochromatic cold and very cold neutrons (VCN). The UCN-PST instrumentation project running at PSI is very timely since a new-generation superthermal spallation source of UCN is under construction at PSI with a UCN density of 3000-4000 n cm -3 . Detailed numerical simulations have been carried out to optimize the UCN density and flux. Recent results on numerical simulations of an UCN-PST-based source of highly monochromatic cold neutrons and VCN are presented
Multilevel Monte Carlo simulation of Coulomb collisions
Energy Technology Data Exchange (ETDEWEB)
Rosin, M.S., E-mail: msr35@math.ucla.edu [Mathematics Department, University of California at Los Angeles, Los Angeles, CA 90036 (United States); Department of Mathematics and Science, Pratt Institute, Brooklyn, NY 11205 (United States); Ricketson, L.F. [Mathematics Department, University of California at Los Angeles, Los Angeles, CA 90036 (United States); Dimits, A.M. [Lawrence Livermore National Laboratory, L-637, P.O. Box 808, Livermore, CA 94511-0808 (United States); Caflisch, R.E. [Mathematics Department, University of California at Los Angeles, Los Angeles, CA 90036 (United States); Institute for Pure and Applied Mathematics, University of California at Los Angeles, Los Angeles, CA 90095 (United States); Cohen, B.I. [Lawrence Livermore National Laboratory, L-637, P.O. Box 808, Livermore, CA 94511-0808 (United States)
2014-10-01
We present a new, for plasma physics, highly efficient multilevel Monte Carlo numerical method for simulating Coulomb collisions. The method separates and optimally minimizes the finite-timestep and finite-sampling errors inherent in the Langevin representation of the Landau–Fokker–Planck equation. It does so by combining multiple solutions to the underlying equations with varying numbers of timesteps. For a desired level of accuracy ε, the computational cost of the method is O(ε{sup −2}) or O(ε{sup −2}(lnε){sup 2}), depending on the underlying discretization, Milstein or Euler–Maruyama respectively. This is to be contrasted with a cost of O(ε{sup −3}) for direct simulation Monte Carlo or binary collision methods. We successfully demonstrate the method with a classic beam diffusion test case in 2D, making use of the Lévy area approximation for the correlated Milstein cross terms, and generating a computational saving of a factor of 100 for ε=10{sup −5}. We discuss the importance of the method for problems in which collisions constitute the computational rate limiting step, and its limitations.
Multi-Index Monte Carlo (MIMC)
Haji Ali, Abdul Lateef
2016-01-06
We propose and analyze a novel Multi-Index Monte Carlo (MIMC) method for weak approximation of stochastic models that are described in terms of differential equations either driven by random measures or with random coefficients. The MIMC method is both a stochastic version of the combination technique introduced by Zenger, Griebel and collaborators and an extension of the Multilevel Monte Carlo (MLMC) method first described by Heinrich and Giles. Inspired by Giles s seminal work, instead of using first-order differences as in MLMC, we use in MIMC high-order mixed differences to reduce the variance of the hierarchical differences dramatically. Under standard assumptions on the convergence rates of the weak error, variance and work per sample, the optimal index set turns out to be of Total Degree (TD) type. When using such sets, MIMC yields new and improved complexity results, which are natural generalizations of Giles s MLMC analysis, and which increase the domain of problem parameters for which we achieve the optimal convergence, O(TOL-2).
Multi-Index Monte Carlo (MIMC)
Haji Ali, Abdul Lateef
2015-01-07
We propose and analyze a novel Multi-Index Monte Carlo (MIMC) method for weak approximation of stochastic models that are described in terms of differential equations either driven by random measures or with random coefficients. The MIMC method is both a stochastic version of the combination technique introduced by Zenger, Griebel and collaborators and an extension of the Multilevel Monte Carlo (MLMC) method first described by Heinrich and Giles. Inspired by Giles’s seminal work, instead of using first-order differences as in MLMC, we use in MIMC high-order mixed differences to reduce the variance of the hierarchical differences dramatically. Under standard assumptions on the convergence rates of the weak error, variance and work per sample, the optimal index set turns out to be of Total Degree (TD) type. When using such sets, MIMC yields new and improved complexity results, which are natural generalizations of Giles’s MLMC analysis, and which increase the domain of problem parameters for which we achieve the optimal convergence.
International Nuclear Information System (INIS)
Ohta, Shigemi
1996-01-01
The Self-Test Monte Carlo (STMC) method resolves the main problems in using algebraic pseudo-random numbers for Monte Carlo (MC) calculations: that they can interfere with MC algorithms and lead to erroneous results, and that such an error often cannot be detected without known exact solution. STMC is based on good randomness of about 10 10 bits available from physical noise or transcendental numbers like π = 3.14---. Various bit modifiers are available to get more bits for applications that demands more than 10 10 random bits such as lattice quantum chromodynamics (QCD). These modifiers are designed so that a) each of them gives a bit sequence comparable in randomness as the original if used separately from each other, and b) their mutual interference when used jointly in a single MC calculation is adjustable. Intermediate data of the MC calculation itself are used to quantitatively test and adjust the mutual interference of the modifiers in respect of the MC algorithm. STMC is free of systematic error and gives reliable statistical error. Also it can be easily implemented on vector and parallel supercomputers. (author)
Algorithms for Monte Carlo calculations with fermions
International Nuclear Information System (INIS)
Weingarten, D.
1985-01-01
We describe a fermion Monte Carlo algorithm due to Petcher and the present author and another due to Fucito, Marinari, Parisi and Rebbi. For the first algorithm we estimate the number of arithmetic operations required to evaluate a vacuum expectation value grows as N 11 /msub(q) on an N 4 lattice with fixed periodicity in physical units and renormalized quark mass msub(q). For the second algorithm the rate of growth is estimated to be N 8 /msub(q) 2 . Numerical experiments are presented comparing the two algorithms on a lattice of size 2 4 . With a hopping constant K of 0.15 and β of 4.0 we find the number of operations for the second algorithm is about 2.7 times larger than for the first and about 13 000 times larger than for corresponding Monte Carlo calculations with a pure gauge theory. An estimate is given for the number of operations required for more realistic calculations by each algorithm on a larger lattice. (orig.)
Quantum Monte Carlo for atoms and molecules
International Nuclear Information System (INIS)
Barnett, R.N.
1989-11-01
The diffusion quantum Monte Carlo with fixed nodes (QMC) approach has been employed in studying energy-eigenstates for 1--4 electron systems. Previous work employing the diffusion QMC technique yielded energies of high quality for H 2 , LiH, Li 2 , and H 2 O. Here, the range of calculations with this new approach has been extended to include additional first-row atoms and molecules. In addition, improvements in the previously computed fixed-node energies of LiH, Li 2 , and H 2 O have been obtained using more accurate trial functions. All computations were performed within, but are not limited to, the Born-Oppenheimer approximation. In our computations, the effects of variation of Monte Carlo parameters on the QMC solution of the Schroedinger equation were studied extensively. These parameters include the time step, renormalization time and nodal structure. These studies have been very useful in determining which choices of such parameters will yield accurate QMC energies most efficiently. Generally, very accurate energies (90--100% of the correlation energy is obtained) have been computed with single-determinant trail functions multiplied by simple correlation functions. Improvements in accuracy should be readily obtained using more complex trial functions
Wielandt acceleration for MCNP5 Monte Carlo eigenvalue calculations
International Nuclear Information System (INIS)
Brown, F.
2007-01-01
Monte Carlo criticality calculations use the power iteration method to determine the eigenvalue (k eff ) and eigenfunction (fission source distribution) of the fundamental mode. A recently proposed method for accelerating convergence of the Monte Carlo power iteration using Wielandt's method has been implemented in a test version of MCNP5. The method is shown to provide dramatic improvements in convergence rates and to greatly reduce the possibility of false convergence assessment. The method is effective and efficient, improving the Monte Carlo figure-of-merit for many problems. In addition, the method should eliminate most of the underprediction bias in confidence intervals for Monte Carlo criticality calculations. (authors)
Odd-flavor Simulations by the Hybrid Monte Carlo
Takaishi, Tetsuya; Takaishi, Tetsuya; De Forcrand, Philippe
2001-01-01
The standard hybrid Monte Carlo algorithm is known to simulate even flavors QCD only. Simulations of odd flavors QCD, however, can be also performed in the framework of the hybrid Monte Carlo algorithm where the inverse of the fermion matrix is approximated by a polynomial. In this exploratory study we perform three flavors QCD simulations. We make a comparison of the hybrid Monte Carlo algorithm and the R-algorithm which also simulates odd flavors systems but has step-size errors. We find that results from our hybrid Monte Carlo algorithm are in agreement with those from the R-algorithm obtained at very small step-size.
Quantum Monte Carlo Endstation for Petascale Computing
Energy Technology Data Exchange (ETDEWEB)
Lubos Mitas
2011-01-26
NCSU research group has been focused on accomplising the key goals of this initiative: establishing new generation of quantum Monte Carlo (QMC) computational tools as a part of Endstation petaflop initiative for use at the DOE ORNL computational facilities and for use by computational electronic structure community at large; carrying out high accuracy quantum Monte Carlo demonstration projects in application of these tools to the forefront electronic structure problems in molecular and solid systems; expanding the impact of QMC methods and approaches; explaining and enhancing the impact of these advanced computational approaches. In particular, we have developed quantum Monte Carlo code (QWalk, www.qwalk.org) which was significantly expanded and optimized using funds from this support and at present became an actively used tool in the petascale regime by ORNL researchers and beyond. These developments have been built upon efforts undertaken by the PI's group and collaborators over the period of the last decade. The code was optimized and tested extensively on a number of parallel architectures including petaflop ORNL Jaguar machine. We have developed and redesigned a number of code modules such as evaluation of wave functions and orbitals, calculations of pfaffians and introduction of backflow coordinates together with overall organization of the code and random walker distribution over multicore architectures. We have addressed several bottlenecks such as load balancing and verified efficiency and accuracy of the calculations with the other groups of the Endstation team. The QWalk package contains about 50,000 lines of high quality object-oriented C++ and includes also interfaces to data files from other conventional electronic structure codes such as Gamess, Gaussian, Crystal and others. This grant supported PI for one month during summers, a full-time postdoc and partially three graduate students over the period of the grant duration, it has resulted in 13
Clear-PEM system counting rates: a Monte Carlo study
Energy Technology Data Exchange (ETDEWEB)
Rodrigues, P [Laboratorio de Instrumentacao e Fisica Experimental de Particulas (LIP), Av. Elias Garcia 14-1 1000-149 Lisbon (Portugal); Trindade, A [Laboratorio de Instrumentacao e Fisica Experimental de Particulas (LIP), Av. Elias Garcia 14-1 1000-149 Lisbon (Portugal); Varela, J [Laboratorio de Instrumentacao e Fisica Experimental de Particulas (LIP), Av. Elias Garcia 14-1 1000-149 Lisbon (Portugal)
2007-01-15
Positron Emission Mammography (PEM) with {sup 18}F-Fluorodeoxyglucose ({sup 18}F-FDG) is a functional imaging technique for breast cancer detection. The development of dedicated imaging systems with high sensitivity and spatial resolution are crucial for early breast cancer diagnosis and an efficient therapy. Clear-PEM is a dual planar scanner designed for high-resolution breast cancer imaging under development by the Portuguese PET Mammography consortium within the Crystal Clear Collaboration. It brings together a favorable combination of high-density scintillator crystals coupled to compact photodetectors, arranged in a double readout scheme capable of providing depth-of-interaction information. A Monte Carlo study of the Clear-PEM system counting rates is presented in this paper. Hypothetical breast exam scenarios were simulated to estimate the single event rates, true and random coincidence rates. A realistic description of the patient and detector geometry, radiation environment, physics and instrumentation factors was adopted in this work. Special attention was given to the {sup 18}F-FDG accumulation in the patient torso organs which, for the Clear-PEM scanner, represent significant activity outside the field-of-view (FOV) contributing to an increase of singles, randoms and scattered coincidences affecting the overall system performance. The potential benefits of patient shielding to minimize the influence of the out-of-field background was explored. The influence of LYSO:Ce crystal intrinsic natural activity due to the presence of the {sup 176}Lu isotope on the counting rate performance of the proposed scanner, was also investigated.
International Nuclear Information System (INIS)
Chow, J
2015-01-01
Purpose: This study evaluated the efficiency of 4D lung radiation treatment planning using Monte Carlo simulation on the cloud. The EGSnrc Monte Carlo code was used in dose calculation on the 4D-CT image set. Methods: 4D lung radiation treatment plan was created by the DOSCTP linked to the cloud, based on the Amazon elastic compute cloud platform. Dose calculation was carried out by Monte Carlo simulation on the 4D-CT image set on the cloud, and results were sent to the FFD4D image deformation program for dose reconstruction. The dependence of computing time for treatment plan on the number of compute node was optimized with variations of the number of CT image set in the breathing cycle and dose reconstruction time of the FFD4D. Results: It is found that the dependence of computing time on the number of compute node was affected by the diminishing return of the number of node used in Monte Carlo simulation. Moreover, the performance of the 4D treatment planning could be optimized by using smaller than 10 compute nodes on the cloud. The effects of the number of image set and dose reconstruction time on the dependence of computing time on the number of node were not significant, as more than 15 compute nodes were used in Monte Carlo simulations. Conclusion: The issue of long computing time in 4D treatment plan, requiring Monte Carlo dose calculations in all CT image sets in the breathing cycle, can be solved using the cloud computing technology. It is concluded that the optimized number of compute node selected in simulation should be between 5 and 15, as the dependence of computing time on the number of node is significant
Monte Carlo simulation of the ARGO
International Nuclear Information System (INIS)
Depaola, G.O.
1997-01-01
We use GEANT Monte Carlo code to design an outline of the geometry and simulate the performance of the Argentine gamma-ray observer (ARGO), a telescope based on silicon strip detector technlogy. The γ-ray direction is determined by geometrical means and the angular resolution is calculated for small variations of the basic design. The results show that the angular resolutions vary from a few degrees at low energies (∝50 MeV) to 0.2 , approximately, at high energies (>500 MeV). We also made simulations using as incoming γ-ray the energy spectrum of PKS0208-512 and PKS0528+134 quasars. Moreover, a method based on multiple scattering theory is also used to determine the incoming energy. We show that this method is applicable to energy spectrum. (orig.)
San Carlos Apache Tribe - Energy Organizational Analysis
Energy Technology Data Exchange (ETDEWEB)
Rapp, James; Albert, Steve
2012-04-01
The San Carlos Apache Tribe (SCAT) was awarded $164,000 in late-2011 by the U.S. Department of Energy (U.S. DOE) Tribal Energy Program's "First Steps Toward Developing Renewable Energy and Energy Efficiency on Tribal Lands" Grant Program. This grant funded: The analysis and selection of preferred form(s) of tribal energy organization (this Energy Organization Analysis, hereinafter referred to as "EOA"). Start-up staffing and other costs associated with the Phase 1 SCAT energy organization. An intern program. Staff training. Tribal outreach and workshops regarding the new organization and SCAT energy programs and projects, including two annual tribal energy summits (2011 and 2012). This report documents the analysis and selection of preferred form(s) of a tribal energy organization.
CARLOS MARTÍ ARÍS: CABOS SUELTOS
Directory of Open Access Journals (Sweden)
Ángel Martínez García-Posada
2012-11-01
Full Text Available Al viento de su mismo título, ondea este libro otoñal su carácter diverso y su direccionalidad múltiple: con la apariencia de una clásica recopilación de presentaciones, conferencias o artículos, alentados estos últimos años a propósito de causas ajenas y afinidades electivas, esta edición agavilla comentarios, prefacios y notas en páginas dispersas, del profesor Carlos Martí, y compone un orden silencioso, secreto autorretrato, velado tras la trama de una tupida cartografía de lazos suaves pero seguros.
Methods for Monte Carlo simulations of biomacromolecules.
Vitalis, Andreas; Pappu, Rohit V
2009-01-01
The state-of-the-art for Monte Carlo (MC) simulations of biomacromolecules is reviewed. Available methodologies for sampling conformational equilibria and associations of biomacromolecules in the canonical ensemble, given a continuum description of the solvent environment, are reviewed. Detailed sections are provided dealing with the choice of degrees of freedom, the efficiencies of MC algorithms and algorithmic peculiarities, as well as the optimization of simple movesets. The issue of introducing correlations into elementary MC moves, and the applicability of such methods to simulations of biomacromolecules is discussed. A brief discussion of multicanonical methods and an overview of recent simulation work highlighting the potential of MC methods are also provided. It is argued that MC simulations, while underutilized biomacromolecular simulation community, hold promise for simulations of complex systems and phenomena that span multiple length scales, especially when used in conjunction with implicit solvation models or other coarse graining strategies.
Variational Monte Carlo study of pentaquark states
Energy Technology Data Exchange (ETDEWEB)
Mark W. Paris
2005-07-01
Accurate numerical solution of the five-body Schrodinger equation is effected via variational Monte Carlo. The spectrum is assumed to exhibit a narrow resonance with strangeness S=+1. A fully antisymmetrized and pair-correlated five-quark wave function is obtained for the assumed non-relativistic Hamiltonian which has spin, isospin, and color dependent pair interactions and many-body confining terms which are fixed by the non-exotic spectra. Gauge field dynamics are modeled via flux tube exchange factors. The energy determined for the ground states with J=1/2 and negative (positive) parity is 2.22 GeV (2.50 GeV). A lower energy negative parity state is consistent with recent lattice results. The short-range structure of the state is analyzed via its diquark content.
Monte Carlo simulation of a CZT detector
International Nuclear Information System (INIS)
Chun, Sung Dae; Park, Se Hwan; Ha, Jang Ho; Kim, Han Soo; Cho, Yoon Ho; Kang, Sang Mook; Kim, Yong Kyun; Hong, Duk Geun
2008-01-01
CZT detector is one of the most promising radiation detectors for hard X-ray and γ-ray measurement. The energy spectrum of CZT detector has to be simulated to optimize the detector design. A CZT detector was fabricated with dimensions of 5x5x2 mm 3 . A Peltier cooler with a size of 40x40 mm 2 was installed below the fabricated CZT detector to reduce the operation temperature of the detector. Energy spectra of were measured with 59.5 keV γ-ray from 241 Am. A Monte Carlo code was developed to simulate the CZT energy spectrum, which was measured with a planar-type CZT detector, and the result was compared with the measured one. The simulation was extended to the CZT detector with strip electrodes. (author)
Linear stories in Carlo Scarpa's architectural drawings
DEFF Research Database (Denmark)
Dayer, Carolina
2017-01-01
, an architect guides the viewer’s imagination into another not-yet-real world that is projected much like divinatory practices of reading palms or tarot cards. The magic-real field of facts and fictions coexisting in one realm can be understood as a confabulation. A confabulation brings together both fact...... and fiction through fārī, a Fable, meaning 'to speak'. In the field of neurology, a mental patient’s confabulation may be when convinces himself that he is in Venice, although he also admits that the town he is seeing through the window is Alexandria. He knows both places, he feels both places and, despite...... the contradiction, both places constitute his reality. Venetian architect and storyteller par excellence, Carlo Scarpa, exercised the power of confabulations throughout his practice of drawing and building. While architectural historians have attempted to explain Scarpa’s work as layers coming together, very little...
Monte Carlo and detector simulation in OOP
International Nuclear Information System (INIS)
Atwood, W.B.; Blankenbecler, R.; Kunz, P.; Burnett, T.; Storr, K.M.
1990-01-01
Object-Oriented Programming techniques are explored with an eye towards applications in High Energy Physics codes. Two prototype examples are given: MCOOP (a particle Monte Carlo generator) and GISMO (a detector simulation/analysis package). The OOP programmer does no explicit or detailed memory management nor other bookkeeping chores; hence, the writing, modification, and extension of the code is considerably simplified. Inheritance can be used to simplify the class definitions as well as the instance variables and action methods of each class; thus the work required to add new classes, parameters, or new methods is minimal. The software industry is moving rapidly to OOP since it has been proven to improve programmer productivity, and promises even more for the future by providing truly reusable software. The High Energy Physics community clearly needs to follow this trend
Geometric Monte Carlo and black Janus geometries
Energy Technology Data Exchange (ETDEWEB)
Bak, Dongsu, E-mail: dsbak@uos.ac.kr [Physics Department, University of Seoul, Seoul 02504 (Korea, Republic of); B.W. Lee Center for Fields, Gravity & Strings, Institute for Basic Sciences, Daejeon 34047 (Korea, Republic of); Kim, Chanju, E-mail: cjkim@ewha.ac.kr [Department of Physics, Ewha Womans University, Seoul 03760 (Korea, Republic of); Kim, Kyung Kiu, E-mail: kimkyungkiu@gmail.com [Department of Physics, Sejong University, Seoul 05006 (Korea, Republic of); Department of Physics, College of Science, Yonsei University, Seoul 03722 (Korea, Republic of); Min, Hyunsoo, E-mail: hsmin@uos.ac.kr [Physics Department, University of Seoul, Seoul 02504 (Korea, Republic of); Song, Jeong-Pil, E-mail: jeong_pil_song@brown.edu [Department of Chemistry, Brown University, Providence, RI 02912 (United States)
2017-04-10
We describe an application of the Monte Carlo method to the Janus deformation of the black brane background. We present numerical results for three and five dimensional black Janus geometries with planar and spherical interfaces. In particular, we argue that the 5D geometry with a spherical interface has an application in understanding the finite temperature bag-like QCD model via the AdS/CFT correspondence. The accuracy and convergence of the algorithm are evaluated with respect to the grid spacing. The systematic errors of the method are determined using an exact solution of 3D black Janus. This numerical approach for solving linear problems is unaffected initial guess of a trial solution and can handle an arbitrary geometry under various boundary conditions in the presence of source fields.
Morse Monte Carlo Radiation Transport Code System
Energy Technology Data Exchange (ETDEWEB)
Emmett, M.B.
1975-02-01
The report contains sections containing descriptions of the MORSE and PICTURE codes, input descriptions, sample problems, deviations of the physical equations and explanations of the various error messages. The MORSE code is a multipurpose neutron and gamma-ray transport Monte Carlo code. Time dependence for both shielding and criticality problems is provided. General three-dimensional geometry may be used with an albedo option available at any material surface. The PICTURE code provide aid in preparing correct input data for the combinatorial geometry package CG. It provides a printed view of arbitrary two-dimensional slices through the geometry. By inspecting these pictures one may determine if the geometry specified by the input cards is indeed the desired geometry. 23 refs. (WRF)
Monte Carlo modeling and meteor showers
International Nuclear Information System (INIS)
Kulikova, N.V.
1987-01-01
Prediction of short lived increases in the cosmic dust influx, the concentration in lower thermosphere of atoms and ions of meteor origin and the determination of the frequency of micrometeor impacts on spacecraft are all of scientific and practical interest and all require adequate models of meteor showers at an early stage of their existence. A Monte Carlo model of meteor matter ejection from a parent body at any point of space was worked out by other researchers. This scheme is described. According to the scheme, the formation of ten well known meteor streams was simulated and the possibility of genetic affinity of each of them with the most probable parent comet was analyzed. Some of the results are presented
[Chagas Carlos Justiniano Ribeiro (1879-1934)].
Pays, J F
2009-12-01
The story of the life of Carlos Chagas is closely associated with the discovery of American Human Trypanosomiasis, caused by Trypanosoma cruzi. Indeed, he worked on this for almost all of his life. Nowadays he is considered as a national hero, but, when he was alive, he was criticised more severely in his own country than elsewhere, often unjustly and motivated by jealousy, but sometimes with good reason. Cases of Chagas disease in non-endemic countries became such a concern that public health measures have had to be taken. In this article we give a short account of the scientific journey of this man, who can be said to occupy his very own place in the history of Tropical Medicine.
Angular biasing in implicit Monte-Carlo
International Nuclear Information System (INIS)
Zimmerman, G.B.
1994-01-01
Calculations of indirect drive Inertial Confinement Fusion target experiments require an integrated approach in which laser irradiation and radiation transport in the hohlraum are solved simultaneously with the symmetry, implosion and burn of the fuel capsule. The Implicit Monte Carlo method has proved to be a valuable tool for the two dimensional radiation transport within the hohlraum, but the impact of statistical noise on the symmetric implosion of the small fuel capsule is difficult to overcome. We present an angular biasing technique in which an increased number of low weight photons are directed at the imploding capsule. For typical parameters this reduces the required computer time for an integrated calculation by a factor of 10. An additional factor of 5 can also be achieved by directing even smaller weight photons at the polar regions of the capsule where small mass zones are most sensitive to statistical noise
Monte Carlo simulations on SIMD computer architectures
International Nuclear Information System (INIS)
Burmester, C.P.; Gronsky, R.; Wille, L.T.
1992-01-01
In this paper algorithmic considerations regarding the implementation of various materials science applications of the Monte Carlo technique to single instruction multiple data (SIMD) computer architectures are presented. In particular, implementation of the Ising model with nearest, next nearest, and long range screened Coulomb interactions on the SIMD architecture MasPar MP-1 (DEC mpp-12000) series of massively parallel computers is demonstrated. Methods of code development which optimize processor array use and minimize inter-processor communication are presented including lattice partitioning and the use of processor array spanning tree structures for data reduction. Both geometric and algorithmic parallel approaches are utilized. Benchmarks in terms of Monte Carl updates per second for the MasPar architecture are presented and compared to values reported in the literature from comparable studies on other architectures
Monte Carlo modelling of TRIGA research reactor
International Nuclear Information System (INIS)
El Bakkari, B.; Nacir, B.; El Bardouni, T.; El Younoussi, C.; Merroun, O.; Htet, A.; Boulaich, Y.; Zoubair, M.; Boukhal, H.; Chakir, M.
2010-01-01
The Moroccan 2 MW TRIGA MARK II research reactor at Centre des Etudes Nucleaires de la Maamora (CENM) achieved initial criticality on May 2, 2007. The reactor is designed to effectively implement the various fields of basic nuclear research, manpower training, and production of radioisotopes for their use in agriculture, industry, and medicine. This study deals with the neutronic analysis of the 2-MW TRIGA MARK II research reactor at CENM and validation of the results by comparisons with the experimental, operational, and available final safety analysis report (FSAR) values. The study was prepared in collaboration between the Laboratory of Radiation and Nuclear Systems (ERSN-LMR) from Faculty of Sciences of Tetuan (Morocco) and CENM. The 3-D continuous energy Monte Carlo code MCNP (version 5) was used to develop a versatile and accurate full model of the TRIGA core. The model represents in detailed all components of the core with literally no physical approximation. Continuous energy cross-section data from the more recent nuclear data evaluations (ENDF/B-VI.8, ENDF/B-VII.0, JEFF-3.1, and JENDL-3.3) as well as S(α, β) thermal neutron scattering functions distributed with the MCNP code were used. The cross-section libraries were generated by using the NJOY99 system updated to its more recent patch file 'up259'. The consistency and accuracy of both the Monte Carlo simulation and neutron transport physics were established by benchmarking the TRIGA experiments. Core excess reactivity, total and integral control rods worth as well as power peaking factors were used in the validation process. Results of calculations are analysed and discussed.
Monte Carlo modelling of TRIGA research reactor
El Bakkari, B.; Nacir, B.; El Bardouni, T.; El Younoussi, C.; Merroun, O.; Htet, A.; Boulaich, Y.; Zoubair, M.; Boukhal, H.; Chakir, M.
2010-10-01
The Moroccan 2 MW TRIGA MARK II research reactor at Centre des Etudes Nucléaires de la Maâmora (CENM) achieved initial criticality on May 2, 2007. The reactor is designed to effectively implement the various fields of basic nuclear research, manpower training, and production of radioisotopes for their use in agriculture, industry, and medicine. This study deals with the neutronic analysis of the 2-MW TRIGA MARK II research reactor at CENM and validation of the results by comparisons with the experimental, operational, and available final safety analysis report (FSAR) values. The study was prepared in collaboration between the Laboratory of Radiation and Nuclear Systems (ERSN-LMR) from Faculty of Sciences of Tetuan (Morocco) and CENM. The 3-D continuous energy Monte Carlo code MCNP (version 5) was used to develop a versatile and accurate full model of the TRIGA core. The model represents in detailed all components of the core with literally no physical approximation. Continuous energy cross-section data from the more recent nuclear data evaluations (ENDF/B-VI.8, ENDF/B-VII.0, JEFF-3.1, and JENDL-3.3) as well as S( α, β) thermal neutron scattering functions distributed with the MCNP code were used. The cross-section libraries were generated by using the NJOY99 system updated to its more recent patch file "up259". The consistency and accuracy of both the Monte Carlo simulation and neutron transport physics were established by benchmarking the TRIGA experiments. Core excess reactivity, total and integral control rods worth as well as power peaking factors were used in the validation process. Results of calculations are analysed and discussed.
Monte carlo analysis of multicolour LED light engine
DEFF Research Database (Denmark)
Chakrabarti, Maumita; Thorseth, Anders; Jepsen, Jørgen
2015-01-01
A new Monte Carlo simulation as a tool for analysing colour feedback systems is presented here to analyse the colour uncertainties and achievable stability in a multicolour dynamic LED system. The Monte Carlo analysis presented here is based on an experimental investigation of a multicolour LED...
Projector Quantum Monte Carlo without minus-sign problem
Frick, M.; Raedt, H. De
Quantum Monte Carlo techniques often suffer from the so-called minus-sign problem. This paper explores a possibility to circumvent this fundamental problem by combining the Projector Quantum Monte Carlo method with the variational principle. Results are presented for the two-dimensional Hubbard
Multiple histogram method and static Monte Carlo sampling
Inda, M.A.; Frenkel, D.
2004-01-01
We describe an approach to use multiple-histogram methods in combination with static, biased Monte Carlo simulations. To illustrate this, we computed the force-extension curve of an athermal polymer from multiple histograms constructed in a series of static Rosenbluth Monte Carlo simulations. From
Monte Carlo methods for pricing ﬁnancial options
Indian Academy of Sciences (India)
Monte Carlo methods have increasingly become a popular computational tool to price complex ﬁnancial options, especially when the underlying space of assets has a large dimensionality, as the performance of other numerical methods typically suffer from the 'curse of dimensionality'. However, even Monte-Carlo ...
A MONTE CARLO COMPARISON OF PAAAM_ETRIC AND ...
African Journals Online (AJOL)
kernel nonparametric method is proposed and developed for estimating low flow quantiles. Ba&ed on annual minimum low flow data and Monte Carlo. Si•ulation Experiments, the proposed model is eotnpand with ... Carlo simulation technique using the criteria of the descriptive ability and predictive ability of a model.
New Approaches and Applications for Monte Carlo Perturbation Theory
Energy Technology Data Exchange (ETDEWEB)
Aufiero, Manuele; Bidaud, Adrien; Kotlyar, Dan; Leppänen, Jaakko; Palmiotti, Giuseppe; Salvatores, Massimo; Sen, Sonat; Shwageraus, Eugene; Fratoni, Massimiliano
2017-02-01
This paper presents some of the recent and new advancements in the extension of Monte Carlo Perturbation Theory methodologies and application. In particular, the discussed problems involve Brunup calculation, perturbation calculation based on continuous energy functions, and Monte Carlo Perturbation Theory in loosely coupled systems.
Forecasting with nonlinear time series model: A Monte-Carlo ...
African Journals Online (AJOL)
In this paper, we propose a new method of forecasting with nonlinear time series model using Monte-Carlo Bootstrap method. This new method gives better result in terms of forecast root mean squared error (RMSE) when compared with the traditional Bootstrap method and Monte-Carlo method of forecasting using a ...
Exponential convergence on a continuous Monte Carlo transport problem
International Nuclear Information System (INIS)
Booth, T.E.
1997-01-01
For more than a decade, it has been known that exponential convergence on discrete transport problems was possible using adaptive Monte Carlo techniques. An adaptive Monte Carlo method that empirically produces exponential convergence on a simple continuous transport problem is described
A Monte Carlo approach to combating delayed completion of ...
African Journals Online (AJOL)
The objective of this paper is to unveil the relevance of Monte Carlo critical path analysis in resolving problem of delays in scheduled completion of development projects. Commencing with deterministic network scheduling, Monte Carlo critical path analysis was advanced by assigning probability distributions to task times.
Debating the Social Thinking of Carlos Nelson Coutinho
Directory of Open Access Journals (Sweden)
Bruno Bruziguessi
2017-10-01
Full Text Available BRAZ, Marcelo; RODRIGUES, Mavi (Org.. Cultura, democracia e socialismo: as idéias de Carlos Nelson Coutinho em debate. [Culture, democracy and socialism: The ideas of Carlos Nelson Coutinho in debate]. Rio de Janeiro: Mórula, 2016. 248 p.
Quantum Monte Carlo method for attractive Coulomb potentials
Kole, J.S.; Raedt, H. De
2001-01-01
Starting from an exact lower bound on the imaginary-time propagator, we present a path-integral quantum Monte Carlo method that can handle singular attractive potentials. We illustrate the basic ideas of this quantum Monte Carlo algorithm by simulating the ground state of hydrogen and helium.
Forest canopy BRDF simulation using Monte Carlo method
Huang, J.; Wu, B.; Zeng, Y.; Tian, Y.
2006-01-01
Monte Carlo method is a random statistic method, which has been widely used to simulate the Bidirectional Reflectance Distribution Function (BRDF) of vegetation canopy in the field of visible remote sensing. The random process between photons and forest canopy was designed using Monte Carlo method.
Crop canopy BRDF simulation and analysis using Monte Carlo method
Huang, J.; Wu, B.; Tian, Y.; Zeng, Y.
2006-01-01
This author designs the random process between photons and crop canopy. A Monte Carlo model has been developed to simulate the Bi-directional Reflectance Distribution Function (BRDF) of crop canopy. Comparing Monte Carlo model to MCRM model, this paper analyzes the variations of different LAD and
Efficiency and accuracy of Monte Carlo (importance) sampling
Waarts, P.H.
2003-01-01
Monte Carlo Analysis is often regarded as the most simple and accurate reliability method. Be-sides it is the most transparent method. The only problem is the accuracy in correlation with the efficiency. Monte Carlo gets less efficient or less accurate when very low probabilities are to be computed
Nuclear data treatment for SAM-CE Monte Carlo calculations
International Nuclear Information System (INIS)
Lichtenstein, H.; Troubetzkoy, E.S.; Beer, M.
1980-01-01
The treatment of nuclear data by the SAM-CE Monte Carlo code system is presented. The retrieval of neutron, gamma production, and photon data from the ENDF/B fils is described. Integral cross sections as well as differential data are utilized in the Monte Carlo calculations, and the processing procedures for the requisite data are summarized
Approximating Sievert Integrals to Monte Carlo Methods to calculate ...
African Journals Online (AJOL)
Radiation dose rates along the transverse axis of a miniature P192PIr source were calculated using Sievert Integral (considered simple and inaccurate), and by the sophisticated and accurate Monte Carlo method. Using data obt-ained by the Monte Carlo method as benchmark and applying least squares regression curve ...
On the Markov Chain Monte Carlo (MCMC) method
Indian Academy of Sciences (India)
In this article, we give an introduction to Monte Carlo techniques with special emphasis on. Markov Chain Monte Carlo (MCMC). Since the latter needs Markov chains with state space that is R or Rd and most text books on Markov chains do not discuss such chains, we have included a short appendix that gives basic ...
Neutron point-flux calculation by Monte Carlo
International Nuclear Information System (INIS)
Eichhorn, M.
1986-04-01
A survey of the usual methods for estimating flux at a point is given. The associated variance-reducing techniques in direct Monte Carlo games are explained. The multigroup Monte Carlo codes MC for critical systems and PUNKT for point source-point detector-systems are represented, and problems in applying the codes to practical tasks are discussed. (author)
Monte Carlo simulation for radiation dose in children radiology
International Nuclear Information System (INIS)
Mendes, Hitalo R.; Tomal, Alessandra
2016-01-01
The dosimetry in pediatric radiology is essential due to the higher risk that children have in comparison to adults. The focus of this study is to present how the dose varies depending on the depth in a 10 year old and a newborn, for this purpose simulations are made using the Monte Carlo method. Potential differences were considered 70 and 90 kVp for the 10 year old and 70 and 80 kVp for the newborn. The results show that in both cases, the dose at the skin surface is larger for smaller potential value, however, it decreases faster for larger potential values. Another observation made is that because the newborn is less thick the ratio between the initial dose and the final is lower compared to the case of a 10 year old, showing that it is possible to make an image using a smaller entrance dose in the skin, keeping the same level of exposure at the detector. (author)
Baräo, Fernando; Nakagawa, Masayuki; Távora, Luis; Vaz, Pedro
2001-01-01
This book focusses on the state of the art of Monte Carlo methods in radiation physics and particle transport simulation and applications, the latter involving in particular, the use and development of electron--gamma, neutron--gamma and hadronic codes. Besides the basic theory and the methods employed, special attention is paid to algorithm development for modeling, and the analysis of experiments and measurements in a variety of fields ranging from particle to medical physics.
Research on perturbation based Monte Carlo reactor criticality search
International Nuclear Information System (INIS)
Li Zeguang; Wang Kan; Li Yangliu; Deng Jingkang
2013-01-01
Criticality search is a very important aspect in reactor physics analysis. Due to the advantages of Monte Carlo method and the development of computer technologies, Monte Carlo criticality search is becoming more and more necessary and feasible. Traditional Monte Carlo criticality search method is suffered from large amount of individual criticality runs and uncertainty and fluctuation of Monte Carlo results. A new Monte Carlo criticality search method based on perturbation calculation is put forward in this paper to overcome the disadvantages of traditional method. By using only one criticality run to get initial k eff and differential coefficients of concerned parameter, the polynomial estimator of k eff changing function is solved to get the critical value of concerned parameter. The feasibility of this method was tested. The results show that the accuracy and efficiency of perturbation based criticality search method are quite inspiring and the method overcomes the disadvantages of traditional one. (authors)
Howell, Robert T.
2004-01-01
With all the talk today about accountability, budget cuts, and the closing of programs in public education, teachers cannot overlook the importance of image in the field of industrial technology. It is very easy for administrators to cut ITE (industrial technology education) programs to save school money--money they might shift to teaching the…
International Nuclear Information System (INIS)
Yeh, C.Y.; Lee, C.C.; Chao, T.C.; Lin, M.H.; Lai, P.A.; Liu, F.H.; Tung, C.J.
2014-01-01
This study aims to utilize a measurement-based Monte Carlo (MBMC) method to evaluate the accuracy of dose distributions calculated using the Eclipse radiotherapy treatment planning system (TPS) based on the anisotropic analytical algorithm. Dose distributions were calculated for the nasopharyngeal carcinoma (NPC) patients treated with the intensity modulated radiotherapy (IMRT). Ten NPC IMRT plans were evaluated by comparing their dose distributions with those obtained from the in-house MBMC programs for the same CT images and beam geometry. To reconstruct the fluence distribution of the IMRT field, an efficiency map was obtained by dividing the energy fluence of the intensity modulated field by that of the open field, both acquired from an aS1000 electronic portal imaging device. The integrated image of the non-gated mode was used to acquire the full dose distribution delivered during the IMRT treatment. This efficiency map redistributed the particle weightings of the open field phase-space file for IMRT applications. Dose differences were observed in the tumor and air cavity boundary. The mean difference between MBMC and TPS in terms of the planning target volume coverage was 0.6% (range: 0.0–2.3%). The mean difference for the conformity index was 0.01 (range: 0.0–0.01). In conclusion, the MBMC method serves as an independent IMRT dose verification tool in a clinical setting. - Highlights: ► The patient-based Monte Carlo method serves as a reference standard to verify IMRT doses. ► 3D Dose distributions for NPC patients have been verified by the Monte Carlo method. ► Doses predicted by the Monte Carlo method matched closely with those by the TPS. ► The Monte Carlo method predicted a higher mean dose to the middle ears than the TPS. ► Critical organ doses should be confirmed to avoid overdose to normal organs
Monte Carlo dose calculations for phantoms with hip prostheses
International Nuclear Information System (INIS)
Bazalova, M; Verhaegen, F; Coolens, C; Childs, P; Cury, F; Beaulieu, L
2008-01-01
Computed tomography (CT) images of patients with hip prostheses are severely degraded by metal streaking artefacts. The low image quality makes organ contouring more difficult and can result in large dose calculation errors when Monte Carlo (MC) techniques are used. In this work, the extent of streaking artefacts produced by three common hip prosthesis materials (Ti-alloy, stainless steel, and Co-Cr-Mo alloy) was studied. The prostheses were tested in a hypothetical prostate treatment with five 18 MV photon beams. The dose distributions for unilateral and bilateral prosthesis phantoms were calculated with the EGSnrc/DOSXYZnrc MC code. This was done in three phantom geometries: in the exact geometry, in the original CT geometry, and in an artefact-corrected geometry. The artefact-corrected geometry was created using a modified filtered back-projection correction technique. It was found that unilateral prosthesis phantoms do not show large dose calculation errors, as long as the beams miss the artefact-affected volume. This is possible to achieve in the case of unilateral prosthesis phantoms (except for the Co-Cr-Mo prosthesis which gives a 3% error) but not in the case of bilateral prosthesis phantoms. The largest dose discrepancies were obtained for the bilateral Co-Cr-Mo hip prosthesis phantom, up to 11% in some voxels within the prostate. The artefact correction algorithm worked well for all phantoms and resulted in dose calculation errors below 2%. In conclusion, a MC treatment plan should include an artefact correction algorithm when treating patients with hip prostheses
Energy Technology Data Exchange (ETDEWEB)
Gallego Franco, P.; Garcia Marcos, R.
2015-07-01
GAMOS simulation code based on Geant4 is a very powerful tool for the design and modeling optimization on Positron Emission Tomography (PET) systems. In order to obtain a proper image quality, it results to be extremely important determine the optimal activity which is going to be delivered. For this reason a study about the internal system parameters that affects image quality, such as scatter fraction (SF) and the count rate equivalent noise (NEC), has been carried out. The study involves the comparison of experimental measures on both parameters, with those obtained by Monte Carlo simulation of Siemens Pet Biograph 6 True Point with True V option. Based on simulations results, a paralizable dead-time model that adjusts, depending on the activity provided, the proper dead-time for scanner detectors. Also a study about the variation of this proper dead-time with the activity has been carried out. (Author)
Energy Technology Data Exchange (ETDEWEB)
Marcus, A.
1980-07-01
The role of images of information (charts, diagrams, maps, and symbols) for effective presentation of facts and concepts is expanding dramatically because of advances in computer graphics technology, increasingly hetero-lingual, hetero-cultural world target populations of information providers, the urgent need to convey more efficiently vast amounts of information, the broadening population of (non-expert) computer users, the decrease of available time for reading texts and for decision making, and the general level of literacy. A coalition of visual performance experts, human engineering specialists, computer scientists, and graphic designers/artists is required to resolve human factors aspects of images of information. The need for, nature of, and benefits of interdisciplinary effort are discussed. The results of an interdisciplinary collaboration are demonstrated in a product for visualizing complex information about global energy interdependence. An invited panel will respond to the presentation.
Energy Technology Data Exchange (ETDEWEB)
Tringe, J.W., E-mail: tringe2@llnl.gov [Lawrence Livermore National Laboratory, 7000 East Avenue, Livermore, CA (United States); Ileri, N. [Lawrence Livermore National Laboratory, 7000 East Avenue, Livermore, CA (United States); Department of Chemical Engineering & Materials Science, University of California, Davis, CA (United States); Levie, H.W. [Lawrence Livermore National Laboratory, 7000 East Avenue, Livermore, CA (United States); Stroeve, P.; Ustach, V.; Faller, R. [Department of Chemical Engineering & Materials Science, University of California, Davis, CA (United States); Renaud, P. [Swiss Federal Institute of Technology, Lausanne, (EPFL) (Switzerland)
2015-08-18
Highlights: • WGA proteins in nanochannels modeled by Molecular Dynamics and Monte Carlo. • Protein surface coverage characterized by atomic force microscopy. • Models indicate transport characteristics depend strongly on surface coverage. • Results resolve of a four orders of magnitude difference in diffusion coefficient values. - Abstract: We use Molecular Dynamics and Monte Carlo simulations to examine molecular transport phenomena in nanochannels, explaining four orders of magnitude difference in wheat germ agglutinin (WGA) protein diffusion rates observed by fluorescence correlation spectroscopy (FCS) and by direct imaging of fluorescently-labeled proteins. We first use the ESPResSo Molecular Dynamics code to estimate the surface transport distance for neutral and charged proteins. We then employ a Monte Carlo model to calculate the paths of protein molecules on surfaces and in the bulk liquid transport medium. Our results show that the transport characteristics depend strongly on the degree of molecular surface coverage. Atomic force microscope characterization of surfaces exposed to WGA proteins for 1000 s show large protein aggregates consistent with the predicted coverage. These calculations and experiments provide useful insight into the details of molecular motion in confined geometries.
Djibrilla saley, Abdoulazizi; Jardani, Abderrahim; Soueid Ahmed, Abdellahi; Raphael, Antoine; Dupont, Jean Paul
2017-04-01
Estimating spatial distributions of the hydraulic conductivity in heterogeneous aquifers has always been an important and challenging task in hydrology. Generally, the hydraulic conductivity field is determined from hydraulic head or pressure measurements. In the present study, we propose to use temperature data as source of information for characterizing the spatial distributions of the hydraulic conductivity field. In this way, we performed a laboratory sandbox experiment with the aim of imaging the heterogeneities of the hydraulic conductivity field from thermal monitoring. During the laboratory experiment, we injected a hot water pulse, which induces a heat plume motion into the sandbox. The induced plume was followed by a set of thermocouples placed in the sandbox. After the temperature data acquisition, we performed a hydraulic tomography using the stochastic Hybrid Monte Carlo approach, also called the Hamiltonian Monte Carlo (HMC) algorithm to invert the temperature data. This algorithm is based on a combination of the Metropolis Monte Carlo method and the Hamiltonian dynamics approach. The parameterization of the inverse problem was done with the Karhunen-Loève (KL) expansion to reduce the dimensionality of the unknown parameters. Our approach has provided successful reconstruction of the hydraulic conductivity field with low computational effort.
PeneloPET, a Monte Carlo PET simulation tool based on PENELOPE: features and validation
Energy Technology Data Exchange (ETDEWEB)
Espana, S; Herraiz, J L; Vicente, E; Udias, J M [Grupo de Fisica Nuclear, Departmento de Fisica Atomica, Molecular y Nuclear, Universidad Complutense de Madrid, Madrid (Spain); Vaquero, J J; Desco, M [Unidad de Medicina y CirugIa Experimental, Hospital General Universitario Gregorio Maranon, Madrid (Spain)], E-mail: jose@nuc2.fis.ucm.es
2009-03-21
Monte Carlo simulations play an important role in positron emission tomography (PET) imaging, as an essential tool for the research and development of new scanners and for advanced image reconstruction. PeneloPET, a PET-dedicated Monte Carlo tool, is presented and validated in this work. PeneloPET is based on PENELOPE, a Monte Carlo code for the simulation of the transport in matter of electrons, positrons and photons, with energies from a few hundred eV to 1 GeV. PENELOPE is robust, fast and very accurate, but it may be unfriendly to people not acquainted with the FORTRAN programming language. PeneloPET is an easy-to-use application which allows comprehensive simulations of PET systems within PENELOPE. Complex and realistic simulations can be set by modifying a few simple input text files. Different levels of output data are available for analysis, from sinogram and lines-of-response (LORs) histogramming to fully detailed list mode. These data can be further exploited with the preferred programming language, including ROOT. PeneloPET simulates PET systems based on crystal array blocks coupled to photodetectors and allows the user to define radioactive sources, detectors, shielding and other parts of the scanner. The acquisition chain is simulated in high level detail; for instance, the electronic processing can include pile-up rejection mechanisms and time stamping of events, if desired. This paper describes PeneloPET and shows the results of extensive validations and comparisons of simulations against real measurements from commercial acquisition systems. PeneloPET is being extensively employed to improve the image quality of commercial PET systems and for the development of new ones.
Contrast to Noise Ratio and Contrast Detail Analysis in Mammography:A Monte Carlo Study
International Nuclear Information System (INIS)
Metaxas, V; Delis, H; Panayiotakis, G; Kalogeropoulou, C; Zampakis, P
2015-01-01
The mammographic spectrum is one of the major factors affecting image quality in mammography. In this study, a Monte Carlo (MC) simulation model was used to evaluate image quality characteristics of various mammographic spectra. The anode/filter combinations evaluated, were those traditionally used in mammography, for tube voltages between 26 and 30 kVp. The imaging performance was investigated in terms of Contrast to Noise Ratio (CNR) and Contrast Detail (CD) analysis, by involving human observers, utilizing a mathematical CD phantom. Soft spectra provided the best characteristics in terms of both CNR and CD scores, while tube voltage had a limited effect. W-anode spectra filtered with k-edge filters demonstrated an improved performance, that sometimes was better compared to softer x-ray spectra, produced by Mo or Rh anode. Regarding the filter material, k-edge filters showed superior performance compared to Al filters. (paper)
Contrast to Noise Ratio and Contrast Detail Analysis in Mammography:A Monte Carlo Study
Metaxas, V.; Delis, H.; Kalogeropoulou, C.; Zampakis, P.; Panayiotakis, G.
2015-09-01
The mammographic spectrum is one of the major factors affecting image quality in mammography. In this study, a Monte Carlo (MC) simulation model was used to evaluate image quality characteristics of various mammographic spectra. The anode/filter combinations evaluated, were those traditionally used in mammography, for tube voltages between 26 and 30 kVp. The imaging performance was investigated in terms of Contrast to Noise Ratio (CNR) and Contrast Detail (CD) analysis, by involving human observers, utilizing a mathematical CD phantom. Soft spectra provided the best characteristics in terms of both CNR and CD scores, while tube voltage had a limited effect. W-anode spectra filtered with k-edge filters demonstrated an improved performance, that sometimes was better compared to softer x-ray spectra, produced by Mo or Rh anode. Regarding the filter material, k-edge filters showed superior performance compared to Al filters.
Carlos Gardel, el patrimonio que sonrie
Directory of Open Access Journals (Sweden)
María Julia Carozzi
2003-10-01
Full Text Available Analizando los modos en que los porteños recordaron a Carlos Gardel en el mes del 68 aniversario de su muerte, el artículo intenta dar cuenta de una de las formas en que los habitantes de la ciudad de Buenos Aires conciben aquello que es memorable, identifican aquello en que se reconocen como porteños y singularizan aquello frente a lo cual experimentan sentimientos de pertenencia colectiva. El trabajo señala la centralidad que el milagro, la mimesis y el contacto directo con su cuerpo desempeñan en la preservación de la memoria de Gardel, quien encarna tanto al tango como a su éxito en el mundo. El caso de Gardel se presenta como un ejemplo de la organización de la memoria y la identidad de los porteños en particular y los argentinos en general en torno a personas reales a quienes se les asigna un valor extraordinario. Al sostener su profundo enraizamiento en cuerpos humanos concretos, tornan problemática la adopción local de los conceptos globalmente aceptados de patrimonio histórico y cultural.The article analyses one of the ways in which the inhabitants of Buenos Aires conceive that which is memorable, source of positive identification and origin of feelings of communitas by examining their commemoration of the 68th anniversary of the death of Carlos Gardel. It underscores the central role that miracles, mimesis and direct bodily contact play in the preservation of the memory of the star, who incarnates both the tango and its world-wide success. The case of Gardel is presented as an example of the centrality that real persons of extraordinary value have in the organization of local memory and collective identity. Since they are embedded in concrete human bodies, they reveal problems in the local adoption of globally accepted concepts of historical and cultural heritage.
Evaluation of high packing density powder X-ray screens by Monte Carlo methods
Energy Technology Data Exchange (ETDEWEB)
Liaparinos, P. [Department of Medical Physics, Medical School, University of Patras, 26500 Patras (Greece); Kandarakis, I.; Cavouras, D. [Department of Medical Instruments Technology, Technological Educational Institution of Athens, Ag. Spyridonos Street, Aigaleo, 12210 Athens (Greece); Kalivas, N. [Greek Atomic Energy Commission, 15310 Athens (Greece); Delis, H. [Department of Medical Physics, Medical School, University of Patras, 26500 Patras (Greece); Panayiotakis, G. [Department of Medical Physics, Medical School, University of Patras, 26500 Patras (Greece)], E-mail: panayiot@upatras.gr
2007-09-21
Phosphor materials are employed in intensifying screens of both digital and conventional X-ray imaging detectors. High packing density powder screens have been developed (e.g. screens in ceramic form) exhibiting high-resolution and light emission properties, and thus contributing to improved image transfer characteristics and higher radiation to light conversion efficiency. For the present study, a custom Monte Carlo simulation program was used in order to examine the performance of ceramic powder screens, under various radiographic conditions. The model was developed using Mie scattering theory for the description of light interactions, based on the physical characteristics (e.g. complex refractive index, light wavelength) of the phosphor material. Monte Carlo simulations were carried out assuming: (a) X-ray photon energy ranging from 18 up to 49 keV, (b) Gd{sub 2}O{sub 2}S:Tb phosphor material with packing density of 70% and grain size of 7 {mu}m and (c) phosphor thickness ranging between 30 and 70 mg/cm{sup 2}. The variation of the Modulation Transfer Function (MTF) and the Luminescence Efficiency (LE) with respect to the X-ray energy and the phosphor thickness was evaluated. Both aforementioned imaging characteristics were shown to take high values at 49 keV X-ray energy and 70 mg/cm{sup 2} phosphor thickness. It was found that high packing density screens may be appropriate for use in medical radiographic systems.
Evaluation of high packing density powder X-ray screens by Monte Carlo methods
Liaparinos, P.; Kandarakis, I.; Cavouras, D.; Kalivas, N.; Delis, H.; Panayiotakis, G.
2007-09-01
Phosphor materials are employed in intensifying screens of both digital and conventional X-ray imaging detectors. High packing density powder screens have been developed (e.g. screens in ceramic form) exhibiting high-resolution and light emission properties, and thus contributing to improved image transfer characteristics and higher radiation to light conversion efficiency. For the present study, a custom Monte Carlo simulation program was used in order to examine the performance of ceramic powder screens, under various radiographic conditions. The model was developed using Mie scattering theory for the description of light interactions, based on the physical characteristics (e.g. complex refractive index, light wavelength) of the phosphor material. Monte Carlo simulations were carried out assuming: (a) X-ray photon energy ranging from 18 up to 49 keV, (b) Gd 2O 2S:Tb phosphor material with packing density of 70% and grain size of 7 μm and (c) phosphor thickness ranging between 30 and 70 mg/cm 2. The variation of the Modulation Transfer Function (MTF) and the Luminescence Efficiency (LE) with respect to the X-ray energy and the phosphor thickness was evaluated. Both aforementioned imaging characteristics were shown to take high values at 49 keV X-ray energy and 70 mg/cm 2 phosphor thickness. It was found that high packing density screens may be appropriate for use in medical radiographic systems.
Evaluation of high packing density powder X-ray screens by Monte Carlo methods
International Nuclear Information System (INIS)
Liaparinos, P.; Kandarakis, I.; Cavouras, D.; Kalivas, N.; Delis, H.; Panayiotakis, G.
2007-01-01
Phosphor materials are employed in intensifying screens of both digital and conventional X-ray imaging detectors. High packing density powder screens have been developed (e.g. screens in ceramic form) exhibiting high-resolution and light emission properties, and thus contributing to improved image transfer characteristics and higher radiation to light conversion efficiency. For the present study, a custom Monte Carlo simulation program was used in order to examine the performance of ceramic powder screens, under various radiographic conditions. The model was developed using Mie scattering theory for the description of light interactions, based on the physical characteristics (e.g. complex refractive index, light wavelength) of the phosphor material. Monte Carlo simulations were carried out assuming: (a) X-ray photon energy ranging from 18 up to 49 keV, (b) Gd 2 O 2 S:Tb phosphor material with packing density of 70% and grain size of 7 μm and (c) phosphor thickness ranging between 30 and 70 mg/cm 2 . The variation of the Modulation Transfer Function (MTF) and the Luminescence Efficiency (LE) with respect to the X-ray energy and the phosphor thickness was evaluated. Both aforementioned imaging characteristics were shown to take high values at 49 keV X-ray energy and 70 mg/cm 2 phosphor thickness. It was found that high packing density screens may be appropriate for use in medical radiographic systems
Iterative acceleration methods for Monte Carlo and deterministic criticality calculations
Energy Technology Data Exchange (ETDEWEB)
Urbatsch, T.J.
1995-11-01
If you have ever given up on a nuclear criticality calculation and terminated it because it took so long to converge, you might find this thesis of interest. The author develops three methods for improving the fission source convergence in nuclear criticality calculations for physical systems with high dominance ratios for which convergence is slow. The Fission Matrix Acceleration Method and the Fission Diffusion Synthetic Acceleration (FDSA) Method are acceleration methods that speed fission source convergence for both Monte Carlo and deterministic methods. The third method is a hybrid Monte Carlo method that also converges for difficult problems where the unaccelerated Monte Carlo method fails. The author tested the feasibility of all three methods in a test bed consisting of idealized problems. He has successfully accelerated fission source convergence in both deterministic and Monte Carlo criticality calculations. By filtering statistical noise, he has incorporated deterministic attributes into the Monte Carlo calculations in order to speed their source convergence. He has used both the fission matrix and a diffusion approximation to perform unbiased accelerations. The Fission Matrix Acceleration method has been implemented in the production code MCNP and successfully applied to a real problem. When the unaccelerated calculations are unable to converge to the correct solution, they cannot be accelerated in an unbiased fashion. A Hybrid Monte Carlo method weds Monte Carlo and a modified diffusion calculation to overcome these deficiencies. The Hybrid method additionally possesses reduced statistical errors.
Present status and future prospects of neutronics Monte Carlo
International Nuclear Information System (INIS)
Gelbard, E.M.
1990-01-01
It is fair to say that the Monte Carlo method, over the last decade, has grown steadily more important as a neutronics computational tool. Apparently this has happened for assorted reasons. Thus, for example, as the power of computers has increased, the cost of the method has dropped, steadily becoming less and less of an obstacle to its use. In addition, more and more sophisticated input processors have now made it feasible to model extremely complicated systems routinely with really remarkable fidelity. Finally, as we demand greater and greater precision in reactor calculations, Monte Carlo is often found to be the only method accurate enough for use in benchmarking. Cross section uncertainties are now almost the only inherent limitations in our Monte Carlo capabilities. For this reason Monte Carlo has come to occupy a special position, interposed between experiment and other computational techniques. More and more often deterministic methods are tested by comparison with Monte Carlo, and cross sections are tested by comparing Monte Carlo with experiment. In this way one can distinguish very clearly between errors due to flaws in our numerical methods, and those due to deficiencies in cross section files. The special role of Monte Carlo as a benchmarking tool, often the only available benchmarking tool, makes it crucially important that this method should be polished to perfection. Problems relating to Eigenvalue calculations, variance reduction and the use of advanced computers are reviewed in this paper. (author)
Monte Carlo simulations support non-Cerenkov radioluminescence production in tissue
Ackerman, Nicole L.; Boschi, Federico; Spinelli, Antonello E.
2017-08-01
There is experimental evidence for the production of non-Cerenkov radioluminescence in a variety of materials, including tissue. We constructed a Geant4 Monte Carlo simulation of the radiation from P32 and Tc99m interacting in chicken breast and used experimental imaging data to model a scintillation-like emission. The same radioluminescence spectrum is visible from both isotopes and cannot otherwise be explained through fluorescence or filter miscalibration. We conclude that chicken breast has a near-infrared scintillation-like response with a light yield three orders of magnitude smaller than BGO.
Characterization of materials for prosthetic implants using the BEAMnrc Monte Carlo code
International Nuclear Information System (INIS)
Spezi, E; Palleri, F; Angelini, A L; Ferri, A; Baruffaldi, F
2007-01-01
Metallic implants degrade image quality and perturb severely the patient dose distribution in external beam radiotherapy. Furthermore, conventional treatment planning systems (TPS) do not accurately account for tissue heterogeneities, especially at the interfaces where high Z gradients are present. This work deals with the accurate and systematic characterization of materials used for prosthetic implants. The dose calculation engine used in this investigation is the BEAMnrc Monte Carlo code. A detailed comparison versus experimental data was carried out for two clinical photon beam energies (6MV and 18MV). Our results show that in both cases a very good agreement (within ± 2%) between calculations and experiments was achieved
SU-E-J-144: Low Activity Studies of Carbon 11 Activation Via GATE Monte Carlo
Energy Technology Data Exchange (ETDEWEB)
Elmekawy, A; Ewell, L [Hampton University, Hampton, VA (United States); Butuceanu, C; Qu, L [Hampton University Proton Therapy Institute, Hampton, VA (United States)
2015-06-15
Purpose: To investigate the behavior of a Monte Carlo simulation code with low levels of activity (∼1,000Bq). Such activity levels are expected from phantoms and patients activated via a proton therapy beam. Methods: Three different ranges for a therapeutic proton radiation beam were examined in a Monte Carlo simulation code: 13.5, 17.0 and 21.0cm. For each range, the decay of an equivalent length{sup 11}C source and additional sources of length plus or minus one cm was studied in a benchmark PET simulation for activities of 1000, 2000 and 3000Bq. The ranges were chosen to coincide with a previous activation study, and the activities were chosen to coincide with the approximate level of isotope creation expected in a phantom or patient irradiated by a therapeutic proton beam. The GATE 7.0 simulation was completed on a cluster node, running Scientific Linux Carbon 6 (Red Hat©). The resulting Monte Carlo data were investigated with the ROOT (CERN) analysis tool. The half-life of{sup 11}C was extracted via a histogram fit to the number of simulated PET events vs. time. Results: The average slope of the deviation of the extracted carbon half life from the expected/nominal value vs. activity showed a generally positive value. This was unexpected, as the deviation should, in principal, decrease with increased activity and lower statistical uncertainty. Conclusion: For activity levels on the order of 1,000Bq, the behavior of a benchmark PET test was somewhat unexpected. It is important to be aware of the limitations of low activity PET images, and low activity Monte Carlo simulations. This work was funded in part by the Philips corporation.
An evaluation of the Monte Carlo simulation of SPECT projection data using MCNP and SimSPECT
International Nuclear Information System (INIS)
Selcow, E.C.; Dobrzeniecki, A.B.; Yanch, J.C.; Lu, A.; Belanger, M.J.
1996-01-01
Simulation of the complete nuclear medicine imaging situation for SPECT (Single Photon Emission Computed Tomography) produces synthetic images that are useful in the analysis and improvement of existing imaging systems and in the design of new and improved systems. The simulation methods the authors employ are based on probabilistic numerical calculations (Monte Carlo); they require enormous amounts of computer time and employ highly complex models (the tomographic acquisition of images through intricate collimators). The presentation consists of three parts. In the first, they describe the techniques developed to achieve reasonable simulation times and the tools built to allow interactive and effective analysis and processing of the resultant synthetic images. In the next part, they explore the limitations of such techniques for performing simulations of medical imaging situations. In the final part, they describe the areas of research that are promising for increasing the quality and breadth of the simulation process
Simulation and the Monte Carlo Method, Student Solutions Manual
Rubinstein, Reuven Y
2012-01-01
This accessible new edition explores the major topics in Monte Carlo simulation Simulation and the Monte Carlo Method, Second Edition reflects the latest developments in the field and presents a fully updated and comprehensive account of the major topics that have emerged in Monte Carlo simulation since the publication of the classic First Edition over twenty-five years ago. While maintaining its accessible and intuitive approach, this revised edition features a wealth of up-to-date information that facilitates a deeper understanding of problem solving across a wide array of subject areas, suc
Multiple Monte Carlo Testing with Applications in Spatial Point Processes
DEFF Research Database (Denmark)
Mrkvička, Tomáš; Myllymäki, Mari; Hahn, Ute
The rank envelope test (Myllym\\"aki et al., Global envelope tests for spatial processes, arXiv:1307.0239 [stat.ME]) is proposed as a solution to multiple testing problem for Monte Carlo tests. Three different situations are recognized: 1) a few univariate Monte Carlo tests, 2) a Monte Carlo test ...... for one group of point patterns, comparison of several groups of point patterns, test of dependence of components in a multi-type point pattern, and test of Boolean assumption for random closed sets....
The Monte Carlo method the method of statistical trials
Shreider, YuA
1966-01-01
The Monte Carlo Method: The Method of Statistical Trials is a systematic account of the fundamental concepts and techniques of the Monte Carlo method, together with its range of applications. Some of these applications include the computation of definite integrals, neutron physics, and in the investigation of servicing processes. This volume is comprised of seven chapters and begins with an overview of the basic features of the Monte Carlo method and typical examples of its application to simple problems in computational mathematics. The next chapter examines the computation of multi-dimensio
[Carlos Chagas Filho's choice of biological physics: reason and motivations].
de Almeida, Darcy Fontoura
2008-01-01
This study investigates the reasons and motivations behind Carlos Chagas Filho's choice to abandon the line of study developed by his father, Carlos Chagas, and brother, Evandro Chagas, both of whom had very successful careers researching tropical diseases. Though Carlos Chagas Filho first worked on anatomical pathology, he suddenly shifted his attentions of the physicochemical aspects of vital processes. Extant sources show that a number of unforeseen circumstances took place from early on in Chagas Filho's education. There was a chance he could carry out work of a similar import in a different area and he set his sights, with uncommon luck, on the introduction of scientific research at university.
Simulation of scintillating fiber gamma ray detectors for medical imaging
International Nuclear Information System (INIS)
Chaney, R.C.; Fenyves, E.J.; Antich, P.P.
1990-01-01
This paper reports on plastic scintillating fibers which have been shown to be effective for high spatial and time resolution of gamma rays. They may be expected to significantly improve the resolution of current medical imaging systems such as PET and SPECT. Monte Carlo simulation of imaging systems using these detectors, provides a means to optimize their performance in this application, as well as demonstrate their resolution and efficiency. Monte Carlo results are presented for PET and SPECT systems constructed using these detectors
Quantum Monte Carlo on graphical processing units
Anderson, Amos G.; Goddard, William A.; Schröder, Peter
2007-08-01
Quantum Monte Carlo (QMC) is among the most accurate methods for solving the time independent Schrödinger equation. Unfortunately, the method is very expensive and requires a vast array of computing resources in order to obtain results of a reasonable convergence level. On the other hand, the method is not only easily parallelizable across CPU clusters, but as we report here, it also has a high degree of data parallelism. This facilitates the use of recent technological advances in Graphical Processing Units (GPUs), a powerful type of processor well known to computer gamers. In this paper we report on an end-to-end QMC application with core elements of the algorithm running on a GPU. With individual kernels achieving as much as 30× speed up, the overall application performs at up to 6× faster relative to an optimized CPU implementation, yet requires only a modest increase in hardware cost. This demonstrates the speedup improvements possible for QMC in running on advanced hardware, thus exploring a path toward providing QMC level accuracy as a more standard tool. The major current challenge in running codes of this type on the GPU arises from the lack of fully compliant IEEE floating point implementations. To achieve better accuracy we propose the use of the Kahan summation formula in matrix multiplications. While this drops overall performance, we demonstrate that the proposed new algorithm can match CPU single precision.
Monte Carlo simulations for heavy ion dosimetry
Energy Technology Data Exchange (ETDEWEB)
Geithner, O.
2006-07-26
Water-to-air stopping power ratio (s{sub w,air}) calculations for the ionization chamber dosimetry of clinically relevant ion beams with initial energies from 50 to 450 MeV/u have been performed using the Monte Carlo technique. To simulate the transport of a particle in water the computer code SHIELD-HIT v2 was used which is a substantially modified version of its predecessor SHIELD-HIT v1. The code was partially rewritten, replacing formerly used single precision variables with double precision variables. The lowest particle transport specific energy was decreased from 1 MeV/u down to 10 keV/u by modifying the Bethe- Bloch formula, thus widening its range for medical dosimetry applications. Optional MSTAR and ICRU-73 stopping power data were included. The fragmentation model was verified using all available experimental data and some parameters were adjusted. The present code version shows excellent agreement with experimental data. Additional to the calculations of stopping power ratios, s{sub w,air}, the influence of fragments and I-values on s{sub w,air} for carbon ion beams was investigated. The value of s{sub w,air} deviates as much as 2.3% at the Bragg peak from the recommended by TRS-398 constant value of 1.130 for an energy of 50 MeV/u. (orig.)
The GENIE neutrino Monte Carlo generator
International Nuclear Information System (INIS)
Andreopoulos, C.; Bell, A.; Bhattacharya, D.; Cavanna, F.; Dobson, J.; Dytman, S.; Gallagher, H.; Guzowski, P.; Hatcher, R.; Kehayias, P.; Meregaglia, A.; Naples, D.; Pearce, G.; Rubbia, A.; Whalley, M.; Yang, T.
2010-01-01
GENIE is a new neutrino event generator for the experimental neutrino physics community. The goal of the project is to develop a 'canonical' neutrino interaction physics Monte Carlo whose validity extends to all nuclear targets and neutrino flavors from MeV to PeV energy scales. Currently, emphasis is on the few-GeV energy range, the challenging boundary between the non-perturbative and perturbative regimes, which is relevant for the current and near future long-baseline precision neutrino experiments using accelerator-made beams. The design of the package addresses many challenges unique to neutrino simulations and supports the full life-cycle of simulation and generator-related analysis tasks. GENIE is a large-scale software system, consisting of ∼120000 lines of C++ code, featuring a modern object-oriented design and extensively validated physics content. The first official physics release of GENIE was made available in August 2007, and at the time of the writing of this article, the latest available version was v2.4.4.
Pseudopotentials for quantum-Monte-Carlo-calculations
International Nuclear Information System (INIS)
Burkatzki, Mark Thomas
2008-01-01
The author presents scalar-relativistic energy-consistent Hartree-Fock pseudopotentials for the main-group and 3d-transition-metal elements. The pseudopotentials do not exhibit a singularity at the nucleus and are therefore suitable for quantum Monte Carlo (QMC) calculations. The author demonstrates their transferability through extensive benchmark calculations of atomic excitation spectra as well as molecular properties. In particular, the author computes the vibrational frequencies and binding energies of 26 first- and second-row diatomic molecules using post Hartree-Fock methods, finding excellent agreement with the corresponding all-electron values. The author shows that the presented pseudopotentials give superior accuracy than other existing pseudopotentials constructed specifically for QMC. The localization error and the efficiency in QMC are discussed. The author also presents QMC calculations for selected atomic and diatomic 3d-transitionmetal systems. Finally, valence basis sets of different sizes (VnZ with n=D,T,Q,5 for 1st and 2nd row; with n=D,T for 3rd to 5th row; with n=D,T,Q for the 3d transition metals) optimized for the pseudopotentials are presented. (orig.)
Monte Carlo simulations for heavy ion dosimetry
International Nuclear Information System (INIS)
Geithner, O.
2006-01-01
Water-to-air stopping power ratio (s w,air ) calculations for the ionization chamber dosimetry of clinically relevant ion beams with initial energies from 50 to 450 MeV/u have been performed using the Monte Carlo technique. To simulate the transport of a particle in water the computer code SHIELD-HIT v2 was used which is a substantially modified version of its predecessor SHIELD-HIT v1. The code was partially rewritten, replacing formerly used single precision variables with double precision variables. The lowest particle transport specific energy was decreased from 1 MeV/u down to 10 keV/u by modifying the Bethe- Bloch formula, thus widening its range for medical dosimetry applications. Optional MSTAR and ICRU-73 stopping power data were included. The fragmentation model was verified using all available experimental data and some parameters were adjusted. The present code version shows excellent agreement with experimental data. Additional to the calculations of stopping power ratios, s w,air , the influence of fragments and I-values on s w,air for carbon ion beams was investigated. The value of s w,air deviates as much as 2.3% at the Bragg peak from the recommended by TRS-398 constant value of 1.130 for an energy of 50 MeV/u. (orig.)
Parallel Monte Carlo Simulation of Aerosol Dynamics
Directory of Open Access Journals (Sweden)
Kun Zhou
2014-02-01
Full Text Available A highly efficient Monte Carlo (MC algorithm is developed for the numerical simulation of aerosol dynamics, that is, nucleation, surface growth, and coagulation. Nucleation and surface growth are handled with deterministic means, while coagulation is simulated with a stochastic method (Marcus-Lushnikov stochastic process. Operator splitting techniques are used to synthesize the deterministic and stochastic parts in the algorithm. The algorithm is parallelized using the Message Passing Interface (MPI. The parallel computing efficiency is investigated through numerical examples. Near 60% parallel efficiency is achieved for the maximum testing case with 3.7 million MC particles running on 93 parallel computing nodes. The algorithm is verified through simulating various testing cases and comparing the simulation results with available analytical and/or other numerical solutions. Generally, it is found that only small number (hundreds or thousands of MC particles is necessary to accurately predict the aerosol particle number density, volume fraction, and so forth, that is, low order moments of the Particle Size Distribution (PSD function. Accurately predicting the high order moments of the PSD needs to dramatically increase the number of MC particles.
Rare event simulation using Monte Carlo methods
Rubino, Gerardo
2009-01-01
In a probabilistic model, a rare event is an event with a very small probability of occurrence. The forecasting of rare events is a formidable task but is important in many areas. For instance a catastrophic failure in a transport system or in a nuclear power plant, the failure of an information processing system in a bank, or in the communication network of a group of banks, leading to financial losses. Being able to evaluate the probability of rare events is therefore a critical issue. Monte Carlo Methods, the simulation of corresponding models, are used to analyze rare events. This book sets out to present the mathematical tools available for the efficient simulation of rare events. Importance sampling and splitting are presented along with an exposition of how to apply these tools to a variety of fields ranging from performance and dependability evaluation of complex systems, typically in computer science or in telecommunications, to chemical reaction analysis in biology or particle transport in physics. ...
Parallel Monte Carlo simulation of aerosol dynamics
Zhou, K.
2014-01-01
A highly efficient Monte Carlo (MC) algorithm is developed for the numerical simulation of aerosol dynamics, that is, nucleation, surface growth, and coagulation. Nucleation and surface growth are handled with deterministic means, while coagulation is simulated with a stochastic method (Marcus-Lushnikov stochastic process). Operator splitting techniques are used to synthesize the deterministic and stochastic parts in the algorithm. The algorithm is parallelized using the Message Passing Interface (MPI). The parallel computing efficiency is investigated through numerical examples. Near 60% parallel efficiency is achieved for the maximum testing case with 3.7 million MC particles running on 93 parallel computing nodes. The algorithm is verified through simulating various testing cases and comparing the simulation results with available analytical and/or other numerical solutions. Generally, it is found that only small number (hundreds or thousands) of MC particles is necessary to accurately predict the aerosol particle number density, volume fraction, and so forth, that is, low order moments of the Particle Size Distribution (PSD) function. Accurately predicting the high order moments of the PSD needs to dramatically increase the number of MC particles. 2014 Kun Zhou et al.
Atomistic Monte Carlo Simulation of Lipid Membranes
Directory of Open Access Journals (Sweden)
Daniel Wüstner
2014-01-01
Full Text Available Biological membranes are complex assemblies of many different molecules of which analysis demands a variety of experimental and computational approaches. In this article, we explain challenges and advantages of atomistic Monte Carlo (MC simulation of lipid membranes. We provide an introduction into the various move sets that are implemented in current MC methods for efficient conformational sampling of lipids and other molecules. In the second part, we demonstrate for a concrete example, how an atomistic local-move set can be implemented for MC simulations of phospholipid monomers and bilayer patches. We use our recently devised chain breakage/closure (CBC local move set in the bond-/torsion angle space with the constant-bond-length approximation (CBLA for the phospholipid dipalmitoylphosphatidylcholine (DPPC. We demonstrate rapid conformational equilibration for a single DPPC molecule, as assessed by calculation of molecular energies and entropies. We also show transition from a crystalline-like to a fluid DPPC bilayer by the CBC local-move MC method, as indicated by the electron density profile, head group orientation, area per lipid, and whole-lipid displacements. We discuss the potential of local-move MC methods in combination with molecular dynamics simulations, for example, for studying multi-component lipid membranes containing cholesterol.
A continuation multilevel Monte Carlo algorithm
Collier, Nathan
2014-09-05
We propose a novel Continuation Multi Level Monte Carlo (CMLMC) algorithm for weak approximation of stochastic models. The CMLMC algorithm solves the given approximation problem for a sequence of decreasing tolerances, ending when the required error tolerance is satisfied. CMLMC assumes discretization hierarchies that are defined a priori for each level and are geometrically refined across levels. The actual choice of computational work across levels is based on parametric models for the average cost per sample and the corresponding variance and weak error. These parameters are calibrated using Bayesian estimation, taking particular notice of the deepest levels of the discretization hierarchy, where only few realizations are available to produce the estimates. The resulting CMLMC estimator exhibits a non-trivial splitting between bias and statistical contributions. We also show the asymptotic normality of the statistical error in the MLMC estimator and justify in this way our error estimate that allows prescribing both required accuracy and confidence in the final result. Numerical results substantiate the above results and illustrate the corresponding computational savings in examples that are described in terms of differential equations either driven by random measures or with random coefficients. © 2014, Springer Science+Business Media Dordrecht.
Los motivos del lobo. Entrevista con Carlos Alazraki
Guzmán, Héctor; Alazraki, Carlos
1995-01-01
Entrevista al publicista mexicano Carlos Alazraki en la que se tocan los temas del humor y la cultura mexicana en los anuncios publicitarios. Incluye obra visual del pintor Davis Birks, reproducida en blanco y negro.
On the Markov Chain Monte Carlo (MCMC) method
Indian Academy of Sciences (India)
Abstract. Markov Chain Monte Carlo (MCMC) is a popular method used to generate samples from arbitrary distributions, which may be speciﬁed indirectly. In this article, we give an introduction to this method along with some examples.
Usefulness of the Monte Carlo method in reliability calculations
International Nuclear Information System (INIS)
Lanore, J.M.; Kalli, H.
1977-01-01
Three examples of reliability Monte Carlo programs developed in the LEP (Laboratory for Radiation Shielding Studies in the Nuclear Research Center at Saclay) are presented. First, an uncertainty analysis is given for a simplified spray system; a Monte Carlo program PATREC-MC has been written to solve the problem with the system components given in the fault tree representation. The second program MONARC 2 has been written to solve the problem of complex systems reliability by the Monte Carlo simulation, here again the system (a residual heat removal system) is in the fault tree representation. Third, the Monte Carlo program MONARC was used instead of the Markov diagram to solve the simulation problem of an electric power supply including two nets and two stand-by diesels
3D SURVEY OF THE SAN CARLO THEATRE IN NAPLES
Directory of Open Access Journals (Sweden)
V. Cappellini
2012-09-01
Full Text Available The article reports the approach developed for the 3D modeling of an important monument in Naples: San Carlo Theatre, the oldest Opera House in Europe recognized as a UNESCO World Heritage site.
Carlo Ginzburg: anomaalia viitab normile / intervjueerinud Marek Tamm
Ginzburg, Carlo, 1939-
2014-01-01
Intervjuu itaalia ajaloolase Carlo Ginzburgiga tema raamatu "Ükski saar pole saar : neli pilguheitu inglise kirjandusele globaalsest vaatenurgast" eesti keeles ilmumise puhul. Teos ilmus Tallinna Ülikooli Kirjastuses
The Monte Carlo simulation of the Ladon photon beam facility
International Nuclear Information System (INIS)
Strangio, C.
1976-01-01
The backward compton scattering of laser light against high energy electrons has been simulated with a Monte Carlo method. The main features of the produced photon beam are reported as well as a careful description of the numerical calculation
Monte Carlo methods for the self-avoiding walk
International Nuclear Information System (INIS)
Janse van Rensburg, E J
2009-01-01
The numerical simulation of self-avoiding walks remains a significant component in the study of random objects in lattices. In this review, I give a comprehensive overview of the current state of Monte Carlo simulations of models of self-avoiding walks. The self-avoiding walk model is revisited, and the motivations for Monte Carlo simulations of this model are discussed. Efficient sampling of self-avoiding walks remains an elusive objective, but significant progress has been made over the last three decades. The model still poses challenging numerical questions however, and I review specific Monte Carlo methods for improved sampling including general Monte Carlo techniques such as Metropolis sampling, umbrella sampling and multiple Markov Chain sampling. In addition, specific static and dynamic algorithms for walks are presented, and I give an overview of recent innovations in this field, including algorithms such as flatPERM, flatGARM and flatGAS. (topical review)
Bayesian phylogeny analysis via stochastic approximation Monte Carlo
Cheon, Sooyoung
2009-11-01
Monte Carlo methods have received much attention in the recent literature of phylogeny analysis. However, the conventional Markov chain Monte Carlo algorithms, such as the Metropolis-Hastings algorithm, tend to get trapped in a local mode in simulating from the posterior distribution of phylogenetic trees, rendering the inference ineffective. In this paper, we apply an advanced Monte Carlo algorithm, the stochastic approximation Monte Carlo algorithm, to Bayesian phylogeny analysis. Our method is compared with two popular Bayesian phylogeny software, BAMBE and MrBayes, on simulated and real datasets. The numerical results indicate that our method outperforms BAMBE and MrBayes. Among the three methods, SAMC produces the consensus trees which have the highest similarity to the true trees, and the model parameter estimates which have the smallest mean square errors, but costs the least CPU time. © 2009 Elsevier Inc. All rights reserved.
Combinatorial nuclear level density by a Monte Carlo method
International Nuclear Information System (INIS)
Cerf, N.
1994-01-01
We present a new combinatorial method for the calculation of the nuclear level density. It is based on a Monte Carlo technique, in order to avoid a direct counting procedure which is generally impracticable for high-A nuclei. The Monte Carlo simulation, making use of the Metropolis sampling scheme, allows a computationally fast estimate of the level density for many fermion systems in large shell model spaces. We emphasize the advantages of this Monte Carlo approach, particularly concerning the prediction of the spin and parity distributions of the excited states,and compare our results with those derived from a traditional combinatorial or a statistical method. Such a Monte Carlo technique seems very promising to determine accurate level densities in a large energy range for nuclear reaction calculations
Monte Carlo techniques for analyzing deep penetration problems
International Nuclear Information System (INIS)
Cramer, S.N.; Gonnord, J.; Hendricks, J.S.
1985-01-01
A review of current methods and difficulties in Monte Carlo deep-penetration calculations is presented. Statistical uncertainty is discussed, and recent adjoint optimization of splitting, Russian roulette, and exponential transformation biasing is reviewed. Other aspects of the random walk and estimation processes are covered, including the relatively new DXANG angular biasing technique. Specific items summarized are albedo scattering, Monte Carlo coupling techniques with discrete ordinates and other methods, adjoint solutions, and multi-group Monte Carlo. The topic of code-generated biasing parameters is presented, including the creation of adjoint importance functions from forward calculations. Finally, current and future work in the area of computer learning and artificial intelligence is discussed in connection with Monte Carlo applications. 29 refs
Time step length versus efficiency of Monte Carlo burnup calculations
International Nuclear Information System (INIS)
Dufek, Jan; Valtavirta, Ville
2014-01-01
Highlights: • Time step length largely affects efficiency of MC burnup calculations. • Efficiency of MC burnup calculations improves with decreasing time step length. • Results were obtained from SIE-based Monte Carlo burnup calculations. - Abstract: We demonstrate that efficiency of Monte Carlo burnup calculations can be largely affected by the selected time step length. This study employs the stochastic implicit Euler based coupling scheme for Monte Carlo burnup calculations that performs a number of inner iteration steps within each time step. In a series of calculations, we vary the time step length and the number of inner iteration steps; the results suggest that Monte Carlo burnup calculations get more efficient as the time step length is reduced. More time steps must be simulated as they get shorter; however, this is more than compensated by the decrease in computing cost per time step needed for achieving a certain accuracy
Herwig: The Evolution of a Monte Carlo Simulation
CERN. Geneva
2015-01-01
Monte Carlo event generation has seen significant developments in the last 10 years starting with preparation for the LHC and then during the first LHC run. I will discuss the basic ideas behind Monte Carlo event generators and then go on to discuss these developments, focussing on the developments in Herwig(++) event generator. I will conclude by presenting the current status of event generation together with some results of the forthcoming new version of Herwig, Herwig 7.
NUEN-618 Class Project: Actually Implicit Monte Carlo
Energy Technology Data Exchange (ETDEWEB)
Vega, R. M. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Brunner, T. A. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)
2017-12-14
This research describes a new method for the solution of the thermal radiative transfer (TRT) equations that is implicit in time which will be called Actually Implicit Monte Carlo (AIMC). This section aims to introduce the TRT equations, as well as the current workhorse method which is known as Implicit Monte Carlo (IMC). As the name of the method proposed here indicates, IMC is a misnomer in that it is only semi-implicit, which will be shown in this section as well.
Studies of Monte Carlo Modelling of Jets at ATLAS
Kar, Deepak; The ATLAS collaboration
2017-01-01
The predictions of different Monte Carlo generators for QCD jet production, both in multijets and for jets produced in association with other objects, are presented. Recent improvements in showering Monte Carlos provide new tools for assessing systematic uncertainties associated with these jets. Studies of the dependence of physical observables on the choice of shower tune parameters and new prescriptions for assessing systematic uncertainties associated with the choice of shower model and tune are presented.
Multiscale Monte Carlo equilibration: Pure Yang-Mills theory
Endres, Michael G.; Brower, Richard C.; Detmold, William; Orginos, Kostas; Pochinsky, Andrew V.
2015-12-01
We present a multiscale thermalization algorithm for lattice gauge theory, which enables efficient parallel generation of uncorrelated gauge field configurations. The algorithm combines standard Monte Carlo techniques with ideas drawn from real space renormalization group and multigrid methods. We demonstrate the viability of the algorithm for pure Yang-Mills gauge theory for both heat bath and hybrid Monte Carlo evolution, and show that it ameliorates the problem of topological freezing up to controllable lattice spacing artifacts.
The sine Gordon model perturbation theory and cluster Monte Carlo
Hasenbusch, M; Pinn, K
1994-01-01
We study the expansion of the surface thickness in the 2-dimensional lattice Sine Gordon model in powers of the fugacity z. Using the expansion to order z**2, we derive lines of constant physics in the rough phase. We describe and test a VMR cluster algorithm for the Monte Carlo simulation of the model. The algorithm shows nearly no critical slowing down. We apply the algorithm in a comparison of our perturbative results with Monte Carlo data.
Monte Carlo methods and applications in nuclear physics
International Nuclear Information System (INIS)
Carlson, J.
1990-01-01
Monte Carlo methods for studying few- and many-body quantum systems are introduced, with special emphasis given to their applications in nuclear physics. Variational and Green's function Monte Carlo methods are presented in some detail. The status of calculations of light nuclei is reviewed, including discussions of the three-nucleon-interaction, charge and magnetic form factors, the coulomb sum rule, and studies of low-energy radiative transitions. 58 refs., 12 figs
Monte Carlo methods and applications in nuclear physics
Energy Technology Data Exchange (ETDEWEB)
Carlson, J.
1990-01-01
Monte Carlo methods for studying few- and many-body quantum systems are introduced, with special emphasis given to their applications in nuclear physics. Variational and Green's function Monte Carlo methods are presented in some detail. The status of calculations of light nuclei is reviewed, including discussions of the three-nucleon-interaction, charge and magnetic form factors, the coulomb sum rule, and studies of low-energy radiative transitions. 58 refs., 12 figs.
Monte Carlo method for solving a parabolic problem
Directory of Open Access Journals (Sweden)
Tian Yi
2016-01-01
Full Text Available In this paper, we present a numerical method based on random sampling for a parabolic problem. This method combines use of the Crank-Nicolson method and Monte Carlo method. In the numerical algorithm, we first discretize governing equations by Crank-Nicolson method, and obtain a large sparse system of linear algebraic equations, then use Monte Carlo method to solve the linear algebraic equations. To illustrate the usefulness of this technique, we apply it to some test problems.
Monte Carlos of the new generation: status and progress
International Nuclear Information System (INIS)
Frixione, Stefano
2005-01-01
Standard parton shower monte carlos are designed to give reliable descriptions of low-pT physics. In the very high-energy regime of modern colliders, this is may lead to largely incorrect predictions of the basic reaction processes. This motivated the recent theoretical efforts aimed at improving monte carlos through the inclusion of matrix elements computed beyond the leading order in QCD. I briefly review the progress made, and discuss bottom production at the Tevatron
The computation of Greeks with multilevel Monte Carlo
Sylvestre Burgos; M. B. Giles
2011-01-01
In mathematical finance, the sensitivities of option prices to various market parameters, also known as the “Greeks”, reflect the exposure to different sources of risk. Computing these is essential to predict the impact of market moves on portfolios and to hedge them adequately. This is commonly done using Monte Carlo simulations. However, obtaining accurate estimates of the Greeks can be computationally costly. Multilevel Monte Carlo offers complexity improvements over standard Monte Carl...
Modern analysis of ion channeling data by Monte Carlo simulations
Energy Technology Data Exchange (ETDEWEB)
Nowicki, Lech [Andrzej SoItan Institute for Nuclear Studies, ul. Hoza 69, 00-681 Warsaw (Poland)]. E-mail: lech.nowicki@fuw.edu.pl; Turos, Andrzej [Institute of Electronic Materials Technology, Wolczynska 133, 01-919 Warsaw (Poland); Ratajczak, Renata [Andrzej SoItan Institute for Nuclear Studies, ul. Hoza 69, 00-681 Warsaw (Poland); Stonert, Anna [Andrzej SoItan Institute for Nuclear Studies, ul. Hoza 69, 00-681 Warsaw (Poland); Garrido, Frederico [Centre de Spectrometrie Nucleaire et Spectrometrie de Masse, CNRS-IN2P3-Universite Paris-Sud, 91405 Orsay (France)
2005-10-15
Basic scheme of ion channeling spectra Monte Carlo simulation is reformulated in terms of statistical sampling. The McChasy simulation code is described and two examples of the code applications are presented. These are: calculation of projectile flux in uranium dioxide crystal and defect analysis for ion implanted InGaAsP/InP superlattice. Virtues and pitfalls of defect analysis using Monte Carlo simulations are discussed.
Advances in Monte Carlo electron transport
International Nuclear Information System (INIS)
Bielajew, Alex F.
1995-01-01
Notwithstanding the success of Monte Carlo (MC) calculations for determining ion chamber correction factors for air-kerma standards and radiotherapy applications, a great challenge remains. MC is unable to calculate ion chamber response to better than 1% for low-Z and 3% for high-Z wall materials. Moreover, the two major MC code systems employed in radiation dosimetry, the EGS and ITS codes, differ in opposite directions from ion chamber experiments. The discrepancy with experiment is due to inadequacies in the underlying e - condensed-history algorithms. As modeled by MC calculations, the e - step-lengths in the chamber walls and the ionisation cavity differ in terms of material traversed by about three orders of magnitude. This demands that the underlying e - transport algorithms be very stable over a great dynamic range. Otherwise a spurious e - disequilibrium may be generated. The multiple-scattering (MS) algorithms, Moliere in the case of EGS and Goudsmit-Saunderson (GS) in the case of ITS, are either mathematically or numerically unstable in the plural-scattering environment of the ionisation cavity. Recently, a new MS theory has been developed that is an exact solution of the Wentzel small-angle formalism using a screened Rutherford cross section. This new MS theory is mathematically, physically and numerically stable from the no-scattering to the MS regimes. This theory is the small-angle equivalent of the GS equation for a Rutherford cross section. Large-angle corrections connecting this theory to GS theory have been derived by Bethe. The Moliere theory is the large-pathlength limit of this theory. The strategy for employing this new theory for ion chamber and radiotherapy calculations is described
Monte carlo sampling of fission multiplicity.
Energy Technology Data Exchange (ETDEWEB)
Hendricks, J. S. (John S.)
2004-01-01
Two new methods have been developed for fission multiplicity modeling in Monte Carlo calculations. The traditional method of sampling neutron multiplicity from fission is to sample the number of neutrons above or below the average. For example, if there are 2.7 neutrons per fission, three would be chosen 70% of the time and two would be chosen 30% of the time. For many applications, particularly {sup 3}He coincidence counting, a better estimate of the true number of neutrons per fission is required. Generally, this number is estimated by sampling a Gaussian distribution about the average. However, because the tail of the Gaussian distribution is negative and negative neutrons cannot be produced, a slight positive bias can be found in the average value. For criticality calculations, the result of rejecting the negative neutrons is an increase in k{sub eff} of 0.1% in some cases. For spontaneous fission, where the average number of neutrons emitted from fission is low, the error also can be unacceptably large. If the Gaussian width approaches the average number of fissions, 10% too many fission neutrons are produced by not treating the negative Gaussian tail adequately. The first method to treat the Gaussian tail is to determine a correction offset, which then is subtracted from all sampled values of the number of neutrons produced. This offset depends on the average value for any given fission at any energy and must be computed efficiently at each fission from the non-integrable error function. The second method is to determine a corrected zero point so that all neutrons sampled between zero and the corrected zero point are killed to compensate for the negative Gaussian tail bias. Again, the zero point must be computed efficiently at each fission. Both methods give excellent results with a negligible computing time penalty. It is now possible to include the full effects of fission multiplicity without the negative Gaussian tail bias.
Monte Carlo Volcano Seismic Moment Tensors
Waite, G. P.; Brill, K. A.; Lanza, F.
2015-12-01
Inverse modeling of volcano seismic sources can provide insight into the geometry and dynamics of volcanic conduits. But given the logistical challenges of working on an active volcano, seismic networks are typically deficient in spatial and temporal coverage; this potentially leads to large errors in source models. In addition, uncertainties in the centroid location and moment-tensor components, including volumetric components, are difficult to constrain from the linear inversion results, which leads to a poor understanding of the model space. In this study, we employ a nonlinear inversion using a Monte Carlo scheme with the objective of defining robustly resolved elements of model space. The model space is randomized by centroid location and moment tensor eigenvectors. Point sources densely sample the summit area and moment tensors are constrained to a randomly chosen geometry within the inversion; Green's functions for the random moment tensors are all calculated from modeled single forces, making the nonlinear inversion computationally reasonable. We apply this method to very-long-period (VLP) seismic events that accompany minor eruptions at Fuego volcano, Guatemala. The library of single force Green's functions is computed with a 3D finite-difference modeling algorithm through a homogeneous velocity-density model that includes topography, for a 3D grid of nodes, spaced 40 m apart, within the summit region. The homogenous velocity and density model is justified by long wavelength of VLP data. The nonlinear inversion reveals well resolved model features and informs the interpretation through a better understanding of the possible models. This approach can also be used to evaluate possible station geometries in order to optimize networks prior to deployment.
A Monte Carlo Evaluation of Weighted Community Detection Algorithms
Directory of Open Access Journals (Sweden)
Kathleen Gates
2016-11-01
Full Text Available The past decade has been marked with a proliferation of community detection algorithms that aim to organize nodes (e.g., individuals, brain regions, variables into modular structures that indicate subgroups, clusters, or communities. Motivated by the emergence of big data across many fields of inquiry, these methodological developments have primarily focused on the detection of communities of nodes from matrices that are very large. However, it remains unknown if the algorithms can reliably detect communities in smaller graph sizes (i.e., 1000 nodes and fewer which are commonly used in brain research. More importantly, these algorithms have predominantly been tested only on binary or sparse count matrices and it remains unclear the degree to which the algorithms can recover community structure for different types of matrices, such as the often used cross-correlation matrices representing functional connectivity across predefined brain regions. Of the publicly available approaches for weighted graphs that can detect communities in graphs sizes of at least 1000, prior research has demonstrated that Newman’s spectral approach (i.e., Leading Eigenvalue, Walktrap, Fast Modularity, the Louvain method (i.e., multilevel community method, Label Propagation, and Infomap all recover communities exceptionally well in certain circumstances. The purpose of the present Monte Carlo simulation study is to test these methods across a large number of conditions, including varied graph sizes and types of matrix (sparse count, correlation, and reflected Euclidean distance, to identify which algorithm is optimal for specific types of data matrices. The results indicate that when the data are in the form of sparse count networks (such as those seen in diffusion tensor imaging, Label Propagation and Walktrap surfaced as the most reliable methods for community detection. For dense, weighted networks such as correlation matrices capturing functional connectivity
Hydrogen analysis depth calibration by CORTEO Monte-Carlo simulation
Energy Technology Data Exchange (ETDEWEB)
Moser, M., E-mail: marcus.moser@unibw.de [Universität der Bundeswehr München, Institut für Angewandte Physik und Messtechnik LRT2, Fakultät für Luft- und Raumfahrttechnik, 85577 Neubiberg (Germany); Reichart, P.; Bergmaier, A.; Greubel, C. [Universität der Bundeswehr München, Institut für Angewandte Physik und Messtechnik LRT2, Fakultät für Luft- und Raumfahrttechnik, 85577 Neubiberg (Germany); Schiettekatte, F. [Université de Montréal, Département de Physique, Montréal, QC H3C 3J7 (Canada); Dollinger, G., E-mail: guenther.dollinger@unibw.de [Universität der Bundeswehr München, Institut für Angewandte Physik und Messtechnik LRT2, Fakultät für Luft- und Raumfahrttechnik, 85577 Neubiberg (Germany)
2016-03-15
Hydrogen imaging with sub-μm lateral resolution and sub-ppm sensitivity has become possible with coincident proton–proton (pp) scattering analysis (Reichart et al., 2004). Depth information is evaluated from the energy sum signal with respect to energy loss of both protons on their path through the sample. In first order, there is no angular dependence due to elastic scattering. In second order, a path length effect due to different energy loss on the paths of the protons causes an angular dependence of the energy sum. Therefore, the energy sum signal has to be de-convoluted depending on the matrix composition, i.e. mainly the atomic number Z, in order to get a depth calibrated hydrogen profile. Although the path effect can be calculated analytically in first order, multiple scattering effects lead to significant deviations in the depth profile. Hence, in our new approach, we use the CORTEO Monte-Carlo code (Schiettekatte, 2008) in order to calculate the depth of a coincidence event depending on the scattering angle. The code takes individual detector geometry into account. In this paper we show, that the code correctly reproduces measured pp-scattering energy spectra with roughness effects considered. With more than 100 μm thick Mylar-sandwich targets (Si, Fe, Ge) we demonstrate the deconvolution of the energy spectra on our current multistrip detector at the microprobe SNAKE at the Munich tandem accelerator lab. As a result, hydrogen profiles can be evaluated with an accuracy in depth of about 1% of the sample thickness.
Mean field theory of the swap Monte Carlo algorithm.
Ikeda, Harukuni; Zamponi, Francesco; Ikeda, Atsushi
2017-12-21
The swap Monte Carlo algorithm combines the translational motion with the exchange of particle species and is unprecedentedly efficient for some models of glass former. In order to clarify the physics underlying this acceleration, we study the problem within the mean field replica liquid theory. We extend the Gaussian Ansatz so as to take into account the exchange of particles of different species, and we calculate analytically the dynamical glass transition points corresponding to the swap and standard Monte Carlo algorithms. We show that the system evolved with the standard Monte Carlo algorithm exhibits the dynamical transition before that of the swap Monte Carlo algorithm. We also test the result by performing computer simulations of a binary mixture of the Mari-Kurchan model, both with standard and swap Monte Carlo. This scenario provides a possible explanation for the efficiency of the swap Monte Carlo algorithm. Finally, we discuss how the thermodynamic theory of the glass transition should be modified based on our results.
Bayesian Optimal Experimental Design Using Multilevel Monte Carlo
Ben Issaid, Chaouki
2015-01-07
Experimental design is very important since experiments are often resource-exhaustive and time-consuming. We carry out experimental design in the Bayesian framework. To measure the amount of information, which can be extracted from the data in an experiment, we use the expected information gain as the utility function, which specifically is the expected logarithmic ratio between the posterior and prior distributions. Optimizing this utility function enables us to design experiments that yield the most informative data for our purpose. One of the major difficulties in evaluating the expected information gain is that the integral is nested and can be high dimensional. We propose using Multilevel Monte Carlo techniques to accelerate the computation of the nested high dimensional integral. The advantages are twofold. First, the Multilevel Monte Carlo can significantly reduce the cost of the nested integral for a given tolerance, by using an optimal sample distribution among different sample averages of the inner integrals. Second, the Multilevel Monte Carlo method imposes less assumptions, such as the concentration of measures, required by Laplace method. We test our Multilevel Monte Carlo technique using a numerical example on the design of sensor deployment for a Darcy flow problem governed by one dimensional Laplace equation. We also compare the performance of the Multilevel Monte Carlo, Laplace approximation and direct double loop Monte Carlo.
Implications of Monte Carlo Statistical Errors in Criticality Safety Assessments
International Nuclear Information System (INIS)
Pevey, Ronald E.
2005-01-01
Most criticality safety calculations are performed using Monte Carlo techniques because of Monte Carlo's ability to handle complex three-dimensional geometries. For Monte Carlo calculations, the more histories sampled, the lower the standard deviation of the resulting estimates. The common intuition is, therefore, that the more histories, the better; as a result, analysts tend to run Monte Carlo analyses as long as possible (or at least to a minimum acceptable uncertainty). For Monte Carlo criticality safety analyses, however, the optimization situation is complicated by the fact that procedures usually require that an extra margin of safety be added because of the statistical uncertainty of the Monte Carlo calculations. This additional safety margin affects the impact of the choice of the calculational standard deviation, both on production and on safety. This paper shows that, under the assumptions of normally distributed benchmarking calculational errors and exact compliance with the upper subcritical limit (USL), the standard deviation that optimizes production is zero, but there is a non-zero value of the calculational standard deviation that minimizes the risk of inadvertently labeling a supercritical configuration as subcritical. Furthermore, this value is shown to be a simple function of the typical benchmarking step outcomes--the bias, the standard deviation of the bias, the upper subcritical limit, and the number of standard deviations added to calculated k-effectives before comparison to the USL
Present status of transport code development based on Monte Carlo method
International Nuclear Information System (INIS)
Nakagawa, Masayuki
1985-01-01
The present status of development in Monte Carlo code is briefly reviewed. The main items are the followings; Application fields, Methods used in Monte Carlo code (geometry spectification, nuclear data, estimator and variance reduction technique) and unfinished works, Typical Monte Carlo codes and Merits of continuous energy Monte Carlo code. (author)
Monte Carlo Techniques for Nuclear Systems - Theory Lectures
Energy Technology Data Exchange (ETDEWEB)
Brown, Forrest B. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States). Monte Carlo Methods, Codes, and Applications Group; Univ. of New Mexico, Albuquerque, NM (United States). Nuclear Engineering Dept.
2016-11-29
These are lecture notes for a Monte Carlo class given at the University of New Mexico. The following topics are covered: course information; nuclear eng. review & MC; random numbers and sampling; computational geometry; collision physics; tallies and statistics; eigenvalue calculations I; eigenvalue calculations II; eigenvalue calculations III; variance reduction; parallel Monte Carlo; parameter studies; fission matrix and higher eigenmodes; doppler broadening; Monte Carlo depletion; HTGR modeling; coupled MC and T/H calculations; fission energy deposition. Solving particle transport problems with the Monte Carlo method is simple - just simulate the particle behavior. The devil is in the details, however. These lectures provide a balanced approach to the theory and practice of Monte Carlo simulation codes. The first lectures provide an overview of Monte Carlo simulation methods, covering the transport equation, random sampling, computational geometry, collision physics, and statistics. The next lectures focus on the state-of-the-art in Monte Carlo criticality simulations, covering the theory of eigenvalue calculations, convergence analysis, dominance ratio calculations, bias in Keff and tallies, bias in uncertainties, a case study of a realistic calculation, and Wielandt acceleration techniques. The remaining lectures cover advanced topics, including HTGR modeling and stochastic geometry, temperature dependence, fission energy deposition, depletion calculations, parallel calculations, and parameter studies. This portion of the class focuses on using MCNP to perform criticality calculations for reactor physics and criticality safety applications. It is an intermediate level class, intended for those with at least some familiarity with MCNP. Class examples provide hands-on experience at running the code, plotting both geometry and results, and understanding the code output. The class includes lectures & hands-on computer use for a variety of Monte Carlo calculations
Monte Carlo systems used for treatment planning and dose verification
Energy Technology Data Exchange (ETDEWEB)
Brualla, Lorenzo [Universitaetsklinikum Essen, NCTeam, Strahlenklinik, Essen (Germany); Rodriguez, Miguel [Centro Medico Paitilla, Balboa (Panama); Lallena, Antonio M. [Universidad de Granada, Departamento de Fisica Atomica, Molecular y Nuclear, Granada (Spain)
2017-04-15
General-purpose radiation transport Monte Carlo codes have been used for estimation of the absorbed dose distribution in external photon and electron beam radiotherapy patients since several decades. Results obtained with these codes are usually more accurate than those provided by treatment planning systems based on non-stochastic methods. Traditionally, absorbed dose computations based on general-purpose Monte Carlo codes have been used only for research, owing to the difficulties associated with setting up a simulation and the long computation time required. To take advantage of radiation transport Monte Carlo codes applied to routine clinical practice, researchers and private companies have developed treatment planning and dose verification systems that are partly or fully based on fast Monte Carlo algorithms. This review presents a comprehensive list of the currently existing Monte Carlo systems that can be used to calculate or verify an external photon and electron beam radiotherapy treatment plan. Particular attention is given to those systems that are distributed, either freely or commercially, and that do not require programming tasks from the end user. These systems are compared in terms of features and the simulation time required to compute a set of benchmark calculations. (orig.) [German] Seit mehreren Jahrzehnten werden allgemein anwendbare Monte-Carlo-Codes zur Simulation des Strahlungstransports benutzt, um die Verteilung der absorbierten Dosis in der perkutanen Strahlentherapie mit Photonen und Elektronen zu evaluieren. Die damit erzielten Ergebnisse sind meist akkurater als solche, die mit nichtstochastischen Methoden herkoemmlicher Bestrahlungsplanungssysteme erzielt werden koennen. Wegen des damit verbundenen Arbeitsaufwands und der langen Dauer der Berechnungen wurden Monte-Carlo-Simulationen von Dosisverteilungen in der konventionellen Strahlentherapie in der Vergangenheit im Wesentlichen in der Forschung eingesetzt. Im Bemuehen, Monte-Carlo
DEFF Research Database (Denmark)
Iglesias, Juan Eugenio; Sabuncu, Mert Rory; Van Leemput, Koen
2013-01-01
Many segmentation algorithms in medical image analysis use Bayesian modeling to augment local image appearance with prior anatomical knowledge. Such methods often contain a large number of free parameters that are first estimated and then kept fixed during the actual segmentation process. However......, a faithful Bayesian analysis would marginalize over such parameters, accounting for their uncertainty by considering all possible values they may take. Here we propose to incorporate this uncertainty into Bayesian segmentation methods in order to improve the inference process. In particular, we approximate...... the required marginalization over model parameters using computationally efficient Markov chain Monte Carlo techniques. We illustrate the proposed approach using a recently developed Bayesian method for the segmentation of hippocampal subfields in brain MRI scans, showing a significant improvement...
Acceleration of Monte Carlo-based scatter compensation for cardiac SPECT
Energy Technology Data Exchange (ETDEWEB)
Sohlberg, A; Watabe, H; Iida, H [National Cardiovascular Center Research Institute, 5-7-1 Fujishiro-dai, Suita City, 565-8565 Osaka (Japan)], E-mail: antti.sohlberg@hermesmedical.com
2008-07-21
Single proton emission computed tomography (SPECT) images are degraded by photon scatter making scatter compensation essential for accurate reconstruction. Reconstruction-based scatter compensation with Monte Carlo (MC) modelling of scatter shows promise for accurate scatter correction, but it is normally hampered by long computation times. The aim of this work was to accelerate the MC-based scatter compensation using coarse grid and intermittent scatter modelling. The acceleration methods were compared to un-accelerated implementation using MC-simulated projection data of the mathematical cardiac torso (MCAT) phantom modelling {sup 99m}Tc uptake and clinical myocardial perfusion studies. The results showed that when combined the acceleration methods reduced the reconstruction time for 10 ordered subset expectation maximization (OS-EM) iterations from 56 to 11 min without a significant reduction in image quality indicating that the coarse grid and intermittent scatter modelling are suitable for MC-based scatter compensation in cardiac SPECT. (note)
Prediction of beam hardening artefacts in computed tomography using Monte Carlo simulations
DEFF Research Database (Denmark)
Thomsen, M.; Bergbäck Knudsen, Erik; Willendrup, Peter Kjær
2015-01-01
We show how radiological images of both single and multi material samples can be simulated using the Monte Carlo simulation tool McXtrace and how these images can be used to make a three dimensional reconstruction. Good numerical agreement between the X-ray attenuation coefficient in experimental...... and simulated data can be obtained, which allows us to use simulated projections in the linearisation procedure for single material samples and in that way reduce beam hardening artefacts. The simulations can be used to predict beam hardening artefacts in multi material samples with complex geometry......, illustrated with an example. Linearisation requires knowledge about the X-ray transmission at varying sample thickness, but in some cases homogeneous calibration phantoms are hard to manufacture, which affects the accuracy of the calibration. Using simulated data overcomes the manufacturing problems...
Calibration of lung counter using a CT model of Torso phantom and Monte Carlo method
International Nuclear Information System (INIS)
Zhang Binquan; Ma Jizeng; Yang Duanjie; Liu Liye; Cheng Jianping
2006-01-01
Tomography image of a Torso phantom was obtained from CT-Scan. The Torso phantom represents the trunk of an adult man that is 170 cm high and weight of 65 kg. After these images were segmented, cropped, and resized, a 3-dimension voxel phantom was created. The voxel phantom includes more than 2 million voxels, which size was 2.73 mm x 2.73 mm x 3 mm. This model could be used for the calibration of lung counter with Monte Carlo method. On the assumption that radioactive material was homogeneously distributed throughout the lung, counting efficiencies of a HPGe detector in different positions were calculated as Adipose Mass fraction (AMF) was different in the soft tissue in chest. The results showed that counting efficiencies of the lung counter changed up to 67% for 17.5 keV γ ray and 20% for 25 keV γ ray when AMF changed from 0 to 40%. (authors)
In silico imaging: Definition, possibilities and challenges
International Nuclear Information System (INIS)
Badano, Aldo
2011-01-01
The capability to simulate the imaging performance of new detector concepts is crucial to develop the next generation of medical imaging systems. Proper modeling tools allow for optimal designs that maximize image quality while minimizing patient and occupational radiation doses. In this context, in silico imaging has become an emerging field of imaging research. This paper reviews current progress and challenges in the simulation of imaging systems with a focus on Monte Carlo approaches to X-ray detector modeling, acceleration approaches, and validation strategies.
Energy Technology Data Exchange (ETDEWEB)
Boudou, C
2006-09-15
High grade gliomas are extremely aggressive brain tumours. Specific techniques combining the presence of high atomic number elements within the tumour to an irradiation with a low x-rays (below 100 keV) beam from a synchrotron source were proposed. For the sake of clinical trials, the use of treatment planning system has to be foreseen as well as tailored dosimetry protocols. Objectives of this thesis work were (1) the development of a dose calculation tools based on Monte Carlo code for particles transport and (2) the implementation of an experimental method for the three dimensional verification of the dose delivered. The dosimetric tool is an interface between tomography images from patient or sample and the M.C.N.P.X. general purpose code. Besides, dose distributions were measured through a radiosensitive polymer gel, providing acceptable results compared to calculations.
Khrushcheva, O; Malerba, L; Becquart, C S; Domain, C; Hou, M
2003-01-01
Several variants are possible in the suite of programs forming multiscale predictive tools to estimate the yield strength increase caused by irradiation in RPV steels. For instance, at the atomic scale, both the Metropolis and the lattice kinetic Monte Carlo methods (MMC and LKMC respectively) allow predicting copper precipitation under irradiation conditions. Since these methods are based on different physical models, the present contribution discusses their consistency on the basis of a realistic case study. A cascade debris in iron containing 0.2% of copper was modelled by molecular dynamics with the DYMOKA code, which is part of the REVE suite. We use this debris as input for both the MMC and the LKMC simulations. Thermal motion and lattice relaxation can be avoided in the MMC, making the model closer to the LKMC (LMMC method). The predictions and the complementarity of the three methods for modelling the same phenomenon are then discussed.
Monte-Carlo Application for Nondestructive Nuclear Waste Analysis
Carasco, C.; Engels, R.; Frank, M.; Furletov, S.; Furletova, J.; Genreith, C.; Havenith, A.; Kemmerling, G.; Kettler, J.; Krings, T.; Ma, J.-L.; Mauerhofer, E.; Neike, D.; Payan, E.; Perot, B.; Rossbach, M.; Schitthelm, O.; Schumann, M.; Vasquez, R.
2014-06-01
, neutron flux distribution. The validation of the measurements simulations with Mont-Carlo transport codes for the design, optimization and data analysis of further P&DGNAA facilities is performed in collaboration with LMN CEA Cadarache. The performance of the prompt gamma neutron activation analysis (PGNAA) for the nondestructive determination of actinides in small samples is investigated. The quantitative determination of actinides relies on the precise knowledge of partial neutron capture cross sections. Up to today these cross sections are not very accurate for analytical purpose. The goal of the TANDEM (Trans-uranium Actinides' Nuclear Data - Evaluation and Measurement) Collaboration is the evaluation of these cross sections. Cross sections are measured using prompt gamma activation analysis facilities in Budapest and Munich. Geant4 is used to optimally design the detection system with Compton suppression. Furthermore, for the evaluation of the cross sections it is strongly needed to correct the results to the self-attenuation of the prompt gammas within the sample. In the framework of cooperation RWTH Aachen University, Forschungszentrum Jülich and the Siemens AG will study the feasibility of a compact Neutron Imaging System for Radioactive waste Analysis (NISRA). The system is based on a 14 MeV neutron source and an advanced detector system (a-Si flat panel) linked to an exclusive converter/scintillator for fast neutrons. For shielding and radioprotection studies the codes MCNPX and Geant4 were used. The two codes were benchmarked in processing time and accuracy in the neutron and gamma fluxes. Also the detector response was simulated with Geant4 to optimize components of the system.
Papadimitroulas, Panagiotis; Loudos, George; Nikiforidis, George C; Kagadis, George C
2012-08-01
GATE is a Monte Carlo simulation toolkit based on the Geant4 package, widely used for many medical physics applications, including SPECT and PET image simulation and more recently CT image simulation and patient dosimetry. The purpose of the current study was to calculate dose point kernels (DPKs) using GATE, compare them against reference data, and finally produce a complete dataset of the total DPKs for the most commonly used radionuclides in nuclear medicine. Patient-specific absorbed dose calculations can be carried out using Monte Carlo simulations. The latest version of GATE extends its applications to Radiotherapy and Dosimetry. Comparison of the proposed method for the generation of DPKs was performed for (a) monoenergetic electron sources, with energies ranging from 10 keV to 10 MeV, (b) beta emitting isotopes, e.g., (177)Lu, (90)Y, and (32)P, and (c) gamma emitting isotopes, e.g., (111)In, (131)I, (125)I, and (99m)Tc. Point isotropic sources were simulated at the center of a sphere phantom, and the absorbed dose was stored in concentric spherical shells around the source. Evaluation was performed with already published studies for different Monte Carlo codes namely MCNP, EGS, FLUKA, ETRAN, GEPTS, and PENELOPE. A complete dataset of total DPKs was generated for water (equivalent to soft tissue), bone, and lung. This dataset takes into account all the major components of radiation interactions for the selected isotopes, including the absorbed dose from emitted electrons, photons, and all secondary particles generated from the electromagnetic interactions. GATE comparison provided reliable results in all cases (monoenergetic electrons, beta emitting isotopes, and photon emitting isotopes). The observed differences between GATE and other codes are less than 10% and comparable to the discrepancies observed among other packages. The produced DPKs are in very good agreement with the already published data, which allowed us to produce a unique DPKs dataset using
Fornaciari, G; Fontecchio, G; Ventura, L; Papola, F; Trombetta, I; Giuffra, V
2012-01-01
The paleopathological study of the skeletal remains belonging to Cardinal Carlo de' Medici (1595-1666), son of Ferdinando I (1549-1609) and Cristina of Lorena (1565-1637), has been presented previously. A diagnosis of Klippel-Feil syndrome, tuberculosis and a polyarthopathy, interpreted as rheumatoid arthritis, was suggested. A revision of this case based on the analysis of the historical documents and of some radiological images of Carlo's bones has been proposed recently; according to the Authors, the Cardinal was affected by the 'Medici syndrome', a combined Psoriatic-DISH arthropathy. This revision offers us the opportunity to discuss this complex case, comparing different points of view, and to present the results of the molecular analyses carried out on Carlo's bone samples. We looked for the genetic risk factors of rheumatoid arthritis (RA) and psoriatic arthritis (PsA). We also searched for the primary candidate genes of RA and PsA, i.e. DR4 or DR1 and Cw6 or DR7 respectively, the latter predisposing also for psoriasis. An original molecular protocol was applied to achieve an aDNA uncontaminated by exogenous sources and almost intact, starting from one of the Cardinal's rib pieces. The allele risk factors for both diseases were identified by PCR-SSP assay as HLA genotyping methodology. Our data assigned Carlo the genotype DRB1*04/*11 for HLA-DRB locus and Cw*04/*12 for HLA-C locus. Since Carlo was infected by M. tuberculosis during infancy and was carrying the DR4 variant but not the Cw6, he surely had a predisposition to RA, not to PsA and/or psoriasis. The diagnosis of RA is thus confirmed.
REVIEW: Fifty years of Monte Carlo simulations for medical physics
Rogers, D. W. O.
2006-07-01
Monte Carlo techniques have become ubiquitous in medical physics over the last 50 years with a doubling of papers on the subject every 5 years between the first PMB paper in 1967 and 2000 when the numbers levelled off. While recognizing the many other roles that Monte Carlo techniques have played in medical physics, this review emphasizes techniques for electron-photon transport simulations. The broad range of codes available is mentioned but there is special emphasis on the EGS4/EGSnrc code system which the author has helped develop for 25 years. The importance of the 1987 Erice Summer School on Monte Carlo techniques is highlighted. As an illustrative example of the role Monte Carlo techniques have played, the history of the correction for wall attenuation and scatter in an ion chamber is presented as it demonstrates the interplay between a specific problem and the development of tools to solve the problem which in turn leads to applications in other areas. This paper is dedicated to W Ralph Nelson and to the memory of Martin J Berger, two men who have left indelible marks on the field of Monte Carlo simulation of electron-photon transport.
Numerical integration of detector response functions via Monte Carlo simulations
Kelly, K. J.; O'Donnell, J. M.; Gomez, J. A.; Taddeucci, T. N.; Devlin, M.; Haight, R. C.; White, M. C.; Mosby, S. M.; Neudecker, D.; Buckner, M. Q.; Wu, C. Y.; Lee, H. Y.
2017-09-01
Calculations of detector response functions are complicated because they include the intricacies of signal creation from the detector itself as well as a complex interplay between the detector, the particle-emitting target, and the entire experimental environment. As such, these functions are typically only accessible through time-consuming Monte Carlo simulations. Furthermore, the output of thousands of Monte Carlo simulations can be necessary in order to extract a physics result from a single experiment. Here we describe a method to obtain a full description of the detector response function using Monte Carlo simulations. We also show that a response function calculated in this way can be used to create Monte Carlo simulation output spectra a factor of ∼ 1000 × faster than running a new Monte Carlo simulation. A detailed discussion of the proper treatment of uncertainties when using this and other similar methods is provided as well. This method is demonstrated and tested using simulated data from the Chi-Nu experiment, which measures prompt fission neutron spectra at the Los Alamos Neutron Science Center.
Monte Carlo studies of high-transverse-energy hadronic interactions
International Nuclear Information System (INIS)
Corcoran, M.D.
1985-01-01
A four-jet Monte Carlo calculation has been used to simulate hadron-hadron interactions which deposit high transverse energy into a large-solid-angle calorimeter and limited solid-angle regions of the calorimeter. The calculation uses first-order QCD cross sections to generate two scattered jets and also produces beam and target jets. Field-Feynman fragmentation has been used in the hadronization. The sensitivity of the results to a few features of the Monte Carlo program has been studied. The results are found to be very sensitive to the method used to ensure overall energy conservation after the fragmentation of the four jets is complete. Results are also sensitive to the minimum momentum transfer in the QCD subprocesses and to the distribution of p/sub T/ to the jet axis and the multiplicities in the fragmentation. With reasonable choices of these features of the Monte Carlo program, good agreement with data at Fermilab/CERN SPS energies is obtained, comparable to the agreement achieved with more sophisticated parton-shower models. With other choices, however, the calculation gives qualitatively different results which are in strong disagreement with the data. These results have important implications for extracting physics conclusions from Monte Carlo calculations. It is not possible to test the validity of a particular model or distinguish between different models unless the Monte Carlo results are unambiguous and different models exhibit clearly different behavior
Monte Carlo capabilities of the SCALE code system
International Nuclear Information System (INIS)
Rearden, B.T.; Petrie, L.M.; Peplow, D.E.; Bekar, K.B.; Wiarda, D.; Celik, C.; Perfetti, C.M.; Ibrahim, A.M.; Hart, S.W.D.; Dunn, M.E.; Marshall, W.J.
2015-01-01
Highlights: • Foundational Monte Carlo capabilities of SCALE are described. • Improvements in continuous-energy treatments are detailed. • New methods for problem-dependent temperature corrections are described. • New methods for sensitivity analysis and depletion are described. • Nuclear data, users interfaces, and quality assurance activities are summarized. - Abstract: SCALE is a widely used suite of tools for nuclear systems modeling and simulation that provides comprehensive, verified and validated, user-friendly capabilities for criticality safety, reactor physics, radiation shielding, and sensitivity and uncertainty analysis. For more than 30 years, regulators, licensees, and research institutions around the world have used SCALE for nuclear safety analysis and design. SCALE provides a “plug-and-play” framework that includes three deterministic and three Monte Carlo radiation transport solvers that can be selected based on the desired solution, including hybrid deterministic/Monte Carlo simulations. SCALE includes the latest nuclear data libraries for continuous-energy and multigroup radiation transport as well as activation, depletion, and decay calculations. SCALE’s graphical user interfaces assist with accurate system modeling, visualization, and convenient access to desired results. SCALE 6.2 will provide several new capabilities and significant improvements in many existing features, especially with expanded continuous-energy Monte Carlo capabilities for criticality safety, shielding, depletion, and sensitivity and uncertainty analysis. An overview of the Monte Carlo capabilities of SCALE is provided here, with emphasis on new features for SCALE 6.2
Parallel MCNP Monte Carlo transport calculations with MPI
International Nuclear Information System (INIS)
Wagner, J.C.; Haghighat, A.
1996-01-01
The steady increase in computational performance has made Monte Carlo calculations for large/complex systems possible. However, in order to make these calculations practical, order of magnitude increases in performance are necessary. The Monte Carlo method is inherently parallel (particles are simulated independently) and thus has the potential for near-linear speedup with respect to the number of processors. Further, the ever-increasing accessibility of parallel computers, such as workstation clusters, facilitates the practical use of parallel Monte Carlo. Recognizing the nature of the Monte Carlo method and the trends in available computing, the code developers at Los Alamos National Laboratory implemented the message-passing general-purpose Monte Carlo radiation transport code MCNP (version 4A). The PVM package was chosen by the MCNP code developers because it supports a variety of communication networks, several UNIX platforms, and heterogeneous computer systems. This PVM version of MCNP has been shown to produce speedups that approach the number of processors and thus, is a very useful tool for transport analysis. Due to software incompatibilities on the local IBM SP2, PVM has not been available, and thus it is not possible to take advantage of this useful tool. Hence, it became necessary to implement an alternative message-passing library package into MCNP. Because the message-passing interface (MPI) is supported on the local system, takes advantage of the high-speed communication switches in the SP2, and is considered to be the emerging standard, it was selected
Introduction to Monte Carlo methods: sampling techniques and random numbers
International Nuclear Information System (INIS)
Bhati, Sharda; Patni, H.K.
2009-01-01
The Monte Carlo method describes a very broad area of science, in which many processes, physical systems and phenomena that are statistical in nature and are difficult to solve analytically are simulated by statistical methods employing random numbers. The general idea of Monte Carlo analysis is to create a model, which is similar as possible to the real physical system of interest, and to create interactions within that system based on known probabilities of occurrence, with random sampling of the probability density functions. As the number of individual events (called histories) is increased, the quality of the reported average behavior of the system improves, meaning that the statistical uncertainty decreases. Assuming that the behavior of physical system can be described by probability density functions, then the Monte Carlo simulation can proceed by sampling from these probability density functions, which necessitates a fast and effective way to generate random numbers uniformly distributed on the interval (0,1). Particles are generated within the source region and are transported by sampling from probability density functions through the scattering media until they are absorbed or escaped the volume of interest. The outcomes of these random samplings or trials, must be accumulated or tallied in an appropriate manner to produce the desired result, but the essential characteristic of Monte Carlo is the use of random sampling techniques to arrive at a solution of the physical problem. The major components of Monte Carlo methods for random sampling for a given event are described in the paper
Study on random number generator in Monte Carlo code
International Nuclear Information System (INIS)
Oya, Kentaro; Kitada, Takanori; Tanaka, Shinichi
2011-01-01
The Monte Carlo code uses a sequence of pseudo-random numbers with a random number generator (RNG) to simulate particle histories. A pseudo-random number has its own period depending on its generation method and the period is desired to be long enough not to exceed the period during one Monte Carlo calculation to ensure the correctness especially for a standard deviation of results. The linear congruential generator (LCG) is widely used as Monte Carlo RNG and the period of LCG is not so long by considering the increasing rate of simulation histories in a Monte Carlo calculation according to the remarkable enhancement of computer performance. Recently, many kinds of RNG have been developed and some of their features are better than those of LCG. In this study, we investigate the appropriate RNG in a Monte Carlo code as an alternative to LCG especially for the case of enormous histories. It is found that xorshift has desirable features compared with LCG, and xorshift has a larger period, a comparable speed to generate random numbers, a better randomness, and good applicability to parallel calculation. (author)
Monte Carlo dose calculations in advanced radiotherapy
Bush, Karl Kenneth
The remarkable accuracy of Monte Carlo (MC) dose calculation algorithms has led to the widely accepted view that these methods should and will play a central role in the radiotherapy treatment verification and planning of the future. The advantages of using MC clinically are particularly evident for radiation fields passing through inhomogeneities, such as lung and air cavities, and for small fields, including those used in today's advanced intensity modulated radiotherapy techniques. Many investigators have reported significant dosimetric differences between MC and conventional dose calculations in such complex situations, and have demonstrated experimentally the unmatched ability of MC calculations in modeling charged particle disequilibrium. The advantages of using MC dose calculations do come at a cost. The nature of MC dose calculations require a highly detailed, in-depth representation of the physical system (accelerator head geometry/composition, anatomical patient geometry/composition and particle interaction physics) to allow accurate modeling of external beam radiation therapy treatments. To perform such simulations is computationally demanding and has only recently become feasible within mainstream radiotherapy practices. In addition, the output of the accelerator head simulation can be highly sensitive to inaccuracies within a model that may not be known with sufficient detail. The goal of this dissertation is to both improve and advance the implementation of MC dose calculations in modern external beam radiotherapy. To begin, a novel method is proposed to fine-tune the output of an accelerator model to better represent the measured output. In this method an intensity distribution of the electron beam incident on the model is inferred by employing a simulated annealing algorithm. The method allows an investigation of arbitrary electron beam intensity distributions and is not restricted to the commonly assumed Gaussian intensity. In a second component of
Directory of Open Access Journals (Sweden)
Joko Siswantoro
2014-11-01
Full Text Available Volume is one of important issues in the production and processing of food product. Traditionally, volume measurement can be performed using water displacement method based on Archimedes’ principle. Water displacement method is inaccurate and considered as destructive method. Computer vision offers an accurate and nondestructive method in measuring volume of food product. This paper proposes algorithm for volume measurement of irregular shape food product using computer vision based on Monte Carlo method. Five images of object were acquired from five different views and then processed to obtain the silhouettes of object. From the silhouettes of object, Monte Carlo method was performed to approximate the volume of object. The simulation result shows that the algorithm produced high accuracy and precision for volume measurement.
International Nuclear Information System (INIS)
Oliveira, Monica G. Nunes; Braz, Delson; Silva, Regina Cely B. da S.
2005-01-01
The computer simulation has been widely used in physical researches by both the viability of the codes and the growth of the power of computers in the last decades. The Monte Carlo simulation program, EGS4 code is a simulation program used in the area of radiation transport. The simulators, surrogate tissues, phantoms are objects used to perform studies on dosimetric quantities and quality testing of images. The simulators have characteristics of scattering and absorption of radiation similar to tissues that make up the body. The aim of this work is to translate the effects of radiation interactions in a real healthy breast tissues, sick and on simulators using the EGS4 Monte Carlo simulation code
Monte Carlo techniques for real-time quantum dynamics
International Nuclear Information System (INIS)
Dowling, Mark R.; Davis, Matthew J.; Drummond, Peter D.; Corney, Joel F.
2007-01-01
The stochastic-gauge representation is a method of mapping the equation of motion for the quantum mechanical density operator onto a set of equivalent stochastic differential equations. One of the stochastic variables is termed the 'weight', and its magnitude is related to the importance of the stochastic trajectory. We investigate the use of Monte Carlo algorithms to improve the sampling of the weighted trajectories and thus reduce sampling error in a simulation of quantum dynamics. The method can be applied to calculations in real time, as well as imaginary time for which Monte Carlo algorithms are more-commonly used. The Monte-Carlo algorithms are applicable when the weight is guaranteed to be real, and we demonstrate how to ensure this is the case. Examples are given for the anharmonic oscillator, where large improvements over stochastic sampling are observed
Monte Carlo simulation of neutron counters for safeguards applications
International Nuclear Information System (INIS)
Looman, Marc; Peerani, Paolo; Tagziria, Hamid
2009-01-01
MCNP-PTA is a new Monte Carlo code for the simulation of neutron counters for nuclear safeguards applications developed at the Joint Research Centre (JRC) in Ispra (Italy). After some preliminary considerations outlining the general aspects involved in the computational modelling of neutron counters, this paper describes the specific details and approximations which make up the basis of the model implemented in the code. One of the major improvements allowed by the use of Monte Carlo simulation is a considerable reduction in both the experimental work and in the reference materials required for the calibration of the instruments. This new approach to the calibration of counters using Monte Carlo simulation techniques is also discussed.
Monte Carlo simulated dynamical magnetization of single-chain magnets
Energy Technology Data Exchange (ETDEWEB)
Li, Jun; Liu, Bang-Gui, E-mail: bgliu@iphy.ac.cn
2015-03-15
Here, a dynamical Monte-Carlo (DMC) method is used to study temperature-dependent dynamical magnetization of famous Mn{sub 2}Ni system as typical example of single-chain magnets with strong magnetic anisotropy. Simulated magnetization curves are in good agreement with experimental results under typical temperatures and sweeping rates, and simulated coercive fields as functions of temperature are also consistent with experimental curves. Further analysis indicates that the magnetization reversal is determined by both thermal-activated effects and quantum spin tunnelings. These can help explore basic properties and applications of such important magnetic systems. - Highlights: • Monte Carlo simulated magnetization curves are in good agreement with experimental results. • Simulated coercive fields as functions of temperature are consistent with experimental results. • The magnetization reversal is understood in terms of the Monte Carlo simulations.
Exploring cluster Monte Carlo updates with Boltzmann machines.
Wang, Lei
2017-11-01
Boltzmann machines are physics informed generative models with broad applications in machine learning. They model the probability distribution of an input data set with latent variables and generate new samples accordingly. Applying the Boltzmann machines back to physics, they are ideal recommender systems to accelerate the Monte Carlo simulation of physical systems due to their flexibility and effectiveness. More intriguingly, we show that the generative sampling of the Boltzmann machines can even give different cluster Monte Carlo algorithms. The latent representation of the Boltzmann machines can be designed to mediate complex interactions and identify clusters of the physical system. We demonstrate these findings with concrete examples of the classical Ising model with and without four-spin plaquette interactions. In the future, automatic searches in the algorithm space parametrized by Boltzmann machines may discover more innovative Monte Carlo updates.
A Multivariate Time Series Method for Monte Carlo Reactor Analysis
International Nuclear Information System (INIS)
Taro Ueki
2008-01-01
A robust multivariate time series method has been established for the Monte Carlo calculation of neutron multiplication problems. The method is termed Coarse Mesh Projection Method (CMPM) and can be implemented using the coarse statistical bins for acquisition of nuclear fission source data. A novel aspect of CMPM is the combination of the general technical principle of projection pursuit in the signal processing discipline and the neutron multiplication eigenvalue problem in the nuclear engineering discipline. CMPM enables reactor physicists to accurately evaluate major eigenvalue separations of nuclear reactors with continuous energy Monte Carlo calculation. CMPM was incorporated in the MCNP Monte Carlo particle transport code of Los Alamos National Laboratory. The great advantage of CMPM over the traditional Fission Matrix method is demonstrated for the three space-dimensional modeling of the initial core of a pressurized water reactor
Application of biasing techniques to the contributon Monte Carlo method
Energy Technology Data Exchange (ETDEWEB)
Dubi, A.; Gerstl, S.A.W.
1980-01-01
Recently, a new Monte Carlo Method called the Contribution Monte Carlo Method was developed. The method is based on the theory of contributions, and uses a new receipe for estimating target responses by a volume integral over the contribution current. The analog features of the new method were discussed in previous publications. The application of some biasing methods to the new contribution scheme is examined here. A theoretical model is developed that enables an analytic prediction of the benefit to be expected when these biasing schemes are applied to both the contribution method and regular Monte Carlo. This model is verified by a variety of numerical experiments and is shown to yield satisfying results, especially for deep-penetration problems. Other considerations regarding the efficient use of the new method are also discussed, and remarks are made as to the application of other biasing methods. 14 figures, 1 tables.
Exploring cluster Monte Carlo updates with Boltzmann machines
Wang, Lei
2017-11-01
Boltzmann machines are physics informed generative models with broad applications in machine learning. They model the probability distribution of an input data set with latent variables and generate new samples accordingly. Applying the Boltzmann machines back to physics, they are ideal recommender systems to accelerate the Monte Carlo simulation of physical systems due to their flexibility and effectiveness. More intriguingly, we show that the generative sampling of the Boltzmann machines can even give different cluster Monte Carlo algorithms. The latent representation of the Boltzmann machines can be designed to mediate complex interactions and identify clusters of the physical system. We demonstrate these findings with concrete examples of the classical Ising model with and without four-spin plaquette interactions. In the future, automatic searches in the algorithm space parametrized by Boltzmann machines may discover more innovative Monte Carlo updates.
Minimum variance Monte Carlo importance sampling with parametric dependence
International Nuclear Information System (INIS)
Ragheb, M.M.H.; Halton, J.; Maynard, C.W.
1981-01-01
An approach for Monte Carlo Importance Sampling with parametric dependence is proposed. It depends upon obtaining by proper weighting over a single stage the overall functional dependence of the variance on the importance function parameter over a broad range of its values. Results corresponding to minimum variance are adapted and other results rejected. Numerical calculation for the estimation of intergrals are compared to Crude Monte Carlo. Results explain the occurrences of the effective biases (even though the theoretical bias is zero) and infinite variances which arise in calculations involving severe biasing and a moderate number of historis. Extension to particle transport applications is briefly discussed. The approach constitutes an extension of a theory on the application of Monte Carlo for the calculation of functional dependences introduced by Frolov and Chentsov to biasing, or importance sample calculations; and is a generalization which avoids nonconvergence to the optimal values in some cases of a multistage method for variance reduction introduced by Spanier. (orig.) [de
MORET: Version 4.B. A multigroup Monte Carlo criticality code
International Nuclear Information System (INIS)
Jacquet, Olivier; Miss, Joachim; Courtois, Gerard
2003-01-01
MORET 4 is a three dimensional multigroup Monte Carlo code which calculates the effective multiplication factor (keff) of any configurations more or less complex as well as reaction rates in the different volumes of the geometry and the leakage out of the system. MORET 4 is the Monte Carlo code of the APOLLO2-MORET 4 standard route of CRISTAL, the French criticality package. It is the most commonly used Monte Carlo code for French criticality calculations. During the last four years, the MORET 4 team has developed or improved the following major points: modernization of the geometry, implementation of perturbation algorithms, source distribution convergence, statistical detection of stationarity, unbiased variance estimation and creation of pre-processing and post-processing tools. The purpose of this paper is not only to present the new features of MORET but also to detail clearly the physical models and the mathematical methods used in the code. (author)
Stabilization effect of fission source in coupled Monte Carlo simulations
Energy Technology Data Exchange (ETDEWEB)
Olsen, Borge; Dufek, Jan [Div. of Nuclear Reactor Technology, KTH Royal Institute of Technology, AlbaNova University Center, Stockholm (Sweden)
2017-08-15
A fission source can act as a stabilization element in coupled Monte Carlo simulations. We have observed this while studying numerical instabilities in nonlinear steady-state simulations performed by a Monte Carlo criticality solver that is coupled to a xenon feedback solver via fixed-point iteration. While fixed-point iteration is known to be numerically unstable for some problems, resulting in large spatial oscillations of the neutron flux distribution, we show that it is possible to stabilize it by reducing the number of Monte Carlo criticality cycles simulated within each iteration step. While global convergence is ensured, development of any possible numerical instability is prevented by not allowing the fission source to converge fully within a single iteration step, which is achieved by setting a small number of criticality cycles per iteration step. Moreover, under these conditions, the fission source may converge even faster than in criticality calculations with no feedback, as we demonstrate in our numerical test simulations.
Monte Carlo Simulation in Statistical Physics An Introduction
Binder, Kurt
2010-01-01
Monte Carlo Simulation in Statistical Physics deals with the computer simulation of many-body systems in condensed-matter physics and related fields of physics, chemistry and beyond, to traffic flows, stock market fluctuations, etc.). Using random numbers generated by a computer, probability distributions are calculated, allowing the estimation of the thermodynamic properties of various systems. This book describes the theoretical background to several variants of these Monte Carlo methods and gives a systematic presentation from which newcomers can learn to perform such simulations and to analyze their results. The fifth edition covers Classical as well as Quantum Monte Carlo methods. Furthermore a new chapter on the sampling of free-energy landscapes has been added. To help students in their work a special web server has been installed to host programs and discussion groups (http://wwwcp.tphys.uni-heidelberg.de). Prof. Binder was awarded the Berni J. Alder CECAM Award for Computational Physics 2001 as well ...
Two proposed convergence criteria for Monte Carlo solutions
International Nuclear Information System (INIS)
Forster, R.A.; Pederson, S.P.; Booth, T.E.
1992-01-01
The central limit theorem (CLT) can be applied to a Monte Carlo solution if two requirements are satisfied: (1) The random variable has a finite mean and a finite variance; and (2) the number N of independent observations grows large. When these two conditions are satisfied, a confidence interval (CI) based on the normal distribution with a specified coverage probability can be formed. The first requirement is generally satisfied by the knowledge of the Monte Carlo tally being used. The Monte Carlo practitioner has a limited number of marginal methods to assess the fulfillment of the second requirement, such as statistical error reduction proportional to 1/√N with error magnitude guidelines. Two proposed methods are discussed in this paper to assist in deciding if N is large enough: estimating the relative variance of the variance (VOV) and examining the empirical history score probability density function (pdf)
Applicability of quasi-Monte Carlo for lattice systems
Energy Technology Data Exchange (ETDEWEB)
Ammon, Andreas [Berlin Humboldt-Univ. (Germany). Dept. of Physics; Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany). John von Neumann-Inst. fuer Computing NIC; Hartung, Tobias [King' s College London (United Kingdom). Dept. of Mathematics; Jansen, Karl [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany). John von Neumann-Inst. fuer Computing NIC; Leovey, Hernan; Griewank, Andreas [Berlin Humboldt-Univ. (Germany). Dept. of Mathematics; Mueller-Preussker, Michael [Berlin Humboldt-Univ. (Germany). Dept. of Physics
2013-11-15
This project investigates the applicability of quasi-Monte Carlo methods to Euclidean lattice systems in order to improve the asymptotic error scaling of observables for such theories. The error of an observable calculated by averaging over random observations generated from ordinary Monte Carlo simulations scales like N{sup -1/2}, where N is the number of observations. By means of quasi-Monte Carlo methods it is possible to improve this scaling for certain problems to N{sup -1}, or even further if the problems are regular enough. We adapted and applied this approach to simple systems like the quantum harmonic and anharmonic oscillator and verified an improved error scaling of all investigated observables in both cases.